CN104272253A - Method and system for controlling display device and computer-readable recording medium - Google Patents

Method and system for controlling display device and computer-readable recording medium Download PDF

Info

Publication number
CN104272253A
CN104272253A CN201380018952.XA CN201380018952A CN104272253A CN 104272253 A CN104272253 A CN 104272253A CN 201380018952 A CN201380018952 A CN 201380018952A CN 104272253 A CN104272253 A CN 104272253A
Authority
CN
China
Prior art keywords
display
icon
information
screen
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201380018952.XA
Other languages
Chinese (zh)
Inventor
全炳贞
郑渊建
申仁暎
全惠暎
崔善
崔源宗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2013/002906 external-priority patent/WO2013151399A1/en
Publication of CN104272253A publication Critical patent/CN104272253A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Abstract

A portable device and a method for controlling a display device include receiving first display information of a first arrangement of icons displayed on a screen of the display device, displaying the first arrangement of icons on a display of a portable device based on the first display information, modifying the first arrangement of icons displayed on the display of the portable device to generate a second arrangement of icons, generating second display information based on the second arrangement of icons, and transmitting to the display device a request to display the second arrangement of icons on the display of the display device, the request comprising the second display information.

Description

For controlling method and system and the computer readable recording medium storing program for performing of display device
Technical field
The method and apparatus consistent with exemplary embodiment relates to control display device, more specifically, relates to a kind of method and system of the screen for remotely editing display device.
Background technology
The function with the display device of communication function is diversified.Such as, Digital Television (TV) can perform several functions, the function that such as web-browsing, application are browsed with content-browsing and receiving broadcast content.
Summary of the invention
Technical matters
Need the several functions remotely controlling display device.
Technical scheme
One or more exemplary embodiment provides a kind of for using mancarried device to control the method and system of the screen shown on the display apparatus.
One or more exemplary embodiment additionally provides a kind of for using mancarried device to control the method and system comprising the screen of multiple application shown on the display apparatus.
One or more exemplary embodiment additionally provides a kind of for using mancarried device to control the method and system of display device based on display screen on the display apparatus.
The beneficial effect of the invention
According to above embodiment, mancarried device can be used remotely to control the screen of display device or/and function.
Accompanying drawing explanation
By referring to accompanying drawing detailed description exemplary embodiment, above and/or other side of the present invention and embodiment will become clearly, wherein:
Figure 1A is the block diagram of the system for the on-screen editing and/or function that control display device illustrated according to exemplary embodiment;
Figure 1B is the block diagram of the system for the on-screen editing and/or function that control display device illustrated according to exemplary embodiment;
Fig. 2 A is the block diagram of the mancarried device illustrated according to exemplary embodiment;
Fig. 2 B is the block diagram of the mancarried device illustrated according to exemplary embodiment;
Fig. 3 is the block diagram of the display device illustrated according to exemplary embodiment;
Fig. 4 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Fig. 5 A to Fig. 5 D is the diagram illustrated according to the display device of the method for Fig. 4 and the screen of mancarried device;
Fig. 6 is the process flow diagram of the second process of arranging of the method for the generation Fig. 4 illustrated according to exemplary embodiment;
Fig. 7 is the process flow diagram of the operation of mancarried device in the method for on-screen editing controlling display device illustrated according to exemplary embodiment;
Fig. 8 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Fig. 9 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 10 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 11 is the process flow diagram that the on-screen editing of control display device according to exemplary embodiment and the method for function are shown;
Figure 12 is the diagram that the function controlling display device is shown;
Figure 13 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 14 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 15 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 16 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 17 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 18 is the process flow diagram of the method for the function of the execution display device illustrated according to exemplary embodiment;
Figure 19 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 20 illustrates that the use server according to exemplary embodiment controls the process flow diagram of the method for the on-screen editing of display device;
Figure 21 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment;
Figure 22 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment.
Embodiment
Optimum embodiment
According to the one side of exemplary embodiment, provide a kind of method for controlling display device, described method comprises: receive display icon on the screen of the display apparatus first (or instruction or determine to show first of icon on the screen of the display apparatus arranges) first display information of arranging; Based on the first display information, first of icon the layout is presented on the display of mancarried device; Amendment is presented at the first layout of the icon on the display of mancarried device, arranges with produce icon second; Arrange based on second of icon and produce the second display information; Arrange that the request be presented on the display of display device is sent to display device by for by second of icon, wherein, described request comprises the second display information.
First display information can comprise: the display position information of icon, wherein, and the position of the display position information of icon instruction display icon on the screen of the display apparatus.
Display position information can comprise: the instruction display icon being in coordinate position is on the screen of the display apparatus with reference to the coordinate information of the absolute position of the coordinate position of the screen of display device.
Display position information can comprise: the coordinate information indicating icon relative position relative to each other on the screen of the display apparatus.
First display information also can comprise: the icon image data of icon.
First display information can be extend markup language (XML) data.
Second display information can be (or comprise or corresponding to): be modified to instruction by the first display information of the reposition of display icon on the screen of the display apparatus.
Second display information can be (or comprise or corresponding to): the first display information and by the difference between the reposition of icon that shows on the screen of the display apparatus.
Described method also can comprise: while first of icon the layout is presented on the display of display device by display device, arranges be presented at second of icon on the display of mancarried device.
At least one step during described modify steps can comprise the following steps: the reposition position of at least one icon in the icon under first of icon the layout being changed at least one icon described in the icon under the second layout of icon; Delete at least one icon described in the icon under the first layout of icon; New icon is added to the icon under the first layout of icon.
First display information is extend markup language (XML) data, and described modify steps can comprise: revise XML data based at least one in described change, described deletion and described interpolation.
Described modify steps can comprise: be received in the input on the display of portable display, wherein, described input for revise icon first arrange under icon; Produce second of icon based on described input to arrange.
Icon can comprise: represent the icon by the icon of content reproduced by display device and the application that represents display device.
According to the one side of another exemplary embodiment, provide a kind of mancarried device, comprising: display; Communication unit, receive display icon on the screen of the display apparatus first (or instruction or determine to show first of icon on the screen of the display apparatus arranges) first display information of arranging; Controller, control display based on the first display information and arrange display over the display by first of icon, receive the first input of arranging of amendment display icon over the display, produce second of icon based on described input to arrange, arrange based on second of icon and produce the second display information, and control communication unit by for by second of icon arrange the request be presented on the display of display device be sent to display device, wherein, described request comprises the second display information.
According to the one side of another exemplary embodiment, provide a kind of non-transitory computer-readable medium recording the program of the method for the screen impelling mancarried device executive editor display device, described method comprises: receive display icon on the screen of the display apparatus first (or instruction or determine to show first of icon on the screen of the display apparatus arranges) first display information of arranging; Based on the first display information, first of icon the layout is presented on the display of mancarried device; Amendment is presented at the first layout of the icon on the display of mancarried device, arranges with produce icon second; Arrange based on second of icon and produce the second display information; Arrange that the request be presented on the display of display device is sent to display device by for by second of icon, wherein, described request comprises the second display information.
Invention pattern
When this uses term "and/or", this term comprises one or more relevant listed any of item and all combines.Time after the statement of such as " ... at least one " is positioned at a column element, the individual element that this expression is modified permutation element and do not modified in described row.
More fully exemplary embodiment is described now with reference to accompanying drawing.It is to be understood, however, that exemplary embodiment is not limited to particular forms disclosed, but exemplary embodiment contains all modifications, equivalent and the substitute that fall in spirit and scope of the present disclosure.In addition, when describing exemplary embodiment, omitting can the detailed description about relevant known function or configuration of fuzzy inventive point of the present invention.
When this uses term " first ", " second " etc. and " primarily ", " secondary " etc., described term does not represent any order, quantity or importance, but for making element, region, assembly, layer or section be distinguished from each other.
Most of term is the general terms be widely used in disclosure art as used herein.But some terms can be created to reflect the intention of those skilled in the art, precedent or new technology as used herein.In addition, some terms can at random be selected as used herein.Under these circumstances, below will be described in detail described term.Therefore, described particular term should be understood based on unique implication of particular term as used herein and context of the present disclosure.
When this uses singulative, unless context clearly separately has instruction, otherwise described singulative is also intended to comprise plural form.It will also be understood that, as used herein term " comprise " and/or " comprising ... " represent there is statement feature, entirety, step, operation, component, assembly and/or above-mentioned item combination, but do not get rid of the combination that there is or add one or more further feature, entirety, step, operation, component, assembly and/or above-mentioned item.
Stating " first arranges " and " second arranges " as used herein indicates the one dimension (1D) of the position based on the information be included in shown screen or two dimension (2D) to arrange respectively.The example of shown screen can comprise: menu screen, application browse screen, content-browsing screen, web-browsing screen and peripheral unit control screen.
Term " object " indicates the information segment be included in screen as used herein, and wherein, described screen is shown.The example being included in the information in screen can comprise: the image of application, the image of content, the image of web page and the image of peripheral unit.Each instruction in the image of application, the image of content, the image of web page and the image of peripheral unit clearly can present the ideograph picture of each in application, content, web page and peripheral unit or the image as menu item display.The image of application can be corresponding to the icon such as applied.
The information of statement " information of object " instruction application, content, web page and peripheral unit.The information of object can comprise ideograph picture (such as, icon, thumbnail) or the image as menu item display.The information of object can comprise as used herein: the information being included in the information in screen and not being included in screen.At least one in lising under the example being included in the information in screen can comprise: ideograph picture, thumbnail and the image as menu item display.At least one in lising under the information be not included in screen can comprise: such as, display position information, application specific information (or tutorial message), application pre-review information and application editor control information (such as, move, wipe, copy, arrange, interpolation etc.).Application specific information can comprise: the information of the title of application, the type of application and instruction application content.
Similar to the information of application, the information of the information of content, the information of web page and peripheral unit can comprise: the information being included in the information in screen and not being included in screen.The information be not included in screen indicates the information be not presented on screen.The information in screen of not being included in can the metadata of indication information or attribute information.
Peripheral unit instruction can be connected to the device of display device.The example of peripheral unit can comprise: such as, media recording/reproducer, printer, camera and audio frequency apparatus.
State the layout of the information on " first arranges " and " second arranges " equal instruction screen as used herein.That is, first arrange and second arrange in each can the layout of the peripheral unit shown as menu item on the layout of the application as menu item display on instruction screen, the layout of content on screen, the layout of the web page as menu item display on screen or screen.
First arranges the layout of instruction based on display screen on the display apparatus.Second arranges the layout that instruction is produced by the on-screen editing of control mancarried device.Therefore, second the information placement that can indicate the whole screen comprising the information obtained by arranging based on first information that change is included in screen is arranged.But second arrange can the layout of information after the change only on instruction screen.
Information based on display multiple objects on the display apparatus controls on-screen editing and can be interpreted as controlling the on-screen editing of display information on the display apparatus or control display device.
State " user's input information " as used herein and can be depending on user's gesture.User's gesture can be defined according to input media.That is, when input media is based on touch-screen, the example of user's gesture can include but not limited to: such as, touch, touch and keep, touch for twice, drag, translation, flick, drag and drop and flicking.Finger or touch tool (such as, writing pencil) by using user perform the user's gesture based on touch-screen.
When input media is based on camera, the example of user's gesture can comprise: based on the space gesture of the image by cameras capture.But the example of user's gesture can comprise: based on the space gesture (such as, mancarried device rock) of the movement of mancarried device.
When input media comprises at least one in physical button, rotating disk, slide switch, operating rod and click type rotating disk, user's input information can be depending on the physical control of user to input media.When input media is based on voice input device, user's input information can be depending on the speech recognition of the voice to user based on natural language.
Now with reference to illustrating that the accompanying drawing of exemplary embodiment more fully describes exemplary embodiment.In the accompanying drawings, identical or corresponding element is represented by identical or corresponding label, therefore will not carry out repeated description to identical or corresponding element.
Figure 1A illustrates the block diagram according to the system 100 in Figure 1A of the screen for editing display device 120 of exemplary embodiment.The block diagram of Figure 1A can illustrate mancarried device 110, network configuration between display device 120 and server 130.
With reference to Figure 1A, system 100 comprises: mancarried device 110, display device 120 and server 130.
System is not limited to the system 100 shown in Figure 1A, but can implement according to the system 101 in Figure 1B.As shown in Figure 1B, system 101 comprises: mancarried device 111 and display device 121.Mancarried device 110 and 111 and display device 120 and 121 can be respectively referred to as first device and the second device.Mancarried device 110 and 111 can represent the device of the on-screen editing for controlling display device 120 and 121 or the device for controlling display device 120 and 121.
The example of the mancarried device 110 and 111 of Figure 1A and Figure 1B can comprise smart phone, notebook, flat board, mobile device, handheld apparatus, hand-held PC, dull and stereotyped phone and personal digital assistant (PDA), but is not limited thereto.
The mancarried device 110 of Figure 1A receives the information of the screen be presented at display device 120 from display device 120 or server 130.Mancarried device 110 produces the editor control signal for editing the screen be presented in display device 120 based on the information of screen.Mancarried device 110 also can produce the function control signal of the function for controlling display device 120.Editor control signal or function control signal are sent to display device 120 or server 130 by mancarried device 110.
The mancarried device 111 of Figure 1B receives the information of the screen be presented at display device 121 from display device 121.Mancarried device 111 produces editor control signal for the screen be presented in display device 121 or function control signal based on the information of screen.Editor control signal or function control signal are sent to display device 121 by mancarried device 111.
The example of the display device 120 and 121 of Figure 1A and Figure 1B can comprise: the TV with communication function and digital consumer electronic (CE) device with communication function and Presentation Function, but is not limited thereto.The example of numeral CE device can comprise: have the digital TV of communication function and refrigerator and Internet protocol TV (IPTV), but be not limited thereto.
When showing information request signal and being received, the information of shown screen is sent to mobile device 110 or server 130 by the display device 120 of Figure 1A.When the editor control signal of the information for shown screen or function control signal are received from mancarried device 110 or server 130, display device 120 edits shown screen according to editor control signal, or controls the function based on shown screen according to function control signal.Screen shown by editor is such control: described control for adjusting shown screen, to show fresh information or to configure the difference of the information be presented on screen.Function control signal based on screen can be that the application such as performing application involved on display screen performs control signal.
When showing information request signal and being received, the information of shown screen is sent to mancarried device 111 by the display device 121 of Figure 1B.When the editor control signal of the information for shown screen or function control signal are received from mancarried device 111, display device 121 can edit shown screen according to editor control signal, or controls the function based on shown screen according to function control signal.Function control signal based on screen can be that the application such as performing application involved on display screen performs control signal.
The server 130 of Figure 1A communicates with display device 120 with mancarried device 110, to control on-screen editing and/or the function of display device 120.
Server 130 comprises: communication unit 131, storage unit 132 and processor 133.Communication unit 131 can carry out wire communication or radio communication with mancarried device 110 and display device 120.Therefore, can configure according to the mode similar to the communication unit 208 be included in mancarried device 110 and operation communication unit 131, below will make an explanation to this.
Storage unit 132 stores at least one program and program resource that can be performed by processor 133.Particularly, storage unit 132 stores the edit history information of the screen be presented in display device 120, thus stores the history to the editor that the screen be presented in display device 120 is made.Edit history information can with user, mancarried device and editor after object store explicitly.In addition, edit history information can store explicitly with such path or information: by or utilize described path or information, the information be wiped free of or editor after information can be resumed.
Processor 133 can comprise at least one processor.Processor 133 receives from mancarried device 110 request signal be presented at display device 120 by the program of load store in storage unit 132, the information of multiple application of arranging display according to first is received from display device 120, the information of described multiple application is sent to mancarried device 110, and the information that second of the described multiple application received from mancarried device 110 arranges is sent to display device 120.The information of described multiple application indicates the screen message be presented in display device 120.
When system be as shown in Figure 1A comprise the system 100 of mancarried device 110, display device 120 and server 130 time, mancarried device 110 and display device 120 send and receive the information for controlling on-screen editing and/or function by server 130.
The example of server 130 can comprise Cloud Server and home gateway, but is not limited thereto.
When system be as shown in fig. 1b comprise the system 101 of mancarried device 111 and display device 121 time, information for controlling on-screen editing and/or function can directly send and receive between mancarried device 111 and display device 121, and without the need to server 130 is used as mediating device.
Fig. 2 A illustrates the block diagram of each according in the mancarried device 110 and 111 of Figure 1A and Figure 1B of exemplary embodiment.
With reference to Fig. 2 A, each in mancarried device 110 and 111 comprises: user input unit 201, sensing cell 202, touch-screen 203, camera 204, audio input unit 205, audio output unit 206, storage unit 207, communication unit 208, port 209, processor 210 and power supply unit 211.But the structure of each in mancarried device 110 and 111 is not limited thereto.
User input unit 201 receives the input (or control data) for controlling the operation of each in mancarried device 110 and 111.User input unit 201 list under can comprising at least one: the touch pad of keypad, dome switch, replacing mouse, roller, roller switch and hardware (H/W) button.Sensing cell 202 detects the current state of each in mancarried device 110 and 111 (such as, whether the position of each in mancarried device 110 and 111, user touch the orientation of each in portable mobile device 110 and 111, mancarried device 110 and 111 or the deceleration of each in mancarried device 110 and 111), and produce the transducing signal for controlling the operation of each in mancarried device 110 and 111.
Sensing cell 202 can comprise proximity transducer.Proximity transducer indicates such sensor: described sensor is not Mechanical Contact by use electromagnetic field or infrared ray, detects the existence of the object on close detection surface of presetting or is present in the existence of the object around default detection surface.The example of proximity transducer comprises: transmission-type photoelectric sensor, directly reflective photoelectric sensor, mirror surface reflecting type photoelectric sensor, high frequency oscillation type proimity sensor, capacitance type proximity transducer, magnetic proximity sensor and infrared proximity transducer.
By the user input data that the user's request or selection of depending on user's gesture produce based on touch-screen 203.One or more in the number of times that can occur according to touch, touch pattern (pattern), touch area, touch intensity or pressure defines user's gesture.Touch based on user's finger is interpreted as the touch at the user's body position of the touch area that can contact touch-screen 203.
In addition, touch-screen 203 can comprise touch for detecting touch-screen 203 or close to any sensor in the various sensors touched.The sensor be included in touch-screen 203 indicates such sensor: described sensor detects pattern on touch-screen 203 or user's gesture, and produce by sensing be identified as user's gesture dragging, flick, touch, touch and keep, touch for twice, translation or the signal that flicks and obtain.
Touch sensors for detecting the example of the sensor of the touch of touch-screen 203.Touch sensor can detect much information fragment, the temperature of the roughness of such as surface in contact, the hardness of contact object and contact point.Corresponding to the situation of pointing device touch panel to the touch of touch-screen 203.The example touched can comprise multiple point touching.In fact close touch of touch-screen 203 do not touch touch-screen 203 with pointing device, but arrive corresponding with the situation of touch-screen 203 at a distance of the position of preset distance.Pointing device is the instrument of the specific part for touching touch-screen 203 or the position close to touch-screen 203.The example of pointing device can comprise the finger of writing pencil and user.Pointing device can represent external input device.
When on-screen editing is controlled, the user interface (UI) based on writing pencil can be provided.Such as, such UI can be provided: the information of described UI in the screen shown by being included on touch-screen 203 shows specific markers (such as, when information is wiped free of, mark " X "), and by using writing pencil to remove described information or moving screen information (such as, spoide effect).Spoide effect indicates such phenomenon: similarly be that water (screen message) is inhaled in spoide (writing pencil), then fall from spoide.
Touch-screen 203 exports the information by each process in mancarried device 110 and 111.Such as, touch-screen 203 in response under list and display screen: the signal detected by sensing cell 202, or the user's input information to be inputted by user input unit 201 or control data, or the touch pattern sensed by the sensing cell be included in touch-screen 203 or user's gesture.Touch-screen 203 can represent I/O (I/O) device.When touch-screen 203 is I/O devices, the screen be presented on touch-screen 203 comprises UI or graphic user interface (GUI).Touch-screen 203 arranges display screen information according to receive from display device 120 first, and shows the screen message after change based on user's input information, and receives user's input information.
The example of touch-screen 203 can include but not limited to: liquid crystal display (LCD), thin film transistor (TFT) LCD, organic light emitting diode display, flexible display, 3D display and active matrix organic light-emitting diode (AMOLED) display.Touch-screen 203 can represent display.Two or more touch-screens 203 can be provided according to the type of mancarried device 110 and 111.
Camera 204 processes picture frame, the rest image such as obtained by imageing sensor under video call mode, screening-mode isotype or moving image.Picture frame after process can be presented on touch-screen 203.The picture frame processed by camera 204 can be stored in storage unit 207, or is sent to destination apparatus by communication unit 208 or port 209.According to the structure of each in mancarried device 110 and 111, camera 204 can comprise two or more cameras.In addition, camera 204 can be used as the input media that identifies the space gesture of user.
Audio input unit 205 receives external voice signal under call model, video record mode, speech recognition mode isotype, and described external voice signal is converted to electricity voice data, and described electricity voice data are sent to processor 210.Audio input unit 205 can comprise: such as, microphone.Audio input unit 205 can comprise the various denoise algorithm for removing the noise produced when receiving external voice signal.
The voice signal using audio input unit 205 to input can be the user's input information for editing the screen of each in display device 120 and 121 or the function on control screen.That is, the voice signal inputted by using audio input unit 205 can be from the user's input information obtained the speech recognition of user speech based on natural language.The external voice signal inputted by audio input unit 205 can be stored in storage unit 207, or is sent to destination apparatus by communication unit 208 or port 209.
According to mancarried device 110 and the interface function between 111 and user, user input unit 201, sensing cell 202, touch-screen 203, camera 204 and audio input unit 205 all can represent input media or I/O device.Such as, when mancarried device 110 and the interface function between 111 and user comprise touch recognition function, speech identifying function and space gesture recognition function, user input unit 201, sensing cell 202, camera 204 and audio input unit 205 all can represent input media, and touch-screen 203 can represent I/O device.
Audio output unit 206 exports the voice signal or sound signal that are received externally under call model, audio reproducing pattern isotype.Audio signal output unit 206 can comprise loudspeaker.Audio input unit 205 and audio output unit 206 can integrally be configured, such as in the form of a headset.
Storage unit 207 stores at least one program and program resource that can be performed by processor 210.The example of at least one program described comprises: for performing the program of control based on the method for the screen of each in the function of screen and/or editor's display device 120 and 121, the operating system program of each in mancarried device 110 and 111, the application of each in mancarried device 110 and 111, and the program needed for the various functions of each (such as, communication function and Presentation Function) performed in mancarried device 110 and 111.
The example of program resource comprises: the information needed for executive routine, for controlling the UI screen message of on-screen editing, and the information of editor control function selected by replying.
UI screen message for controlling on-screen editing can comprise: for the UI screen message of on-screen editing's (move, wipe, copy, add, arrange or back up) of selected screen message.UI screen message for controlling the function on screen can comprise: for performing the UI screen message of selected screen message.For the UI screen message that controls on-screen editing with for control based on the function of screen UI screen message can by together with provide.
Be stored in can comprise for the program performing the method controlling the on-screen editing of each in display device 120 and 121 and/or function in storage unit 207: for controlling the application of the on-screen editing of each in display device 120 and 121 and/or function, or the remote control application of the on-screen editing's menu of each comprised in display device 120 and 121 and/or function Control-Menu.
Therefore, be performed or be performed after remote control application is performed after the on-screen editing of each in control display device 120 and 121 and/or the method for function can be performed in the application for controlling on-screen editing and/or function, on-screen editing or function control to be performed subsequently.
Storage unit 207 can comprise lower array storage unit individually: the storage unit storing at least one program (comprising operating system) needed for the various functions of each performed in mancarried device 110 and 111, stores application and for the resource of method that performs the on-screen editing controlling display device 120 and 121 and the storage unit of at least one program.
In addition, storage unit 207 can store edit history information.That is, edit history information can be arranged as second by processor 210 and produce, and can be stored in storage unit 207.Edit history information can be produced, the program after all editors is sequentially known.Alternatively, edit history information can be produced with the screen after editor for unit, the content of the screen after the content of the screen before editor and editor is learned.
The example of storage unit 207 can comprise: nonvolatile memory (such as high-speed random access memory (RAM), disk storage device or flash memory) and other non-volatile semiconductor devices.Therefore, storage unit 207 can represent storer.
Communication unit 208 is configured to data be sent to display device 120 and server 130 via cable network or wireless network and receive data from display device 120 and server 130, wherein, wireless network is such as: wireless Internet, wireless intranet, wireless telephony network, WLAN (wireless local area network) (LAN), Wi-Fi network, Wi-Fi is (WFD) network directly, the third generation (3G) network, forth generation (4G) LTE (Long Term Evolution) network, blueteeth network, Infrared Data Association (IrDA) network, radio-frequency (RF) identification (RFID) network, ultra broadband (UWB) network, ZigBee-network and near field communication network.
Communication unit 208 can comprise following at least one but be not limited thereto: broadcast reception module, mobile communication module, wireless Internet module, wired Internet module, short-range communication module and positional information module.
Broadcast reception module by broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.The example of broadcast channel can comprise satellite channel and terrestrial channel.Wireless signal is sent to base station, exterior terminal (such as via mobile communications network by mobile communication module, each in display device 120 and 121) and server 130 at least one, and at least one receives wireless signal from base station, exterior terminal (such as, in display device 120 and 121 each) and server 130.Wireless signal can comprise various types of data, such as voice-call-signaling, video call or text/Multimedia Message sending/receiving.Wireless Internet module instruction is used for the module of wireless Internet access.The instruction of cable network module is used for the module of wired internet access.
Short-range communication module instruction is used for the module based on short-range communication.The example of short-range communication technique can comprise: the technology based on bluetooth, the technology based on RFID, the technology based on infrared communication, the technology based on UWB, the technology based on Zigbee, the technology based on WFD and the technology based on near-field communication (NFC).
Positional information module is the module for detecting or obtain the position of each in mancarried device 110 and 111.Such as, positional information module can be GPS (GPS) module.GPS module is from multiple satellite reception positional information.Positional information can comprise: the coordinate information with dimension and longitude.
Port 209 can use plug and play interface (such as, USB (universal serial bus) (USB) port (not shown)) and external device (ED) (not shown) to carry out transmission and the reception of data.Plug and play interface indicates such module: described module allows external device (ED) be connected with mancarried device 110 and 111 and be automatically arranged to access.
The various elements of each in power supply unit 211 pairs of mancarried devices 110 and 111 are powered.Power supply unit 211 comprises: at least one power supply, such as, and battery or alternating current (AC) power supply.Each in mancarried device 110 and 111 can not comprise power supply unit 211, but can comprise the linkage unit (not shown) that can be connected to externally fed unit (not shown).
Processor 210 can be at least one processor for controlling the integrated operation of each in mancarried device 110 and 111.According to the function of each in mancarried device 110 and 111, at least one processor of processor 210 can be multiple processor or processor core, and can be used as multiple processor and operate.
Processor 210 can control user input unit 201, sensing cell 202, touch-screen 203, camera 204, audio input unit 205, audio output unit 206, storage unit 207, communication unit 208, port 209 and power supply unit 210.Therefore, processor 210 can represent controller, microprocessor or digital signal processor.In addition, processor 210 can provide based on the user interface of touch-screen 203 with by using the user's input information of user input unit 210, sensing cell 202, camera 203 and the audio input unit 205 corresponding to input media and input.Processor 210 is connected by the element of at least one bus (not shown) with mancarried device 110 and 111, and wherein, described at least one bus makes the element of mancarried device 110 and 111 be connected to each other.
Processor 210 can perform at least one program relevant to the method for the on-screen editing of each controlled in display device 120 and 121 and/or function.Processor 210 can perform described program from storage unit 207 fetch program, or can download from the external device (ED) connected by communication unit 208 (such as, application provides server (not shown) or marketplace server (not shown)) and perform described program.Will be appreciated that processor 210 comprises: the interface function unit between the various functional modules in each in processor 210 and mancarried device 110 and 111.Can as Fig. 4, Fig. 6-Fig. 8, Fig. 9-Figure 11 and the operation performing the processor 210 relevant to the method for the on-screen editing of each controlled in display device 120 and 121 and/or function as shown in Figure 19.
Fig. 2 B illustrates the block diagram of each according in the mancarried device 110 and 111 of Figure 1A and Figure 1B of another exemplary embodiment.With reference to Fig. 2 B, each in mancarried device 110 and 111 comprises: touch-screen 220, processor 221, storer 222 and communication unit 223.Touch-screen 220 is configured according to the mode that the touch-screen 203 with Fig. 2 A is identical, and shows multiple object and receive user's input information.
Communication unit 223 is configured according to the mode that the communication unit 208 with Fig. 2 A is identical, and communicates with each in display device 120 and 121.
Processor 221 provides the user interface based on touch-screen 220, the information of the multiple objects under the first layout is received from each display device 120 and 121, information based on described multiple object shows described multiple object on the touch screen 220, produce second arrange and change the information of described multiple object based on user's input information, and the information that second arranges is sent to each in display device 120 and 121.Therefore, touch-screen 220 can be used to edit at least one in described multiple object, and based on the second information of arranging, the multiple objects after editor can be presented in each in display device 120 and 121 by display device 120 and 121.
Fig. 3 is the block diagram that display device is shown.Display device can be the display device 120 and 121 of Figure 1A and Figure 1B.In figure 3, display device 120 and 121 can be the TV with communication function.
With reference to Fig. 3, each in display device 120 and 121 comprises: wireless communication unit 301, communication unit 302, broadcast signal reception unit 303, storage unit 304, processor 305, graphics processing unit 306, display unit 307, audio treatment unit 308, audio output unit 309 and audio input unit 310.Display device 120 and 121 is not limited to the structure in Fig. 3, and can not comprise broadcast signal reception unit 303.
Wireless communication unit 301 is configured to carry out radio communication with such as telepilot (not shown).When telepilot can comprise IR transmitter, wireless communication unit 301 can comprise: corresponding IR receiver, receives the infrared signal sent from telepilot, carry out demodulation, and the infrared signal after demodulation is sent to processor 305 to described infrared signal.When telepilot can comprise RF module, wireless communication unit 301 can comprise: corresponding RF module, receives the RF signal sent from telepilot, and described RF signal is sent to processor 305.Wireless communication unit 301 is not limited to IR receiver and RF module, and can be configured to adopt other technology one or more of (such as, the short-range communication technique of such as Bluetooth Communication Technology).
Communication unit 302 carries out wireless data communications via each in wireless communication module and mancarried device 110 and 111 or server 130.Display device 120 can communicate with each in mancarried device 110 and 111 and server 130 via cordless communication network, wireline communication network with 121, wherein, cordless communication network is blueteeth network, RFID network, WiFi network, IrDA network, UWB network, Zigbee network or near-field communication (NFC) network such as, any one in wireline communication network such as HPNA Home Phoneline Networking Alliance (PNA), power line communication (PLC), IEEE 1394, wired internet or other various home network.
When from each in the mancarried device 110 and 111 connected by communication unit 302 or server 130 receives the request signal for the information just shown time, the information be presented on display unit 307 to be sent to the server 130 of each or request in the mancarried device 110 and 111 of request by communication unit 302 by processor 305, and edit based on the sequentially receive from each in mancarried device 110 or 111 or server 130 second information of arranging and show at least one in the information segment be presented on display unit 307, or function is controlled.As the processor 210 of Fig. 2 A, processor 305 can comprise at least one processor.
The broadcast singal received from tuner (not shown) is divided into picture signal and voice signal by broadcast signal reception unit 303, and exports described picture signal and voice signal.That is, the RF broadcast singal corresponding to previously stored all channels or the channel selected by user selected by tuner in the middle of the RF broadcast singal received by antenna.In addition, selected RF broadcast singal is converted to intermediate-freuqncy signal, baseband images or voice signal by tuner.Intermediate-freuqncy signal, baseband images or voice signal are imported into processor 305.
Storage unit 304 can be such storer: described storer store for the treatment of with the program controlling the various signals performed by processor 305, and the user profile of each in the information of each in mancarried device 110 and 111, the information of each in display device 120 and 121 and display device 120 and 121 can be stored.The Information Availability of each in mancarried device 110 and 111 in: when there is access control authorization request signal from each in mancarried device 110 and 111, determine whether to allow access.Alternatively, the Information Availability of each in mancarried device 110 and 111 in: be presented in display device 120 and 121 each on information be sent to each in mancarried device 110 and 111 after, each in the mancarried device 110 and 111 of each that display is connected in display device 120 and 121.The information of each in mancarried device 110 and 111 can represent authorization message.
Processor 305 controls the allomeric function of each in display device 120 and 121.Processor 305 can control the exchanges data of being undertaken by communication unit 302 and mancarried device 110 and 111.Processor 305 by performing the program that is stored in storage unit 304 to form UI screen, and by graphics processing unit 306 by UI screen display on display unit 307.
UI screen can comprise: the screen of the information of each in display mancarried device 110 and 111.In addition, UI screen can comprise: the screen showing the screen message (such as, applying) selected by each in mancarried device 110 and 111.That is, when receiving the information by the application selected by each in mancarried device 110 and 111, the UI screen that selected application and non-selected application can be carried out distinguishing can be provided.
Graphics processing unit 306 comprises: image decoder (not shown) and scaler (not shown).Graphics processing unit 306 processes and will be presented at the picture signal exported from broadcast signal reception unit 303 on screen.Image decoder is decoded to the picture signal after demultiplexing, and scaler performs convergent-divergent for display unit 307, to export the resolution of decoded picture signal.
The exportable image processed by graphics processing unit 306 of display unit 307.The example of the image exported can comprise: the image, UI screen, the application browse screen comprising multiple application, the content-browsing screen comprising content, web-browsing screen and the selectable peripheral unit that receive from broadcast signal reception unit 303 control screen.Display unit 307 can comprise touch-screen, and can be used as input media.
Audio treatment unit 308 processes the audio frequency the content that the voice signal and being included in that exports from broadcast signal reception unit 303 received by communication unit 302, and the voice signal after process and audio frequency are outputted to audio output unit 309.Audio output unit 309 can be configured differently as exporting stereophonic signal, 3.1 sound channel signals or 5.1 sound channel signals.
In addition, audio treatment unit 308 can process the signal inputted from audio input unit 310, and described signal is sent to processor 305.Audio input unit 310 can comprise microphone.
Fig. 4 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Although the method for Fig. 4 can be performed by the processor 221 of the processor 210 of Fig. 2 A or Fig. 2 B, for the ease of making an explanation, explain based on the relation between processor 210 and display device 120 hereafter.
At operation S401, processor 210 receives the information of multiple object from display device 120 by communication unit 208.The information of described multiple object is the screen message arranging display according to first in display device 120.Display device 120 can represent exterior display device.
In order to receive the information of described multiple object, processor 210 can perform independent on-screen editing and controls application or can perform the remote control application be stored in storage unit 207, and selects on-screen editing's menu item to perform the function for controlling on-screen editing.Display information request signal is sent to display device 120 by communication unit 208 by processor 210.
The information received list under can comprising at least one: such as, the displays image information of each object (such as, application or thumbnail icon), the display position information of each object (such as, relative coordinate position or absolute coordinates position, denoted object relative to each other or the layout information of the layout of display screen), the details of each object (such as, the mark of object, comprise the metadata of title, URL, file type, to the file path of object, the attribute etc. of object), the pre-review information (preview data of object or briefly description) of each object and the editor control information of each object are (such as, the available command that can perform object, such as delete, mobile, hide, backup, arrange).
Delete command can be the order for deleting object.Movement directive can be the order of the position for changing shown object.Hidden command can be the order for making object hide from the visual field of user.Backup command can be the order for the object be stored in display device 120 being backuped to mancarried device 110 or server 130.Setting command can be the order of the attribute (the audio or video presentation properties of such as content) for arranging object.When reproducing content, audio reproduction properties can be channel information (stereo, 5:1,6:1 etc.), sound effect information, information volume, dampening information, balancing information etc.Rabbit attribute can be contrast, color balance, brightness, screen width high ratio etc.In addition, the information received can comprise: the information obtained by catching the information that is presented in display device 120, such as, and the image of icon or the layout of object.
Can according to can by the form of mancarried device 110 decipher, can by the form of display device 120 decipher or the form that can be formed information by the general format (such as, extend markup language (XML) form) of mancarried device 110 and display device 120 decipher.Information can be changed by mancarried device 110, display device 120 and server 130 if desired between form.
At operation S402, processor 210, based on the information received by communication unit 208, is arranged according to first and is presented on the touch-screen 203 of mancarried device 110 by multiple object.The screen be presented at first on touch-screen 203 is arranged according to first and is shown described multiple object.The screen of the reproducible display device 120 of touch-screen 203, or can being presented at the screen copying display device 120 in the subwindow on touch-screen 203.Processor 210 can by the information reconstruction received for being suitable for touch-screen 203.Such as, if the screen of display device 120 is larger than touch-screen 203 dimensionally, then can scaled described multiple object to be shown on the touchscreen.Therefore, even if the display size of display device 110 is different from the size of touch-screen 203, also still described multiple object can be reproduced on touch-screen 203 faithfully.
Fig. 5 A to Fig. 5 D illustrates when the on-screen editing's function being used for the screen editing display device 120 and 121 is performed, and is presented at the diagram of the screen on display device 120 and 121 and mancarried device 110 and 111.
With reference to Fig. 5 A, when screen 510 is displayed in display device 120 and 121, be presented at mancarried device 110 identical with screen 510 with the screen 511 on the display unit 203 of each in 111 (in other words, described screen shows identical content according to identical general arrangement at first).In fig. 5, multiple application is shown on screen 510, and the layout of described multiple application is replicated on screen 511.When user selects one or more in shown application, screen 511 is arranged and can be modified subsequently.As mentioned above, the information about described multiple application can be display position information, displays image information and other attribute information, and can be formed the form by mancarried device 110 or display device 120 decipher.Fig. 5 A illustrates according to general XML format and the information 520 of form format, and wherein, general XML format can by mancarried device 110 or display device 120 decipher, to store, to revise and to copy between mancarried device 110 and display device 120 layout of object.
At operation S403, processor 210 produces the second layout.The second layout can be produced based on for the user's input of handled in shown application.In response to receiving user's input, processor 210 can edit the information of described multiple object suitably.That is, when user selects to be presented at an object on the screen 511 on the touch-screen 203 of the mancarried device 110 of Fig. 5 A, the UI screen 513_1 for controlling the editor of selected object can be displayed on the screen 513 of Fig. 5 B.Although screen 512 illustrates that application selected on the screen 511 of display unit 203 is displayed in display device 120, described selection can not be replicated in display device 120, and only those can be reflected in display device 120 change of arranging.Therefore, the operation that mancarried device 110 performs can be replicated in or can not be replicated in display device 120, only can be reflected the change of arranging.
When user selects " erasing " on UI screen 513_1, the screen 513 be presented on the touch-screen 203 of Fig. 5 B is changed to the screen 515 of Fig. 5 C or the screen 517 of Fig. 5 D, and is produced based on the second layout of screen 515 or screen 517.The screen 515 of Fig. 5 C is obtained by wiping selected object.Obtained the screen 517 of Fig. 5 D by following process: the object selected by erasing, and the display position of mobile each object is to fill the display position of the object be wiped free of.In other words, after the object selected by deleting, the array of shown object can be rearranged.Correspondingly can be edited the information of the second new layout about object by mancarried device 110, make described information can be sent to display device 120 to be replicated in display device 120.Before editor in display device 120 is done, described information can be sent to display device 120 and change for being confirmed by user, thus user can the editor in display device 120 be done before change in preview display device 120, and ask to confirm from display device 120.New layout can be migrated to display device, as shown in screen 514 and screen 516.
The editting function based on the device of touch-screen can be used to revise UI below, but be not limited to the UI in Fig. 5 A to Fig. 5 D.
Fig. 6 is the process flow diagram that the process of arranging according to the generation second of exemplary embodiment is shown.
At operation S601, processor 210 detects the image of the multiple objects under arranging based on first user by touch-screen 203 selects.User selects based on touch, but can to perform based on the various gesture of user or by using the hardware button of user input unit 201.
At operation S602, the information of the object selected by user is sent to display device 120 and 121 by communication unit 208 by processor 210.Therefore, display unit 120 and 121 can show the information of selected object, as shown on the screen 512 of Fig. 5 B.That is, when the event relevant with the object of display occurs, the event information about event is sent to display device 120 and 121 by communication unit 208 by processor 210.Alternatively, as mentioned above, after editor is done, the information that second arranges only is sent to display device 120 and 121 by communication unit 208 by processor 210, and can not send event information.
At operation S603, processor 210 by the UI screen display that is used for controlling for the on-screen editing of selected object at the touch-screen 203 of mancarried device 110.That is, processor 210 can show UI screen, the UI screen 513_1 of such as Fig. 5 B.UI screen can change according to selected object.Editting function UI screen can be determined according to selected object.Such as, the first object with Backup options or can to hide Options or setting options for the attribute changing object are associated.
At operation S604, processor 210 detects to be selected by the user about UI screen 513_1 of touch-screen 203.At operation S605, processor 210 produces the second layout according to testing result.Second layout produces about the first information of arranging by amendment, or can be the fresh information only indicating amendment.Arrange according to second, the screen be presented on the touch-screen 203 of mancarried device 110 can be changed into screen 515 or screen 517, but be not limited thereto.
When being selected at UI screen 513_1 middle term " movement ", and when while selected object being drawn to another location, instruction is kept the touch of the selection of item " movement ", the screen be presented on touch-screen 203 can be changed to selected object and be moved to the screen dragging the position stopped.In addition, in the region be shown without any object UI screen 513_1 being provided in touch-screen 203, and when the item " setting " on UI screen 513_1 is selected, processor 210 can perform the task of the attribute for arranging new object or existing object.In addition, when when UI screen 513_1 middle term " backup " is by selection, the information about selected item can be stored in the storage unit 207 of mancarried device 110 as backup information by processor 210.Information about selected object comprises the information of the object that can be used for selected by manipulation, but is not limited thereto.
At operation S404, the information that second arranges is sent to display device 120 and 121 by communication unit 208 by processor 210.As mentioned above, the second information of arranging is revised by the first information of arranging, or can be only the information of instruction to the change of the first information of arranging.Therefore, at least one object in described multiple object is edited by display device 120 and 121, and is presented on display unit 120 based on the second information of arranging by display device 120 and 121.That is, the screen 516 after the editor of the screen 514 after the editor of Fig. 5 C or Fig. 5 D is presented on display unit 120 by the information can arranged based on second.
Fig. 7 is the process flow diagram of the method for the on-screen editing of the control device device 120 and 121 illustrated according to exemplary embodiment.Although the method for Fig. 7 can be performed by each in the processor 210 and 211 of Fig. 2 A and Fig. 2 B, for the ease of making an explanation, explain when hypothesis performs described method by processor 210 hereafter.
Fig. 7 illustrates that the information about multiple object is converted into the example be suitable for by the form of mancarried device 110 and 111 and display device 120 and 121 decipher.Operation S701, the operation S703 of Fig. 7 are similar to operating S403 to operation S401 with operation S704, therefore omit the unnecessary explanation to operation S701, operation S703 and operation S704.
At operation S702, processor 210 receives information by communication unit 208 from display device 120 and 121, and the information of described multiple object is converted to the information according to the data layout of each in mancarried device 110 and 111.Therefore, as shown in Figure 5 A, first arrange under the information of described multiple object can by mancarried device 110 and 111 decipher, and the screen be presented in display device 120 and 121 can be replicated on mancarried device 110 and 111.
At operation S705, the information that second arranges is converted to the information of the data layout according to display device 120 and 121 by processor 210, and by communication unit 208, the information after conversion is sent to display device 120 and 121.Can be changed described information by communication unit 208.Therefore, the information of the described multiple object under arranging about second can by display device 120 and 121 decipher.
Fig. 8 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Fig. 8 illustrate according to second arrange multiple object is presented in mancarried device 110 and 111 each on function be added to the example of the method for Fig. 4.
The operation S801 of Fig. 8 is similar with operating S404 to operation S401 to S403 with operation S805 to operation S803, therefore omits the unnecessary explanation to operation S801 to operation S803 and operation S805.
At operation S803, processor 210 produces the second layout.At operation S804, processor 210 is arranged according to second and is presented on the touch-screen 203 of mancarried device 110 and 111 by described multiple object.That is, processor 210 display comprises the screen 515 of Fig. 5 C or the screen 517 of Fig. 5 D of the described multiple object under the second layout.
Fig. 9 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Fig. 9 illustrates that the operation storing edit history information is added to the example of the method for Fig. 4.The operation S901 to operation S903 of Fig. 9 is similar to operation S404 with the operation S401 of operation S905 and Fig. 4, therefore omits the unnecessary explanation to operation S901 to operation S903 and operation S905.
At operation S903, processor 210 produces the second layout.At operation S904, processor 210 produces the edit history information of the second layout, and edit history information is stored in storage unit 207.Edit history information can indicate the first layout and second arrange between difference, and described difference can join with date edited or time correlation.In this case, edit history information is sent to display device 120 and 121 by communication unit 208 by processor 210, and shares edit history information with display device 120 and 121.
Figure 10 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Figure 10 illustrates the example of the information of the display screen receiving display device 120 and 121 instead of the information of multiple object.
That is, in step S1001, processor 210 is by the information of the display screen of communication unit 208 receiving and displaying device 120 and 121.The display information request signal of request display information can be sent to display device 120 and 121 by mancarried device 110 and 111, and receives the information of display screen from display device 120 and 121 in response to described request.For this reason, as described in the operation S401 of Fig. 4, the on-screen editing of each display device selected 120 and 121 in mancarried device 110 and 111 controls on-screen editing's Control-Menu of application or remote control application.
At operation S1002, processor 210 by the information displaying that receives on the touch-screen 203 of mancarried device 110 and 111.At operation S1003, when the user's input information for adjusting shown information being detected by touch-screen 203, processor 210 changes screen with the screen after acquiring change based on the user's input information detected, and produces the information of the screen after changing.Can perform according to the mode similar to the mode of arranging for generation of second and produce the process of the information of the screen after changing for changing screen with the screen after acquiring change.
At operation S1004, the information of the screen after change is sent to display device 120 and 120 by communication unit 208 by processor 210.Therefore, display device 120 and 121 is edited and is shown the screen after editing.
Figure 11 is the process flow diagram of the method illustrated according to the on-screen editing of each in the control display device 120 and 121 of exemplary embodiment and function.The method of Figure 11 can be performed by processor 210.
Operation S401, the operation S402 of the operation S1101 of Figure 11, operation S1102, operation S1104 and operation S1105 and Fig. 4, S403 with S404 is similar in operation, therefore omits operation S1101, operation S1102, operation S1104 and the unnecessary explanation operating S1105.
At operation S1103, detect user's input information by touch-screen 203, when the user's input information detected is the edit requests for editing the screen be presented on touch-screen 203, processor 210 proceeds to operation S1104 and operation S1105.
At operation S1103, detect user's input information by touch-screen 203, when the user's input information detected is the execution request for performing at least one object in the screen that is included in and is presented on touch-screen 203, processor 210 proceeds to operation S1106.At operation S1106, execution request signal is sent to display device 120 and 121 by communication unit 208 by processor 210.Therefore, the object selected by display device 120 and 121 execution.Selected object can represent selected screen message.When selected to as if moving image time, display device 120 and 121 carrys out reproducing motion pictures according to the execution request signal received.
That is, the function of display device 120 and 121 can be controlled as illustrated in fig. 12.Figure 12 illustrates the screen of the function for controlling display device 120 and 121.Suppose web page screen 1210 to be displayed in display device 120 and 121 and screen 1212 be displayed on mancarried device 110 and 111 touch-screen 203 on (as shown in Figure 12), when the search field window on the touch-screen 203 being presented at mancarried device 110 and 111 is touched and data are imported in search field window, keyword 1224 is presented on screen 1222, the information of keyword 1224 is sent to display device 120 and 121, and and the window 1221 that comprise keyword 1223 identical with screen 1222 is presented on the screen 1220 of display device 120 and 121.When using mancarried device 110 and 111 to input information, the information of keyword is sent to display device 120 and 121, and corresponding task can be performed when searching for requested.
Figure 13 is the process flow diagram that the on-screen editing of control display device 120 and 121 according to exemplary embodiment and the method for function are shown.The operation of Figure 13 is performed by the processor 305 of Fig. 3.
At operation S1301, processor 305 is arranged according to first and is presented on display unit 307 by multiple object.At operation S1302, the information of described multiple object is sent to mancarried device 110 and 111 by communication unit 302 by processor 305.Described transmission can be performed in response to the display information request signal of mancarried device 110 and 111.Shown described multiple object can be the multiple objects be associated with user.Therefore, if when the user of display device 120 and 121 is another users, the described multiple object be presented on display unit 307 can be the different object set be associated from different user.
At operation S1303, processor 305 receives the second information of arranging of described multiple object from mancarried device 110 and 111 by communication unit 302.
At operation S1304, when the second information of arranging is received, described multiple object is presented on display unit 307 according to the second layout based on the second information of arranging by processor 305.As mentioned above, processor 305 can arrange based on first and second arrange between difference edit the information of the first layout, or the information of the second layout can be received.
Figure 14 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Figure 14 illustrates that the function of information displaying on screen of object instruction selected by mancarried device 110 and 111 is added to the example of the operation of Figure 13.Therefore, the operation S1401 of Figure 14, operation S1402, operation S1405 are similar to operate S1304 with operation S1301 with operation S1406, therefore omit operation S1401, operate S1402, operate S1405 and operate the unnecessary explanation of S1406.
At operation S1403, processor 305 receives the information of at least one object selected by the user of mancarried device 110 and 111 by communication unit 302.Therefore, at operation S1404, processor 305 shows the information indicating alternative in the middle of the multiple objects be presented at according to the first layout display unit 307.Such as, selected object can on the display 307 highlighted or expand, indicate the selection to described object.
Figure 15 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.Figure 15 illustrates and to produce and the function storing historical information is added to the example of the operation of Figure 13.Therefore, the operation S1501 of Figure 15 is similar to operating S1304 to operation S1301 to operation S1504, therefore omits the unnecessary explanation to operation S1501 to operation S1504.
At operation S1504, the screen be presented on display unit 307 is edited.At operation S1505, processor 305 produces edit history information, and edit history information is stored in storage unit 304.In this case, edit history information is sent to mancarried device 110 and 111 by communication unit 302 by processor 305, and shares edit history information with mancarried device 110 and 111.
At operation S1505, edit history information can be stored as and manage edit history information according to shown information by processor 305.Such as, when shown screen is application browse screen, edit history information is stored as the edit history information applied and browse.Such as, application icon can be arranged or revise, and can be stored the history that application icon is edited.In addition, when display screen is content-browsing screen, edit history information is stored as the edit history information of content-browsing.Such as, content icon or thumbnail can be arranged or revise, and can be stored the history that content icon is edited.Selectively, edit history information can be stored as and manage edit history information according to shown information by processor 305.
Figure 16 is the process flow diagram of the method for the on-screen editing of the control display device 120 and 121 illustrated according to exemplary embodiment.
At operation S1601, arrange the described multiple object of display according to first.At operation S1602, receive information request signal from the mancarried device 110 of the information of asking about described shown multiple objects.At operation S1603, in response to receiving information request signal, arrange that the information of described multiple object of display is sent to mancarried device 110 by about according to first.
At operation S1604, processor 305 passes through communication unit 302 from mancarried device 110 and 111 Received signal strength.When the signal received is the information of the second layout, at operation S1605, described multiple object is presented on display unit 307 according to the second layout based on the second information of arranging by processor 305.
On the other hand, when the signal received by communication unit 302 is the execution request signal for selected object, processor 305 proceeds to operation S1606, and processor 305 performs selected object.
Figure 17 is the process flow diagram of the method for the on-screen editing of the control display device illustrated according to exemplary embodiment.
In operation 1701, multiple object is presented on display unit 307 by display device 120.In operation 1702, access control authorization request signal is sent to display device 120 by communication unit 208 by mancarried device 110.In operation 1703, display unit 120 determines whether to allow access mancarried device 110.Access control can be the authorisation process of the information based on the mancarried device 110 be stored in storage unit 207.
If determine to allow access mancarried device 110, then method proceeds to operation 1704.In operation 1704, by communication unit 302, whether display device 120 notifies that mancarried device 110 is accessed and is allowed to.
In operation 1705, mancarried device 110 is presented at the information of the multiple objects in display device 120 to display device 120 request.In operation 1706, the information of described multiple object is sent to mancarried device 110 by display device 120.
In operation 1707, mancarried device 110 is edited described multiple object.Editor as described in can performing as described in reference Fig. 5 A to Fig. 5 D.
In operation 1708, mancarried device 110 produces the information of the second layout based on edited result.In operation 1709, the information that second arranges is sent to display device 120 by mancarried device 110.In operation 1710, display device 120 correspondingly edits the described multiple object be presented on display unit 307.Next, in operation 1711, edited result is stored in storage unit 304 by display device 120.In operation 1712, display device 120 notifies that mancarried device 110 has been edited.
Figure 18 is the process flow diagram of the method for the function of the execution display device 120 illustrated according to exemplary embodiment.
With reference to Figure 18, in operation 1801, display device 120 shows multiple object.In operation 1802, access control authorization request signal is sent to display device 120 by mancarried device 110.In operation 1803, similar to the operation 1703 of Figure 17, display device 120 determines whether to allow access mancarried device.
In operation 1804, instruction allows the confirmation signal of access to be sent to mancarried device 110 by display device 120.In operation 1805, mancarried device 110 asks the information about the object be presented in display device 120 to display device 120.
In operation 1806, the information about object is sent to mancarried device 110 by display device 120.In operation 1807, mancarried device 110 produces the execution control signal of the object for performing the display selected by least one.In operation 1808, execution control signal is sent to display device 120 by mancarried device 110, to perform the object of described display selected by least one.Therefore, in operation 1809, display device 120 performs selected object.
Figure 19 illustrates that the use mancarried device 110 according to exemplary embodiment controls the process flow diagram of the method for the on-screen editing of display device 120.The method of Figure 19 is performed by processor 210.
At operation S1901, processor 210 receives the information of arranging the multiple objects be presented at display device 120 according to first by communication unit 208 from server 130.
At operation S1902, processor 210 arranges the described multiple object of display according to first based on the information of the described multiple object received from server 130.
At operation S1903, processor 210, based on the user's input information detected, produces the second layout by the information changing described multiple object.At operation S1904, the information that second arranges is sent to server 130 by communication unit 208 by processor 210.
Figure 20 is the process flow diagram of the method for the on-screen editing of the control display device 120 illustrated according to exemplary embodiment.The method of Figure 20 is performed by the processor 133 of Figure 1A.
At operation S2001, communication unit 131 receives the information request signal for the information about the object be presented at display device 120 from mancarried device 110.For being included in information request signal with the information that display device 120 carries out communicating.For the identification information of display device 120 or the address of display device 120 can be comprised with the information that display device 120 carries out communicating, such as, Internet protocol (IP) address.At operation S2002, processor 133 use comprise be to operate in the information request signal that receives of S2001 for carrying out with display device 120 information that communicates, the information of asking about the object be presented in display device 120 to display device 120 by communication unit 131.
At operation S2003, received the information of the described multiple object arranging display according to first from display device 120 by communication unit 131.At operation S2004, the information of described multiple object is sent to mancarried device 110 by communication unit 131 by processor 133.At operation S2005, received the second information of arranging of the information of described multiple object from mancarried device 110 by communication unit 131.At operation S2006, the information that second arranges is sent to display device 110 by communication unit 131 by processor 133.
Server 130 can be classified to the information of the multiple objects be presented in display device 120 explicitly with user profile, and the information be associated to be stored in storage unit 132 and to manage the information be associated.In this case, when the information being presented at the described multiple object in display device 120 is not the information only for multiple objects of the user of mancarried device 110, the user of mancarried device 110 can request server 130 by only for the information displaying of described multiple object of described user in display device 120.In other words, user-specific information can be shown, such as, the object be only associated with user.User-specific information can be determined according to the mark of the mark of user or the mancarried device 110 be associated with user.
According to described request, server 130 is only sent to display device 120 and mancarried device 110 for the information of described multiple object of the user of mancarried device 110 by being stored in server 130.Therefore, mancarried device 110 and display device 120 show the information of the described multiple object received from server 130 simultaneously.
Mancarried device 110 shows the information of described multiple object, and controls on-screen editing or function based on the information of described multiple object.Whenever there is control event or editing operation is done, by server 130, the result of the control to on-screen editing or function is sent to display device 120, or directly the result of described control can be sent to display device 120 from mancarried device 110.
Display device 120 carrys out editing screen according to on-screen editing's control signal, or controls the function of shown object according to function control signal.
Figure 21 is the process flow diagram of the method for the on-screen editing of the control display device 120 illustrated according to exemplary embodiment.The method of Figure 21 is performed by processor 305.
At operation S2101, show multiple object by processor 305 according to the first layout.At operation S2102, processor 305 receives request signal for shown information by communication unit 302 from server 130.At operation S2103, the information of shown multiple objects is sent to server 130 by processor 305.
Next, at operation S2104, the information of the second layout is received from server 130 by communication unit 302.At operation S2105, the screen that processor 305 information shown by arranging based on second is edited at least one object in the described multiple object be presented on display unit 307 and obtained.
Figure 22 is the process flow diagram of the method for the on-screen editing of the control display device 120 illustrated according to exemplary embodiment.
In operation 2201, display device 120 shows multiple object.In operation 2202, access control authorization request signal is sent to server 130 by mancarried device 110.
The described multiple object be presented in display device 120 can be by server 130 provide only for multiple objects of the user of mancarried device 110.That is, object can be user-specific object.
In operation 2203, server 130 determines whether to allow mancarried device 110 to conduct interviews.Can perform based on the information of mancarried device 110 and describedly to determine.The information of mancarried device 110 can comprise the identification information of mancarried device 110 and/or the user profile of mancarried device 110.
If determine to allow access, then method proceeds to operation 2207.In operation 2207, server 130 can notify that mancarried device 110 access is allowed to.But, access control can be performed by display device 120 alternative server 130.
In operation 2204, access control authorization request signal is relayed to display device 120 by server 130.In operation 2205, display device 120 determines whether to allow access.In operation 2206, the access of display device 120 announcement server 130 is allowed to.In operation 2207, server 130 can notify that mancarried device 110 access is allowed to.
In operation 2208, information request signal is sent to server 130 by mancarried device 110.In operation 2209, the information request signal of described multiple object is sent to display device 120 by server 130.Information request signal for the information about described multiple object can represent display information request signal.
In operation 2210, the information of the current described multiple object be presented in display device 120 is sent to server 130 by display device 120.In operation 2211, the information of the described multiple object received from display device 120 is sent to mancarried device 110 by server 130.
In operation 2212, mancarried device 110 performs on-screen editing.In operation 2213, the control command (such as, about second information of arranging of object) corresponding to on-screen editing is sent to server 130 by mancarried device 110.
In operation 2214, control command is converted to the control command with the data layout that can be processed by display device 120 by server 130.In operation 2215, the control command after conversion is sent to display device 120 by server 130.In operation 2216, display device 120 carrys out executive editor according to the control command after the conversion received from server 130.In operation 2217, when editor is done, display device 120 shows the screen after editor, and editor is completed message and be sent to server 130.Complete notification message at operation 2218 event memory, and is sent to mancarried device 110 in operation 2219 by server 130.The result stored can comprise the historical information according to described editor.Edit history information can be sent to mancarried device 110 and/or display device 120 by server 130, to share edit history information with mancarried device 110 and/or display device 120.
The computer-readable code on computer readable recording medium storing program for performing can be implemented as according at least one program comprising the order of the method for performing the on-screen editing controlling display device of one or more exemplary embodiment.Computer readable recording medium storing program for performing comprises: can store can by any memory storage of the data of computer system reads.The example of computer readable recording medium storing program for performing comprises: ROM (read-only memory) (ROM), random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices.Computer readable recording medium storing program for performing can be distributed in the computer system of network connection, to store and computer readable code executed according to distributed way.
Although illustrate and describe exemplary embodiment particularly, exemplary embodiment and term are only used to provide complete understanding of the present disclosure, and should not be interpreted as restriction.Only should consider exemplary embodiment in descriptive sense, instead of in order to limit object.Therefore, the scope of the present invention's design be can't help to specifically describe and is limited, but is defined by the claims, and comprises all differences in described scope by being interpreted as.

Claims (15)

1., for controlling a method for display device, described method comprises:
Receive display icon on the screen of the display apparatus the first first display information of arranging;
Based on the first display information, first of icon the layout is presented on the display of mancarried device;
Amendment is presented at the first layout of the icon on the display of mancarried device, arranges with produce icon second;
Arrange based on second of icon and produce the second display information;
Arrange that the request be presented on the display of display device is sent to display device by for by second of icon, wherein, described request comprises the second display information.
2. the method for claim 1, wherein the first display information comprises: the display position information of icon, wherein, and the position of display position information instruction display icon on the screen of the display apparatus.
3. method as claimed in claim 2, wherein, display position information comprises: the instruction display icon being in coordinate position is on the screen of the display apparatus with reference to the coordinate information of the absolute position of the coordinate position of the screen of display device.
4. as claim 2 or method according to claim 3, wherein, display position information comprises: the coordinate information indicating icon relative position relative to each other on the screen of the display apparatus.
5. as claim 2 or method according to claim 3, wherein, the first display information also comprises: the icon image data of icon.
6. method as claimed in claim 5, wherein, the first display information is extensible markup.
7. as claim 2 or method according to claim 3, wherein, the second display information comprises: be modified to instruction by the first display information of the reposition of display icon on the screen of the display apparatus.
8. as claim 2 or method according to claim 3, wherein, the second display information comprises: the first display information and by the difference between the reposition of icon that shows on the screen of the display apparatus.
9. as claim 1 or claim 2 or method according to claim 3, also comprise: while first of icon the layout is presented on the display of display device by display device, arrange second of icon and be presented on the display of mancarried device.
10. at least one step as claim 1 or claim 2 or method according to claim 3, wherein, during described modify steps comprises the following steps:
The position of at least one icon in the icon under first of icon the layout is changed into the reposition of at least one icon described in the icon under the second layout of icon;
Delete at least one icon in the icon under the first layout of icon;
New icon is added to the icon under the first layout of icon.
11. methods as claimed in claim 10, wherein, the first display information is extend markup language (XML) data,
Wherein, described modify steps comprises: revise XML data based at least one in described change, described deletion and described interpolation.
12. as claim 1 or claim 2 or method according to claim 3, and wherein, described modify steps comprises:
Be received in the input on the display of portable display, wherein, described input for revise icon first arrange under icon;
Produce second of icon based on described input to arrange.
13. as claim 1 or claim 2 or method according to claim 3, and wherein, icon comprises: represent the icon by the icon of content reproduced by display device and the application that represents display device.
14. 1 kinds of mancarried devices, comprising:
Display;
Communication unit, the receive display icon on the screen of the display apparatus first first display information of arranging;
Controller, control display based on the first display information and arrange display over the display by first of icon, receive the first input of arranging of amendment display icon over the display, produce second of icon based on described input to arrange, arrange based on second of icon and produce the second display information, and control communication unit by for by second of icon arrange the request be presented on the display of display device be sent to display device, wherein, described request comprises the second display information.
15. 1 kinds record the non-transitory computer-readable medium impelling mancarried device to perform the program of the method for controlling display device, and described method comprises:
Receive display icon on the screen of the display apparatus the first first display information of arranging;
Based on the first display information, first of icon the layout is presented on the display of mancarried device;
Amendment is presented at the first layout of the icon on the display of mancarried device, arranges with produce icon second;
Arrange based on second of icon and produce the second display information;
Arrange that the request be presented on the display of display device is sent to display device by for by second of icon, wherein, described request comprises the second display information.
CN201380018952.XA 2012-04-07 2013-04-08 Method and system for controlling display device and computer-readable recording medium Pending CN104272253A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR20120036403 2012-04-07
KR10-2012-0036403 2012-04-07
KR1020130036173A KR102037415B1 (en) 2012-04-07 2013-04-03 Method and system for controlling display device, and computer readable recording medium thereof
KR10-2013-0036173 2013-04-03
PCT/KR2013/002906 WO2013151399A1 (en) 2012-04-07 2013-04-08 Method and system for controlling display device and computer-readable recording medium

Publications (1)

Publication Number Publication Date
CN104272253A true CN104272253A (en) 2015-01-07

Family

ID=49634285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380018952.XA Pending CN104272253A (en) 2012-04-07 2013-04-08 Method and system for controlling display device and computer-readable recording medium

Country Status (2)

Country Link
KR (1) KR102037415B1 (en)
CN (1) CN104272253A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104918135A (en) * 2015-06-01 2015-09-16 无锡天脉聚源传媒科技有限公司 Video station caption generating method and device
CN105163160A (en) * 2015-08-29 2015-12-16 天脉聚源(北京)科技有限公司 Method and device for improving information synthesis security
CN105630286A (en) * 2015-12-18 2016-06-01 小米科技有限责任公司 Icon arranging method and device
CN106020751A (en) * 2015-12-14 2016-10-12 G思玛特有限公司 Utilize the display screen actuating system and the method of mobile device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102197886B1 (en) * 2014-03-18 2021-01-04 주식회사 엘지유플러스 Method for controlling wearable device and apparatus thereof
KR102208047B1 (en) * 2014-03-18 2021-01-26 주식회사 엘지유플러스 Method for controlling wearable device and apparatus thereof
KR20200097012A (en) * 2019-02-07 2020-08-18 주식회사 엔씨소프트 System and method for terminal device control
KR20220081723A (en) * 2020-12-09 2022-06-16 삼성전자주식회사 Electronic apparatus, order management system and controlling method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1355994A (en) * 1999-06-11 2002-06-26 联合视频制品公司 Interactive TV, application system with hand-held appliation device
CN101202827A (en) * 2006-12-12 2008-06-18 索尼株式会社 Portable terminal, displaying method, and storage medium
US20100223563A1 (en) * 2009-03-02 2010-09-02 Apple Inc. Remotely defining a user interface for a handheld device
US20120042272A1 (en) * 2010-08-12 2012-02-16 Hong Jiyoung Mobile terminal and method of controlling the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081299A1 (en) * 2010-10-04 2012-04-05 Verizon Patent And Licensing Inc. Method and apparatus for providing remote control via a touchable display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1355994A (en) * 1999-06-11 2002-06-26 联合视频制品公司 Interactive TV, application system with hand-held appliation device
CN101202827A (en) * 2006-12-12 2008-06-18 索尼株式会社 Portable terminal, displaying method, and storage medium
US20100223563A1 (en) * 2009-03-02 2010-09-02 Apple Inc. Remotely defining a user interface for a handheld device
US20120042272A1 (en) * 2010-08-12 2012-02-16 Hong Jiyoung Mobile terminal and method of controlling the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104918135A (en) * 2015-06-01 2015-09-16 无锡天脉聚源传媒科技有限公司 Video station caption generating method and device
CN104918135B (en) * 2015-06-01 2018-05-08 无锡天脉聚源传媒科技有限公司 A kind of video station symbol generation method and device
CN105163160A (en) * 2015-08-29 2015-12-16 天脉聚源(北京)科技有限公司 Method and device for improving information synthesis security
CN106020751A (en) * 2015-12-14 2016-10-12 G思玛特有限公司 Utilize the display screen actuating system and the method of mobile device
CN106020751B (en) * 2015-12-14 2017-10-31 天津中节能智能玻显科技有限公司 Display screen driving system and method using mobile device
CN105630286A (en) * 2015-12-18 2016-06-01 小米科技有限责任公司 Icon arranging method and device
CN105630286B (en) * 2015-12-18 2019-10-11 小米科技有限责任公司 Icon arrangement method and device

Also Published As

Publication number Publication date
KR102037415B1 (en) 2019-10-28
KR20130113987A (en) 2013-10-16

Similar Documents

Publication Publication Date Title
US10175847B2 (en) Method and system for controlling display device and computer-readable recording medium
CN104272253A (en) Method and system for controlling display device and computer-readable recording medium
CN104281430B (en) The method and apparatus for executing function relevant to the information being shown on external device (ED)
CN105683894B (en) Application execution method of display device and display device thereof
JP5233708B2 (en) Information processing apparatus, information processing method, and program
CN110362246B (en) Method of controlling electronic device, and storage medium
EP3343412B1 (en) Method and system for reproducing contents, and computer-readable recording medium thereof
US20120159340A1 (en) Mobile terminal and displaying method thereof
CN104765584A (en) User terminal apparatus and control method thereof
CN103530032A (en) Mobile terminal, image display device and user interface providing method using the same
CN104603763A (en) Information transmission method and system, device, and computer readable recording medium thereof
CN103853427A (en) Display device for executing a plurality of applications and method for controlling the same
KR20160018001A (en) Mobile terminal and method for controlling the same
KR20130113983A (en) Method and system for playing contents, and computer readable recording medium thereof
KR101943988B1 (en) Method and system for transmitting content, apparatus and computer readable recording medium thereof
CN104395877A (en) Method and apparatus for performing auto-naming of content, and computer-readable recording medium thereof
CN108521595A (en) Position method, apparatus and smart television are recommended in selection based on interactive voice
KR20180020452A (en) Terminal and method for controlling the same
KR20170099088A (en) Electronic device and method for controlling the same
KR20130001826A (en) Mobile terminal and control method therof
KR20160090709A (en) Mobile terminal and method for controlling the same
CN108540851A (en) Position method, apparatus and smart television are recommended in selection based on interactive voice
AU2017202560B2 (en) Method and system for reproducing contents, and computer-readable recording medium thereof
KR20180057035A (en) Mobile terminal and method for controlling the same
KR20170032095A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150107

RJ01 Rejection of invention patent application after publication