WO2014035123A1 - Dispositif terminal utilisateur et procédé de commande associé - Google Patents

Dispositif terminal utilisateur et procédé de commande associé Download PDF

Info

Publication number
WO2014035123A1
WO2014035123A1 PCT/KR2013/007695 KR2013007695W WO2014035123A1 WO 2014035123 A1 WO2014035123 A1 WO 2014035123A1 KR 2013007695 W KR2013007695 W KR 2013007695W WO 2014035123 A1 WO2014035123 A1 WO 2014035123A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
window
application window
terminal apparatus
user
Prior art date
Application number
PCT/KR2013/007695
Other languages
English (en)
Inventor
Kang-Tae Kim
Eun-Young Kim
Duck-Hyun Kim
Chul-Joo Kim
Kwang-Won Sun
Jae-Yeol Lee
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2014035123A1 publication Critical patent/WO2014035123A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • Methods and apparatuses consistent with the exemplary embodiments relate to a user terminal apparatus and control method thereof, and more particularly, to a touch based user terminal apparatus and a control method thereof.
  • display apparatuses such as TVs, PCs, laptop computers, tablet PCs, mobile phones, and MP3 players etc. are widely used in most households.
  • mobile terminals such as a tablet PC and mobile phone also provide multi tasking environments which execute a plurality of applications at the same time.
  • the purpose of the present invention is to provide a user terminal apparatus which automatically adjusts an application window to satisfy a user’s needs in a multi-tasking environment, and a control method thereof.
  • a user terminal apparatus includes a display which displays an application window on a screen; a user interface unit which receives an input of a user command; and a controller which controls so that a title area is displayed on one area of the application window when a first user command for displaying the title area on the application window is input and that the title area automatically disappears when a predetermined event occurs.
  • the predetermined event may be an event where a predetermined time passes from a display point of the title area.
  • controller may display the title area, when the first user command is input with the application window displayed on an entire area of the screen.
  • the first user command may be a user manipulation of touching a predetermined area of the application window.
  • the one area of the application window may be an area adjacent to the predetermined area.
  • the controller may control to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window.
  • the controller may reduce the application window to a predetermined size in the multi window mode, and display the title area to be regularly included in one area of the reduced application window.
  • the second user command may be a touch manipulation regarding the title area.
  • the user terminal apparatus may control to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a double tab manipulation regarding a predetermined area of the application window or a touch and flick manipulation regarding the predetermined area is input.
  • the user terminal apparatus may be a touch based mobile terminal.
  • a control method of a user terminal apparatus may include displaying an application window on a screen; displaying a title area on one area of the application window, when a first user command for displaying the title area on the application window is input; and controlling so that the title area automatically disappears when a predetermined event occurs.
  • the predetermined event may be an event where a predetermined time passes from a display point of the title area.
  • the displaying may display the title area, when the first user command is input with the application window displayed on an entire area of the screen.
  • the first user command may be a user manipulation of touching a predetermined area of the application window.
  • control method of a user terminal apparatus may further include controlling controls to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window.
  • FIG. 1 is a view illustrating a user terminal apparatus according to an exemplary embodiment of the present disclosure
  • FIG. 2 is a block diagram illustrating a detailed configuration of a user terminal apparatus illustrated in FIG. 1;
  • FIG. 3a and 3b are a view for explaining a software configuration stored in a storage unit
  • FIG. 4 is a view for explaining a method of displaying an application which supports a multi window mode according to an exemplary embodiment of the present disclosure
  • FIG. 5a and 5b are a view for explaining an application window display format in a case of executing an application in a multi-window mode and general mode according to another exemplary embodiment of the present disclosure
  • FIG. 6a and 6b are a view for explaining a method of moving an application window in a multi window mode according to another exemplary embodiment of the present disclosure
  • FIG. 7 is a view for explaining a method of adjusting a size of an application window in a multi window mode according to another exemplary embodiment of the present disclosure
  • FIGs. 8 to 10 are views for explaining a method of displaying a title area in an application window according to various exemplary embodiments of the present disclosure.
  • FIG. 11 is a view for explaining a control method of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
  • FIG. 1 is a view illustrating a user terminal apparatus according to an exemplary embodiment of the present disclosure.
  • FIG. 1(a) is a mimetic diagram for explaining an embodiment example of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
  • the user terminal apparatus 100 may be embodied as a tablet PC, but is not limited thereto, and thus may be embodied as various types of apparatuses which are portable and have display functions such as smart phones, PMPs, PDAs.
  • the user terminal apparatus 100 has inside a touch screen, and thus may be embodied to execute a program using fingers or a pen.
  • a tablet PC which is an embodiment example of the user terminal apparatus 100 of the present disclosure is an apparatus where portability of a PDA and functions of a notebook have been combined.
  • a tablet PC may have functions of a desktop and at the same time use wireless internet, and the main input apparatus may be a touch screen but an existing keyboard or mouse may be connected and used.
  • a tablet PC may have a function of perceiving the handwriting of what a user wrote and storing it as data.
  • the user terminal apparatus 100 may provide a multi-tasking environment executing a plurality of applications at the same time and performing an operation.
  • a method of providing an application window in a multi tasking environment according to various exemplary embodiments of the present disclosure.
  • FIG. 1(b) is a block diagram illustrating a configuration of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
  • the user terminal apparatus 100 includes a display 110, user interface unit 120 and controller 130.
  • the display 110 displays a screen.
  • the screen may include an application execution screen, GUI (Graphic User Interface) screen etc. which includes various objects such as an image, video and text etc.
  • GUI Graphic User Interface
  • the display 110 may display a plurality of application execution screens, that is a plurality of application windows at the same time, Such a screen display mode is called a multi window mode hereinbelow.
  • An application window provided in such a multi window mode may be provided in a format where position moving, size adjustment, and pinup function etc. are possible.
  • the application window may include a title area(or title bar) which includes a menu item for providing the corresponding function. More specifically, on a title area, a maximizing button, end button, pin-up button etc. may be provided. Accordingly, through a manipulation of each button, it is possible to receive a window maximizing command, window end command, and window pinup command etc.
  • the application window provided in the multi window mode may be embodied to have a small size which occupies a partial area of among the entire screen of the display 110 to enable moving location and size adjustment easily.
  • the window mode is called a mini mode.
  • the display 110 may display one application window to have a size maximized to the entirety of the screen.
  • a screen display mode is called a normal mode.
  • the application window provided in a normal mode may be provided in a format which does not include a title area so as to provide an application execution screen as wide as possible.
  • the application window provided in a general mode is called a maximization mode hereinbelow in that it has a maximized size which occupies the entire screen area.
  • the display 110 may be embodied as an LCD(Liquid Crystal Display Panel), PLED (Organic Light Emitting Diode) etc., but is not limited thereto.
  • the display 110 may be embodied as a touch screen format having a mutual layer structure with the touch pad.
  • the display 110 may be used as a user interface 120 to be explained hereinbelow besides an output device.
  • the touch screen may be formed to detect not only an input location and area but also touch input pressure as well.
  • the user interface unit 120 receives various user commands.
  • the user interface unit 120 may receive a user command for displaying a title area on the application window.
  • the user interface unit 120 may receive a user command for displaying a title area on the application window with the application window displayed in a maximization mode occupying the entire area of the screen.
  • the user command may be a user manipulation, of touching a predetermined area of the application window, especially, one-tap manipulation, flick manipulation etc.
  • the predetermined area may be an information display area(or menu display area) provided in a top end area of the application window, but not limited thereto.
  • a restoration button for restoring the window in a mini mode and an end button for ending the window can be provided. That is, as aforementioned, it is possible to receive not only a touch manipulation regarding an area excluding an area where a button is provided of the title area but also a user command for restoring the maximization mode into a mini mode through a touch manipulation regarding a restoration button provided on the title area.
  • the user interface unit 120 may receive a user command for restoring the size of the application window into a mini mode reduced to a predetermined size with the application window displayed on the title area.
  • the user command may be a touch manipulation regarding an area excluding the area where the button is provided of the title area, and a touch manipulation regarding a window return button provided on the title area.
  • the user interface unit 120 may also receive a user command for restoring the window into a mini mode with the application window maximized and with the title area not displayed.
  • the user command may be a two-tap manipulation, flick-down manipulation etc. regarding an information display area.
  • the user interface unit 120 may receive a user command for moving the location of the application window in a multi window mode, a user command for adjusting a size of the application window, and a user command for pinning up of the application window.
  • a pin-up command may be embodied to operate in pin-up ON/OFF through a tap operation regarding the pin-up button. That is, during a pin ON, the application window may be displayed in a regular state, and during a PIN OFF, it may be embodied such that moving location and size adjustment are possible
  • the controller 130 controls the overall operations of the user terminal apparatus 100.
  • the controller 130 displays a title area on one area of the application window, and when a predetermined event occurs, the controller 130 may control so that the title area automatically disappears.
  • the predetermined event may be an event where a predetermined time passes, but not limited thereto.
  • the controller 130 may receive a corresponding user command with the application window displayed in a maximization mode.
  • a return button for returning to a mini mode and an end button for ending the window may be provided, unlike the mini mode.
  • the title area may be displayed on a top end area of the application window.
  • a title area is displayed on the adjacent upper area of the information display area, and after a predetermined time passes, the title area may automatically disappear.
  • the predetermined time may be for example within 3 seconds, but not limited thereto.
  • the controller 130 may control to enter into the multi window mode for displaying a plurality of application windows on the screen.
  • the user command may be a manipulation of a flick, especially, a flick down of the title area, but is not limited thereto, and may also be embodied as a one-tap manipulation, and drag manipulation.
  • the controller 130 may control so that the application window is displayed in a mini mode reduced to have a predetermined size.
  • the title area may be displayed regularly on one area of the mini window. That is, in the maximization mode where the application window is displayed on the entire screen, the title area may not be displayed regularly but be displayed on the title area according to the user command, but in the multi window mode, the title area may be regularly displayed.
  • the title area is displayed with the application window displayed on the entire screen, that is in a maximization mode, and then is automatically disappears, but is not limited thereto. That is, according to embodiment methods, the technology may be applied to the mini window format.
  • controller 130 may control to display the guide GUI for guiding moving the location and adjusting the size of the application window according to the user command in the multi window mode. In this case, the controller 130 may control to provide a haptic feedback with the guide GUI displayed.
  • controller 130 may provide various menu items for executing the multi window mode and general mode.
  • FIG. 2 is a block diagram illustrating a detailed configuration according to an exemplary embodiment of a user terminal apparatus illustrated in FIG. 1.
  • the user terminal apparatus 100 includes a display 110, user interface unit 120, controller 130, storage unit 140, application driver 150, feedback provider 160, communication unit 170, audio processor 180, video processor 185, speaker 190, button 191, USB port 192, camera 193, and mike 193.
  • application driver 150 the user terminal apparatus 100 includes a display 110, user interface unit 120, controller 130, storage unit 140, application driver 150, feedback provider 160, communication unit 170, audio processor 180, video processor 185, speaker 190, button 191, USB port 192, camera 193, and mike 193.
  • Operations of the aforementioned controller 130 may be performed by programs stored in the storage unit 140.
  • various data such as various data and contents etc. that are input or predetermined during execution of O/S(Operating System) software module, various applications for driving the user terminal apparatus 100.
  • applications stored in the storage unit 140 may be differentiated into applications which can and cannot support multi window mode.
  • the storage unit 140 may store various formats of templates where the layout for arranging a plurality of application windows on the screen in the multi window mode is predefined.
  • the application driver 150 performs a function of driving and executing an application which could be provided in the user terminal apparatus 100.
  • an application is an application program which may be executed by itself, and may include various multimedia contents.
  • multimedia contents includes text, audio, still image, animation, video, interactivity contents, EPG(Electronic Program Guide) contents from contents providers, electronic messages received from users, and information on current events, but are not limited thereto.
  • the feedback provider 160 performs a function of providing various feedback according to the function executed in the user terminal apparatus 100.
  • the feedback provider 160 may provide a haptic feedback regarding the GUI displayed on the screen.
  • a haptic feedback is a technology of enabling sensing touch of the user by generating vibration, force or impact to the user terminal apparatus 100, and may also be called computer touch technology.
  • the feedback provider 160 may provide haptic feedback regarding the corresponding guide GUI when a guide GUI for guiding the location moving and size adjusting of the application window is displayed according to the user command in the multi window mode.
  • the feedback provider 160 may apply different vibration conditions(for example, vibration frequency, vibration length, vibration intensity, vibration wave, vibration location etc.) according to the control by the controller 130, and provide various feedbacks.
  • vibration conditions for example, vibration frequency, vibration length, vibration intensity, vibration wave, vibration location etc.
  • a method of applying vibration methods differently and generating various haptic feedback is a conventional art, and thus detailed explanation is omitted.
  • the feedback provider 160 uses the vibration sensor to provide haptic feedback, but this is merely an exemplary embodiment, and thus the feedback provider 160 may use the piezo sensor to provide haptic feedback.
  • the communication unit 170 is a configuration of performing communication with various types of external devices according to various types of communication methods.
  • the communication unit 170 includes a wifi chip 171, Bluetooth 172, and wireless communication chip 173 etc.
  • a wifi chip 171 and Bluetooth chip 172 performs communication in a wifi method, and Bluetooth method, respectively.
  • a wireless communication chip 173 refers to a chip which performs communication according to various communication standards such as IEEE, zigbee, 3G(3rd Generation), 3GPP(3rd Generation Partnership Project), LTE(Long Term Evoloution).
  • the communication unit 170 may further include an NFC chip which operates in an NFC(Near Field Communication) method which uses 13.56MHx band of among various RF-ID frequency bands such as 135kHz, 13.56MHz, 433MHz, 860 ⁇ 960MHz, 2.45GHz etc.
  • the audio processor 180 is a configurative element which performs processing to audio data.
  • various processing such as decoding, amplifying, and noise filtering etc. may be performed to audio data.
  • the audio processor 185 is a configurative element which performs processing to video data.
  • the video processor 185 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion etc. to video data.
  • the speaker 190 is a configurative element which outputs not only various audio data processed in the audio processor 180 but also various notice sound and voice messages etc.
  • the button 191 may be a button of various types such as a mechanical button, touch pad, and wheel etc. formed in an arbitrary area such as the front surface, side surface, and rear surface etc. of the appearance of the main body.
  • a button for turning ON/OFF the power of the user terminal apparatus 100 may be provided.
  • the USB port 192 may perform communication with various external apparatuses or charging with various external apparatuses through a USB cable.
  • the camera 193 is a configuration of photographing a still image or video according to a control by a user.
  • a plurality of cameras 193 may be embodied such as a front camera and rear camera etc.
  • the mike 194 is a configuration for receiving user voice or other sound and converting the received user voice or other sound into audio data.
  • the controller 130 may use user voice input through the mike 194 in the call process, or convert the input voice into audio data and store it in the storage unit 140.
  • the controller 130 may perform controlling operations according to a user motion perceived by user voice or the camera 193 input through the mike 194. That is, the user terminal apparatus 100 may operate in a motion control mode or voice control mode. In a case of operating in a motion control mode, the controller 130 activates the camera 193 to photograph the user, and tracks changes of the user’s motion to perform a control operation corresponding thereto. In a case of operating in a voice control mode, the controller 130 may analyze the user voice input through the mike, and operate in a voice recognition mode for performing a control operation according to the analyzed user voice.
  • various external input ports for connecting with various external terminals such as a headset, mouse, and LAN etc. may be further included.
  • controller 130 uses various programs stored in the storage unit 140 to control the overall operations of the user terminal apparatus 100.
  • the controller 130 may execute an application stored in the storage unit 140 and configure and display its execution screen, and may also reproduce various content stored in the storage unit 140.
  • the controller 130 may perform communication with external devices through the communication unit 160.
  • the controller 130 includes a RAM 131, ROM 132, main CPU 133, graphic processor 134, 1st to nth interfaces 135-1 ⁇ 135-n, and bus 136.
  • the RAM 131, ROM 132, main CPU 133, graphic processor 134, and 1st to nth interfaces 135-1 ⁇ 135-n etc. may be connected to one another through a bus 136.
  • the 1st to nth interfaces 135-1 to 135n are connected to various configurative elements aforementioned.
  • One of the interfaces may be a network interface which is connected to an external device through a network.
  • the main CPU 133 accesses the storage unit 140, and performs booting using the O/S stored in the storage unit 140. In addition, the main CPU 133 uses various programs, contents, and data etc. stored in the storage unit 140 to perform various operations.
  • a command set etc. for system booting is stored in the ROM 132.
  • the main CPU 133 copies the O/S stored in the storage unit 140 to the RAM 131 according to the command stored in the ROM 132, and executes the O/S to boot the system.
  • the main CPU 133 copies various application programs stored in the storage unit 140 to the RAM 131, and executes the application program copied to the RAM 131 to perform various operations.
  • the graphic processor 134 uses a calculator(not illustrated) and renderer(not illustrated) to generate a screen which includes various objects such as an icon, image, and text etc.
  • the calculator uses a control command received from the input apparatus 134 to calculate feature values such as a coordinate value, format, size, color etc. where each object is to be displayed according to a layout of the screen.
  • the renderer generates various screens of various layouts which include objects based on the feature values calculated in the calculator. The screen generated in the renderer is displayed within the display area of the display 110.
  • the user terminal apparatus 100 may include a sensor(not illustrated).
  • the sensor(not illustrated) may sense various operations such as a touch, rotation, inclination, pressure, and approach etc. regarding the user terminal apparatus 100.
  • the sensor(not illustrated) may include a touch sensor which senses a touch.
  • the touch sensor may be embodied into a capacitive or a pressure-sensitive touch sensor.
  • the capacitive touch sensor is a type where dielectric coated on the surface of the display 110 is used to sense micro electricity approved to a user’s body and calculate a touch coordinate when a part of the user’s body is touched to the surface of the display 110.
  • the pressure-sensitive touch sensor is a type where two electrode panels are included, and thus when a user touches the screen, the pressure-sensitive touch sensor senses current flowing due to contact of an upper and lower panel of a touched point to calculate a touch coordinate.
  • the touch sensor may be embodied in various formats.
  • the sensor may further include a geomagnetic sensor for sensing a rotation state and moving direction etc. of the user terminal apparatus 100, and an acceleration sensor for sensing an inclination degree of the user terminal apparatus 100.
  • FIG. 2 is an example of a detailed configuration included in the user terminal apparatus 100, and thus depending on exemplary embodiments, part of the configurative elements illustrated in FIG. 2 may be omitted or changed, or other configurative elements may be further added.
  • a GPS receiver(not illustrated) for receiving a GPS signal from a GPS(Global Positioning System) satellite and calculating a present location of the user terminal apparatus 100
  • a DMB receiver(not illustrated) etc. for receiving and processing a DMB(Digital Multimedia Broadcasting) signal may be further included.
  • FIG. 3A is a view for explaining a configuration of software stored in the storage unit 140.
  • the storage unit 140 software which includes a base module 141, sensing module 142, communication module 143, presentation module 144, web browser module 145, and service module 146 may be stored.
  • a base module 141 refers to a basic module which processes a signal transmitted from each hardware included in the user terminal apparatus 100 and transmits the processed signal to an upper layer module.
  • the base module 141 includes a storage module 141-1, security module 141-2, and network module 141-3 etc.
  • the storage module 141-1 is a program module which manages database DB or registry.
  • the main CPU 133 may use the storage module 141-1 to access database within the storage unit 140 and read various data.
  • the security module 141-2 is a program module which supports certification, permission, secure storage etc. regarding hardware
  • a network module 141-3 is a module for supporting network connection, and includes a DNET module and UpnP module etc.
  • the sensing module 142 is a module which collects information from various sensors, and analyzes and manages the collected information.
  • the sensing module 142 may include a face recognition module, voice recognition module, motion recognition module, and NFC recognition module etc.
  • the communication module 143 is a module for performing communication with the outside.
  • the communication module 143 may include a messaging module 143-1 such as a messenger program, SMS(Short Message Service) & MMS(Multimedia Message Service) program, and email program etc. and a telephone module 143-2 which includes a Call Info Aggregator program module and VoIP module etc.
  • a presentation module 144 is a module for configuring a display screen.
  • the presentation module 144 includes a multimedia module 144-1 for reproducing and outputting multimedia contents, and a UI rendering module 144-2 which performs UI and graphic processing.
  • the multimedia module 144-1 may include a player module, camcorder module, and sound processing module etc. Accordingly, it performs an operation of reproducing various multimedia contents and generating a screen and sound.
  • the UI rendering module 144-2 may include an Image Compositor module which combines images, a coordinate combination module which combines coordinates on the screen for displaying an image, an X11 module which receives various events from hardware, and a 2D/3D UI toolkit which provides a tool for configuring a UI of a 2D or 3D format.
  • the web browser module 145 refers to a module which performs web browsing and accesses the web server.
  • the web browser module 145 may include various modules such as a web view module which configures a web page, a download agent module which performs downloading, a bookmark module, and webkit module etc.
  • the service module 146 is a module which includes various applications for providing various services. More specifically, the service module 146 may include various program modules such as a navigation program, contents reproducing program, game program, e-book program, calendar program, alarm management program, and other widget program etc.
  • the user terminal apparatus 100 may be embodied in a format which further includes a location based module which is interlocked to hardware such as a GPS chip and supports base service.
  • FIG. 3B an example of a multi-window framework architecture on system (ex. Android® system) used by the user terminal apparatus 100 according to an exemplary embodiment of the present invention in order to display a plurality of application windows on a screen is provided.
  • the framework architecture of FIG. 3B may be one component of the software illustrated in FIG. 3A; however, this will be explained by referring to a separate drawing for convenient explanation.
  • the multi-window framework architecture may include an application framework 310 and a multi-window framework 320.
  • the multi-window framework 320 may operate separately from the application framework 310.
  • the application framework 310 may include an activity manager 311, a window manager 312, and a view system 313.
  • the multi-window framework 320 may include a multi-window manager 321.
  • the activity manager 311 may call information corresponding to an executing window of the executed application to the multi-window framework.
  • the activity manager 311 may receive information regarding display mode, size and position of an application executing window based on a life cycle of the application executing window from the multi-window framework.
  • the activity manager 311 may call information regarding display mode, size and position of the application executing window during a creation phase of the life cycle of the application executing window.
  • the window manager 312 may confirm the application executing window corresponding to a touch inputted by a user.
  • the window manager 312 may provide position information on a display corresponding to a touch input of a user to the multi-window framework, and receive information of the application executing window corresponding to the touch input determined by the multi-window framework from the multi-window framework.
  • the window manager 312 may receive information regarding position and size of application executing window from the multi-window framework, and determine application executing window corresponding to touch inputting of a user based on the received position and size of application executing window.
  • the view system 313 may confirm positions and sizes of widget window and pop-up window.
  • the multi-window framework 320 may determine sizes and positions of widget window and pop-up window, and the view system 313 may receive information regarding sizes and positions of widget window and pop-up window from the window framework.
  • the multi-window manager 321 included in the multi-window framework manages various operations regarding multi-window functions provided from the user terminal apparatus 100, and provides various Application Programming Interfaces (APIs) regarding multi-window functions. Further, multi-window service may store various APIs regarding multi-window functions. An API regarding common functions of single window and multi-window may be implemented as common class, and an API regarding functions only applied in multi-window may be implemented so as to be divided according to display mode.
  • the application framework 310 may further include a content provider 314, a package manager 315, a telephone manager 316, a resource manager 317, a position manager 318, a notice manager 319, and the like.
  • the multi-window framework 320 may also include a multi-window service 322, which will not be explained because providing service is described above.
  • FIG. 4 is a view for explaining a method of displaying an application which supports a multi window mode according to an exemplary embodiment of the present disclosure.
  • various menu items may be displayed in an icon interface format on the initial screen according to an exemplary embodiment of the present disclosure.
  • a first menu item 10 may be displayed on a central bottom area of the screen, and a second menu item 20 may be displayed on an upper right area of the screen.
  • the first menu item 10 plays a function of displaying an application which is capable of supporting a multi window mode on a particular area of a screen.
  • the second menu item plays a function of displaying all applications which may be provided in the user terminal apparatus 100 on the entire area of the screen.
  • application icons which may support the multi window mode are displayed in a row on the bottom of the screen.
  • applications application icons
  • the corresponding area is referred to as a “mini tray” for convenience of explanation.
  • a third menu item 30 and fourth menu item 40 may be displayed on a right and left side of the first menu item 10.
  • the third menu item 30 plays a function of providing various formats of templates which predefined a layout for arranging a plurality of application windows in the multi window mode.
  • the fourth menu item 40 plays a function of providing an application list which is currently in execution in the multi window mode.
  • the applications move to the corresponding drag direction in the FIG. 4(c).
  • the movement of the application stops with the display being maintained in a state where the user touch manipulation is released.
  • the application 425 moves by the dragged distance, and other icons on the mini tray 420 also move to the same direction.
  • the user manipulation is released with the third to 9th icons (423 to 429) displayed on the mini tray 420, the movement of the application on the mini tray 420 stops as it is displayed.
  • FIG. 5 is a view for explaining an application window display format when an application is executed in a multi window mode or general mode according to other exemplary embodiments of the present disclosure.
  • an application which may support a multi window mode may be displayed on the bottom of the screen.
  • the corresponding application window 510 may be displayed in a mini mode format.
  • the title area 511 may be displayed on the application window 150.
  • various menu items for supporting the multi window mode may be displayed. For example, as illustrated, a maximization button 511-1, end button 511-2 and pinup button 511-3 may be displayed. Functions thereof have been aforementioned and thus are omitted.
  • the application window may be displayed in different formats according to the execution intention of the user. That is, in a case where the user executed an application in the multi window mode, a mini mode where movement of window location, adjustment of size, and pinup function etc. are possible may be displayed, and in a case where the user executed an application in a general mode, the window may be displayed in a maximization mode, to enable displaying where user’s intention is reflected.
  • FIG. 6a and 6b are a view for explaining a method of moving the application window in a multi window mode according to other exemplary embodiments of the present disclosure.
  • a guide GUI 612 for guiding the window movement and size adjustment may be displayed in the circumference portion of the application window 610.
  • the guide GUI 612 may be highlighted to differentiate the circumference of the application window 610, or be displayed in a identifiable color.
  • the guide GUI 612 may move to the corresponding direction and be displayed.
  • the application window 610 may be displayed without being moved in the originally displayed area.
  • the application window 610 may be moved to the area where the guide GUI 612 is displayed at the point where the touch manipulation is released, and be displayed.
  • the application window 610 when a user manipulation of touching the title area 611 of the application window 610 displayed in the mini mode in the multi window mode and dragging to a particular direction is input, the application window 610 itself may be moved to the drag direction and be displayed.
  • FIG. 7 is a view for explaining a method of adjusting the size of the application window in the multi window mode according to other exemplary embodiments of the present disclosure.
  • the guide GUI 712 for guiding movement and size adjustment may be displayed in the circumference portion of the application window 710.
  • the guide GUI 712 may be expanded to the drag direction and be displayed.
  • the guide GUI 712 of the portion where there is user manipulation may be expanded and moved, while the guide GUI 712 of the portion where there is not user manipulation is fixated in the circumference area of the application window.
  • the size of the application window may be expanded by a size corresponding to the guide GUI 712 and be displayed.
  • FIGs. 8 to 10 are views for explaining a method of displaying the title area in on the application window according to various exemplary embodiments of the present disclosure.
  • the title area 811 may be displayed on an upper area adjacent to the information display area 813.
  • the first user manipulation may be a touch manipulation regarding the information display area 813, for example, a tap manipulation, but it is not limited thereto, and thus the first user manipulation may be a flick down, flick up, drag-down and drag-up manipulation.
  • a return button 811-3 and end button 811-5 may be included in the title area.
  • the displayed title area 811 may disappear from the application window 810.
  • the predetermined event may be an event where a predetermined time passes.
  • the title area 811 when after the title area 811 is displayed the external event regarding the title area 811 does not occur, it becomes able to make the title area 811 disappear to provide the application window in a size as wide as possible to the user.
  • the title area 911 may be displayed as the application window 910 is changed into mini mode.
  • the application window 910 is displayed in a maximization mode.
  • the title area 911 may be displayed as the application window 910 is restored in the mini mode.
  • the title area may be displayed as the application window 910 is restored in the mini mode.
  • the flicking down or dragging down manipulation may be a manipulation of touching the information display area 913 and then flicking or dragging down the information display area 913.
  • the user manipulation may be a flick down manipulation regarding the title area 1011, but is not limited thereto, and thus it may be a tap, flick up, drag down, or drag up manipulation.
  • the application window may be restored to the mini mode and be displayed.
  • the user manipulation may be a tap manipulation regarding the return button 1011-1.
  • the title area 1011 may not disappear dynamically and be displayed in a regular manner. This is to enable easy movement of location and size adjustment of the application window in the multi window mode which displays a plurality of application windows at the same time.
  • the aforementioned exemplary embodiment may also be applied to a case where the multi window mode, that is, the application window 1010 is a mini mode. That is, it is possible to embody the present disclosure in such a manner that the title area is not displayed in the mini mode but only when there is a user manipulation the title area is displayed for a predetermined time and the title area to disappear dynamically.
  • FIG. 11 is a view for explaining a method of controlling a user terminal apparatus according to an exemplary embodiment of the present disclosure.
  • an application window is displayed on a screen (S1110)
  • the predetermined event may be an event where a predetermined time passes from the display point of the title area, but is not limited thereto.
  • the displaying (S1110) may display the title area in a state where the application window is displayed on the entire area of the screen, that is when the first user command is input in the maximization mode. That is, when the application window is a mini mode, the title area may be displayed regularly on the application window.
  • the first user command may be a user manipulation of touching the predetermined area of the application window.
  • the touch manipulation may be a tap manipulation, but is not limited thereto.
  • the one area of the application window where the title area is displayed may be an area adjacent to the predetermined area which receives the first user command.
  • a second user command when a second user command is input with the title area displayed on the application window, it may be controlled to enter to a multi window mode for displaying a plurality of application windows on the screen at the same time.
  • the second user command may be a touch manipulation regarding the title area, but is not limited thereto.
  • a double tap manipulation regarding the predetermined area of the application window or a touch and flick manipulation regarding the predetermined area when input, it may be controlled to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time. That is, depending on exemplary embodiment examples, even when the title area is not displayed, it is possible to enable change of the application window from the maximization mode to the mini mode with one predetermined user manipulation.
  • a controlling method may be embodied as a program and be provided in the user terminal apparatus.
  • a non-transitory computer readable medium which stores a program which performs displaying the application window on a screen, displaying a title area on one area of the application window when a first user command for displaying the title area on the application window is input, and controlling so that the title area automatically disappears when a predetermined event occurs.
  • a non-transitory computer readable medium refers to a medium which stores data semi-permanently and not for a short period of time, such as a register, cache and memory etc. More specifically, the aforementioned various applications or programs may be stored and provided in a non-transitory computer readable medium such as a CD, DVD, hard disk, Blue ray disk, USB, memory card, and ROM etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif terminal utilisateur. Le dispositif terminal utilisateur comprend un affichage qui affiche une fenêtre d'application sur un écran ; une unité d'interface utilisateur qui reçoit une entrée d'une commande utilisateur ; et un contrôleur qui contrôle : qu'une zone de titre est affichée sur une zone de la fenêtre d'application lorsqu'une première commande utilisateur visant à afficher la zone de titre sur la fenêtre d'application est entrée ; et que la zone de titre disparaît automatiquement lorsqu'un événement prédéterminé se produit.
PCT/KR2013/007695 2012-08-28 2013-08-28 Dispositif terminal utilisateur et procédé de commande associé WO2014035123A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0094502 2012-08-28
KR1020120094502A KR20140028383A (ko) 2012-08-28 2012-08-28 사용자 단말 장치 및 그 제어 방법

Publications (1)

Publication Number Publication Date
WO2014035123A1 true WO2014035123A1 (fr) 2014-03-06

Family

ID=50183869

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/007695 WO2014035123A1 (fr) 2012-08-28 2013-08-28 Dispositif terminal utilisateur et procédé de commande associé

Country Status (2)

Country Link
KR (1) KR20140028383A (fr)
WO (1) WO2014035123A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106411651A (zh) * 2016-10-31 2017-02-15 安徽汇顿电子科技有限公司 一种基于网络通信的智能家居调试系统
CN107179840A (zh) * 2017-05-25 2017-09-19 上海传英信息技术有限公司 控制器及其控制方法
CN108259803A (zh) * 2017-07-20 2018-07-06 青岛海信电器股份有限公司 电子终端设备、电视终端、信号输入电路及方法
US11232057B2 (en) 2017-07-20 2022-01-25 Hisense Visual Technology Co., Ltd. Terminal device and control method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525882B1 (ko) * 2014-05-02 2015-06-04 에스코어 주식회사 컴퓨터 실행 가능한 멀티 디스플레이 제공 방법, 이를 수행하는 멀티 디스플레이 제공 장치 및 이를 저장하는 기록매체

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473745A (en) * 1994-12-14 1995-12-05 International Business Machines Corporation Exposing and hiding a title bar behind its window using a visual cue
US6304261B1 (en) * 1997-06-11 2001-10-16 Microsoft Corporation Operating system for handheld computing device having program icon auto hide
US20030237043A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation User interface for media player program
US20050183017A1 (en) * 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
EP2346220A1 (fr) * 2010-01-15 2011-07-20 Research In Motion Limited Procédé et dispositif électronique portable pour le traitement d'images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473745A (en) * 1994-12-14 1995-12-05 International Business Machines Corporation Exposing and hiding a title bar behind its window using a visual cue
US6304261B1 (en) * 1997-06-11 2001-10-16 Microsoft Corporation Operating system for handheld computing device having program icon auto hide
US20050183017A1 (en) * 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
US20030237043A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation User interface for media player program
EP2346220A1 (fr) * 2010-01-15 2011-07-20 Research In Motion Limited Procédé et dispositif électronique portable pour le traitement d'images

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106411651A (zh) * 2016-10-31 2017-02-15 安徽汇顿电子科技有限公司 一种基于网络通信的智能家居调试系统
CN107179840A (zh) * 2017-05-25 2017-09-19 上海传英信息技术有限公司 控制器及其控制方法
CN108259803A (zh) * 2017-07-20 2018-07-06 青岛海信电器股份有限公司 电子终端设备、电视终端、信号输入电路及方法
CN108259803B (zh) * 2017-07-20 2021-02-02 海信视像科技股份有限公司 电子终端设备、电视终端、信号输入电路及方法
US11232057B2 (en) 2017-07-20 2022-01-25 Hisense Visual Technology Co., Ltd. Terminal device and control method thereof

Also Published As

Publication number Publication date
KR20140028383A (ko) 2014-03-10

Similar Documents

Publication Publication Date Title
WO2014035147A1 (fr) Appareil terminal d'utilisateur et son procédé de commande
WO2015119485A1 (fr) Dispositif de terminal utilisateur et son procédé d'affichage
WO2014069750A1 (fr) Appareil de terminal utilisateur et son procédé de commande
WO2014088355A1 (fr) Appareil de terminal utilisateur et son procédé de commande
EP3105649A1 (fr) Dispositif de terminal utilisateur et son procédé d'affichage
WO2015119463A1 (fr) Dispositif de terminal utilisateur et son procédé d'affichage
WO2015119480A1 (fr) Dispositif terminal utilisateur et son procédé d'affichage
WO2017095040A1 (fr) Dispositif terminal d'utilisateur et son procédé d'affichage
WO2014017722A1 (fr) Dispositif d'affichage permettant une exécution de multiples applications et son procédé de commande
WO2016167503A1 (fr) Appareil d'affichage et procédé pour l'affichage
WO2016052940A1 (fr) Dispositif terminal utilisateur et procédé associé de commande du dispositif terminal utilisateur
WO2014175692A1 (fr) Dispositif terminal utilisateur pourvu d'un stylet et procédé de commande associé
WO2013180454A1 (fr) Procédé d'affichage d'un élément dans un terminal et terminal utilisant celui-ci
WO2014182086A1 (fr) Appareil d'affichage et méthode fournissant un écran d'interface utilisateur pour celui-ci
WO2014017790A1 (fr) Dispositif d'affichage et son procédé de commande
WO2014107011A1 (fr) Procédé et dispositif mobile d'affichage d'image
EP3207460A1 (fr) Procédé pour partager un écran entre des dispositifs et dispositif l'utilisant
WO2015178677A1 (fr) Dispositif formant terminal utilisateur, procédé de commande d'un dispositif formant terminal utilisateur et système multimédia associé
WO2014098539A1 (fr) Appareil de terminal utilisateur et son procédé de commande
WO2014035123A1 (fr) Dispositif terminal utilisateur et procédé de commande associé
WO2016072678A1 (fr) Dispositif de terminal utilisateur et son procédé de commande
WO2015020288A1 (fr) Appareil d'affichage et méthode associée
WO2015099300A1 (fr) Procédé et appareil de traitement d'objet fourni par le biais d'une unité d'affichage
EP2995075A1 (fr) Appareil d'affichage a pluralite d'ecrans et son procede de commande
WO2016167610A1 (fr) Terminal portatif pouvant commander la luminosité de ce dernier, et son procédé de commande de luminosité

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13833011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13833011

Country of ref document: EP

Kind code of ref document: A1