WO2014109445A1 - Contents display method and mobile terminal implementing the same - Google Patents

Contents display method and mobile terminal implementing the same Download PDF

Info

Publication number
WO2014109445A1
WO2014109445A1 PCT/KR2013/006221 KR2013006221W WO2014109445A1 WO 2014109445 A1 WO2014109445 A1 WO 2014109445A1 KR 2013006221 W KR2013006221 W KR 2013006221W WO 2014109445 A1 WO2014109445 A1 WO 2014109445A1
Authority
WO
WIPO (PCT)
Prior art keywords
page
input device
touch input
touch
pages
Prior art date
Application number
PCT/KR2013/006221
Other languages
French (fr)
Inventor
Sang Hyup Lee
Shin Jun Lee
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/739,777 external-priority patent/US20130198678A1/en
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP13870559.5A priority Critical patent/EP2943867A4/en
Publication of WO2014109445A1 publication Critical patent/WO2014109445A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/34Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/14Electronic books and readers

Definitions

  • Methods and apparatuses consistent with exemplary embodiments of the present disclosure relate to a contents display method and a mobile terminal implementing the same.
  • a mobile terminal provides various contents.
  • the contents may be displayed for each of a plurality of pages.
  • a contents display method and an apparatus thereof in the related art do not provide the feeling of operating pages on the mobile terminal which is similar to the feeling of operating an actual paper book for a user.
  • a contents display method and an apparatus thereof of the related art if a user provides input information (e.g., a push)associated with a page skip, for example, a next page button is detected, and a currently displayed page is replaced with a next page. Such a replacement scheme does not actually skip the currently displayed page but simply browses to a next web page.
  • a recently developed mobile terminal may include a touch screen. The mobile terminal detects and skips pages in response to the detected gesture.
  • the mobile terminal When the user skips the pages, the mobile terminal according to the related art provides an animation which gradually folds a current page (that is, a front surface of the page) and shows a next page (that is, a back surface of the page) regardless of a touched point or a direction of drag.
  • One or more exemplary embodiments provide a contents display method capable of achieving a realistic feeling for a user when the user operates a screen on which a page is displayed by a touch input device (e.g., finger or pen), and an apparatus thereof.
  • a touch input device e.g., finger or pen
  • One or more exemplary embodiments also provide a contents display method in which an animation of pages being skipped provides a realistic feeling, and an apparatus thereof.
  • a method of displaying contents of pages displayed by a mobile terminal including a display unit in which a touch panel is installed, the method including: displaying a page; detecting movement of a touch input device with respect to the displayed page; and displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.
  • a mobile terminal including: a display unit in which a touch panel is installed and configured to display contents for each of a plurality of pages; a memory configured to store the pages; and a controller configured to control the display unit such that one of the pages is displayed, detect movement of a touch input device with respect to the displayed page, and control the display unit such that the page is displayed as convexly deformed and skipped in response to the movement of the touch input device.
  • a method to display pages including: displaying a page on a device including a touch input unit, generating a page mesh corresponding to the displayed page, the page mesh including a plurality of nodes having respective weights, detecting movement of a touch input device with respect to the displayed page, using the touch input unit, and changing an appearance of the page according to the detected movement and the page mesh.
  • One or more exemplary embodiments provide a contents display method capable of achieving a realistic feeling for a user when the user operates a screen on which a page is displayed by a touch input device (e.g., finger or pen), and an apparatus thereof.
  • One or more exemplary embodiments also provide a contents display method in which an animation of pages being skipped provides a realistic feeling, and an apparatus thereof.
  • FIG. 1 is a block diagram illustrating a configuration of a mobile terminal according to an exemplary embodiment
  • FIGS. 2A and 2B are diagrams illustrating a page mesh according to an exemplary embodiment
  • FIG. 3 is a flowchart illustrating a page display method according to an exemplary embodiment
  • FIG. 4 is a flowchart illustrating a page deforming method according to an exemplary embodiment
  • FIG. 5 is a flowchart illustrating a page skipping method according to an exemplary embodiment
  • FIG. 6 is a flowchart illustrating page setting according to an exemplary embodiment
  • FIG. 7A is an exemplary diagram of a screen for setting an environment of a mobile terminal according to an exemplary embodiment
  • FIG. 7B is an exemplary diagram of a screen for setting an environment of a page according to an exemplary embodiment
  • FIGS. 8A to 23 are exemplary diagrams illustrating screens for describing a page display method according to an exemplary embodiment
  • FIG. 24 is a flowchart illustrating a page editing method according to an exemplary embodiment.
  • FIGS. 25A to 26 are diagrams of screens illustrating a page editing method according to an exemplary embodiment.
  • the contents display method according to exemplary embodiments may be implemented by a mobile terminal, for example, a smart phone, a tablet PC, an e-book reader, a navigation device, or a moving image player.
  • a mobile terminal for example, a smart phone, a tablet PC, an e-book reader, a navigation device, or a moving image player.
  • the contents display method and the mobile terminal thereof will be described in detail.
  • the term 'contents' may refer to photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, and thumb-nails and icons, in addition to many other types of contents.
  • the contents are displayed for each page.
  • the pages according to exemplary embodiments may be convexly and stereoscopically deformed in response to a user gesture. Accordingly, when the user operates a screen on which pages are displayed by a touch input device (e.g., finger or pen), the user may feel as if the user is controlling real paper pages.
  • a touch input device e.g., finger or pen
  • the term 'page mesh' refers to geometrical information of the pages.
  • the page mesh includes a plurality of nodes and links connecting the nodes with each other.
  • a weight is allocated to each node, and an elastic value is allocated to each link.
  • the elastic value may be differently allocated for the user according to properties of the pages in order to achieve a realistic feeling.
  • the elastic value when the pages are thickly set (that is, when the weight is greatly set), the elastic value may be greatly allocated.
  • the elastic value When the page is relatively thinly set, the elastic value may be allocated to a relatively smaller value.
  • a great weight may be allocated to nodes located in an inner side (e.g., spine) of the page.
  • relatively outer nodes e.g., page edge
  • a small weight may be allocated to the relatively outer nodes. The same weight may be allocated to all nodes.
  • Virtual force applied to each node may be classified into two types.
  • the first virtual force is an internal virtual (hereinafter also referred to 'internal force').
  • the second virtual force is an external virtual force (hereinafter also referred to 'external force') such as gravity or human power.
  • the virtual gravity of the external force is defined as a force pulling the node down. If a screen on which the pages are displayed is arranged in an XY plane and a user’s viewpoint is along a positive direction of the Z direction on the XY plane, the virtual gravity pulls the node down toward a lower portion of the XY plane.
  • the Z axis is vertical (orthogonal) to the XY plane.
  • the Z axis is not an actual axis, but is a virtual axis for stereoscopically controlling the virtual page.
  • the virtual gravity may act equally on all nodes.
  • the gravity may have a different effect according to a property of a page to achieve a realistic feeling for the user. For example, in a case where the user lifts and puts down a page of a real paper book, when the page is thin, if the page is relatively thick, the page falls down rapidly down.
  • the following table 1 illustrates thicknesses by types of pages. For example, referring to table 1, a pamphlet falls down faster than a leaflet. That is, a deformation degree of the page may be changed according to thicknesses and materials set to the display page.
  • An artificial force is a force which the user applies to the page.
  • a user gesture with respect to the screen may be the artificial force.
  • a target node touched by the touch input device is moved in a direction in which the touch input device is moved.
  • the artificial force is transferred to other nodes through links.
  • a sum of the internal force and the external force is applied to each node.
  • a controller of the mobile terminal calculates forces applied to each node based on the artificial force applied to the displayed page.
  • the force may be obtained in various ways, for example, by multiplying a moving distance of a target node by speed to obtain acceleration, and by multiplying the acceleration by a weight of a corresponding target node.
  • the mobile terminal reflects a deformed page mesh on a page to generate an animation.
  • a procedure of generating the animation based on the artificial force is defined by a physically-based simulation.
  • the physically-based simulation may be executed by various components, such as, for example, an Application Processor (AP), a Central Processing Unit (CPU), or a Graphics Processing Unit (GPU).
  • AP Application Processor
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • FIG. 1 is a block diagram illustrating a configuration of a mobile terminal according to an exemplary embodiment.
  • the mobile terminal 100 includes a display unit 110, a key input unit 120, a memory 130, a radio frequency (RF) communication unit 140, an audio processor 150, a speaker SPK, a microphone MIC, a controller 160, and a pen 170.
  • RF radio frequency
  • the display unit 110 displays contents on a screen under control of the controller 160. That is, when the controller 160 processes (e.g., performs a decoding or resizing operation) and stores the contents in a buffer, the display unit 110 converts the contents stored in the buffer into an analog signal, and displays the analog signal on the screen. When power is supplied to the display unit 110, the display unit 110 displays a lock image (e.g., login image) on the screen. If lock release information (e.g., password) is detected in a state that a lock image is displayed, the controller 160 releases the lock. That is, the display unit 110 terminates the displaying of the lock image, and displays another image, for example, a home image under control of the controller 160.
  • a lock image e.g., login image
  • lock release information e.g., password
  • the home image includes a background image and a plurality of icons displayed thereon. Icons indicate applications or contents, respectively. If the user selects an icon, for example, an application icon (e.g., taps the icon), the controller 160 executes a corresponding application (e.g., gallery), and controls the display unit 110 to display an execution image of the corresponding application (e.g., page including a plurality of thumbnails).
  • an icon for example, an application icon (e.g., taps the icon)
  • the controller 160 executes a corresponding application (e.g., gallery), and controls the display unit 110 to display an execution image of the corresponding application (e.g., page including a plurality of thumbnails).
  • the display unit 110 may be implemented as various types, for example, a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), or a flexible display.
  • LCD Liquid Crystal Display
  • OLED Organic Light Emitting Diode
  • AMOLED Active Matrix Organic Light Emitting Diode
  • the touch panel 111 is installed on a screen of the display unit 110.
  • the touch panel 111 may be implemented as an add-on type located on a screen of the display unit 110 or an on-cell type or an in-cell type inserted into an inside of the display unit 110.
  • the touch panel 111 generates an analog signal (e.g., touch event) in response to touch of a touch input device (e.g., finger or pen) with respect to a screen, and a touch IC 112 converts the analog signal into a digital signal, and transfers the digital signal to the controller 160.
  • the touch event includes a touch coordinate (x, y).
  • the touch IC 112 determines a representative coordinate of a plurality of touch coordinates, stores a determined touch coordinate in an internal memory of the touch IC 112, and transfers the touch coordinate stored in the internal memory to the controller 160 in response to a request of the controller 160.
  • the touch coordinate may be a pixel unit.
  • an X axis coordinate may be, for example, (0, 640)
  • a Y axis coordinate may be, for example, (0, 480).
  • the controller 160 determines that the touch input device (e.g., finger or pen) touches the touch panel 111.
  • the controller 160 computes a location variation amount (dx, dy) of the touch and moving speed of the touch input device in response to movement of the touch input device.
  • the controller 160 determines a user gesture as one of various different types of gestures, for example, touch, multi-touch, tap, double tap, long tap, tap & touch, drag, flick, press, pinch in, and pinch out based on presence of touch release of the touch input device, presence of movement of the touch input device, a location variation amount of the touch input device, and moving speed of the touch input device.
  • the touch is a gesture where a user makes the touch input device contact with one point of a touch panel 111 on a screen.
  • the multi-touch is a gesture where the user makes a plurality of touch input devices (e.g., thumb and index finger) contact the touch panel 111.
  • the tap is a gesture where the user touches-off a corresponding point without movement after touching the touch input device on one point.
  • the double tap is a gesture where a user continuously taps one point twice.
  • the long tap is a gesture where touch of the touch input device is released from a corresponding point without a motion of the touch input device after touching one point longer than the tap.
  • the tap & touch is a gesture where the user touches a corresponding point within a predetermined time (e.g., 0.5 seconds) after touching one point of a screen.
  • the drag is a gesture that moves the touch input device in a predetermined direction in a state in which one point is touched.
  • the flick is an operation that touches-off after moving the touch input device at a higher speed than the drag.
  • the press is a gesture to maintain the touch without movement for a predetermined time (e.g., 2 seconds) after touching one point.
  • the pinch in is a gesture where the user reduces an interval between touch input devices after simultaneously multi-touching two points by the two touch input devices.
  • the pinch out is a gesture for increasing the interval between the touch input devices. That is, the touch is a gesture in which the user contacts the touch screen, and other gestures refer to variations in the touch.
  • the touch panel 111 may be a converged touch panel including a hand touch panel detecting a hand gesture and a pen touch panel detecting a pen gesture.
  • the hand touch panel may include a capacitive type touch panel.
  • the hand touch panel may also include a resistive type touch panel, an infrared type touch panel, or an ultrasonic type touch panel. Further, the hand touch panel does not generate a touch event only based on a hand gesture, but may also generate the touch event based on other objects touching the touch panel 111 (e.g., conductive material capable of providing variation in capacitance).
  • the pen touch panel may include an electromagnetic induction type touch panel. Accordingly, the pen touch panel generates a touch event by a specially manufactured touch pen 170 so that a magnetic field may be formed.
  • the touch event generated by the pen touch panel includes a value indicating a type of the touch together with a touch coordinate. For example, when a first voltage level is received from the pen touch panel, the controller 160 determines a touch of the touch input device as an indirect touch (that is, hovering). When a second voltage level greater than the first voltage level is received from the touch panel 111, the controller 160 determines the touch of the touch input device as a direct touch.
  • the touch event generated by the pen touch panel may further include a value indicating the presence of pushing of a button installed at the pen 170. For example, when a button installed at the pen 170 is pushed, a magnetic field generated from a coil of the pen 170 varies. In response to the variation in the magnetic field, the pen touch panel generates a third voltage level, and transfers the third voltage level to the controller 160.
  • the key input unit 120 may include a plurality of input keys and function keys for receiving numeric or character information and setting various functions.
  • the keys may include a menu loading key, a screen on/off key, a power on/off key, and a volume control key.
  • the key input unit 120 generates a key event associated with user settings and function control of the mobile terminal 100 and transfers the key event to the controller 160.
  • the key event may include a power on/off event, a volume control event, a screen on/off event, and a shutter event.
  • the controller 160 controls the foregoing constituent elements in response to the key event.
  • a key of the key input unit 120 may refer to a hard key and a virtual key displayed on the display unit 110 may refer to a soft key.
  • the secondary memory 130 may include various components, such as a disk, a RAM, a ROM, and a flash memory.
  • the secondary memory 130 stores contents generated by the mobile terminal 100 or contents received from an external device (e.g., server, desktop PC, tablet PC) through the RF communication unit 140.
  • the secondary memory 130 may temporarily store data copied from a message, a photograph, a web page, and a document by the user for performing a copy and paste operation.
  • the secondary memory 130 stores various preset values (e.g., screen brightness, presence of vibration upon generation of a touch, presence of automatic rotation of screen). Further, the memory 130 stores histogram information, for example, information associated with a most recently displayed page before the application is terminated.
  • the secondary memory 130 stores a booting program, and at least one operating system (e.g., gallery, address book, video player, calendar, note pad, electronic book viewer, music player, web browser).
  • the operating system serves as an interface between hardware and applications and further serves as an interface between applications, and manages various computer resources, such as a CPU, a graphic processing unit (GPU), a main memory, and the secondary memory 130.
  • the applications may be classified into an embedded application and a third party application.
  • the embedded application includes a web browser, an e-mail program, and an instant messenger. If power of a battery is supplied to the controller 160 of the mobile terminal 100, the booting program is loaded into a main memory of the controller 160.
  • the booting program loads host and guest operating systems into the main memory 161.
  • the operating systems load the application into the main memory 161.
  • the RF communication unit 140 performs voice calls, image calls, and data communications with an external device through a network under the control of the controller 160.
  • the RF communication unit 140 may include an RF transmitter for up-converting a frequency of a transmitted signal and amplifying the converted signal, and an RF receiver for low-noise-amplifying a frequency of a received signal and down-converting the amplified signal.
  • the RF communication unit 140 may include a mobile communication module (e.g., 3-generation mobile communication module, 3.5-generation mobile communication module, 4-generation mobile communication module, etc.), a digital broadcasting module (e.g., DMB module), and a near field communication module.
  • the audio processor 150 inputs and outputs an audio signal (e.g., voice data) for voice recognition, voice recording, digital recording, and call operations.
  • the audio processor 150 receives an audio signal from the controller 160, converts the received audio signal into an analog signal, amplifies the analog signal, and outputs the amplified analog signal through the speaker SPK.
  • the audio processor 150 converts an audio signal received from the microphone MIC into digital data, and provides the converted digital signal to the controller 160.
  • the speaker SPK converts an audio signal received from the audio processor 150 into a sound wave and outputs the sound wave.
  • the MIC converts the sound wave received from a person or other sound source into the audio signal.
  • the controller 160 controls overall operations and signal flows between internal constituent elements of the mobile terminal 100, processes data, and controls supply of power from a battery to the constituent elements.
  • the controller 160 includes at least one CPU.
  • the CPU is a core control unit of a computer system and performs computation and comparison of data, and interpretation and execution of commands.
  • the CPU includes various registers which temporarily store data and commands.
  • the controller 160 may include at least one Graphic Processing Unit (GPU).
  • the GPU is a graphic control unit performing computation and comparison of data associated with graphics, and interpretation and execution of commands, alternatively to the CPU.
  • Each of the CPU and the GPU may be configured by integrating at least two independent cores (e.g., quad-core) as one package being a single IC.
  • the CPU and the GPU may be implemented as a System on Chip (SoC).
  • SoC System on Chip
  • the CPU and the GPU may be a package of a multi-layer structure.
  • a configuration including the CPU and the GPU may be referred to as an Application Processor (AP).
  • the GPU of the controller 160 deforms a page mesh in response to a gesture (e.g., drag) of a touch input device, and generates an animation by reflecting a page on the deformed page mesh.
  • the GPU receives information associated with a touch gesture from the touch IC 112.
  • the GPU deforms the page mesh using the received information. If a touch of the touch input device is released from a screen, the GPU restores the page mesh to an original state. That is, the deformed page mesh is restored to an original state by the elastic force of links and gravity applied to each node.
  • the GPU accesses the secondary memory 130 to read a page therefrom.
  • the GPU reflects deformation information of the page mesh on the read page to generate the animation.
  • the deformation information of the page mesh includes coordinates (x, y, z) of each node constituting the page mesh.
  • the GPU controls the display unit 110 to display the animation.
  • the animation may be generated by a CPU or an application processor (AP).
  • the controller 160 includes a main memory, for example, a RAM.
  • the main memory may store various programs, for example, a booting program, a host operation system, guest operating systems, and applications loaded from the secondary memory 130.
  • the CPUs and GPUs of the controller 160 access the foregoing program to decode a command of the program and execute a function (e.g., generation of histogram) according to the interpretation result.
  • the controller 160 temporarily stores data to be written in the secondary memory 130 and temporarily stores data read out from the secondary memory 130.
  • a cache memory may be further provided as a temporary data warehouse.
  • the pen 170 is an accessory of the mobile terminal 100 which can be separated from the mobile terminal 100, and may include, for example, a penholder, a rib disposed at an end of the penholder, a coil disposed inside the penholder adjacent to the rib to generate a magnetic field, and a button which varies the magnetic field.
  • the coil of the pen 170 forms the magnetic field around the rib.
  • a pen touch panel of the touch panel 111 detects the magnetic field, and generates a touch event corresponding to the magnetic field.
  • the mobile terminal 100 may further include constituent elements which are not described above, such as a Global Positioning System (GPS) module, a vibration motor, and an acceleration sensor.
  • GPS Global Positioning System
  • FIGs. 2A and 2B show a diagram illustrating a page mesh according to exemplary embodiments.
  • the controller 160 and particularly, a CPU of the controller 160, configures a page mesh.
  • the page mesh includes a plurality of nodes and a plurality of links connecting the nodes with each other.
  • reference numeral 210 represents a plurality of nodes
  • reference numeral 220 represents a plurality of links.
  • a plurality of nodes may be arranged in a matrix pattern, and locations thereof may be represented by XY coordinates. Further, as described above, a suitable weight is allocated to each node, and a suitable elastic value is allocated to each link (spring).
  • a relatively heavyweight may be allocated to nodes which are located at a center 230 of the page mesh.
  • a relatively light weight may be allocated to outer nodes located relatively away from the center 230 as compared with nodes located near the center 230. Accordingly, the nodes have a movement which is gradually lighter in an outward direction. Nodes react gradually more sensitively to a touch gesture in the direction towards an outer node. When the page is skipped (turned), nodes located at a central axis (Y axis) 230 are fixed, which is different from other nodes.
  • the same weight may be allocated to all nodes. As such, the entire movement of the page mesh may be heavier than a previous case in which different weights are allocated to the nodes. That is, a deformation degree of the page may be changed according to attribute information (e.g., thickness, weight, material) set to a corresponding page. Further, the deformation degree of the page may be changed according to a calculated gradient.
  • attribute information e.g., thickness, weight, material
  • the deformation degree of the page may be changed according to a calculated gradient.
  • the user touches a right bottom peripheral point 240 of a page by a touch input device (e.g., finger, pen). Then, the controller 160 detects a target node touched by the touch input device. Next, the user moves the touch input device from the right bottom peripheral point 240 to a left direction. Accordingly, the controller 160 moves the target node to the left direction on an XY plane according to movement of the touch input device. That is, the target node is moved in a direction which is vertical to gravity.
  • a touch input device e.g., finger, pen
  • the controller 160 calculates displacement of the moved target node.
  • the displacement may be represented as a vector having a size and a direction.
  • the size of the displacement includes at least one of a current location of the target node, a moved distance of the target node, and speed of the target node.
  • the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node.
  • the controller 160 deforms the page mesh according to the computed gradient, and reflects the deformed page mesh on the page to generate the animation.
  • the controller 160 calculates forces applied to each node using the calculated displacement.
  • the force is a vector having a size and a direction. As described above, the force is a sum of an elastic force, gravity, and an artificial force.
  • the controller 160 calculates location values of respective nodes using the calculated forces. As shown in FIG. 2B, the controller 160 generates the animation (deformed page) as shown in FIG. 2B using the calculated location values. Meanwhile, the controller 160 may move the target node in a direction which is vertical to gravity. That is, a value of a Z axis may vary or be '0' according to variations in a value of an X axis and a value of a Y axis of the target node.
  • the controller 160 fixes a node located at a central axis 230 so that the node located at the central axis 230 functions in a different fashion from other nodes.
  • exemplary embodiments achieve a realistic effect of the user pushing and moving a page of a paper book in a real-time manner.
  • the deformed page has a convex shape.
  • the page mesh may be variously deformed in a realistic fashion according to the touch coordinates, a moving direction, and moving speed of the touch input device. Accordingly, the user may experience a feeling of interacting with pages of a real paper book according to the exemplary embodiments.
  • FIG. 3 is a flowchart illustrating a page display method according to an exemplary embodiment.
  • a controller 160 may initially be in an idle state.
  • a display unit 110 displays a home image including a plurality of icons.
  • the controller 160 detects a tap of the touch input device with respect to the icon.
  • the controller 160 executes a corresponding application (e.g., loads pages of a corresponding application to a main memory) in response to the tap (operation 301).
  • the executed application may include a gallery, an address book, a video player, an electronic book viewer, a music player, or a web browser.
  • the controller 160 selects at least one page (e.g., most recently displayed page before a corresponding application is terminated) of pages of the executed application, and controls the display unit 110 to display the selected page (operation 302).
  • the controller 160 determines whether a touch is detected (operation 303). When the touch is not detected, the controller 160 determines whether a threshold time elapses (operation 304). The threshold time is set to automatically turn-off the screen. If the touch is not detected by the time the threshold time elapses, the controller 160 turns-off the screen (operation 305). The threshold time may be set to many different values, e.g., 1 minute, which may be changed according to selection of the user. When the touch is detected, the controller 160 determines whether the touch input device is moved (e.g., drag, flick) (operation 306). When the touch input device is moved, the controller 160 controls the display unit 110 to display a convexly deformed page in response to the movement of the touch input device (operation 307). That is, the controller 160 deforms the page mesh in response to the movement of the touch input device, and reflects the deformed page mesh on the page to generate the animation. A detailed process of step 307 will be described with reference to FIG. 4.
  • the controller 160 determines whether the touch of the touch input device is released from the screen (operation 308). If the touch of the touch input device is maintained without releasing the touch, the process returns to operation 306. Conversely, if the touch input device is touch-released, the process proceeds to operation 309.
  • the controller 160 determines whether the touch release is an event corresponding to a page skip (operation 309). That is, the controller 160 determines whether the page skip is generated based on at least one of a moving direction of the touch input device, a touch coordinate and speed before generation of the touch release. When the page skip is generated, the controller 160 controls the display unit 110 to skip a currently display page and to display another page (operation 310). When the page skip is not generated, the controller 160 maintains the displaying of a current page (operation 311). Next, the controller 160 determines whether execution of the application is terminated (operation 312). When the execution of the application is not terminated, the process returns to operation 303.
  • FIG. 4 is a flowchart illustrating a page deforming method according to an exemplary embodiment.
  • a controller 160 detects a target node touched by a touch input device. Further, the controller 160 detects a moving direction of the touch input device. The controller 160 moves the target node to a moving direction of the touch input device (operation 401). Particularly, the controller 160 may move the target node to a direction vertical to a gravity direction. The controller 160 may move the target node at a determined gradient (e.g., -30° ⁇ +30°) based on the gravity direction. Next, the controller 160 calculates displacement of the moved target node (operation 402).
  • a determined gradient e.g., -30° ⁇ +30°
  • the displacement is represented by a vector having a size and a direction.
  • the size of the displacement may include at least one of a current location of the target node, a moved distance of the target node, and speed of the target node.
  • the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only a speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node.
  • the controller 160 calculates forces applied to each node using the calculated displacement of the target node (operation 403).
  • the calculation of the forces is generally known in the art. That is, the controller 160 calculates a magnitude of forces applied to each node and a direction to which the forces are applied (operation 403).
  • the controller 160 applies the calculated forces to each node to deform a page mesh (operation 404). That is, the controller 160 calculates location values of respective nodes using the calculated forces (operation 404).
  • the controller 160 applies the deformed page mesh to a page to generate an animation (operation 405).
  • the generated histogram is displayed such that the page is convexly deformed as the target node is moved to a direction vertical to gravity or a determined direction of a gradient.
  • the page is restored to an original state, that is, an open state.
  • the page may be skipped or may not be skipped and returned to an original position.
  • Such a result is determined by forced applied to respective nodes of the page mesh. That is, if an artificial force disappears, only the elastic force and the gravity are applied to the page mesh.
  • the controller 160 calculates a sum of forces applied to respective nodes of the page mesh.
  • the controller 160 determines a moved direction of the page based on the sum of the forces.
  • the controller 160 moves the page along the determined direction. For example, the page is moved in a direction towards which a mass center of the page mesh faces.
  • the moving direction of the page is determined as a moving direction before the touch input device is separated from a screen (that is, a page). A detailed example will be described with reference to FIG. 5.
  • FIG. 5 is a flowchart illustrating a page skipping method according to an exemplary embodiment.
  • a display unit 110 displays a page, and a touch input device of a user touches the displayed page (operation 501). While the touch input device touches the page, the controller 160 detects a current touch coordinate (x, y) (operation 502).
  • a current touch coordinate x, y
  • the X axis is a horizontal axis based on a viewpoint at which the user views a screen. It is assumed that two pages are displayed, one page at a left side and the other page at a right side, based on a center line of a screen. Further, a right direction is a positive direction of an X axis, and a left direction is a negative direction of the X axis.
  • a controller 160 determines whether "
  • the value "x” is a current touch coordinate
  • the value "old_x” is a previous touch coordinate
  • the "th” is a preset threshold value.
  • the “th” may be set to 5 mm, although is not limited thereto.
  • the controller 160 determines whether the current touch coordinate is greater than the previous touch coordinate (operation 504). When the current touch coordinate is greater than the previous touch coordinate, the controller 160 determines a moving direction of the touch input device as a 'right direction' (operation 505). When the current touch coordinate is less than or equal to the previous touch coordinate, the controller 160 determines the touch direction as a 'left direction' (operation 506). After the moving direction is determined, the controller 160 sets the current touch coordinate to the previous touch coordinate (operation 507). Next, the controller 160 determines whether a touch of the touch input device is released from the screen (operation 508). When the touch of the touch input device is not released, the process returns to operation 502.
  • the controller 160 determines whether the determined touch direction is a right direction (operation 509). When the touch direction is the right direction, the controller 160 moves the touched page to the right direction (operation 510). If the touched page is a left page, operation 510 corresponds to an operation of skipping the page to a previous page. When the touched page is a right page, operation 510 corresponds to an operation of maintaining the displaying of the touched page without skipping the page to the next page. When the touch direction is the left direction, the controller 160 moves the touched page to the left direction (operation 511). If the touched page is the left page, operation 511 corresponds to an operation of maintaining the displaying of the touched page without skipping the page back. Conversely, if the touched page is the right page, step 511 corresponds to an operation of skipping the page to a next page.
  • FIG. 6 is a flowchart illustrating page setting according to an exemplary embodiment.
  • a controller 160 may control a display unit 110 to display a home image (operation 620).
  • the home image includes an icon corresponding to an environment setting.
  • the controller 160 detects a touch of an icon corresponding to the environment setting (operation 621).
  • the controller 160 controls the display unit 110 to display an environment setting image of the mobile terminal 100 (operation 622).
  • the controller 160 sets an environment of the mobile terminal, particularly, an environment with respect to the page, according to an operation of a user with respect to the screen (operation 623).
  • Preset values associated with the page are stored in the secondary memory 130. When the page stored in the secondary memory 130 is displayed, the preset values are used by the controller 160.
  • FIG. 7A is an exemplary screen diagram for setting an environment of a mobile terminal according to an exemplary embodiment.
  • the display unit 110 displays an environment setting image 730 under the control of the controller 160.
  • the environment setting image 730 include items such as a wireless network 731, a location service 732, a sound 733, a display 734, a security 735, and page setting 736.
  • the user touches the page setting 736 from the foregoing items.
  • the controller 160 controls the display unit 110 to display a page setting image for setting the environment of the page.
  • FIG. 7B is an exemplary screen diagram for setting an environment of a page.
  • the display unit 110 displays a page setting screen 740 under control of the controller 160.
  • the page setting screen 740 includes items such as page thickness/material 741, touch gesture change 742, an allowable gradient range 743, feedback 744, and a screen change time 745.
  • the page thickness/material 741 may be, for example, 75g/m2 and a printing page.
  • the page thickness/material 741 is set by a manufacturer of an electronic book, and may not be changed by the user.
  • the touch gesture change 742 is an item which changes an allowed touch gesture so the page may be skipped.
  • the touch gesture which is allowed for page skip may be changed from flick to drag or vice versa.
  • An allowable gradient range 743 where the target node is movable may be in the range of -30° to +30°.
  • the feedback 744 is an item for determining feedback to be provided to the user when the page is skipped.
  • the user may set a vibration effect and a sound effect as the feedback.
  • the screen change time 750 may be set to 0.5 seconds.
  • a display mode of a screen according to the exemplary embodiments may be divided into a landscape mode and a portrait mode.
  • the mobile terminal 100 displays two pages toward the left and right directions.
  • the mobile terminal displays one page.
  • the exemplary embodiments are not limited thereto.
  • a sensor e.g., acceleration sensor included in the mobile terminal 100 detects rotation information and transfers the rotation information to the controller 160.
  • the controller 160 may determine the display mode of the mobile terminal 100 using the rotation information.
  • the exemplary embodiments may use the landscape mode and the portrait mode as the display mode.
  • FIGS. 8A to 23 are exemplary screen diagrams illustrating a page display method according to an exemplary embodiment.
  • the controller 160 moves a target node to convexly deform the page.
  • a shape of a page is convex, a concrete shape of the page is changed according to touched information (e.g., touched location, moving direction, moving distance, speed).
  • touched information e.g., touched location, moving direction, moving distance, speed.
  • the term 'contents' may refer to, for example, photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, thumb-nails, and icons.
  • the contents are displayed for each page.
  • Contents of which the number of pages vary may include photographs, videos, audio, images, contact points, memos, documents, thumb-nails, and icons.
  • Contents of which the number of pages do not vary may include calendars, e-books, and web pages.
  • FIGS. 8A to 16 illustrate screens on which pages of a gallery are displayed.
  • a plurality of icons e.g., 12 icons as illustrated above
  • the number of icons displayed for each page may be changed by a user.
  • Each icon corresponds to an icon or a photograph. For example, when the user taps the icon, a corresponding photograph (or moving image) is displayed on the screen.
  • the page may include other contents, for example, photographs or thumbnails instead of icons.
  • the page may include various types of contents (e.g., icons, photographs, thumbnails).
  • the user touches the touch input device on a right bottom peripheral point 810 of a right page (e.g., 4/11 page) of the gallery. Then, the controller 160 detects the target node (or touch coordinate) corresponding to the right bottom peripheral point 810. The user moves the touch input device in a left bottom direction in a state in which the right bottom peripheral point 810 is touched. Then, the controller 160 moves the target node in a left direction from a right bottom area. The controller 160 calculates displacement of the moved target node. In detail, the controller 160 calculates a current location value of the target node (that is, a current touch coordinate of the touch input device), moving speed and direction of the target node.
  • a current location value of the target node that is, a current touch coordinate of the touch input device
  • FIG. 8A illustrates an animation (that is, a deformed shape of page) when the touch input device moves from the right bottom peripheral point 810 in a left direction from a right bottom area and is located at a first lower side point 820. As shown in FIG. 8A, the page is greatly deformed and becomes convex in a moving direction (810 -> 820) of the target node. A corner region 815 having the target node is compared with another corner region 830 so that the corner region 815 approaches a spine 840.
  • FIG. 8B illustrates an animation in which the touch input device is located at a second lower side pointer 850.
  • FIG. 8B illustrates an animation in which the touch input device is located at a second lower side pointer 850.
  • the page of FIG. 8B is more convex than a page of FIG. 8A. Accordingly, if the touch of the touch input is released, the page of FIG. 8A is not skipped but the page of FIG. 8B may be skipped. In other words, in a case shown in FIG.
  • a direction of a force (that is, mass center) may act in a rightward direction. Accordingly, the page is not skipped but instead returns to an original position.
  • a direction of a force that is, mass center
  • the page is not skipped but instead returns to an original position.
  • FIG. 8B when the user releases a touch from the second lower side point 850, the direction of the force may act in a leftward direction. Accordingly, the page may be skipped to an opposite side.
  • a direction of the mass center may be associated with a current touch coordinate.
  • there may be a skip condition An example of the condition will be described in detail.
  • the page skip may be determined according to speed when the touch input device is moved from the right bottom peripheral point 810 to a first bottom side point 820. For example, when a touch of the touch input device is released after the touch input device is moved at a speed greater than or equal to 30cm/sec, the page may be skipped. When the speed is less than 30cm/sec, the page may not be skipped. Many other examples of page skip conditions may also be implemented.
  • the user may move the touch input device from the second bottom point 850 to a left side in a state that the touch continuously is maintained. That is, the user moves the touch input device to the first left point 860 through a central line. Then, the controller 160 controls a part 870 of a rear surface (e.g., page 5) of a currently operated page. If the user releases the touch from the first left point 850, as shown in FIG. 8D, the controller 160 displays the entire rear surface on a left side. In this manner, when the touch input device is moved from the right side to the left side through the central line, a rear surface of the page is displayed. When the touch is released after passing through the central line, the page is skipped.
  • a rear surface e.g., page 5
  • a rear surface of the currently operated page may be displayed.
  • the controller 160 may control the display unit 110 to display the rear surface.
  • a threshold for displaying the rear surface may be set to values other than 10 mm.
  • FIG. 9 illustrates an animation when the touch input device is moved from the right bottom peripheral point 910 towards a left top periphery and is located at the third bottom side point 920.
  • FIG. 9 illustrates an animation when the touch input device is moved from the right bottom peripheral point 910 towards a left top periphery and is located at the third bottom side point 920.
  • the touch input devices in FIG. 8A and FIG. 9 start from the same right bottom peripheral point but moving directions thereof are different from each other. Accordingly, shapes of deformed pages are different from each other.
  • FIG. 8A when the user releases the touch, the page in FIG. 8A is not skipped but the page in FIG. 9 may be skipped to a left side. Touches shown in FIGS. 8A and 9 start from the same right bottom periphery of the page. However, a moving direction of the case shown in FIG. 8A is along a periphery but a moving direction of a case shown in FIG. 9 is along a center of the page. Accordingly, in a case shown in FIG. 8A, a lower portion of the page has a mass center of a left side, and an upper portion of the page has a mass center of a right side. In this case, the overall mass center may be towards a right side. As a result, the page is not skipped.
  • a moving direction of the touch is towards a center of the page rather than along a periphery
  • upper/lower portions of the page have a mass center towards a left side.
  • the page is skipped.
  • a direction of the mass center of the page may be associated with the moving direction of the touch input device as well as a current touch coordinate and speed of the touch input device.
  • FIGS. 10A and 10B the user touches the touch input device at a right side point 1010 of a center of the page (e.g., page 4/11), and moves the touch input device to an opposite side (left side). That is, FIG. 10A illustrates an animation where the touch input device is moved from the right side point 1010 to a left side and is located at a central point 1020. As shown in FIG. 10A, when the user moves the touch input device to a left side after the user touches a right side point 1010 of a center of the page, upper and lower portions of the page may be symmetrically and uniformly deformed. As shown in FIG.
  • FIG. 10B the user moves the touch input device from the central point 1020 in a leftward direction to a right side point 1030. That is, FIG. 10B illustrates an animation where the touch input device is located at the right side point 1030.
  • FIG. 10B illustrates an animation where the touch input device is located at the right side point 1030.
  • FIG. 10A the page shown in FIG. 10B is more convex than the page shown in FIG. 10B. Accordingly, when the user releases the touch, the page of FIG. 10A is not skipped, but the page of FIG. 10B may be skipped.
  • moving directions of touch input devices in FIGS. 8A and 10A are the same (towards a left side), but initial touch coordinates thereof are different from each other.
  • shapes of deformed pages are different from each other.
  • FIGs. 10B As shown by a comparison of FIGs. 10B with FIG. 8B, if the user releases a touch, a page of FIG. 8B is not skipped but a page of FIG. 10B may be skipped. Touches of FIGS. 8B and 10B start from a right side of the page. However, a touch of FIG. 10B starts from the center, and a touch of FIG. 8B starts from a lower portion of the center. Accordingly, in a case shown in FIG. 10B, upper and lower portions of the page may have a mass center towards a left side. As a result, the page is skipped. In contrast, in a case shown in FIG.
  • a lower portion of the page has a mass center toward a left side but an upper portion of the page may have a mass center toward a right side.
  • the page may not be skipped.
  • a direction of the mass center in the page may be associated with an initial touch coordinate as well as a current touch coordinate and a moving direction of the touch input device.
  • FIGS. 11A and 11B the user touches a touch input device at a right top peripheral point 1110 of a page (e.g., page 4/11), and moves the touch input device from the point towards a point in the top periphery. That is, FIG. 11A illustrates an animation where the touch input device is moved from the right top peripheral point 1110 towards a left top periphery point and is located at a first top side point 1120. As shown in FIG. 11B, the user moves the touch input device from the first top side point 1120 along a top periphery in a left direction. That is, FIG. 11B is an animation where the touch input device is located at a second top side point 1130.
  • FIG. 12 illustrates an animation where the touch input device is moved from a right top peripheral point 1210 towards a left bottom periphery and is located at a top side point 1220.
  • touch input devices in FIGS. 11A and 12 start from a right top peripheral point but moving directions thereof are different from each other. Accordingly, shapes of deformed pages are different from each other.
  • FIG. 13 illustrates an animation where the touch input device is moved from a first left bottom peripheral point 1310 of the page (e.g., page 4/11) to a left side and is located at a second left bottom peripheral point 1320.
  • FIG. 14 illustrates an animation where the touch input device is moved from the first left side point 1410 to the left side and is located at a second left side point 1420.
  • FIG. 15 illustrates an animation where the touch input device is moved from the first left top peripheral point 1510 and is located at the second left top peripheral point 1520.
  • the user may touch all parts of the page as well as touch coordinates described in FIGs. 8 to 15. Accordingly, it is understood that deformation of the page may be changed according to a touch coordinate, a moving direction, and speed.
  • a display mode is set to a portrait mode.
  • the display unit 110 may display one page in the portrait mode.
  • a user touches the touch input device at a right bottom peripheral point 1610 of a page (e.g., page 4/11).
  • the controller 160 detects a target node corresponding to the right bottom peripheral point 1610.
  • the user moves the touch input device towards a left bottom periphery in a state in which the right bottom peripheral point 1610 is touched.
  • the controller 160 moves the target node towards the left bottom periphery.
  • the controller 160 calculates displacement of the moved target node.
  • the controller 160 calculates a current location, moving speed, and a moving direction of the target node.
  • the controller 160 calculates forces applied to respective nodes using the calculated displacement of the target node.
  • the controller 160 calculates location values of the respective nodes using the calculated forces.
  • the controller 160 generates an animator using the calculated location values of the nodes.
  • the controller 160 controls the display unit 110 to display the generated animation.
  • FIG. 16 illustrates an animation in which the touch input device is moved from the right bottom peripheral point 1610 towards a left bottom periphery and is located at a bottom side point 1620. If the touch input device approaches a left side within a preset threshold value (e.g., 10 mm from a left side of a screen), the controller 160 skips the page and controls the display unit 110 to display a next page (e.g., page 4/11).
  • a preset threshold value e.g. 10 mm from a left side of a screen
  • FIG. 17 illustrates a screen displaying a page of a music player, the number of pages of the music player being changeable.
  • one page includes a plurality of icons (e.g., 12 icons as illustrated in FIG. 17).
  • the number of icons displayed per page may be changed by the user.
  • the user touches a touch input device at a right bottom peripheral point 1710 of a right page (e.g., page 4/11), and moves the touch input device from the right bottom peripheral point 1710 towards a left bottom periphery.
  • the controller 160 generates an animation and controls the display unit 110 to display the generated animation.
  • FIG. 17 illustrates an animation (that is, a shape in which a page of a music player is convexly deformed) in which the touch input device is moved from the right bottom peripheral point 1710 towards a left periphery and is located at a bottom side point 1720.
  • One page includes a plurality of icons (e.g., 12 icons).
  • the icons may be displayed in many different ways, e.g., a list form, and are not limited to being displayed in a grid form as illustrated in FIG. 17.
  • the page may include a music folder, a thumbnail or a music playback image. If the user taps the music folder, a corresponding music playback image is displayed.
  • the music playback image includes an album cover photograph, a playback button, and a pause button. If the user taps the playback button, music included in the corresponding music folder may be sequentially or randomly played.
  • FIG. 18 illustrates a screen which displays a page of a video player, the number of pages of the video player being changeable.
  • a page of a video player in response to movement 1810 of the touch input device, a page of a video player is convexly deformed from the touched page (e.g., page 4/11).
  • the page of the video player includes a plurality of icons (e.g., 12 icons).
  • the number of icons displayed per page may be changed by the user. If the user taps one of the icons, a corresponding video is played.
  • the music player and the video player may be integrated with each other. That is, the music icon and the video icon may be displayed on one page. In the case of an integral type configuration, the page may also be convexly deformed.
  • FIG. 19 illustrates a screen displaying a page of an address book, the number of pages of the address book being changeable.
  • a page of the address book in response to movement 1910 of the touch input device, a page of the address book is convexly deformed from the touched page (e.g., page 4/11).
  • the page of the address book includes a plurality of contact point information.
  • the contact point information may be displayed in a form of a list, although is not limited thereto.
  • the contact point information may also be displayed in another form, for example, in a form of a grid. If the user taps contact point information, corresponding detailed information (e.g., phone numbers, e-mail addresses, home addresses, office addresses) is displayed.
  • the page of the address book may also be displayed in a portrait mode as illustrated in FIG. 16, rather than the landscape mode.
  • FIG. 20 illustrates a screen displaying a page of a memo note, the number of pages of the memo note being changeable.
  • a page of the memo address is convexly deformed from the touched page (e.g., page 4/11).
  • FIG. 21 illustrates a screen displaying a page of a calendar, the number of pages of the calendar being fixed.
  • a page e.g., February 2012
  • FIG. 22 illustrates a screen displaying a page of a web browser, the number of pages of the web browser being fixed.
  • the page of the web browser is convexly deformed from the touched page (e.g., page 4/11).
  • FIG. 23 illustrates a screen displaying a page of an electronic book, the number of pages of the electronic book being fixed.
  • the page of the electronic book is convexly deformed from the touched page (e.g., page 4/11).
  • FIG. 24 is a flowchart illustrating a page editing method according to an exemplary embodiment.
  • FIGS. 25A to 26 are diagrams of screens illustrating a page editing method according to an exemplary embodiment.
  • a controller 160 detects an event requesting the display of a page (operation 2410).
  • the request event is a tap of a touch input device with respect to an application icon.
  • the controller 160 determines a page to be displayed among stored pages of a corresponding application (operation 2420).
  • the controller 160 may display the most recently displayed page before a corresponding application was terminated as the page to be displayed.
  • the controller 160 controls the display unit 110 to display the determined page and additional information thereof (operation 2430).
  • the additional information includes a number 2510 of a corresponding page and the number 2520 of a total number of pages.
  • the controller 160 detects an event requesting editing of the page (operation 2440).
  • the request event is a tap of the touch input device on a delete button 2530.
  • the controller 160 reconfigures pages in response to the editing request event and stores the reconfigured pages in a secondary memory 130 (operation 2450). Further, the controller 160 controls the display unit 110 to display at least one of the reconfigured pages and additional information thereof (operation 2460).
  • the controller 160 reconfigures pages using the remaining pages except for pages P5, P9, P11, and P19 ⁇ P24 selected by the user from a contents list in response to a tap of the delete button 2530. Accordingly, the total number of the pages is changed from 11 to 10. Further, in the examples shown in FIGS.
  • the order of the currently display pages may be maintained, but it is understood that the order of currently displayed pages may be changed.
  • contents such as videos, audio, images, contact points, memos, documents, thumbnails, and icons as well as photographs may be edited (e.g., deleting contents from the page, adding contents to the page, or moving the contents to another page).
  • the pages may be configured in the order of time. For example, when a shooting time of a first photograph is earlier than that of a second photograph, the first photograph is configured at a previous page as compared with the second photograph. Further, the pages may be configured by places. For example, a first page is configured by photographs shot in Seoul, and a second page is configured by photographs in New York. If an arrangement scheme of contents is changed from "time" to "place” or vice versa by the user, the controller 160 may reconfigure pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed.
  • the controller 160 reconfigures the pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed.
  • the number of contents included in one page may be changed.
  • each of the pages shown in FIGS. 25A and 25B respectively includes 12 contents
  • the page of FIG. 26 includes 9 contents.
  • information e.g., a number
  • an arrangement scheme of contents in the page may be changed. That is, the contents may be arranged in a form of a grid, a list, or some other form.
  • the number of contents included in the page may be changed.
  • the information indicating an order of currently displayed pages may be changed and the changed information may be displayed.
  • information indicating the total number of pages may be changed and the changed information may be displayed.
  • the foregoing method for displaying contents according to exemplary embodiments may be implemented in an executable program command form by various computer components and may be recorded in a computer readable recording medium.
  • the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof.
  • the program command recorded in a recording medium may be specially designed or configured for the exemplary embodiments or be known to a person having ordinary skill in a computer software field to be used.
  • the computer readable recording medium may include magnetic media such as a hard disk, floppy disk, or magnetic tape, optical media such as a Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), magneto-optical media such as a floptical disk, and a hardware device such as ROM, RAM, and flash memory storing and executing program commands.
  • the program command may include a machine language code created by a complier and a high-level language code executable by a computer using an interpreter.
  • the foregoing hardware device may be configured to be operated as at least one software module to perform an operation according to the exemplary embodiments.
  • the exemplary embodiments provide a highly realistic feeling to a user when the user operates a screen on which pages are displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and a mobile terminal for displaying contents are provided. The method of displaying contents for pages of a mobile terminal including a display unit in which a touch panel is installed, includes: displaying a page; detecting movement of a touch input device with respect to the displayed page; and displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.

Description

CONTENTS DISPLAY METHOD AND MOBILE TERMINAL IMPLEMENTING THE SAME
Methods and apparatuses consistent with exemplary embodiments of the present disclosure relate to a contents display method and a mobile terminal implementing the same.
A mobile terminal provides various contents. The contents may be displayed for each of a plurality of pages. However, a contents display method and an apparatus thereof in the related art do not provide the feeling of operating pages on the mobile terminal which is similar to the feeling of operating an actual paper book for a user. According to a contents display method and an apparatus thereof of the related art, if a user provides input information (e.g., a push)associated with a page skip, for example, a next page button is detected, and a currently displayed page is replaced with a next page. Such a replacement scheme does not actually skip the currently displayed page but simply browses to a next web page. Meanwhile, a recently developed mobile terminal may include a touch screen. The mobile terminal detects and skips pages in response to the detected gesture. When the user skips the pages, the mobile terminal according to the related art provides an animation which gradually folds a current page (that is, a front surface of the page) and shows a next page (that is, a back surface of the page) regardless of a touched point or a direction of drag.
One or more exemplary embodiments provide a contents display method capable of achieving a realistic feeling for a user when the user operates a screen on which a page is displayed by a touch input device (e.g., finger or pen), and an apparatus thereof.
One or more exemplary embodiments also provide a contents display method in which an animation of pages being skipped provides a realistic feeling, and an apparatus thereof.
In accordance with an aspect of an exemplary embodiment, there is provided a method of displaying contents of pages displayed by a mobile terminal including a display unit in which a touch panel is installed, the method including: displaying a page; detecting movement of a touch input device with respect to the displayed page; and displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.
In accordance with an aspect of another exemplary embodiment, there is provided a mobile terminal including: a display unit in which a touch panel is installed and configured to display contents for each of a plurality of pages; a memory configured to store the pages; and a controller configured to control the display unit such that one of the pages is displayed, detect movement of a touch input device with respect to the displayed page, and control the display unit such that the page is displayed as convexly deformed and skipped in response to the movement of the touch input device.
In accordance with an aspect of another exemplary embodiment, there is provided a method to display pages including: displaying a page on a device including a touch input unit, generating a page mesh corresponding to the displayed page, the page mesh including a plurality of nodes having respective weights, detecting movement of a touch input device with respect to the displayed page, using the touch input unit, and changing an appearance of the page according to the detected movement and the page mesh.
One or more exemplary embodiments provide a contents display method capable of achieving a realistic feeling for a user when the user operates a screen on which a page is displayed by a touch input device (e.g., finger or pen), and an apparatus thereof. One or more exemplary embodiments also provide a contents display method in which an animation of pages being skipped provides a realistic feeling, and an apparatus thereof.
The above and/or other aspects will be more apparent from the following detailed description of exemplary embodiments in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram illustrating a configuration of a mobile terminal according to an exemplary embodiment;
FIGS. 2A and 2B are diagrams illustrating a page mesh according to an exemplary embodiment;
FIG. 3 is a flowchart illustrating a page display method according to an exemplary embodiment;
FIG. 4 is a flowchart illustrating a page deforming method according to an exemplary embodiment;
FIG. 5 is a flowchart illustrating a page skipping method according to an exemplary embodiment;
FIG. 6 is a flowchart illustrating page setting according to an exemplary embodiment;
FIG. 7A is an exemplary diagram of a screen for setting an environment of a mobile terminal according to an exemplary embodiment;
FIG. 7B is an exemplary diagram of a screen for setting an environment of a page according to an exemplary embodiment;
FIGS. 8A to 23 are exemplary diagrams illustrating screens for describing a page display method according to an exemplary embodiment;
FIG. 24 is a flowchart illustrating a page editing method according to an exemplary embodiment; and
FIGS. 25A to 26 are diagrams of screens illustrating a page editing method according to an exemplary embodiment.
Exemplary embodiments are described with reference to the accompanying drawings in detail. The same reference numbers are used throughout the drawings to refer to the same or like parts. Detailed descriptions of well-known functions and structures incorporated herein may be omitted to avoid obscuring the subject matter of the exemplary embodiments.
The contents display method according to exemplary embodiments may be implemented by a mobile terminal, for example, a smart phone, a tablet PC, an e-book reader, a navigation device, or a moving image player. Hereinafter, the contents display method and the mobile terminal thereof will be described in detail.
As used herein, according to exemplary embodiments, the term 'contents' may refer to photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, and thumb-nails and icons, in addition to many other types of contents. The contents are displayed for each page. The pages according to exemplary embodiments may be convexly and stereoscopically deformed in response to a user gesture. Accordingly, when the user operates a screen on which pages are displayed by a touch input device (e.g., finger or pen), the user may feel as if the user is controlling real paper pages.
As used herein, according to exemplary embodiments, the term 'page mesh' refers to geometrical information of the pages. The page mesh includes a plurality of nodes and links connecting the nodes with each other. A weight is allocated to each node, and an elastic value is allocated to each link. The elastic value may be differently allocated for the user according to properties of the pages in order to achieve a realistic feeling.
For example, when the pages are thickly set (that is, when the weight is greatly set), the elastic value may be greatly allocated. When the page is relatively thinly set, the elastic value may be allocated to a relatively smaller value. A great weight may be allocated to nodes located in an inner side (e.g., spine) of the page. When location variation in relatively outer nodes (e.g., page edge) is greater than that of inner nodes, a small weight may be allocated to the relatively outer nodes. The same weight may be allocated to all nodes.
Virtual force applied to each node may be classified into two types. The first virtual force is an internal virtual (hereinafter also referred to 'internal force'). The second virtual force is an external virtual force (hereinafter also referred to 'external force') such as gravity or human power. The virtual gravity of the external force is defined as a force pulling the node down. If a screen on which the pages are displayed is arranged in an XY plane and a user’s viewpoint is along a positive direction of the Z direction on the XY plane, the virtual gravity pulls the node down toward a lower portion of the XY plane. The Z axis is vertical (orthogonal) to the XY plane. The Z axis is not an actual axis, but is a virtual axis for stereoscopically controlling the virtual page. The virtual gravity may act equally on all nodes. The gravity may have a different effect according to a property of a page to achieve a realistic feeling for the user. For example, in a case where the user lifts and puts down a page of a real paper book, when the page is thin, if the page is relatively thick, the page falls down rapidly down. The following table 1 illustrates thicknesses by types of pages. For example, referring to table 1, a pamphlet falls down faster than a leaflet. That is, a deformation degree of the page may be changed according to thicknesses and materials set to the display page.
Table 1
Leaflet interposed in newspaper 52.3g/m2
Magazine body, advertising papers 64g/m2
Tickets, weekly magazines, pamphlets 127.9g/m2
Cover of fashion magazine, name cards 157g/m2
Sketchbooks 200g/m2
Printing papers 75g/m2
An artificial force is a force which the user applies to the page. For example, a user gesture with respect to the screen may be the artificial force. A target node touched by the touch input device is moved in a direction in which the touch input device is moved. In this case, the artificial force is transferred to other nodes through links. As a result, a sum of the internal force and the external force is applied to each node. If the artificial force is applied to a displayed page, a controller of the mobile terminal calculates forces applied to each node based on the artificial force applied to the displayed page. The force may be obtained in various ways, for example, by multiplying a moving distance of a target node by speed to obtain acceleration, and by multiplying the acceleration by a weight of a corresponding target node. The calculation of the force is generally known in the art, and thus a detailed description thereof is omitted. The mobile terminal reflects a deformed page mesh on a page to generate an animation. A procedure of generating the animation based on the artificial force is defined by a physically-based simulation. The physically-based simulation may be executed by various components, such as, for example, an Application Processor (AP), a Central Processing Unit (CPU), or a Graphics Processing Unit (GPU).
FIG. 1 is a block diagram illustrating a configuration of a mobile terminal according to an exemplary embodiment.
Referring to FIG. 1, the mobile terminal 100 includes a display unit 110, a key input unit 120, a memory 130, a radio frequency (RF) communication unit 140, an audio processor 150, a speaker SPK, a microphone MIC, a controller 160, and a pen 170.
The display unit 110 displays contents on a screen under control of the controller 160. That is, when the controller 160 processes (e.g., performs a decoding or resizing operation) and stores the contents in a buffer, the display unit 110 converts the contents stored in the buffer into an analog signal, and displays the analog signal on the screen. When power is supplied to the display unit 110, the display unit 110 displays a lock image (e.g., login image) on the screen. If lock release information (e.g., password) is detected in a state that a lock image is displayed, the controller 160 releases the lock. That is, the display unit 110 terminates the displaying of the lock image, and displays another image, for example, a home image under control of the controller 160. The home image includes a background image and a plurality of icons displayed thereon. Icons indicate applications or contents, respectively. If the user selects an icon, for example, an application icon (e.g., taps the icon), the controller 160 executes a corresponding application (e.g., gallery), and controls the display unit 110 to display an execution image of the corresponding application (e.g., page including a plurality of thumbnails).
The display unit 110 may be implemented as various types, for example, a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), or a flexible display.
The touch panel 111 is installed on a screen of the display unit 110. For example, the touch panel 111 may be implemented as an add-on type located on a screen of the display unit 110 or an on-cell type or an in-cell type inserted into an inside of the display unit 110.
The touch panel 111 generates an analog signal (e.g., touch event) in response to touch of a touch input device (e.g., finger or pen) with respect to a screen, and a touch IC 112 converts the analog signal into a digital signal, and transfers the digital signal to the controller 160. The touch event includes a touch coordinate (x, y). For example, the touch IC 112 determines a representative coordinate of a plurality of touch coordinates, stores a determined touch coordinate in an internal memory of the touch IC 112, and transfers the touch coordinate stored in the internal memory to the controller 160 in response to a request of the controller 160. The touch coordinate may be a pixel unit. For example, when resolution of a screen is 640 (the number of horizontal pixels)*480(the number of vertical pixels), an X axis coordinate may be, for example, (0, 640), and a Y axis coordinate may be, for example, (0, 480).
When the touch coordinate is received from the touch IC 112, the controller 160 determines that the touch input device (e.g., finger or pen) touches the touch panel 111. When the touch coordinate is not received from the touch IC 112, the controller 160 determines that the touch of the touch input device is released. Further, for example, when the touched coordinate varies from (x0, y0) to (x1, y2), a variation amount of the touched coordinate (e.g., D (D2=(x0-x1)2+(y0-y1)2) exceeds a preset "moving threshold (e.g., 1 mm)", the controller 160 determines that the touch input device has moved.
The controller 160 computes a location variation amount (dx, dy) of the touch and moving speed of the touch input device in response to movement of the touch input device. The controller 160 determines a user gesture as one of various different types of gestures, for example, touch, multi-touch, tap, double tap, long tap, tap & touch, drag, flick, press, pinch in, and pinch out based on presence of touch release of the touch input device, presence of movement of the touch input device, a location variation amount of the touch input device, and moving speed of the touch input device. The touch is a gesture where a user makes the touch input device contact with one point of a touch panel 111 on a screen. The multi-touch is a gesture where the user makes a plurality of touch input devices (e.g., thumb and index finger) contact the touch panel 111. The tap is a gesture where the user touches-off a corresponding point without movement after touching the touch input device on one point. The double tap is a gesture where a user continuously taps one point twice. The long tap is a gesture where touch of the touch input device is released from a corresponding point without a motion of the touch input device after touching one point longer than the tap. The tap & touch is a gesture where the user touches a corresponding point within a predetermined time (e.g., 0.5 seconds) after touching one point of a screen. The drag is a gesture that moves the touch input device in a predetermined direction in a state in which one point is touched. The flick is an operation that touches-off after moving the touch input device at a higher speed than the drag. The press is a gesture to maintain the touch without movement for a predetermined time (e.g., 2 seconds) after touching one point. The pinch in is a gesture where the user reduces an interval between touch input devices after simultaneously multi-touching two points by the two touch input devices. The pinch out is a gesture for increasing the interval between the touch input devices. That is, the touch is a gesture in which the user contacts the touch screen, and other gestures refer to variations in the touch.
The touch panel 111 may be a converged touch panel including a hand touch panel detecting a hand gesture and a pen touch panel detecting a pen gesture. The hand touch panel may include a capacitive type touch panel. The hand touch panel may also include a resistive type touch panel, an infrared type touch panel, or an ultrasonic type touch panel. Further, the hand touch panel does not generate a touch event only based on a hand gesture, but may also generate the touch event based on other objects touching the touch panel 111 (e.g., conductive material capable of providing variation in capacitance). The pen touch panel may include an electromagnetic induction type touch panel. Accordingly, the pen touch panel generates a touch event by a specially manufactured touch pen 170 so that a magnetic field may be formed. In particular, the touch event generated by the pen touch panel includes a value indicating a type of the touch together with a touch coordinate. For example, when a first voltage level is received from the pen touch panel, the controller 160 determines a touch of the touch input device as an indirect touch (that is, hovering). When a second voltage level greater than the first voltage level is received from the touch panel 111, the controller 160 determines the touch of the touch input device as a direct touch. The touch event generated by the pen touch panel may further include a value indicating the presence of pushing of a button installed at the pen 170. For example, when a button installed at the pen 170 is pushed, a magnetic field generated from a coil of the pen 170 varies. In response to the variation in the magnetic field, the pen touch panel generates a third voltage level, and transfers the third voltage level to the controller 160.
The key input unit 120 may include a plurality of input keys and function keys for receiving numeric or character information and setting various functions. The keys may include a menu loading key, a screen on/off key, a power on/off key, and a volume control key. The key input unit 120 generates a key event associated with user settings and function control of the mobile terminal 100 and transfers the key event to the controller 160. The key event may include a power on/off event, a volume control event, a screen on/off event, and a shutter event. The controller 160 controls the foregoing constituent elements in response to the key event. Meanwhile, a key of the key input unit 120 may refer to a hard key and a virtual key displayed on the display unit 110 may refer to a soft key.
The secondary memory 130 may include various components, such as a disk, a RAM, a ROM, and a flash memory. The secondary memory 130 stores contents generated by the mobile terminal 100 or contents received from an external device (e.g., server, desktop PC, tablet PC) through the RF communication unit 140. The secondary memory 130 may temporarily store data copied from a message, a photograph, a web page, and a document by the user for performing a copy and paste operation. The secondary memory 130 stores various preset values (e.g., screen brightness, presence of vibration upon generation of a touch, presence of automatic rotation of screen). Further, the memory 130 stores histogram information, for example, information associated with a most recently displayed page before the application is terminated.
The secondary memory 130 stores a booting program, and at least one operating system (e.g., gallery, address book, video player, calendar, note pad, electronic book viewer, music player, web browser). The operating system serves as an interface between hardware and applications and further serves as an interface between applications, and manages various computer resources, such as a CPU, a graphic processing unit (GPU), a main memory, and the secondary memory 130. The applications may be classified into an embedded application and a third party application. For example, the embedded application includes a web browser, an e-mail program, and an instant messenger. If power of a battery is supplied to the controller 160 of the mobile terminal 100, the booting program is loaded into a main memory of the controller 160. The booting program loads host and guest operating systems into the main memory 161. The operating systems load the application into the main memory 161.
The RF communication unit 140 performs voice calls, image calls, and data communications with an external device through a network under the control of the controller 160. The RF communication unit 140 may include an RF transmitter for up-converting a frequency of a transmitted signal and amplifying the converted signal, and an RF receiver for low-noise-amplifying a frequency of a received signal and down-converting the amplified signal. The RF communication unit 140 may include a mobile communication module (e.g., 3-generation mobile communication module, 3.5-generation mobile communication module, 4-generation mobile communication module, etc.), a digital broadcasting module (e.g., DMB module), and a near field communication module.
The audio processor 150 inputs and outputs an audio signal (e.g., voice data) for voice recognition, voice recording, digital recording, and call operations. The audio processor 150 receives an audio signal from the controller 160, converts the received audio signal into an analog signal, amplifies the analog signal, and outputs the amplified analog signal through the speaker SPK. The audio processor 150 converts an audio signal received from the microphone MIC into digital data, and provides the converted digital signal to the controller 160. The speaker SPK converts an audio signal received from the audio processor 150 into a sound wave and outputs the sound wave. The MIC converts the sound wave received from a person or other sound source into the audio signal.
The controller 160 controls overall operations and signal flows between internal constituent elements of the mobile terminal 100, processes data, and controls supply of power from a battery to the constituent elements. The controller 160 includes at least one CPU. As is generally known in the art, the CPU is a core control unit of a computer system and performs computation and comparison of data, and interpretation and execution of commands. The CPU includes various registers which temporarily store data and commands. The controller 160 may include at least one Graphic Processing Unit (GPU). The GPU is a graphic control unit performing computation and comparison of data associated with graphics, and interpretation and execution of commands, alternatively to the CPU. Each of the CPU and the GPU may be configured by integrating at least two independent cores (e.g., quad-core) as one package being a single IC. The CPU and the GPU may be implemented as a System on Chip (SoC). The CPU and the GPU may be a package of a multi-layer structure. A configuration including the CPU and the GPU may be referred to as an Application Processor (AP).
The GPU of the controller 160 deforms a page mesh in response to a gesture (e.g., drag) of a touch input device, and generates an animation by reflecting a page on the deformed page mesh. The GPU receives information associated with a touch gesture from the touch IC 112. The GPU deforms the page mesh using the received information. If a touch of the touch input device is released from a screen, the GPU restores the page mesh to an original state. That is, the deformed page mesh is restored to an original state by the elastic force of links and gravity applied to each node. The GPU accesses the secondary memory 130 to read a page therefrom. The GPU reflects deformation information of the page mesh on the read page to generate the animation. The deformation information of the page mesh includes coordinates (x, y, z) of each node constituting the page mesh. In addition, the GPU controls the display unit 110 to display the animation. The animation may be generated by a CPU or an application processor (AP).
The controller 160 includes a main memory, for example, a RAM. The main memory may store various programs, for example, a booting program, a host operation system, guest operating systems, and applications loaded from the secondary memory 130. The CPUs and GPUs of the controller 160 access the foregoing program to decode a command of the program and execute a function (e.g., generation of histogram) according to the interpretation result. In addition, the controller 160 temporarily stores data to be written in the secondary memory 130 and temporarily stores data read out from the secondary memory 130. A cache memory may be further provided as a temporary data warehouse.
The pen 170 is an accessory of the mobile terminal 100 which can be separated from the mobile terminal 100, and may include, for example, a penholder, a rib disposed at an end of the penholder, a coil disposed inside the penholder adjacent to the rib to generate a magnetic field, and a button which varies the magnetic field. The coil of the pen 170 forms the magnetic field around the rib. A pen touch panel of the touch panel 111 detects the magnetic field, and generates a touch event corresponding to the magnetic field.
According to exemplary embodiments, the mobile terminal 100 may further include constituent elements which are not described above, such as a Global Positioning System (GPS) module, a vibration motor, and an acceleration sensor.
FIGs. 2A and 2B show a diagram illustrating a page mesh according to exemplary embodiments. Referring to FIG. 2A, the controller 160, and particularly, a CPU of the controller 160, configures a page mesh. The page mesh includes a plurality of nodes and a plurality of links connecting the nodes with each other. In FIG. 2A, reference numeral 210 represents a plurality of nodes, and reference numeral 220 represents a plurality of links. As shown in FIG. 2A, a plurality of nodes may be arranged in a matrix pattern, and locations thereof may be represented by XY coordinates. Further, as described above, a suitable weight is allocated to each node, and a suitable elastic value is allocated to each link (spring). A relatively heavyweight may be allocated to nodes which are located at a center 230 of the page mesh. A relatively light weight may be allocated to outer nodes located relatively away from the center 230 as compared with nodes located near the center 230. Accordingly, the nodes have a movement which is gradually lighter in an outward direction. Nodes react gradually more sensitively to a touch gesture in the direction towards an outer node. When the page is skipped (turned), nodes located at a central axis (Y axis) 230 are fixed, which is different from other nodes.
Alternatively, the same weight may be allocated to all nodes. As such, the entire movement of the page mesh may be heavier than a previous case in which different weights are allocated to the nodes. That is, a deformation degree of the page may be changed according to attribute information (e.g., thickness, weight, material) set to a corresponding page. Further, the deformation degree of the page may be changed according to a calculated gradient. When an artificial force (that is, gesture of touch input device) is applied to the page, the controller 160, particularly, a GPU of the controller 160, detects a gesture, deforms the page mesh in response to the detected gesture, and reflects the deformed page mesh on the page to generate an animation. In detail, referring to FIG. 2B, the user touches a right bottom peripheral point 240 of a page by a touch input device (e.g., finger, pen). Then, the controller 160 detects a target node touched by the touch input device. Next, the user moves the touch input device from the right bottom peripheral point 240 to a left direction. Accordingly, the controller 160 moves the target node to the left direction on an XY plane according to movement of the touch input device. That is, the target node is moved in a direction which is vertical to gravity.
The controller 160 calculates displacement of the moved target node. The displacement may be represented as a vector having a size and a direction. In detail, the size of the displacement includes at least one of a current location of the target node, a moved distance of the target node, and speed of the target node. For example, the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node. The controller 160 deforms the page mesh according to the computed gradient, and reflects the deformed page mesh on the page to generate the animation. The controller 160 calculates forces applied to each node using the calculated displacement. The force is a vector having a size and a direction. As described above, the force is a sum of an elastic force, gravity, and an artificial force. The controller 160 calculates location values of respective nodes using the calculated forces. As shown in FIG. 2B, the controller 160 generates the animation (deformed page) as shown in FIG. 2B using the calculated location values. Meanwhile, the controller 160 may move the target node in a direction which is vertical to gravity. That is, a value of a Z axis may vary or be '0' according to variations in a value of an X axis and a value of a Y axis of the target node. The controller 160 fixes a node located at a central axis 230 so that the node located at the central axis 230 functions in a different fashion from other nodes. As such, exemplary embodiments achieve a realistic effect of the user pushing and moving a page of a paper book in a real-time manner. Accordingly, as shown in FIG. 2B, the deformed page has a convex shape. As described above referring to FIGs. 2A and 2B, the page mesh may be variously deformed in a realistic fashion according to the touch coordinates, a moving direction, and moving speed of the touch input device. Accordingly, the user may experience a feeling of interacting with pages of a real paper book according to the exemplary embodiments.
FIG. 3 is a flowchart illustrating a page display method according to an exemplary embodiment.
Referring to FIG. 3, a controller 160 may initially be in an idle state. For example, a display unit 110 displays a home image including a plurality of icons. The controller 160 detects a tap of the touch input device with respect to the icon. The controller 160 executes a corresponding application (e.g., loads pages of a corresponding application to a main memory) in response to the tap (operation 301). For example, the executed application may include a gallery, an address book, a video player, an electronic book viewer, a music player, or a web browser. The controller 160 selects at least one page (e.g., most recently displayed page before a corresponding application is terminated) of pages of the executed application, and controls the display unit 110 to display the selected page (operation 302).
The controller 160 determines whether a touch is detected (operation 303). When the touch is not detected, the controller 160 determines whether a threshold time elapses (operation 304). The threshold time is set to automatically turn-off the screen. If the touch is not detected by the time the threshold time elapses, the controller 160 turns-off the screen (operation 305). The threshold time may be set to many different values, e.g., 1 minute, which may be changed according to selection of the user. When the touch is detected, the controller 160 determines whether the touch input device is moved (e.g., drag, flick) (operation 306). When the touch input device is moved, the controller 160 controls the display unit 110 to display a convexly deformed page in response to the movement of the touch input device (operation 307). That is, the controller 160 deforms the page mesh in response to the movement of the touch input device, and reflects the deformed page mesh on the page to generate the animation. A detailed process of step 307 will be described with reference to FIG. 4.
The controller 160 determines whether the touch of the touch input device is released from the screen (operation 308). If the touch of the touch input device is maintained without releasing the touch, the process returns to operation 306. Conversely, if the touch input device is touch-released, the process proceeds to operation 309. The controller 160 determines whether the touch release is an event corresponding to a page skip (operation 309). That is, the controller 160 determines whether the page skip is generated based on at least one of a moving direction of the touch input device, a touch coordinate and speed before generation of the touch release. When the page skip is generated, the controller 160 controls the display unit 110 to skip a currently display page and to display another page (operation 310). When the page skip is not generated, the controller 160 maintains the displaying of a current page (operation 311). Next, the controller 160 determines whether execution of the application is terminated (operation 312). When the execution of the application is not terminated, the process returns to operation 303.
FIG. 4 is a flowchart illustrating a page deforming method according to an exemplary embodiment. Referring to FIG. 4, a controller 160 detects a target node touched by a touch input device. Further, the controller 160 detects a moving direction of the touch input device. The controller 160 moves the target node to a moving direction of the touch input device (operation 401). Particularly, the controller 160 may move the target node to a direction vertical to a gravity direction. The controller 160 may move the target node at a determined gradient (e.g., -30°~ +30°) based on the gravity direction. Next, the controller 160 calculates displacement of the moved target node (operation 402).
The displacement is represented by a vector having a size and a direction. In detail, the size of the displacement may include at least one of a current location of the target node, a moved distance of the target node, and speed of the target node. For example, the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only a speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node.
After calculating the displacement, the controller 160 calculates forces applied to each node using the calculated displacement of the target node (operation 403). The calculation of the forces is generally known in the art. That is, the controller 160 calculates a magnitude of forces applied to each node and a direction to which the forces are applied (operation 403). Next, the controller 160 applies the calculated forces to each node to deform a page mesh (operation 404). That is, the controller 160 calculates location values of respective nodes using the calculated forces (operation 404). Finally, the controller 160 applies the deformed page mesh to a page to generate an animation (operation 405). The generated histogram is displayed such that the page is convexly deformed as the target node is moved to a direction vertical to gravity or a determined direction of a gradient.
If the touch of the touch input device is released from the deformed page, the page is restored to an original state, that is, an open state. In this case, the page may be skipped or may not be skipped and returned to an original position. Such a result is determined by forced applied to respective nodes of the page mesh. That is, if an artificial force disappears, only the elastic force and the gravity are applied to the page mesh. The controller 160 calculates a sum of forces applied to respective nodes of the page mesh. The controller 160 determines a moved direction of the page based on the sum of the forces. The controller 160 moves the page along the determined direction. For example, the page is moved in a direction towards which a mass center of the page mesh faces. The moving direction of the page is determined as a moving direction before the touch input device is separated from a screen (that is, a page). A detailed example will be described with reference to FIG. 5.
FIG. 5 is a flowchart illustrating a page skipping method according to an exemplary embodiment.
Referring to FIG. 5, a display unit 110 displays a page, and a touch input device of a user touches the displayed page (operation 501). While the touch input device touches the page, the controller 160 detects a current touch coordinate (x, y) (operation 502). It is assumed that the X axis is a horizontal axis based on a viewpoint at which the user views a screen. It is assumed that two pages are displayed, one page at a left side and the other page at a right side, based on a center line of a screen. Further, a right direction is a positive direction of an X axis, and a left direction is a negative direction of the X axis. Under the assumption, a controller 160 determines whether "|x - old_x|> th" is satisfied (operation 503). The value "x" is a current touch coordinate, the value "old_x" is a previous touch coordinate, and the "th" is a preset threshold value. For example, the "th" may be set to 5 mm, although is not limited thereto. When "|x - old_x|> th" is not satisfied, the process proceeds to operation 508. Conversely, when "|x - old_x|> th" is satisfied, that is, a difference between the current touch coordinate and the previous touch coordinate exceeds the threshold value, the process proceeds to operation 504.
The controller 160 determines whether the current touch coordinate is greater than the previous touch coordinate (operation 504). When the current touch coordinate is greater than the previous touch coordinate, the controller 160 determines a moving direction of the touch input device as a 'right direction' (operation 505). When the current touch coordinate is less than or equal to the previous touch coordinate, the controller 160 determines the touch direction as a 'left direction' (operation 506). After the moving direction is determined, the controller 160 sets the current touch coordinate to the previous touch coordinate (operation 507). Next, the controller 160 determines whether a touch of the touch input device is released from the screen (operation 508). When the touch of the touch input device is not released, the process returns to operation 502. Conversely, when the touch of the touch input device is released, the controller 160 determines whether the determined touch direction is a right direction (operation 509). When the touch direction is the right direction, the controller 160 moves the touched page to the right direction (operation 510). If the touched page is a left page, operation 510 corresponds to an operation of skipping the page to a previous page. When the touched page is a right page, operation 510 corresponds to an operation of maintaining the displaying of the touched page without skipping the page to the next page. When the touch direction is the left direction, the controller 160 moves the touched page to the left direction (operation 511). If the touched page is the left page, operation 511 corresponds to an operation of maintaining the displaying of the touched page without skipping the page back. Conversely, if the touched page is the right page, step 511 corresponds to an operation of skipping the page to a next page.
FIG. 6 is a flowchart illustrating page setting according to an exemplary embodiment. Referring to FIG. 6, a controller 160 may control a display unit 110 to display a home image (operation 620). The home image includes an icon corresponding to an environment setting. The controller 160 detects a touch of an icon corresponding to the environment setting (operation 621). Then, the controller 160 controls the display unit 110 to display an environment setting image of the mobile terminal 100 (operation 622). The controller 160 sets an environment of the mobile terminal, particularly, an environment with respect to the page, according to an operation of a user with respect to the screen (operation 623). Preset values associated with the page are stored in the secondary memory 130. When the page stored in the secondary memory 130 is displayed, the preset values are used by the controller 160.
FIG. 7A is an exemplary screen diagram for setting an environment of a mobile terminal according to an exemplary embodiment. Referring to FIG. 7A, the display unit 110 displays an environment setting image 730 under the control of the controller 160. The environment setting image 730 include items such as a wireless network 731, a location service 732, a sound 733, a display 734, a security 735, and page setting 736. The user touches the page setting 736 from the foregoing items. Then, the controller 160 controls the display unit 110 to display a page setting image for setting the environment of the page.
FIG. 7B is an exemplary screen diagram for setting an environment of a page. Referring to FIG. 7B, the display unit 110 displays a page setting screen 740 under control of the controller 160. The page setting screen 740 includes items such as page thickness/material 741, touch gesture change 742, an allowable gradient range 743, feedback 744, and a screen change time 745. As illustrated in table 1, the page thickness/material 741 may be, for example, 75g/m2 and a printing page. The page thickness/material 741 is set by a manufacturer of an electronic book, and may not be changed by the user. The touch gesture change 742 is an item which changes an allowed touch gesture so the page may be skipped. For example, the touch gesture which is allowed for page skip may be changed from flick to drag or vice versa. An allowable gradient range 743 where the target node is movable may be in the range of -30° to +30°. The feedback 744 is an item for determining feedback to be provided to the user when the page is skipped. For example, the user may set a vibration effect and a sound effect as the feedback. For example, the screen change time 750 may be set to 0.5 seconds.
Hereinafter, exemplary embodiments will be described with reference to exemplary screen diagrams. A display mode of a screen according to the exemplary embodiments may be divided into a landscape mode and a portrait mode. In a case of the landscape mode, the mobile terminal 100 displays two pages toward the left and right directions. In a case of the portrait mode, the mobile terminal displays one page. However, the exemplary embodiments are not limited thereto. If the user rotates the mobile terminal 100, a sensor (e.g., acceleration sensor) included in the mobile terminal 100 detects rotation information and transfers the rotation information to the controller 160. The controller 160 may determine the display mode of the mobile terminal 100 using the rotation information. The exemplary embodiments may use the landscape mode and the portrait mode as the display mode.
FIGS. 8A to 23 are exemplary screen diagrams illustrating a page display method according to an exemplary embodiment. As described above, the controller 160 moves a target node to convexly deform the page. Although a shape of a page is convex, a concrete shape of the page is changed according to touched information (e.g., touched location, moving direction, moving distance, speed). As used herein, according to exemplary embodiments, the term 'contents' may refer to, for example, photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, thumb-nails, and icons. The contents are displayed for each page. Contents of which the number of pages vary may include photographs, videos, audio, images, contact points, memos, documents, thumb-nails, and icons. Contents of which the number of pages do not vary may include calendars, e-books, and web pages.
FIGS. 8A to 16 illustrate screens on which pages of a gallery are displayed. A plurality of icons (e.g., 12 icons as illustrated above) is included for each page. The number of icons displayed for each page may be changed by a user. Each icon corresponds to an icon or a photograph. For example, when the user taps the icon, a corresponding photograph (or moving image) is displayed on the screen. Further, the page may include other contents, for example, photographs or thumbnails instead of icons. In addition, the page may include various types of contents (e.g., icons, photographs, thumbnails).
Referring to FIG. 8A, the user touches the touch input device on a right bottom peripheral point 810 of a right page (e.g., 4/11 page) of the gallery. Then, the controller 160 detects the target node (or touch coordinate) corresponding to the right bottom peripheral point 810. The user moves the touch input device in a left bottom direction in a state in which the right bottom peripheral point 810 is touched. Then, the controller 160 moves the target node in a left direction from a right bottom area. The controller 160 calculates displacement of the moved target node. In detail, the controller 160 calculates a current location value of the target node (that is, a current touch coordinate of the touch input device), moving speed and direction of the target node. The controller 160 calculates forces applied to each node using the calculated displacement. The controller 160 calculates location values (coordinates) of respective nodes using the calculated forces. The controller 160 generates the animation using the calculated location values. In addition, the controller 160 controls the display unit 110 to display the generated animation. FIG. 8A illustrates an animation (that is, a deformed shape of page) when the touch input device moves from the right bottom peripheral point 810 in a left direction from a right bottom area and is located at a first lower side point 820. As shown in FIG. 8A, the page is greatly deformed and becomes convex in a moving direction (810 -> 820) of the target node. A corner region 815 having the target node is compared with another corner region 830 so that the corner region 815 approaches a spine 840.
Referring to FIG. 8B, the user move the touch input device from the first lower side point 820 towards a left bottom periphery. Then, the controller 160 generates the animation, and controls the display unit 110 to display the generated animation. That is, FIG. 8B illustrates an animation in which the touch input device is located at a second lower side pointer 850. Upon comparison of FIG. 8B with FIG. 8A, the page of FIG. 8B is more convex than a page of FIG. 8A. Accordingly, if the touch of the touch input is released, the page of FIG. 8A is not skipped but the page of FIG. 8B may be skipped. In other words, in a case shown in FIG. 8A, when the user release the touch from the first lower side point 820, a direction of a force (that is, mass center) may act in a rightward direction. Accordingly, the page is not skipped but instead returns to an original position. In a case shown in FIG. 8B, when the user releases a touch from the second lower side point 850, the direction of the force may act in a leftward direction. Accordingly, the page may be skipped to an opposite side. As a result, a direction of the mass center may be associated with a current touch coordinate. Also, even in the case shown in FIG. 8A, there may be a skip condition. An example of the condition will be described in detail. The page skip may be determined according to speed when the touch input device is moved from the right bottom peripheral point 810 to a first bottom side point 820. For example, when a touch of the touch input device is released after the touch input device is moved at a speed greater than or equal to 30cm/sec, the page may be skipped. When the speed is less than 30cm/sec, the page may not be skipped. Many other examples of page skip conditions may also be implemented.
Referring to FIGs. 8B to 8D, the user may move the touch input device from the second bottom point 850 to a left side in a state that the touch continuously is maintained. That is, the user moves the touch input device to the first left point 860 through a central line. Then, the controller 160 controls a part 870 of a rear surface (e.g., page 5) of a currently operated page. If the user releases the touch from the first left point 850, as shown in FIG. 8D, the controller 160 displays the entire rear surface on a left side. In this manner, when the touch input device is moved from the right side to the left side through the central line, a rear surface of the page is displayed. When the touch is released after passing through the central line, the page is skipped. Although the touch input device does not pass through the central line, a rear surface of the currently operated page may be displayed. For example, if the touch input device approaches the central line within a preset threshold value (e.g., 10 mm from the central line), the controller 160 may control the display unit 110 to display the rear surface. A threshold for displaying the rear surface may be set to values other than 10 mm. Hereinafter, an example of another exemplary screen according to an exemplary embodiment is described. Repetitive descriptions with respect to FIGS. 8A to 8D are omitted.
Referring to FIG. 9, the user touches a touch input device at a right bottom peripheral point 910 of a page (e.g., page 4/11) and moves the touch input device from the right bottom point towards a left top area in an upwardly left direction. Then, the controller 160 generates an animation and controls the display unit 110 to display the generated animation. That is, FIG. 9 illustrates an animation when the touch input device is moved from the right bottom peripheral point 910 towards a left top periphery and is located at the third bottom side point 920. Upon comparison of FIG. 9 with FIG. 8A, the touch input devices in FIG. 8A and FIG. 9 start from the same right bottom peripheral point but moving directions thereof are different from each other. Accordingly, shapes of deformed pages are different from each other. Further, when the user releases the touch, the page in FIG. 8A is not skipped but the page in FIG. 9 may be skipped to a left side. Touches shown in FIGS. 8A and 9 start from the same right bottom periphery of the page. However, a moving direction of the case shown in FIG. 8A is along a periphery but a moving direction of a case shown in FIG. 9 is along a center of the page. Accordingly, in a case shown in FIG. 8A, a lower portion of the page has a mass center of a left side, and an upper portion of the page has a mass center of a right side. In this case, the overall mass center may be towards a right side. As a result, the page is not skipped. Conversely, in the case of FIG. 9, since a moving direction of the touch is towards a center of the page rather than along a periphery, upper/lower portions of the page have a mass center towards a left side. As a result, the page is skipped. As a result, a direction of the mass center of the page may be associated with the moving direction of the touch input device as well as a current touch coordinate and speed of the touch input device.
Referring to FIGS. 10A and 10B, the user touches the touch input device at a right side point 1010 of a center of the page (e.g., page 4/11), and moves the touch input device to an opposite side (left side). That is, FIG. 10A illustrates an animation where the touch input device is moved from the right side point 1010 to a left side and is located at a central point 1020. As shown in FIG. 10A, when the user moves the touch input device to a left side after the user touches a right side point 1010 of a center of the page, upper and lower portions of the page may be symmetrically and uniformly deformed. As shown in FIG. 10B, the user moves the touch input device from the central point 1020 in a leftward direction to a right side point 1030. That is, FIG. 10B illustrates an animation where the touch input device is located at the right side point 1030. As shown by a comparison of FIG. 10B with FIG. 10A, the page shown in FIG. 10B is more convex than the page shown in FIG. 10B. Accordingly, when the user releases the touch, the page of FIG. 10A is not skipped, but the page of FIG. 10B may be skipped. As shown by a comparison of FIG. 10A with FIG. 8A, moving directions of touch input devices in FIGS. 8A and 10A are the same (towards a left side), but initial touch coordinates thereof are different from each other. Accordingly, shapes of deformed pages are different from each other. As shown by a comparison of FIGs. 10B with FIG. 8B, if the user releases a touch, a page of FIG. 8B is not skipped but a page of FIG. 10B may be skipped. Touches of FIGS. 8B and 10B start from a right side of the page. However, a touch of FIG. 10B starts from the center, and a touch of FIG. 8B starts from a lower portion of the center. Accordingly, in a case shown in FIG. 10B, upper and lower portions of the page may have a mass center towards a left side. As a result, the page is skipped. In contrast, in a case shown in FIG. 8B, a lower portion of the page has a mass center toward a left side but an upper portion of the page may have a mass center toward a right side. In this case, if the overall mass center of the page is toward a right side, the page may not be skipped. As a result, a direction of the mass center in the page may be associated with an initial touch coordinate as well as a current touch coordinate and a moving direction of the touch input device.
Referring to FIGS. 11A and 11B, the user touches a touch input device at a right top peripheral point 1110 of a page (e.g., page 4/11), and moves the touch input device from the point towards a point in the top periphery. That is, FIG. 11A illustrates an animation where the touch input device is moved from the right top peripheral point 1110 towards a left top periphery point and is located at a first top side point 1120. As shown in FIG. 11B, the user moves the touch input device from the first top side point 1120 along a top periphery in a left direction. That is, FIG. 11B is an animation where the touch input device is located at a second top side point 1130.
Referring to FIG. 12, the user touches the touch input device at a right top peripheral point 1210 of a page (e.g., page 4/11), and moves the touch input device from the right top peripheral point 1210 towards a left bottom periphery. That is, FIG. 12 illustrates an animation where the touch input device is moved from a right top peripheral point 1210 towards a left bottom periphery and is located at a top side point 1220. As shown by a comparison of FIG. 12 with FIG. 11A, touch input devices in FIGS. 11A and 12 start from a right top peripheral point but moving directions thereof are different from each other. Accordingly, shapes of deformed pages are different from each other.
Next, FIG. 13 illustrates an animation where the touch input device is moved from a first left bottom peripheral point 1310 of the page (e.g., page 4/11) to a left side and is located at a second left bottom peripheral point 1320. FIG. 14 illustrates an animation where the touch input device is moved from the first left side point 1410 to the left side and is located at a second left side point 1420. FIG. 15 illustrates an animation where the touch input device is moved from the first left top peripheral point 1510 and is located at the second left top peripheral point 1520.
The user may touch all parts of the page as well as touch coordinates described in FIGs. 8 to 15. Accordingly, it is understood that deformation of the page may be changed according to a touch coordinate, a moving direction, and speed.
Referring to FIG. 16, a display mode is set to a portrait mode. The display unit 110 may display one page in the portrait mode. A user touches the touch input device at a right bottom peripheral point 1610 of a page (e.g., page 4/11). Then, the controller 160 detects a target node corresponding to the right bottom peripheral point 1610. The user moves the touch input device towards a left bottom periphery in a state in which the right bottom peripheral point 1610 is touched. Then, the controller 160 moves the target node towards the left bottom periphery. Further, the controller 160 calculates displacement of the moved target node. In detail, the controller 160 calculates a current location, moving speed, and a moving direction of the target node. The controller 160 calculates forces applied to respective nodes using the calculated displacement of the target node. The controller 160 calculates location values of the respective nodes using the calculated forces. The controller 160 generates an animator using the calculated location values of the nodes. The controller 160 controls the display unit 110 to display the generated animation. As described above, FIG. 16 illustrates an animation in which the touch input device is moved from the right bottom peripheral point 1610 towards a left bottom periphery and is located at a bottom side point 1620. If the touch input device approaches a left side within a preset threshold value (e.g., 10 mm from a left side of a screen), the controller 160 skips the page and controls the display unit 110 to display a next page (e.g., page 4/11).
FIG. 17 illustrates a screen displaying a page of a music player, the number of pages of the music player being changeable. For example, one page includes a plurality of icons (e.g., 12 icons as illustrated in FIG. 17). The number of icons displayed per page may be changed by the user. Referring to FIG. 17, the user touches a touch input device at a right bottom peripheral point 1710 of a right page (e.g., page 4/11), and moves the touch input device from the right bottom peripheral point 1710 towards a left bottom periphery. Then, the controller 160 generates an animation and controls the display unit 110 to display the generated animation. FIG. 17 illustrates an animation (that is, a shape in which a page of a music player is convexly deformed) in which the touch input device is moved from the right bottom peripheral point 1710 towards a left periphery and is located at a bottom side point 1720. One page includes a plurality of icons (e.g., 12 icons). The icons may be displayed in many different ways, e.g., a list form, and are not limited to being displayed in a grid form as illustrated in FIG. 17. When the user taps the icon, corresponding music is played. According to exemplary embodiments, the page may include a music folder, a thumbnail or a music playback image. If the user taps the music folder, a corresponding music playback image is displayed. The music playback image includes an album cover photograph, a playback button, and a pause button. If the user taps the playback button, music included in the corresponding music folder may be sequentially or randomly played.
FIG. 18 illustrates a screen which displays a page of a video player, the number of pages of the video player being changeable. Referring to FIG. 18, in response to movement 1810 of the touch input device, a page of a video player is convexly deformed from the touched page (e.g., page 4/11). The page of the video player includes a plurality of icons (e.g., 12 icons). The number of icons displayed per page may be changed by the user. If the user taps one of the icons, a corresponding video is played. According to exemplary embodiments, the music player and the video player may be integrated with each other. That is, the music icon and the video icon may be displayed on one page. In the case of an integral type configuration, the page may also be convexly deformed.
FIG. 19 illustrates a screen displaying a page of an address book, the number of pages of the address book being changeable. Referring to FIG. 19, in response to movement 1910 of the touch input device, a page of the address book is convexly deformed from the touched page (e.g., page 4/11). The page of the address book includes a plurality of contact point information. As shown in FIG. 19, the contact point information may be displayed in a form of a list, although is not limited thereto. The contact point information may also be displayed in another form, for example, in a form of a grid. If the user taps contact point information, corresponding detailed information (e.g., phone numbers, e-mail addresses, home addresses, office addresses) is displayed. According to an exemplary embodiments, the page of the address book may also be displayed in a portrait mode as illustrated in FIG. 16, rather than the landscape mode.
FIG. 20 illustrates a screen displaying a page of a memo note, the number of pages of the memo note being changeable. Referring to FIG. 20, in response to movement 2010 of the touch input device, a page of the memo address is convexly deformed from the touched page (e.g., page 4/11).
FIG. 21 illustrates a screen displaying a page of a calendar, the number of pages of the calendar being fixed. Referring to FIG. 21, in response to movement 2110 of the touch input device, a page (e.g., February 2012) of the calendar is convexly deformed from the touched page (e.g., page 4/11). FIG. 22 illustrates a screen displaying a page of a web browser, the number of pages of the web browser being fixed. Referring to FIG. 22, in response to movement 2210 of the touch input device, the page of the web browser is convexly deformed from the touched page (e.g., page 4/11).
FIG. 23 illustrates a screen displaying a page of an electronic book, the number of pages of the electronic book being fixed. Referring to FIG. 23, in response to movement 2310 of the touch input device, the page of the electronic book is convexly deformed from the touched page (e.g., page 4/11).
FIG. 24 is a flowchart illustrating a page editing method according to an exemplary embodiment.
FIGS. 25A to 26 are diagrams of screens illustrating a page editing method according to an exemplary embodiment.
Referring to FIG. 24, a controller 160 detects an event requesting the display of a page (operation 2410). For example, the request event is a tap of a touch input device with respect to an application icon. If the request event is detected, the controller 160 determines a page to be displayed among stored pages of a corresponding application (operation 2420). For example, the controller 160 may display the most recently displayed page before a corresponding application was terminated as the page to be displayed. The controller 160 controls the display unit 110 to display the determined page and additional information thereof (operation 2430). For example, referring to FIG. 25A, the additional information includes a number 2510 of a corresponding page and the number 2520 of a total number of pages. The controller 160 detects an event requesting editing of the page (operation 2440). For example, referring to FIG. 25A, the request event is a tap of the touch input device on a delete button 2530. The controller 160 reconfigures pages in response to the editing request event and stores the reconfigured pages in a secondary memory 130 (operation 2450). Further, the controller 160 controls the display unit 110 to display at least one of the reconfigured pages and additional information thereof (operation 2460). For example, referring to FIGS. 25A and 25B, the controller 160 reconfigures pages using the remaining pages except for pages P5, P9, P11, and P19~P24 selected by the user from a contents list in response to a tap of the delete button 2530. Accordingly, the total number of the pages is changed from 11 to 10. Further, in the examples shown in FIGS. 25A and 25B, the order of the currently display pages may be maintained, but it is understood that the order of currently displayed pages may be changed. According to exemplary embodiments, contents such as videos, audio, images, contact points, memos, documents, thumbnails, and icons as well as photographs may be edited (e.g., deleting contents from the page, adding contents to the page, or moving the contents to another page).
According to exemplary embodiments, the pages may be configured in the order of time. For example, when a shooting time of a first photograph is earlier than that of a second photograph, the first photograph is configured at a previous page as compared with the second photograph. Further, the pages may be configured by places. For example, a first page is configured by photographs shot in Seoul, and a second page is configured by photographs in New York. If an arrangement scheme of contents is changed from "time" to "place" or vice versa by the user, the controller 160 may reconfigure pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed.
If a format of the page is changed, the controller 160 reconfigures the pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed. In detail, the number of contents included in one page may be changed. For example, each of the pages shown in FIGS. 25A and 25B respectively includes 12 contents, and the page of FIG. 26 includes 9 contents. Accordingly, information (e.g., a number) indicating the order of currently displayed pages is changed and the changed information is displayed, and information indicating the total number of pages is changed and the changed information is displayed. Further, an arrangement scheme of contents in the page may be changed. That is, the contents may be arranged in a form of a grid, a list, or some other form. In this manner, if the arrangement scheme is changed, the number of contents included in the page may be changed. Also, the information indicating an order of currently displayed pages may be changed and the changed information may be displayed. In addition, information indicating the total number of pages may be changed and the changed information may be displayed.
The foregoing method for displaying contents according to exemplary embodiments may be implemented in an executable program command form by various computer components and may be recorded in a computer readable recording medium. According to exemplary embodiments, the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof. According to exemplary embodiments, the program command recorded in a recording medium may be specially designed or configured for the exemplary embodiments or be known to a person having ordinary skill in a computer software field to be used. The computer readable recording medium may include magnetic media such as a hard disk, floppy disk, or magnetic tape, optical media such as a Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), magneto-optical media such as a floptical disk, and a hardware device such as ROM, RAM, and flash memory storing and executing program commands. Further, the program command may include a machine language code created by a complier and a high-level language code executable by a computer using an interpreter. The foregoing hardware device may be configured to be operated as at least one software module to perform an operation according to the exemplary embodiments.
As mentioned above, according to the contents display method and the mobile terminal of the exemplary embodiments, the exemplary embodiments provide a highly realistic feeling to a user when the user operates a screen on which pages are displayed.
Although exemplary embodiments have been described in detail hereinabove, it should be clearly understood that many variations and modifications of the basic inventive concepts disclosed herein which may appear to those skilled in the present art will still fall within the spirit and scope of the exemplary embodiments, as defined in the appended claims.

Claims (20)

1. A method of displaying contents of pages displayed by a mobile terminal including a display unit in which a touch panel is installed, the method comprising:
displaying a page;
detecting movement of a touch input device with respect to the displayed page; and
displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.
The method of claim 1, further comprising:
detecting an editing request event requesting editing of the contents;
reconfiguring the page in response to the editing request event; and
displaying the reconfigured page and additional information of the reconfigured page.
The method of claim 2, wherein the additional information comprises information indicating a total number of pages and an order of the displayed pages.
The method of claim 2, wherein the editing request event comprises one of an addition request to add contents to the contents, a delete request to delete contents from the contents, a movement request to move the contents, and a change request to change a formation of the page.
The method of claim 4, wherein the reconfiguring of the page comprises changing at least one of a total number of the pages and an order of the displayed pages.
The method of claim 1, wherein the contents comprises at least one of photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, thumbnails, and icons.
The method of claim 1, wherein the displaying of the page so that the page is convexly deformed and skipped comprises displaying a convexly deformed page corresponding to attribute information set for the page.
The method of claim 1, wherein the detecting of the movement of the touch input device comprises detecting at least one of a moving direction, a moving distance, and a speed of the touch input device.
The method of claim 1, wherein the displaying of the page so that the page is convexly deformed and skipped comprises deforming a plurality of the pages into convex forms which are different from each other and displaying the deformed pages.
The method of claim 1, further comprising:
determining a direction of a force applied to the convexly deformed page based on at least one of an initial touch coordinate, a current touch coordinate, a moving distance, and a moving direction of the touch input device; and
displaying the convexly deformed page such that the convexly deformed page is moved according to the determined direction of the force when a touch release of the touch input device with respect to the convexly deformed page is detected.
The method of claim 1, further comprising displaying the convexly deformed page so that the convexly deformed page is moved according to a moving direction of the touch input device before a touch release of the touch input device with respect to the convexly deformed page is detected.
A mobile terminal comprising:
a display unit in which a touch panel is installed and configured to display contents for each of a plurality of pages;
a memory configured to store the contents of the pages; and
a controller configured to control the display unit such that one of the pages is displayed, detect movement of a touch input device with respect to the displayed page, and control the display unit such that the page is displayed as convexly deformed and skipped in response to the detected movement of the touch input device.
The mobile terminal of claim 12, wherein the controller is further configured to detect an editing request event requesting editing of the contents, reconfigure and store the page in response to the editing request event, and control the display unit such that the reconfigured page and additional information of the reconfigured page are displayed.
The mobile terminal of claim 13, wherein the additional information comprises information indicating a total number of the pages and an order of the pages.
The mobile terminal of claim 13, wherein the editing request event comprises one of an addition request to add contents to the contents, a delete request to delete contents from the contents, a movement request to move the contents, and a change request to change of a formation of the page.
The mobile terminal of claim 15, wherein the control unit reconfigures the page by changing of at least one of a total number of the pages and an order of the pages.
The mobile terminal of claim 12, wherein the memory is further configured to store attribute information of the page, and the controller is further configured to convexly deform the page corresponding to the attribute information.
The mobile terminal of claim 12, wherein the controller is further configured control the display unit to display the page such that the page is moved according to a moving direction of the touch input device before a touch release of the touch input device with respect to the page is detected.
A non-transitory computer readable recording medium implemented in a mobile terminal including a display unit in which a touch panel is installed, the non-transitory computer readable recording medium causing the mobile terminal to perform operations comprising:
displaying a page;
detecting movement of a touch input device with respect to the displayed page; and
displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.
A method to display pages, the method comprising:
displaying a page on a device comprising a touch input unit;
generating a page mesh corresponding to the displayed page, the page mesh comprising a plurality of nodes having respective weights;
detecting movement of a touch input device with respect to the displayed page, using the touch input unit; and
changing an appearance of the page according to the detected movement and the page mesh,
wherein the changing the appearance of the page comprises convexly deforming the page.
PCT/KR2013/006221 2013-01-11 2013-07-11 Contents display method and mobile terminal implementing the same WO2014109445A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13870559.5A EP2943867A4 (en) 2013-01-11 2013-07-11 Contents display method and mobile terminal implementing the same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US13/739,777 2013-01-11
US13/739,777 US20130198678A1 (en) 2012-01-31 2013-01-11 Method and apparatus for displaying page in terminal
KR10-2013-0009788 2013-01-29
KR1020130009788A KR20140096780A (en) 2013-01-29 2013-01-29 Contents display method and mobile terminal implementing the same

Publications (1)

Publication Number Publication Date
WO2014109445A1 true WO2014109445A1 (en) 2014-07-17

Family

ID=51167078

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/006221 WO2014109445A1 (en) 2013-01-11 2013-07-11 Contents display method and mobile terminal implementing the same

Country Status (3)

Country Link
EP (1) EP2943867A4 (en)
KR (1) KR20140096780A (en)
WO (1) WO2014109445A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11322062B1 (en) 2021-01-06 2022-05-03 Microsoft Technology Licensing, Llc Dual display device control
WO2022150092A1 (en) * 2021-01-06 2022-07-14 Microsoft Technology Licensing, Llc Dual display device control

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019027105A1 (en) * 2017-07-31 2019-02-07 (주)레터플라이 Terminal for automatically adding and deleting content editor page according to content creation range, and computer-readable recording medium
KR102220111B1 (en) * 2019-08-28 2021-02-24 허창용 Calendar Display Device having Smart Function

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110050594A1 (en) * 2009-09-02 2011-03-03 Kim John T Touch-Screen User Interface
KR20110110138A (en) * 2009-01-07 2011-10-06 마이크로소프트 코포레이션 Virtual page turn
KR20120034542A (en) * 2010-10-01 2012-04-12 삼성전자주식회사 Apparatus and method for turning e-book pages in portable terminal
KR20120103923A (en) * 2011-03-11 2012-09-20 한국과학기술원 Method, device for controlling user terminal having touch screen, recording medium for the same, and user terminal comprising the same
KR20120105695A (en) * 2011-03-16 2012-09-26 엘지전자 주식회사 Mobile terminal and method for controlling thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE525338C2 (en) * 2002-03-27 2005-02-01 Touch & Turn Ab Device and method for turning sheets in a digitized virtual document
JP5671921B2 (en) * 2010-10-04 2015-02-18 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5666239B2 (en) * 2010-10-15 2015-02-12 シャープ株式会社 Information processing apparatus, information processing apparatus control method, program, and recording medium
JP2012150566A (en) * 2011-01-17 2012-08-09 Sharp Corp Display device, display method, computer program, and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110110138A (en) * 2009-01-07 2011-10-06 마이크로소프트 코포레이션 Virtual page turn
US20110050594A1 (en) * 2009-09-02 2011-03-03 Kim John T Touch-Screen User Interface
KR20120034542A (en) * 2010-10-01 2012-04-12 삼성전자주식회사 Apparatus and method for turning e-book pages in portable terminal
KR20120103923A (en) * 2011-03-11 2012-09-20 한국과학기술원 Method, device for controlling user terminal having touch screen, recording medium for the same, and user terminal comprising the same
KR20120105695A (en) * 2011-03-16 2012-09-26 엘지전자 주식회사 Mobile terminal and method for controlling thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2943867A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11322062B1 (en) 2021-01-06 2022-05-03 Microsoft Technology Licensing, Llc Dual display device control
WO2022150092A1 (en) * 2021-01-06 2022-07-14 Microsoft Technology Licensing, Llc Dual display device control

Also Published As

Publication number Publication date
KR20140096780A (en) 2014-08-06
EP2943867A4 (en) 2016-08-24
EP2943867A1 (en) 2015-11-18

Similar Documents

Publication Publication Date Title
US11422627B2 (en) Apparatus and method for providing haptic feedback to input unit
WO2013129857A1 (en) Method and apparatus for turning pages in terminal
CN103729159B (en) Multi-display apparatus and method of controlling display operation
WO2013115499A1 (en) Method and apparatus for displaying page in terminal
WO2013129858A1 (en) Method for displaying pages of e-book and mobile device adapted thereto
US10114539B2 (en) System and method for providing feedback associated with e-book in mobile device
WO2014129828A1 (en) Method for providing a feedback in response to a user input and a terminal implementing the same
KR102056175B1 (en) Method of making augmented reality contents and terminal implementing the same
CN104035672B (en) For providing the mobile device and its control method of preview by detection rubbing gesture
AU2014312481B2 (en) Display apparatus, portable device and screen display methods thereof
US20130268847A1 (en) System and method for displaying pages of e-book
KR102251834B1 (en) Method for displaying in electronic device
US9658762B2 (en) Mobile terminal and method for controlling display of object on touch screen
JP2013543621A (en) Method and system for viewing stacked screen displays using gestures
WO2014129787A1 (en) Electronic device having touch-sensitive user interface and related operating method
WO2014107079A1 (en) Content zooming method and terminal implementing the same
WO2014109445A1 (en) Contents display method and mobile terminal implementing the same
US20150019961A1 (en) Portable terminal and method for controlling data merging
US20130298068A1 (en) Contents display method and mobile terminal implementing the same
CN114461312B (en) Display method, electronic device and storage medium
US10713422B2 (en) Method of editing document in mobile terminal and mobile terminal using the same
CN116483618A (en) Data backup method and system, storage medium and electronic equipment
KR20130088695A (en) Page display method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13870559

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2013870559

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013870559

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE