US20120144293A1 - Display apparatus and method of providing user interface thereof - Google Patents

Display apparatus and method of providing user interface thereof Download PDF

Info

Publication number
US20120144293A1
US20120144293A1 US13/242,896 US201113242896A US2012144293A1 US 20120144293 A1 US20120144293 A1 US 20120144293A1 US 201113242896 A US201113242896 A US 201113242896A US 2012144293 A1 US2012144293 A1 US 2012144293A1
Authority
US
United States
Prior art keywords
data object
data
function
area
preset touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/242,896
Inventor
Min-Soo Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, MIN-SOO
Publication of US20120144293A1 publication Critical patent/US20120144293A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present general inventive concept relates to a display apparatus and a method of providing a user interface (UI) thereof, and more particularly, to a display apparatus and a method of providing a UI thereof which can provide an editing function of a data object
  • Functions that are frequently used when an image is edited in the related art may include “file merge”, “file divide”, “partial cut”, and the like.
  • file merge When an edit function is independently performed in a mobile appliance without a personal computer (PC), the following steps are generally performed in a thumbnail view state.
  • buttons for copy, partial cut, paste, and the like can be constantly positioned on a screen.
  • touch panels have been spread in mobile appliances, many menu operations based on the existing five direction keys (up, down, left, right, and center) are still used, and thus it is required to pass through several menu steps when a specified function is performed. Also, when several files are selected, even in the case of successive data, it is required to select respective files by touching the respective files (in the case of a PC, mouse drag is frequently used).
  • an aspect of the present general inventive concept provides a display apparatus and a method of providing a user interface (UI) thereof, which can provide an intuitive UI for performing a data editing function.
  • UI user interface
  • a display apparatus which includes a display unit to display a data object in the form of a corresponding icon, a user interface unit to receive a preset touch operation corresponding to an editing function to edit the data object, an editing unit to perform the editing function to edit the data object, and a control unit operating to control the editing unit to perform the editing function to edit the data object that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
  • the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
  • the control unit may operate to convert the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
  • the display unit may display at least two data objects and the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
  • the display unit may display at least three data objects and the preset touch operation that corresponds to the data object merge function may be an operation of touching a first data object, successively selecting and dragging areas in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
  • the control unit may operate to create a clip board area of the data object in a predetermined area of a screen of the display unit and to display the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • the clip board area may include at least one of a copy area and a deletion area, and the control unit may operate to copy or delete the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
  • the control unit may operate to perform a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
  • the control unit may operate to perform a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
  • a method of providing a UI of a display apparatus which includes displaying a data object in the form of an icon, receiving a preset touch operation corresponding to an editing function to edit the data object, and performing the editing function that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
  • the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
  • the method of providing the UI may further include converting the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
  • the displaying the data object may further include displaying at least three data objects and the preset touch operation that corresponds to the data object merge function may be an operation of touching a first data object, successively selecting and dragging areas, in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
  • the displaying the data object may further include displaying at least two data objects and the preset touch operation that corresponds to the data object merge function is an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
  • the method of providing the UI may further include creating a clip board area of the data object in a predetermined area of a screen of the display unit and displaying the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • the clip board area may include at least one of a copy area and a deletion area, and the method of providing the UI may further include copying or deleting the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
  • the method of providing the UI may further include performing a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
  • the method of providing the UI may further include performing a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
  • a display apparatus including a display unit to display at least one data object in the form of a corresponding icon, a user interface unit to receive a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of touch operations comprises touching at least two points on the display unit simultaneously, a control unit to determine which one of the plurality of preset touch operations is received by the user interface unit, and an editing unit to perform the respective editing function corresponding to the determined preset touch operation.
  • the plurality of preset touch operations may include at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
  • the at least one data object may include a first data object and a second data object and the preset touch operation corresponding to the data object merge function may include touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
  • the at least one data object may include a plurality of data objects and the preset touch operation corresponding to the data object merge function may include touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
  • the preset touch operation corresponding to the data object division function may include simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
  • the display unit may display the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function may include simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
  • a method of providing a UI of a display apparatus including displaying at least one data object in the form of a corresponding icon, receiving one of a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of preset touch operations comprises touching at least two points on the display unit simultaneously, determining which one of the plurality of preset touch operations is received, and performing the respective editing function corresponding to the determined preset touch operation.
  • the plurality of preset touch operations may include at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
  • the at least one data object may include a first data object and a second data object and the preset touch operation corresponding to the data object merge function may include touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
  • the at least one data object may include a plurality of data objects and the preset touch operation corresponding to the data object merge function may include touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
  • the preset touch operation corresponding to the data object division function may include simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
  • the displaying may include displaying the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function may include simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
  • a display apparatus including a display unit to display one or more data objects, and a control unit to select one of the displayed data objects and to separate the selected data object into at least two data objects that each include a portion of the selected data object according to a first touch operation on the selected data object, and to select at least two of the displayed data objects and to merge the at least two selected data objects into one data object according to a second touch operation on the at least two selected data objects.
  • FIG. 1 is a block diagram illustrating the configuration of a display apparatus according to an exemplary embodiment of the present general inventive concept
  • FIGS. 2A to 2D are diagrams illustrating a method of providing a UI for a data object merge function according to an exemplary embodiment of the present general inventive concept
  • FIGS. 3A to 3D are diagrams illustrating a method of providing a UI for a data object division function according to an exemplary embodiment of the present general inventive concept
  • FIGS. 4A to 4E are diagrams illustrating a method of providing a UI for a data object clip board function according to an exemplary embodiment of the present general inventive concept
  • FIGS. 5A to 5F are diagrams illustrating a method of providing a UI for a data object clip board function according to another exemplary embodiment of the present general inventive concept
  • FIGS. 6A and 6B are diagrams illustrating a method of providing a UI for a data object clip board function according to still another exemplary embodiment of the present general inventive concept
  • FIG. 7 is a diagram illustrating a method of displaying an area for a clip board function according to an exemplary embodiment of the present general inventive concept.
  • FIG. 8 is a flowchart illustrating a method of providing a UI according to an exemplary embodiment of the present general inventive concept.
  • FIG. 1 is a block diagram illustrating the configuration of a display apparatus according to an exemplary embodiment of the present general inventive concept.
  • a display apparatus 100 includes a display unit 110 , a user interface unit 120 , an editing unit 130 , and a control unit 140 .
  • the display apparatus 100 may be implemented in various forms.
  • the display apparatus 100 may be implemented by a mobile terminal, such as a portable phone, a smart phone, a notebook computer, a terminal for digital broadcasting, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigation device, and the like, and a terminal, such as a digital YV, a desktop computer, and the like.
  • a mobile terminal such as a portable phone, a smart phone, a notebook computer, a terminal for digital broadcasting, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigation device, and the like
  • a terminal such as a digital YV, a desktop computer, and the like.
  • the display unit 110 may display various data objects in the form of icons.
  • the data objects may be in various data forms, such as moving images, images, texts, applications, sound data, photo slide shows, and the like.
  • the icons may be displayed in diverse forms including thumbnail forms of respective data objects.
  • the display unit 110 may be implemented in a touch screen form that forms a mutual layer structure with a touch pad.
  • the display unit 110 may be used as a user interface unit 120 , to be described later, in addition to an output device.
  • the display unit 110 may include at least one of a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a 3D display.
  • some displays may be constructed to be transparent so that the outside can be seen through the displays. These displays may be called transparent displays, and a representative example of the transparent displays may be a TOLED (Transparent Organic Light Emitting Diode).
  • two or more display units 110 may exist in accordance with the implementation forms of the display apparatus 100 .
  • an external display unit (not illustrated) and an internal display unit (not illustrated) may be provided at the same time.
  • the touch screen may be configured to detect not only a touch input position and a touch input area but also a touch input pressure.
  • the display unit 110 displays information that is processed in the display apparatus 100 .
  • a call mode in a state where the display apparatus 100 is implemented by a mobile terminal, a call-related UI (User Interface) or a GUI (Graphic User Interface) is displayed.
  • a photographed and/or received image UI or GUI may be displayed.
  • the user interface unit 120 functions to receive and analyze a user command that is input from a user through an input device such as a mouse or a touch screen.
  • the user interface unit 120 may receive various editing commands for various data objects.
  • the editing command may be a user command for performing an object merge, an object division, a clip board function, and the like.
  • the object merge and the object division are performed in a thumbnail viewing state, and the clip board function may be performed during execution of the subject data object.
  • the clip board function may be performed during reproduction of a moving image.
  • the user interface unit 120 may receive various user commands to control the operation of the display apparatus 100 , and may include a key pad, a dome switch, a touch pad (resistive/capacitive), a jog wheel, a jog switch, and the like.
  • a touch pad resistive/capacitive
  • jog wheel a jog wheel
  • jog switch a jog switch
  • the editing unit 130 may perform an editing function of a data object in accordance with a user command received from the user interface unit 120 .
  • the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • the control unit 140 functions to control the whole operation of the display apparatus 100 .
  • control unit 140 may operate to enter into an editing mode if at least one icon is touched over a preset time in a normal mode.
  • control unit 140 may control the editing unit 130 to perform an editing function that corresponds to the preset touch operation with respect to the corresponding data object.
  • the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch.
  • the preset touch operation that corresponds to the data object merge function may be an operation of touching a reference data object, successively selecting and dragging areas in which icons that correspond to remaining data objects to be merged are displayed to the reference data object, and then releasing the touch.
  • the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of a data object, widening a space between the at least two selected points, and then releasing the touch.
  • the control unit 140 may operate to convert an editing mode into a reproduction mode of the corresponding data object in order to separate moving image data.
  • control unit 140 may operate to create a clip board area of the data object in a predetermined area of a screen of the display unit 110 and to display the data object in the form of an icon on the created area in accordance with an editing command for performing the clip board function.
  • the clip board area may include at least one of a data copy area and a data deletion area.
  • a user can intuitively recognize whether data to be edited exists in the clip board area. For example, in the case where an icon form that corresponds to a predetermined data object is displayed in the copy area, the user can confirm that the corresponding data has been copied, while in the case where an icon form that corresponds to a predetermined data object is displayed in the deletion area, the user can confirm that the corresponding data has been deleted.
  • control unit 140 may operate to store or delete a data object by selecting and positioning the data object to be stored or deleted in the storage area or the deletion area of the data object on the screen in accordance with the editing command for performing the clip board function.
  • control unit 140 may operate to perform a paste function of the data object by touching and dragging the data object stored in the storage area of the data object to a desired position.
  • control unit 140 may operate to perform a partial cut function of the data object by selecting an area to be cut from at least one data object and dragging the selected area to the deletion area of the data object.
  • a storage unit (not illustrated), which may store programs for processing and control through the control unit 140 and perform temporary storage of input/output data (for example, a phonebook, messages, still images, moving images, and the like), may be further included.
  • input/output data for example, a phonebook, messages, still images, moving images, and the like
  • the storage unit may store data about vibration and sound of various patterns which are output when a touch is input on a touch screen.
  • the storage unit (not illustrated) may include at least one type of storage media, such as flash memory type, hard disk type, multimedia card micro type, and card type memories (for example, an SD or XD memory, and the like), magnetic memories such as a RAM (Random Access Memory), an SRAM (Static Random Access Memory), a ROM (Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a PROM (Programmable Read-Only Memory), magnetic disks, and optical disks.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • the display apparatus 100 may further include a radio communication unit (not illustrated) which includes one or more constituent elements that perform radio communication between the mobile terminal and a radio communication system or radio communication between the mobile terminal 100 and a network on which the mobile terminal 100 is positioned, an A/V (Audio/Video) input unit (not illustrated) to input an audio signal and/or a video signal, a sensing unit (not illustrated) to sense the current states of the mobile terminal 100 , such as an open/close state of the mobile terminal 100 , the position of the mobile terminal 100 , existence/nonexistence of user contact, the direction of the mobile terminal, acceleration/deceleration of the mobile terminal, and the like, and to create sensing signals to control the operation of the mobile terminal 100 , an interface unit (not illustrated) to server as an interface with all external appliances connected to the mobile terminal 100 , an output unit (not illustrated) to output an audio signal, a video signal, or an alarm signal, and a power supply unit
  • FIGS. 2A to 2D are diagrams illustrating a method of providing a UI for a data object merge function according to an exemplary embodiment of the present general inventive concept.
  • a user may enter into an editing mode by touching at least one icon over a preset time.
  • the preset time may be a range of time that includes an error range.
  • a user simultaneously touches the corresponding data objects for a predetermined time and maintains the touch without releasing the touch.
  • the merge function is an operation of combining data objects into a single new data object.
  • the two data objects are combined into a new single data object including the contents of the two data objects. For example, if the two data objects are images, execution of the merge will result in a new data object containing both images.
  • the merge state is released. That is, if the user does not release the touch when the space between the two data objects is narrowed, as illustrated in FIG. 2B , but instead widens the space between the two data objects, as illustrated in FIG. 2A , and then releases the touch, the merge is not executed and the two data objects are not combined.
  • the user makes a reference point by touching one data object over a predetermined time, and then maintains the touch without releasing the touch.
  • the merge state of the data objects is released.
  • FIGS. 3A to 3D are diagrams illustrating a method of providing a UI for a data object division function according to an exemplary embodiment of the present general inventive concept.
  • the user touches two points of a data object to be divided and then maintains the touch state without releasing the touch.
  • the corresponding icon is divided into two pieces to achieve the division of the data object. If the user narrows the space between the data objects in this state, the data object division function is released. That is, if the user releases the touch in the state illustrated in FIG. 3A , the data object division function is not executed.
  • the data object division function is an operation to divide the data object into two new data objects.
  • the data object division function is executed, the data object is divided into two new data objects, each of the new data objects include part of the contents of the original data object.
  • the data object is an image such as a moving image
  • the corresponding moving image is converted into an image reproduction area as illustrated in FIG. 3C .
  • the corresponding point is divided into two pieces, and thus the mode is converted into a thumbnail viewing mode as illustrated in FIG. 3D to complete the creation of the divided data objects.
  • FIGS. 4A to 4E are diagrams illustrating a method of providing a UI for a data object clip board function according to an exemplary embodiment of the present general inventive concept.
  • the clip board function is performed during the reproduction of the file in the case where the corresponding data object is a reproducible file such as a moving image.
  • the user selects two points of the data object desired to be copied. For example, as illustrated in FIG. 4A , the user touches a point slightly to the left of “A” and a point slightly to the right of “G” to select the text “ABCDEFG”.
  • the user drags the two selected points to a predetermined area to move the selected data object to the predetermined area.
  • the user drags the selected text “ABCDEFG” to an area in the upper left portion of the display defined as a cutting area.
  • a selection which is dragged to the cutting area may be stored and retrieved later.
  • the display may also include a predefined delete area.
  • the delete area is defined as an upper right portion of the display. A selection which is dragged to the delete area may be deleted.
  • the cutting area and the delete area are illustrated in the upper left and upper right corners of the display in FIG. 4B , the cutting area and delete area may be located at various positions on the display.
  • an icon is created in the upper left portion of the display to represent the selected data object which was dragged to the cutting area.
  • the selected data object is the text “ABCDEFG” which was dragged to the cutting area.
  • the icon is a triangle and is displayed in the same portion of the display as the cutting area.
  • the selected data object may be represented in other forms such as a thumbnail image, text, and the like or may be displayed in various locations within the display.
  • the user may touch and drag the icon created at an upper end on the left side to the corresponding data object or another data object to insert the icon into a determined insertion position.
  • the text “abcdefghijklmnopqrstuvwxyz” is displayed and the user drags the icon to the point between “k” and “I”, and then releases the touch.
  • FIG. 4E illustrates the result of executing the data object insertion as shown in FIG. 4D .
  • the selected data object is inserted in the position where the user dragged the icon.
  • the text “ABCDEFG” was inserted between “k” and “l”, resulting in the display of “abcdefghijkABCDEFGlmnopqrstuvwxyz”.
  • the icon remains displayed in the upper left part of the display so that the user may insert the selected text into other parts of the displayed data object or into other data objects.
  • FIGS. 5A to 5F are diagrams illustrating a method of providing a UI for a data object clip board function according to another exemplary embodiment of the present general inventive concept.
  • FIG. 5A is a display of a moving image.
  • the title of the moving image is displayed in an upper right portion of the display.
  • the title of the moving image is “Movie 1”.
  • the content of the moving image may be displayed in a central portion of the display and a control bar may be displayed in a lower portion of the display.
  • a user may use the control bar to perform operations on the moving image such as play, pause, and stop.
  • a user may also use the control bar to select a time within the length of the moving image.
  • the user selects two desired points on the control bar to be edited during the reproduction of a moving image.
  • the two points represent a first time and a second time within the length of the moving image and the selected portion to be edited is the time between the first time and the second time.
  • the user drags the two selected points to a predetermined area (for example, at an upper end on the left side) to move the selected points to the predetermined area.
  • a predetermined area for example, at an upper end on the left side
  • the user drags the selected portion of the moving image “Movie 1” to the cutting area defined as an upper left part of the display.
  • an icon is created in the upper left portion of the display to represent the selected portion of the moving image which was dragged to the cutting area.
  • the icon represents the selected portion of “Movie 1” which was dragged to the cutting area.
  • the position in which the selected and copied area as illustrated in FIG. 5A is to be inserted is designated on the reproduction area of the corresponding data object or another data object, and the icon created at the upper end on the left side is inserted in the designated insertion position by touching and dragging the icon as illustrated in FIG. 5E .
  • FIG. 5D a new moving image titled “Movie 2” is displayed.
  • the user selects a position on the control bar, and the time within the moving image corresponding to the selected position is displayed in a central portion of the display.
  • the user drags the icon representing the selected portion of the moving image “Movie 1” to the selected position on the control bar to insert the selected portion of the moving image “Movie 1” at the selected position of the moving image “Movie 2”.
  • the designation of the insertion position as illustrated in FIG. 5D may be unnecessary according to circumstances, and it is also possible to position the selected data just in the position of the reproduction area, in which the data is to be inserted, to insert the data therein.
  • a user may choose not to select a position on the control bar as illustrated in FIG. 5D , and may instead drag the icon representing the selected portion of the moving image to a desired position on the control bar.
  • FIG. 5F illustrates the result of executing the data object insertion as shown in FIG. 5E .
  • the selected portion of the moving image “Movie 1” has been inserted at the selected position of the moving image “Movie 2”.
  • the cutting area is created in the area at the upper end on the left side of the display screen.
  • this is merely exemplary, and the cutting area for the clip board function may be created in various positions of the display screen.
  • FIGS. 6A and 6B are diagrams illustrating a method of providing a UI for a data object clip board function according to still another exemplary embodiment of the present general inventive concept.
  • the user may drag the corresponding icon in the cutting area to the deletion area (at the upper end on the right side) to delete the copied data.
  • the user drags the triangle icon from the cutting area in the upper left part of the display to the deletion area in the upper right part of the display to delete any data that was previously copied to the cutting area.
  • the user may select a desired data area from the predetermined data object and drag the selected data area to the deletion area (at the upper end on the right side) to delete the corresponding data.
  • FIG. 6B illustrates that the text “ABCDEFG” is selected.
  • the text may be selected by selecting a point slightly to the left of “A” and a point slightly to the right of “G”.
  • the user drags the selected text “ABCDEFG” to the deletion area to delete the selected text.
  • FIG. 7 is a diagram illustrating a method of displaying an area for a clip board function according to an exemplary embodiment of the present general inventive concept. As illustrated in FIG. 7 , icons 710 , a thumbnail 720 , and text data 730 are displayed in an upper left part of the display and indicate corresponding data objects.
  • the clip board data stored in the storage unit may be displayed in the form of corresponding icons 710 .
  • the corresponding icons 710 may include a circle, a triangle, and an “X” and may be located in an upper left part of the display.
  • Different corresponding icons 710 may represent different data objects copied to the cutting area. For example, a first data object copied to the cutting area may be represented by a circle and a second data object copied to the cutting area may be represented by a triangle.
  • the clip board data stored in the storage unit may be displayed in the form of a thumbnail 720 of the corresponding data object. As illustrated in FIG. 7 , the thumbnail 720 may be located in an upper left part of the display.
  • the clip board data stored in the storage unit may be displayed in the form of text data 730 that indicates the corresponding data object. As illustrated in FIG. 7 , the text data 730 may be illustrated in an upper left part of the display.
  • FIG. 8 is a flowchart illustrating a method of providing a UI according to an exemplary embodiment of the present general inventive concept.
  • At least one data object is displayed in the form of an icon at operation S 810 .
  • the mode may be shifted to an editing mode.
  • the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch.
  • the preset touch operation that corresponds to the data object merge function may be an operation of touching a reference data object, successively selecting and dragging areas in which icons that correspond to remaining data objects to be merged are displayed to the reference data object, and then releasing the touch.
  • the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of a data object, widening a space between the at least two selected points, and then releasing the touch.
  • a clip board area of the data object may be created in a predetermined area of the display screen, and the data object may be displayed in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • the clip board area may include at least one of a data copy area and a data deletion area, and the data object to be copied or deleted may be selected and dragged to the copy area or the deletion area to copy or delete the corresponding data object.
  • the paste function of the data object may be performed by touching and dragging the data object copied into the copy area of the data object to a desired position.
  • the partial cut function of the data object may be performed by selecting an area to be cut from at least one data object and dragging the selected area to the deletion area of the data object.
  • the present general inventive concept can also be embodied as computer-readable codes on a computer-readable medium.
  • the computer-readable medium can include a computer-readable recording medium and a computer-readable transmission medium.
  • the computer-readable recording medium is any data storage device that can store data as a program which can be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
  • the computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
  • the computer-readable transmission medium can be transmitted through carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
  • the present general inventive concept may also be implemented using at least one of ASICs (Application Specific Integrated Circuits), DSPs (Digital Signal Processors), DSPDs (Digital Signal Processing Devices), PLDs (Programmable Logic Devices), FPGAs (Field Programmable Gate Arrays), processors, controllers, micro-controllers, microprocessors, and electric units to perform functions.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field Programmable Gate Arrays
  • processors controllers, micro-controllers, microprocessors, and electric units to perform functions.
  • the exemplary embodiments such as procedures or functions may be implemented together with separate software modules that perform at least one function or operation.
  • Software codes may be implemented by software applications written in appropriate program languages. Also, software codes may be stored in a memory and executed by the control unit.
  • the data editing process is shortened, and thus an effective editing can be performed.
  • the gestures of gathering files in the case of merging the files and tearing a file in the case of dividing the file are intuitive, and thus the user can easily and quickly learn the editing method.
  • the cut files are created and maintained in the form of icons using an upper end corner portion that is not generally used on the screen, it is easy to confirm whether any cut file exists and to insert the cut file into another file without implementing a multi-window.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display apparatus is provided and includes a display unit displaying a data object in the form of a corresponding icon, a user interface unit receiving a preset touch operation corresponding to an editing function of the data object, an editing unit to perform editing of the data object, and a control unit operating to perform the editing of the data object that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode. Accordingly, a user's convenience can be sought.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of priority under 35 U.S.C. §119(a) to Korean Patent Application No. 10-2010-0123500, filed on Dec. 6, 2010, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present general inventive concept relates to a display apparatus and a method of providing a user interface (UI) thereof, and more particularly, to a display apparatus and a method of providing a UI thereof which can provide an editing function of a data object
  • 2. Description of the Related Art
  • Functions that are frequently used when an image is edited in the related art may include “file merge”, “file divide”, “partial cut”, and the like. For example, when an edit function is independently performed in a mobile appliance without a personal computer (PC), the following steps are generally performed in a thumbnail view state.
  • First, in the case of the “file merge”, processes are performed in the order of {circle around (1)} menu selection→{circle around (2)} merge function selection→{circle around (3)} selection of files to be merged→{circle around (4)} merge performance.
  • In the case of the “file divide”, processes are performed in the order of {circle around (1)} menu selection→{circle around (2)} divide function selection→{circle around (3)} selection of a file to be divided→{circle around (4)} selection of divide points→{circle around (5)} divide performance (the order of processes {circle around (1)} and {circle around (3)} may be changed depending on the appliance).
  • Also, in the case of the “partial cut”, processes are performed in the order of {circle around (1)} menu selection→{circle around (2)} partial cut function selection→{circle around (3)} selection of a file to be cut→{circle around (4)} selection of two cut points→{circle around (5)} partial cut performance (the order of processes {circle around (1)} and {circle around (3)} may be changed depending on the appliance).
  • Also, in the case of touching a touch panel in a thumbnail state in a mobile appliance using the touch panel, selection, viewing, and movement of a corresponding file can be performed, and using dragging distances between two fingers, image enlargement and reduction can be performed. Also, icons for copy, partial cut, paste, and the like, can be constantly positioned on a screen.
  • However, although touch panels have been spread in mobile appliances, many menu operations based on the existing five direction keys (up, down, left, right, and center) are still used, and thus it is required to pass through several menu steps when a specified function is performed. Also, when several files are selected, even in the case of successive data, it is required to select respective files by touching the respective files (in the case of a PC, mouse drag is frequently used).
  • Also, due to the limit of a screen size on the characteristics of a mobile appliance, it is difficult to display several windows on the screen, and thus implementation of a paste function after performing a partial cut operation becomes difficult (i.e., it is difficult to confirm the existence/nonexistence of the cut data and to perform the paste of the cut file to another file that is different from the cut file, and area deletion is frequently used ←→ in the case of a window-based PC, it can be simply performed by Ctrl+C or cut Ctrl+V).
  • On the other hand, since the editing function icon is constantly positioned on the screen, an actually usable screen becomes narrow.
  • SUMMARY OF THE INVENTION
  • The present general inventive concept addresses at least the above problems and/or disadvantages and provides at least the advantages described below. Accordingly, an aspect of the present general inventive concept provides a display apparatus and a method of providing a user interface (UI) thereof, which can provide an intuitive UI for performing a data editing function.
  • Additional aspects and utilities of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
  • The foregoing and/or other features and utilities of the present general inventive concept may be realized by a display apparatus, which includes a display unit to display a data object in the form of a corresponding icon, a user interface unit to receive a preset touch operation corresponding to an editing function to edit the data object, an editing unit to perform the editing function to edit the data object, and a control unit operating to control the editing unit to perform the editing function to edit the data object that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
  • Here, the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • In this case, the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
  • The control unit may operate to convert the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
  • The display unit may display at least two data objects and the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
  • Also, the display unit may display at least three data objects and the preset touch operation that corresponds to the data object merge function may be an operation of touching a first data object, successively selecting and dragging areas in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
  • The control unit may operate to create a clip board area of the data object in a predetermined area of a screen of the display unit and to display the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • The clip board area may include at least one of a copy area and a deletion area, and the control unit may operate to copy or delete the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
  • The control unit may operate to perform a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
  • The control unit may operate to perform a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
  • The foregoing and/or other features and utilities of the present general inventive concept may also be realized by a method of providing a UI of a display apparatus, which includes displaying a data object in the form of an icon, receiving a preset touch operation corresponding to an editing function to edit the data object, and performing the editing function that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
  • Here, the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • In this case, the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
  • The method of providing the UI may further include converting the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
  • Also, the displaying the data object may further include displaying at least three data objects and the preset touch operation that corresponds to the data object merge function may be an operation of touching a first data object, successively selecting and dragging areas, in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
  • Also, the displaying the data object may further include displaying at least two data objects and the preset touch operation that corresponds to the data object merge function is an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
  • The method of providing the UI may further include creating a clip board area of the data object in a predetermined area of a screen of the display unit and displaying the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • The clip board area may include at least one of a copy area and a deletion area, and the method of providing the UI may further include copying or deleting the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
  • The method of providing the UI may further include performing a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
  • The method of providing the UI may further include performing a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
  • The foregoing and/or other features and utilities of the present general inventive concept may also be realized by a display apparatus including a display unit to display at least one data object in the form of a corresponding icon, a user interface unit to receive a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of touch operations comprises touching at least two points on the display unit simultaneously, a control unit to determine which one of the plurality of preset touch operations is received by the user interface unit, and an editing unit to perform the respective editing function corresponding to the determined preset touch operation.
  • The plurality of preset touch operations may include at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
  • The at least one data object may include a first data object and a second data object and the preset touch operation corresponding to the data object merge function may include touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
  • The at least one data object may include a plurality of data objects and the preset touch operation corresponding to the data object merge function may include touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
  • The preset touch operation corresponding to the data object division function may include simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
  • The display unit may display the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function may include simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
  • The foregoing and/or other features and utilities of the present general inventive concept may also be realized by a method of providing a UI of a display apparatus including displaying at least one data object in the form of a corresponding icon, receiving one of a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of preset touch operations comprises touching at least two points on the display unit simultaneously, determining which one of the plurality of preset touch operations is received, and performing the respective editing function corresponding to the determined preset touch operation.
  • The plurality of preset touch operations may include at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
  • The at least one data object may include a first data object and a second data object and the preset touch operation corresponding to the data object merge function may include touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
  • The at least one data object may include a plurality of data objects and the preset touch operation corresponding to the data object merge function may include touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
  • The preset touch operation corresponding to the data object division function may include simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
  • The displaying may include displaying the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function may include simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
  • The foregoing and/or other features and utilities of the present general inventive concept may also be realized by a display apparatus including a display unit to display one or more data objects, and a control unit to select one of the displayed data objects and to separate the selected data object into at least two data objects that each include a portion of the selected data object according to a first touch operation on the selected data object, and to select at least two of the displayed data objects and to merge the at least two selected data objects into one data object according to a second touch operation on the at least two selected data objects.
  • Accordingly, the data editing process is shortened, and an efficient editing can be performed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects and advantages of the present general inventive concept will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 is a block diagram illustrating the configuration of a display apparatus according to an exemplary embodiment of the present general inventive concept;
  • FIGS. 2A to 2D are diagrams illustrating a method of providing a UI for a data object merge function according to an exemplary embodiment of the present general inventive concept;
  • FIGS. 3A to 3D are diagrams illustrating a method of providing a UI for a data object division function according to an exemplary embodiment of the present general inventive concept;
  • FIGS. 4A to 4E are diagrams illustrating a method of providing a UI for a data object clip board function according to an exemplary embodiment of the present general inventive concept;
  • FIGS. 5A to 5F are diagrams illustrating a method of providing a UI for a data object clip board function according to another exemplary embodiment of the present general inventive concept;
  • FIGS. 6A and 6B are diagrams illustrating a method of providing a UI for a data object clip board function according to still another exemplary embodiment of the present general inventive concept;
  • FIG. 7 is a diagram illustrating a method of displaying an area for a clip board function according to an exemplary embodiment of the present general inventive concept; and
  • FIG. 8 is a flowchart illustrating a method of providing a UI according to an exemplary embodiment of the present general inventive concept.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Reference will now be made in detail to the embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present general inventive concept while referring to the figures. However, the present disclosure is not restricted or limited to such embodiments. In explaining the present disclosure, well-known functions or constructions will not be described in detail so as to avoid obscuring the description with unnecessary detail.
  • FIG. 1 is a block diagram illustrating the configuration of a display apparatus according to an exemplary embodiment of the present general inventive concept.
  • Referring to FIG. 1, a display apparatus 100 according to an exemplary embodiment of the present general inventive concept includes a display unit 110, a user interface unit 120, an editing unit 130, and a control unit 140.
  • The display apparatus 100 may be implemented in various forms. For example, the display apparatus 100 may be implemented by a mobile terminal, such as a portable phone, a smart phone, a notebook computer, a terminal for digital broadcasting, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigation device, and the like, and a terminal, such as a digital YV, a desktop computer, and the like. Hereinafter, however, for convenience in explanation, it is assumed that the display apparatus 100 is a mobile terminal. However, it can be easily understood by those of ordinary skill in the art that the configuration to be described hereinafter can be applied to a fixed terminal except for the configuration elements specially configured for mobility.
  • The display unit 110 may display various data objects in the form of icons. Here, the data objects may be in various data forms, such as moving images, images, texts, applications, sound data, photo slide shows, and the like. Also, the icons may be displayed in diverse forms including thumbnail forms of respective data objects.
  • Specifically, the display unit 110 may be implemented in a touch screen form that forms a mutual layer structure with a touch pad. In this case, the display unit 110 may be used as a user interface unit 120, to be described later, in addition to an output device. The display unit 110 may include at least one of a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a 3D display. Among them, some displays may be constructed to be transparent so that the outside can be seen through the displays. These displays may be called transparent displays, and a representative example of the transparent displays may be a TOLED (Transparent Organic Light Emitting Diode). Also, two or more display units 110 may exist in accordance with the implementation forms of the display apparatus 100. For example, in the case of a mobile terminal, an external display unit (not illustrated) and an internal display unit (not illustrated) may be provided at the same time. Also, the touch screen may be configured to detect not only a touch input position and a touch input area but also a touch input pressure.
  • Also, the display unit 110 displays information that is processed in the display apparatus 100. For example, in a call mode in a state where the display apparatus 100 is implemented by a mobile terminal, a call-related UI (User Interface) or a GUI (Graphic User Interface) is displayed. Also, in the case where the display apparatus 100 is in a video call mode or in a photographing mode, a photographed and/or received image UI or GUI may be displayed.
  • The user interface unit 120 functions to receive and analyze a user command that is input from a user through an input device such as a mouse or a touch screen.
  • Specifically, the user interface unit 120 may receive various editing commands for various data objects. Here, the editing command may be a user command for performing an object merge, an object division, a clip board function, and the like. In this case, the object merge and the object division are performed in a thumbnail viewing state, and the clip board function may be performed during execution of the subject data object. For example, in the case where the subject data object is a moving image, the clip board function may be performed during reproduction of a moving image.
  • Also, the user interface unit 120 may receive various user commands to control the operation of the display apparatus 100, and may include a key pad, a dome switch, a touch pad (resistive/capacitive), a jog wheel, a jog switch, and the like. In particular, in the case where the touch pad forms a mutual layer structure together with the display unit 110 to be described later, this may be called a touch screen.
  • The editing unit 130 may perform an editing function of a data object in accordance with a user command received from the user interface unit 120. Here, the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • The control unit 140 functions to control the whole operation of the display apparatus 100.
  • Specifically, the control unit 140 may operate to enter into an editing mode if at least one icon is touched over a preset time in a normal mode.
  • Specifically, if a preset touch operation is performed with respect to an icon that corresponds to a data object in an editing mode, the control unit 140 may control the editing unit 130 to perform an editing function that corresponds to the preset touch operation with respect to the corresponding data object.
  • Here, if it is intended to select and merge two data objects, the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch.
  • Also, if it is intended to select and merge at least three data objects, the preset touch operation that corresponds to the data object merge function may be an operation of touching a reference data object, successively selecting and dragging areas in which icons that correspond to remaining data objects to be merged are displayed to the reference data object, and then releasing the touch.
  • Also, if it is intended to divide at least one data object, the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of a data object, widening a space between the at least two selected points, and then releasing the touch. In this case, if the corresponding data object is a moving image, the control unit 140 may operate to convert an editing mode into a reproduction mode of the corresponding data object in order to separate moving image data.
  • Also, the control unit 140 may operate to create a clip board area of the data object in a predetermined area of a screen of the display unit 110 and to display the data object in the form of an icon on the created area in accordance with an editing command for performing the clip board function. Here, the clip board area may include at least one of a data copy area and a data deletion area.
  • Accordingly, a user can intuitively recognize whether data to be edited exists in the clip board area. For example, in the case where an icon form that corresponds to a predetermined data object is displayed in the copy area, the user can confirm that the corresponding data has been copied, while in the case where an icon form that corresponds to a predetermined data object is displayed in the deletion area, the user can confirm that the corresponding data has been deleted.
  • Also, the control unit 140 may operate to store or delete a data object by selecting and positioning the data object to be stored or deleted in the storage area or the deletion area of the data object on the screen in accordance with the editing command for performing the clip board function.
  • Also, the control unit 140 may operate to perform a paste function of the data object by touching and dragging the data object stored in the storage area of the data object to a desired position.
  • Also, the control unit 140 may operate to perform a partial cut function of the data object by selecting an area to be cut from at least one data object and dragging the selected area to the deletion area of the data object.
  • Also, a storage unit (not illustrated), which may store programs for processing and control through the control unit 140 and perform temporary storage of input/output data (for example, a phonebook, messages, still images, moving images, and the like), may be further included.
  • Specifically, the storage unit (not illustrated) may store data about vibration and sound of various patterns which are output when a touch is input on a touch screen. The storage unit (not illustrated) may include at least one type of storage media, such as flash memory type, hard disk type, multimedia card micro type, and card type memories (for example, an SD or XD memory, and the like), magnetic memories such as a RAM (Random Access Memory), an SRAM (Static Random Access Memory), a ROM (Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a PROM (Programmable Read-Only Memory), magnetic disks, and optical disks.
  • On the other hand, in the case where the display apparatus 100 is implemented by a mobile terminal, the display apparatus 100 may further include a radio communication unit (not illustrated) which includes one or more constituent elements that perform radio communication between the mobile terminal and a radio communication system or radio communication between the mobile terminal 100 and a network on which the mobile terminal 100 is positioned, an A/V (Audio/Video) input unit (not illustrated) to input an audio signal and/or a video signal, a sensing unit (not illustrated) to sense the current states of the mobile terminal 100, such as an open/close state of the mobile terminal 100, the position of the mobile terminal 100, existence/nonexistence of user contact, the direction of the mobile terminal, acceleration/deceleration of the mobile terminal, and the like, and to create sensing signals to control the operation of the mobile terminal 100, an interface unit (not illustrated) to server as an interface with all external appliances connected to the mobile terminal 100, an output unit (not illustrated) to output an audio signal, a video signal, or an alarm signal, and a power supply unit to receive an external power supply and an internal power supply and to provide the power supply that is required to operate the respective constituent elements.
  • Hereinafter, a method of providing a UI for an editing function of various data objects according to an exemplary embodiment of the present general inventive concept will be described in more detail.
  • FIGS. 2A to 2D are diagrams illustrating a method of providing a UI for a data object merge function according to an exemplary embodiment of the present general inventive concept.
  • Although not illustrated in the drawing, a user may enter into an editing mode by touching at least one icon over a preset time. Here, the preset time may be a range of time that includes an error range.
  • As illustrated in FIG. 2A, if it is intended to merge two data objects that are displayed in the form of icons on the screen of the display unit 110 (or user interface unit 120), a user simultaneously touches the corresponding data objects for a predetermined time and maintains the touch without releasing the touch.
  • Then, as illustrated in FIG. 2B, if a user narrows a space between the two data objects by dragging the two data objects, and then releases the touch, the merge of the two data objects is executed.
  • The merge function is an operation of combining data objects into a single new data object. Here, when the merge is executed, the two data objects are combined into a new single data object including the contents of the two data objects. For example, if the two data objects are images, execution of the merge will result in a new data object containing both images.
  • Also, if the user widens the space between the two data objects in a state as illustrated in FIG. 2A again, the merge state is released. That is, if the user does not release the touch when the space between the two data objects is narrowed, as illustrated in FIG. 2B, but instead widens the space between the two data objects, as illustrated in FIG. 2A, and then releases the touch, the merge is not executed and the two data objects are not combined.
  • On the other hand, if it is intended to merge three or more data objects that are displayed in the form of icons on the display unit 110 as illustrated in FIG. 2C, the user makes a reference point by touching one data object over a predetermined time, and then maintains the touch without releasing the touch.
  • Then, if the user successively selects and drags other data objects to be merged to the reference point and then releases the touch as illustrated in FIG. 2D, the merge of the at least three data objects is executed.
  • Also, if the user returns the merged data objects to their original positions in the state as illustrated in FIG. 2C, the merge state of the data objects is released.
  • FIGS. 3A to 3D are diagrams illustrating a method of providing a UI for a data object division function according to an exemplary embodiment of the present general inventive concept.
  • As illustrated in FIG. 3A, the user touches two points of a data object to be divided and then maintains the touch state without releasing the touch.
  • Then, if the user widens the space between the two touch points in both directions as if the user tore the data object to two pieces, as illustrated in FIG. 3B, the corresponding icon is divided into two pieces to achieve the division of the data object. If the user narrows the space between the data objects in this state, the data object division function is released. That is, if the user releases the touch in the state illustrated in FIG. 3A, the data object division function is not executed.
  • The data object division function is an operation to divide the data object into two new data objects. When the data object division function is executed, the data object is divided into two new data objects, each of the new data objects include part of the contents of the original data object.
  • In the case where the data object is an image such as a moving image, if the user releases the touch in the state as illustrated in FIG. 3B, the corresponding moving image is converted into an image reproduction area as illustrated in FIG. 3C.
  • If the user performs double click (or double touch) of a desired point on the image being reproduced as illustrated in FIG. 3C, the corresponding point is divided into two pieces, and thus the mode is converted into a thumbnail viewing mode as illustrated in FIG. 3D to complete the creation of the divided data objects.
  • FIGS. 4A to 4E are diagrams illustrating a method of providing a UI for a data object clip board function according to an exemplary embodiment of the present general inventive concept.
  • Although the above-described object merge function (see FIGS. 2A to 2D) and object division function (see FIGS. 3A to 3D) are performed in the thumbnail viewing state, the clip board function is performed during the reproduction of the file in the case where the corresponding data object is a reproducible file such as a moving image.
  • As illustrated in FIG. 4A, the user selects two points of the data object desired to be copied. For example, as illustrated in FIG. 4A, the user touches a point slightly to the left of “A” and a point slightly to the right of “G” to select the text “ABCDEFG”.
  • Then, as illustrated in FIG. 4B, the user drags the two selected points to a predetermined area to move the selected data object to the predetermined area. As illustrated in FIG. 4B, the user drags the selected text “ABCDEFG” to an area in the upper left portion of the display defined as a cutting area. A selection which is dragged to the cutting area may be stored and retrieved later. The display may also include a predefined delete area. For example, in FIG. 4B, the delete area is defined as an upper right portion of the display. A selection which is dragged to the delete area may be deleted. Although the cutting area and the delete area are illustrated in the upper left and upper right corners of the display in FIG. 4B, the cutting area and delete area may be located at various positions on the display.
  • As illustrated in FIG. 4C, an icon is created in the upper left portion of the display to represent the selected data object which was dragged to the cutting area. Here, the selected data object is the text “ABCDEFG” which was dragged to the cutting area. As illustrated in FIG. 4C, the icon is a triangle and is displayed in the same portion of the display as the cutting area. However, the selected data object may be represented in other forms such as a thumbnail image, text, and the like or may be displayed in various locations within the display.
  • Then, as illustrated in FIG. 4D, the user may touch and drag the icon created at an upper end on the left side to the corresponding data object or another data object to insert the icon into a determined insertion position. For example, in FIG. 4D the text “abcdefghijklmnopqrstuvwxyz” is displayed and the user drags the icon to the point between “k” and “I”, and then releases the touch.
  • FIG. 4E illustrates the result of executing the data object insertion as shown in FIG. 4D. As illustrated in FIG. 4E the selected data object is inserted in the position where the user dragged the icon. Here, the text “ABCDEFG” was inserted between “k” and “l”, resulting in the display of “abcdefghijkABCDEFGlmnopqrstuvwxyz”. Additionally, the icon remains displayed in the upper left part of the display so that the user may insert the selected text into other parts of the displayed data object or into other data objects.
  • FIGS. 5A to 5F are diagrams illustrating a method of providing a UI for a data object clip board function according to another exemplary embodiment of the present general inventive concept.
  • With reference to FIGS. 5A to 5F, a case where the data object is a moving image will be described.
  • FIG. 5A is a display of a moving image. Here, the title of the moving image is displayed in an upper right portion of the display. As illustrated in FIG. 5A, the title of the moving image is “Movie 1”. The content of the moving image may be displayed in a central portion of the display and a control bar may be displayed in a lower portion of the display. A user may use the control bar to perform operations on the moving image such as play, pause, and stop. A user may also use the control bar to select a time within the length of the moving image. As illustrated in FIG. 5A, the user selects two desired points on the control bar to be edited during the reproduction of a moving image. For example, the two points represent a first time and a second time within the length of the moving image and the selected portion to be edited is the time between the first time and the second time.
  • Then, as illustrated in FIG. 5B, the user drags the two selected points to a predetermined area (for example, at an upper end on the left side) to move the selected points to the predetermined area. As illustrated in FIG. 5B, the user drags the selected portion of the moving image “Movie 1” to the cutting area defined as an upper left part of the display.
  • As illustrated in FIG. 5C, an icon is created in the upper left portion of the display to represent the selected portion of the moving image which was dragged to the cutting area. Here, the icon represents the selected portion of “Movie 1” which was dragged to the cutting area.
  • Thereafter, as illustrated in FIG. 5D, the position in which the selected and copied area as illustrated in FIG. 5A is to be inserted is designated on the reproduction area of the corresponding data object or another data object, and the icon created at the upper end on the left side is inserted in the designated insertion position by touching and dragging the icon as illustrated in FIG. 5E.
  • For example, in FIG. 5D a new moving image titled “Movie 2” is displayed. As illustrated in FIG. 5D, the user selects a position on the control bar, and the time within the moving image corresponding to the selected position is displayed in a central portion of the display. In FIG. 5E, the user drags the icon representing the selected portion of the moving image “Movie 1” to the selected position on the control bar to insert the selected portion of the moving image “Movie 1” at the selected position of the moving image “Movie 2”.
  • However, the designation of the insertion position as illustrated in FIG. 5D may be unnecessary according to circumstances, and it is also possible to position the selected data just in the position of the reproduction area, in which the data is to be inserted, to insert the data therein. For example, a user may choose not to select a position on the control bar as illustrated in FIG. 5D, and may instead drag the icon representing the selected portion of the moving image to a desired position on the control bar.
  • FIG. 5F illustrates the result of executing the data object insertion as shown in FIG. 5E. As illustrated in FIG. 5F, the selected portion of the moving image “Movie 1” has been inserted at the selected position of the moving image “Movie 2”.
  • On the other hand, in the above-described embodiment, it is exemplified that the cutting area is created in the area at the upper end on the left side of the display screen. However, this is merely exemplary, and the cutting area for the clip board function may be created in various positions of the display screen.
  • FIGS. 6A and 6B are diagrams illustrating a method of providing a UI for a data object clip board function according to still another exemplary embodiment of the present general inventive concept.
  • Referring to FIG. 6A, if it is intended to delete the data copied into the cutting area (at the upper end on the left side), the user may drag the corresponding icon in the cutting area to the deletion area (at the upper end on the right side) to delete the copied data. As illustrated in FIG. 6A, the user drags the triangle icon from the cutting area in the upper left part of the display to the deletion area in the upper right part of the display to delete any data that was previously copied to the cutting area.
  • Referring to FIG. 6B, the user may select a desired data area from the predetermined data object and drag the selected data area to the deletion area (at the upper end on the right side) to delete the corresponding data. For example, FIG. 6B illustrates that the text “ABCDEFG” is selected. The text may be selected by selecting a point slightly to the left of “A” and a point slightly to the right of “G”. As illustrated in FIG. 6B, the user drags the selected text “ABCDEFG” to the deletion area to delete the selected text.
  • FIG. 7 is a diagram illustrating a method of displaying an area for a clip board function according to an exemplary embodiment of the present general inventive concept. As illustrated in FIG. 7, icons 710, a thumbnail 720, and text data 730 are displayed in an upper left part of the display and indicate corresponding data objects.
  • As illustrated in FIG. 7, the clip board data stored in the storage unit may be displayed in the form of corresponding icons 710. As illustrated in FIG. 7, the corresponding icons 710 may include a circle, a triangle, and an “X” and may be located in an upper left part of the display. Different corresponding icons 710 may represent different data objects copied to the cutting area. For example, a first data object copied to the cutting area may be represented by a circle and a second data object copied to the cutting area may be represented by a triangle.
  • Also, the clip board data stored in the storage unit may be displayed in the form of a thumbnail 720 of the corresponding data object. As illustrated in FIG. 7, the thumbnail 720 may be located in an upper left part of the display.
  • Also, the clip board data stored in the storage unit may be displayed in the form of text data 730 that indicates the corresponding data object. As illustrated in FIG. 7, the text data 730 may be illustrated in an upper left part of the display.
  • FIG. 8 is a flowchart illustrating a method of providing a UI according to an exemplary embodiment of the present general inventive concept.
  • Referring to FIG. 8, according to the method of providing a UI of a display apparatus according to an exemplary embodiment of the present general inventive concept, at least one data object is displayed in the form of an icon at operation S810.
  • Then, a preset touch operation for editing the data object is received at operation S820.
  • Thereafter, if the preset touch operation is performed with respect to the icon in an editing mode, an editing function of the data object that corresponds to the preset touch operation is performed (S830).
  • Also, if at least one icon is touched over a preset time in a normal mode, the mode may be shifted to an editing mode.
  • Here, the editing function may include at least one of a data object merge function, a data object division function, and a clip board function.
  • Also, if it is intended to select and merge two data objects, the preset touch operation that corresponds to the data object merge function may be an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch.
  • Also, if it is intended to select and merge at least three data objects, the preset touch operation that corresponds to the data object merge function may be an operation of touching a reference data object, successively selecting and dragging areas in which icons that correspond to remaining data objects to be merged are displayed to the reference data object, and then releasing the touch.
  • Also, if it is intended to divide at least one data object, the preset touch operation that corresponds to the data object division function may be an operation of selecting and touching at least two points of a data object, widening a space between the at least two selected points, and then releasing the touch.
  • Also, a clip board area of the data object may be created in a predetermined area of the display screen, and the data object may be displayed in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
  • Here, the clip board area may include at least one of a data copy area and a data deletion area, and the data object to be copied or deleted may be selected and dragged to the copy area or the deletion area to copy or delete the corresponding data object.
  • Also, the paste function of the data object may be performed by touching and dragging the data object copied into the copy area of the data object to a desired position.
  • Also, the partial cut function of the data object may be performed by selecting an area to be cut from at least one data object and dragging the selected area to the deletion area of the data object.
  • The present general inventive concept can also be embodied as computer-readable codes on a computer-readable medium. The computer-readable medium can include a computer-readable recording medium and a computer-readable transmission medium. The computer-readable recording medium is any data storage device that can store data as a program which can be thereafter read by a computer system. Examples of the computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices. The computer-readable recording medium can also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The computer-readable transmission medium can be transmitted through carrier waves or signals (e.g., wired or wireless data transmission through the Internet). Also, functional programs, codes, and code segments to accomplish the present general inventive concept can be easily construed by programmers skilled in the art to which the present general inventive concept pertains.
  • The present general inventive concept may also be implemented using at least one of ASICs (Application Specific Integrated Circuits), DSPs (Digital Signal Processors), DSPDs (Digital Signal Processing Devices), PLDs (Programmable Logic Devices), FPGAs (Field Programmable Gate Arrays), processors, controllers, micro-controllers, microprocessors, and electric units to perform functions. In some cases, such exemplary embodiments may be implemented by the control unit.
  • According to software implementation, the exemplary embodiments such as procedures or functions may be implemented together with separate software modules that perform at least one function or operation. Software codes may be implemented by software applications written in appropriate program languages. Also, software codes may be stored in a memory and executed by the control unit.
  • As described above, according to the present general inventive concept, the data editing process is shortened, and thus an effective editing can be performed.
  • Specifically, the gestures of gathering files in the case of merging the files and tearing a file in the case of dividing the file are intuitive, and thus the user can easily and quickly learn the editing method.
  • Also, in the case of the clip board function, the cut files are created and maintained in the form of icons using an upper end corner portion that is not generally used on the screen, it is easy to confirm whether any cut file exists and to insert the cut file into another file without implementing a multi-window.
  • Although a few embodiments of the present general inventive concept have been shown and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.

Claims (33)

1. A display apparatus comprising:
a display unit to display a data object in the form of a corresponding icon;
a user interface unit to receive a preset touch operation corresponding to an editing function to edit the data object;
an editing unit to perform the editing function to edit the data object; and
a control unit to control the editing unit to perform the editing function to edit the data object that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
2. The display apparatus as claimed in claim 1, wherein the editing function includes at least one of a data object merge function, a data object division function, and a clip board function.
3. The display apparatus as claimed in claim 2, wherein the preset touch operation that corresponds to the data object division function is an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
4. The display apparatus as claimed in claim 3, wherein the control unit operates to convert the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
5. The display apparatus as claimed in claim 2, wherein the display unit displays at least two data objects and the preset touch operation that corresponds to the data object merge function is an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
6. The display apparatus as claimed in claim 2, wherein the display unit displays at least three data objects and the preset touch operation that corresponds to the data object merge function is an operation of touching a first data object, successively selecting and dragging areas in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
7. The display apparatus as claimed in claim 3, wherein the control unit operates to create a clip board area of the data object in a predetermined area of a screen of the display unit and to display the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
8. The display apparatus as claimed in claim 7, wherein the clip board area includes at least one of a copy area and a deletion area, and
the control unit operates to copy or delete the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
9. The display apparatus as claimed in claim 8, wherein the control unit operates to perform a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
10. The display apparatus as claimed in claim 8, wherein the control unit operates to perform a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
11. A method of providing a UI of a display apparatus, the method comprising:
displaying a data object in the form of an icon;
receiving a preset touch operation corresponding to an editing function to edit the data object; and
performing the editing function that corresponds to the preset touch operation when the preset touch operation for the icon is performed in an editing mode.
12. The method of providing the UI as claimed in claim 11, wherein the editing function includes at least one of a data object merge function, a data object division function, and a clip board function.
13. The method of providing the UI as claimed in claim 12, wherein the preset touch operation that corresponds to the data object division function is an operation of selecting and touching at least two points of the data object, widening a space between the at least two selected points, and then releasing the touch if it is intended to divide the data object.
14. The method of providing the UI as claimed in claim 13, further comprising converting the editing mode into a reproduction mode to separate a moving image in accordance with the preset touch operation that corresponds to the data object division function in the case where the data object is a moving image.
15. The method of providing the UI as claimed in claim 12, wherein the displaying the data object comprises displaying at least three data objects and the preset touch operation that corresponds to the data object merge function is an operation of touching a first data object, successively selecting and dragging areas in which icons that correspond to at least two other data objects to be merged are displayed to the first data object, and then releasing the touch of the first data object if it is intended to select and merge the at least three data objects.
16. The method of providing the UI as claimed in claim 12, wherein the displaying the data object comprises displaying at least two data objects and the preset touch operation that corresponds to the data object merge function is an operation of simultaneously touching two data objects, narrowing a space between the two data objects, and then releasing the touch if it is intended to select and merge the two data objects.
17. The method of providing the UI as claimed in claim 13, further comprising creating a clip board area of the data object in a predetermined area of a screen of the display unit and displaying the data object in the form of an icon on the created area in accordance with the preset touch operation to perform the clip board function.
18. The method of providing the UI as claimed in claim 17, wherein the clip board area includes at least one of a copy area and a deletion area, and
the method of providing the UI further comprises copying or deleting the data object by selecting and dragging the data object to the copy area or the deletion area, respectively.
19. The method of providing the UI as claimed in claim 18, further comprising performing a paste function of the data object by touching and dragging the data object copied onto the copy area to a desired position.
20. The method of providing the UI as claimed in claim 18, further comprising performing a partial cut function of the data object by selecting an area to be cut from the data object and dragging the selected area to the deletion area.
21. A display apparatus comprising:
a display unit to display at least one data object in the form of a corresponding icon;
a user interface unit to receive a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of preset touch operations comprises touching at least two points on the display unit simultaneously;
a control unit to determine which one of the plurality of preset touch operations is received by the user interface unit; and
an editing unit to perform the respective editing function corresponding to the determined preset touch operation.
22. The display apparatus as claimed in claim 21, wherein the plurality of preset touch operations corresponding to respective editing functions comprises at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
23. The display apparatus as claimed in claim 22, wherein the at least one data object comprises a first data object and a second data object and the preset touch operation corresponding to the data object merge function comprises touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
24. The display apparatus as claimed in claim 22, wherein the at least one data object comprises a plurality of data objects and the preset touch operation corresponding to the data object merge function comprises touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
25. The display apparatus as claimed in claim 22, wherein the preset touch operation corresponding to the data object division function comprises simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
26. The display apparatus as claimed in claim 22, wherein the display unit displays the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function comprises simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
27. A method of providing a UI of a display apparatus, the method comprising:
displaying at least one data object in the form of a corresponding icon;
receiving one of a plurality of preset touch operations corresponding to respective editing functions to edit the data object, wherein at least one of the plurality of preset touch operations comprises touching at least two points on the display unit simultaneously;
determining which one of the plurality of preset touch operations is received; and
performing the respective editing function corresponding to the determined preset touch operation.
28. The method of providing the UI as claimed in claim 27, wherein the plurality of preset touch operations corresponding to respective editing functions comprises at least one of a preset touch operation corresponding to a data object merge function, a preset touch operation corresponding to a data object division function, and a preset touch operation corresponding to a copy function.
29. The method of providing the UI as claimed in claim 28, wherein the at least one data object comprises a first data object and a second data object and the preset touch operation corresponding to the data object merge function comprises touching the first data object and the second data object simultaneously, sliding the first data object and the second data object into each other, and then releasing the touches on the first data object and the second data object.
30. The method of providing the UI as claimed in claim 28, wherein the at least one data object comprises a plurality of data objects and the preset touch operation corresponding to the data object merge function comprises touching a first data object of the plurality of data objects, and while touching the first data object, successively touching, sliding into the first data object, and releasing the touch on each of the plurality of data objects to be merged with the first data object, and then releasing the touch on the first data object.
31. The method of providing the UI as claimed in claim 28, wherein the preset touch operation corresponding to the data object division function comprises simultaneously touching two points of one of the at least one data object, sliding the two touched points away from each other, and then releasing the touch.
32. The method of providing the UI as claimed in claim 28, wherein the displaying further comprises displaying the data corresponding to one of the at least one data object and the preset touch operation corresponding to the copy function comprises simultaneously touching two points within the data and releasing the touches on the two points to select an area of the data, and then touching the selected area of the data, sliding the selected area of the data to a predetermined copy area, and releasing the touch on the selected area.
33. A display apparatus comprising:
a display unit to display one or more data objects; and
a control unit to select one of the displayed data objects and to separate the selected data object into at least two data objects that each include a portion of the selected data object according to a first touch operation on the selected data object, and to select at least two of the displayed data objects and to merge the at least two selected data objects into one data object according to a second touch operation on the at least two selected data objects.
US13/242,896 2010-12-06 2011-09-23 Display apparatus and method of providing user interface thereof Abandoned US20120144293A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020100123500A KR20120062297A (en) 2010-12-06 2010-12-06 Display apparatus and user interface providing method thereof
KR2010-0123500 2010-12-06

Publications (1)

Publication Number Publication Date
US20120144293A1 true US20120144293A1 (en) 2012-06-07

Family

ID=46163438

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/242,896 Abandoned US20120144293A1 (en) 2010-12-06 2011-09-23 Display apparatus and method of providing user interface thereof

Country Status (3)

Country Link
US (1) US20120144293A1 (en)
KR (1) KR20120062297A (en)
CN (1) CN102591562A (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246929A1 (en) * 2010-03-30 2011-10-06 Michael Jones Tabs for managing content
US20130203468A1 (en) * 2012-02-07 2013-08-08 Research In Motion Limited Methods and devices for merging contact records
US20130332872A1 (en) * 2012-06-11 2013-12-12 EVERYTHINK Ltd. System and method for drag hover drop functionality
US20140215364A1 (en) * 2013-01-30 2014-07-31 Samsung Electronics Co., Ltd. Method and electronic device for configuring screen
US20140245203A1 (en) * 2013-02-26 2014-08-28 Samsung Electronics Co., Ltd. Portable device and method for operating multi-application thereof
CN104461480A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
US20150277687A1 (en) * 2014-03-28 2015-10-01 An-Sheng JHANG System and method for manipulating and presenting information
US20160062557A1 (en) * 2014-09-02 2016-03-03 Samsung Electronics Co., Ltd. Method of processing content and electronic device thereof
USD760260S1 (en) * 2014-06-27 2016-06-28 Opower, Inc. Display screen of a communications terminal with graphical user interface
EP2983078A4 (en) * 2013-04-04 2016-12-21 Jung Hwan Park Method and apparatus for creating and editing image into which object is inserted
US20180024736A1 (en) * 2016-07-22 2018-01-25 Asustek Computer Inc. Electronic device and touch panel
US9946451B2 (en) 2013-03-12 2018-04-17 Lg Electronics Inc. Terminal and method of operating the same
US10067516B2 (en) 2013-01-22 2018-09-04 Opower, Inc. Method and system to control thermostat using biofeedback
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US10209825B2 (en) 2012-07-18 2019-02-19 Sentons Inc. Detection of type of object used to provide a touch contact input
US10235004B1 (en) 2011-11-18 2019-03-19 Sentons Inc. Touch input detector with an integrated antenna
US10248262B2 (en) * 2011-11-18 2019-04-02 Sentons Inc. User interface interaction using touch input force
US10296144B2 (en) 2016-12-12 2019-05-21 Sentons Inc. Touch input detection with shared receivers
US10386968B2 (en) 2011-04-26 2019-08-20 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US10386966B2 (en) 2013-09-20 2019-08-20 Sentons Inc. Using spectral control in detecting touch input
US10444909B2 (en) 2011-04-26 2019-10-15 Sentons Inc. Using multiple signals to detect touch input
US10444905B2 (en) 2017-02-01 2019-10-15 Sentons Inc. Update of reference data for touch input detection
CN110781480A (en) * 2019-10-17 2020-02-11 珠海格力电器股份有限公司 Information input method, device and storage medium
US10585522B2 (en) 2017-02-27 2020-03-10 Sentons Inc. Detection of non-touch inputs using a signature
US10698528B2 (en) 2011-11-18 2020-06-30 Sentons Inc. Localized haptic feedback
US10790046B2 (en) * 2012-02-24 2020-09-29 Perkinelmer Informatics, Inc. Systems, methods, and apparatus for drawing and editing chemical structures on a user interface via user gestures
US10824313B2 (en) 2013-04-04 2020-11-03 P.J. Factory Co., Ltd. Method and device for creating and editing object-inserted images
US10908741B2 (en) 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
US11009411B2 (en) 2017-08-14 2021-05-18 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11327599B2 (en) 2011-04-26 2022-05-10 Sentons Inc. Identifying a contact type
US11580829B2 (en) 2017-08-14 2023-02-14 Sentons Inc. Dynamic feedback for haptics
US11768597B2 (en) * 2019-02-14 2023-09-26 Naver Corporation Method and system for editing video on basis of context obtained using artificial intelligence

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5977132B2 (en) * 2012-09-28 2016-08-24 富士ゼロックス株式会社 Display control device, image display device, and program
KR102042461B1 (en) * 2012-10-31 2019-11-08 엘지전자 주식회사 Mobile terminal and method for controlling of the same
CN103970460A (en) * 2013-01-30 2014-08-06 三星电子(中国)研发中心 Touch screen-based operation method and terminal equipment using same
CN103218116A (en) * 2013-03-12 2013-07-24 广东欧珀移动通信有限公司 Implementation method and system for simultaneously editing multiple desktop elements
JP5979168B2 (en) * 2014-03-11 2016-08-24 コニカミノルタ株式会社 Screen display device, screen display system, screen display method, and computer program
CN107844226B (en) * 2016-09-19 2021-06-04 珠海金山办公软件有限公司 Method and device for switching text contents between different interfaces
KR102644092B1 (en) * 2017-12-29 2024-03-06 주식회사 피제이팩토리 Method for generating multi-depth image
CN111666025A (en) * 2020-05-29 2020-09-15 维沃移动通信(杭州)有限公司 Image selection method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214536A1 (en) * 2002-05-14 2003-11-20 Microsoft Corporation Lasso select
US20100058182A1 (en) * 2008-09-02 2010-03-04 Lg Electronics Inc. Mobile terminal and method of combining contents
US20100095234A1 (en) * 2008-10-07 2010-04-15 Research In Motion Limited Multi-touch motion simulation using a non-touch screen computer input device
US20110072344A1 (en) * 2009-09-23 2011-03-24 Microsoft Corporation Computing system with visual clipboard
US20110191719A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Cut, Punch-Out, and Rip Gestures
US20120137231A1 (en) * 2010-11-30 2012-05-31 Verizon Patent And Licensing, Inc. User interfaces for facilitating merging and splitting of communication sessions

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214536A1 (en) * 2002-05-14 2003-11-20 Microsoft Corporation Lasso select
US20100058182A1 (en) * 2008-09-02 2010-03-04 Lg Electronics Inc. Mobile terminal and method of combining contents
US20100095234A1 (en) * 2008-10-07 2010-04-15 Research In Motion Limited Multi-touch motion simulation using a non-touch screen computer input device
US20110072344A1 (en) * 2009-09-23 2011-03-24 Microsoft Corporation Computing system with visual clipboard
US20110191719A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Cut, Punch-Out, and Rip Gestures
US20120137231A1 (en) * 2010-11-30 2012-05-31 Verizon Patent And Licensing, Inc. User interfaces for facilitating merging and splitting of communication sessions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Forum.MacRumors.com, "How to split clips in iMovie 1.1 for iPhone", Sep 8, 2010, pp 1-2 http://forums.macrumors.com/showthread.php?t=1010733 *

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8539353B2 (en) * 2010-03-30 2013-09-17 Cisco Technology, Inc. Tabs for managing content
US20110246929A1 (en) * 2010-03-30 2011-10-06 Michael Jones Tabs for managing content
US10969908B2 (en) 2011-04-26 2021-04-06 Sentons Inc. Using multiple signals to detect touch input
US10444909B2 (en) 2011-04-26 2019-10-15 Sentons Inc. Using multiple signals to detect touch input
US10877581B2 (en) 2011-04-26 2020-12-29 Sentons Inc. Detecting touch input force
US10386968B2 (en) 2011-04-26 2019-08-20 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US11327599B2 (en) 2011-04-26 2022-05-10 Sentons Inc. Identifying a contact type
US11907464B2 (en) 2011-04-26 2024-02-20 Sentons Inc. Identifying a contact type
US10235004B1 (en) 2011-11-18 2019-03-19 Sentons Inc. Touch input detector with an integrated antenna
US10248262B2 (en) * 2011-11-18 2019-04-02 Sentons Inc. User interface interaction using touch input force
US11016607B2 (en) 2011-11-18 2021-05-25 Sentons Inc. Controlling audio volume using touch input force
US10732755B2 (en) 2011-11-18 2020-08-04 Sentons Inc. Controlling audio volume using touch input force
US11829555B2 (en) 2011-11-18 2023-11-28 Sentons Inc. Controlling audio volume using touch input force
US10698528B2 (en) 2011-11-18 2020-06-30 Sentons Inc. Localized haptic feedback
US10353509B2 (en) 2011-11-18 2019-07-16 Sentons Inc. Controlling audio volume using touch input force
US11209931B2 (en) 2011-11-18 2021-12-28 Sentons Inc. Localized haptic feedback
US20130203468A1 (en) * 2012-02-07 2013-08-08 Research In Motion Limited Methods and devices for merging contact records
US10790046B2 (en) * 2012-02-24 2020-09-29 Perkinelmer Informatics, Inc. Systems, methods, and apparatus for drawing and editing chemical structures on a user interface via user gestures
US20130332872A1 (en) * 2012-06-11 2013-12-12 EVERYTHINK Ltd. System and method for drag hover drop functionality
US10209825B2 (en) 2012-07-18 2019-02-19 Sentons Inc. Detection of type of object used to provide a touch contact input
US10466836B2 (en) 2012-07-18 2019-11-05 Sentons Inc. Using a type of object to provide a touch contact input
US10860132B2 (en) 2012-07-18 2020-12-08 Sentons Inc. Identifying a contact type
US10067516B2 (en) 2013-01-22 2018-09-04 Opower, Inc. Method and system to control thermostat using biofeedback
US20140215364A1 (en) * 2013-01-30 2014-07-31 Samsung Electronics Co., Ltd. Method and electronic device for configuring screen
EP2763131A1 (en) * 2013-01-30 2014-08-06 Samsung Electronics Co., Ltd Method and electronic device for configuring screen
US20140245203A1 (en) * 2013-02-26 2014-08-28 Samsung Electronics Co., Ltd. Portable device and method for operating multi-application thereof
US10180767B2 (en) * 2013-02-26 2019-01-15 Samsung Electronics Co., Ltd. Portable device and method facilitating execution of multiple applications simultaneously
US9946451B2 (en) 2013-03-12 2018-04-17 Lg Electronics Inc. Terminal and method of operating the same
US10824313B2 (en) 2013-04-04 2020-11-03 P.J. Factory Co., Ltd. Method and device for creating and editing object-inserted images
US10061493B2 (en) 2013-04-04 2018-08-28 Jung Hwan Park Method and device for creating and editing object-inserted images
EP2983078A4 (en) * 2013-04-04 2016-12-21 Jung Hwan Park Method and apparatus for creating and editing image into which object is inserted
CN104461480A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
US10386966B2 (en) 2013-09-20 2019-08-20 Sentons Inc. Using spectral control in detecting touch input
US20150277687A1 (en) * 2014-03-28 2015-10-01 An-Sheng JHANG System and method for manipulating and presenting information
USD760260S1 (en) * 2014-06-27 2016-06-28 Opower, Inc. Display screen of a communications terminal with graphical user interface
US11847292B2 (en) * 2014-09-02 2023-12-19 Samsung Electronics Co., Ltd. Method of processing content and electronic device thereof
US20240118781A1 (en) * 2014-09-02 2024-04-11 Samsung Electronics Co., Ltd. Method of processing content and electronic device thereof
US20160062557A1 (en) * 2014-09-02 2016-03-03 Samsung Electronics Co., Ltd. Method of processing content and electronic device thereof
EP2993567A1 (en) * 2014-09-02 2016-03-09 Samsung Electronics Co., Ltd. Method of processing content and electronic device thereof
CN106662969A (en) * 2014-09-02 2017-05-10 三星电子株式会社 Method of processing content and electronic device thereof
US20180024736A1 (en) * 2016-07-22 2018-01-25 Asustek Computer Inc. Electronic device and touch panel
US10908741B2 (en) 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
US10509515B2 (en) 2016-12-12 2019-12-17 Sentons Inc. Touch input detection with shared receivers
US10296144B2 (en) 2016-12-12 2019-05-21 Sentons Inc. Touch input detection with shared receivers
US10444905B2 (en) 2017-02-01 2019-10-15 Sentons Inc. Update of reference data for touch input detection
US10585522B2 (en) 2017-02-27 2020-03-10 Sentons Inc. Detection of non-touch inputs using a signature
US11061510B2 (en) 2017-02-27 2021-07-13 Sentons Inc. Detection of non-touch inputs using a signature
US11009411B2 (en) 2017-08-14 2021-05-18 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11435242B2 (en) 2017-08-14 2022-09-06 Sentons Inc. Increasing sensitivity of a sensor using an encoded signal
US11340124B2 (en) 2017-08-14 2022-05-24 Sentons Inc. Piezoresistive sensor for detecting a physical disturbance
US11262253B2 (en) 2017-08-14 2022-03-01 Sentons Inc. Touch input detection using a piezoresistive sensor
US11580829B2 (en) 2017-08-14 2023-02-14 Sentons Inc. Dynamic feedback for haptics
US11768597B2 (en) * 2019-02-14 2023-09-26 Naver Corporation Method and system for editing video on basis of context obtained using artificial intelligence
CN110781480A (en) * 2019-10-17 2020-02-11 珠海格力电器股份有限公司 Information input method, device and storage medium

Also Published As

Publication number Publication date
KR20120062297A (en) 2012-06-14
CN102591562A (en) 2012-07-18

Similar Documents

Publication Publication Date Title
US20120144293A1 (en) Display apparatus and method of providing user interface thereof
KR102430623B1 (en) Application command control for small screen display
KR101733839B1 (en) Managing workspaces in a user interface
US10248305B2 (en) Manipulating documents in touch screen file management applications
US10684769B2 (en) Inset dynamic content preview pane
RU2693909C2 (en) Command user interface for displaying and scaling selected control elements and commands
KR101899819B1 (en) Mobile terminal and method for controlling thereof
US7880728B2 (en) Application switching via a touch screen interface
KR20150070282A (en) Thumbnail and document map based navigation in a document
US10248439B2 (en) Format object task pane
US11256388B2 (en) Merged experience of reading and editing with seamless transition
US20150052465A1 (en) Feedback for Lasso Selection
JP2015506522A (en) How to navigate between content items in a browser using array mode
TW201504922A (en) Managing workspaces in a user interface
KR20130064458A (en) Display apparatus for displaying screen divided by a plurallity of area and method thereof
JP5229750B2 (en) Information processing apparatus, information processing method, and program thereof
KR20160138573A (en) Sliding surface
US20070045961A1 (en) Method and system providing for navigation of a multi-resource user interface
US20140354554A1 (en) Touch Optimized UI
US9529509B1 (en) Item selection
JP2012048311A (en) Information processor, information processing method and program
US10459612B2 (en) Select and move hint
US10162492B2 (en) Tap-to-open link selection areas
JP5621866B2 (en) Information processing apparatus, information processing method, and program thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KIM, MIN-SOO;REEL/FRAME:026960/0579

Effective date: 20110825

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION