US20190042070A1 - Method for editing display information and electronic device thereof - Google Patents

Method for editing display information and electronic device thereof Download PDF

Info

Publication number
US20190042070A1
US20190042070A1 US16/155,499 US201816155499A US2019042070A1 US 20190042070 A1 US20190042070 A1 US 20190042070A1 US 201816155499 A US201816155499 A US 201816155499A US 2019042070 A1 US2019042070 A1 US 2019042070A1
Authority
US
United States
Prior art keywords
electronic device
touch
edit
information
cut
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/155,499
Inventor
Dong-Hyun YEOM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US16/155,499 priority Critical patent/US20190042070A1/en
Publication of US20190042070A1 publication Critical patent/US20190042070A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04807Pen manipulated menu
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present disclosure relates to an electronic device. More particularly, the present disclosure relates to an apparatus and a method for editing display information in an electronic device.
  • portable electronic devices With advances in communication technology, multimedia services used by portable electronic devices have increased in popularity. Accordingly, an amount of information to be processed and displayed by the portable electronic device has increased. At the same time, portable electronic devices are now typically provided with a touchscreen that is capable of increasing the size of a display unit by improving space utilization efficiency.
  • a touchscreen is an Input/Output (I/O) unit for performing input and output of information using one screen.
  • I/O Input/Output
  • the portable electronic device may increase a display area by removing a separate input unit such as a keypad.
  • a manipulation method of the electronic device is different from that of an electronic device having a separate input unit.
  • the electronic device may display more information via an expanded screen of the touchscreen, the electronic device may require a separate user interface corresponding to the touchscreen.
  • an aspect of the present disclosure is to provide an apparatus and a method for editing display information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information according to a sponge edit method in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information using a touch pen in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information by considering touch information by a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing an icon by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing text by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing a reproduction list by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • a method for editing an object displayed on a display unit in an electronic device includes, when a first edit event occurs with a first touch maintained, storing object information of a first touch point, and when detecting a second touch, displaying an object of the first touch point on a second touch point.
  • a method for editing an object displayed on a display unit in an electronic device includes determining an object group including at least one object among a plurality of objects displayed on a display unit, when a first edit event occurs with a first touch of at least one of the objects included in the object group maintained, storing information of the at least one object included in the object group, and when detecting a second touch, displaying the at least one object included in the object group on a second touch point.
  • a method for editing an object displayed on a display unit in an electronic device includes, when a first edit event occurs with a first touch maintained, determining whether an object has been touched via the first touch, and when the object has been touched via the first touch, cutting the object of the first touch point.
  • a method for operating an electronic device includes, when detecting a button input of a touch pen with a first touch maintained by the touch pen, cutting at least one object on a first touch point according to a sponge edit method, displaying cutting information of the at least one object on a display unit, and when detecting a button input of the touch pen with a second touch maintained by the touch pen, pasting the cut at least one object on a second touch point according to the sponge edit method.
  • an electronic device includes a touchscreen, at least one processor, and a memory, wherein when a first edit event occurs with a first touch maintained, the processor may store information of an object of a first touch point in the memory, and when detecting a second touch, display the object of the first touch point on a second touch point of the touchscreen.
  • an electronic device includes a touchscreen, at least one processor, and a memory, wherein the processor may determine an object group including at least one of a plurality of objects displayed on the touchscreen, when a first edit event occurs with a first touch of one of the objects included in the object group maintained, store information of the at least one object included in the object group in the memory, when detecting a second touch, display at least one object included in the object group on a second touch point of the touchscreen.
  • an electronic device includes a touchscreen, at least one processor, and a memory, wherein when a first edit event occurs with a first touch maintained, the processor may determine whether an object is touched via the first touch, and when the object is touched via the first touch, cut the object of a first touch point.
  • FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure
  • FIG. 2 is a block diagram illustrating a processor according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart illustrating a procedure for editing display information in an electronic device according to an embodiment of the present disclosure
  • FIG. 4 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 5 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 6 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 7 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 8 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 9 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIG. 10 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure
  • FIGS. 11A, 11B, 11C, 11D, 11E, 11F, and 11G are views illustrating a screen configuration for editing an icon using a touch pen in an electronic device according to an embodiment of the present disclosure
  • FIGS. 12A, 12B, 12C, 12D, and 12E are views illustrating a screen configuration for editing an icon using a hand touch in an electronic device according to an embodiment of the present disclosure
  • FIGS. 13A, 13B, 13C, 13D, 13E, 13F, and 13G are views illustrating a screen configuration for editing text using a touch pen in an electronic device according to an embodiment of the present disclosure
  • FIGS. 14A, 14B, 14C, 14D, and 14E are views illustrating a screen configuration for editing text using a hand touch in an electronic device according to an embodiment of the present disclosure
  • FIGS. 15A, 15B, 15C, 15D, 15E, 15F, and 15G are views illustrating a screen configuration for editing a reproduction list using a touch pen in an electronic device according to an embodiment of the present disclosure
  • FIGS. 16A, 16B, 16C, 16D, and 16E are views illustrating a screen configuration for editing a reproduction list using a hand touch in an electronic device according to an embodiment of the present disclosure
  • FIGS. 17A, 17B, 17C, 17D, 17E, 17F, 17G, and 17H are views illustrating a screen configuration for editing a plurality of icons using a touch pen in an electronic device according to an embodiment of the present disclosure
  • FIGS. 18A, 18B, 18C, 18D, 18E, and 18F are views illustrating a screen configuration for editing a plurality of icons using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a plurality of edit lists in an electronic device according to an embodiment of the present disclosure.
  • the display information is information displayed on a display unit to provide a service in an electronic device and may include an object forming a screen displayed on the display unit for an interface between the electronic device and a user using graphics.
  • the term “electronic device” denotes any of a plurality of devices and may be a device such as a portable electronic device, a portable terminal, a mobile station, a Personal Digital Assistant (PDA), a laptop computer, a smart phone, a net-book, a television (TV), a Mobile Internet Device (MID), an Ultra Mobile Personal Computer (UMPC), a tablet Personal Computer (PC), a desktop computer, a smart TV, a digital camera, a wrist watch, a navigation device, an MP3 player, etc.
  • the electronic device may be an arbitrary wireless device that combines the functions of two or more devices among the above devices.
  • the electronic device may edit an object according to a sponge edit method.
  • the electronic device may perform editing such as copy, move, cut, paste, delete, etc. on an object using the sponge edit method.
  • the electronic device may provide editing such as move and copy via the sponge edit method in the same manner.
  • the sponge edit method may denote an edit method in which an object touched by a touch pen or a finger is absorbed into the touch pen or the finger similar to a sponge or pipette absorbing liquid, and the previously absorbed object is emitted and displayed on a point touched by the touch pen or the finger similar to the sponge or pipette emitting liquid.
  • FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure.
  • the electronic device 100 may include a memory 110 , a processor unit 120 , an audio processor 130 , an Input/Output (I/O) controller 140 , a touchscreen 150 , and an input unit 160 .
  • a plurality of memories 110 may exist.
  • the memory 110 may include a program storage 111 for storing a program for controlling an operation of the electronic device 100 , and a data storage 112 for storing data occurring during execution of a program.
  • the data storage 112 may store an object cut by an object edit program 114 . In the case where a delete event occurs, the data storage 112 may delete object information stored via cut.
  • the program storage 111 may include a Graphic User Interface (GUI) program 113 , the object edit program 114 , and at least one application 115 .
  • GUI Graphic User Interface
  • a program included in the program storage 111 is a set of instructions and may be expressed as an instruction set.
  • the GUI program 113 may include at least one software element for providing a user interface using graphics on a display unit 152 .
  • the GUI program 113 may control to display application information driven by a processor 122 on the display unit 152 .
  • the GUI program 113 may control to represent a sponge effect of an object edit by the object edit program 114 .
  • the GUI program 113 may control to represent an effect illustrating the object being absorbed by the touch pen or the finger similar to a sponge or pipette absorbing liquid.
  • the GUI program 113 may control to represent an effect of displaying a previously absorbed object on a point touched by the touch pen or the finger like the sponge emitting liquid.
  • the object edit program 114 may include at least one software element for editing an object by considering of touch information detected via a touch input unit 154 and edit event occurrence information. For example, in the case where a cut event occurs based on the maintaining of a first touch on an object, the object edit program 114 may control to cut the object of the first touch point. After that, in the case where a second touch is detected, the object edit program 114 may control to paste the cut object on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit program 114 may control to paste the cut object on the second touch point.
  • the object edit program 114 may detect cut event occurrence and paste event occurrence using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc. Also, a cut event and a paste event may be generated by the same input method or different input methods.
  • the object edit program 114 may control to cut an object included in the object group. After that, when detecting a second touch, the object edit program 114 may control to paste the cut object of the object group on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit program 114 may control to paste the cut object on the second touch point.
  • the object edit program 114 may determine whether a touch of an object occurs. For example, the object edit program 114 may determine that the touch of the object occurs by comparing an object recognize region with a touch point. In the case where a touch of an object occurs, the object edit program 114 may control to cut the object on the touch point. Meanwhile, in the case where the touch of the object does not occur, the object edit program 114 may control to paste a previously cut object on the touch point. At this point, the object edit program 114 may recognize an object recognize region differently depending on a touch method. For example, in the case where a touch is detected via the touch pen, the object edit program 114 may recognize an object recognize region narrower than in the case where a touch is detected by a hand.
  • the object edit program 114 may control to paste at least one of the plurality of cut objects selected by a user.
  • the application 115 may include a software element for at least one application installed in the electronic device 100 .
  • the processor unit 120 may include a memory interface 121 , at least one processor 122 , and a peripheral interface 123 .
  • the memory interface 121 , the at least one processor 122 , and the peripheral interface 123 included in the processor unit 120 may be integrated in at least one Integrated Circuit (IC) or implemented as separate elements.
  • IC Integrated Circuit
  • the memory interface 121 may control an access of the memory 110 by an element such as the processor 122 or the peripheral interface 123 .
  • the peripheral interface 123 may control connection between I/O peripherals of the electronic device 100 and the processor 122 and the memory interface 121 .
  • the processor 122 may control the electronic device 100 to provide various multimedia services using at least one software program. At this point, the processor 122 may control to execute at least one program stored in the memory 110 and provide a service corresponding to the relevant program. For example, the processor 122 may execute the object edit program 114 stored in the program storage 111 to edit an object by considering touch information detected via the touch input unit 154 and edit event occur information.
  • the audio processor 130 may provide an audio interface between a user and the electronic device 100 via a speaker 131 and a microphone 132 .
  • the audio processor 130 may generate a sound via the speaker 131 as if the object were being absorbed by a sponge or pipette.
  • the audio processor 130 may generate a sound via the speaker as if the object were emitted.
  • the I/O controller 140 may provide an interface between an I/O unit such as the touchscreen 150 and the input unit 160 and the peripheral interface 123 .
  • the touchscreen 150 is an I/O unit for performing output of information and input of information, and may include the display unit 152 and the touch input unit 154 .
  • the display unit 152 may display state information of the electronic device 100 , a character input by a user, a moving picture, a still picture, etc.
  • the display unit 152 may display application information driven by the processor 122 .
  • the display unit 152 may represent an animation effect as if the relevant object were being absorbed by the touch pen or the finger that has touched the object similar to a sponge or pipette absorbing liquid.
  • the display unit 152 may represent an animation effect as if the object were emitted from the touch pen or the finger like the sponge or pipette emitting liquid.
  • the touch input unit 154 may provide touch information detected via a touch panel to the processor unit 120 via the I/O controller 140 . At this point, the touch input unit 154 may provide touch information by the touch pen or the finger to the processor unit 120 via the I/O controller 140 .
  • the input unit 160 may provide input data generated by a user's selection to the processor unit 120 via the I/O controller 140 .
  • the input unit 160 includes only a control button for controlling the electronic device 100 .
  • the input unit 160 may include a keypad for receiving input data from a user.
  • the electronic device 100 may further include a communication system for performing a communication function for voice communication and data communication.
  • the communication system may be classified into a plurality of communication submodules supporting different communication networks.
  • the communication network may include a Global System for Mobile Communication (GSM) network, an Enhanced Data GSM Environment (EDGE) network, a Code Division Multiple Access (CDMA) network, a Wideband CDMA (WCDMA) network, a Long Term Evolution (LTE) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a wireless Local Area Network (LAN), a Bluetooth network, a Near Field Communication (NFC), etc.
  • GSM Global System for Mobile Communication
  • EDGE Enhanced Data GSM Environment
  • CDMA Code Division Multiple Access
  • WCDMA Wideband CDMA
  • LTE Long Term Evolution
  • OFDMA Orthogonal Frequency Division Multiple Access
  • LAN Local Area Network
  • Bluetooth a Bluetooth network
  • NFC Near Field Communication
  • the processor 122 may execute software elements stored in the program storage 111 inside one module to edit an object.
  • the processor 122 may be configured to include elements for editing an object as separate modules as illustrated in FIG. 2 .
  • FIG. 2 is a block diagram illustrating a processor according to an embodiment of the present disclosure.
  • the processor 122 may include an object edit controller 200 , and a display controller 210 .
  • the object edit controller 200 may execute the object edit program 114 stored in the program storage 111 to edit an object. For example, in the case where a cut event occurs with a first touch of an object maintained, the object edit controller 200 may cut an object on the first touch point. After that, when detecting a second touch, the object edit controller 200 may paste the cut object on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit controller 200 may paste the cut object on the second touch point.
  • the object edit controller 200 may detect occurrence of a cut event and occurrence of a paste event using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc.
  • the cut event and the paste event may be generated by the same input method or other input methods.
  • the object edit controller 200 may cut the object included in the object group. After that, when detecting a second touch, the object edit controller 200 may paste the cut object of the object group on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit controller 200 may paste the cut object on the second touch point.
  • the object edit controller 200 may determine whether a touch of the object occurs. At this point, the object edit controller 200 may determine whether the touch of the object occurs by comparing an object recognize region with a touch point. In the case where the touch of the object occurs, the object edit controller 200 may cut the object on the touch point. In contrast, in the case where the touch of the object does not occur, the object edit controller 200 may paste a previously cut object on the touch point. At this point, the object edit controller 200 may recognize an object recognize region differently depending on a touch method. For example, in the case where a touch is detected via the touch pen, the object edit controller 200 may recognize an object recognize region narrower than that in the case where a touch is detected by a hand.
  • the object edit controller 200 may control to paste at least one of the plurality of cut objects selected by a user.
  • the display controller 210 may control to execute the GUI program 113 stored in the program storage 111 and display a user interface using graphics on the display unit 152 .
  • the display controller 210 may control to display information of an application driven by the processor 122 on the display unit 152 .
  • the display controller 210 may control to represent an animation effect as if a relevant object were absorbed into the touch pen or the finger that has touched the object similar to a sponge or pipette absorbing liquid.
  • the display controller 210 may control to represent an animation effect as if the object were emitted from the touch pen or the finger like a sponge emitting liquid.
  • the processor 122 may further include an audio controller for controlling to generate sound effects depending on the sponge edit method.
  • the audio controller may control to generate a sound via the speaker 131 as if the object were absorbed.
  • the audio controller may control to generate a sound via the speaker 131 as if the object were emitted.
  • the electronic device may edit an object by considering touch information and edit event occurrence information. Accordingly, to discriminate an object edit function from a different service that may occur by considering touch information, the electronic device may switch to an object edit mode and edit an object by considering touch information and edit event occurrence information.
  • the electronic device may enter the object edit mode using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc.
  • the object edit controller 200 may cut an object on a first touch point and paste the cut object on a second touch point. At this point, the object edit controller 200 may paste the cut object by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the object edit controller 200 may change the position of another object existing on the second touch point and paste the cut object on the second touch point. For another example, the object edit controller 200 may change the paste position on which the cut object is to be pasted to a point on which another object does not exist and paste the cut object.
  • FIG. 3 is a flowchart illustrating a procedure for editing display information in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may proceed to operation 303 to store the first object.
  • the electronic device generates an animation edit effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method.
  • the electronic device may detect occurrence of a first edit event by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 305 and when detecting a second touch, proceed to operation 307 to display the first object stored in operation 303 on the second touch point.
  • the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger according to the sponge edit method and display the first object on the second touch point.
  • FIG. 4 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may determine whether a first touch of a first object is detected in operation 401 .
  • the electronic device may determine whether the first touch of the first object is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 403 to determine whether a first edit event occurs by maintaining of the first touch. For example, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the first touch maintained, etc.
  • the electronic device may proceed to operation 405 to determine whether the first touch is released.
  • the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device ends the present algorithm.
  • the electronic device may proceed to operation 403 to determine whether the first edit event occurs by maintaining of the first touch.
  • the electronic device may proceed to operation 407 to cut the first object on the first touch point.
  • the electronic device generates an animation effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method.
  • the electronic device may generate a sound as if the first object were absorbed according to the sponge edit method.
  • the electronic device may store the cut first object in the data storage 112 .
  • the electronic device may proceed to operation 409 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 411 to paste the first object cut in operation 407 on the second touch point.
  • the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the first object on the second touch point.
  • the electronic device may generate a sound as if the first object were emitted according to the sponge edit method.
  • the electronic device may paste the cut first object on the second touch point.
  • the electronic device may delete cut information of the first object from the data storage 112 .
  • the electronic device may store the cut information of the first object in the data storage 112 until a cancel event of the first object occurs.
  • the electronic device may recognize that the cancel event has occurred.
  • the electronic device may store the cut information of the first object in the data storage 112 for a reference time after performing paste on the first object.
  • the electronic device may paste the first object by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of the other object existing on the second touch point and paste the first object on the second touch point. For another example, the electronic device may change the paste position of the first object to a point on which another object does not exist and paste the first object.
  • FIG. 5 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • the electronic device may determine whether a first touch of a first object is detected in operation 501 .
  • the electronic device may determine whether the first touch of the first object is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 503 to determine whether a first edit event occurs through maintaining of the first touch. For example, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 505 to determine whether the first touch is released.
  • the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device may end the present algorithm.
  • the electronic device may proceed to operation 503 to determine whether the first edit event occurs with the first touch maintained.
  • the electronic device may proceed to operation 507 to cut the first object on the first touch point.
  • the electronic device generates an animation effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method.
  • the electronic device may generate a sound as if the first object were absorbed according to the sponge edit method.
  • the electronic device may store the cut first object in the data storage 112 .
  • the electronic device may proceed to operation 509 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 511 to determine whether a second edit event occurs through maintaining of the second touch. For example, the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the second touch maintained, etc.
  • the electronic device may proceed to operation 513 to determine whether the second touch is released.
  • the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut first object to a previous state of the cut event occurrence.
  • the electronic device may proceed to operation 511 to determine whether the second edit event occurs with the second touch maintained.
  • the electronic device may proceed to operation 515 to paste the first object cut in operation 507 on the second touch point.
  • the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the first object on the second touch point.
  • the electronic device may generate a sound as if the first object were emitted according to the sponge edit method.
  • the electronic device may paste the cut first object on the second touch point.
  • the electronic device may delete cut information of the first object from the data storage 112 .
  • the electronic device may store the cut information of the first object in the data storage 112 until a cancel event of the first object occurs.
  • the electronic device may determine a cancel event by considering button input information of the touch pen not touching the screen.
  • the electronic device may perform paste on the first object and store the cut information of the first object in the data storage 112 for a reference time.
  • the electronic device may paste the first object by considering whether an object exists on a second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of another object existing on the second touch point and paste the first object on the second touch point. For another example, the electronic device may change the paste position of the first object to a point on which another object does not exist and paste the first object.
  • FIG. 6 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to still another embodiment of the present disclosure.
  • the electronic device may determine whether a touch is detected in operation 601 .
  • the electronic device may determine whether a touch is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 603 to determine whether an edit event occurs by maintaining of the touch. For example, the electronic device may determine whether an edit event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the touch maintained, etc.
  • the electronic device may proceed to operation 605 to determine whether the touch is released.
  • the electronic device may recognize it does not perform object editing. Accordingly, the electronic device may end the present algorithm.
  • the electronic device may proceed to operation 603 to determine whether an edit event occurs with the touch maintained.
  • the electronic device may proceed to operation 607 to determine whether an object exists on the touch point detected in operation 601 . For example, the electronic device may determine whether the touch of the object has occurred by comparing an object recognize region with the touch point. At this point, the electronic device may recognize the object recognize region differently depending on a touch method. More specifically, when detecting a touch by the touch pen, the electronic device may recognize it may perform a more detailed touch using the touch pen than a case by a hand touch. Accordingly, the electronic device may recognize an object recognize region narrower than that in a case of detection by the hand touch.
  • a region that may recognize a relevant object may be represented by considering touch information.
  • the electronic device may proceed to operation 609 to cut the object on the touch point.
  • the electronic device generates an animation effect as if the object touched by the touch pen or the finger were absorbed into the touch pen or the finger according to the sponge edit method.
  • the electronic device may generate a sound as if the object were absorbed according to the sponge edit method.
  • the electronic device may store the cut first object in the data storage 112 .
  • the electronic device may proceed to operation 611 to paste a previously cut object on the touch point.
  • the electronic device may generate an animation effect as if the previously cut object were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the object on the touch point.
  • the electronic device may generate a sound as if the object were emitted according to the sponge edit method.
  • the electronic device may paste the cut object on the touch point where an object does not exist. At this point, the electronic device may delete cut information of the object from the data storage 112 . For another example, the electronic device may store the cut information of the object in the data storage 112 until a cancel event of the object occurs. Here, the electronic device may determine a cancel event by considering button input information of the touch pen not touching the screen.
  • FIGS. 11A, 11B, 11C, 11D, 11E, 11F, and 11G are views illustrating a screen configuration for editing an icon using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may edit a touched object by considering touch information detected via the touch input unit 154 and edit event occurrence information. For example, as illustrated in FIG. 11A , when detecting a button input 1103 of the touch pen by maintaining of a touch 1101 of an icon “Flickr” by the touch pen, the electronic device may cut the icon “Flickr”. At this point, as illustrated in FIG. 11B , the electronic device may cut the icon “Flickr” with an animation effect as if the icon “Flickr” were absorbed into the touch pen like a sponge or pipette absorbing liquid. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut the icon “Flickr” using the sponge effect at a point at which a button input is released after the button input 1103 of the touch pen has been detected.
  • the electronic device may rearrange icons displayed on the display unit 152 as illustrated in FIG. 11C .
  • the electronic device may display a notice icon 1111 representing whether a cut icon exists as illustrated in FIG. 11C .
  • the notice icon 1111 may represent the number of cut icons.
  • the electronic device may paste the cut icon “Flickr” on the second touch point as illustrated in FIG. 11G
  • the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may paste the icon “Flickr” on the second touch point.
  • the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • FIGS. 12A, 12B, 12C, 12D, and 12E are views illustrating a screen configuration for editing an icon using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may cut the icon “Flickr”.
  • the electronic device may cut the icon “Flickr” with an animation effect as if the icon “Flickr” were absorbed into the finger like a sponge or pipette absorbing liquid as illustrated in FIG. 12B .
  • the electronic device may rearrange icons displayed on the display unit 152 as illustrated in FIG. 11C .
  • the electronic device may display a notice icon 1111 representing whether a cut icon exists as illustrated in FIG. 11C .
  • the electronic device may paste the cut icon “Flickr” on the second touch point.
  • the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may paste the cut icon “Flickr” on the second touch point.
  • the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may edit an icon according to the sponge edit method.
  • FIGS. 13A, 13B, 13C, 13D, 13E, 13F, and 13G are views illustrating a screen configuration for editing text using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 14A, 14B, 14C, 14D, and 14E are views illustrating a screen configuration for editing text using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may edit text according to the sponge edit method as illustrated in FIGS. 13A to 13G or FIGS. 14A to 14E .
  • the electronic device may cut the word “wonderful”.
  • the electronic device may cut the word “wonderful” with an animation effect as if the word “wonderful” were absorbed into the touch pen like a sponge or pipette absorbing liquid.
  • the electronic device may automatically set a text region for cutting or determine the text region depending on a user's input information. For example, when detecting a button input 1303 of the touch pen by maintaining of a touch 1301 by the touch pen, the electronic device may automatically set a text region for cutting by considering a touch point and a word spacing point of text. At this point, the electronic device may change the size of the text region by considering input information of a text region change bar 1305 as illustrated in FIG. 13A . Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut the word “wonderful” using the sponge effect at a point at which the button input is released after the button input 1303 of the touch pen is detected.
  • the electronic device may rearrange text displayed on the display unit 152 as illustrated in FIG. 13C . As illustrated in FIG. 13C , the electronic device may display a notice icon 1311 representing whether cut text exists. At this point, the notice icon 1311 may display the number of cut text.
  • the electronic device may paste the cut word “wonderful” on the second touch point as illustrated in FIG. 13G
  • the electronic device may paste the word “wonderful” on the second touch point with an animation effect as if the word “wonderful” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may paste the word “wonderful” on the second touch point.
  • the electronic device may paste the word “wonderful” on the second touch point with an animation effect as if the word “wonderful” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may cut the word “wonderful”.
  • the electronic device may cut the word “wonderful” with an animation effect as if the word “wonderful” were absorbed into the finger like a sponge or pipette absorbing liquid.
  • the electronic device may automatically set a text region for cutting or determine the text region depending on a user's input information.
  • the electronic device may automatically set a text region for cutting by considering a touch point and a word spacing point of text. At this point, the electronic device may change the size of the text region by considering input information of a text region change bar 1405 as illustrated in FIG. 14A
  • the electronic device may rearrange the arrangement of text displayed on the display unit 152 as illustrated in FIG. 13C . Also, the electronic device may display a notice icon 1311 representing whether cut text exists as illustrated in FIG. 13C .
  • the electronic device may paste the cut word “wonderful” on the second touch point.
  • the electronic device may paste the word “wonderful” with an animation effect as if the word “wonderful” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may paste the cut word “wonderful” on the second touch point.
  • the electronic device may paste the word “wonderful” with an animation effect as if the word “wonderful” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may edit text according to the sponge edit method.
  • FIGS. 15A, 15B, 15C, 15D, 15E, 15F, and 15G are views illustrating a screen configuration for editing a reproduction list using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 16A, 16B, 16C, 16D, and 16E are views illustrating a screen configuration for editing a reproduction list using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may edit a reproduction list according to the sponge edit method as illustrated in FIGS. 15A to 15G and FIGS. 16A to 16E .
  • the electronic device may cut the list “song 3”.
  • the electronic device may cut the list “song 3” with an animation effect as if the list “song 3” were absorbed into the touch pen like a sponge or pipette absorbing liquid.
  • the electronic device may cut the list “song 3” using the sponge effect at a point at which a button input is released after the button input 1503 of the touch pen is detected.
  • the electronic device may rearrange a reproduction list displayed on the display unit 152 as illustrated in FIG. 15C .
  • the electronic device may display a notice icon 1511 representing whether a cut reproduction list exists.
  • the notice icon 1511 may display the number of cut reproduction lists.
  • the electronic device may paste the cut list “song 3” on the second touch point as illustrated in FIG. 15G
  • the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may paste the list “song 3” on the second touch point.
  • the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may perform cut on the list “song 3”.
  • the electronic device may cut the list “song 3” with an animation effect as if the list “song 3” were absorbed into the finger like a sponge or pipette absorbing liquid.
  • the electronic device may rearrange reproduction lists displayed on the display unit 152 as illustrated in FIG. 15C .
  • the electronic device may display a notice icon 1511 representing whether a cut reproduction list exists.
  • the electronic device may paste the cut list “song 3” on the second touch point.
  • the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may paste the cut list “song 3” on the second touch point.
  • the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may rearrange objects displayed on the display unit 152 .
  • the electronic device may not rearrange the objects displayed on the display unit 152 .
  • the electronic device may leave a cut object region as a vacant space.
  • FIG. 7 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may set an object group in operation 701 .
  • the electronic device may recognize at least one object included in an object select region set by considering multi-touched points detected via the touch input 154 as the object group.
  • the electronic device may recognize at least one object included in an object select region set by consideration of a touch point and drag information detected via the touch input unit 154 as the object group.
  • the electronic device may recognize at least one object included in an object select region set by considering touched points successively detected via the touch input unit 154 as an object group 1301 .
  • the electronic device may switch to an object select mode. At this point, the electronic device may recognize at least one object selected in the object select mode as the object group.
  • the electronic device may proceed to operation 703 to determine whether a touch of at least one object among one or more objects included in the object group is detected.
  • the electronic device may proceed to operation 705 to determine whether an edit event occurs with a touch of the object included in the object group maintained.
  • the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 707 to determine whether a touch of the object included in the object group is released.
  • the electronic device may recognize it does not perform editing on the object group. Accordingly, the electronic device may end the present algorithm.
  • the electronic device may proceed to operation 705 to determine whether an edit event occurs with the touch of the object included in the object group maintained.
  • the electronic device may proceed to operation 709 to cut the object included in the object group.
  • the electronic device generates an animation effect as if objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method.
  • the electronic device may generate a sound as if the objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method.
  • the electronic device may store cut object information in the data storage 112 .
  • the electronic device may proceed to operation 711 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 713 to paste the object included in the object group on the second touch point. For example, the electronic device may generate an animation effect as if objects included in the object group were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the objects included in the object group on the second touch point. Additionally, the electronic device may generate a sound as if the objects included in the object group were emitted according to the sponge edit method.
  • the electronic device may paste the cut objects of the object group on the second touch point.
  • the electronic device may paste the cut objects of the object group by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of another object existing on the second touch point and paste the cut objects of the object group on the second touch point. For another example, the electronic device may change the paste position of the cut objects of the object group to a point on which another object does not exist, and paste the objects.
  • FIG. 8 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • the electronic device may set an object group in operation 801 .
  • the electronic device may recognize at least one object included in an object select region set by considering multi-touched points detected via the touch input 154 as the object group.
  • the electronic device may recognize at least one object included in an object select region set by considering a touch point and drag information detected via the touch input unit 154 as the object group.
  • the electronic device may recognize at least one object included in an object select region set by considering touched points successively detected via the touch input unit 154 as an object group 1301 .
  • the electronic device may switch to an object select mode. At this point, the electronic device may recognize at least one object selected in the object select mode as the object group.
  • the electronic device may proceed to operation 803 to determine whether a touch of at least one object among one or more objects included in the object group is detected.
  • the electronic device may proceed to operation 805 to determine whether an edit event occurs with a touch of the object included in the object group maintained. For example, with the touch of the object included in the object group maintained, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 807 to determine whether a touch of the object included in the object group is released.
  • the electronic device may recognize it does not perform editing on the object group. Accordingly, the electronic device may end the present algorithm.
  • the electronic device may proceed to operation 805 to determine whether an edit event occurs with the touch of the object included in the object group maintained.
  • the electronic device may proceed to operation 809 to cut the object included in the object group.
  • the electronic device generates an animation effect as if objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method.
  • the electronic device may generate a sound as if the objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method.
  • the electronic device may store cut object information in the data storage 112 .
  • the electronic device may proceed to operation 811 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154 .
  • the electronic device may proceed to operation 813 to determine whether a second edit event occurs with the second touch maintained.
  • the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 815 to determine whether the second touch is released.
  • the electronic device may recognize it does not perform editing on an object group. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut object group to a previous state of the cut event occurrence.
  • the electronic device may proceed to operation 813 to determine whether the second edit event occurs with the second touch maintained.
  • the electronic device may proceed to operation 817 to paste an object included in the object group on the second touch point.
  • the electronic device may generate an animation effect as if the objects included in the object group were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the objects included in the object group on the second touch point.
  • the electronic device may generate a sound as if the objects included in the object group were emitted according to the sponge edit method.
  • FIGS. 17A, 17B, 17C, 17D, 17E, 17F, 17G, and 17H are views illustrating screen configuration for editing a plurality of icons using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may edit at least one object included in the object group by considering touch information detected via the touch input unit 154 and the edit event occur information. For example, as illustrated in FIG. 17A , the electronic device may set an object group 1701 by considering touch points successively detected via the touch input unit 154 . After that, as illustrated in FIG. 17B , when detecting a button input 1713 of the touch pen with a touch 1711 of an icon “Flickr” included in an object group by the touch pen maintained, the electronic device may cut an icon “Picasa”, an icon “FC2”, an icon “Flickr”, an icon “Line pix”, an icon “Microsoft”, and an icon “MySpace” included in the object group. At this point, as illustrated in FIG.
  • the electronic device may cut the objects with an animation effect as if the objects included in the object group were absorbed into the touch pen like a sponge or pipette absorbing liquid. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut objects included in the object group using a sponge effect at a point at which a button input is released after the button input 1713 of the touch pen is detected.
  • the electronic device may rearrange objects displayed on the display unit 152 . As illustrated in FIG. 17D , the electronic device may display a notice icon 1721 representing whether a cut object exists. At this point, the notice icon 1721 may display the number of cut objects.
  • the electronic device may paste cut objects included in the object group on the second touch point as illustrated in FIG. 17H .
  • the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may paste the cut objects included in the object group on the second touch point.
  • the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the touch pen like a sponge or pipette emitting liquid.
  • the electronic device may set an object group 1801 by considering touch points successively detected via the touch input unit 154 .
  • the electronic device may cut an icon “Picasa”, an icon “FC2”, an icon “Flickr”, an icon “Line pix”, an icon “Microsoft”, and an icon “MySpace” included in the object group.
  • the electronic device may cut the objects with an animation effect as if the objects included in the object group were absorbed into the finger like a sponge or pipette absorbing liquid.
  • the electronic device may rearrange the objects displayed on the display unit 152 .
  • the electronic device may display a notice icon 1721 representing whether a cut object exists. At this point, the notice icon 1721 may represent the number of cut objects.
  • the electronic device may paste the cut objects included in the object group on the second touch point.
  • the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the finger like a sponge or pipette emitting liquid.
  • the electronic device may paste the cut objects included in the object group on the second touch point.
  • the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the finger like a sponge or pipette emitting liquid.
  • FIG. 9 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a plurality of edit lists in an electronic device according to an embodiment of the present disclosure.
  • the electronic device may perform a cut event on an object in operation 901 .
  • a cut event occurs by maintaining of a touch of an object by the touch pen or the finger as illustrated in FIGS. 3 to 6
  • the electronic device may cut the touched object.
  • the electronic device may cut the objects included in the object group.
  • the electronic device may proceed to operation 903 to display cut information of the object on the display unit 152 .
  • the electronic device may display whether a cut object exists and the number of cut objects using a notice icon as illustrated in FIGS. 11C, 13C, 15C , and 17 C.
  • the electronic device may display a list of cut objects on the display unit 152 .
  • the electronic device may display a list 1901 of cut objects on the notice bar as illustrated in FIG. 19A .
  • the electronic device may display a list of cut objects using a separate popup window 1911 as illustrated in FIG. 19B .
  • the electronic device may set a portion of the display unit 152 as a region 1921 for displaying a list of cut objects and display the list of cut objects as illustrated in FIG. 19C .
  • the electronic device may proceed to operation 905 to determine an object list on which it will perform a paste event. For example, the electronic device may select at least one object to paste among one or more cut objects.
  • the electronic device may proceed to operation 907 to determine whether a touch is detected.
  • the electronic device may proceed to operation 909 to paste an object included in the object list on which it will perform a paste event in operation 905 on the touch point. For example, the electronic device may generate an animation effect as if objects included in the object list were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the objects included in the object list on the touch point. Additionally, the electronic device may generate a sound as if the objects included in the object list were emitted according to the sponge edit method.
  • FIG. 10 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • the electronic device may perform a cut event on an object in operation 1001 .
  • a cut event occurs by maintaining of a touch of an object by the touch pen or the finger as illustrated in FIGS. 3 to 6
  • the electronic device may cut the touched object.
  • the electronic device may cut objects included the object group.
  • the electronic device may proceed to operation 1003 to display cut information of the object on the display unit 152 .
  • the electronic device may display whether a cut object exists and the number of cut objects using a notice icon as illustrated in FIGS. 11C, 13C, 15C , and 17 C.
  • the electronic device may display a list of cut objects on the display unit 152 .
  • the electronic device may display a list 1901 of cut objects on the notice bar as illustrated in FIG. 19A .
  • the electronic device may display a list of cut objects using a separate popup window 1911 as illustrated in FIG. 19B .
  • the electronic device may set a portion of the display unit 152 as a region 1921 for displaying a list of cut objects and display the list of cut objects as illustrated in FIG. 19C .
  • the electronic device may proceed to operation 1005 to determine an object list on which it will perform a paste event. For example, the electronic device may select at least one object to paste among one or more cut objects.
  • the electronic device may proceed to operation 1007 to determine whether a touch is detected.
  • the electronic device may proceed to operation 1009 to determine whether an edit event occurs with the touch maintained.
  • the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • the electronic device may proceed to operation 1011 to determine whether the touch is released.
  • the electronic device may recognize it does not perform editing on the object. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut object group to a previous state of the cut event occurrence.
  • the electronic device may proceed to operation 1009 to determine whether an edit event occurs with the touch maintained.
  • the electronic device may proceed to operation 1013 to paste an object included in the object list on which it will perform a paste event in operation 1005 on the touch point.
  • the electronic device may generate an animation effect as if objects included in the object list were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the objects included in the object list on the touch point.
  • the electronic device may generate a sound as if the objects included in the object list were emitted according to the sponge edit method.
  • the electronic device may end the present algorithm.
  • the electronic device generates an animation effect as if a relevant object were absorbed into a touch pen or a finger that has touched the object. At this point, the electronic device may change a point into which the object is absorbed by considering the touch point.
  • the electronic device having the touchscreen may easily edit display information by editing the display information displayed on the display unit using a sponge effect, and arouse a user's emotional interest via the sponge effect.
  • Any such software may be stored in a non-transitory computer readable storage medium.
  • the non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CD Compact Disk
  • DVD Digital Versatile Disc
  • the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.

Abstract

An apparatus and a method for editing display information in an electronic device having a touchscreen are provided. The method includes, when a first edit event occurs with a first touch maintained, storing object information of a first touch point, and when a second touch is detected, displaying an object of the first touch point on a second touch point.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application is a continuation application of prior application Ser. No. 14/016,799, filed on Sep. 3, 2013, which has issued as U.S. Pat. No. 10,095,401 on Oct. 9, 2018 and was based on and claimed priority under 35 U.S.C. § 119(a) of a Korean patent application filed on Sep. 14, 2012 in the Korean Intellectual Property Office and assigned Serial No. 10-2012-0101927, the entire disclosure of which is hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present disclosure relates to an electronic device. More particularly, the present disclosure relates to an apparatus and a method for editing display information in an electronic device.
  • BACKGROUND
  • With advances in communication technology, multimedia services used by portable electronic devices have increased in popularity. Accordingly, an amount of information to be processed and displayed by the portable electronic device has increased. At the same time, portable electronic devices are now typically provided with a touchscreen that is capable of increasing the size of a display unit by improving space utilization efficiency.
  • A touchscreen is an Input/Output (I/O) unit for performing input and output of information using one screen. In the case where a portable electronic device uses a touchscreen, the portable electronic device may increase a display area by removing a separate input unit such as a keypad. In case of using the touchscreen as described above, since the electronic device does not have a separate input unit such as a keypad, a manipulation method of the electronic device is different from that of an electronic device having a separate input unit. Furthermore, since the electronic device may display more information via an expanded screen of the touchscreen, the electronic device may require a separate user interface corresponding to the touchscreen.
  • Accordingly, there is a need for an improved apparatus and device for editing display information in an electronic device having a touchscreen.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
  • SUMMARY
  • Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an apparatus and a method for editing display information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information according to a sponge edit method in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information using a touch pen in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing display information by considering touch information by a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing an icon by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing text by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • Another aspect of the present disclosure is to provide an apparatus and a method for editing a reproduction list by considering touch information of a touch pen and button input information in an electronic device having a touchscreen.
  • In accordance with an aspect of the present disclosure, a method for editing an object displayed on a display unit in an electronic device is provided. The method includes, when a first edit event occurs with a first touch maintained, storing object information of a first touch point, and when detecting a second touch, displaying an object of the first touch point on a second touch point.
  • In accordance with another aspect of the present disclosure, a method for editing an object displayed on a display unit in an electronic device is provided. The method includes determining an object group including at least one object among a plurality of objects displayed on a display unit, when a first edit event occurs with a first touch of at least one of the objects included in the object group maintained, storing information of the at least one object included in the object group, and when detecting a second touch, displaying the at least one object included in the object group on a second touch point.
  • In accordance with still another aspect of the present disclosure, a method for editing an object displayed on a display unit in an electronic device is provided. The method includes, when a first edit event occurs with a first touch maintained, determining whether an object has been touched via the first touch, and when the object has been touched via the first touch, cutting the object of the first touch point.
  • In accordance with further another aspect of the present disclosure, a method for operating an electronic device is provided. The method includes, when detecting a button input of a touch pen with a first touch maintained by the touch pen, cutting at least one object on a first touch point according to a sponge edit method, displaying cutting information of the at least one object on a display unit, and when detecting a button input of the touch pen with a second touch maintained by the touch pen, pasting the cut at least one object on a second touch point according to the sponge edit method.
  • In accordance with another aspect of the present disclosure, an electronic device is provided. The electronic device includes a touchscreen, at least one processor, and a memory, wherein when a first edit event occurs with a first touch maintained, the processor may store information of an object of a first touch point in the memory, and when detecting a second touch, display the object of the first touch point on a second touch point of the touchscreen.
  • In accordance with another aspect of the present disclosure, an electronic device is provided. The electronic device includes a touchscreen, at least one processor, and a memory, wherein the processor may determine an object group including at least one of a plurality of objects displayed on the touchscreen, when a first edit event occurs with a first touch of one of the objects included in the object group maintained, store information of the at least one object included in the object group in the memory, when detecting a second touch, display at least one object included in the object group on a second touch point of the touchscreen.
  • In accordance with another aspect of the present disclosure, an electronic device is provided. The electronic device includes a touchscreen, at least one processor, and a memory, wherein when a first edit event occurs with a first touch maintained, the processor may determine whether an object is touched via the first touch, and when the object is touched via the first touch, cut the object of a first touch point.
  • Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings in which:
  • FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure;
  • FIG. 2 is a block diagram illustrating a processor according to an embodiment of the present disclosure;
  • FIG. 3 is a flowchart illustrating a procedure for editing display information in an electronic device according to an embodiment of the present disclosure;
  • FIG. 4 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 5 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 6 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 7 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 8 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 9 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIG. 10 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 11A, 11B, 11C, 11D, 11E, 11F, and 11G are views illustrating a screen configuration for editing an icon using a touch pen in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 12A, 12B, 12C, 12D, and 12E are views illustrating a screen configuration for editing an icon using a hand touch in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 13A, 13B, 13C, 13D, 13E, 13F, and 13G are views illustrating a screen configuration for editing text using a touch pen in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 14A, 14B, 14C, 14D, and 14E are views illustrating a screen configuration for editing text using a hand touch in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 15A, 15B, 15C, 15D, 15E, 15F, and 15G are views illustrating a screen configuration for editing a reproduction list using a touch pen in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 16A, 16B, 16C, 16D, and 16E are views illustrating a screen configuration for editing a reproduction list using a hand touch in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 17A, 17B, 17C, 17D, 17E, 17F, 17G, and 17H are views illustrating a screen configuration for editing a plurality of icons using a touch pen in an electronic device according to an embodiment of the present disclosure;
  • FIGS. 18A, 18B, 18C, 18D, 18E, and 18F are views illustrating a screen configuration for editing a plurality of icons using a hand touch in an electronic device according to an embodiment of the present disclosure; and
  • FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a plurality of edit lists in an electronic device according to an embodiment of the present disclosure.
  • Throughout the drawings, like reference numerals will be understood to refer to like parts, components and structures.
  • DETAILED DESCRIPTION
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure are provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • Various embodiments of the present disclosure provide a technology for editing display information in an electronic device having a touchscreen. Here, the display information is information displayed on a display unit to provide a service in an electronic device and may include an object forming a screen displayed on the display unit for an interface between the electronic device and a user using graphics.
  • In the following description, the term “electronic device” denotes any of a plurality of devices and may be a device such as a portable electronic device, a portable terminal, a mobile station, a Personal Digital Assistant (PDA), a laptop computer, a smart phone, a net-book, a television (TV), a Mobile Internet Device (MID), an Ultra Mobile Personal Computer (UMPC), a tablet Personal Computer (PC), a desktop computer, a smart TV, a digital camera, a wrist watch, a navigation device, an MP3 player, etc. Also, the electronic device may be an arbitrary wireless device that combines the functions of two or more devices among the above devices.
  • Also, in the following description, the electronic device may edit an object according to a sponge edit method. The electronic device may perform editing such as copy, move, cut, paste, delete, etc. on an object using the sponge edit method. In the following description, it may be assumed that the electronic device performs editing of cut and paste on an object according to the sponge edit method. However, the electronic device may provide editing such as move and copy via the sponge edit method in the same manner. Here, the sponge edit method may denote an edit method in which an object touched by a touch pen or a finger is absorbed into the touch pen or the finger similar to a sponge or pipette absorbing liquid, and the previously absorbed object is emitted and displayed on a point touched by the touch pen or the finger similar to the sponge or pipette emitting liquid.
  • FIG. 1 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 1, the electronic device 100 may include a memory 110, a processor unit 120, an audio processor 130, an Input/Output (I/O) controller 140, a touchscreen 150, and an input unit 160. Here, a plurality of memories 110 may exist.
  • The memory 110 may include a program storage 111 for storing a program for controlling an operation of the electronic device 100, and a data storage 112 for storing data occurring during execution of a program.
  • The data storage 112 may store an object cut by an object edit program 114. In the case where a delete event occurs, the data storage 112 may delete object information stored via cut.
  • The program storage 111 may include a Graphic User Interface (GUI) program 113, the object edit program 114, and at least one application 115. Here, a program included in the program storage 111 is a set of instructions and may be expressed as an instruction set.
  • The GUI program 113 may include at least one software element for providing a user interface using graphics on a display unit 152. The GUI program 113 may control to display application information driven by a processor 122 on the display unit 152. At this point, the GUI program 113 may control to represent a sponge effect of an object edit by the object edit program 114. For example, in case of cutting an object touched by a touch pen or a finger, the GUI program 113 may control to represent an effect illustrating the object being absorbed by the touch pen or the finger similar to a sponge or pipette absorbing liquid. Also, the GUI program 113 may control to represent an effect of displaying a previously absorbed object on a point touched by the touch pen or the finger like the sponge emitting liquid.
  • The object edit program 114 may include at least one software element for editing an object by considering of touch information detected via a touch input unit 154 and edit event occurrence information. For example, in the case where a cut event occurs based on the maintaining of a first touch on an object, the object edit program 114 may control to cut the object of the first touch point. After that, in the case where a second touch is detected, the object edit program 114 may control to paste the cut object on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit program 114 may control to paste the cut object on the second touch point. Here, the object edit program 114 may detect cut event occurrence and paste event occurrence using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc. Also, a cut event and a paste event may be generated by the same input method or different input methods.
  • For another example, in the case where a cut event occurs by maintaining a first touch on at least one object included in an object group, the object edit program 114 may control to cut an object included in the object group. After that, when detecting a second touch, the object edit program 114 may control to paste the cut object of the object group on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit program 114 may control to paste the cut object on the second touch point.
  • For another example, in the case where an edit event occurs by maintaining a touch detected via the touch input unit 154, the object edit program 114 may determine whether a touch of an object occurs. For example, the object edit program 114 may determine that the touch of the object occurs by comparing an object recognize region with a touch point. In the case where a touch of an object occurs, the object edit program 114 may control to cut the object on the touch point. Meanwhile, in the case where the touch of the object does not occur, the object edit program 114 may control to paste a previously cut object on the touch point. At this point, the object edit program 114 may recognize an object recognize region differently depending on a touch method. For example, in the case where a touch is detected via the touch pen, the object edit program 114 may recognize an object recognize region narrower than in the case where a touch is detected by a hand.
  • In case of cutting a plurality of objects, the object edit program 114 may control to paste at least one of the plurality of cut objects selected by a user.
  • The application 115 may include a software element for at least one application installed in the electronic device 100.
  • The processor unit 120 may include a memory interface 121, at least one processor 122, and a peripheral interface 123. Here, the memory interface 121, the at least one processor 122, and the peripheral interface 123 included in the processor unit 120 may be integrated in at least one Integrated Circuit (IC) or implemented as separate elements.
  • The memory interface 121 may control an access of the memory 110 by an element such as the processor 122 or the peripheral interface 123.
  • The peripheral interface 123 may control connection between I/O peripherals of the electronic device 100 and the processor 122 and the memory interface 121.
  • The processor 122 may control the electronic device 100 to provide various multimedia services using at least one software program. At this point, the processor 122 may control to execute at least one program stored in the memory 110 and provide a service corresponding to the relevant program. For example, the processor 122 may execute the object edit program 114 stored in the program storage 111 to edit an object by considering touch information detected via the touch input unit 154 and edit event occur information.
  • The audio processor 130 may provide an audio interface between a user and the electronic device 100 via a speaker 131 and a microphone 132. For example, in case of cutting an object according to a sponge edit method, the audio processor 130 generates a sound via the speaker 131 as if the object were being absorbed by a sponge or pipette. For another example, in case of pasting an object according to the sponge edit method, the audio processor 130 may generate a sound via the speaker as if the object were emitted.
  • The I/O controller 140 may provide an interface between an I/O unit such as the touchscreen 150 and the input unit 160 and the peripheral interface 123.
  • The touchscreen 150 is an I/O unit for performing output of information and input of information, and may include the display unit 152 and the touch input unit 154.
  • The display unit 152 may display state information of the electronic device 100, a character input by a user, a moving picture, a still picture, etc. For example, the display unit 152 may display application information driven by the processor 122. For another example, in the case where the object edit program 114 performs a cut on an object, the display unit 152 may represent an animation effect as if the relevant object were being absorbed by the touch pen or the finger that has touched the object similar to a sponge or pipette absorbing liquid. For still another example, in the case where the object edit program 114 performs paste on an object, the display unit 152 may represent an animation effect as if the object were emitted from the touch pen or the finger like the sponge or pipette emitting liquid.
  • The touch input unit 154 may provide touch information detected via a touch panel to the processor unit 120 via the I/O controller 140. At this point, the touch input unit 154 may provide touch information by the touch pen or the finger to the processor unit 120 via the I/O controller 140.
  • The input unit 160 may provide input data generated by a user's selection to the processor unit 120 via the I/O controller 140. For example, the input unit 160 includes only a control button for controlling the electronic device 100. For another example, the input unit 160 may include a keypad for receiving input data from a user.
  • Though not shown, the electronic device 100 may further include a communication system for performing a communication function for voice communication and data communication. At this point, the communication system may be classified into a plurality of communication submodules supporting different communication networks. For example, though not limited thereto, the communication network may include a Global System for Mobile Communication (GSM) network, an Enhanced Data GSM Environment (EDGE) network, a Code Division Multiple Access (CDMA) network, a Wideband CDMA (WCDMA) network, a Long Term Evolution (LTE) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a wireless Local Area Network (LAN), a Bluetooth network, a Near Field Communication (NFC), etc.
  • In the above embodiment, the processor 122 may execute software elements stored in the program storage 111 inside one module to edit an object.
  • In another embodiment, the processor 122 may be configured to include elements for editing an object as separate modules as illustrated in FIG. 2.
  • FIG. 2 is a block diagram illustrating a processor according to an embodiment of the present disclosure.
  • Referring to FIG. 2, the processor 122 may include an object edit controller 200, and a display controller 210.
  • The object edit controller 200 may execute the object edit program 114 stored in the program storage 111 to edit an object. For example, in the case where a cut event occurs with a first touch of an object maintained, the object edit controller 200 may cut an object on the first touch point. After that, when detecting a second touch, the object edit controller 200 may paste the cut object on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit controller 200 may paste the cut object on the second touch point. Here, the object edit controller 200 may detect occurrence of a cut event and occurrence of a paste event using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc. At this point, the cut event and the paste event may be generated by the same input method or other input methods.
  • For another example, in the case where a cut event occurs by maintaining a first touch of at least one object included in an object group, the object edit controller 200 may cut the object included in the object group. After that, when detecting a second touch, the object edit controller 200 may paste the cut object of the object group on the second touch point. At this point, in the case where a paste event occurs with the second touch maintained, the object edit controller 200 may paste the cut object on the second touch point.
  • For another example, in the case where an edit event occurs by maintaining a touch detected via the touch input unit 154, the object edit controller 200 may determine whether a touch of the object occurs. At this point, the object edit controller 200 may determine whether the touch of the object occurs by comparing an object recognize region with a touch point. In the case where the touch of the object occurs, the object edit controller 200 may cut the object on the touch point. In contrast, in the case where the touch of the object does not occur, the object edit controller 200 may paste a previously cut object on the touch point. At this point, the object edit controller 200 may recognize an object recognize region differently depending on a touch method. For example, in the case where a touch is detected via the touch pen, the object edit controller 200 may recognize an object recognize region narrower than that in the case where a touch is detected by a hand.
  • In case of cutting a plurality of objects, the object edit controller 200 may control to paste at least one of the plurality of cut objects selected by a user.
  • The display controller 210 may control to execute the GUI program 113 stored in the program storage 111 and display a user interface using graphics on the display unit 152. For example, the display controller 210 may control to display information of an application driven by the processor 122 on the display unit 152. For another example, in the case where the object edit controller 200 performs a cut of an object, the display controller 210 may control to represent an animation effect as if a relevant object were absorbed into the touch pen or the finger that has touched the object similar to a sponge or pipette absorbing liquid. For still another example, in the case where the object edit controller 200 performs paste on an object, the display controller 210 may control to represent an animation effect as if the object were emitted from the touch pen or the finger like a sponge emitting liquid.
  • Though not shown, the processor 122 may further include an audio controller for controlling to generate sound effects depending on the sponge edit method. For example, in the case where the object edit controller 200 cuts an object, the audio controller may control to generate a sound via the speaker 131 as if the object were absorbed. For another example, in the case where the object edit controller 200 pastes an object, the audio controller may control to generate a sound via the speaker 131 as if the object were emitted.
  • As described above, the electronic device may edit an object by considering touch information and edit event occurrence information. Accordingly, to discriminate an object edit function from a different service that may occur by considering touch information, the electronic device may switch to an object edit mode and edit an object by considering touch information and edit event occurrence information. Here, the electronic device may enter the object edit mode using at least one input method of a button input of the touch pen, a hardware button input of the electronic device, an icon selection, a motion detection of the electronic device, a user gesture detection, etc.
  • In the above embodiment, the object edit controller 200 may cut an object on a first touch point and paste the cut object on a second touch point. At this point, the object edit controller 200 may paste the cut object by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the object edit controller 200 may change the position of another object existing on the second touch point and paste the cut object on the second touch point. For another example, the object edit controller 200 may change the paste position on which the cut object is to be pasted to a point on which another object does not exist and paste the cut object.
  • FIG. 3 is a flowchart illustrating a procedure for editing display information in an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 3, when a first edit event occurs upon maintaining of a first touch of a first object in operation 301, the electronic device may proceed to operation 303 to store the first object. At this point, the electronic device generates an animation edit effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method. Here, the electronic device may detect occurrence of a first edit event by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • After that, the electronic device may proceed to operation 305 and when detecting a second touch, proceed to operation 307 to display the first object stored in operation 303 on the second touch point. For example, the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger according to the sponge edit method and display the first object on the second touch point.
  • FIG. 4 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 4, the electronic device may determine whether a first touch of a first object is detected in operation 401. For example, the electronic device may determine whether the first touch of the first object is detected via the touch input unit 154.
  • When detecting the first touch of the first object, the electronic device may proceed to operation 403 to determine whether a first edit event occurs by maintaining of the first touch. For example, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the first touch maintained, etc.
  • In the case where the first edit event does not occur, the electronic device may proceed to operation 405 to determine whether the first touch is released.
  • In the case where the first touch is released, the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device ends the present algorithm.
  • In the case where the first touch is not released, the electronic device may proceed to operation 403 to determine whether the first edit event occurs by maintaining of the first touch.
  • In the case where the first edit event occurs with the first touch maintained in operation 403, the electronic device may proceed to operation 407 to cut the first object on the first touch point. At this point, the electronic device generates an animation effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method. The electronic device may generate a sound as if the first object were absorbed according to the sponge edit method. Here, the electronic device may store the cut first object in the data storage 112.
  • After that, the electronic device may proceed to operation 409 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154.
  • When detecting the second touch, the electronic device may proceed to operation 411 to paste the first object cut in operation 407 on the second touch point. For example, the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the first object on the second touch point. The electronic device may generate a sound as if the first object were emitted according to the sponge edit method.
  • As described above, the electronic device may paste the cut first object on the second touch point. At this point, the electronic device may delete cut information of the first object from the data storage 112. For another example, the electronic device may store the cut information of the first object in the data storage 112 until a cancel event of the first object occurs. Here, in case of receiving button input information from the touch pen not touching the screen, the electronic device may recognize that the cancel event has occurred. For still another example, the electronic device may store the cut information of the first object in the data storage 112 for a reference time after performing paste on the first object.
  • The electronic device may paste the first object by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of the other object existing on the second touch point and paste the first object on the second touch point. For another example, the electronic device may change the paste position of the first object to a point on which another object does not exist and paste the first object.
  • FIG. 5 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • Referring to FIG. 5, the electronic device may determine whether a first touch of a first object is detected in operation 501. For example, the electronic device may determine whether the first touch of the first object is detected via the touch input unit 154.
  • When detecting the first touch of the first object, the electronic device may proceed to operation 503 to determine whether a first edit event occurs through maintaining of the first touch. For example, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • In the case where the first edit event does not occur, the electronic device may proceed to operation 505 to determine whether the first touch is released.
  • In the case where the first touch is released, the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device may end the present algorithm.
  • In the case where the first touch is not released, the electronic device may proceed to operation 503 to determine whether the first edit event occurs with the first touch maintained.
  • In the case where the first edit event occurs with the first touch maintained in operation 503, the electronic device may proceed to operation 507 to cut the first object on the first touch point. At this point, the electronic device generates an animation effect as if the first object were absorbed into the touch pen or the finger that has touched the first object according to the sponge edit method. Additionally, the electronic device may generate a sound as if the first object were absorbed according to the sponge edit method. Here, the electronic device may store the cut first object in the data storage 112.
  • After that, the electronic device may proceed to operation 509 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154.
  • When detecting the second touch, the electronic device may proceed to operation 511 to determine whether a second edit event occurs through maintaining of the second touch. For example, the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the second touch maintained, etc.
  • In the case where the second edit event does not occur, the electronic device may proceed to operation 513 to determine whether the second touch is released.
  • In the case where the second touch is released, the electronic device may recognize it does not perform editing on the first object. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut first object to a previous state of the cut event occurrence.
  • In contrast, in the case where the second touch is not released, the electronic device may proceed to operation 511 to determine whether the second edit event occurs with the second touch maintained.
  • Meanwhile, in the case where the second edit event occurs by maintaining of the second touch in operation 511, the electronic device may proceed to operation 515 to paste the first object cut in operation 507 on the second touch point. For example, the electronic device may generate an animation effect as if the first object were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the first object on the second touch point. Additionally, the electronic device may generate a sound as if the first object were emitted according to the sponge edit method.
  • As described above, the electronic device may paste the cut first object on the second touch point. At this point, the electronic device may delete cut information of the first object from the data storage 112. For another example, the electronic device may store the cut information of the first object in the data storage 112 until a cancel event of the first object occurs. Here, the electronic device may determine a cancel event by considering button input information of the touch pen not touching the screen. For still another example, the electronic device may perform paste on the first object and store the cut information of the first object in the data storage 112 for a reference time.
  • The electronic device may paste the first object by considering whether an object exists on a second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of another object existing on the second touch point and paste the first object on the second touch point. For another example, the electronic device may change the paste position of the first object to a point on which another object does not exist and paste the first object.
  • FIG. 6 is a flowchart illustrating a procedure for editing display information using a sponge edit method in an electronic device according to still another embodiment of the present disclosure.
  • Referring to FIG. 6, the electronic device may determine whether a touch is detected in operation 601. For example, the electronic device may determine whether a touch is detected via the touch input unit 154.
  • When detecting the touch, the electronic device may proceed to operation 603 to determine whether an edit event occurs by maintaining of the touch. For example, the electronic device may determine whether an edit event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information with the touch maintained, etc.
  • In the case where the edit event occurs, the electronic device may proceed to operation 605 to determine whether the touch is released.
  • In the case where the touch is released, the electronic device may recognize it does not perform object editing. Accordingly, the electronic device may end the present algorithm.
  • In the case where the touch is not released, the electronic device may proceed to operation 603 to determine whether an edit event occurs with the touch maintained.
  • In the case where an edit event occurs by maintaining of the touch in operation 603, the electronic device may proceed to operation 607 to determine whether an object exists on the touch point detected in operation 601. For example, the electronic device may determine whether the touch of the object has occurred by comparing an object recognize region with the touch point. At this point, the electronic device may recognize the object recognize region differently depending on a touch method. More specifically, when detecting a touch by the touch pen, the electronic device may recognize it may perform a more detailed touch using the touch pen than a case by a hand touch. Accordingly, the electronic device may recognize an object recognize region narrower than that in a case of detection by the hand touch. Here, a region that may recognize a relevant object may be represented by considering touch information.
  • In the case where a touch of the object occurs in operation 607, the electronic device may proceed to operation 609 to cut the object on the touch point. At this point, the electronic device generates an animation effect as if the object touched by the touch pen or the finger were absorbed into the touch pen or the finger according to the sponge edit method. The electronic device may generate a sound as if the object were absorbed according to the sponge edit method. Here, the electronic device may store the cut first object in the data storage 112.
  • In the case where a touch of the object does not occur in operation 607, the electronic device may proceed to operation 611 to paste a previously cut object on the touch point. For example, the electronic device may generate an animation effect as if the previously cut object were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the object on the touch point. The electronic device may generate a sound as if the object were emitted according to the sponge edit method.
  • As described above, the electronic device may paste the cut object on the touch point where an object does not exist. At this point, the electronic device may delete cut information of the object from the data storage 112. For another example, the electronic device may store the cut information of the object in the data storage 112 until a cancel event of the object occurs. Here, the electronic device may determine a cancel event by considering button input information of the touch pen not touching the screen.
  • FIGS. 11A, 11B, 11C, 11D, 11E, 11F, and 11G are views illustrating a screen configuration for editing an icon using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • As illustrated in FIGS. 3 to 6, the electronic device may edit a touched object by considering touch information detected via the touch input unit 154 and edit event occurrence information. For example, as illustrated in FIG. 11A, when detecting a button input 1103 of the touch pen by maintaining of a touch 1101 of an icon “Flickr” by the touch pen, the electronic device may cut the icon “Flickr”. At this point, as illustrated in FIG. 11B, the electronic device may cut the icon “Flickr” with an animation effect as if the icon “Flickr” were absorbed into the touch pen like a sponge or pipette absorbing liquid. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut the icon “Flickr” using the sponge effect at a point at which a button input is released after the button input 1103 of the touch pen has been detected.
  • In case of cutting the icon “Flickr”, the electronic device may rearrange icons displayed on the display unit 152 as illustrated in FIG. 11C. The electronic device may display a notice icon 1111 representing whether a cut icon exists as illustrated in FIG. 11C. At this point, the notice icon 1111 may represent the number of cut icons.
  • After cutting the icon “Flickr”, when detecting a second touch 1121 as illustrated in FIG. 11D, the electronic device may paste the cut icon “Flickr” on the second touch point as illustrated in FIG. 11G At this point, as illustrated in FIG. 11F, the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • After cutting the icon “Flickr”, when detecting a button input 1133 of the touch pen with the second touch 1131 maintained as illustrated in FIG. 11E, the electronic device may paste the icon “Flickr” on the second touch point. At this point, as illustrated in FIG. 11F, the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • FIGS. 12A, 12B, 12C, 12D, and 12E are views illustrating a screen configuration for editing an icon using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 12A, when detecting a hardware button input 1203 (e.g., volume button) of the electronic device by maintaining of a touch 1201 of the icon “Flickr” by a user's finger, the electronic device may cut the icon “Flickr”. At this point, the electronic device may cut the icon “Flickr” with an animation effect as if the icon “Flickr” were absorbed into the finger like a sponge or pipette absorbing liquid as illustrated in FIG. 12B. In case of cutting the icon “Flickr”, the electronic device may rearrange icons displayed on the display unit 152 as illustrated in FIG. 11C. Also, the electronic device may display a notice icon 1111 representing whether a cut icon exists as illustrated in FIG. 11C.
  • After cutting the icon “Flickr”, when detecting a second touch 1211 as illustrated in FIG. 12C, the electronic device may paste the cut icon “Flickr” on the second touch point. At this point, as illustrated in FIG. 12E, the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the finger like a sponge or pipette emitting liquid.
  • After cutting the icon “Flickr”, when detecting a hardware button input 1223 of the electronic device with the second touch 1221 maintained as illustrated in FIG. 12D, the electronic device may paste the cut icon “Flickr” on the second touch point. At this point, as illustrated in FIG. 12E, the electronic device may paste the icon “Flickr” on the second touch point with an animation effect as if the icon “Flickr” were emitted from the finger like a sponge or pipette emitting liquid.
  • In the above embodiment of the present disclosure, the electronic device may edit an icon according to the sponge edit method.
  • FIGS. 13A, 13B, 13C, 13D, 13E, 13F, and 13G are views illustrating a screen configuration for editing text using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 14A, 14B, 14C, 14D, and 14E are views illustrating a screen configuration for editing text using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • In another embodiment of the present disclosure, the electronic device may edit text according to the sponge edit method as illustrated in FIGS. 13A to 13G or FIGS. 14A to 14E. For example, when detecting a button input 1303 of the touch pen by maintaining of a touch 1301 of a word “wonderful” by the touch pen as illustrated in FIG. 13A, the electronic device may cut the word “wonderful”. At this point, as illustrated in FIG. 13B, the electronic device may cut the word “wonderful” with an animation effect as if the word “wonderful” were absorbed into the touch pen like a sponge or pipette absorbing liquid. In this case, the electronic device may automatically set a text region for cutting or determine the text region depending on a user's input information. For example, when detecting a button input 1303 of the touch pen by maintaining of a touch 1301 by the touch pen, the electronic device may automatically set a text region for cutting by considering a touch point and a word spacing point of text. At this point, the electronic device may change the size of the text region by considering input information of a text region change bar 1305 as illustrated in FIG. 13A. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut the word “wonderful” using the sponge effect at a point at which the button input is released after the button input 1303 of the touch pen is detected.
  • In case of cutting the word “wonderful”, the electronic device may rearrange text displayed on the display unit 152 as illustrated in FIG. 13C. As illustrated in FIG. 13C, the electronic device may display a notice icon 1311 representing whether cut text exists. At this point, the notice icon 1311 may display the number of cut text.
  • After cutting the word “wonderful”, when detecting a second touch 1321 as illustrated in FIG. 13D, the electronic device may paste the cut word “wonderful” on the second touch point as illustrated in FIG. 13G At this point, as illustrated in FIG. 13F, the electronic device may paste the word “wonderful” on the second touch point with an animation effect as if the word “wonderful” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • After cutting the word “wonderful”, when detecting a button input 1333 of the touch pen with a second touch 1331 maintained as illustrated in FIG. 13E, the electronic device may paste the word “wonderful” on the second touch point. At this point, as illustrated in FIG. 13F, the electronic device may paste the word “wonderful” on the second touch point with an animation effect as if the word “wonderful” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • For another example, when detecting a hardware button input 1403 of the electronic device by maintaining of a touch 1401 of the word “wonderful” by a user's finger as illustrated in FIG. 14A, the electronic device may cut the word “wonderful”. At this point, as illustrated in FIG. 14B, the electronic device may cut the word “wonderful” with an animation effect as if the word “wonderful” were absorbed into the finger like a sponge or pipette absorbing liquid. In this case, the electronic device may automatically set a text region for cutting or determine the text region depending on a user's input information. For example, when detecting the button input 1403 of the touch pen by maintaining of the touch 1401 by the finger, the electronic device may automatically set a text region for cutting by considering a touch point and a word spacing point of text. At this point, the electronic device may change the size of the text region by considering input information of a text region change bar 1405 as illustrated in FIG. 14A
  • In case of cutting the word “wonderful”, the electronic device may rearrange the arrangement of text displayed on the display unit 152 as illustrated in FIG. 13C. Also, the electronic device may display a notice icon 1311 representing whether cut text exists as illustrated in FIG. 13C.
  • After cutting the word “wonderful”, when detecting a second touch 1411 as illustrated in FIG. 14C, the electronic device may paste the cut word “wonderful” on the second touch point. At this point, as illustrated in FIG. 14E, the electronic device may paste the word “wonderful” with an animation effect as if the word “wonderful” were emitted from the finger like a sponge or pipette emitting liquid.
  • After cutting the word “wonderful”, when detecting a hardware button input 1423 of the electronic device with the second touch 1421 maintained as illustrated in FIG. 14D, the electronic device may paste the cut word “wonderful” on the second touch point. At this point, as illustrated in FIG. 14E, the electronic device may paste the word “wonderful” with an animation effect as if the word “wonderful” were emitted from the finger like a sponge or pipette emitting liquid.
  • In the above embodiment of the present disclosure, the electronic device may edit text according to the sponge edit method.
  • FIGS. 15A, 15B, 15C, 15D, 15E, 15F, and 15G are views illustrating a screen configuration for editing a reproduction list using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • FIGS. 16A, 16B, 16C, 16D, and 16E are views illustrating a screen configuration for editing a reproduction list using a hand touch in an electronic device according to an embodiment of the present disclosure.
  • In another embodiment of the present disclosure, the electronic device may edit a reproduction list according to the sponge edit method as illustrated in FIGS. 15A to 15G and FIGS. 16A to 16E. For example, when detecting a button input 1503 of the touch pen by maintaining of a touch 1501 of a list “song 3” by the touch pen as illustrated in FIG. 15A, the electronic device may cut the list “song 3”. At this point, as illustrated in FIG. 15B, the electronic device may cut the list “song 3” with an animation effect as if the list “song 3” were absorbed into the touch pen like a sponge or pipette absorbing liquid. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut the list “song 3” using the sponge effect at a point at which a button input is released after the button input 1503 of the touch pen is detected.
  • In case of cutting the list “song 3”, the electronic device may rearrange a reproduction list displayed on the display unit 152 as illustrated in FIG. 15C. As illustrated in FIG. 15C, the electronic device may display a notice icon 1511 representing whether a cut reproduction list exists. At this point, the notice icon 1511 may display the number of cut reproduction lists.
  • After cutting the list “song 3”, when detecting a second touch 1521 as illustrated in FIG. 15D, the electronic device may paste the cut list “song 3” on the second touch point as illustrated in FIG. 15G At this point, as illustrated in FIG. 15F, the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • After cutting the list “song 3”, when detecting a button input 1533 of the touch pen with a second touch 1531 maintained as illustrated in FIG. 15E, the electronic device may paste the list “song 3” on the second touch point. At this point, as illustrated in FIG. 15F, the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the touch pen like a sponge or pipette emitting liquid.
  • For another example, when detecting a hardware button input 1603 of the electronic device with a touch 1601 of the list “song 3” by a user's finger maintained as illustrated in FIG. 16A, the electronic device may perform cut on the list “song 3”. At this point, as illustrated in FIG. 16B, the electronic device may cut the list “song 3” with an animation effect as if the list “song 3” were absorbed into the finger like a sponge or pipette absorbing liquid. In case of cutting the list “song 3”, the electronic device may rearrange reproduction lists displayed on the display unit 152 as illustrated in FIG. 15C. Also, as illustrated in FIG. 15C, the electronic device may display a notice icon 1511 representing whether a cut reproduction list exists.
  • After cutting the list “song 3”, when detecting a second touch 1611 as illustrated in FIG. 16C, the electronic device may paste the cut list “song 3” on the second touch point. At this point, as illustrated in FIG. 16E, the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the finger like a sponge or pipette emitting liquid.
  • After cutting the list “song 3”, when detecting a hardware input button 1623 of the electronic device with a second touch 1621 maintained as illustrated in FIG. 16D, the electronic device may paste the cut list “song 3” on the second touch point. At this point, as illustrated in FIG. 16E, the electronic device may paste the list “song 3” on the second touch point with an animation effect as if the list “song 3” were emitted from the finger like a sponge or pipette emitting liquid.
  • In case of cutting at least one object as illustrated in FIGS. 11C, 13C, and 15C, the electronic device may rearrange objects displayed on the display unit 152. In another embodiment of the present disclosure, in case of cutting at least one object, the electronic device may not rearrange the objects displayed on the display unit 152. For example, the electronic device may leave a cut object region as a vacant space.
  • FIG. 7 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 7, the electronic device may set an object group in operation 701. For example, the electronic device may recognize at least one object included in an object select region set by considering multi-touched points detected via the touch input 154 as the object group. For another example, the electronic device may recognize at least one object included in an object select region set by consideration of a touch point and drag information detected via the touch input unit 154 as the object group. For still another example, the electronic device may recognize at least one object included in an object select region set by considering touched points successively detected via the touch input unit 154 as an object group 1301. For still further another example, when detecting an object select event, the electronic device may switch to an object select mode. At this point, the electronic device may recognize at least one object selected in the object select mode as the object group.
  • After setting the object group, the electronic device may proceed to operation 703 to determine whether a touch of at least one object among one or more objects included in the object group is detected.
  • When detecting a touch of an object included in the object group, the electronic device may proceed to operation 705 to determine whether an edit event occurs with a touch of the object included in the object group maintained. Here, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • In the case where an edit event does not occur, the electronic device may proceed to operation 707 to determine whether a touch of the object included in the object group is released.
  • In the case where the touch of the object included in the object group is released, the electronic device may recognize it does not perform editing on the object group. Accordingly, the electronic device may end the present algorithm.
  • In the case where the touch of the object included in the object group is not released, the electronic device may proceed to operation 705 to determine whether an edit event occurs with the touch of the object included in the object group maintained.
  • In the case where an edit event occurs with the touch of the object included in the object group maintained in operation 705, the electronic device may proceed to operation 709 to cut the object included in the object group. At this point, the electronic device generates an animation effect as if objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method. Additionally, the electronic device may generate a sound as if the objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method. Here, the electronic device may store cut object information in the data storage 112.
  • After that, the electronic device may proceed to operation 711 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154.
  • When detecting the second touch, the electronic device may proceed to operation 713 to paste the object included in the object group on the second touch point. For example, the electronic device may generate an animation effect as if objects included in the object group were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the objects included in the object group on the second touch point. Additionally, the electronic device may generate a sound as if the objects included in the object group were emitted according to the sponge edit method.
  • As described above, the electronic device may paste the cut objects of the object group on the second touch point. At this point, the electronic device may paste the cut objects of the object group by considering whether an object exists on the second touch point. For example, in the case where another object exists on the second touch point, the electronic device may change the position of another object existing on the second touch point and paste the cut objects of the object group on the second touch point. For another example, the electronic device may change the paste position of the cut objects of the object group to a point on which another object does not exist, and paste the objects.
  • FIG. 8 is a flowchart illustrating a procedure for editing a plurality of display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • Referring to FIG. 8, the electronic device may set an object group in operation 801. For example, the electronic device may recognize at least one object included in an object select region set by considering multi-touched points detected via the touch input 154 as the object group. For another example, the electronic device may recognize at least one object included in an object select region set by considering a touch point and drag information detected via the touch input unit 154 as the object group. For still another example, the electronic device may recognize at least one object included in an object select region set by considering touched points successively detected via the touch input unit 154 as an object group 1301. For still further another example, when detecting an object select event, the electronic device may switch to an object select mode. At this point, the electronic device may recognize at least one object selected in the object select mode as the object group.
  • After setting the object group, the electronic device may proceed to operation 803 to determine whether a touch of at least one object among one or more objects included in the object group is detected.
  • When detecting a touch of an object included in the object group, the electronic device may proceed to operation 805 to determine whether an edit event occurs with a touch of the object included in the object group maintained. For example, with the touch of the object included in the object group maintained, the electronic device may determine whether a cut event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • In the case where an edit event does not occur, the electronic device may proceed to operation 807 to determine whether a touch of the object included in the object group is released.
  • In the case where the touch of the object included in the object group is released, the electronic device may recognize it does not perform editing on the object group. Accordingly, the electronic device may end the present algorithm.
  • In the case where the touch of the object included in the object group is not released, the electronic device may proceed to operation 805 to determine whether an edit event occurs with the touch of the object included in the object group maintained.
  • In the case where an edit event occurs with the touch of the object included in the object group maintained in operation 805, the electronic device may proceed to operation 809 to cut the object included in the object group. At this point, the electronic device generates an animation effect as if objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method. Additionally, the electronic device may generate a sound as if the objects included in the object group were absorbed into the touch pen or the finger according to the sponge edit method. Here, the electronic device may store cut object information in the data storage 112.
  • After that, the electronic device may proceed to operation 811 to determine whether a second touch is detected. For example, the electronic device may determine whether the second touch is detected via the touch input unit 154.
  • When detecting the second touch, the electronic device may proceed to operation 813 to determine whether a second edit event occurs with the second touch maintained. Here, the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • In the case where the second edit event does not occur, the electronic device may proceed to operation 815 to determine whether the second touch is released.
  • In the case where the second touch is released, the electronic device may recognize it does not perform editing on an object group. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut object group to a previous state of the cut event occurrence.
  • In the case where the second touch is not released, the electronic device may proceed to operation 813 to determine whether the second edit event occurs with the second touch maintained.
  • In the case where the second edit event occurs with the second touch maintained in operation 813, the electronic device may proceed to operation 817 to paste an object included in the object group on the second touch point. For example, the electronic device may generate an animation effect as if the objects included in the object group were emitted from the touch pen or the finger that has touched the second touch point according to the sponge edit method and display the objects included in the object group on the second touch point. The electronic device may generate a sound as if the objects included in the object group were emitted according to the sponge edit method.
  • FIGS. 17A, 17B, 17C, 17D, 17E, 17F, 17G, and 17H are views illustrating screen configuration for editing a plurality of icons using a touch pen in an electronic device according to an embodiment of the present disclosure.
  • As described above, the electronic device may edit at least one object included in the object group by considering touch information detected via the touch input unit 154 and the edit event occur information. For example, as illustrated in FIG. 17A, the electronic device may set an object group 1701 by considering touch points successively detected via the touch input unit 154. After that, as illustrated in FIG. 17B, when detecting a button input 1713 of the touch pen with a touch 1711 of an icon “Flickr” included in an object group by the touch pen maintained, the electronic device may cut an icon “Picasa”, an icon “FC2”, an icon “Flickr”, an icon “Line pix”, an icon “Microsoft”, and an icon “MySpace” included in the object group. At this point, as illustrated in FIG. 17C, the electronic device may cut the objects with an animation effect as if the objects included in the object group were absorbed into the touch pen like a sponge or pipette absorbing liquid. Similarly with an operation of absorbing liquid using a pipette, the electronic device may cut objects included in the object group using a sponge effect at a point at which a button input is released after the button input 1713 of the touch pen is detected.
  • In case of cutting the object group, as illustrated in FIG. 17D, the electronic device may rearrange objects displayed on the display unit 152. As illustrated in FIG. 17D, the electronic device may display a notice icon 1721 representing whether a cut object exists. At this point, the notice icon 1721 may display the number of cut objects.
  • After cutting the object group, when detecting a second touch 1731 as illustrated in FIG. 17E, the electronic device may paste cut objects included in the object group on the second touch point as illustrated in FIG. 17H. At this point, the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the touch pen like a sponge or pipette emitting liquid.
  • After cutting the object group, when detecting a button input 1743 of the touch pen with a second touch 1741 maintained as illustrated in FIG. 17F, the electronic device may paste the cut objects included in the object group on the second touch point. At this point, as illustrated in FIG. 17G the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the touch pen like a sponge or pipette emitting liquid.
  • For another example, as illustrated in FIG. 18A, the electronic device may set an object group 1801 by considering touch points successively detected via the touch input unit 154. After that, as illustrated in FIG. 18B, when detecting a hardware button input 1813 of the electronic device with a touch 1811 of an icon “Flickr” included in the object group by a user's finger maintained, the electronic device may cut an icon “Picasa”, an icon “FC2”, an icon “Flickr”, an icon “Line pix”, an icon “Microsoft”, and an icon “MySpace” included in the object group. At this point, as illustrated in FIG. 18C, the electronic device may cut the objects with an animation effect as if the objects included in the object group were absorbed into the finger like a sponge or pipette absorbing liquid. In case of cutting the object group, as illustrated in FIG. 17D, the electronic device may rearrange the objects displayed on the display unit 152. Also, as illustrated in FIG. 17D, the electronic device may display a notice icon 1721 representing whether a cut object exists. At this point, the notice icon 1721 may represent the number of cut objects.
  • After cutting the object group, when detecting a second touch 1821 as illustrated in FIG. 18D, the electronic device may paste the cut objects included in the object group on the second touch point. At this point, as illustrated in FIG. 18F, the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the finger like a sponge or pipette emitting liquid.
  • After cutting the object group, when detecting a hardware button input 1833 of the electronic device with a second touch 1831 maintained as illustrated in FIG. 18E, the electronic device may paste the cut objects included in the object group on the second touch point. At this point, as illustrated in FIG. 18F, the electronic device may paste the objects on the second touch point with an animation effect as if the objects included in the object group were emitted from the finger like a sponge or pipette emitting liquid.
  • FIG. 9 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to an embodiment of the present disclosure. FIGS. 19A, 19B, and 19C are views illustrating screen configuration for displaying a plurality of edit lists in an electronic device according to an embodiment of the present disclosure.
  • Referring to FIG. 9, the electronic device may perform a cut event on an object in operation 901. For example, in the case where a cut event occurs by maintaining of a touch of an object by the touch pen or the finger as illustrated in FIGS. 3 to 6, the electronic device may cut the touched object. For another example, in the case where a cut event occurs by maintaining of a touch of an object included in an object group by the touch pen or the finger as illustrated in FIGS. 7 and 8, the electronic device may cut the objects included in the object group.
  • In case of performing a cut event on an object, the electronic device may proceed to operation 903 to display cut information of the object on the display unit 152. For example, the electronic device may display whether a cut object exists and the number of cut objects using a notice icon as illustrated in FIGS. 11C, 13C, 15C, and 17C. When detecting selection of the notice icon, the electronic device may display a list of cut objects on the display unit 152. For another example, in the case where a notice bar is executed, the electronic device may display a list 1901 of cut objects on the notice bar as illustrated in FIG. 19A. For still another example, in the case where a cut object exists, the electronic device may display a list of cut objects using a separate popup window 1911 as illustrated in FIG. 19B. For yet another example, the electronic device may set a portion of the display unit 152 as a region 1921 for displaying a list of cut objects and display the list of cut objects as illustrated in FIG. 19C.
  • After displaying cut information on the display unit 152, the electronic device may proceed to operation 905 to determine an object list on which it will perform a paste event. For example, the electronic device may select at least one object to paste among one or more cut objects.
  • After that, the electronic device may proceed to operation 907 to determine whether a touch is detected.
  • When detecting a touch, the electronic device may proceed to operation 909 to paste an object included in the object list on which it will perform a paste event in operation 905 on the touch point. For example, the electronic device may generate an animation effect as if objects included in the object list were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the objects included in the object list on the touch point. Additionally, the electronic device may generate a sound as if the objects included in the object list were emitted according to the sponge edit method.
  • FIG. 10 is a flowchart illustrating a procedure for selectively editing display information using a sponge edit method in an electronic device according to another embodiment of the present disclosure.
  • Referring to FIG. 10, the electronic device may perform a cut event on an object in operation 1001. For example, in the case where a cut event occurs by maintaining of a touch of an object by the touch pen or the finger as illustrated in FIGS. 3 to 6, the electronic device may cut the touched object. For another example, in the case where a cut event occurs by maintaining of a touch of an object included in an object group by the touch pen or the finger as illustrated in FIGS. 7 and 8, the electronic device may cut objects included the object group.
  • In case of performing a cut event on an object, the electronic device may proceed to operation 1003 to display cut information of the object on the display unit 152. For example, the electronic device may display whether a cut object exists and the number of cut objects using a notice icon as illustrated in FIGS. 11C, 13C, 15C, and 17C. When detecting selection of the notice icon, the electronic device may display a list of cut objects on the display unit 152. For another example, in the case where a notice bar is executed, the electronic device may display a list 1901 of cut objects on the notice bar as illustrated in FIG. 19A. For still another example, in the case where a cut object exists, the electronic device may display a list of cut objects using a separate popup window 1911 as illustrated in FIG. 19B. For yet another example, the electronic device may set a portion of the display unit 152 as a region 1921 for displaying a list of cut objects and display the list of cut objects as illustrated in FIG. 19C.
  • After displaying cut information on the display unit 152, the electronic device may proceed to operation 1005 to determine an object list on which it will perform a paste event. For example, the electronic device may select at least one object to paste among one or more cut objects.
  • After that, the electronic device may proceed to operation 1007 to determine whether a touch is detected.
  • When detecting a touch, the electronic device may proceed to operation 1009 to determine whether an edit event occurs with the touch maintained. Here, the electronic device may determine whether a paste event occurs by considering at least one of button input information of the touch pen, hardware button input information of the electronic device, icon selection information, motion detection information of the electronic device, user gesture detection information, etc.
  • In the case where an edit event does not occur, the electronic device may proceed to operation 1011 to determine whether the touch is released.
  • In the case where the touch is released, the electronic device may recognize it does not perform editing on the object. Accordingly, the electronic device may end the present algorithm. At this point, the electronic device may restore the cut object group to a previous state of the cut event occurrence.
  • In the case where the touch is not released, the electronic device may proceed to operation 1009 to determine whether an edit event occurs with the touch maintained.
  • In the case where an edit event occurs with the touch maintained in operation 1009, the electronic device may proceed to operation 1013 to paste an object included in the object list on which it will perform a paste event in operation 1005 on the touch point. For example, the electronic device may generate an animation effect as if objects included in the object list were emitted from the touch pen or the finger that has touched the touch point according to the sponge edit method and display the objects included in the object list on the touch point. Additionally, the electronic device may generate a sound as if the objects included in the object list were emitted according to the sponge edit method.
  • After that, the electronic device may end the present algorithm.
  • In the above embodiments, the electronic device generates an animation effect as if a relevant object were absorbed into a touch pen or a finger that has touched the object. At this point, the electronic device may change a point into which the object is absorbed by considering the touch point.
  • As described above, the electronic device having the touchscreen may easily edit display information by editing the display information displayed on the display unit using a sponge effect, and arouse a user's emotional interest via the sponge effect.
  • It will be appreciated that various embodiments of the present disclosure according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.
  • Any such software may be stored in a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
  • While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.

Claims (10)

What is claimed is:
1. A method in an electronic device, the method comprising:
when a first edit event occurs with a first touch maintained, storing object information of a first touch point; and
when detecting a second touch, displaying an object of the first touch point on a second touch point.
2. The method of claim 1, wherein the object comprises at least one of an application menu, an icon, a widget, text, and a reproduction list.
3. The method of claim 1, further comprising:
after the storing of the object information, removing display of the object on the first touch point from a display unit according to a sponge edit method.
4. The method of claim 3, further comprising:
after the removing of the display of the object, displaying display remove information of the object on the display unit.
5. The method of claim 1, wherein whether the first edit event occurs is determined by considering at least one of button input information of a touch pen, button input information of a hardware, edit icon selection information, motion information of the electronic device, and a user's gesture information.
6. The method of claim 1, wherein the displaying of the object comprises:
when detecting the second touch, determining whether a second edit event occurs with the second touch maintained; and
when the second edit event occurs with the second touch maintained, displaying the object of the first touch point on a second touch point according to a sponge edit method.
7. The method of claim 6, wherein whether the second edit event occurs is determined by considering at least one of button input information of a touch pen, button input information of a hardware, edit icon selection information, motion information of the electronic device, and a user's gesture information.
8. The method of claim 1, further comprising:
after the displaying of the object of the first touch point, deleting the stored object information of the first touch point.
9. The method of claim 1, further comprising:
determining an object group comprising at least one of a plurality of objects displayed on a display unit,
wherein the storing of the object comprises:
when the first edit event occurs with a first touch of at least one of one or more objects included in the object group maintained, storing information of at least one object included in the object group.
10. The method of claim 9, further comprising:
after the storing of the information of the at least one object included in the object group, removing display of the at least one object included in the object group from the display unit according to a sponge edit method.
US16/155,499 2012-09-14 2018-10-09 Method for editing display information and electronic device thereof Abandoned US20190042070A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/155,499 US20190042070A1 (en) 2012-09-14 2018-10-09 Method for editing display information and electronic device thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2012-0101927 2012-09-14
KR1020120101927A KR102096581B1 (en) 2012-09-14 2012-09-14 Method for editing display information and an electronic device thereof
US14/016,799 US10095401B2 (en) 2012-09-14 2013-09-03 Method for editing display information and electronic device thereof
US16/155,499 US20190042070A1 (en) 2012-09-14 2018-10-09 Method for editing display information and electronic device thereof

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/016,799 Continuation US10095401B2 (en) 2012-09-14 2013-09-03 Method for editing display information and electronic device thereof

Publications (1)

Publication Number Publication Date
US20190042070A1 true US20190042070A1 (en) 2019-02-07

Family

ID=49231254

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/016,799 Active 2034-04-25 US10095401B2 (en) 2012-09-14 2013-09-03 Method for editing display information and electronic device thereof
US16/155,499 Abandoned US20190042070A1 (en) 2012-09-14 2018-10-09 Method for editing display information and electronic device thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/016,799 Active 2034-04-25 US10095401B2 (en) 2012-09-14 2013-09-03 Method for editing display information and electronic device thereof

Country Status (6)

Country Link
US (2) US10095401B2 (en)
EP (1) EP2708999A3 (en)
KR (1) KR102096581B1 (en)
CN (2) CN109240593B (en)
AU (1) AU2013316279A1 (en)
WO (1) WO2014042470A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101531169B1 (en) * 2013-09-23 2015-06-24 삼성전자주식회사 Method and Apparatus for drawing a 3 dimensional object
US10656788B1 (en) * 2014-08-29 2020-05-19 Open Invention Network Llc Dynamic document updating application interface and corresponding control functions
KR102317619B1 (en) 2016-09-23 2021-10-26 삼성전자주식회사 Electronic device and Method for controling the electronic device thereof
KR20210123920A (en) * 2020-04-06 2021-10-14 삼성전자주식회사 Electronic device for providing editing function by air gesture, method for operating thereof and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US20070257906A1 (en) * 2006-05-04 2007-11-08 Shimura Yukimi Virtual suction tool
US20140304599A1 (en) * 2011-10-06 2014-10-09 Sony Ericsson Mobile Communications Ab Method and Electronic Device for Manipulating a First or a Second User Interface Object

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3007232B2 (en) 1992-10-19 2000-02-07 富士通株式会社 Drawing processing apparatus and method
US6081818A (en) * 1996-12-11 2000-06-27 International Business Machines Corporation Cut object dynamic display and method of displaying cut objects
JP2003177864A (en) 2001-12-13 2003-06-27 Fuji Photo Film Co Ltd Information terminal system
JP4074982B2 (en) 2002-07-23 2008-04-16 ソニー株式会社 Information processing apparatus and method, recording medium, and program
US7185291B2 (en) * 2003-03-04 2007-02-27 Institute For Information Industry Computer with a touch screen
US7565619B2 (en) * 2004-08-26 2009-07-21 Microsoft Corporation System and method for automatic item relocating in a user interface layout
CN100395691C (en) * 2005-09-14 2008-06-18 王学永 Numeric stored handwritting pen
US8643605B2 (en) 2005-11-21 2014-02-04 Core Wireless Licensing S.A.R.L Gesture based document editor
KR20080019742A (en) 2006-08-29 2008-03-05 주식회사 대우일렉트로닉스 Stylus pen having data input/output function and operating method for the same
US7934156B2 (en) 2006-09-06 2011-04-26 Apple Inc. Deletion gestures on a portable multifunction device
CN101529874A (en) 2006-09-06 2009-09-09 苹果公司 Incoming telephone call management for a portable multifunction device with touch screen display
CN101315586B (en) * 2008-07-21 2013-01-09 常州迪锐特电子科技有限公司 Electronic pen for interactive electronic white board and interaction control method thereof
KR101592296B1 (en) * 2008-09-03 2016-02-05 엘지전자 주식회사 Mobile terminal and method for selection and activation object thereof
KR101503835B1 (en) * 2008-10-13 2015-03-18 삼성전자주식회사 Apparatus and method for object management using multi-touch
DE112008004156B4 (en) 2008-12-15 2021-06-24 Hewlett-Packard Development Company, L.P. SYSTEM AND METHOD FOR A GESTURE-BASED EDITING MODE AND COMPUTER-READABLE MEDIUM FOR IT
JP2010256969A (en) 2009-04-21 2010-11-11 Sharp Corp User interface device and method for controlling the same
CN102299990A (en) * 2010-06-22 2011-12-28 希姆通信息技术(上海)有限公司 Gesture control cellphone
US9747270B2 (en) 2011-01-07 2017-08-29 Microsoft Technology Licensing, Llc Natural input for spreadsheet actions
US10620794B2 (en) 2010-12-23 2020-04-14 Apple Inc. Device, method, and graphical user interface for switching between two user interfaces
EP3734405A1 (en) 2011-02-10 2020-11-04 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US20130132878A1 (en) * 2011-09-02 2013-05-23 Adobe Systems Incorporated Touch enabled device drop zone
CN102314318A (en) * 2011-09-13 2012-01-11 深圳市同洲电子股份有限公司 Character input method applied to touch screen terminal, device and touch screen terminal
US9164599B2 (en) * 2012-08-10 2015-10-20 Adobe Systems Incorporated Multifunctional stylus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US20070257906A1 (en) * 2006-05-04 2007-11-08 Shimura Yukimi Virtual suction tool
US20140304599A1 (en) * 2011-10-06 2014-10-09 Sony Ericsson Mobile Communications Ab Method and Electronic Device for Manipulating a First or a Second User Interface Object

Also Published As

Publication number Publication date
CN104641342A (en) 2015-05-20
WO2014042470A1 (en) 2014-03-20
CN109240593A (en) 2019-01-18
CN104641342B (en) 2018-11-09
CN109240593B (en) 2021-10-26
AU2013316279A1 (en) 2015-01-29
KR102096581B1 (en) 2020-05-29
US20140078083A1 (en) 2014-03-20
US10095401B2 (en) 2018-10-09
EP2708999A3 (en) 2017-09-20
KR20140035581A (en) 2014-03-24
EP2708999A2 (en) 2014-03-19

Similar Documents

Publication Publication Date Title
US20190042070A1 (en) Method for editing display information and electronic device thereof
US10234951B2 (en) Method for transmitting/receiving message and electronic device thereof
US11316805B2 (en) Method for transmitting message and electronic device thereof
EP2778870B1 (en) Method and apparatus for copying and pasting of data
US9645730B2 (en) Method and apparatus for providing user interface in portable terminal
US9582188B2 (en) Method for adjusting display area and electronic device thereof
US9851898B2 (en) Method for changing display range and electronic device thereof
US20140129980A1 (en) Display method and electronic device using the same
US9891787B2 (en) Method and apparatus of operating electronic device
US20130215059A1 (en) Apparatus and method for controlling an object in an electronic device with touch screen
EP2706449A2 (en) Method for changing object position and electronic device thereof
US9489069B2 (en) Method for controlling display scrolling and zooming and an electronic device thereof
US10331329B2 (en) Electronic device and method for changing order or location of content
US9588665B2 (en) Object editing method and electronic device thereof
KR20160140795A (en) Transient user interface elements
US20140240261A1 (en) Method for determining touch input object and electronic device thereof
EP2811391A1 (en) Method for transforming an object based on motion, gestures or breath input and electronic device thereof
CN104731500A (en) Information processing method and electronic equipment
KR101423168B1 (en) Method and terminal of providing graphical user interface for viewing contents
US20140043286A1 (en) Method for identifying touch pen in electronic device, and electronic device
US11973723B2 (en) Method for transmitting message and electronic device thereof
US9823830B2 (en) Method for managing a list and electronic device thereof
CN112612400B (en) Text content processing method and electronic equipment

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION