WO2015081739A1 - Method of performing one or more actions on electronic device - Google Patents

Method of performing one or more actions on electronic device Download PDF

Info

Publication number
WO2015081739A1
WO2015081739A1 PCT/CN2014/086192 CN2014086192W WO2015081739A1 WO 2015081739 A1 WO2015081739 A1 WO 2015081739A1 CN 2014086192 W CN2014086192 W CN 2014086192W WO 2015081739 A1 WO2015081739 A1 WO 2015081739A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
electronic device
context based
user interface
based utilities
Prior art date
Application number
PCT/CN2014/086192
Other languages
French (fr)
Inventor
Pranshu JOSHI
Umesh S
Santosh Kumar Nath
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN201480003270.6A priority Critical patent/CN105051669A/en
Priority to EP14867972.3A priority patent/EP2965185B1/en
Priority to ES14867972T priority patent/ES2785207T3/en
Publication of WO2015081739A1 publication Critical patent/WO2015081739A1/en
Priority to US14/969,163 priority patent/US20160098166A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/107Computer-aided management of electronic mailing [e-mailing]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/26Devices for calling a subscriber
    • H04M1/27Devices whereby a plurality of signals may be stored simultaneously
    • H04M1/274Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc
    • H04M1/2745Devices whereby a plurality of signals may be stored simultaneously with provision for storing more than one subscriber number at a time, e.g. using toothed disc using static electronic memories, e.g. chips
    • H04M1/27467Methods of retrieving data
    • H04M1/2747Scrolling on a display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present disclosure relates to the field of user interface technologies for an electronic device.
  • the present disclosure is related to a method and an electronic device for performing one or more actions using drag-drop operation on electronic device.
  • the touch screen enabled electronic devices includes, but not limited to, mobile phones, computer, tablets, notes, laptops etc.
  • Electronic devices are commonly utilized to provide users a means to communicate and stay “connected” while moving from one place to another place.
  • Technology of such electronic devices has advanced to the extent where data regarding any desired content is readily available. Such information exchange can occur by way of user entering information (e.g. , text, visual, audio, and so on) into the display area of a user device and interacting with the device, utilizing that display area.
  • Figure 1 illustrates a traditional way of performing ‘forward’ action in an e-mail conversation according to an embodiment of the prior art.
  • the email application where user commonly performs creating message, replying, forwarding and deleting etc.
  • forwarding an email firstly, the user has to click on the email to forward the content or message contained in the email. Secondly, the user clicks on open ‘Menu’ tab and then chooses ‘options’ .
  • the excess scroll or movement on the display page that is not intuitively related to hand movement causes stress on the user.
  • performing some functions can be cumbersome and might not allow a user to quickly and easily interact with the electronic device.
  • the present disclosure relates to a method of performing one or more actions on an electronic device.
  • the method comprises selecting at least one item from one or more items displayed on a user interface of the electronic device.
  • the at least one item is selected by touching the at least one item for a predetermined time.
  • the selected at least one item is dragged towards one of one or more context based utilities on the user interface.
  • Each of the one or more context based utilities corresponds to a preconfigured action.
  • the selected at least one item is dropped on the one or more context based utilities to perform corresponding preconfigured action.
  • An electronic device for performing one or more actions.
  • the electronic device comprises a user interface and a processing unit.
  • the user interface is used to perform drag and drop operation.
  • the processing unit is communicatively connected to the user interface, and is configured to detect selection of at least one item from one or more items displayed on the user interface of the electronic device. The at least one item is selected by touching the at least one item for a predetermined time. Then the processing unit detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface. Each of the one or more context based utilities corresponds to a preconfigured action.
  • the processing unit detects dropping of the selected at least one item on the one or more context based utilities and performs preconfigured action corresponding to the one of the one or more context based utilities.
  • the present disclosure relates to a non-transitory computer readable medium including operations stored thereon that when processed by at least one processing unit cause an electronic device to perform one or more actions by performing the acts of detecting selection of at least one item from one or more items displayed on the user interface of the electronic device.
  • the at least one item is selected by touching the at least one item for a predetermined time.
  • dragging of the selected at least one item towards one of one or more context based utilities on the user interface is detected.
  • Each of the one or more context based utilities corresponds to a preconfigured action.
  • dropping of the selected at least one item on the one or more context based utilities is detected.
  • Preconfigured action corresponding to the one of the one or more context based utilities is performed.
  • the present disclosure relates to a computer program for performing one or more actions on an electronic device, said computer program comprising code segment for detecting selection of at least one item from one or more items displayed on the user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time; code segment for detecting dragging of the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; code segment for detecting dropping of the selected at least one item on the one or more context based utilities; and code segment for performing preconfigured action corresponding to the one of the one or more context based utilities.
  • Figure 1 shows exemplary diagram illustrating traditional method of performing actions in electronic communication on an electronic device in accordance with an embodiment of the prior art
  • Figure 2 shows exemplary block diagram illustrating an electronic device in accordance with an embodiment of the present disclosure
  • Figures 3a-3c shows user interface illustrating method of performing one or more actions in email application on an electronic device in accordance with an embodiment of the present disclosure
  • Figures 4a-4c shows user interface illustrating method of performing one or more actions in contact registers on an electronic device in accordance with an embodiment of the present disclosure
  • Figure 5 shows an exemplary flow chart illustrating a method of performing one or more actions on an electronic device in accordance with an embodiment of the present disclosure
  • Figure 6 is an exemplary flow chart illustrating a method of performing one or more actions by an electronic device in accordance with an embodiment of the present disclosure.
  • Figure 7 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device providing a popped up box with one or more context based utilities in accordance with an embodiment of the present disclosure.
  • Embodiment of the present disclosure relates to user interface technologies for an electronic device. More particularly, the present disclosure provides a drag-drop operation to perform one or more actions on the user interface of the electronic device.
  • An item is selected, dragged and dropped on to a context based utility to achieve a desired action.
  • an item is selected when a user touches the item for a predetermined time. Then, the user drags the item towards one of one or more context based utilities which corresponds to a preconfigured action. After dragging, the item is dropped on the one of the one or more context based utilities to achieve the preconfigured action. For example, considering the email application, e-mail from the inbox is selected by a touching the mail for a predetermined time.
  • the one or more context based utilities related to the email communications include, but does not limit to, ‘reply’ , ‘forward’ , ‘delete’ , ‘print’ , ‘categorize’ , ‘copy’ , ‘move’ , ‘compose’ etc. Then, the selected item is dragged towards the context based utility such as ‘Reply’ and dropped on the ‘Reply’ utility as desired by a user to achieve a reply action.
  • the context based utility such as ‘Reply’ and dropped on the ‘Reply’ utility as desired by a user to achieve a reply action.
  • at least one contact is selected by touching the at least one contact for a predetermined time.
  • the one or more context based utilities related to contact registers include, but does not limit to, ‘edit’ , ‘view’ , ‘delete’ , ‘add’ , ‘connect to social networks’ etc. Then, the at least one contact is dragged towards ‘view’ utility and dropped on the ‘view’ utility as desired by the user to achieve viewing of the contact being selected by the user. Therefore, the drag-drop operation reduces the number of steps followed in traditional methods as illustrated in prior arts.
  • FIG. 2 shows exemplary block diagram illustrating an electronic device 202 in accordance with an embodiment of the present disclosure.
  • the electronic device 202 is a device comprising a touch screen interface.
  • the electronic device 202 includes, but is not limiting to, mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), laptop, computer, point-of-sale terminals, car navigation systems, medical monitors, contactless device, industrial control panels, and other electronic device having touch screen panel.
  • the electronic device 202 comprises including a user interface 204 and a processing unit 206.
  • the user interface 204 provides a user to input instructions using input device (not shown in figure 2) including, but not limiting to, stylus, finger, pen shaped pointing device, and any other device that can be used to provide input through the user interface204.
  • the user interface 204 is a touch screen.
  • the user interface 204 is configured to perform drag-drop operation to achieve one or more actions.
  • the user interface 204 is communicatively connected to the processing unit 206 through a communication interface.
  • the information with respect to drag-drop operation can be communicated to the processing unit 206 from a machine-readable medium.
  • the term machine-readable medium can be defined as a medium providing data to a machine to enable the machine to perform a specific function.
  • the machine-readable medium can be a storage media.
  • the storage media can be the storage unit. All such media must be tangible/non transitory to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into the machine.
  • the processing unit 206 detects selection of an item, dragging and dropping of the item on the one or more contexts based utilities and perform one or more actions related to the corresponding one or more context based utilities. Particularly, the processing unit 206 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202.
  • the one or more items includes, but does not limit to, electronic mails (e-mails) , short message services (SMSs), messages, images, audios, videos, and electronic documents.
  • the at least one item is selected by touching the at least one item for a predetermined time. In a non-limiting embodiment, the predetermined time is in the range of about 1 millisecond to about 30 milliseconds.
  • an email from inbox comprising received emails is selected when the user touches the email for about 20 milliseconds.
  • the processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface 204.
  • the one or more context based utilities are the icons which are logically related to the actions. For example, in email communications, when an email is selected the one or more utilities corresponding to the emails are displayed ‘reply to email’ , ‘forward the email’ , and ‘delete the email’ etc.
  • the one or more context based utilities pops up based on type of the at least one item being selected.
  • the one or more context based utilities are displayed adjacent to the at least one item being selected.
  • the one or more context based utilities are displayed below the emails and next to the email after the email is being selected.
  • Each of the one or more context based utilities corresponds to a preconfigured action including, but not limiting to, replying, deleting, forwarding, editing, composing, adding, calling, viewing and other related actions capable of being performed on the electronic device 202.
  • the user wishes to reply to an email
  • the user selects the email, drags the email to the utility ‘reply’ to achieve the reply action.
  • the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities to achieve the preconfigured action corresponding to the one or more context based utilities.
  • the preconfigured action is initiated by the processing unit upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
  • the reply action is achievable when the user drops the selected email on the ‘reply’ icon or utility.
  • the drag-drop operation comprising selection of the at least one item, dragging and dropping of the selected at least one item is performed without lifting the input device over the user interface 204.
  • the present disclosure only limited events or applications are listed out for the purposes of the demonstration only. This should not be construed as limitation on this part.
  • Figure 3a illustrates an exemplary email application on the electronic device 202 comprising various context based utilities corresponding to the email application.
  • the user interface 204 of the electronic device 202 displays the various emails received in the ‘inbox’ section depicted by numeral 302 using the email account “XYZ@gmail. com” .
  • the inbox displays the one or more received emails depicted by numeral 304.
  • the emails are received from ‘Abc’ , ‘HIJ’ , ‘LMN’ and ‘PQR’ and so on illustrated by numeral 304.
  • Each of the emails comprises some content.
  • email received from ‘Abc’ comprises content stated as “Hi, This is a test mail” .
  • the one or more context based utilities depicted by numeral 306 are displayed below the emails. That is, the ‘forward’ utility is depicted by 306a, ‘reply’ utility without email history (chain of emails) is depicted by 306b, ‘reply’ with email history is depicted by 306c and ‘delete’ utility is depicted by 306d.
  • Figures 3b shows user interface 204 illustrating method of performing one or more actions such as forwarding the mails in email application on the electronic device 202 using drag-drop operation in accordance with an exemplary embodiment of the present disclosure.
  • the user selects the email depicted by numeral 310.
  • the user selects the email 310 among the one or more received emails 304 by touching the email 310 for around 10-30 milliseconds on the user interface 206 using finger 308.
  • the user drags the email 310, where dragging of the email 310 by the user using the finger 308 is depicted by numeral 312.
  • the user drops the email 310 down on the ‘forward’ utility 306a to achieve the action of forwarding the email.
  • the forward dialogue box appears on the user interface 204 where the user is asked to enter the email id or name of the person to whom the user wishes to forward the email 310 along with the content depicted by numeral 314a.
  • Figures 3c shows user interface 204 illustrating method of performing one or more actions such as replying mails in email application on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure.
  • the user selects the email depicted by numeral 316.
  • the user selects the email 316 among the one or more received emails 304 by touching the email 316 for around 1-30 milliseconds on the user interface 204 using finger 308.
  • the user drags the email 316, where dragging of the email 316 by the user using the finger 308 is depicted by numeral 318.
  • the user drops the email 316 on the ‘reply’ utility 306b to achieve the action of replying to the email 316.
  • the reply dialogue box appears on the user interface 204 where the user is asked to enter the text in text field 320a.
  • the email id of the person is retained in “To” field when the user is replying. That is, the email 316 received from the person having the email id “Abc@gmail. com” is retained in “To” as depicted by numeral 320 when the user is performing reply action.
  • various other actions such as including, but not limiting to deleting, composing, and editing the email is achieved by performing the drag-drop operation on the corresponding context based utility.
  • Figure 4a shows user interface 204 illustrating method of performing one or more actions in contact registers on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure.
  • the contact register depicted by numeral 402 which appears on the user interface 204 when the user clicks the contact registers on the electronic device 202.
  • the contact registers contains the information of the contacts such as including, but not limiting to, phone number, email id, picture and name.
  • the contact registers 402 contains the name and phone number of the contacts depicted by numeral 404.
  • One or more context based utilities depicted by 406 related to the contact registers 402 appears on the user interface 206.
  • add utility 406a to add the contact view utility 406b to view the contact information
  • edit utility 406c edit the contact information
  • forward utility 406d to forward the contact information
  • SMS utility 406e to send sms to the contact
  • delete utility 406f delete the contact
  • Figures 4b shows user interface 204 illustrating method of performing one or more actions such as forwarding the contact information in contact registers 402 on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure.
  • the contact register 402 containing the one or more contacts along with their contact information such as name and the phone number as depicted by numeral 404.
  • the user selects the contact depicted by numeral 408.
  • the user selects the contact 408 among the one or more contacts 404 by touching the contact 408 for around 1-30 milliseconds on the user interface 206 using finger 308.
  • the user drags the contact 408, where dragging of the contact 408 by the user using the finger 308 is depicted by numeral 410.
  • the user drops the contact 408 down on the forward utility 406d to achieve the action of forwarding the contact 408. After dropping of the contact 408 on the forward utility 406d is performed, the contact 408 is forwarded by the user as desired by the user.
  • Figures 4c shows user interface 204 illustrating method of performing one or more actions such as editing the contact information in contact registers 402 on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure.
  • the contact register 402 containing the one or more contacts along with their contact information such as name and the phone number as depicted by numeral 404.
  • the user selects the contact depicted by numeral 412.
  • the user selects the contact 410 among the one or more contacts 404 by touching the contact 410 for around 1-30 milliseconds on the user interface 204 using finger 308.
  • the one or more context based utilities related to the contact register 402 pops up when the user touches the contact in the contact register 402 for a predetermined time, for example, around 30 milliseconds.
  • the one or more context based utilities which is popped up is depicted by numeral 414.
  • the popped up box includes options including, but not limiting to, call, SMS, add, view, delete, connect to social networks and edit.
  • the user invites the contact person present in the contact register to various social network by choosing connect to social networks utility.
  • the user drags the contact 410, where dragging of the contact 410 by the user using the finger 308 is depicted by numeral 416.
  • the user drops the contact 410 on the edit utility 416a to achieve the action of editing the contact 410. After dropping of the contact 410 on the edit utility 416a is achieved, the contact 410 is edited by the user as desired by the user.
  • various other actions such as including, but not limiting to deleting, adding, viewing and connect to the social network are achieved by performing the drag-drop operation on the corresponding context based utility.
  • the utilities are dynamically provided on the user interface 204 based on the at least one item being selected on the user interface 204. For example, when user selects the audio clip, the utilities corresponding to audio clips such as ‘play’ , ‘stop’ , ‘pause’ , ‘delete’ , ‘forward’ etc. are provided or displayed on the user interface 204.
  • the user can connect to the various social networks and electronic applications to receive and transmit the one or more items including, but not limiting to, electronic mails (e-mails), short message services (SMSs), messages, images, audios, videos, and electronic documents.
  • e-mails electronic mails
  • SMSs short message services
  • messages images, audios, videos, and electronic documents.
  • FIG. 5 shows an exemplary flow chart illustrating a method of performing one or more actions on the electronic device 202 using a drag-drop operation in accordance with an embodiment of the present disclosure.
  • at least one item is selected from the one or more items displayed on the user interface 204 of the electronic device 202.
  • the at least one item is selected by touching the at least one item for the predetermined time.
  • Selection of the at least one item is detected by the processing unit 206 configured in the electronic device 202.
  • the selected at least one item is dragged towards one of one or more context based utilities on the user interface 204.
  • Each of the one or more context based utilities corresponds to the preconfigured action.
  • the selected at least one item is dropped on the one or more context based utilities to perform corresponding preconfigured action.
  • the preconfigured action is initiated by the processing unit 206 upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
  • Figure 6 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device 202 in accordance with an embodiment of the present disclosure.
  • the processing unit 208 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202. The at least one item is selected by touching the at least one item for the predetermined time.
  • the processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface 204. Each of the one or more context based utilities corresponds to the preconfigured action.
  • the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities.
  • the processing unit 206 performs the preconfigured action corresponding to the one of the one or more context based utilities. The preconfigured action is initiated upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
  • Figure 7 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device 202 providing a popped up box with one or more context based utilities in accordance with an embodiment of the present disclosure.
  • the processing unit 208 of the electronic device 202 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202 when the at least one item is touched for a predetermined time. Then, after the at least one item is touched for the predetermined time, a popped up box including the one or more context based utilities associated to the selected at least one item is provided on the user interface 204 as illustrated at step 704.
  • the processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities in the popped up box. Each of the one or more context based utilities corresponds to the preconfigured action.
  • the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities in the popped up box.
  • the processing unit 206 performs the preconfigured action corresponding to the one of the one or more context based utilities. The preconfigured action is initiated upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
  • Embodiment of the present disclosure minimizes number of touches on the user interface to perform a desired action on the electronic device with drag-drop operation.
  • Embodiment of the present disclosure reduces traversing multiple pages and facilitates to perform the one or more actions on same screen, thus reducing process overhead.
  • Embodiment of the present disclosure reduces number of steps to achieve a desired action on the electronic device i.e. with minimum of movement on the display page.
  • the present disclosure reduces finger movement to random touch point and thus saving considerable amount of time.
  • Embodiment of the present disclosure reduces excess scroll or movement on the display page to reduce hand movement and thus reduces stress on the user.
  • the described operations may be implemented as a method, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
  • the described operations may be implemented as code maintained in a “non-transitory computer readable medium” , where a processing unit may read and execute the code from the computer readable medium.
  • the processing unit is at least one of a microprocessor and a processor capable of processing and executing the queries.
  • a non-transitory computer readable medium may comprise media such as magnetic storage medium (e.g. , hard disk drives, floppy disks, tape, etc. ), optical storage (CD-ROMs, DVDs, optical disks, etc. ), volatile and non-volatile memory devices (e.g.
  • non-transitory computer-readable media comprise all computer-readable media except for a transitory.
  • the code implementing the described operations may further be implemented in hardware logic (e.g. , an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC), etc. ) .
  • the code implementing the described operations may be implemented in “transmission signals” , where transmission signals may propagate through space or through a transmission media, such as an optical fiber, copper wire, etc.
  • the transmission signals in which the code or logic is encoded may further comprise a wireless signal, satellite transmission, radio waves, infrared signals, Bluetooth, etc.
  • the transmission signals in which the code or logic is encoded is capable of being transmitted by a transmitting station and received by a receiving station, where the code or logic encoded in the transmission signal may be decoded and stored in hardware or a non-transitory computer readable medium at the receiving and transmitting stations or devices.
  • An “article of manufacture” comprises non-transitory computer readable medium, hardware logic, and/or transmission signals in which code may be implemented.
  • a device in which the code implementing the described embodiments of operations is encoded may comprise a computer readable medium or hardware logic.
  • an embodiment means “one or more (but not all) embodiments of the invention (s) " unless expressly specified otherwise.
  • Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise.
  • devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
  • FIG. 6 and 7 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.

Abstract

Embodiment of the present disclosure relates to user interface technologies of an electronic device. The present disclosure relates to a method of performing one or more actions on an electronic device. The method comprises selecting at least one item from one or more items displayed on a user interface of the electronic device. The at least one item is selected by touching the at least one item for a predetermined time. After selecting the at least one item, the at least one item is dragged towards one of one or more context based utilities on the user interface. Each of the one or more context based utilities corresponds to a preconfigured action. Then, the at least one item is dropped on the one or more context based utilities to perform corresponding preconfigured action.

Description

A METHOD OF PERFORMING ONE OR MORE ACTIONS ON AN ELECTRONIC DEVICE
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority of Indian Patent Application No. IN5596/CHE/2013, filed December 4, 2013 and titled “AMETHOD OF PERFORMING ONE OR MORE ACTIONS ON AN ELECTRONIC DEVICE” , which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
The present disclosure relates to the field of user interface technologies for an electronic device. In particular, the present disclosure is related to a method and an electronic device for performing one or more actions using drag-drop operation on electronic device.
BACKGROUND
In recent years, various electronic devices with touch screen have been developed to provide a user friendly interaction of a user with the electronic device. The touch screen reduces usage of hardware buttons to be used for interacting with the electronic device. Particularly, one or more utilities or icons on the interface of the touch screen avails the user with fastest interactions. Generally, the touch screen enabled electronic devices includes, but not limited to, mobile phones, computer, tablets, notes, laptops etc. Electronic devices are commonly utilized to provide users a means to communicate and stay "connected" while moving from one place to another place. Technology of such electronic devices has advanced to the extent where data regarding any desired content is readily available. Such information exchange can occur by way of user entering information (e.g. , text, visual, audio, and so on) into the display area of a user device and interacting with the device, utilizing that display area.
Usually, when a user attempts to navigate through various directories, applications, files or other functions, all the information that the user might need to navigate might not be displayed on the display area at the same time. This makes the user scroll or move through various display pages to achieve the desired result. Traditionally, the user has to perform minimum of at least 3 steps to achieve the desired result. Figure 1 illustrates a traditional way of performing ‘forward’ action in an e-mail conversation according to an embodiment of the prior art. Considering the email application where user commonly performs creating message, replying, forwarding and deleting etc. For example, for forwarding an email, firstly, the user has to click on the email to forward the content or  message contained in the email. Secondly, the user clicks on open ‘Menu’ tab and then chooses ‘options’ . Thirdly, the user clicks on the ‘forward’ to achieve the forward action. The excess scroll or movement on the display page that is not intuitively related to hand movement causes stress on the user. In addition, performing some functions can be cumbersome and might not allow a user to quickly and easily interact with the electronic device.
Therefore, there exists a need to reduce number of steps to achieve a desired action on the electronic device. Hence, a drag-drop operation is provided to achieve the desired action with minimum of movement on the display page.
SUMMARY
The shortcomings of the prior art are overcome and additional advantages are provided through the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
The present disclosure relates to a method of performing one or more actions on an electronic device. The method comprises selecting at least one item from one or more items displayed on a user interface of the electronic device. The at least one item is selected by touching the at least one item for a predetermined time. Then, the selected at least one item is dragged towards one of one or more context based utilities on the user interface. Each of the one or more context based utilities corresponds to a preconfigured action. Lastly, the selected at least one item is dropped on the one or more context based utilities to perform corresponding preconfigured action.
An electronic device is disclosed in the present disclosure for performing one or more actions. The electronic device comprises a user interface and a processing unit. The user interface is used to perform drag and drop operation. The processing unit is communicatively connected to the user interface, and is configured to detect selection of at least one item from one or more items displayed on the user interface of the electronic device. The at least one item is selected by touching the at least one item for a predetermined time. Then the processing unit detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface. Each of the one or more context based utilities corresponds to a preconfigured action. The processing unit detects dropping of the selected at least one item on the one or more context based utilities and performs preconfigured action corresponding to the one of the one or more context based utilities.
The present disclosure relates to a non-transitory computer readable medium including operations stored thereon that when processed by at least one processing unit cause an electronic device to perform one or more actions by performing the acts of detecting selection of at least one item from one or more items displayed on the user interface of the electronic device. The at least one item is selected by touching the at least one item for a predetermined time. Then dragging of the selected at least one item towards one of one or more context based utilities on the user interface is detected. Each of the one or more context based utilities corresponds to a preconfigured action. Next, dropping of the selected at least one item on the one or more context based utilities is detected. Preconfigured action corresponding to the one of the one or more context based utilities is performed.
The present disclosure relates to a computer program for performing one or more actions on an electronic device, said computer program comprising code segment for detecting selection of at least one item from one or more items displayed on the user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time; code segment for detecting dragging of the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; code segment for detecting dropping of the selected at least one item on the one or more context based utilities; and code segment for performing preconfigured action corresponding to the one of the one or more context based utilities.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects and features described above, further aspects, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The novel features and characteristic of the disclosure are set forth in the appended claims. The embodiments of the disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings.
Figure 1 shows exemplary diagram illustrating traditional method of performing actions in electronic communication on an electronic device in accordance with an embodiment of the prior art;
Figure 2 shows exemplary block diagram illustrating an electronic device in accordance with an embodiment of the present disclosure;
Figures 3a-3c shows user interface illustrating method of performing one or more actions in email application on an electronic device in accordance with an embodiment of the present disclosure;
Figures 4a-4c shows user interface illustrating method of performing one or more actions in contact registers on an electronic device in accordance with an embodiment of the present disclosure;
Figure 5 shows an exemplary flow chart illustrating a method of performing one or more actions on an electronic device in accordance with an embodiment of the present disclosure;
Figure 6 is an exemplary flow chart illustrating a method of performing one or more actions by an electronic device in accordance with an embodiment of the present disclosure; and
Figure 7 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device providing a popped up box with one or more context based utilities in accordance with an embodiment of the present disclosure.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. Additional features and advantages of the disclosure will be described hereinafter which form the subject of the claims of the disclosure. It should be appreciated by those skilled in the art that the conception and specific aspect disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the disclosure as set forth in the appended claims. The novel features which are believed to be characteristic of the disclosure, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however,  that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
Embodiment of the present disclosure relates to user interface technologies for an electronic device. More particularly, the present disclosure provides a drag-drop operation to perform one or more actions on the user interface of the electronic device. An item is selected, dragged and dropped on to a context based utility to achieve a desired action. Typically, an item is selected when a user touches the item for a predetermined time. Then, the user drags the item towards one of one or more context based utilities which corresponds to a preconfigured action. After dragging, the item is dropped on the one of the one or more context based utilities to achieve the preconfigured action. For example, considering the email application, e-mail from the inbox is selected by a touching the mail for a predetermined time. The one or more context based utilities related to the email communications include, but does not limit to, ‘reply’ , ‘forward’ , ‘delete’ , ‘print’ , ‘categorize’ , ‘copy’ , ‘move’ , ‘compose’ etc. Then, the selected item is dragged towards the context based utility such as ‘Reply’ and dropped on the ‘Reply’ utility as desired by a user to achieve a reply action. Another example, considering the call registers, at least one contact is selected by touching the at least one contact for a predetermined time. The one or more context based utilities related to contact registers include, but does not limit to, ‘edit’ , ‘view’ , ‘delete’ , ‘add’ , ‘connect to social networks’ etc. Then, the at least one contact is dragged towards ‘view’ utility and dropped on the ‘view’ utility as desired by the user to achieve viewing of the contact being selected by the user. Therefore, the drag-drop operation reduces the number of steps followed in traditional methods as illustrated in prior arts.
Henceforth, embodiments of the present disclosure are explained with the help of exemplary diagrams and one or more examples. However, such exemplary diagrams and examples are provided for the illustration purpose for better understanding of the present disclosure and should not be construed as limitation on scope of the present disclosure.
Figure 2 shows exemplary block diagram illustrating an electronic device 202 in accordance with an embodiment of the present disclosure. The electronic device 202 is a device comprising a touch screen interface. In a non-limiting example, the electronic device 202 includes, but is not limiting to, mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), laptop, computer, point-of-sale terminals, car navigation systems, medical monitors, contactless device, industrial control panels, and other electronic device having touch screen panel. The electronic device 202 comprises including a user interface 204 and a processing unit 206. The user interface 204 provides a user to input instructions using input device (not shown in figure 2) including, but not  limiting to, stylus, finger, pen shaped pointing device, and any other device that can be used to provide input through the user interface204. In an embodiment, the user interface 204 is a touch screen. In an embodiment, the user interface 204 is configured to perform drag-drop operation to achieve one or more actions. In an embodiment, the user interface 204 is communicatively connected to the processing unit 206 through a communication interface. In an embodiment, the information with respect to drag-drop operation can be communicated to the processing unit 206 from a machine-readable medium. The term machine-readable medium can be defined as a medium providing data to a machine to enable the machine to perform a specific function. The machine-readable medium can be a storage media. The storage media can be the storage unit. All such media must be tangible/non transitory to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into the machine.
The processing unit 206 detects selection of an item, dragging and dropping of the item on the one or more contexts based utilities and perform one or more actions related to the corresponding one or more context based utilities. Particularly, the processing unit 206 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202. In an embodiment, the one or more items includes, but does not limit to, electronic mails (e-mails) , short message services (SMSs), messages, images, audios, videos, and electronic documents. The at least one item is selected by touching the at least one item for a predetermined time. In a non-limiting embodiment, the predetermined time is in the range of about 1 millisecond to about 30 milliseconds. For example, an email from inbox comprising received emails is selected when the user touches the email for about 20 milliseconds. The processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface 204. The one or more context based utilities are the icons which are logically related to the actions. For example, in email communications, when an email is selected the one or more utilities corresponding to the emails are displayed ‘reply to email’ , ‘forward the email’ , and ‘delete the email’ etc. In an embodiment, the one or more context based utilities pops up based on type of the at least one item being selected. In an embodiment, the one or more context based utilities are displayed adjacent to the at least one item being selected. For example, the one or more context based utilities are displayed below the emails and next to the email after the email is being selected. Each of the one or more context based utilities corresponds to a preconfigured action including, but not limiting to, replying, deleting, forwarding, editing, composing, adding, calling, viewing and other related actions capable of being performed on the electronic device 202. For example, when the user wishes to reply to an email, the user selects the  email, drags the email to the utility ‘reply’ to achieve the reply action. Upon detecting the dragging of the at least one item, the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities to achieve the preconfigured action corresponding to the one or more context based utilities. The preconfigured action is initiated by the processing unit upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities. For example, the reply action is achievable when the user drops the selected email on the ‘reply’ icon or utility.
In embodiment, the drag-drop operation comprising selection of the at least one item, dragging and dropping of the selected at least one item is performed without lifting the input device over the user interface 204. In the present disclosure only limited events or applications are listed out for the purposes of the demonstration only. This should not be construed as limitation on this part.
Figure 3a illustrates an exemplary email application on the electronic device 202 comprising various context based utilities corresponding to the email application. Considering the email application, where the user logs in to an email account and clicks on the inbox to display one or more emails received. In the illustrated figure 3a, the user interface 204 of the electronic device 202 displays the various emails received in the ‘inbox’ section depicted by numeral 302 using the email account “XYZ@gmail. com” . The inbox displays the one or more received emails depicted by numeral 304. For example, the emails are received from ‘Abc’ , ‘HIJ’ , ‘LMN’ and ‘PQR’ and so on illustrated by numeral 304. Each of the emails comprises some content. For example, email received from ‘Abc’ comprises content stated as “Hi, This is a test mail” . Based on the email application, the one or more context based utilities depicted by numeral 306 are displayed below the emails. That is, the ‘forward’ utility is depicted by 306a, ‘reply’ utility without email history (chain of emails) is depicted by 306b, ‘reply’ with email history is depicted by 306c and ‘delete’ utility is depicted by 306d.
Figures 3b shows user interface 204 illustrating method of performing one or more actions such as forwarding the mails in email application on the electronic device 202 using drag-drop operation in accordance with an exemplary embodiment of the present disclosure. Consider the one or more emails received in the inbox section of the user’s account. To perform drag-drop operation, the user selects the email depicted by numeral 310. The user selects the email 310 among the one or more received emails 304 by touching the email 310 for around 10-30 milliseconds on the user interface 206 using finger 308. The user drags the email 310, where dragging of the email 310 by the user using the finger 308 is depicted by numeral 312. The user drops the email 310 down on the ‘forward’ utility 306a to achieve the action of forwarding the email. After dropping of the email 310 on the ‘forward’  utility 306a is performed, the forward dialogue box appears on the user interface 204 where the user is asked to enter the email id or name of the person to whom the user wishes to forward the email 310 along with the content depicted by numeral 314a.
Figures 3c shows user interface 204 illustrating method of performing one or more actions such as replying mails in email application on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure. Considering the one or more emails received in the inbox section of the user’s account. To perform drag-drop operation, the user selects the email depicted by numeral 316. The user selects the email 316 among the one or more received emails 304 by touching the email 316 for around 1-30 milliseconds on the user interface 204 using finger 308. The user drags the email 316, where dragging of the email 316 by the user using the finger 308 is depicted by numeral 318. The user drops the email 316 on the ‘reply’ utility 306b to achieve the action of replying to the email 316. After dropping of the email 316 on the ‘reply’ utility 306b is performed, the reply dialogue box appears on the user interface 204 where the user is asked to enter the text in text field 320a. The email id of the person is retained in “To” field when the user is replying. That is, the email 316 received from the person having the email id “Abc@gmail. com” is retained in “To” as depicted by numeral 320 when the user is performing reply action.
Similarly, various other actions such as including, but not limiting to deleting, composing, and editing the email is achieved by performing the drag-drop operation on the corresponding context based utility.
Figure 4a shows user interface 204 illustrating method of performing one or more actions in contact registers on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure. Considering the contact register depicted by numeral 402 which appears on the user interface 204 when the user clicks the contact registers on the electronic device 202. The contact registers contains the information of the contacts such as including, but not limiting to, phone number, email id, picture and name. In the illustrated figure 4a, the contact registers 402 contains the name and phone number of the contacts depicted by numeral 404. One or more context based utilities depicted by 406 related to the contact registers 402 appears on the user interface 206. That is, add utility 406a to add the contact, view utility 406b to view the contact information, edit utility 406c to edit the contact information, forward utility 406d to forward the contact information, SMS utility 406e to send sms to the contact and delete utility 406f to delete the contact are displayed.
Figures 4b shows user interface 204 illustrating method of performing one or more actions such as forwarding the contact information in contact registers 402 on the electronic device 202 using  drag-drop operation in accordance with an embodiment of the present disclosure. Considering the contact register 402 containing the one or more contacts along with their contact information such as name and the phone number as depicted by numeral 404. To perform drag-drop operation, the user selects the contact depicted by numeral 408. The user selects the contact 408 among the one or more contacts 404 by touching the contact 408 for around 1-30 milliseconds on the user interface 206 using finger 308. The user drags the contact 408, where dragging of the contact 408 by the user using the finger 308 is depicted by numeral 410. The user drops the contact 408 down on the forward utility 406d to achieve the action of forwarding the contact 408. After dropping of the contact 408 on the forward utility 406d is performed, the contact 408 is forwarded by the user as desired by the user.
Figures 4c shows user interface 204 illustrating method of performing one or more actions such as editing the contact information in contact registers 402 on the electronic device 202 using drag-drop operation in accordance with an embodiment of the present disclosure. Considering the contact register 402 containing the one or more contacts along with their contact information such as name and the phone number as depicted by numeral 404. To perform drag-drop operation, the user selects the contact depicted by numeral 412. The user selects the contact 410 among the one or more contacts 404 by touching the contact 410 for around 1-30 milliseconds on the user interface 204 using finger 308. In the illustrated figure 4c, the one or more context based utilities related to the contact register 402 pops up when the user touches the contact in the contact register 402 for a predetermined time, for example, around 30 milliseconds. The one or more context based utilities which is popped up is depicted by numeral 414. The popped up box includes options including, but not limiting to, call, SMS, add, view, delete, connect to social networks and edit. In an embodiment, the user invites the contact person present in the contact register to various social network by choosing connect to social networks utility. The user drags the contact 410, where dragging of the contact 410 by the user using the finger 308 is depicted by numeral 416. The user drops the contact 410 on the edit utility 416a to achieve the action of editing the contact 410. After dropping of the contact 410 on the edit utility 416a is achieved, the contact 410 is edited by the user as desired by the user.
Similarly, various other actions such as including, but not limiting to deleting, adding, viewing and connect to the social network are achieved by performing the drag-drop operation on the corresponding context based utility. A person skilled in art would understand that the utilities are dynamically provided on the user interface 204 based on the at least one item being selected on the user interface 204. For example, when user selects the audio clip, the utilities corresponding to audio  clips such as ‘play’ , ‘stop’ , ‘pause’ , ‘delete’ , ‘forward’ etc. are provided or displayed on the user interface 204.
In an embodiment, the user can connect to the various social networks and electronic applications to receive and transmit the one or more items including, but not limiting to, electronic mails (e-mails), short message services (SMSs), messages, images, audios, videos, and electronic documents.
Figure 5 shows an exemplary flow chart illustrating a method of performing one or more actions on the electronic device 202 using a drag-drop operation in accordance with an embodiment of the present disclosure. At step 502, at least one item is selected from the one or more items displayed on the user interface 204 of the electronic device 202. The at least one item is selected by touching the at least one item for the predetermined time. Selection of the at least one item is detected by the processing unit 206 configured in the electronic device 202. At step 504, the selected at least one item is dragged towards one of one or more context based utilities on the user interface 204. Each of the one or more context based utilities corresponds to the preconfigured action. At step 506, the selected at least one item is dropped on the one or more context based utilities to perform corresponding preconfigured action. The preconfigured action is initiated by the processing unit 206 upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
Figure 6 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device 202 in accordance with an embodiment of the present disclosure. At step 602, the processing unit 208 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202. The at least one item is selected by touching the at least one item for the predetermined time. Then at step 604, the processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities on the user interface 204. Each of the one or more context based utilities corresponds to the preconfigured action. At step 606, the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities. At step 608, the processing unit 206 performs the preconfigured action corresponding to the one of the one or more context based utilities. The preconfigured action is initiated upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
Figure 7 is an exemplary flow chart illustrating a method of performing one or more actions by the electronic device 202 providing a popped up box with one or more context based utilities in  accordance with an embodiment of the present disclosure. At step 702, the processing unit 208 of the electronic device 202 detects selection of at least one item from one or more items displayed on the user interface 204 of the electronic device 202 when the at least one item is touched for a predetermined time. Then, after the at least one item is touched for the predetermined time, a popped up box including the one or more context based utilities associated to the selected at least one item is provided on the user interface 204 as illustrated at step 704. At step 706, the processing unit 206 detects dragging of the selected at least one item towards one of one or more context based utilities in the popped up box. Each of the one or more context based utilities corresponds to the preconfigured action. At step 708, the processing unit 206 detects dropping of the selected at least one item on the one or more context based utilities in the popped up box. At step 710, the processing unit 206 performs the preconfigured action corresponding to the one of the one or more context based utilities. The preconfigured action is initiated upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
Additionally, advantages of present disclosure are illustrated herein.
Embodiment of the present disclosure minimizes number of touches on the user interface to perform a desired action on the electronic device with drag-drop operation.
Embodiment of the present disclosure reduces traversing multiple pages and facilitates to perform the one or more actions on same screen, thus reducing process overhead.
Embodiment of the present disclosure reduces number of steps to achieve a desired action on the electronic device i.e. with minimum of movement on the display page. Thus, the present disclosure reduces finger movement to random touch point and thus saving considerable amount of time.
Embodiment of the present disclosure reduces excess scroll or movement on the display page to reduce hand movement and thus reduces stress on the user.
The described operations may be implemented as a method, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a “non-transitory computer readable medium” , where a processing unit may read and execute the code from the computer readable medium. The processing unit is at least one of a microprocessor and a processor capable of processing and executing the queries. A non-transitory computer readable medium may comprise media such as magnetic storage medium (e.g. , hard disk drives, floppy disks, tape, etc. ), optical storage (CD-ROMs, DVDs, optical disks, etc. ), volatile and  non-volatile memory devices (e.g. , EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc. ), etc. We suggest further stating in the specification that non-transitory computer-readable media comprise all computer-readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g. , an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC), etc. ) . Still further, the code implementing the described operations may be implemented in “transmission signals” , where transmission signals may propagate through space or through a transmission media, such as an optical fiber, copper wire, etc. The transmission signals in which the code or logic is encoded may further comprise a wireless signal, satellite transmission, radio waves, infrared signals, Bluetooth, etc. The transmission signals in which the code or logic is encoded is capable of being transmitted by a transmitting station and received by a receiving station, where the code or logic encoded in the transmission signal may be decoded and stored in hardware or a non-transitory computer readable medium at the receiving and transmitting stations or devices. An “article of manufacture” comprises non-transitory computer readable medium, hardware logic, and/or transmission signals in which code may be implemented. A device in which the code implementing the described embodiments of operations is encoded may comprise a computer readable medium or hardware logic. Of course, those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the invention, and that the article of manufacture may comprise suitable information bearing medium known in the art.
The terms "an embodiment" , "embodiment" , "embodiments" , "the embodiment" , "the embodiments" , "one or more embodiments" , "some embodiments" , and "one embodiment" mean "one or more (but not all) embodiments of the invention (s) " unless expressly specified otherwise.
The terms "including" , "comprising" , “having” and variations thereof mean "including but not limited to" , unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
The terms "a" , "an" and "the" mean "one or more" , unless expressly specified otherwise.
Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
Further, although process steps, method steps, algorithms or the like may be described in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate) , it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of figures 6 and 7 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
The foregoing description of various embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto. The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the  inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (13)

  1. A method of performing one or more actions on an electronic device, said method comprising:
    selecting at least one item from one or more items displayed on a user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time;
    dragging the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; and
    dropping the selected at least one item on the one or more context based utilities to perform corresponding preconfigured action.
  2. The method as claimed in claim 1, wherein the one or more items are selected from a group comprising electronic mails (e-mails), short message services (SMSs), messages, images, audios, videos, and electronic documents.
  3. The method as claimed in claim 1, wherein the one or more context based utilities is displayed adjacent to the at least one item being selected.
  4. The method as claimed in claim 1, wherein the preconfigured action corresponding to each of the context based utilities is selected from at least one of replying, deleting, forwarding, editing, composing, adding, calling, viewing and other related actions capable of being performed on the electronic device.
  5. The method as claimed in claim 1, wherein the predetermined time is in the range of about 1 millisecond to about 30 milliseconds.
  6. The method as claimed in claim 1, wherein selecting of the at least one item is detected by a processing unit configured in the electronic device.
  7. The method as claimed in claim 6, wherein the preconfigured action is initiated by the processing unit upon determining the dragging and dropping of the selected at least one item on the one or more context based utilities.
  8. The method as claimed in claim 1, wherein the electronic device is a device comprising touch screen interface selected from at least one of mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), laptop, computer, point-of-sale terminals, car navigation systems, medical monitors, contactless device, industrial control panels, and other electronic device having touch screen panel.
  9. A electronic device for performing one or more actions, said electronic device comprising:
    a user interface to perform drag and drop operation; and
    a processing unit, communicatively connected to the user interface, and configured to:
    detect selection of at least one item from one or more items displayed on the user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time;
    detect dragging of the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; and
    detect dropping of the selected at least one item on the one or more context based utilities; and
    perform preconfigured action corresponding to the one of the one or more context based utilities.
  10. The electronic device as claimed in claim 9, wherein the user interface displays the one or more context based utilities adjacent to the at least one item being selected.
  11. The electronic device as claimed in claim 9, wherein the electronic device is a device comprising touch screen selected from at least one of mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), laptop, computer, point-of-sale terminals, car navigation systems, medical monitors, contactless device, industrial control panels, and other electronic device having touch screen panel.
  12. A non-transitory computer readable medium including operations stored thereon that when processed by at least one processing unit cause an electronic device to perform one or more actions by performing the acts of:
    detecting selection of at least one item from one or more items displayed on the user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time;
    detecting dragging of the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; and
    detecting dropping of the selected at least one item on the one or more context based utilities; and
    performing preconfigured action corresponding to the one of the one or more context based utilities.
  13. A computer program for performing one or more actions on an electronic device, said computer program comprising code segment for detecting selection of at least one item from one or more items displayed on the user interface of the electronic device, wherein the at least one item is selected by touching the at least one item for a predetermined time; code segment for detecting dragging of the selected at least one item towards one of one or more context based utilities on the user interface, wherein each of the one or more context based utilities correspond to a preconfigured action; code segment for detecting dropping of the selected at least one item on the one or more context based utilities; and code segment for performing preconfigured action corresponding to the one of the one or more context based utilities.
PCT/CN2014/086192 2013-12-04 2014-09-10 Method of performing one or more actions on electronic device WO2015081739A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201480003270.6A CN105051669A (en) 2013-12-04 2014-09-10 Method of performing one or more actions on electronic device
EP14867972.3A EP2965185B1 (en) 2013-12-04 2014-09-10 Method of performing one or more actions on electronic device
ES14867972T ES2785207T3 (en) 2013-12-04 2014-09-10 Procedure to perform one or more actions on the electronic device
US14/969,163 US20160098166A1 (en) 2013-12-04 2015-12-15 Method of Performing One or More Actions on an Electronic Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
INCH55962013 2013-12-04
ININ5596/CHE/2013 2013-12-04

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/969,163 Continuation US20160098166A1 (en) 2013-12-04 2015-12-15 Method of Performing One or More Actions on an Electronic Device

Publications (1)

Publication Number Publication Date
WO2015081739A1 true WO2015081739A1 (en) 2015-06-11

Family

ID=53272841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/086192 WO2015081739A1 (en) 2013-12-04 2014-09-10 Method of performing one or more actions on electronic device

Country Status (1)

Country Link
WO (1) WO2015081739A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012194A1 (en) 1999-11-09 2001-08-09 Soares Gilbert Anthony Plug-in draw out unit
CN102075619A (en) * 2010-12-15 2011-05-25 华为终端有限公司 Method for unlocking touch screen mobile phone and touch screen mobile phone
CN102799368A (en) * 2012-06-29 2012-11-28 广州市动景计算机科技有限公司 Method for opening links on touch control type browser and touch control type browser system
CN103034433A (en) * 2011-10-04 2013-04-10 Lg电子株式会社 Mobile terminal and control method for the same
CN103279257A (en) * 2012-01-04 2013-09-04 三星电子株式会社 Method and apparatus for managing icon in portable terminal
WO2013175770A1 (en) * 2012-05-25 2013-11-28 パナソニック株式会社 Information processing device, information processing method, and information processing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012194A1 (en) 1999-11-09 2001-08-09 Soares Gilbert Anthony Plug-in draw out unit
CN102075619A (en) * 2010-12-15 2011-05-25 华为终端有限公司 Method for unlocking touch screen mobile phone and touch screen mobile phone
CN103034433A (en) * 2011-10-04 2013-04-10 Lg电子株式会社 Mobile terminal and control method for the same
CN103279257A (en) * 2012-01-04 2013-09-04 三星电子株式会社 Method and apparatus for managing icon in portable terminal
WO2013175770A1 (en) * 2012-05-25 2013-11-28 パナソニック株式会社 Information processing device, information processing method, and information processing program
CN102799368A (en) * 2012-06-29 2012-11-28 广州市动景计算机科技有限公司 Method for opening links on touch control type browser and touch control type browser system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2965185A4 *

Similar Documents

Publication Publication Date Title
US10567481B2 (en) Work environment for information sharing and collaboration
US10725632B2 (en) In-place contextual menu for handling actions for a listing of items
EP2701352B1 (en) Method for transmitting/receiving message and electronic device thereof
US10101877B2 (en) Portable electronic device including touch-sensitive display and method of providing access to an application
KR102008916B1 (en) Method for displaying of unconfirmed contents and an electronic device thereof
CN103577042B (en) Method for providing message function and its electronic device
KR102208362B1 (en) Method and apparatus for managing message of electronic device
US20130290879A1 (en) Displaying notification messages and messages on a portable electronic device
EP2369459B1 (en) Menu executing method and apparatus in portable terminal
US20100105440A1 (en) Mobile Communications Device Home Screen
US20080057926A1 (en) Missed Telephone Call Management for a Portable Multifunction Device
US20090282332A1 (en) Apparatus, method and computer program product for selecting multiple items using multi-touch
US8762892B2 (en) Controlling an integrated messaging system using gestures
KR20120069494A (en) Method and apparatus for displaying icon in portable terminal
EP3043302B1 (en) Electronic device and method of controlling display of information
US11275486B2 (en) Restructuring view of messages based on configurable persistence
KR20150051292A (en) Method for sharing contents and electronic device thereof
JP2013513152A (en) Direct manipulation of input dialog in instant communication client application
WO2012058811A1 (en) Method and device for checking event to be processed
EP2965185B1 (en) Method of performing one or more actions on electronic device
US10701200B2 (en) Electronic device, control method and non-transitory storage medium for associating a text message with a call
CN102866847B (en) A kind of method and apparatus showing information
US20130159877A1 (en) Stealth mode for interacting with electronic messages
US20190361534A1 (en) Interpreting and generating input and output gestures
WO2015081739A1 (en) Method of performing one or more actions on electronic device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480003270.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14867972

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2014867972

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE