US20130246930A1 - Touch gestures related to interaction with contacts in a business data system - Google Patents
Touch gestures related to interaction with contacts in a business data system Download PDFInfo
- Publication number
- US20130246930A1 US20130246930A1 US13/754,896 US201313754896A US2013246930A1 US 20130246930 A1 US20130246930 A1 US 20130246930A1 US 201313754896 A US201313754896 A US 201313754896A US 2013246930 A1 US2013246930 A1 US 2013246930A1
- Authority
- US
- United States
- Prior art keywords
- contact
- user
- business data
- touch gesture
- contact information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- CRM customer relations management
- ERP enterprise resource planning
- LOB line-of-business
- CCM customer relations management
- ERP enterprise resource planning
- LOB line-of-business
- CLM customer relations management
- ERP enterprise resource planning
- LOB line-of-business
- CLM customer relations management
- ERP enterprise resource planning
- LOB line-of-business
- These types of systems often enable creation and maintaining of business data records.
- Some of these records include customer records that have details about customers, vendor records that include details of vendors, sales records, sales proposals, quotes, order records, records that contain product or service information, and records related to business contacts, among many others.
- the system can also include workflows that enable users to perform various tasks and workflows using the business data system.
- An example of a workflow provided in some business data systems is one that allow(s) users or organizations to track various business opportunities. For instance, if there is an opportunity to make a sale of products or services to another organization, the business data system allows users to enter information that may be helpful in converting that opportunity into an actual sale. Similarly, some such systems allow many other types of tasks or workflows to be performed as well. For instance, some systems allow users to prepare a quote for a potential customer. Then, when the customer accepts the terms of the quote, the user can convert the quote into an actual order. These are merely two examples of a wide variety of different types of tasks and workflows that can be performed within a business data system.
- some users may wish to contact other people associated with the business data records being operated on. For instance, where a customer has a primary contact, it may be that the user wishes to call or otherwise communicate with that person in order to discuss the terms of a proposal or order. Therefore, some business data systems allow a user to search for contacts, and communicate with a given contact.
- a desktop computer may have user interface displays with user input mechanisms that can be actuated by a point and click device (such a mouse or track ball) or a hardware keyboard.
- a point and click device such as a mouse or track ball
- mobile devices often have touch sensitive screens. This enables a user to actuate user input mechanisms using touch gestures, such as by using a finger, a stylus, or other device.
- a business data system generates a user interface display showing a business data record.
- the business data system receives a touch gesture user input to manipulate a contact within the business data system.
- the business data system manipulates the contact based on the touch gesture user input.
- FIG. 1 is a block diagram of one illustrative business data environment.
- FIG. 2A is a flow diagram of one embodiment of the operation of the system shown in FIG. 1 in manipulating contact information based on a touch gesture.
- FIG. 2B is a flow diagram illustrating one embodiment of the operation of the system shown in FIG. 1 in manipulating a contact, within a business record, using touch gestures.
- FIGS. 3A-3F show exemplary user interface displays.
- FIG. 4 shows one embodiment of the system shown in FIG. 1 in different architectures.
- FIGS. 5-9 illustrate various mobile devices.
- FIG. 10 is a block diagram of one illustrative computing environment.
- FIG. 1 shows one illustrative embodiment of a business data architecture 90 .
- Business data architecture 90 includes CRM system 100 , CRM data store 102 and user device 104 .
- User device 104 is shown generating user interface displays 106 for interaction by user 108 .
- CRM system 100 can be any business data system (such as a CRM system, an ERP system, an LOB system, or another business data application or business data system) it is described herein as a CRM system, for the sake of example only.
- CRM system 100 illustratively includes processor 110 , user interface component 112 , communication component 114 , workflow/task component 118 and other CRM components 120 .
- Processor 110 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). It is a functional part of CRM system 100 and is activated by, and facilitates the functionality of, the other components and items in CRM system 100 . It will also be noted that while only a single processor 110 is shown, processor 110 can actually be multiple different computer processors as well. In addition, the multiple different computer processors used by system 100 can be local to system 100 , or remote from system 100 , but accessible by system 100 .
- User interface component 112 illustratively generates user interface displays with user input mechanisms that can be actuated by user 108 .
- the user interface displays 106 (that user 108 interacts with) can be generated by user interface component 112 in CRM system 100 and passed to device 104 where they can be displayed (by device 104 , as user interface displays 106 ) for interaction by user 108 .
- Communication component 114 illustratively facilitates communication among various users of CRM system 100 , or between users of CRM system 100 and other individuals who may not necessarily be users of system 100 . For instance, if user 108 wishes to communicate with a contact who may not necessarily have access to CRM system 100 (such as by initiating a phone call, an instant message, etc.), communication component 114 illustratively facilitates this type of communication. Therefore, communication component 114 can illustratively facilitate email communication, telephone or cellular telephone communication, instant message communication, chat room communication, or other types of communication.
- Workflow/task component 118 illustratively uses user interface component 112 to generate user interface displays 106 so that user 108 can perform tasks and carryout workflow within CRM system 100 .
- workflow/task component 118 illustratively allows user 108 to add contact information to CRM system 100 , to track opportunities within system 100 , to convert quotes to orders, or to input various other types of information or perform other tasks or workflows.
- CRM components 120 illustratively provide the functionality for other things that can be done in CRM system 100 . There are a wide variety of other things that users can do within CRM system 100 , and these various functions are provided by other components 120 .
- CRM system 100 has access to CRM data store 102 .
- CRM data store 102 illustratively stores a variety of different business data records. While data store 102 is shown as a single data store, it can be multiple different data stores. It can be local to system 100 or remote therefrom. Where it includes multiple different data stores, they can all be local to or remote from system 100 , or some can be local while others are remote.
- the data records can include, by way of example only, proposals 124 , opportunities 126 , quotes 128 , customer data records 130 , orders 132 , product/service information 134 , vendor records 136 , contacts 138 , workflows 140 , and other business data records 142 .
- Each of the business data records may be an object or entity, or another type of record.
- the records can include links to other records, or stand by themselves. All of these types of structures, and others are contemplated herein.
- Proposals 124 illustratively include business information for a proposal that can be made to a customer.
- Opportunities 126 illustratively include a wide variety of different types of information (some of which is described below with respect to FIGS. 3A-3F ) that enable user 108 to track a sales opportunity within CRM system 100 .
- Quotes 128 illustratively include information defining quotes that can be provided to customers.
- Customers 130 include customer information, such as contact information, address, billing information, etc. for different customers.
- Orders 132 illustratively include order information that reflects orders that have actually been made by various customers.
- Product/service information 134 illustratively includes information that describes products or services in CRM system 100 .
- Vendors 136 illustratively include information describing vendors that are used by the organization in which CRM system 100 is deployed.
- Contacts 138 illustratively include contact information for various people that are either users of CRM system 100 , or that are related to any of the other business data records in CRM data store 102 (for instance they can be contacts at vendors, customers, other users, etc.).
- Workflows 130 illustratively define the various workflows that user 108 can perform within CRM system 100 .
- the workflows can take a wide variety of different forms. For instance, they may simply be data entry workflows, workflows posting information to a ledger, workflows fleshing out proposals or quotes, or a wide variety of other things.
- CRM system 100 accesses workflows 140 in order to generate the user interface displays 106 that can be manipulated by user 108 , in order to perform the different workflows.
- User device 104 illustratively includes user interface component 122 , client CRM system 144 , and processor 146 .
- Client CRM system 144 is illustratively used by user device 104 in order to access CRM system 100 .
- client CRM system 144 can be a stand alone system as well, in which case it has access to CRM data store 102 , or a different CRM data store. As described herein, however, it is simply used in order to access CRM system 100 . This is but one option.
- User interface component 122 illustratively generates the user interface displays 106 on user device 104 .
- device 104 has a touch sensitive user interface display screen. Therefore, user interface component 122 illustratively generates the displays for display on the user interface display screen.
- the displays 106 have user input mechanisms 107 that can be actuated using touch gestures by user 108 .
- Processor 146 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). Processor 146 is illustratively a functional part of device 104 and is activated by, and facilitates the functionality of, the other systems, components and items in device 104 . While processor 146 is shown as a single processor, it could be multiple processors as well.
- user interface displays 106 are illustratively user interface displays that are provided for interaction by user 108 .
- User input mechanisms 107 can be a wide variety of different types of user input mechanisms. For instance, they can be buttons, icons, text boxes, dropdown menus, soft keyboards or virtual keyboards or keypads, links, check boxes, active tiles that function as a link to underlying information and that actively or dynamically show information about the underlying information or a wide variety of other user input mechanisms that can be actuated using touch gestures.
- FIG. 2A is a flow diagram illustrating one embodiment of the operation of the architecture shown in FIG. 1 in manipulating contacts using touch gestures within CRM system 100 .
- User 108 first illustratively provides an input indicating that he or she wishes to access CRM system 100 .
- This can launch client CRM system 144 which provides access to CRM system 100 , or it can launch CRM system 100 and provide direct or indirect access.
- CRM system 100 uses user interface component 112 to generate a user interface display 106 that displays a wall or other CRM display.
- the CRM display illustratively includes user input mechanisms 107 that allow user 108 to manipulate them and thus control and manipulate CRM system 100 .
- FIG. 3A is one illustrative example of a user interface display 200 that shows a wall or a CRM start screen.
- Display 200 is shown on user device 202 which is illustratively a tablet computer.
- Tablet computer 202 illustratively includes touch sensitive display screen 204 .
- touch sensitive display screen 204 illustratively includes touch sensitive display screen 204 .
- device 202 could be any other type of device that has a touch sensitive display screen.
- Start screen (or wall) 200 is shown with a plurality of tiles, or icons 206 .
- the icons are divided generally into two different sections.
- the first section is a personal section 208
- the second section is a business section 210 .
- These sections are exemplary only and may or may not be used.
- the tiles in section 208 are illustratively user actuatable links which, when actuated by a user, cause a corresponding function to happen. For example, when either one of a pair of browser tiles 210 or 212 are actuated by the user, they launch a browser.
- store tile 214 When store tile 214 is actuated by the user, it launches an on-line store application or portal.
- Other tiles are shown for navigating to the control panel, for viewing weather, for viewing stock information of identified companies, or that indicate popular browsing sessions.
- the tiles shown in the personal section 208 are exemplary only and a wide variety of other tiles could be shown as well.
- the business section 210 of start display 200 also includes a plurality of tiles which, when actuated by the user, cause the CRM system to take action.
- contact tile 216 when actuated by the user, opens a contact menu for the user.
- Opportunities tile 218 when actuated by the user, opens opportunity records or an opportunities menu that allows the user to navigate to individual opportunity records.
- the “my leads” tile 220 when actuated by the user, causes the CRM system 100 to open a menu or records corresponding to leads for the given user.
- a news tile 222 provides news about one or more items that have taken place in CRM system 100 , and that are of interest to the user. In the example shown in FIG.
- tile 222 shows that an opportunity for the ACME Company has been closed by another sales person.
- the CRM system 100 navigates the user to additional information about that closed opportunity. For instance, it may navigate the user to the opportunity record or to the sales record, or simply to the ACME Company general record.
- the other tiles when actuated by the user, cause the CRM system to navigate the user to other places of interest or to launch other components of the CRM system. Those displayed are shown for the sake of example only.
- CRM system 100 receives a user touch gesture to manipulate a contact in CRM system 100 .
- a user touch gesture to manipulate a contact in CRM system 100 .
- the user can simply touch contacts tile 216 .
- This causes CRM system 100 to display a contact menu that allows the user to take a variety of other actions, such as to open a contact 155 , edit a contact 157 , add or delete contacts 159 , initiate communication with one or more contacts 161 , schedule a meeting with a contact 163 , touch a search button to begin a search 191 for a contact, or perform other contact manipulation steps 165 .
- CRM system 100 manipulates the contact based on the touch gestures. This is indicated by block 167 in FIG. 2A .
- the user can manipulate contacts in other ways as well. For instance, instead of actuating contact tile 216 , or one of the specific contacts represented by the photos or images on tile 216 , the user may open up other business data records in CRM system 100 . Many of those business data records may have individual people, or contacts, associated with them. Therefore, user 108 can manipulate contacts from within those business data records as well.
- FIG. 2B is a flow diagram illustrating one embodiment of this type of contact manipulation.
- the first two blocks in FIG. 2B are similar to the first two blocks shown in FIG. 2A , and they are similarly numbered. Therefore, at block 150 , the user launches the CRM system and at block 152 the CRM system displays a start display or wall or other CRM user interface display.
- FIG. 3B shows one example of this.
- the user has illustratively actuated tile 218 , such as by touching it.
- CRM system 100 displays an opportunities tile 224 .
- Opportunities tile 224 is illustratively indicative of a new opportunity that has been created.
- the user then actuates tile 224 , using a touch gesture (e.g., by touching it) with his or her finger 226 .
- This causes CRM system 100 to open another user interface display, such as user interface display 228 shown in FIG. 3C , corresponding to the newly created opportunity.
- Receiving the user input to open the CRM record is indicated by block 154 in FIG. 2B
- having the CRM system 100 display the record is indicated by block 156 .
- FIG. 3C shows that the business record display 228 displays tiles or links (or other icons or user-actuatable items) that allow the user to view a variety of different kinds of information.
- display 228 includes a “people” or “contact” tile 230 .
- Tile 230 identifies people either at the organization for which the opportunity has been generated, or at the organization that employs the CRM system, that are somehow related to the opportunity.
- the opportunity tile 230 may link user 108 to other people in the company that employs the CRM system, who are working on converting the opportunity into an actual sale.
- tile 230 when actuated by the user, may navigate the user to contact information for individuals at the company for which the opportunity was developed.
- the CRM system 100 illustratively navigates user 108 to either a contact menu or a specific contact and allows the user to manipulate the contact in a similar way as described above with respect to FIG. 2A .
- the user can open a contact, delete or edit it, initiate communication, etc.
- FIG. 3C also shows examples of other information that can be shown in a business data record.
- user interface display 228 includes a wide variety of actuable items that take the user to other information corresponding to the opportunity.
- Invoices tile 232 when actuated by the user, navigates the user to another display where the user can view information related to invoices that correspond to this opportunity.
- Quotes tile 234 when actuated by the user, navigates the user to additional information about quotes generated for this company or somehow related to this opportunity.
- Document tile 236 illustratively navigates the user to other related documents corresponding to this opportunity, and activity tile 238 shows, in general, the amount of activity related to this opportunity.
- CRM system 100 can navigate the user to additional displays showing the specific activity represented by the tile 238 .
- User interface display 228 also illustratively includes a “What's new” section 240 .
- What's new section 240 can display posts by user 108 , or other users of the CRM system, that are related to the opportunity being displayed.
- display 228 is illustratively pannable in the directions indicated by arrow 242 .
- display 228 illustratively pans to the left or to the right based on the touch gesture.
- User interface display 228 also illustratively includes an information section 244 that displays a primary contact tile 246 corresponding to a primary contact for this opportunity.
- a plurality of additional tiles 248 are displayed below the primary contact tile 246 , and provide information corresponding to the individual represented by primary contact tile 246 .
- the tiles 248 for instance, provide a preferred contact method for the primary contact, an amount of revenue generated by the primary contact, an indicator of the availability of the primary contact, a reputation or rating for the primary contact, a date when the opportunity corresponding to the primary contact closes, and a credit limit for the primary contact.
- all of the tiles 248 are exemplary only, and additional or different information corresponding to the primary contact, or other information, can be displayed as well.
- CRM system 100 would illustratively provide a user input mechanism that allows user 108 to navigate to contact information corresponding to the displayed business data record. Determining whether contact information is displayed on the business data record represented by user interface display 228 is indicated by block 158 in FIG. 2B . If not, receiving the user touch gesture to show contact information is indicated by block 160 .
- both the contact tile 230 and the primary contact tile 246 are shown in user interface display 228 . Therefore, the user need not provide an additional touch gesture to see contact information.
- FIG. 3C also shows that the user is using his or her finger 226 to actuate tile 246 .
- user 108 is selecting primary contact 246 , by actuating the corresponding tile.
- Receiving a touch gesture selecting a contact is indicated by block 162 in FIG. 2B .
- Actuation of tile 146 causes CRM system 100 to generate another user interface display that allows the user to manipulate the contact information. As described above with respect to FIG. 2A , this can take a wide variety of different forms. However, in the embodiment discussed with respect to FIG. 2B , actuating primary contact tile 246 causes CRM system 100 to generate a display, such as user interface display 250 , shown in FIG. 3D . It can be seen that a number of the items in user interface display 250 are the same as those shown in user interface display 228 in FIG. 3C , and they are similarly numbered. However, FIG. 3D also shows that, since the user actuated tile 246 , this causes CRM system 100 to display communication bar 252 .
- Communication bar 252 displays the specific contact options for the selected contact, who was selected when the user actuated tile 246 .
- Contact bar 252 itself, illustratively includes a plurality of user actuatable items each of which represents a method for contacting the primary contact represented by tile 246 .
- contact bar 252 includes phone button 166 , email button 168 , instant messenger button 170 and other button 172 . Displaying the specific contact options for the selected contact is indicated by block 164 in FIG. 2B .
- FIG. 3D shows that the user 108 has used his or her finger 226 to actuate the phone button 166 . In the embodiment shown, the user simply touches button 166 to actuate it. Receiving the user touch gesture selecting a contact option is indicated by block 174 in FIG. 2B .
- communication component 114 in CRM system 100 illustratively initiates a phone call to the primary contact “Phil B.” represented by tile 246 and generates a suitable user interface display indicating that the call has been initiated.
- FIG. 3E shows one exemplary user interface display 300 that illustrates this. It can be seen in display 300 that a phone call is underway to Phil B. This is indicated generally at 302 . Display 300 shows the identity of the person being called, an indication that it is a phone call, and the elapsed time of the call. Of course, this information is exemplary only and a wide variety of additional or different information could be shown as well. In any case, user interface display 300 illustrates that a call has been placed.
- a number of other exemplary things are shown in display 300 .
- a list of objectives to be accomplished are shown generally at 306 .
- a status bar 304 shows how many of the objectives for the phone call have been completed.
- the objectives listed are “product requirements”, “key decision makers”, “budget”, and “notes”. In one embodiment, these are the agenda items for the phone call. Of course, they may be simply “to do” items or a variety of other listed items as well.
- FIG. 3E also shows that a soft keyboard is displayed generally at 308 . This allows user 108 to type information into the text boxes at 306 , or to otherwise enter alphanumeric information, using touch.
- the communication can proceed until one of the parties stops the communication. This can be done, in one embodiment, by user 108 simply touching an appropriate button on the user interface display.
- FIG. 3F shows one illustrative way of doing this.
- FIG. 3F shows user interface display 310 , which is similar to user interface display 308 shown in FIG. 3E , and similar items are similarly numbered. However, it can be seen in FIG. 3F that the parties to the call have accomplished two of the agenda items, and therefore status bar 304 shows that two out of four items have been completed.
- Display 310 also shows that the user has touched a “hang up” button 312 . Hang up button 312 allows user 108 to terminate the call, simply by actuating button 312 .
- Receiving a user touch gesture to end the communication is indicated by block 178 in FIG. 2B .
- communication component 114 of CRM system 100 hangs up the call, or disconnects the call, or otherwise discontinues the telephone communication. This is indicated by block 180 in FIG. 2B .
- a user can quickly and easily manipulate contact information within a CRM system, or other business data system.
- contact information When contact information is displayed, the user can use a touch gesture to manipulate it. This can make manipulation of contact information much easier and less cumbersome.
- touch gestures mentioned herein can take a wide variety of different forms. They can be simple touches or taps, swipes, slides, multi-touch inputs, positional gestures (gestures at a specific position or location on the screen), brushing, multi-finger gestures, touch and hold gestures, etc.
- positional gestures gestures at a specific position or location on the screen
- brushing multi-finger gestures
- touch and hold gestures etc.
- the speed of the gestures can be used for control as well (e.g., a quick swipe can pan quickly while a slow swipe pans slowly, etc.).
- FIG. 4 is a block diagram of system 100 , shown in FIG. 1 , except that it is disposed in a cloud computing architecture 500 .
- Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services.
- cloud computing delivers the services over a wide area network, such as the internet, using appropriate protocols.
- cloud computing providers deliver applications over a wide area network and they can be accessed through a web browser or any other computing component.
- Software or components of system 100 as well as the corresponding data can be stored on servers at a remote location.
- the computing resources in a cloud computing environment can be consolidated at a remote data center location or they can be dispersed.
- Cloud computing infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user.
- the components and functions described herein can be provided from a service provider at a remote location using a cloud computing architecture.
- they can be provided from a conventional server, or they can be installed on client devices directly, or in other ways.
- Cloud computing both public and private provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.
- a public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware.
- a private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.
- FIG. 4 specifically shows that CRM system 100 (or, of course, another business data system such as an ERP system, LOB application, etc.) is located in cloud 502 (which can be public, private, or a combination where portions are public while others are private). Therefore, user 108 uses a user device 104 to access those systems through cloud 502 .
- CRM system 100 or, of course, another business data system such as an ERP system, LOB application, etc.
- cloud 502 which can be public, private, or a combination where portions are public while others are private. Therefore, user 108 uses a user device 104 to access those systems through cloud 502 .
- FIG. 4 also depicts another embodiment of a cloud architecture.
- FIG. 4 shows that it is also contemplated that some elements of business system 100 (or architecture 90 ) are disposed in cloud 502 while others are not.
- data store 102 can be disposed inside of cloud 502 (with CRM system 100 ) or outside of cloud 502 , and accessed through cloud 502 .
- communication component 114 is also outside of cloud 502 . Regardless of where they are located, they can be accessed directly by device 104 , through a network (either a wide area network or a local area network), they can be hosted at a remote site by a service, or they can be provided as a service through a cloud or accessed by a connection service that resides in the cloud. All of these architectures are contemplated herein.
- system 100 can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc.
- FIG. 5 is a simplified block diagram of one illustrative embodiment of a handheld or mobile computing device that can be used as a user's or client's hand held device 16 (e.g., device 104 ), in which the present system (or parts of it) can be deployed.
- FIGS. 6-9 are examples of handheld or mobile devices.
- FIG. 5 provides a general block diagram of the components of a client device 16 that can run components of system 100 or that interacts with system 100 , or both.
- a communications link 13 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning.
- Examples of communications link 13 include an infrared port, a serial/USB port, a cable network port such as an Ethernet port, and a wireless network port allowing communication though one or more communication protocols including General Packet Radio Service (GPRS), LTE, HSPA, HSPA+ and other 3G and 4G radio protocols, 1Xrtt, and Short Message Service, which are wireless services used to provide cellular access to a network, as well as 802.11 and 802.11b (Wi-Fi) protocols, and Bluetooth protocol, which provide local wireless connections to networks.
- GPRS General Packet Radio Service
- LTE Long Term Evolution
- HSPA High Speed Packet Access
- HSPA+ High Speed Packet Access Plus
- 1Xrtt 3G and 4G radio protocols
- 1Xrtt 1Xrtt
- Short Message Service Short Message Service
- SD card interface 15 Secure Digital (SD) card that is connected to a SD card interface 15 .
- SD card interface 15 and communication links 13 communicate with a processor 17 (which can also embody processor 146 from FIG. 1 ) along a bus 19 that is also connected to memory 21 and input/output (I/O) components 23 , as well as clock 25 and location system 27 .
- I/O components 23 are provided to facilitate input and output operations.
- I/O components 23 for various embodiments of the device 16 can include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches and output components such as a display device, a speaker, and or a printer port.
- Other I/O components 23 can be used as well.
- Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions for processor 17 .
- Location system 27 illustratively includes a component that outputs a current geographical location of device 16 .
- This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions.
- GPS global positioning system
- Memory 21 stores operating system 29 , network settings 31 , applications 33 , application configuration settings 35 , data store 37 , communication drivers 39 , and communication configuration settings 41 .
- Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below).
- Memory 21 stores computer readable instructions that, when executed by processor 17 , cause the processor to perform computer-implemented steps or functions according to the instructions.
- System 100 or the items in data store 102 for example, can reside in memory 21 .
- device 16 can have a client business system 24 (e.g., client CRM system 144 ) which can run various business applications or embody parts or all of business system 100 .
- Processor 17 can be activated by other components to facilitate their functionality as well.
- Examples of the network settings 31 include things such as proxy information, Internet connection information, and mappings.
- Application configuration settings 35 include settings that tailor the application for a specific enterprise or user.
- Communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, SMS parameters, connection user names and passwords.
- Applications 33 can be applications that have previously been stored on the device 16 or applications that are installed during use, although these can be part of operating system 29 , or hosted external to device 16 , as well.
- FIG. 6 shows one embodiment in which device 16 is a tablet computer 600 (also shown in FIGS. 3A-3F ).
- Screen 602 can be a touch screen (so touch gestures from a user's finger can be used to interact with the application) or a pen-enabled interface that receives inputs from a pen or stylus. It can also use an on-screen virtual keyboard. Of course, it might also be attached to a keyboard or other user input device through a suitable attachment mechanism, such as a wireless link or USB port, for instance.
- Computer 600 can also illustratively receive voice inputs as well.
- FIGS. 7 , 8 and 9 provide additional examples of devices 16 that can be used, although others can be used as well.
- a mobile phone 45 (or feature phone) is provided as the device 16 .
- Phone 45 includes a set of keypads 47 for dialing phone numbers, a display 49 capable of displaying images including application images, icons, web pages, photographs, and video, and control buttons 51 for selecting items shown on the display.
- the phone includes an antenna 53 for receiving cellular phone signals such as General Packet Radio Service (GPRS) and 1Xrtt, and Short Message Service (SMS) signals.
- GPRS General Packet Radio Service
- 1Xrtt 1Xrtt
- SMS Short Message Service
- phone 45 also includes a Secure Digital (SD) card slot 55 that accepts a SD card 57 .
- SD Secure Digital
- the mobile device of FIG. 8 is a personal digital assistant (PDA) 59 or a multimedia player or a tablet computing device, etc. (hereinafter referred to as PDA 59 ).
- PDA 59 includes an inductive screen 61 that senses the position of a stylus 63 (or other pointers, such as a user's finger) when the stylus is positioned over the screen. This allows the user to select, highlight, and move items on the screen as well as draw and write.
- PDA 59 also includes a number of user input keys or buttons (such as button 65 ) which allow the user to scroll through menu options or other display options which are displayed on display 61 , and allow the user to change applications or select user input functions, without contacting display 61 .
- PDA 59 can include an internal antenna and an infrared transmitter/receiver that allow for wireless communication with other computers as well as connection ports that allow for hardware connections to other computing devices. Such hardware connections are typically made through a cradle that connects to the other computer through a serial or USB port. As such, these connections are non-network connections.
- mobile device 59 also includes a SD card slot 67 that accepts a SD card 69 .
- FIG. 9 is similar to FIG. 8 except that the phone is a smart phone 71 .
- Smart phone 71 has a touch sensitive display 73 that displays icons or tiles or other user input mechanisms 75 .
- Mechanisms 75 can be used by a user to access a business data system (like CRM system 100 ) run applications, make calls, perform data transfer operations, etc.
- smart phone 71 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone.
- FIG. 10 is one embodiment of a computing environment in which system 100 (for example) can be deployed.
- an exemplary system for implementing some embodiments includes a general-purpose computing device in the form of a computer 810 .
- Components of computer 810 may include, but are not limited to, a processing unit 820 (which can comprise processor 110 ), a system memory 830 , and a system bus 821 that couples various system components including the system memory to the processing unit 820 .
- the system bus 821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
- ISA Industry Standard Architecture
- MCA Micro Channel Architecture
- EISA Enhanced ISA
- VESA Video Electronics Standards Association
- PCI Peripheral Component Interconnect
- Computer 810 typically includes a variety of computer readable media.
- Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media.
- Computer readable media may comprise computer storage media and communication media.
- Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810 .
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media.
- the system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832 .
- ROM read only memory
- RAM random access memory
- BIOS basic input/output system 833
- RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820 .
- FIG. 10 illustrates operating system 834 , application programs 835 , other program modules 836 , and program data 837 .
- the computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media.
- FIG. 10 illustrates a hard disk drive 841 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 851 that reads from or writes to a removable, nonvolatile magnetic disk 852 , and an optical disk drive 855 that reads from or writes to a removable, nonvolatile optical disk 856 such as a CD ROM or other optical media.
- removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
- the hard disk drive 841 is typically connected to the system bus 821 through a non-removable memory interface such as interface 840
- magnetic disk drive 851 and optical disk drive 855 are typically connected to the system bus 821 by a removable memory interface, such as interface 850 .
- the drives and their associated computer storage media discussed above and illustrated in FIG. 10 provide storage of computer readable instructions, data structures, program modules and other data for the computer 810 .
- hard disk drive 841 is illustrated as storing operating system 844 , application programs 845 , other program modules 846 , and program data 847 .
- operating system 844 application programs 845 , other program modules 846 , and program data 847 are given different numbers here to illustrate that, at a minimum, they are different copies.
- a user may enter commands and information into the computer 810 through input devices such as a keyboard 862 , a microphone 863 , and a pointing device 861 , such as a mouse, trackball or touch pad.
- Other input devices may include a joystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
- a visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890 .
- computers may also include other peripheral output devices such as speakers 897 and printer 896 , which may be connected through an output peripheral interface 895 .
- the computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 880 .
- the remote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 810 .
- the logical connections depicted in FIG. 10 include a local area network (LAN) 871 and a wide area network (WAN) 873 , but may also include other networks.
- LAN local area network
- WAN wide area network
- Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
- the computer 810 When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870 .
- the computer 810 When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873 , such as the Internet.
- the modem 872 which may be internal or external, may be connected to the system bus 821 via the user input interface 860 , or other appropriate mechanism.
- program modules depicted relative to the computer 810 may be stored in the remote memory storage device.
- FIG. 10 illustrates remote application programs 885 as residing on remote computer 880 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
Abstract
A business data system generates a user interface display showing a business data record. The business data system receives a touch gesture user input manipulating a contact within the business data system. The business data system manipulates the contact based on the touch gesture user input.
Description
- There are a wide variety of different types of business data systems. Some such systems include customer relations management (CRM) systems, enterprise resource planning (ERP) systems, line-of-business (LOB) applications, and other business systems. These types of systems often enable creation and maintaining of business data records. Some of these records include customer records that have details about customers, vendor records that include details of vendors, sales records, sales proposals, quotes, order records, records that contain product or service information, and records related to business contacts, among many others. The system can also include workflows that enable users to perform various tasks and workflows using the business data system.
- An example of a workflow provided in some business data systems is one that allow(s) users or organizations to track various business opportunities. For instance, if there is an opportunity to make a sale of products or services to another organization, the business data system allows users to enter information that may be helpful in converting that opportunity into an actual sale. Similarly, some such systems allow many other types of tasks or workflows to be performed as well. For instance, some systems allow users to prepare a quote for a potential customer. Then, when the customer accepts the terms of the quote, the user can convert the quote into an actual order. These are merely two examples of a wide variety of different types of tasks and workflows that can be performed within a business data system.
- In performing these types of tasks and workflows, some users may wish to contact other people associated with the business data records being operated on. For instance, where a customer has a primary contact, it may be that the user wishes to call or otherwise communicate with that person in order to discuss the terms of a proposal or order. Therefore, some business data systems allow a user to search for contacts, and communicate with a given contact.
- The use of mobile devices is also increasing rapidly. For instance, some mobile devices include smart phones, cellular telephones, and tablet computers, to name a few. These types of devices often have different types of user input mechanisms than desktop computers. For example, a desktop computer may have user interface displays with user input mechanisms that can be actuated by a point and click device (such a mouse or track ball) or a hardware keyboard. However, mobile devices often have touch sensitive screens. This enables a user to actuate user input mechanisms using touch gestures, such as by using a finger, a stylus, or other device.
- The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
- A business data system generates a user interface display showing a business data record. The business data system receives a touch gesture user input to manipulate a contact within the business data system. The business data system manipulates the contact based on the touch gesture user input.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
-
FIG. 1 is a block diagram of one illustrative business data environment. -
FIG. 2A is a flow diagram of one embodiment of the operation of the system shown inFIG. 1 in manipulating contact information based on a touch gesture. -
FIG. 2B is a flow diagram illustrating one embodiment of the operation of the system shown inFIG. 1 in manipulating a contact, within a business record, using touch gestures. -
FIGS. 3A-3F show exemplary user interface displays. -
FIG. 4 shows one embodiment of the system shown inFIG. 1 in different architectures. -
FIGS. 5-9 illustrate various mobile devices. -
FIG. 10 is a block diagram of one illustrative computing environment. -
FIG. 1 shows one illustrative embodiment of abusiness data architecture 90.Business data architecture 90 includesCRM system 100,CRM data store 102 anduser device 104.User device 104 is shown generating user interface displays 106 for interaction byuser 108. WhileCRM system 100 can be any business data system (such as a CRM system, an ERP system, an LOB system, or another business data application or business data system) it is described herein as a CRM system, for the sake of example only.CRM system 100 illustratively includesprocessor 110,user interface component 112,communication component 114, workflow/task component 118 andother CRM components 120. -
Processor 110 is illustratively a computer processor with associated memory and timing circuitry (not separately shown). It is a functional part ofCRM system 100 and is activated by, and facilitates the functionality of, the other components and items inCRM system 100. It will also be noted that while only asingle processor 110 is shown,processor 110 can actually be multiple different computer processors as well. In addition, the multiple different computer processors used bysystem 100 can be local tosystem 100, or remote fromsystem 100, but accessible bysystem 100. -
User interface component 112 illustratively generates user interface displays with user input mechanisms that can be actuated byuser 108. The user interface displays 106 (thatuser 108 interacts with) can be generated byuser interface component 112 inCRM system 100 and passed todevice 104 where they can be displayed (bydevice 104, as user interface displays 106) for interaction byuser 108. -
Communication component 114 illustratively facilitates communication among various users ofCRM system 100, or between users ofCRM system 100 and other individuals who may not necessarily be users ofsystem 100. For instance, ifuser 108 wishes to communicate with a contact who may not necessarily have access to CRM system 100 (such as by initiating a phone call, an instant message, etc.),communication component 114 illustratively facilitates this type of communication. Therefore,communication component 114 can illustratively facilitate email communication, telephone or cellular telephone communication, instant message communication, chat room communication, or other types of communication. - Workflow/
task component 118 illustratively usesuser interface component 112 to generate user interface displays 106 so thatuser 108 can perform tasks and carryout workflow withinCRM system 100. For instance, workflow/task component 118 illustratively allowsuser 108 to add contact information toCRM system 100, to track opportunities withinsystem 100, to convert quotes to orders, or to input various other types of information or perform other tasks or workflows. -
Other CRM components 120 illustratively provide the functionality for other things that can be done inCRM system 100. There are a wide variety of other things that users can do withinCRM system 100, and these various functions are provided byother components 120. -
CRM system 100 has access toCRM data store 102.CRM data store 102 illustratively stores a variety of different business data records. Whiledata store 102 is shown as a single data store, it can be multiple different data stores. It can be local tosystem 100 or remote therefrom. Where it includes multiple different data stores, they can all be local to or remote fromsystem 100, or some can be local while others are remote. - The data records can include, by way of example only,
proposals 124,opportunities 126,quotes 128,customer data records 130,orders 132, product/service information 134,vendor records 136,contacts 138,workflows 140, and otherbusiness data records 142. Each of the business data records may be an object or entity, or another type of record. The records can include links to other records, or stand by themselves. All of these types of structures, and others are contemplated herein. -
Proposals 124 illustratively include business information for a proposal that can be made to a customer.Opportunities 126 illustratively include a wide variety of different types of information (some of which is described below with respect toFIGS. 3A-3F ) that enableuser 108 to track a sales opportunity withinCRM system 100.Quotes 128 illustratively include information defining quotes that can be provided to customers.Customers 130 include customer information, such as contact information, address, billing information, etc. for different customers.Orders 132 illustratively include order information that reflects orders that have actually been made by various customers. Product/service information 134 illustratively includes information that describes products or services inCRM system 100.Vendors 136 illustratively include information describing vendors that are used by the organization in whichCRM system 100 is deployed.Contacts 138 illustratively include contact information for various people that are either users ofCRM system 100, or that are related to any of the other business data records in CRM data store 102 (for instance they can be contacts at vendors, customers, other users, etc.).Workflows 130 illustratively define the various workflows thatuser 108 can perform withinCRM system 100. - The workflows can take a wide variety of different forms. For instance, they may simply be data entry workflows, workflows posting information to a ledger, workflows fleshing out proposals or quotes, or a wide variety of other things. In any case,
CRM system 100accesses workflows 140 in order to generate the user interface displays 106 that can be manipulated byuser 108, in order to perform the different workflows. -
User device 104 illustratively includesuser interface component 122,client CRM system 144, andprocessor 146.Client CRM system 144 is illustratively used byuser device 104 in order to accessCRM system 100. Of course,client CRM system 144 can be a stand alone system as well, in which case it has access toCRM data store 102, or a different CRM data store. As described herein, however, it is simply used in order to accessCRM system 100. This is but one option. -
User interface component 122 illustratively generates the user interface displays 106 onuser device 104. In the embodiment described herein,device 104 has a touch sensitive user interface display screen. Therefore,user interface component 122 illustratively generates the displays for display on the user interface display screen. Thedisplays 106 haveuser input mechanisms 107 that can be actuated using touch gestures byuser 108. -
Processor 146 is illustratively a computer processor with associated memory and timing circuitry (not separately shown).Processor 146 is illustratively a functional part ofdevice 104 and is activated by, and facilitates the functionality of, the other systems, components and items indevice 104. Whileprocessor 146 is shown as a single processor, it could be multiple processors as well. - As briefly discussed above, user interface displays 106 are illustratively user interface displays that are provided for interaction by
user 108.User input mechanisms 107 can be a wide variety of different types of user input mechanisms. For instance, they can be buttons, icons, text boxes, dropdown menus, soft keyboards or virtual keyboards or keypads, links, check boxes, active tiles that function as a link to underlying information and that actively or dynamically show information about the underlying information or a wide variety of other user input mechanisms that can be actuated using touch gestures. -
FIG. 2A is a flow diagram illustrating one embodiment of the operation of the architecture shown inFIG. 1 in manipulating contacts using touch gestures withinCRM system 100.User 108 first illustratively provides an input indicating that he or she wishes to accessCRM system 100. This can launchclient CRM system 144 which provides access toCRM system 100, or it can launchCRM system 100 and provide direct or indirect access. In response,CRM system 100 usesuser interface component 112 to generate auser interface display 106 that displays a wall or other CRM display. The CRM display illustratively includesuser input mechanisms 107 that allowuser 108 to manipulate them and thus control and manipulateCRM system 100. -
FIG. 3A is one illustrative example of auser interface display 200 that shows a wall or a CRM start screen.Display 200 is shown onuser device 202 which is illustratively a tablet computer.Tablet computer 202 illustratively includes touchsensitive display screen 204. Of course, it will be noted thatdevice 202 could be any other type of device that has a touch sensitive display screen. Start screen (or wall) 200 is shown with a plurality of tiles, oricons 206. - In the embodiment shown in
FIG. 3A , the icons (or tiles) are divided generally into two different sections. The first section is apersonal section 208, and the second section is abusiness section 210. These sections are exemplary only and may or may not be used. The tiles insection 208 are illustratively user actuatable links which, when actuated by a user, cause a corresponding function to happen. For example, when either one of a pair ofbrowser tiles store tile 214 is actuated by the user, it launches an on-line store application or portal. Other tiles are shown for navigating to the control panel, for viewing weather, for viewing stock information of identified companies, or that indicate popular browsing sessions. Of course, the tiles shown in thepersonal section 208 are exemplary only and a wide variety of other tiles could be shown as well. - The
business section 210 ofstart display 200 also includes a plurality of tiles which, when actuated by the user, cause the CRM system to take action. For instance,contact tile 216, when actuated by the user, opens a contact menu for the user.Opportunities tile 218, when actuated by the user, opens opportunity records or an opportunities menu that allows the user to navigate to individual opportunity records. The “my leads”tile 220, when actuated by the user, causes theCRM system 100 to open a menu or records corresponding to leads for the given user. Anews tile 222 provides news about one or more items that have taken place inCRM system 100, and that are of interest to the user. In the example shown inFIG. 3A ,tile 222 shows that an opportunity for the ACME Company has been closed by another sales person. When the user actuatestile 222, theCRM system 100 navigates the user to additional information about that closed opportunity. For instance, it may navigate the user to the opportunity record or to the sales record, or simply to the ACME Company general record. The other tiles, when actuated by the user, cause the CRM system to navigate the user to other places of interest or to launch other components of the CRM system. Those displayed are shown for the sake of example only. - Once the CRM system is launched and the start screen is displayed,
CRM system 100 then receives a user touch gesture to manipulate a contact inCRM system 100. This is indicated byblock 153 inFIG. 2A . By way of example, the user can simply touchcontacts tile 216. This causesCRM system 100 to display a contact menu that allows the user to take a variety of other actions, such as to open acontact 155, edit acontact 157, add or deletecontacts 159, initiate communication with one ormore contacts 161, schedule a meeting with acontact 163, touch a search button to begin asearch 191 for a contact, or perform other contact manipulation steps 165. In response,CRM system 100 manipulates the contact based on the touch gestures. This is indicated byblock 167 inFIG. 2A . - It should also be noted that the user can manipulate contacts in other ways as well. For instance, instead of actuating
contact tile 216, or one of the specific contacts represented by the photos or images ontile 216, the user may open up other business data records inCRM system 100. Many of those business data records may have individual people, or contacts, associated with them. Therefore,user 108 can manipulate contacts from within those business data records as well. -
FIG. 2B is a flow diagram illustrating one embodiment of this type of contact manipulation. The first two blocks inFIG. 2B are similar to the first two blocks shown inFIG. 2A , and they are similarly numbered. Therefore, atblock 150, the user launches the CRM system and atblock 152 the CRM system displays a start display or wall or other CRM user interface display. - In the embodiment shown in
FIG. 2B , the user then provides a touch gesture to open a CRM record.FIG. 3B shows one example of this. In the embodiment shown inFIG. 3B , the user has illustratively actuatedtile 218, such as by touching it. In response,CRM system 100 displays anopportunities tile 224.Opportunities tile 224 is illustratively indicative of a new opportunity that has been created. The user then actuatestile 224, using a touch gesture (e.g., by touching it) with his or herfinger 226. This causesCRM system 100 to open another user interface display, such asuser interface display 228 shown inFIG. 3C , corresponding to the newly created opportunity. Receiving the user input to open the CRM record is indicated byblock 154 inFIG. 2B , and having theCRM system 100 display the record is indicated byblock 156. -
FIG. 3C shows that thebusiness record display 228 displays tiles or links (or other icons or user-actuatable items) that allow the user to view a variety of different kinds of information. For instance,display 228 includes a “people” or “contact”tile 230.Tile 230 identifies people either at the organization for which the opportunity has been generated, or at the organization that employs the CRM system, that are somehow related to the opportunity. By way of example, theopportunity tile 230 may linkuser 108 to other people in the company that employs the CRM system, who are working on converting the opportunity into an actual sale. In addition,tile 230, when actuated by the user, may navigate the user to contact information for individuals at the company for which the opportunity was developed. In any case, if the user actuatestile 230, theCRM system 100 illustratively navigatesuser 108 to either a contact menu or a specific contact and allows the user to manipulate the contact in a similar way as described above with respect toFIG. 2A . For instance, the user can open a contact, delete or edit it, initiate communication, etc. -
FIG. 3C also shows examples of other information that can be shown in a business data record. For instance,user interface display 228 includes a wide variety of actuable items that take the user to other information corresponding to the opportunity.Invoices tile 232, when actuated by the user, navigates the user to another display where the user can view information related to invoices that correspond to this opportunity.Quotes tile 234, when actuated by the user, navigates the user to additional information about quotes generated for this company or somehow related to this opportunity.Document tile 236 illustratively navigates the user to other related documents corresponding to this opportunity, andactivity tile 238 shows, in general, the amount of activity related to this opportunity. When the user actuatestile 238,CRM system 100 can navigate the user to additional displays showing the specific activity represented by thetile 238. -
User interface display 228 also illustratively includes a “What's new”section 240. What'snew section 240 can display posts byuser 108, or other users of the CRM system, that are related to the opportunity being displayed. - In addition, as shown in
FIG. 3C ,display 228 is illustratively pannable in the directions indicated byarrow 242. By way of example, if the user uses his or herfinger 226 and makes a swiping motion to the left or to the right,display 228 illustratively pans to the left or to the right based on the touch gesture. -
User interface display 228 also illustratively includes aninformation section 244 that displays aprimary contact tile 246 corresponding to a primary contact for this opportunity. A plurality ofadditional tiles 248 are displayed below theprimary contact tile 246, and provide information corresponding to the individual represented byprimary contact tile 246. Thetiles 248, for instance, provide a preferred contact method for the primary contact, an amount of revenue generated by the primary contact, an indicator of the availability of the primary contact, a reputation or rating for the primary contact, a date when the opportunity corresponding to the primary contact closes, and a credit limit for the primary contact. Of course, all of thetiles 248 are exemplary only, and additional or different information corresponding to the primary contact, or other information, can be displayed as well. - Since the opportunity record represented by
user interface 228 has a primary contact (or tile) 246 that represents the primary contact for the displayed opportunity, the user can manipulate that contact information from within the opportunity business record displayed inuser interface display 228. If there were no contact information corresponding to the business opportunity displayed ondisplay 220,CRM system 100 would illustratively provide a user input mechanism that allowsuser 108 to navigate to contact information corresponding to the displayed business data record. Determining whether contact information is displayed on the business data record represented byuser interface display 228 is indicated byblock 158 inFIG. 2B . If not, receiving the user touch gesture to show contact information is indicated byblock 160. - As described above, in the embodiment shown in
FIG. 3C , both thecontact tile 230 and theprimary contact tile 246 are shown inuser interface display 228. Therefore, the user need not provide an additional touch gesture to see contact information. -
FIG. 3C also shows that the user is using his or herfinger 226 to actuatetile 246. Thus,user 108 is selectingprimary contact 246, by actuating the corresponding tile. Receiving a touch gesture selecting a contact is indicated byblock 162 inFIG. 2B . - Actuation of
tile 146 causesCRM system 100 to generate another user interface display that allows the user to manipulate the contact information. As described above with respect toFIG. 2A , this can take a wide variety of different forms. However, in the embodiment discussed with respect toFIG. 2B , actuatingprimary contact tile 246 causesCRM system 100 to generate a display, such asuser interface display 250, shown inFIG. 3D . It can be seen that a number of the items inuser interface display 250 are the same as those shown inuser interface display 228 inFIG. 3C , and they are similarly numbered. However,FIG. 3D also shows that, since the user actuatedtile 246, this causesCRM system 100 to displaycommunication bar 252.Communication bar 252 displays the specific contact options for the selected contact, who was selected when the user actuatedtile 246.Contact bar 252, itself, illustratively includes a plurality of user actuatable items each of which represents a method for contacting the primary contact represented bytile 246. For instance,contact bar 252 includesphone button 166,email button 168,instant messenger button 170 andother button 172. Displaying the specific contact options for the selected contact is indicated byblock 164 inFIG. 2B . - When the user actuates any of the buttons in
contact bar 252, this causesCRM system 100 to illustratively initiate communication with the primary contact using the selected method of communication.FIG. 3D shows that theuser 108 has used his or herfinger 226 to actuate thephone button 166. In the embodiment shown, the user simply touchesbutton 166 to actuate it. Receiving the user touch gesture selecting a contact option is indicated byblock 174 inFIG. 2B . - In response to the user actuating the
phone button 166,communication component 114 inCRM system 100 illustratively initiates a phone call to the primary contact “Phil B.” represented bytile 246 and generates a suitable user interface display indicating that the call has been initiated. -
FIG. 3E shows one exemplaryuser interface display 300 that illustrates this. It can be seen indisplay 300 that a phone call is underway to Phil B. This is indicated generally at 302.Display 300 shows the identity of the person being called, an indication that it is a phone call, and the elapsed time of the call. Of course, this information is exemplary only and a wide variety of additional or different information could be shown as well. In any case,user interface display 300 illustrates that a call has been placed. - A number of other exemplary things are shown in
display 300. A list of objectives to be accomplished are shown generally at 306. Astatus bar 304 shows how many of the objectives for the phone call have been completed. The objectives listed are “product requirements”, “key decision makers”, “budget”, and “notes”. In one embodiment, these are the agenda items for the phone call. Of course, they may be simply “to do” items or a variety of other listed items as well. -
FIG. 3E also shows that a soft keyboard is displayed generally at 308. This allowsuser 108 to type information into the text boxes at 306, or to otherwise enter alphanumeric information, using touch. - The communication (e.g., the telephone call) can proceed until one of the parties stops the communication. This can be done, in one embodiment, by
user 108 simply touching an appropriate button on the user interface display.FIG. 3F shows one illustrative way of doing this.FIG. 3F showsuser interface display 310, which is similar touser interface display 308 shown inFIG. 3E , and similar items are similarly numbered. However, it can be seen inFIG. 3F that the parties to the call have accomplished two of the agenda items, and thereforestatus bar 304 shows that two out of four items have been completed.Display 310 also shows that the user has touched a “hang up”button 312. Hang upbutton 312 allowsuser 108 to terminate the call, simply by actuatingbutton 312. Receiving a user touch gesture to end the communication is indicated byblock 178 inFIG. 2B . In response,communication component 114 ofCRM system 100 hangs up the call, or disconnects the call, or otherwise discontinues the telephone communication. This is indicated byblock 180 inFIG. 2B . - It can thus be seen that a user can quickly and easily manipulate contact information within a CRM system, or other business data system. When contact information is displayed, the user can use a touch gesture to manipulate it. This can make manipulation of contact information much easier and less cumbersome.
- It will be noted that the touch gestures mentioned herein can take a wide variety of different forms. They can be simple touches or taps, swipes, slides, multi-touch inputs, positional gestures (gestures at a specific position or location on the screen), brushing, multi-finger gestures, touch and hold gestures, etc. The speed of the gestures can be used for control as well (e.g., a quick swipe can pan quickly while a slow swipe pans slowly, etc.). These and other gestures are all contemplated herein.
-
FIG. 4 is a block diagram ofsystem 100, shown inFIG. 1 , except that it is disposed in acloud computing architecture 500. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services. In various embodiments, cloud computing delivers the services over a wide area network, such as the internet, using appropriate protocols. For instance, cloud computing providers deliver applications over a wide area network and they can be accessed through a web browser or any other computing component. Software or components ofsystem 100 as well as the corresponding data, can be stored on servers at a remote location. The computing resources in a cloud computing environment can be consolidated at a remote data center location or they can be dispersed. Cloud computing infrastructures can deliver services through shared data centers, even though they appear as a single point of access for the user. Thus, the components and functions described herein can be provided from a service provider at a remote location using a cloud computing architecture. Alternatively, they can be provided from a conventional server, or they can be installed on client devices directly, or in other ways. - The description is intended to include both public cloud computing and private cloud computing. Cloud computing (both public and private) provides substantially seamless pooling of resources, as well as a reduced need to manage and configure underlying hardware infrastructure.
- A public cloud is managed by a vendor and typically supports multiple consumers using the same infrastructure. Also, a public cloud, as opposed to a private cloud, can free up the end users from managing the hardware. A private cloud may be managed by the organization itself and the infrastructure is typically not shared with other organizations. The organization still maintains the hardware to some extent, such as installations and repairs, etc.
- In the embodiment shown in
FIG. 4 , some items are similar to those shown inFIG. 1 and they are similarly numbered.FIG. 4 specifically shows that CRM system 100 (or, of course, another business data system such as an ERP system, LOB application, etc.) is located in cloud 502 (which can be public, private, or a combination where portions are public while others are private). Therefore,user 108 uses auser device 104 to access those systems throughcloud 502. -
FIG. 4 also depicts another embodiment of a cloud architecture.FIG. 4 shows that it is also contemplated that some elements of business system 100 (or architecture 90) are disposed incloud 502 while others are not. By way of example,data store 102 can be disposed inside of cloud 502 (with CRM system 100) or outside ofcloud 502, and accessed throughcloud 502. In another embodiment,communication component 114 is also outside ofcloud 502. Regardless of where they are located, they can be accessed directly bydevice 104, through a network (either a wide area network or a local area network), they can be hosted at a remote site by a service, or they can be provided as a service through a cloud or accessed by a connection service that resides in the cloud. All of these architectures are contemplated herein. - It will also be noted that
system 100, or portions of it, can be disposed on a wide variety of different devices. Some of those devices include servers, desktop computers, laptop computers, tablet computers, or other mobile devices, such as palm top computers, cell phones, smart phones, multimedia players, personal digital assistants, etc. -
FIG. 5 is a simplified block diagram of one illustrative embodiment of a handheld or mobile computing device that can be used as a user's or client's hand held device 16 (e.g., device 104), in which the present system (or parts of it) can be deployed.FIGS. 6-9 are examples of handheld or mobile devices. -
FIG. 5 provides a general block diagram of the components of aclient device 16 that can run components ofsystem 100 or that interacts withsystem 100, or both. In thedevice 16, acommunications link 13 is provided that allows the handheld device to communicate with other computing devices and under some embodiments provides a channel for receiving information automatically, such as by scanning. Examples of communications link 13 include an infrared port, a serial/USB port, a cable network port such as an Ethernet port, and a wireless network port allowing communication though one or more communication protocols including General Packet Radio Service (GPRS), LTE, HSPA, HSPA+ and other 3G and 4G radio protocols, 1Xrtt, and Short Message Service, which are wireless services used to provide cellular access to a network, as well as 802.11 and 802.11b (Wi-Fi) protocols, and Bluetooth protocol, which provide local wireless connections to networks. - Under other embodiments, applications or systems (like system 100) are received on a removable Secure Digital (SD) card that is connected to a
SD card interface 15.SD card interface 15 andcommunication links 13 communicate with a processor 17 (which can also embodyprocessor 146 fromFIG. 1 ) along abus 19 that is also connected tomemory 21 and input/output (I/O)components 23, as well asclock 25 andlocation system 27. I/O components 23, in one embodiment, are provided to facilitate input and output operations. I/O components 23 for various embodiments of thedevice 16 can include input components such as buttons, touch sensors, multi-touch sensors, optical or video sensors, voice sensors, touch screens, proximity sensors, microphones, tilt sensors, and gravity switches and output components such as a display device, a speaker, and or a printer port. Other I/O components 23 can be used as well. -
Clock 25 illustratively comprises a real time clock component that outputs a time and date. It can also, illustratively, provide timing functions forprocessor 17. -
Location system 27 illustratively includes a component that outputs a current geographical location ofdevice 16. This can include, for instance, a global positioning system (GPS) receiver, a LORAN system, a dead reckoning system, a cellular triangulation system, or other positioning system. It can also include, for example, mapping software or navigation software that generates desired maps, navigation routes and other geographic functions. -
Memory 21stores operating system 29,network settings 31,applications 33,application configuration settings 35,data store 37,communication drivers 39, and communication configuration settings 41.Memory 21 can include all types of tangible volatile and non-volatile computer-readable memory devices. It can also include computer storage media (described below).Memory 21 stores computer readable instructions that, when executed byprocessor 17, cause the processor to perform computer-implemented steps or functions according to the instructions.System 100 or the items indata store 102, for example, can reside inmemory 21. Similarly,device 16 can have a client business system 24 (e.g., client CRM system 144) which can run various business applications or embody parts or all ofbusiness system 100.Processor 17 can be activated by other components to facilitate their functionality as well. - Examples of the
network settings 31 include things such as proxy information, Internet connection information, and mappings.Application configuration settings 35 include settings that tailor the application for a specific enterprise or user. Communication configuration settings 41 provide parameters for communicating with other computers and include items such as GPRS parameters, SMS parameters, connection user names and passwords. -
Applications 33 can be applications that have previously been stored on thedevice 16 or applications that are installed during use, although these can be part ofoperating system 29, or hosted external todevice 16, as well. -
FIG. 6 shows one embodiment in whichdevice 16 is a tablet computer 600 (also shown inFIGS. 3A-3F ).Screen 602 can be a touch screen (so touch gestures from a user's finger can be used to interact with the application) or a pen-enabled interface that receives inputs from a pen or stylus. It can also use an on-screen virtual keyboard. Of course, it might also be attached to a keyboard or other user input device through a suitable attachment mechanism, such as a wireless link or USB port, for instance.Computer 600 can also illustratively receive voice inputs as well. -
FIGS. 7 , 8 and 9 provide additional examples ofdevices 16 that can be used, although others can be used as well. InFIG. 7 , a mobile phone 45 (or feature phone) is provided as thedevice 16.Phone 45 includes a set ofkeypads 47 for dialing phone numbers, adisplay 49 capable of displaying images including application images, icons, web pages, photographs, and video, andcontrol buttons 51 for selecting items shown on the display. The phone includes anantenna 53 for receiving cellular phone signals such as General Packet Radio Service (GPRS) and 1Xrtt, and Short Message Service (SMS) signals. In some embodiments,phone 45 also includes a Secure Digital (SD)card slot 55 that accepts aSD card 57. - The mobile device of
FIG. 8 is a personal digital assistant (PDA) 59 or a multimedia player or a tablet computing device, etc. (hereinafter referred to as PDA 59).PDA 59 includes aninductive screen 61 that senses the position of a stylus 63 (or other pointers, such as a user's finger) when the stylus is positioned over the screen. This allows the user to select, highlight, and move items on the screen as well as draw and write.PDA 59 also includes a number of user input keys or buttons (such as button 65) which allow the user to scroll through menu options or other display options which are displayed ondisplay 61, and allow the user to change applications or select user input functions, without contactingdisplay 61. Although not shown,PDA 59 can include an internal antenna and an infrared transmitter/receiver that allow for wireless communication with other computers as well as connection ports that allow for hardware connections to other computing devices. Such hardware connections are typically made through a cradle that connects to the other computer through a serial or USB port. As such, these connections are non-network connections. In one embodiment,mobile device 59 also includes aSD card slot 67 that accepts aSD card 69. -
FIG. 9 is similar toFIG. 8 except that the phone is asmart phone 71.Smart phone 71 has a touchsensitive display 73 that displays icons or tiles or otheruser input mechanisms 75.Mechanisms 75 can be used by a user to access a business data system (like CRM system 100) run applications, make calls, perform data transfer operations, etc. In general,smart phone 71 is built on a mobile operating system and offers more advanced computing capability and connectivity than a feature phone. - Note that other forms of the
devices 16 are possible. -
FIG. 10 is one embodiment of a computing environment in which system 100 (for example) can be deployed. With reference toFIG. 10 , an exemplary system for implementing some embodiments includes a general-purpose computing device in the form of acomputer 810. Components ofcomputer 810 may include, but are not limited to, a processing unit 820 (which can comprise processor 110), asystem memory 830, and asystem bus 821 that couples various system components including the system memory to theprocessing unit 820. Thesystem bus 821 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus. Memory and programs described with respect toFIG. 1 can be deployed in corresponding portions ofFIG. 10 . -
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed bycomputer 810. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer readable media. - The
system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements withincomputer 810, such as during start-up, is typically stored in ROM 831.RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processingunit 820. By way of example, and not limitation,FIG. 10 illustratesoperating system 834,application programs 835,other program modules 836, andprogram data 837. - The
computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,FIG. 10 illustrates ahard disk drive 841 that reads from or writes to non-removable, nonvolatile magnetic media, amagnetic disk drive 851 that reads from or writes to a removable, nonvolatilemagnetic disk 852, and anoptical disk drive 855 that reads from or writes to a removable, nonvolatileoptical disk 856 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive 841 is typically connected to thesystem bus 821 through a non-removable memory interface such asinterface 840, andmagnetic disk drive 851 andoptical disk drive 855 are typically connected to thesystem bus 821 by a removable memory interface, such asinterface 850. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 10 , provide storage of computer readable instructions, data structures, program modules and other data for thecomputer 810. InFIG. 10 , for example,hard disk drive 841 is illustrated as storingoperating system 844,application programs 845,other program modules 846, andprogram data 847. Note that these components can either be the same as or different fromoperating system 834,application programs 835,other program modules 836, andprogram data 837.Operating system 844,application programs 845,other program modules 846, andprogram data 847 are given different numbers here to illustrate that, at a minimum, they are different copies. - A user may enter commands and information into the
computer 810 through input devices such as akeyboard 862, amicrophone 863, and apointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to theprocessing unit 820 through auser input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). Avisual display 891 or other type of display device is also connected to thesystem bus 821 via an interface, such as avideo interface 890. In addition to the monitor, computers may also include other peripheral output devices such asspeakers 897 andprinter 896, which may be connected through an outputperipheral interface 895. - The
computer 810 is operated in a networked environment using logical connections to one or more remote computers, such as aremote computer 880. Theremote computer 880 may be a personal computer, a hand-held device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer 810. The logical connections depicted inFIG. 10 include a local area network (LAN) 871 and a wide area network (WAN) 873, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment, the
computer 810 is connected to theLAN 871 through a network interface or adapter 870. When used in a WAN networking environment, thecomputer 810 typically includes amodem 872 or other means for establishing communications over theWAN 873, such as the Internet. Themodem 872, which may be internal or external, may be connected to thesystem bus 821 via theuser input interface 860, or other appropriate mechanism. In a networked environment, program modules depicted relative to thecomputer 810, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,FIG. 10 illustratesremote application programs 885 as residing onremote computer 880. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. - It should also be noted that features from different embodiments can be combined. That is, one or more features from one embodiment can be combined with one or more features of other embodiments. This is contemplated herein.
- Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (20)
1. A computer-implemented method of manipulating contact information in a business data system, comprising:
displaying a user interface display showing business information, from the business data system;
receiving a user touch gesture on the user interface display, to manipulate contact information in the business data system; and
manipulating the contact information in the business data system based on the user touch gesture.
2. The computer-implemented method of claim 1 wherein displaying a user interface display comprises:
displaying a user actuatable contact user input mechanism showing a contact in the business data system.
3. The computer-implemented method of claim 2 wherein receiving the user touch gesture comprises:
receiving a user touch on the contact user input mechanism.
4. The computer-implemented method of claim 3 wherein manipulating the contact information comprises:
accessing additional contact information corresponding to the contact; and
displaying the additional contact information.
5. The computer-implemented method of claim 4 wherein the additional contact information comprises a plurality of contact options for the contact and wherein displaying the additional contact information comprises:
displaying a user actuatable input mechanism corresponding to each of the contact options.
6. The computer-implemented method of claim 5 and further comprising:
receiving a user touch gesture on a given one of the input mechanisms corresponding to a given contact option; and
initiating communication with the contact using the given contact option.
7. The computer-implemented method of claim 1 wherein receiving a user touch gesture to manipulate contact information comprises:
receiving a user touch gesture requesting a business data record from the business data system; and
displaying the business data record with a contact user input mechanism for a contact corresponding to the business data record.
8. The computer-implemented method of claim 7 wherein receiving a user touch gesture to manipulate contact information further comprises:
receiving a user touch gesture actuating the contact user input mechanism on the business data record.
9. The computer-implemented method of claim 8 wherein displaying the business data record with the contact user-input mechanism for the contact, comprises:
displaying a contact menu with an input mechanism corresponding to a contact search mechanism, and wherein receiving the user touch gesture actuating the contact user input mechanism comprises receiving a touch gesture initiating a search for a specific contact.
10. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises:
receiving the user touch gesture to perform one of adding, deleting or editing the contact information.
11. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises:
receiving the user touch gesture to initiate communication with an entity represented by the contact information.
12. The computer-implemented method of claim 8 wherein receiving a user touch gesture to manipulate contact information further comprises:
receiving the user touch gesture to schedule a meeting with an entity represented by the contact information.
13. A business data system, comprising:
a user interface component;
a business data component generating a user interface display, using the user interface component, showing a business data record with a user input mechanism showing a contact corresponding to the business data record, receiving a user touch gesture through the user input mechanism to manipulate contact information for the contact and manipulating the contact information in the business data system based on the touch gesture; and
a computer processor being a functional part of the business data system and activated by the user interface component and the business data component to facilitate generating the user interface display, receiving the touch gesture and manipulating the contact information.
14. The business data system of claim 13 and further comprising:
a communication component, the touch gesture initiating communication with the contact, using the communication component.
15. The business data system of claim 14 wherein the business data component displays a communication input mechanism representing a communication option for communicating with the contact, and wherein the touch gesture actuates the communication input mechanism, the communication component initiating communication with the contact using the communication option represented by the communication input mechanism.
16. The business data system of claim 15 wherein the communication option comprises a selected one of electronic mail, telephone, cellular telephone, and instant messaging.
17. The business data system of claim 13 wherein the business data component manipulates the contact information, based on the user touch gesture by performing one of adding contact information, deleting contact information, editing contact information and displaying additional contact information.
18. A mobile device, comprising:
a touch sensitive display screen;
a user interface component displaying a user interface display from a business data system on the touch sensitive display screen and receiving a user touch gesture through the user interface display;
a business data component manipulating contact information in the business data system based on the touch gesture; and
a computer processor being a functional component of the mobile device and activated by the user interface component and the business data component to facilitate displaying, receiving the user touch gesture, and manipulating the contact information.
19. The mobile device of claim 18 wherein the user interface component receives a touch gesture requesting display of a business data record from the business data system, the business data record including contact information for a contact corresponding to the business data record.
20. The mobile device of claim 19 wherein the business data component manipulates the contact information by at least one of initiating communication with the contact or scheduling a meeting with the contact.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/754,896 US20130246930A1 (en) | 2012-03-16 | 2013-01-31 | Touch gestures related to interaction with contacts in a business data system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261612148P | 2012-03-16 | 2012-03-16 | |
US13/754,896 US20130246930A1 (en) | 2012-03-16 | 2013-01-31 | Touch gestures related to interaction with contacts in a business data system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130246930A1 true US20130246930A1 (en) | 2013-09-19 |
Family
ID=49157144
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/541,785 Active 2032-11-18 US9310888B2 (en) | 2012-03-16 | 2012-07-05 | Multimodal layout and rendering |
US13/754,896 Abandoned US20130246930A1 (en) | 2012-03-16 | 2013-01-31 | Touch gestures related to interaction with contacts in a business data system |
US13/773,630 Abandoned US20130241852A1 (en) | 2012-03-16 | 2013-02-21 | Use of touch and gestures related to tasks and business workflow |
US13/827,813 Active 2034-01-17 US9645650B2 (en) | 2012-03-16 | 2013-03-14 | Use of touch and gestures related to tasks and business workflow |
US13/827,759 Abandoned US20130246111A1 (en) | 2012-03-16 | 2013-03-14 | Use of touch and gestures related to tasks and business workflow |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/541,785 Active 2032-11-18 US9310888B2 (en) | 2012-03-16 | 2012-07-05 | Multimodal layout and rendering |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/773,630 Abandoned US20130241852A1 (en) | 2012-03-16 | 2013-02-21 | Use of touch and gestures related to tasks and business workflow |
US13/827,813 Active 2034-01-17 US9645650B2 (en) | 2012-03-16 | 2013-03-14 | Use of touch and gestures related to tasks and business workflow |
US13/827,759 Abandoned US20130246111A1 (en) | 2012-03-16 | 2013-03-14 | Use of touch and gestures related to tasks and business workflow |
Country Status (1)
Country | Link |
---|---|
US (5) | US9310888B2 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150121241A1 (en) * | 2013-10-31 | 2015-04-30 | Bank Of America Corporation | Visual representation for permission to contact |
US9310888B2 (en) | 2012-03-16 | 2016-04-12 | Microsoft Technology Licensing, Llc | Multimodal layout and rendering |
USD763893S1 (en) * | 2015-07-28 | 2016-08-16 | Microsoft Corporation | Display screen with graphical user interface |
USD763892S1 (en) * | 2015-07-28 | 2016-08-16 | Microsoft Corporation | Display screen with animated graphical user interface |
USD765708S1 (en) * | 2015-07-27 | 2016-09-06 | Microsoft Corporation | Display screen with animated graphical user interface |
USD765709S1 (en) * | 2015-07-28 | 2016-09-06 | Microsoft Corporation | Display screen with animated graphical user interface |
USD768689S1 (en) * | 2015-07-27 | 2016-10-11 | Microsoft Corporation | Display screen with animated graphical user interface |
USD770506S1 (en) * | 2015-07-15 | 2016-11-01 | Microsoft Corporation | Display screen with animated graphical user interface |
US20170090747A1 (en) * | 2015-09-24 | 2017-03-30 | International Business Machines Corporation | Input device interaction |
USD844657S1 (en) | 2017-11-27 | 2019-04-02 | Microsoft Corporation | Display screen with animated graphical user interface |
USD845982S1 (en) | 2017-11-27 | 2019-04-16 | Microsoft Corporation | Display screen with graphical user interface |
USD845989S1 (en) | 2017-11-27 | 2019-04-16 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD846568S1 (en) | 2017-11-27 | 2019-04-23 | Microsoft Corporation | Display screen with graphical user interface |
US10452260B2 (en) * | 2014-11-21 | 2019-10-22 | Nintex Pty Ltd | Managing workflow tasks in touch screen mobile devices |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10223710B2 (en) | 2013-01-04 | 2019-03-05 | Visa International Service Association | Wearable intelligent vision device apparatuses, methods and systems |
US9436300B2 (en) * | 2012-07-10 | 2016-09-06 | Nokia Technologies Oy | Method and apparatus for providing a multimodal user interface track |
WO2015112108A1 (en) * | 2012-11-28 | 2015-07-30 | Visa International Service Association | Multi disparate gesture actions and transactions apparatuses, methods and systems |
US9507480B1 (en) * | 2013-01-28 | 2016-11-29 | Amazon Technologies, Inc. | Interface optimization application |
US8984439B2 (en) * | 2013-02-14 | 2015-03-17 | Citibank, N.A. | Methods and systems for managing a graphical user interface |
US10025459B2 (en) | 2013-03-14 | 2018-07-17 | Airwatch Llc | Gesture-based workflow progression |
US10664870B2 (en) * | 2013-03-14 | 2020-05-26 | Boxer, Inc. | Email-based promotion for user adoption |
US9804749B2 (en) * | 2014-03-03 | 2017-10-31 | Microsoft Technology Licensing, Llc | Context aware commands |
KR101610872B1 (en) * | 2014-08-04 | 2016-05-12 | 티더블유모바일 주식회사 | Access icon information managing system and method of the same |
EP3015977A1 (en) * | 2014-10-29 | 2016-05-04 | Hewlett-Packard Development Company, L.P. | Rendering a user interface |
US10877714B2 (en) | 2015-03-10 | 2020-12-29 | Zoho Corporation Private Limited | Methods and apparatus for enhancing electronic presentations |
US20160321218A1 (en) * | 2015-04-27 | 2016-11-03 | Neatly Co. | System and method for transforming image information for a target system interface |
CN105824921A (en) | 2016-03-16 | 2016-08-03 | 广州彩瞳网络技术有限公司 | User social relation recognition device and method |
US10140017B2 (en) | 2016-04-20 | 2018-11-27 | Google Llc | Graphical keyboard application with integrated search |
US10222957B2 (en) * | 2016-04-20 | 2019-03-05 | Google Llc | Keyboard with a suggested search query region |
US10078673B2 (en) | 2016-04-20 | 2018-09-18 | Google Llc | Determining graphical elements associated with text |
US10305828B2 (en) | 2016-04-20 | 2019-05-28 | Google Llc | Search query predictions by a keyboard |
US9965530B2 (en) | 2016-04-20 | 2018-05-08 | Google Llc | Graphical keyboard with integrated search features |
US9846052B2 (en) | 2016-04-29 | 2017-12-19 | Blackriver Systems, Inc. | Electronic route creation |
US10664157B2 (en) | 2016-08-03 | 2020-05-26 | Google Llc | Image search query predictions by a keyboard |
CN114168236A (en) * | 2020-09-10 | 2022-03-11 | 华为技术有限公司 | Application access method and related device |
CN112269510B (en) * | 2020-10-29 | 2022-03-25 | 维沃移动通信(杭州)有限公司 | Information processing method and device and electronic equipment |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090100321A1 (en) * | 2007-10-12 | 2009-04-16 | Microsoft Corporation | Universal contextual actions menu across windows applications |
US20100162171A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Visual address book and dialer |
US20100318921A1 (en) * | 2009-06-16 | 2010-12-16 | Marc Trachtenberg | Digital easel collaboration system and method |
US20110078184A1 (en) * | 2009-09-28 | 2011-03-31 | Lg Electronics Inc. | Mobile terminal and method of searching a contact in the mobile terminal |
US20110087990A1 (en) * | 2009-10-13 | 2011-04-14 | Research In Motion Limited | User interface for a touchscreen display |
US20110113348A1 (en) * | 2009-11-06 | 2011-05-12 | Cisco Technplogy, Inc. | Method and apparatus for visualizing and navigating within an immersive collaboration environment |
US20110268418A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Record and Playback in a Conference |
US20110275418A1 (en) * | 2007-01-07 | 2011-11-10 | Scott Forstall | Portable Multifunction Device, Method, and Graphical User Interface for Conference Calling |
US20110296312A1 (en) * | 2010-05-26 | 2011-12-01 | Avaya Inc. | User interface for managing communication sessions |
US20110313805A1 (en) * | 2010-06-18 | 2011-12-22 | Microsoft Corporation | Customizable user interface including contact and business management features |
US20120030627A1 (en) * | 2010-07-30 | 2012-02-02 | Nokia Corporation | Execution and display of applications |
US20120144320A1 (en) * | 2010-12-03 | 2012-06-07 | Avaya Inc. | System and method for enhancing video conference breaks |
US20120159355A1 (en) * | 2010-12-15 | 2012-06-21 | Microsoft Corporation | Optimized joint document review |
US20120192090A1 (en) * | 2011-01-25 | 2012-07-26 | Bank Of America Corporation | Single identifiable entry point for accessing contact information via a computer network |
US8244851B1 (en) * | 2011-10-18 | 2012-08-14 | Clintelica AB | Group network connector |
US20130227461A1 (en) * | 2011-08-24 | 2013-08-29 | Salesforce.Com, Inc. | Systems and methods for promoting related lists |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
Family Cites Families (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5321750A (en) | 1989-02-07 | 1994-06-14 | Market Data Corporation | Restricted information distribution system apparatus and methods |
US5339392A (en) | 1989-07-27 | 1994-08-16 | Risberg Jeffrey S | Apparatus and method for creation of a user definable video displayed document showing changes in real time data |
US5297032A (en) | 1991-02-01 | 1994-03-22 | Merrill Lynch, Pierce, Fenner & Smith Incorporated | Securities trading workstation |
US5836011A (en) | 1995-01-20 | 1998-11-10 | International Business Machines Corporation | Implementation of teams and roles within a people oriented work environment |
US5819284A (en) | 1995-03-24 | 1998-10-06 | At&T Corp. | Personalized real time information display as a portion of a screen saver |
US6480194B1 (en) | 1996-11-12 | 2002-11-12 | Silicon Graphics, Inc. | Computer-related method, system, and program product for controlling data visualization in external dimension(s) |
US6216141B1 (en) | 1996-12-06 | 2001-04-10 | Microsoft Corporation | System and method for integrating a document into a desktop window on a client computer |
US5959621A (en) | 1996-12-06 | 1999-09-28 | Microsoft Corporation | System and method for displaying data items in a ticker display pane on a client computer |
US6111573A (en) * | 1997-02-14 | 2000-08-29 | Velocity.Com, Inc. | Device independent window and view system |
WO1999026127A1 (en) | 1997-11-14 | 1999-05-27 | Avesta Technologies, Inc. | System and method for displaying multiple sources of data in near real-time |
US6449638B1 (en) | 1998-01-07 | 2002-09-10 | Microsoft Corporation | Channel definition architecture extension |
US6311058B1 (en) | 1998-06-30 | 2001-10-30 | Microsoft Corporation | System for delivering data content over a low bit rate transmission channel |
US6278448B1 (en) | 1998-02-17 | 2001-08-21 | Microsoft Corporation | Composite Web page built from any web content |
US6832355B1 (en) | 1998-07-28 | 2004-12-14 | Microsoft Corporation | Web page display system |
US6188405B1 (en) | 1998-09-14 | 2001-02-13 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory, to objects |
US6510553B1 (en) | 1998-10-26 | 2003-01-21 | Intel Corporation | Method of streaming video from multiple sources over a network |
US7216351B1 (en) | 1999-04-07 | 2007-05-08 | International Business Machines Corporation | Systems and methods for synchronizing multi-modal interactions |
US6456334B1 (en) | 1999-06-29 | 2002-09-24 | Ati International Srl | Method and apparatus for displaying video in a data processing system |
US6976210B1 (en) | 1999-08-31 | 2005-12-13 | Lucent Technologies Inc. | Method and apparatus for web-site-independent personalization from multiple sites having user-determined extraction functionality |
US6724403B1 (en) | 1999-10-29 | 2004-04-20 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US7987431B2 (en) | 1999-10-29 | 2011-07-26 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US7028264B2 (en) | 1999-10-29 | 2006-04-11 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US7082576B2 (en) * | 2001-01-04 | 2006-07-25 | Microsoft Corporation | System and process for dynamically displaying prioritized data objects |
US20020084991A1 (en) | 2001-01-04 | 2002-07-04 | Harrison Edward R. | Simulating mouse events with touch screen displays |
US7895522B2 (en) | 2001-09-28 | 2011-02-22 | Ntt Docomo, Inc. | Layout of platform specific graphical user interface widgets migrated between heterogeneous device platforms |
US7912792B2 (en) | 2002-07-12 | 2011-03-22 | Vendavo, Inc. | Systems and methods for making margin-sensitive price adjustments in an integrated price management system |
WO2004017584A1 (en) | 2002-08-16 | 2004-02-26 | Nuasis Corporation | Contact center architecture |
US20040073571A1 (en) * | 2002-10-10 | 2004-04-15 | International Business Machines Corporation | Console flight management system and method |
US20040093343A1 (en) | 2002-11-12 | 2004-05-13 | Scott Lucas | Enhanced client relationship management systems and methods |
US8572058B2 (en) | 2002-11-27 | 2013-10-29 | Accenture Global Services Limited | Presenting linked information in a CRM system |
JP2006510135A (en) * | 2002-12-16 | 2006-03-23 | マイクロソフト コーポレーション | System and method for interfacing with a computer device |
US20040122693A1 (en) * | 2002-12-23 | 2004-06-24 | Michael Hatscher | Community builder |
US8195631B2 (en) * | 2002-12-23 | 2012-06-05 | Sap Ag | Resource finder tool |
US20040254805A1 (en) | 2003-03-14 | 2004-12-16 | Sven Schwerin-Wenzel | Benefits and compensation realignment |
JP3935856B2 (en) * | 2003-03-28 | 2007-06-27 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Information processing apparatus, server, method and program for creating a digest of a document with a defined layout |
US20040210468A1 (en) | 2003-04-18 | 2004-10-21 | Ralf Rubel | System and method for providing a territory management tool |
US7685016B2 (en) | 2003-10-07 | 2010-03-23 | International Business Machines Corporation | Method and system for analyzing relationships between persons |
US7085590B2 (en) * | 2003-12-31 | 2006-08-01 | Sony Ericsson Mobile Communications Ab | Mobile terminal with ergonomic imaging functions |
EP1569087A3 (en) * | 2004-02-17 | 2007-04-25 | Canon Kabushiki Kaisha | Data processing apparatus, data processing method, program for implementing the method, and storage medium storing the program |
US7460134B2 (en) * | 2004-03-02 | 2008-12-02 | Microsoft Corporation | System and method for moving computer displayable content into a preferred user interactive focus area |
US20060026502A1 (en) | 2004-07-28 | 2006-02-02 | Koushik Dutta | Document collaboration system |
US20060080468A1 (en) | 2004-09-03 | 2006-04-13 | Microsoft Corporation | Smart client add-in architecture |
US7617450B2 (en) | 2004-09-30 | 2009-11-10 | Microsoft Corporation | Method, system, and computer-readable medium for creating, inserting, and reusing document parts in an electronic document |
US20060085245A1 (en) * | 2004-10-19 | 2006-04-20 | Filenet Corporation | Team collaboration system with business process management and records management |
US7818672B2 (en) * | 2004-12-30 | 2010-10-19 | Microsoft Corporation | Floating action buttons |
US8819569B2 (en) | 2005-02-18 | 2014-08-26 | Zumobi, Inc | Single-handed approach for navigation of application tiles using panning and zooming |
US20060235884A1 (en) | 2005-04-18 | 2006-10-19 | Performance Assessment Network, Inc. | System and method for evaluating talent and performance |
US7933632B2 (en) | 2005-09-16 | 2011-04-26 | Microsoft Corporation | Tile space user interface for mobile devices |
US7945531B2 (en) * | 2005-09-16 | 2011-05-17 | Microsoft Corporation | Interfaces for a productivity suite application and a hosted user interface |
US20070100845A1 (en) | 2005-10-31 | 2007-05-03 | Juergen Sattler | Customer relationship management integration system and method |
US20070211293A1 (en) * | 2006-03-10 | 2007-09-13 | Kabushiki Kaisha Toshiba | Document management system, method and program therefor |
US7689583B2 (en) * | 2006-09-11 | 2010-03-30 | Microsoft Corporation | Flexible data presentation enabled by metadata |
DE102006048182A1 (en) * | 2006-10-10 | 2008-04-17 | Navigon Ag | Navigation device and method for displaying a road map with isolines |
US20090070744A1 (en) | 2007-08-28 | 2009-03-12 | Sugarcrm Inc. | CRM SYSTEM AND METHOD HAVING DRILLDOWNS, ACLs, SHARED FOLDERS, A TRACKER AND A MODULE BUILDER |
JP4870601B2 (en) * | 2007-03-17 | 2012-02-08 | 株式会社リコー | Screen data generation apparatus, image processing apparatus, screen data generation method and program |
US7937663B2 (en) | 2007-06-29 | 2011-05-03 | Microsoft Corporation | Integrated collaborative user interface for a document editor program |
US8352966B2 (en) * | 2007-09-11 | 2013-01-08 | Yahoo! Inc. | System and method of inter-widget communication |
US20090070333A1 (en) | 2007-09-12 | 2009-03-12 | Bailey Andrew G | Method for Document Management Across Multiple Software Applications |
US20090076878A1 (en) | 2007-09-19 | 2009-03-19 | Matthias Woerner | Efficient project staffing |
KR101472844B1 (en) * | 2007-10-23 | 2014-12-16 | 삼성전자 주식회사 | Adaptive document displaying device and method |
US7966566B2 (en) | 2007-11-12 | 2011-06-21 | Sap Ag | Systems and methods to create follow-up meetings |
US8370160B2 (en) | 2007-12-31 | 2013-02-05 | Motorola Mobility Llc | Methods and apparatus for implementing distributed multi-modal applications |
US8296161B2 (en) | 2008-09-05 | 2012-10-23 | Salesforce.Com, Inc. | Method and system for wealth management |
US8209341B2 (en) * | 2008-09-30 | 2012-06-26 | International Business Machines Corporation | Configurable transformation macro |
US8255825B2 (en) * | 2008-10-07 | 2012-08-28 | Microsoft Corporation | Content aware adaptive display |
US9026918B2 (en) | 2008-10-16 | 2015-05-05 | Accenture Global Services Limited | Enabling a user device to access enterprise data |
US20100114698A1 (en) * | 2008-10-31 | 2010-05-06 | Goel Kavi J | Advertising meta-keywords |
US8176096B2 (en) | 2008-12-18 | 2012-05-08 | Microsoft Corporation | Data visualization interactivity architecture |
TWI401600B (en) * | 2009-05-11 | 2013-07-11 | Compal Electronics Inc | Method and user interface apparatus for managing functions of wireless communication components |
JP2011010275A (en) * | 2009-05-26 | 2011-01-13 | Sanyo Electric Co Ltd | Image reproducing apparatus and imaging apparatus |
US9298336B2 (en) * | 2009-05-28 | 2016-03-29 | Apple Inc. | Rotation smoothing of a user interface |
US8825509B2 (en) | 2009-10-23 | 2014-09-02 | Salesforce.Com, Inc. | System, method and computer program product for leveraging a customer relationship management system to send meeting invitations |
US20110175826A1 (en) | 2010-01-15 | 2011-07-21 | Bradford Allen Moore | Automatically Displaying and Hiding an On-screen Keyboard |
US9405426B2 (en) * | 2010-03-01 | 2016-08-02 | Salesforce.Com, Inc. | Method and system for providing an adaptive input user interface for data entry applications |
US8589815B2 (en) | 2010-03-10 | 2013-11-19 | Microsoft Corporation | Control of timing for animations in dynamic icons |
US8996978B2 (en) | 2010-05-14 | 2015-03-31 | Sap Se | Methods and systems for performing analytical procedures by interactions with visual representations of datasets |
US8544075B2 (en) | 2010-06-15 | 2013-09-24 | Microsoft Corporation | Extending a customer relationship management eventing framework to a cloud computing environment in a secure manner |
US20120036204A1 (en) | 2010-08-09 | 2012-02-09 | Bank Of America Corporation | Social media engagement system integration |
US8868582B2 (en) * | 2010-08-23 | 2014-10-21 | Sap Ag | Repository infrastructure for on demand platforms |
US8306849B2 (en) | 2010-09-16 | 2012-11-06 | International Business Machines Corporation | Predicting success of a proposed project |
EP2625660A4 (en) * | 2010-10-05 | 2014-06-11 | Centric Software Inc | Interactive collection book for mobile devices |
US8775955B2 (en) | 2010-12-02 | 2014-07-08 | Sap Ag | Attraction-based data visualization |
US8360308B2 (en) * | 2010-12-30 | 2013-01-29 | Cerner Innovation, Inc. | Protocol driven image acquisition |
US20120254791A1 (en) * | 2011-03-31 | 2012-10-04 | Apple Inc. | Interactive menu elements in a virtual three-dimensional space |
US20120290351A1 (en) | 2011-05-10 | 2012-11-15 | Oracle International Corporation | System for automated sales team identification and/or creation |
US20130007586A1 (en) * | 2011-06-29 | 2013-01-03 | Renjit Tom Thomas | Method and system for creating and using web feed display templates |
US9311426B2 (en) * | 2011-08-04 | 2016-04-12 | Blackberry Limited | Orientation-dependent processing of input files by an electronic device |
US20130154947A1 (en) * | 2011-12-14 | 2013-06-20 | International Business Machines Corporation | Determining a preferred screen orientation based on known hand positions |
US9170780B2 (en) * | 2011-12-15 | 2015-10-27 | Sap Se | Processing changed application metadata based on relevance |
JP5818672B2 (en) * | 2011-12-21 | 2015-11-18 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | Information processing apparatus, display processing method, program, and recording medium |
US20130167110A1 (en) * | 2011-12-27 | 2013-06-27 | René Gross | Modeled user interface controllers |
US9310888B2 (en) | 2012-03-16 | 2016-04-12 | Microsoft Technology Licensing, Llc | Multimodal layout and rendering |
US8601019B1 (en) * | 2012-04-03 | 2013-12-03 | Google Inc. | Presenting autocomplete suggestions |
US9025195B2 (en) * | 2012-04-30 | 2015-05-05 | Hewlett-Packard Indigo B.V. | Work flow and finishing for print production of photograph images |
-
2012
- 2012-07-05 US US13/541,785 patent/US9310888B2/en active Active
-
2013
- 2013-01-31 US US13/754,896 patent/US20130246930A1/en not_active Abandoned
- 2013-02-21 US US13/773,630 patent/US20130241852A1/en not_active Abandoned
- 2013-03-14 US US13/827,813 patent/US9645650B2/en active Active
- 2013-03-14 US US13/827,759 patent/US20130246111A1/en not_active Abandoned
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110275418A1 (en) * | 2007-01-07 | 2011-11-10 | Scott Forstall | Portable Multifunction Device, Method, and Graphical User Interface for Conference Calling |
US20090100321A1 (en) * | 2007-10-12 | 2009-04-16 | Microsoft Corporation | Universal contextual actions menu across windows applications |
US20100162171A1 (en) * | 2008-12-19 | 2010-06-24 | Verizon Data Services Llc | Visual address book and dialer |
US20100318921A1 (en) * | 2009-06-16 | 2010-12-16 | Marc Trachtenberg | Digital easel collaboration system and method |
US20110078184A1 (en) * | 2009-09-28 | 2011-03-31 | Lg Electronics Inc. | Mobile terminal and method of searching a contact in the mobile terminal |
US20110087990A1 (en) * | 2009-10-13 | 2011-04-14 | Research In Motion Limited | User interface for a touchscreen display |
US20110113348A1 (en) * | 2009-11-06 | 2011-05-12 | Cisco Technplogy, Inc. | Method and apparatus for visualizing and navigating within an immersive collaboration environment |
US20110268418A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Record and Playback in a Conference |
US20110296312A1 (en) * | 2010-05-26 | 2011-12-01 | Avaya Inc. | User interface for managing communication sessions |
US20110313805A1 (en) * | 2010-06-18 | 2011-12-22 | Microsoft Corporation | Customizable user interface including contact and business management features |
US20120030627A1 (en) * | 2010-07-30 | 2012-02-02 | Nokia Corporation | Execution and display of applications |
US20120144320A1 (en) * | 2010-12-03 | 2012-06-07 | Avaya Inc. | System and method for enhancing video conference breaks |
US20120159355A1 (en) * | 2010-12-15 | 2012-06-21 | Microsoft Corporation | Optimized joint document review |
US20120192090A1 (en) * | 2011-01-25 | 2012-07-26 | Bank Of America Corporation | Single identifiable entry point for accessing contact information via a computer network |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US20130227461A1 (en) * | 2011-08-24 | 2013-08-29 | Salesforce.Com, Inc. | Systems and methods for promoting related lists |
US8244851B1 (en) * | 2011-10-18 | 2012-08-14 | Clintelica AB | Group network connector |
Non-Patent Citations (1)
Title |
---|
"Yelp for Android - App review," available at https://www.youtube.com/watch?v=7vP3w_4mh-s, posted: May 4, 2011; Duration: 2m28s. * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9310888B2 (en) | 2012-03-16 | 2016-04-12 | Microsoft Technology Licensing, Llc | Multimodal layout and rendering |
US9645650B2 (en) | 2012-03-16 | 2017-05-09 | Microsoft Technology Licensing, Llc | Use of touch and gestures related to tasks and business workflow |
US9203844B2 (en) * | 2013-10-31 | 2015-12-01 | Bank Of America Corporation | Visual representation for permission to contact |
US20150121241A1 (en) * | 2013-10-31 | 2015-04-30 | Bank Of America Corporation | Visual representation for permission to contact |
US10452260B2 (en) * | 2014-11-21 | 2019-10-22 | Nintex Pty Ltd | Managing workflow tasks in touch screen mobile devices |
USD770506S1 (en) * | 2015-07-15 | 2016-11-01 | Microsoft Corporation | Display screen with animated graphical user interface |
USD765708S1 (en) * | 2015-07-27 | 2016-09-06 | Microsoft Corporation | Display screen with animated graphical user interface |
USD768689S1 (en) * | 2015-07-27 | 2016-10-11 | Microsoft Corporation | Display screen with animated graphical user interface |
USD763892S1 (en) * | 2015-07-28 | 2016-08-16 | Microsoft Corporation | Display screen with animated graphical user interface |
USD765709S1 (en) * | 2015-07-28 | 2016-09-06 | Microsoft Corporation | Display screen with animated graphical user interface |
USD763893S1 (en) * | 2015-07-28 | 2016-08-16 | Microsoft Corporation | Display screen with graphical user interface |
US20170090747A1 (en) * | 2015-09-24 | 2017-03-30 | International Business Machines Corporation | Input device interaction |
US10416776B2 (en) * | 2015-09-24 | 2019-09-17 | International Business Machines Corporation | Input device interaction |
US10551937B2 (en) | 2015-09-24 | 2020-02-04 | International Business Machines Corporation | Input device interaction |
USD844657S1 (en) | 2017-11-27 | 2019-04-02 | Microsoft Corporation | Display screen with animated graphical user interface |
USD845982S1 (en) | 2017-11-27 | 2019-04-16 | Microsoft Corporation | Display screen with graphical user interface |
USD845989S1 (en) | 2017-11-27 | 2019-04-16 | Microsoft Corporation | Display screen with transitional graphical user interface |
USD846568S1 (en) | 2017-11-27 | 2019-04-23 | Microsoft Corporation | Display screen with graphical user interface |
Also Published As
Publication number | Publication date |
---|---|
US20130241852A1 (en) | 2013-09-19 |
US9645650B2 (en) | 2017-05-09 |
US20130246913A1 (en) | 2013-09-19 |
US9310888B2 (en) | 2016-04-12 |
US20130241951A1 (en) | 2013-09-19 |
US20130246111A1 (en) | 2013-09-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130246930A1 (en) | Touch gestures related to interaction with contacts in a business data system | |
US20140365952A1 (en) | Navigation and modifying content on a role tailored workspace | |
US20140365961A1 (en) | Unified worklist | |
US20140365263A1 (en) | Role tailored workspace | |
US9589057B2 (en) | Filtering content on a role tailored workspace | |
US11416948B2 (en) | Image tagging for capturing information in a transaction | |
CN105339957B (en) | Method and system for displaying different views of an entity | |
US10026132B2 (en) | Chronological information mapping | |
US11734631B2 (en) | Filtering records on a unified display | |
US20160261577A1 (en) | Analysis with embedded electronic spreadsheets | |
WO2016014322A1 (en) | Taking in-line contextual actions on a unified display | |
US20160026943A1 (en) | Unified threaded rendering of activities in a computer system | |
WO2015134301A1 (en) | Retrieval of enterprise content that has been presented | |
US9804749B2 (en) | Context aware commands | |
US20160026953A1 (en) | In-line creation of activities on a unified display | |
US20140002377A1 (en) | Manipulating content on a canvas with touch gestures | |
US20150248227A1 (en) | Configurable reusable controls | |
US20160026373A1 (en) | Actionable steps within a process flow | |
US20140365963A1 (en) | Application bar flyouts | |
US20160381203A1 (en) | Automatic transformation to generate a phone-based visualization | |
US20150301987A1 (en) | Multiple monitor data entry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PAUSHKINA, ANASTASIA;CYREK, TED A.;HEYDEMANN, CHRISTIAN;AND OTHERS;SIGNING DATES FROM 20130107 TO 20130114;REEL/FRAME:029727/0202 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |