US20140046729A1 - Triggering, conducting, and analyzing an automated survey - Google Patents

Triggering, conducting, and analyzing an automated survey Download PDF

Info

Publication number
US20140046729A1
US20140046729A1 US14/057,396 US201314057396A US2014046729A1 US 20140046729 A1 US20140046729 A1 US 20140046729A1 US 201314057396 A US201314057396 A US 201314057396A US 2014046729 A1 US2014046729 A1 US 2014046729A1
Authority
US
United States
Prior art keywords
survey
customer
automated
computer
servicer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/057,396
Inventor
Karl Meyer
Jonathan Turner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RXO Last Mile Inc
Original Assignee
3PD Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3PD Inc filed Critical 3PD Inc
Priority to US14/057,396 priority Critical patent/US20140046729A1/en
Publication of US20140046729A1 publication Critical patent/US20140046729A1/en
Assigned to XPO LAST MILE, INC. reassignment XPO LAST MILE, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: 3PD, INC.
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. PATENT SECURITY AGREEMENT Assignors: CON-WAY FREIGHT, INC., XPO LAST MILE, INC.
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. PATENT SECURITY AGREEMENT Assignors: CON-WAY FREIGHT, INC., XPO LAST MILE, INC.
Priority to US16/382,257 priority patent/US10664853B2/en
Assigned to XPO LAST MILE, INC. reassignment XPO LAST MILE, INC. PARTIAL RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 037108, FRAME 0894 Assignors: MORGAN STANLEY SENIOR FUNDING, INC., AS AGENT
Assigned to XPO LAST MILE, INC. reassignment XPO LAST MILE, INC. PARTIAL RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 037108, FRAME 0885 Assignors: MORGAN STANLEY SENIOR FUNDING, INC., AS AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • G06Q10/0833Tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/015Providing customer assistance, e.g. assisting a customer within a business location or via helpdesk
    • G06Q30/016After-sales
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • G06Q10/1095Meeting or appointment
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams

Definitions

  • the present disclosure generally relates to surveys, and more particularly relates to survey automation.
  • Businesses often use surveys to obtain feedback from customers. The survey responses can help a business understand the customer's level of satisfaction. Also, a business can use data from surveys to track patterns and trends in customer service. In response, the business can make changes as necessary in areas where improvements can be made. Businesses that can keep operations running smoothly and focused on customer satisfaction may typically have a better chance of long-term success.
  • a computer-implemented method may comprise the steps of conducting, by a processing device, an automated survey on a survey recipient associated with a customer location.
  • the automated survey may be configured to prompt the survey recipient regarding the quality of a service performed for a customer associated with the customer location.
  • the method also includes receiving survey result information from the survey recipient in response to conducting the automated survey and analyzing the survey result information.
  • the method also includes determining if one or more subsequent follow-up actions are warranted based in part on analyzing the survey result information.
  • a survey result analysis system that comprises a processing device associated with a computing system, the processing device configured to execute a survey program, and a memory device in communication with the processing device.
  • the memory device is configured to store the survey program, wherein the survey program is configured to enable the processing device to conduct an automated survey on a survey recipient associated with a customer location.
  • the automated survey for instance, may be configured to prompt the survey recipient regarding the quality of a service performed for a customer associated with the customer location.
  • the survey program also enables the processing device to receive survey result information from the survey recipient in response to conducting the automated survey, perform an analysis of the survey result information, and determine if one or more subsequent follow-up actions are warranted based in part on the analysis of the survey result information.
  • Some implementations may also include another computer-implemented method enabling a processing device to conduct an automated survey configured to prompt a survey recipient regarding the quality of a service performed associated with a customer location.
  • the method also enables the processing device to obtain survey result information in response to conducting the automated survey, to determine if a voice message was received in response to the automated survey, and to present the voice message to a person for listening to the voice message.
  • a survey result analysis system which comprises a processing device and a memory device.
  • the processing device is associated with a computing system and is configured to execute a survey program.
  • the memory device is in communication with the processing device and is configured to store the survey program.
  • the survey program is configured to enable the processing device to retrieve survey result information extracted from an automated survey offered to a survey recipient.
  • the processing device is also configured by the survey program to determine if a voice message was received in response to the automated survey and to present the voice message to a person for listening to the voice message.
  • FIG. 1 is a block diagram illustrating a first embodiment of general business interactions.
  • FIG. 2 is a block diagram illustrating a second embodiment of general business interactions.
  • FIG. 3 is a block diagram illustrating an embodiment of a service group according to various implementations of the present disclosure.
  • FIG. 4 is a block diagram illustrating a survey network system according to various implementations of the present disclosure.
  • FIG. 5 is a block diagram illustrating an embodiment of the automated survey system shown in FIG. 4 , according to various implementations of the present disclosure.
  • FIG. 6 is a block diagram illustrating an embodiment of the survey program shown in FIG. 5 , according to various implementations of the present disclosure.
  • FIG. 7 is a diagram illustrating an embodiment of data segments stored in the database shown in FIG. 5 , according to various implementations of the present disclosure.
  • FIG. 8 is a flow diagram illustrating general operations of a survey system according to various implementations of the present disclosure.
  • FIG. 9 is a flow diagram illustrating an embodiment of a survey method according to various implementations of the present disclosure.
  • FIG. 10 is a block diagram illustrating an embodiment of a method for creating a survey according to various embodiments.
  • FIG. 11 is a screen shot of a user interface for creating an automated survey according to various implementations of the present disclosure.
  • FIG. 12 is a diagram illustrating a sample script for an automated survey according to various implementations of the present disclosure.
  • FIG. 13 is a flow diagram illustrating a method for triggering and conducting a survey according to various implementations of the present disclosure.
  • FIG. 14 is a flow diagram illustrating a method for conducting an automated survey according to various implementations of the present disclosure.
  • FIG. 15 is a flow diagram illustrating a method for handling survey result information according to various implementations of the present disclosure.
  • FIG. 16 is a flow diagram illustrating a method for performing survey follow-up actions according to various implementations of the present disclosure.
  • FIG. 17 is a screen shot of a user interface for enabling access to voice messages according to various implementations of the present disclosure.
  • FIGS. 18A and 18B include combinable parts of a screen shot of a user interface for enabling input of follow-up actions according to various implementations of the present disclosure.
  • FIG. 19 is a screen shot of a user interface for enabling access to survey result information according to various implementations of the present disclosure.
  • FIG. 20 is a screen shot of the user interface of FIG. 19 according to various implementations.
  • FIG. 21 is a screen shot of a user interface for searching and tracking survey responses according to various implementations of the present disclosure.
  • FIGS. 22A and 22B are screen shots of a service issue report according to various implementations of the present disclosure.
  • FIG. 23 is a screen shot of a quality report according to various implementations of the present disclosure.
  • FIG. 24 is a screen shot of a survey result report according to various implementations of the present disclosure.
  • FIG. 25 is a screen shot of a survey response report according to various implementations of the present disclosure.
  • FIG. 26 is a screen shot of a summary quality report according to various implementations of the present disclosure.
  • the present disclosure describes systems and methods for conducting surveys in response to interactions between businesses and customers.
  • Surveys may be created and utilized for obtaining feedback about products sold to customers and/or about services provided for the customers.
  • the survey systems and methods herein may also be configured to be conducted in response to products or other offerings by a company or business.
  • various implementations herein describe many services as being delivery services, but it should be understood that the present disclosure also may include other types of services without departing from the principles described herein.
  • Other features and advantages will be apparent to one of ordinary skill in the art upon consideration of the general principles described herein, and all such features and advantages are intended to be included in the present disclosure.
  • FIG. 1 is a block diagram of a business interaction between a business 10 and a customer 12 .
  • the business 10 may be any company, profit center, or other entity.
  • the business 10 may be a physical store, on-line store, service company, or other entity.
  • the customer 12 may be any individual who is to receive a service or who orders or purchases a product.
  • the business 10 provides goods and/or services directly to the customer 12 .
  • there are several opportunities for the business 10 to display customer service including, for example, the customer 12 interacting with a salesperson, sales clerk, or cashier, the customer 12 receiving a service, such as a repair, maintenance, improvement, legal service, delivery or other type of service, or other types of interactions.
  • the business 10 When a service is to be performed in this arrangement, the business 10 employs internal servicers who provide the service directly to the customer 12 .
  • Various examples of non-limiting services may include a delivery of a purchased product, a plumbing service, tax return preparation, automobile repair, etc.
  • FIG. 2 shows another example of a general business interaction in which the customer 12 pays the business 10 for goods or services, the business 10 provides a service group 14 with information for fulfilling the service, and the service group 14 provides the service to the customer 12 on behalf of the business 10 .
  • the service group 14 includes the service professionals and other people involved in the business of offering one or more services, and is often a separate corporate entity from the business 10 .
  • the service group 14 may be responsible for delivering, building, assembling, installing, maintaining, repairing, improving, testing, demonstrating, removing, and/or other service actions.
  • the business 10 may be considered a client of the service group 14 .
  • the customer 12 may provide the business 10 with personal information, such as name, address, phone numbers, e-mail addresses, etc., which can be used for contacting the customer 12 to provide the intended services or for contacting the customer 12 as needed.
  • Other ordering information may be exchanged or created, including special instructions for delivery, unpacking or assembly requests, and/or installation requests. Orders can usually be taken in any number of ways, including transactions in person, by phone, by mail, by e-mail, by the Internet, or by other ordering methods.
  • the business 10 may provide some of this order information to the service group 14 in order that the service group 14 can perform the service properly.
  • the order information can be provided by an automatic ordering system, by facsimile device, by e-mail, by phone, or in any other manner.
  • the service group 14 may pick up products, as necessary, from the business's store, warehouse, supplier, etc., and deliver the products to one or more customers 12 .
  • the customer 12 may schedule the service directly with the service group 14 .
  • FIG. 3 is a block diagram showing an embodiment of a service group 20 , such as the service group 14 shown in FIG. 2 .
  • managed services 20 may represent a service company, which may be responsible for the management of internal servicers 24 , who are employed by a client business, and service managers 26 , who may be employed by the managed services 22 company or may be independent contract companies.
  • the managed services 22 may include operators who manage the services for a particular client.
  • servicers 30 may be direct independent contractors to managed services 22 .
  • the managed services 22 may include an automated survey system, which automatically conducts surveys and analyzes the results of the survey. More details of the automated survey systems are described below.
  • the service managers 26 may be field managers, regional managers, or local managers who manage one or more service providers 28 , often in a particular region and/or for a specific client.
  • the service manager may also manage one or more internal servicers 24 .
  • the service providers 28 manage a number of servicers 30 , who may be employed by the service providers 28 or may be independent contractors.
  • the servicer 30 may be the individual or team representing the service group 20 (or service group 14 shown in FIG. 2 ) and who directly interacts with the customer 12 .
  • FIG. 4 is a block diagram of an embodiment of a survey network system 34 according to various implementations of the present disclosure.
  • the survey network system 34 includes an automated survey system 36 (described in more detail below), client systems 38 , service group systems 40 , and customer systems 42 . These and other systems are capable of interacting and communicating via one or more communication networks 44 .
  • the communication networks 44 may include telephone lines, such as land line or public switched telephone network (PSTN) systems, mobile phone channels and systems, communication channels for exchanging data and information, such as a local area network (LAN), wide area network (WAN), the Internet, or other data, communication, or telecommunication networks.
  • PSTN public switched telephone network
  • LAN local area network
  • WAN wide area network
  • the Internet or other data, communication, or telecommunication networks.
  • the client systems 38 may represent any business, such as the businesses described with respect to FIGS. 1 and 2 .
  • the client systems 38 represent at least a part of a business that is a client of the service group, which utilizes the service group systems 40 .
  • the service group may be responsible for performing one or more services on behalf of the clients.
  • the service group may be the service group 20 described with respect to FIG. 3 or other group of servicers, service providers, service managers, and/or managed services.
  • the automated survey system 36 may be part of the client systems 38 or may be part of the service group systems 40 .
  • the client systems 38 and service group systems 40 may be part of one company or enterprise.
  • the service group systems 40 may include equipment used by the servicers and by field managers.
  • the service group systems 40 may include handheld devices (e.g., devices carried by the servicers), mobile phones, laptop computers, or other devices.
  • the servicer may use any suitable device of the service group systems 40 to notify the automated survey system 36 that the service has been completed.
  • the servicer may call into an integrated voice response (IVR) device (or voice response unit (VRU)) of the automated survey system 36 to input information about the service or completion of the service.
  • IVR integrated voice response
  • VRU voice response unit
  • Another example may include a telephone call, landline or mobile, to a support agent, who may be associated with the automated survey system 36 and who can manually enter the service information into the automated survey system 36 .
  • completion of the particular service may be communicated by some automated process, such as the automatic detection of a change in the servicer's location using, for example, a global positioning system (GPS) device.
  • GPS global positioning system
  • the automated survey system 36 waits for a short amount of time (e.g., to allow the customer to reflect upon the service received). After a configurable short delay, e.g., about 10 minutes, the automated survey system 36 launches an automated survey.
  • the survey is conducted over the telephone using an IVR system, which is configured to call the customer's home telephone number using contact information obtained during the order process.
  • the survey may be sent to the customer systems 42 using the PSTN or over other communication networks, such as an e-mail system, chat session, text message system, etc.
  • the customer may delegate another individual to interact with the servicers, such as if the customer wishes for a neighbor to handle the acceptance of the delivered items. In these cases, the survey recipient may be the neighbor, who may be in a better position to rate the delivery service.
  • the automated survey system 36 may include a processing system adapted to conduct the survey when the service is complete.
  • the automated survey system 36 is further configured to analyze the results of the survey to determine if any follow-up actions with the customer are needed. For example, if the customer is dissatisfied with the service received, the customer can leave responses that can be analyzed for follow-up. In some situations, the customer may have need of immediate resolution to which the service group or client can provide follow up. Feedback may be received in the form of key strokes on a touch tone key pad of a telephone, voice messages left over the telephone, and/or by other communication means.
  • Some follow-up actions may involve a service manager, field manager, or other representative of the service group.
  • the automated survey system 36 organizes the survey results in tables or charts to clearly communicate any issues that the customers may have. For example, if the customer indicates poor service, such as by providing low ratings on the survey or by explaining problems in a voice message, this information can be automatically or manually recorded and then provided directly to the service manager or other responsible person or team of the service group associated with the service group systems 40 . In some cases, survey feedback can be directed to the client systems 38 . In the case where follow-up actions may involve the client, the automated survey system 36 may send an automatic communication to the client systems 38 in order that the client can view the survey result information using a web-enabled browser via the Internet. Both the client and field managers of the service group can access survey result information and/or a digitized version of the voice message as needed to help resolve the customer's issues.
  • FIG. 5 is a block diagram illustrating an embodiment of the automated survey system 36 shown in FIG. 4 , according to various implementations of the present disclosure.
  • the automated survey system 36 includes a processing device 48 and a memory device 50 , which includes at least an order management program 52 , a survey program 54 , and a database 56 .
  • the automated survey system 36 further includes input/output devices 58 and interface devices 60 .
  • the components of the automated survey system 36 are interconnected and may communicate with each other via a computer bus interface 62 or other suitable communication devices.
  • each component of the automated survey system 36 as shown may include multiple components on multiple computer systems of a network.
  • the managed services 22 of the service group may comprise servers, such as application servers, file servers, database servers, web servers, etc., for performing various functions described herein.
  • the servers of the automated survey system 36 may for example be physically separate servers or servers in a VMware ESX ⁇ 4.0 virtual environment, among other implementations.
  • the internal servicers 24 , service managers 26 , service providers 28 , and/or servicers 30 may comprise laptop or desktop computer systems, which may form part of the automated survey system 36 and may be used for accessing the servers as needed.
  • the processing device 48 may be one or more general-purpose or specific-purpose processors or microcontrollers for controlling the operations and functions of the automated survey system 36 .
  • the processing device 48 may include a plurality of processors, computers, servers, or other processing elements for performing different functions within the automated survey system 36 .
  • the memory device 50 may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units, each including a tangible storage medium.
  • the various storage units may include any combination of volatile memory and non-volatile memory.
  • volatile memory may comprise random access memory (RAM), dynamic RAM (DRAM), etc.
  • Non-volatile memory may comprise read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, etc.
  • the storage units may be configured to store any combination of information, data, instructions, software code, etc.
  • the order management program 52 , survey program 54 , and database 56 may be stored in one or more memory devices 50 and run on the same or different computer systems and/or servers.
  • the input/output devices 58 may include various input mechanisms and output mechanisms.
  • input mechanisms may include various data entry devices, such as keyboards, keypads, buttons, switches, touch pads, touch screens, cursor control devices, computer mice, stylus-receptive components, voice-activated mechanisms, microphones, cameras, infrared sensors, or other data entry devices.
  • Output mechanisms may include various data output devices, such as computer monitors, display screens, touch screens, audio output devices, speakers, alarms, notification devices, lights, light emitting diodes, liquid crystal displays, printers, or other data output devices.
  • the input/output devices 58 may also include interaction devices configured to receive input and provide output, such as dongles, touch screen devices, and other input/output devices, to enable input and/or output communication.
  • the interface devices 60 may include various devices for interfacing the automated survey system 36 with one or more types of communication systems, such as the communication networks 44 .
  • the interface devices 60 may include devices for communicating the automated survey from the automated survey system 36 to the customer systems 42 .
  • a telephone/voice interface device of the interface devices 60 can be used for controlling an IVR device and accessing a telephone network.
  • interface devices 60 may include various devices for interfacing with a data network, such as the Internet, to enable the communication of data.
  • the interface devices 60 may include Dialogic cards, Dialogic Diva softIP software, Envox, a voice over Internet protocol (VoIP) device, or other hardware or software interface elements.
  • VoIP voice over Internet protocol
  • the order management program 52 stored in the memory device 50 includes any suitable instructions for processing a customer order.
  • the order management program 52 may be Dispatch Office or other software for managing orders.
  • the order management program 52 may include the capability of tracking deliveries.
  • the order management program 52 may be omitted from the automated survey system 36 in some embodiments or placed in a separate processing system according to other embodiments.
  • the survey program 54 which is described in more detail below, includes instructions and templates for enabling a user to create an automated survey.
  • the survey program 54 is also configured to detect a trigger event, such as the completion of a delivery service, and then launch the automated survey in response to the trigger.
  • the survey program 54 also may automatically analyze the feedback from the survey recipient and enable a survey monitor person to review voice messages left by the survey recipient and enter notes, a summary, and/or a transcript of the voice message. When the analysis of the survey result information is made, the survey program 54 can determine if follow-up actions are warranted. For example, if a delivered product is damaged, the survey program 54 can communicate with the appropriate person or team that can resolve the issue.
  • the survey program 54 utilizes, as needed, the database 56 , which is configured to store order information, customer information, survey information, and other types of data and information. Other implementations may omit one or more of the functions disclosed herein.
  • FIG. 6 is a block diagram showing an embodiment of the survey program 54 according to various implementations of the present disclosure.
  • the survey program 54 includes a survey assembling module 62 , a survey triggering module 64 , a survey conducting module 66 , an automated survey result analyzing module 68 , a survey result monitoring module 70 , a survey follow-up module 72 , and a survey result reporting module 74 .
  • certain functions described herein may be executed by the module explicitly described or may alternatively be executed by one or more modules.
  • the survey assembling module 62 is configured to record a survey script read by a professional speaker.
  • the survey assembling module 62 can record the read script in digitized form in a way file, vox file, and/or other audio file formats.
  • a file naming convention can be used to help identify the properties of the survey scripts.
  • the file name may include an indication of the client, product, types of services, spoken language, store brand, and/or other information.
  • the survey assembling module 62 enables a user to select different scripts to combine into a complete survey.
  • each script may be a single question, single statement, or other portion of an entire survey.
  • the user may then arrange the selected scripts in a particular order. Also, the user is enabled to enter acceptable answers for each of the survey questions.
  • the survey triggering module 64 detects when a trigger event occurs that warrants the conducting of a survey.
  • the trigger event may be the completion of a delivery service or other service.
  • the survey triggering module 64 may detect when an order case is closed or when the status of a customer's order has been closed or finished (e.g., when an order has been fulfilled and properly delivered).
  • the survey triggering module 64 may detect the order status using a polling process in which the database 56 is polled. The polling process may be operated on a periodic schedule, e.g., about every 10 minutes.
  • the survey triggering module 64 may create a new survey case to indicate that a survey is to be launched.
  • the survey triggering module 64 may detect when a survey record has been created automatically or manually in the database 56 .
  • the survey triggering module 64 may be configured to receive indications when trigger events occur that warrant the initiation of surveys. For example, when a service is complete, the servicer may use a handheld device that prompts the servicer to provide input when the service job is finished. The handheld device may transmit a wireless signal to the automated survey system 36 via the interface devices 60 and this signal may be forwarded to the survey triggering module 64 . Some embodiments may also include a purchased product (e.g., a mobile phone, smart phone, cable service, etc.) that may be configured to automatically communicate notification of a trigger event (e.g., installation, registration, initiation of phone service, etc.) to the survey triggering module 64 . Other trigger events and other means of communicating a notification of the trigger events to the survey triggering module 64 may be used according to the particular design.
  • a purchased product e.g., a mobile phone, smart phone, cable service, etc.
  • notification of a trigger event e.g., installation, registration, initiation of phone service, etc.
  • the survey triggering module 64 may then set a flag stored in the memory device 50 or provide some other type of indication that the service job is complete (or other trigger event has occurred) and that the status of a new survey case associated with that service job is now opened. In some implementations, the survey triggering module 64 may enter the time that the trigger signal was received in order to allow multiple service jobs to be recorded chronologically according to completion time.
  • the survey triggering module 64 may also be configured to perform a polling process in which the database 56 is polled to determined which entries were recorded over a past predetermined time period. For example, if surveys are to be initiated every ten minutes, the polling process can determine which service jobs were completed in the last ten minutes. The survey triggering module 64 places the polled service jobs in the scheduling queue 84 in the order in which the service jobs were completed. The order that the automated surveys are conducted is based in part on the list in the survey scheduling queue 84 .
  • the survey triggering module 64 may also be configured to wait a predetermined amount of time before triggering the launch of the survey.
  • the reason for the delay is to allow the customer to have time to observe the delivered product and try running it, for example, to determine if there are any defects. Also, the delay permits time for the servicer to leave the vicinity of the customer's residence to allow the customer to provide unbiased responses to the survey questions.
  • the survey triggering module 64 instructs the survey conducting module 66 to launch the survey.
  • the survey conducting module 66 is configured to retrieve the appropriate survey script for the particular client, brand, product, service, customer, order, or other criteria. Also, the survey conducting module 66 retrieves the customer contact information, such as a home telephone number or mobile phone number. The survey conducting module 66 may be configured to control the IVR device to dial the customer's number and begin playing the survey scripts when the customer answers the phone. In some embodiments, other methods of contacting the customer may be used.
  • the survey conducting module 66 is also configured to capture the touch tone entries from the customer's telephone in response to the survey questions. Customer input can also be captured by the survey conducting module 66 using other input techniques, such as by e-mail, web-based inputs, spoken answers, etc.
  • the survey conducting module 66 also gives the customer an option to leave a voice message, if desired. When a voice message is left, the survey conducting module 66 may also record the message in digital form.
  • the survey conducting module 66 may also be configured to give the customer the option of speaking with a live operator. If the customer wishes to speak with an operator, the survey conducting module 66 may redirect the call to an operator associated with the service group.
  • the survey conducting module 66 may also be configured to give the customer the option to leave a message using text, such as typing a message in an e-mail, typing a message in a text message, typing a message on a smart phone, using a chat session, or other means of leaving a non-voice message.
  • the survey result information and voice messages can be analyzed to determine the customer's satisfaction with the service received. Some analysis of this information may be done automatically, while other analysis may require human involvement.
  • the automated survey result analyzing module 68 is configured to automatically analyze the feedback from the customer when the survey is completed.
  • the survey may include any number of questions, any of which may require numeric answers, such as answers on a numeric scale from 1 to 5, where 1 represents “completely dissatisfied” and 5 represents “completely satisfied.” Other scales can be used according to the particular design.
  • the automated survey result analyzing module 68 may be configured to calculate a score of the survey recipient's numeric answers.
  • the automated survey result analyzing module 68 may be configured to use the overall score to determine if it is below a threshold that indicates that the customer was generally dissatisfied with the service. With a low average score, such as if the score is below 3.0 on a scale from 1 to 5, the automated survey result analyzing module 68 may set a flag to indicate that follow-up is warranted. Thresholds other than 3.0 may also be used according to the client's wishes or based on other factors. In some embodiments, the automated survey result analyzing module 68 may be configured to automatically send an e-mail or communicate in another manner to the field manager (or others) for follow up. The field manager may then respond by calling the customer to try to resolve any issues.
  • the automated survey result analyzing module 68 may detect if one or more answers indicate the lowest level of satisfaction on the part of the customer. In this case, the automated survey result analyzing module 68 may set the flag indicating the need for follow-up. Also, an automatic e-mail may be sent to the field manager (or others). The automated survey result analyzing module 68 may be configured to analyze the feedback from the survey in any suitable manner to determine if follow-up actions are warranted.
  • the survey result monitoring module 70 may be a web-based tool that can be accessed by a human operator (e.g., a survey monitor, service manager, field manager, or other authorized personnel of the service group).
  • the survey result monitoring module 70 may provide a user interface enabling the user to access the survey result information, analyzed results from the automated survey result analyzing module 68 , digitized voice messages, and/or other information.
  • the survey result monitoring module 70 may enable the user to access and listen to the voice messages, enter a transcript of the voice message, enter a summary of the voice message, append notes to the survey result information, select one or more predefined classifications of customer issues, and/or select or recommend one or more follow-up actions.
  • the survey result monitoring module 70 can open a follow-up case for the purpose of monitoring the status of follow-up actions taken until the customer issues are resolved.
  • opening cases is understood to include the creation of one or more database records.
  • survey cases and follow-up cases for the same service may be monitored simultaneously.
  • the survey result monitoring module 70 may provide a link or hyperlink to the survey information and/or voice messages.
  • the input received from the user via the user interface can be stored along with the other information of the survey record and/or follow-up record.
  • the survey follow-up module 72 may be configured to track the follow-up actions that are taken to resolve customer issues.
  • the survey follow-up module 72 may record and organize information related to the status of the follow-up case, such as, for example, the age of the follow-up case from the start of an opened follow-up case to the present.
  • the survey follow-up module 72 enables access to this information and allows the user to use a searching tool associated with the survey follow-up module 72 to search for specific groups of follow-up cases, based on any factors, such as client, age, region, etc.
  • the survey follow-up module 72 is configured to initiate follow-up actions. For example, if the survey feedback contains certain scores or marks that fit the specified criteria for needing follow-up, the survey follow-up module 72 may automatically send an e-mail to the field manager responsible for that servicer or service team. In this way, the field manager is informed that follow-up is needed and is incentivized to act quickly to resolve the issues. Along with the e-mail, the survey follow-up module 72 can also transmit the survey result information and recorded voice messages and/or links to the information and voice messages. In some cases, the issues may require the involvement of the client. Depending on how the client decides to establish follow-up routines, the survey follow-up module 72 may communicate information to the client directly or to both the client and the field manager.
  • the survey follow-up module 72 may be configured to determine the age of a follow-up case and track the progress being made to resolve the issues.
  • the survey follow-up module 72 may be monitored by the survey monitor person to determine if certain issues need to be revisited.
  • the survey follow-up module 72 may enable the transmission or re-transmission of an e-mail as a reminder as necessary to notify the field manager or other responsible party for resolving an older issue.
  • the reminder can be send automatically by the survey follow-up module 72 based on predetermined conditions.
  • the survey follow-up module 72 may be further configured to calculate incentive payments based in part on survey scores, survey result information, compliments, or other information that is received with respect to the performance by a servicer or service team.
  • the survey follow-up module 72 may calculate bonuses for managers based on survey result numbers. In this respect, the servicers and managers can receive bonus compensation for high quality customer service.
  • the survey result reporting module 74 may be configured to send reports to one or more clients to inform them of the survey result information, types of issues encountered, overall scores, or other information or data.
  • the reports may be sent automatically to the clients based in part on the client's preferences. Some reports may be communicated daily, monthly, quarterly, or for any time period.
  • the survey result reporting module 76 may be configured to communicate with different groups of people who may be responsible for different aspects of a particular service. For example, when the results of surveys indicate defective products from a client, the survey result reporting module 74 may be configured to send a notice to an individual or department about the detective products.
  • the survey program 54 of the present disclosure may be implemented in hardware, software, firmware, or any combinations thereof.
  • the survey program 54 may be implemented in software or firmware that is stored on a memory device and that is executable by a suitable instruction execution system.
  • the survey program 54 may be implemented as one or more computer programs stored on different memory devices or different computer systems of a network. If implemented in hardware, the survey program 54 may be implemented using discrete logic circuitry, an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), or any combinations thereof.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • FIG. 7 is a diagram showing an embodiment of the database 56 shown in FIG. 5 .
  • the database 56 may contain various information and data. As illustrated, the database 56 may include order information 78 , customer information 80 , service information 82 , survey scripts 84 , a survey scheduling queue 86 , survey result information 88 , voice messages 90 , and survey follow-up action information 91 , and may further include other types of data.
  • the service information 82 may be related to any type of service, such as a delivery service, installation service, repair service, or other services.
  • the voice messages 90 may instead be stored in a separate file system associated with the memory device 50 .
  • the order information 78 may include the store name, product purchases, type of services to be provided, date and time of order, etc.
  • the customer information 80 may include the customer's name, mailing address, billing address, delivery address, telephone and mobile phone numbers, e-mail addresses, preferred means of contact, etc.
  • the service information 82 (e.g., when related to a delivery service) may include the product ordered, shipping identification information of the product, the delivery driver, the carrier, the servicer, the promised delivery time, the actual arrival time, status of delivery, etc.
  • the survey scripts 84 may include digitized voice scripts of portions of one or more surveys, complete surveys, or other survey information.
  • the survey scheduling queue 86 is a queue for recording the time when survey cases are open, a sequence of surveys to be conducted, etc.
  • the survey result information 88 may include the results, feedback, responses, etc., provided by the customer during the survey.
  • the survey result information 88 may also include result of the analysis by the automated survey result analyzing module 68 , such as overall scores.
  • the voice messages 90 may include digitized voice messages recorded during the survey.
  • the voice messages 90 may be stored as files (e.g., on a separate file server) that may be accessed by hyperlinks via the network.
  • the survey follow-up action information 91 may include a record of a classification of customer issues that warrant follow-up actions in addition to a record of follow-up actions to be taken to resolve the customer issues.
  • FIG. 8 is a flow diagram illustrating an overview of the automated survey process according to various implementations.
  • Customer Order 92 represents the process when the customer orders a product or service from the client.
  • the client collects contact information associated with the customer during the ordering process. This contact information can be used for contacting the customer in order to run the survey.
  • Service Interaction 94 is the process when a service of any kind is performed for the customer.
  • the service may be a delivery of goods or packages, building and/or installing a product, maintenance, repair, improvement, communication with a service manager or customer service representative, a product registration process, or other services.
  • the client or service group may conduct a survey to collect information about the customer's satisfaction with the service. The collected information can be used to help the service group improve the quality of their services.
  • a survey may be triggered. This is indicated by block 96 .
  • One way in which the survey is triggered may include a servicer calling into an IVR device indicating that the job is complete or closed.
  • Another way of triggering a survey may include the servicer using a handheld device to close the job and the handheld device being configured to send a trigger signal to the automated survey system 36 .
  • Another way may include the servicer calling a support center to close the job using a landline telephone or mobile phone.
  • the closed status may be detected in the database by a program that creates a survey call record that initiates the deployment of the survey.
  • an Automated Survey may be conducted.
  • the survey may be conducted automatically via a phone call to the customer using an IVR device, e-mail, chat, or other means of communication.
  • the automated survey may include pre-recorded questions and may respond to the answers captured by a numeric keypad, an alphanumeric keyboard, touch screen device, or other data entry device on the customer's telephone, mobile phone, computer, or other device.
  • Responses may be received via telephone, in a return e-mail or chat session, or by other digital entry device.
  • Responses to survey questions may also be in the form of voice messages received via telephone, VoIP, or other voice recording device or system.
  • the customer may be given the option to wait for live customer care if desired.
  • an option may be given to allow the customer to enter a message other than a voice message, such as, for example, a text message, e-mail message, or other textual based message.
  • the survey may be started within about ten minutes of the trigger event and completed within about two minutes.
  • the automated survey system is configured to analyze the results. This analysis can be done automatically by the processing system and/or manually by a survey monitor person.
  • the automated analysis may include analysis of the customer data, product data, survey responses, and/or other information.
  • the survey responses may be collected using finite answers, such as an answer 1, 2, 3, 4, or 5 for a ranking in response to a specific survey question.
  • the survey response may include a voice message, which can be manually analyzed and entered according to certain defined classifications.
  • the results of a survey do not require follow-up with the customer and these survey cases can be closed.
  • the customer may enter certain responses or leave a voice message that prompts the automated survey system to begin a follow-up process to resolve any issues that the customer may have.
  • the answers are analyzed, either automatically or manually, the issues may be identified.
  • a follow-up process is opened to ensure that the issues are treated sensitively.
  • the follow-up may include inquiries to gather additional information from the customer, if needed. Countermeasures may be followed as needed to resolve the issues.
  • Follow-up actions may be acted upon internally within the service group or if necessary reported to client management and/or client teams. Information from the analysis and the follow-up may be collected and reported to internal teams for future use, such as performance management, improving processes, services and products, tracking costs and issues, billing, etc. Reports include hyperlinks to voicemails for easy access and review.
  • FIG. 9 is a flow diagram of an embodiment of a method for executing a service case, survey case, and follow-up case.
  • a service case is opened.
  • the client or business
  • the order information may be related to the specific service order and the customer's personal information.
  • the servicer performs the service for the customer.
  • the service case is closed.
  • the closing of the service case causes the opening of a survey case.
  • the completion of the service job triggers the initiation of the survey case.
  • the survey case includes the conducting of an automated survey.
  • responses are received from the survey recipient, the survey case is closed.
  • a follow-up case is opened to determine if follow-up to the survey is needed. Any issues fed back by the customer are analyzed to determine if follow-up actions are needed. If so, the appropriate people are contacted in order to resolve the issues. When the issues are resolved, the follow-up case is closed.
  • FIG. 10 is a flow diagram illustrating an embodiment of a method for creating a survey.
  • the method includes digitally recording voice scripts as indicated in block 106 .
  • each voice script may be one or more survey question and/or one or more statements or sentences.
  • file names for the voice scripts are established. This process may include automatically naming the files based on the spoken language, store, store brand, product information, or other information.
  • Block 110 includes enabling a user to select one or more voice scripts from the recorded scripts that may be used to form a completed survey. The user may be enabled to add and/or delete scripts.
  • certain scripts may be automatically selected depending on client preferences, based on a bill code associated with a client brand (if the client has multiple brands), based on order criteria, or based on other factors.
  • a service order in one particular example may include a delivery and assembly, and hence automatic selection of both delivery-related questions and assembly-related questions can be made.
  • the method further includes enabling a user to arrange the scripts in a particular order, as desired, to form a certain logical sequence of scripts for the survey, as suggested in block 112 .
  • the user is enabled to enter the answers from the survey recipient that are acceptable for the particular survey questions.
  • FIG. 11 is a screen shot of a user interface 118 for creating an automated survey according to various implementations of the present disclosure.
  • the user interface 118 includes, among other things, a sequence column 120 that displays a sequence of survey scripts that form the entire survey and enables the user to change the sequence as needed.
  • a question ID column 122 identifies the respective survey scripts (i.e., questions and/or statements).
  • a question description column 124 includes a description of the respective survey script.
  • An answer options column 126 enables the user to enter the acceptable feedback responses, based in part on the questions being asked.
  • Column 128 enables the user to select which answers to the respective questions are to be shown on a web-enabled user interface that reports the survey result information to the appropriate individuals responsible for handling customer issues.
  • the user interface 118 also includes an add button 132 , enabling the user to add a selected question or statement to the survey.
  • a delete button 134 enables the user to delete one or more questions, and a save button 136 enables to the user to save the survey when it is complete.
  • the user interface 118 may also include a “sample playback” button allowing the user to listen to how the created survey might sound.
  • FIG. 12 is a diagram illustrating an example of a completed survey 140 according to various implementations.
  • the survey 140 in this example includes an introduction, survey instructions, list of questions, and a statement giving the survey recipient an opportunity to leave a voice message. It should be understood that other wording of sentences, the wording of questions, the sequence and types of questions asked, and other aspects of the survey can be modified to meet the particular client's needs.
  • the survey 140 can be formed using preset elements. The survey 140 can be read and recorded, and then accessed for playback during the survey. Elements to allow time for answers to be entered by the survey recipient can be added as needed.
  • FIG. 13 is a flow diagram illustrating an embodiment of a method for triggering and conducting a survey according to various implementations.
  • the method includes receiving notification of the occurrence of a trigger event associated with a service record in accordance with block 144 .
  • the trigger event may be the completion of the designated service.
  • the method includes changing the status of the service record to closed.
  • the survey record is then created, as indicated in block 148 , and is placed in a survey scheduling queue, as indicated in block 150 .
  • decision block 152 it is determined whether or not a periodic time for performing a polling function has arrived.
  • the polling function may be configured to operate every 10 or 15 minutes. If the proper time has not yet arrived, the flow path loops back to itself and block 152 is repeated until the time arrives.
  • the database is polled to detect new survey records, as indicated in block 154 .
  • Block 156 indicates that the method includes conducting an automated survey. The order that the automated surveys are launched may be based in part on the sequence of survey records in the survey scheduling queue. The process of conducting the automated survey is described in more detail below.
  • survey result information is received. The survey result information may be choices entered by the survey recipient, voice messages, or other useful data.
  • FIG. 14 is a flow diagram illustrating an embodiment of a method for conducting an automated survey according to various implementations.
  • the automated survey conducting method includes determining, according to decision block 162 , whether or not a new survey record has been found. If not, the flow path returns back to block 162 until one is found.
  • an automated survey is prepared, as indicated in block 164 .
  • the preparing of the survey may include, for example, accessing scripts and questions, accessing contact information, or other functions for forming an appropriate survey.
  • the information gathered together to prepare the survey may include field manager case information, client order information, client product information, a library of survey scripts and questions, and other suitable information.
  • decision block 166 it is determined whether or not the survey recipient is on a do-not-call list. If so, the method skips ahead to block 168 , which indicates that the survey case is closed with a status of “no contact made—DNC.” If the survey recipient is not on the do-not-call list, the method flows to block 170 , which indicates that an attempt is made to contact the survey recipient. According to decision block 172 , it is determined whether or not contact is made with the survey recipient. If not, then the flow proceeds to decision block 184 . If contact is made, the flow proceeds to block 174 , which indicates that the automated survey is launched and responses by the survey recipient are captured.
  • the survey recipient is given the option to speak with a live operator. If it is determined in decision block 176 that the survey recipient requests to speak to someone live, then the flow branches to block 178 . As indicated in block 178 , the survey recipient is connected with an operator, such as a customer service agent, for the completion of the survey. When the live survey is completed, the survey analysis status is set to “ready” as indicated in block 180 . If in block 176 it is determined that the survey recipient does not wish to talk with a live operator, the flow proceeds to decision block 182 . According to block 182 , it is determined whether or not the survey was completed successfully. If so, the flow proceeds to block 180 to set the survey analysis status to “ready.” If the survey did not complete successfully, as determined in block 182 , flow proceeds to decision block 184 .
  • Block 184 is reached when the survey recipient could not be contacted (decision block 172 ) or when the survey was not completed successfully (decision block 182 ). At this point, it is determined whether or not the number of contact attempts is equal to a predetermined threshold. If the number of contact attempts is determined to be equal to the threshold, flow proceeds from block 184 to block 186 and the survey is closed with the status of “no contact made.” If not, then the method goes to block 188 , in which the survey is reschedule for another attempt, and the flow then proceeds back to block 170 .
  • FIG. 15 is a flow diagram illustrating a method for handling survey result information according to various implementations of the present disclosure.
  • the method includes receiving survey result information from an automated survey, as indicated in block 192 .
  • the method includes analyzing the survey result information (e.g., averaging the survey result information) to obtain a survey score. It is determined, according to decision block 196 , whether the analysis reveals that follow-up actions are warranted or not, such as by automatically comparing an average score to a defined threshold. If so, a flag is set to open a follow-up case as indicated in block 198 . However, if no follow-up is warranted based on the analysis, flow proceeds from block 196 to decision block 200 .
  • Block 200 it is determined whether or not a voice message was received. If so, the voice message is made available for access by a survey monitor person according to block 202 . Also, input may be received from the survey monitor person, as indicated in block 204 . The input received from the survey monitor person may include a summary of the voice message, selection of one or more customer issues from a list, selection of one or more follow-up actions from a list, a flag set to open a follow-up case, and/or other inputs. The flag to open the follow-up case may be set in response to the content and interpretation of the voice message.
  • Block 206 indicates that the survey results are made available for reporting to various individuals, teams, departments, or others and for tracking the progress of the follow-up actions.
  • FIG. 16 is a flow diagram illustrating an embodiment of a method for performing survey follow-up actions according to various implementations of the present disclosure.
  • decision block 210 it is determined whether or not a follow-up flag has been set. If not, which indicates that no follow-up is needed, then the flow of the method skips to block 212 and the follow-up case is closed. If a flag is set, the flow proceeds to block 214 , which indicates that the survey result information and survey scores are received.
  • decision block 216 it is determined whether the survey result information meets certain criteria for sending an auto-notification to the client. The client may request to receive automatic notification based on any suitable conditions or criteria associated with the survey result information.
  • block 216 For example, if the client requests to receive notification of compliments and if one or more compliments are recorded in the survey results, then the criteria in this case are met. If it is determined in block 216 that the criteria are met, an auto-notification of the survey details is sent to the client, as described in block 218 . In some embodiments, block 218 may be omitted if the client chooses not to receive auto-notifications.
  • decision block 220 indicates that a determination is made whether the survey score warrants one or more follow-up actions. If not, then the flow skips to block 212 and the follow-up case is closed. However, if follow-up is warranted, the method flows on to decision block 222 , which determines whether involvement by a field manager is needed. If so, the survey result information (which may include any of the survey answers, survey scores, and voice messages) is made available to the field manager, according to block 224 . When the survey result information is received, the field manager may be enabled to add or edit follow-up information, as indicated in block 225 . For example, the field manager may log any follow-up actions taken to resolve the issues.
  • the field manager may also set classifications of issues and set follow-up actions that were not previously recorded.
  • the field manager may also be enabled to mark when the follow-up case is closed, e.g., when all the issues have been resolved.
  • the method also includes checking if client involvement is needed, as indicated in decision block 226 . If so, the flow is directed to block 228 and the survey result information is made available to the client.
  • the client is enabled to add and/or edit follow-up information. In some embodiments, the client's name may be logged in during the modification process.
  • the types of follow-up information that can be modified in this method may be different for the field manager, client, and others who may be given access to the information and authority to change the information, depending on the particular design.
  • the information made available to the client may be different than that made available to the field manager, depending on the particular design.
  • the field managers and clients when given the information, may be responsible for contacting the customer, service group members, or others by any available communication devices in order to help resolve the issues.
  • Decision block 230 indicates that it is determined whether or not any issues remain. This determination may be made by the field manager, who may set a flag, mark an item on a checklist, enter a summary, or other operation that may be detectable by the survey program 54 . These indications can be analyzed to determine that the issues are resolved. If no issues remain, the flow goes to block 212 and the follow-up case is closed. If issues still remain, the flow loops back to block 220 to repeat follow-up actions until the issues can be resolved.
  • each block may represent a module, segment, portion of code, etc., which comprises one or more executable instructions for performing the specified logical functions. It should be noted that the functions described with respect to the blocks may occur in a different order than shown. For example, two or more blocks may be executed substantially concurrently, in a reverse order, or in any other sequence depending on the particular functionality involved.
  • the survey program 54 which comprises an ordered listing of executable instructions for implementing logical functions, may be embodied in any computer-readable medium for use by any combination of instruction execution systems or devices, such as computer-based systems, processor-controlled systems, etc.
  • the computer-readable medium may include one or more suitable physical media components configured to store the software, programs, or computer code for a measurable length of time.
  • the computer-readable medium may be any medium configured to contain, store, communicate, propagate, or transport programs for execution by the instruction execution systems or devices.
  • FIG. 17 is a screen shot of a user interface 231 for enabling access to voice messages according to various embodiments.
  • User interface 231 lists the survey responses that include a voice message that needs to be verified. More specifically, verifying a voice message may include the actions toward the voice message of screening, filtering, sorting, searching, or other actions.
  • Section 233 of the user interface 231 includes information about the profit center (business), job identification numbers, customer, and the time and date when each respective survey was completed.
  • Column 234 shows if the respective survey feedback included a low overall score, representing poor quality service, such as one below a minimum threshold.
  • Column 235 includes a link to the different voice messages. If the user wishes to hear the message, the user may click on the “Listen” link to retrieve the voice message file. If the voice message warrants follow-up actions, the user can select either yes or no in the response required column 236 .
  • the details of the surveys can be retrieved by the user by clicking on the respective “Details” link.
  • FIGS. 18A and 18B are parts of a screen shot of a user interface 238 for enabling a user to enter follow-up actions to be taken, according to various implementations of the present disclosure.
  • the user interface 238 may be opened, for example, by clicking on the detail link in column 237 shown in FIG. 17 . Also, the user interface 238 may be opened when the user clicks on the “listen” link in column 235 .
  • section 240 of user interface 238 information about the order, profit center, servicer, customer, etc. is displayed. Within section 240 is a link 241 that enables a user to access a voice message, if one is left.
  • information about the survey questions is displayed.
  • Section 244 of the user interface 238 enables the user to check certain listed items to define the customer's issues and categorize them into classification categories.
  • the list of issues included in section 244 may be customized for the client based on the client's needs, based on the particular service provided, based on the particular product being delivered, or based on any other factors.
  • Some non-limiting examples of customer issue items listed in section 244 may include a scheduling issue, an incorrect phone number, an issue with the contract carrier, a delivery fee issue, a schedule notification issue, poor service at the store, a damaged product, the product missing items, the wrong product delivered, the wrong address, a store or client issue, a voice message compliment, or any other service issues.
  • the selection of at least of the classification items can be required before a case is closed. By listening to the voice message, the user may be able to determine the classification of issues described audibly.
  • Section 246 includes a list of possible ways to resolve the issues marked in section 244 . This list may also be customized for the particular client depending on various factors. Some non-limiting examples of resolution items listed in section 246 may include the issuing of a gift card to the customer, passing the information on the store or client, leaving a voice message for the client, recording a voicemail summary, or other ways of reaching resolution. Other items may also include the closure of the follow-up case based on a failure to contact the customer or a representative speaking with the customer to resolve some issue, addressing the issue with the delivery team, or the customer misunderstanding the survey.
  • the user interface 238 enables the user to check the appropriate boxes of section 246 as needed. The user interface 238 may display certain additional information fields depending on the selections made in section 246 .
  • the user interface 238 may prompt the user to enter the name of the person to which the survey result information is passed.
  • the user interface 238 may prompt the user to enter the monetary amount of the gift card to be issued.
  • a voice message is left, the user may listen to the message by clicking on the link 241 and then may enter a summary of the voice message in window 248 .
  • the window 248 can also be used to record steps that were taken by different people of the service group to resolve issues or any other notes that may be necessary for understanding the issues of the case.
  • the summaries entered in window 248 are displayed in section 250 when inserted by the user.
  • the Actions selected in section 246 are also automatically displayed adjacent section 250 . If the follow-up case is to be closed, the user may check the box 252 .
  • FIG. 19 is a screen shot of an embodiment of a user interface 256 for enabling access to survey result information.
  • the user interface 256 may be created automatically when the survey recipient leaves a voice message.
  • the user interface 256 displays a table 258 having details of the order, store, carrier, driver, customer, customer contact information, promised delivery time window, actual arrival time, etc.
  • the user interface 256 also includes a table 260 displaying the survey questions, response options, and the answers provided.
  • Table 260 also displays a calculation of the average score.
  • Window 262 shows a summary of the voice message left by the survey recipient and textual entries made by the service team monitoring the status of the survey and follow-up.
  • the user interface 256 also includes a link 257 allowing the user to respond to the survey recipient. Also, the user interface 256 includes a link 264 , which allows the user to listen to a recording of the voice message left by the survey recipient.
  • the voice message may include any file format, such as a .wav file, a vox file, etc.
  • FIG. 20 is a screen shot of an embodiment of a user interface 266 .
  • the user interface 266 may be created automatically when the overall score 267 displayed in a survey result section is below an acceptable threshold. For example, if the survey questions are based on a five-point scale with “5” representing complete satisfaction and “1” representing complete dissatisfaction, then a threshold of about 3.0 (or any other suitable number) may be set. Therefore, an overall score below 3.0 (in this case) may initiate the generation of the user interface 266 .
  • the user interface 266 may also include a delivery notes section 268 and an order history section 269 .
  • the delivery notes section 268 may include notes that were recorded when the customer placed an order. As an example, the delivery notes 268 may be useful for the completion of certain services.
  • the order history section 269 may include a history of the order case, survey case, and/or follow-up case of a service order. Information in the order history section 269 may be entered manually and/or automatically.
  • FIG. 21 is a screen shot of an embodiment of a user interface 270 for enabling a search of survey responses.
  • the user interface 270 may be made available to each of the service managers and other personnel responsible for monitoring the orders, surveys, and follow-up cases for a service company.
  • the user interface 270 allows the user to search for follow-up cases and view the details of the follow-up cases. If box 272 is checked, only the follow-up cases that are still open (or pending) are searched.
  • the user can select one or more profit centers (or business segments) depending on the need. Also, the user can select the option to search all the profit centers of the service company. Fields 276 and 278 allow the user to enter the timeframe in which the search is made.
  • the search button 280 is selected, the user interface 270 is configured to search the database for follow-up cases that match the search criteria and display the results in table 282 .
  • the table 282 includes rows of different entries arranged with columns for the profit center, the customer receiving the service (“ship to”), the job number, the time and date the follow-up case was opened (“reported at”), the deadline, the age of the follow-up case, whether a low score was received in the survey, whether a voice message link is available, the number of responses, whether the follow-up case has been closed, and a details link linking to the details of the survey.
  • the table 282 may list the follow-up cases in a sequence from the oldest case to the newest, ordered according to the age column.
  • the age column may work with a suitable clock or timing device to update the age of opened cases every six minutes (0.1 hours). The age may be used by the service team to give priority to older issues.
  • FIGS. 22A and 22B are screen shots of a service issue report 286 according to various implementations of the present disclosure.
  • the service issue report 286 may be communicated to the client or store to report issues regarding the order or service that need attention by the store. The clients may be given the option to receive such a report at different stages of the follow-up or when certain situations occur.
  • the service issue report 286 includes an information table 288 , a survey result table 290 , a voice message link 292 , and a voice message response table 294 shown in FIG. 22B .
  • the information table 288 includes information about the order, service, customer, etc.
  • the survey result table 290 includes respective responses to the survey questions.
  • the voice message response table 294 shows the voice message summary and a summary of follow-up calls (e.g., by JBROWN in this example) to the customer to resolve the issues.
  • FIG. 23 is a diagram of an embodiment of a quality report 300 .
  • the quality report 300 may be communicated to the client (i.e., “Acme”).
  • the quality report 300 includes the client's survey scores broken up among the different regions (e.g., starting in this example with the New York region). Also, the quality report 300 divides each region down to the individual servicers. With this report, the client can obtain useful information about the overall success of the delivery teams, the success of teams within each region, and success of individual servicers.
  • FIG. 24 is a diagram of a survey feedback report 304 according to various implementations of the present disclosure.
  • the survey feedback report 304 may include a table 306 showing the daily survey results, a table 308 showing the month-to-date survey results, and a table 310 showing a response classification matrix.
  • the survey result reporting module 74 ( FIG. 6 ) may be configured to send the survey feedback report 304 to the service managers and other teams of the service group. The report may be transmitted with an e-mail or may be accessed using a hyperlink.
  • the survey feedback report 304 may be sent on a periodic basis to keep the managers and teams up to date. For example, it may be sent on a daily basis, issued on the morning following the day of service being reported.
  • the next-day information can be useful for training or coaching purposes, such as for use by a manager to coach service teams to practice proper technique and behavior that may better please the customers. In this way, service teams can be given immediate feedback based on the previous day's survey responses.
  • the daily survey result table 306 may include numbers broken out by region.
  • the columns of the daily survey result table 306 include the number of service orders (e.g., deliveries), the number of surveys completed, the percentages of customers completing the survey, and an average score goal.
  • the daily survey result table 306 also may include the particular questions of the survey, such as whether the customer would desire to have the delivery team back, the appearance of the delivery team, on-time success, call ahead success, whether the delivery team properly tested and demonstrated the product, the professional courtesy of the team, and the overall average score.
  • the month-to-date results table 308 may include the same column divisions as the daily report but for the longer time period from the first of the month to the present.
  • the response classification matrix table 310 includes columns for each of a number of specific customer issues.
  • the table 310 may include scheduling issues, incorrect phone number issues, contract carrier issues, notification of deliver time issues, damaged product issues, address issues, store/client issues, etc.
  • the survey result reporting module 74 may generate one or more reports describing the issuance of gift cards.
  • the follow-up action of issuing a gift card may be initiated by a user selecting the item labeled “Issued Gift Card” in the follow-up actions section 246 of the user interface 238 ( FIG. 18 ).
  • Gift cards may be issued when service mistakes, mishaps, or other problems occur.
  • the gift cards may be used to reimburse, compensate, or in some way appease the customer for the service problems. Since the service group is representing the client, the issued gift cards may be valid only at the client's stores, for example, or in other embodiments may be valid at any stores.
  • the service managers may analyze the events surrounding the service problems and determine if a particular servicer is at fault or responsible. If it is determined that a particular servicer is responsible for the service problem, such as for arriving outside the promised time window, failing to complete the service, and/or other problems, then the automated survey system 36 may be configured to automatically subtract the gift card amount from the servicer's pay. In this respect, the amount that the managed services 22 pays for gift card issuance is charged back to the servicer. However, if the problem is not caused by the servicer but is caused by other operators or systems, then the servicer is not held responsible.
  • FIG. 25 is a diagram of an embodiment of a survey response report 314 .
  • the survey response report 314 includes tables for follow-up cases that remain opened and those that are closed.
  • the survey response report 314 also includes for each case the customer, classifications of issues, overall score on the survey, age of the case, summary notes, and access to more details of the case.
  • FIG. 26 is a diagram of an embodiment of a summary quality report 318 according to various implementations.
  • the summary quality report 318 includes a first column 320 , which includes each of the stores of a client, divided regionally.
  • a second column 322 includes the overall average survey scores for the respective stores, and a third set of columns 324 includes the average scores and number of surveys completed for each of the questions asked on the surveys.
  • the survey result reporting module 74 may be configured to distinguish between the average scores that meet a particular goal and those that do not meet the goal. For example, a first score 326 may be displayed in one manner (e.g., black) while another score 328 may be displayed in a different manner (e.g., red).
  • One benefit might be the ability to provide rapid follow-up actions to customers that have issues. In some cases, it may be possible to resolve the customer's issue within two hours, which is a desirable service goal for a service company. By responding to issues quickly, the overall customer satisfaction level of a service group can be high.
  • Another advantage might be the aspect of performing the survey using an automated system as opposed to a survey conducted by a live operator.
  • An automated system may allow the survey recipient to answer more truthfully and may also lead to a high survey participation rate. For example, many surveys have a participation rate of under 10%. However, with the automated survey system 36 described herein, a participation rate of about 40% or higher can been achieved. In this respect, the service company can obtain a larger sample of data that may better define the satisfaction level of the customers. Also, by conducting the survey at an advantageous time, which is controlled by the automated survey system 36 , customers are more likely to take the survey and likely to answer more accurately, because the service experience might still be fresh in their minds.
  • conditional language such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more particular embodiments or that one or more particular embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Game Theory and Decision Science (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Educational Administration (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Systems and methods for analyzing results of an automated survey are provided. In one implementation, a computer-implemented method comprises conducting, by a processing device, an automated survey on a survey recipient associated with a customer location. The automated survey is configured to prompt the survey recipient regarding the quality of a service performed for a customer associated with the customer location. The method also includes receiving survey result information from the survey recipient in response to conducting the automated survey. The method further includes analyzing the survey result information and determining if one or more subsequent follow-up actions are warranted based in part on analyzing the survey result information.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 12/722,455, filed Mar. 11, 2010, which claimed the benefit of U.S. Provisional Application No. 61/266,599, filed Dec. 4, 2009, the entire disclosures of which are hereby incorporated by reference herein.
  • TECHNICAL FIELD
  • The present disclosure generally relates to surveys, and more particularly relates to survey automation.
  • BACKGROUND
  • Businesses often use surveys to obtain feedback from customers. The survey responses can help a business understand the customer's level of satisfaction. Also, a business can use data from surveys to track patterns and trends in customer service. In response, the business can make changes as necessary in areas where improvements can be made. Businesses that can keep operations running smoothly and focused on customer satisfaction may typically have a better chance of long-term success.
  • SUMMARY
  • The present disclosure describes various systems and methods for analyzing results of an automated survey. According to some embodiments, a computer-implemented method may comprise the steps of conducting, by a processing device, an automated survey on a survey recipient associated with a customer location. For example, the automated survey may be configured to prompt the survey recipient regarding the quality of a service performed for a customer associated with the customer location. The method also includes receiving survey result information from the survey recipient in response to conducting the automated survey and analyzing the survey result information. The method also includes determining if one or more subsequent follow-up actions are warranted based in part on analyzing the survey result information.
  • According to some implementations, a survey result analysis system that comprises a processing device associated with a computing system, the processing device configured to execute a survey program, and a memory device in communication with the processing device. The memory device is configured to store the survey program, wherein the survey program is configured to enable the processing device to conduct an automated survey on a survey recipient associated with a customer location. The automated survey, for instance, may be configured to prompt the survey recipient regarding the quality of a service performed for a customer associated with the customer location. The survey program also enables the processing device to receive survey result information from the survey recipient in response to conducting the automated survey, perform an analysis of the survey result information, and determine if one or more subsequent follow-up actions are warranted based in part on the analysis of the survey result information.
  • Some implementations may also include another computer-implemented method enabling a processing device to conduct an automated survey configured to prompt a survey recipient regarding the quality of a service performed associated with a customer location. The method also enables the processing device to obtain survey result information in response to conducting the automated survey, to determine if a voice message was received in response to the automated survey, and to present the voice message to a person for listening to the voice message.
  • Another implementation of the present disclosure includes a survey result analysis system, which comprises a processing device and a memory device. The processing device is associated with a computing system and is configured to execute a survey program. The memory device is in communication with the processing device and is configured to store the survey program. The survey program is configured to enable the processing device to retrieve survey result information extracted from an automated survey offered to a survey recipient. The processing device is also configured by the survey program to determine if a voice message was received in response to the automated survey and to present the voice message to a person for listening to the voice message.
  • Various implementations described in the present disclosure may include additional systems, methods, features, and advantages, which may not necessarily be expressly disclosed herein but will be apparent to one of ordinary skill in the art upon examination of the following detailed description and accompanying drawings. It is intended that all such systems, methods, features, and advantages be included within the present disclosure and protected by the accompanying claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The features and components of the following figures are illustrated to emphasize the general principles of the present disclosure. Corresponding features and components throughout the figures may be designated by matching reference characters for the sake of consistency and clarity.
  • FIG. 1 is a block diagram illustrating a first embodiment of general business interactions.
  • FIG. 2 is a block diagram illustrating a second embodiment of general business interactions.
  • FIG. 3 is a block diagram illustrating an embodiment of a service group according to various implementations of the present disclosure.
  • FIG. 4 is a block diagram illustrating a survey network system according to various implementations of the present disclosure.
  • FIG. 5 is a block diagram illustrating an embodiment of the automated survey system shown in FIG. 4, according to various implementations of the present disclosure.
  • FIG. 6 is a block diagram illustrating an embodiment of the survey program shown in FIG. 5, according to various implementations of the present disclosure.
  • FIG. 7 is a diagram illustrating an embodiment of data segments stored in the database shown in FIG. 5, according to various implementations of the present disclosure.
  • FIG. 8 is a flow diagram illustrating general operations of a survey system according to various implementations of the present disclosure.
  • FIG. 9 is a flow diagram illustrating an embodiment of a survey method according to various implementations of the present disclosure.
  • FIG. 10 is a block diagram illustrating an embodiment of a method for creating a survey according to various embodiments.
  • FIG. 11 is a screen shot of a user interface for creating an automated survey according to various implementations of the present disclosure.
  • FIG. 12 is a diagram illustrating a sample script for an automated survey according to various implementations of the present disclosure.
  • FIG. 13 is a flow diagram illustrating a method for triggering and conducting a survey according to various implementations of the present disclosure.
  • FIG. 14 is a flow diagram illustrating a method for conducting an automated survey according to various implementations of the present disclosure.
  • FIG. 15 is a flow diagram illustrating a method for handling survey result information according to various implementations of the present disclosure.
  • FIG. 16 is a flow diagram illustrating a method for performing survey follow-up actions according to various implementations of the present disclosure.
  • FIG. 17 is a screen shot of a user interface for enabling access to voice messages according to various implementations of the present disclosure.
  • FIGS. 18A and 18B include combinable parts of a screen shot of a user interface for enabling input of follow-up actions according to various implementations of the present disclosure.
  • FIG. 19 is a screen shot of a user interface for enabling access to survey result information according to various implementations of the present disclosure.
  • FIG. 20 is a screen shot of the user interface of FIG. 19 according to various implementations.
  • FIG. 21 is a screen shot of a user interface for searching and tracking survey responses according to various implementations of the present disclosure.
  • FIGS. 22A and 22B are screen shots of a service issue report according to various implementations of the present disclosure.
  • FIG. 23 is a screen shot of a quality report according to various implementations of the present disclosure.
  • FIG. 24 is a screen shot of a survey result report according to various implementations of the present disclosure.
  • FIG. 25 is a screen shot of a survey response report according to various implementations of the present disclosure.
  • FIG. 26 is a screen shot of a summary quality report according to various implementations of the present disclosure.
  • DETAILED DESCRIPTION
  • The present disclosure describes systems and methods for conducting surveys in response to interactions between businesses and customers. Surveys may be created and utilized for obtaining feedback about products sold to customers and/or about services provided for the customers. Although various implementations of the present disclosure are described with respect to surveys conducted in response to a service, the survey systems and methods herein may also be configured to be conducted in response to products or other offerings by a company or business. In addition, various implementations herein describe many services as being delivery services, but it should be understood that the present disclosure also may include other types of services without departing from the principles described herein. Other features and advantages will be apparent to one of ordinary skill in the art upon consideration of the general principles described herein, and all such features and advantages are intended to be included in the present disclosure.
  • FIG. 1 is a block diagram of a business interaction between a business 10 and a customer 12. The business 10 may be any company, profit center, or other entity. The business 10 may be a physical store, on-line store, service company, or other entity. The customer 12 may be any individual who is to receive a service or who orders or purchases a product. In such an interaction as illustrated in FIG. 1, the business 10 provides goods and/or services directly to the customer 12. During this interaction, there are several opportunities for the business 10 to display customer service, including, for example, the customer 12 interacting with a salesperson, sales clerk, or cashier, the customer 12 receiving a service, such as a repair, maintenance, improvement, legal service, delivery or other type of service, or other types of interactions. When a service is to be performed in this arrangement, the business 10 employs internal servicers who provide the service directly to the customer 12. Various examples of non-limiting services may include a delivery of a purchased product, a plumbing service, tax return preparation, automobile repair, etc.
  • FIG. 2 shows another example of a general business interaction in which the customer 12 pays the business 10 for goods or services, the business 10 provides a service group 14 with information for fulfilling the service, and the service group 14 provides the service to the customer 12 on behalf of the business 10. The service group 14 includes the service professionals and other people involved in the business of offering one or more services, and is often a separate corporate entity from the business 10. For example, the service group 14 may be responsible for delivering, building, assembling, installing, maintaining, repairing, improving, testing, demonstrating, removing, and/or other service actions. In the arrangement of FIG. 2, the business 10 may be considered a client of the service group 14.
  • According to various implementations, the customer 12 may provide the business 10 with personal information, such as name, address, phone numbers, e-mail addresses, etc., which can be used for contacting the customer 12 to provide the intended services or for contacting the customer 12 as needed. Other ordering information may be exchanged or created, including special instructions for delivery, unpacking or assembly requests, and/or installation requests. Orders can usually be taken in any number of ways, including transactions in person, by phone, by mail, by e-mail, by the Internet, or by other ordering methods. The business 10 may provide some of this order information to the service group 14 in order that the service group 14 can perform the service properly. The order information can be provided by an automatic ordering system, by facsimile device, by e-mail, by phone, or in any other manner. The service group 14 may pick up products, as necessary, from the business's store, warehouse, supplier, etc., and deliver the products to one or more customers 12. In some embodiments, the customer 12 may schedule the service directly with the service group 14.
  • FIG. 3 is a block diagram showing an embodiment of a service group 20, such as the service group 14 shown in FIG. 2. In this implementation, managed services 20 may represent a service company, which may be responsible for the management of internal servicers 24, who are employed by a client business, and service managers 26, who may be employed by the managed services 22 company or may be independent contract companies. In some cases, the managed services 22 may include operators who manage the services for a particular client. In other implementations, servicers 30 may be direct independent contractors to managed services 22. According to various implementations of the present disclosure, the managed services 22 may include an automated survey system, which automatically conducts surveys and analyzes the results of the survey. More details of the automated survey systems are described below.
  • The service managers 26 may be field managers, regional managers, or local managers who manage one or more service providers 28, often in a particular region and/or for a specific client. The service manager may also manage one or more internal servicers 24. The service providers 28 manage a number of servicers 30, who may be employed by the service providers 28 or may be independent contractors. The servicer 30 may be the individual or team representing the service group 20 (or service group 14 shown in FIG. 2) and who directly interacts with the customer 12.
  • FIG. 4 is a block diagram of an embodiment of a survey network system 34 according to various implementations of the present disclosure. The survey network system 34 includes an automated survey system 36 (described in more detail below), client systems 38, service group systems 40, and customer systems 42. These and other systems are capable of interacting and communicating via one or more communication networks 44. The communication networks 44 may include telephone lines, such as land line or public switched telephone network (PSTN) systems, mobile phone channels and systems, communication channels for exchanging data and information, such as a local area network (LAN), wide area network (WAN), the Internet, or other data, communication, or telecommunication networks.
  • The client systems 38 may represent any business, such as the businesses described with respect to FIGS. 1 and 2. In the environment of the survey network system 34 of FIG. 4, the client systems 38 represent at least a part of a business that is a client of the service group, which utilizes the service group systems 40. The service group may be responsible for performing one or more services on behalf of the clients. The service group may be the service group 20 described with respect to FIG. 3 or other group of servicers, service providers, service managers, and/or managed services. In some embodiments, the automated survey system 36 may be part of the client systems 38 or may be part of the service group systems 40. As suggested in FIG. 1, the client systems 38 and service group systems 40 may be part of one company or enterprise.
  • According to various embodiments of FIG. 4, the service group systems 40 may include equipment used by the servicers and by field managers. For example, the service group systems 40 may include handheld devices (e.g., devices carried by the servicers), mobile phones, laptop computers, or other devices. When the servicer completes a service, the servicer may use any suitable device of the service group systems 40 to notify the automated survey system 36 that the service has been completed. For example, the servicer may call into an integrated voice response (IVR) device (or voice response unit (VRU)) of the automated survey system 36 to input information about the service or completion of the service. Another example may include a telephone call, landline or mobile, to a support agent, who may be associated with the automated survey system 36 and who can manually enter the service information into the automated survey system 36. In some implementations, completion of the particular service may be communicated by some automated process, such as the automatic detection of a change in the servicer's location using, for example, a global positioning system (GPS) device.
  • After notification of service completion has been received, the automated survey system 36 waits for a short amount of time (e.g., to allow the customer to reflect upon the service received). After a configurable short delay, e.g., about 10 minutes, the automated survey system 36 launches an automated survey. In some implementations, the survey is conducted over the telephone using an IVR system, which is configured to call the customer's home telephone number using contact information obtained during the order process. The survey may be sent to the customer systems 42 using the PSTN or over other communication networks, such as an e-mail system, chat session, text message system, etc. In some cases, the customer may delegate another individual to interact with the servicers, such as if the customer wishes for a neighbor to handle the acceptance of the delivered items. In these cases, the survey recipient may be the neighbor, who may be in a better position to rate the delivery service.
  • In some implementations, the automated survey system 36 may include a processing system adapted to conduct the survey when the service is complete. The automated survey system 36 is further configured to analyze the results of the survey to determine if any follow-up actions with the customer are needed. For example, if the customer is dissatisfied with the service received, the customer can leave responses that can be analyzed for follow-up. In some situations, the customer may have need of immediate resolution to which the service group or client can provide follow up. Feedback may be received in the form of key strokes on a touch tone key pad of a telephone, voice messages left over the telephone, and/or by other communication means.
  • Some follow-up actions may involve a service manager, field manager, or other representative of the service group. The automated survey system 36 organizes the survey results in tables or charts to clearly communicate any issues that the customers may have. For example, if the customer indicates poor service, such as by providing low ratings on the survey or by explaining problems in a voice message, this information can be automatically or manually recorded and then provided directly to the service manager or other responsible person or team of the service group associated with the service group systems 40. In some cases, survey feedback can be directed to the client systems 38. In the case where follow-up actions may involve the client, the automated survey system 36 may send an automatic communication to the client systems 38 in order that the client can view the survey result information using a web-enabled browser via the Internet. Both the client and field managers of the service group can access survey result information and/or a digitized version of the voice message as needed to help resolve the customer's issues.
  • FIG. 5 is a block diagram illustrating an embodiment of the automated survey system 36 shown in FIG. 4, according to various implementations of the present disclosure. As shown in this embodiment, the automated survey system 36 includes a processing device 48 and a memory device 50, which includes at least an order management program 52, a survey program 54, and a database 56. The automated survey system 36 further includes input/output devices 58 and interface devices 60. The components of the automated survey system 36 are interconnected and may communicate with each other via a computer bus interface 62 or other suitable communication devices.
  • In some embodiments, each component of the automated survey system 36 as shown may include multiple components on multiple computer systems of a network. For example, the managed services 22 of the service group may comprise servers, such as application servers, file servers, database servers, web servers, etc., for performing various functions described herein. The servers of the automated survey system 36 may for example be physically separate servers or servers in a VMware ESX±4.0 virtual environment, among other implementations. In addition, the internal servicers 24, service managers 26, service providers 28, and/or servicers 30 may comprise laptop or desktop computer systems, which may form part of the automated survey system 36 and may be used for accessing the servers as needed.
  • The processing device 48 may be one or more general-purpose or specific-purpose processors or microcontrollers for controlling the operations and functions of the automated survey system 36. In some implementations, the processing device 48 may include a plurality of processors, computers, servers, or other processing elements for performing different functions within the automated survey system 36.
  • The memory device 50 may include one or more internally fixed storage units, removable storage units, and/or remotely accessible storage units, each including a tangible storage medium. The various storage units may include any combination of volatile memory and non-volatile memory. For example, volatile memory may comprise random access memory (RAM), dynamic RAM (DRAM), etc. Non-volatile memory may comprise read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, etc. The storage units may be configured to store any combination of information, data, instructions, software code, etc. The order management program 52, survey program 54, and database 56 may be stored in one or more memory devices 50 and run on the same or different computer systems and/or servers.
  • The input/output devices 58 may include various input mechanisms and output mechanisms. For example, input mechanisms may include various data entry devices, such as keyboards, keypads, buttons, switches, touch pads, touch screens, cursor control devices, computer mice, stylus-receptive components, voice-activated mechanisms, microphones, cameras, infrared sensors, or other data entry devices. Output mechanisms may include various data output devices, such as computer monitors, display screens, touch screens, audio output devices, speakers, alarms, notification devices, lights, light emitting diodes, liquid crystal displays, printers, or other data output devices. The input/output devices 58 may also include interaction devices configured to receive input and provide output, such as dongles, touch screen devices, and other input/output devices, to enable input and/or output communication.
  • The interface devices 60 may include various devices for interfacing the automated survey system 36 with one or more types of communication systems, such as the communication networks 44. The interface devices 60 may include devices for communicating the automated survey from the automated survey system 36 to the customer systems 42. For example, when the survey is communicated via telephone, a telephone/voice interface device of the interface devices 60 can be used for controlling an IVR device and accessing a telephone network. Also, interface devices 60 may include various devices for interfacing with a data network, such as the Internet, to enable the communication of data. In some examples, the interface devices 60 may include Dialogic cards, Dialogic Diva softIP software, Envox, a voice over Internet protocol (VoIP) device, or other hardware or software interface elements.
  • The order management program 52 stored in the memory device 50 includes any suitable instructions for processing a customer order. For example, the order management program 52 may be Dispatch Office or other software for managing orders. In some implementations, the order management program 52 may include the capability of tracking deliveries. The order management program 52 may be omitted from the automated survey system 36 in some embodiments or placed in a separate processing system according to other embodiments.
  • The survey program 54, which is described in more detail below, includes instructions and templates for enabling a user to create an automated survey. The survey program 54 is also configured to detect a trigger event, such as the completion of a delivery service, and then launch the automated survey in response to the trigger. The survey program 54 also may automatically analyze the feedback from the survey recipient and enable a survey monitor person to review voice messages left by the survey recipient and enter notes, a summary, and/or a transcript of the voice message. When the analysis of the survey result information is made, the survey program 54 can determine if follow-up actions are warranted. For example, if a delivered product is damaged, the survey program 54 can communicate with the appropriate person or team that can resolve the issue. The survey program 54 utilizes, as needed, the database 56, which is configured to store order information, customer information, survey information, and other types of data and information. Other implementations may omit one or more of the functions disclosed herein.
  • FIG. 6 is a block diagram showing an embodiment of the survey program 54 according to various implementations of the present disclosure. As illustrated in FIG. 6, according to some embodiments, the survey program 54 includes a survey assembling module 62, a survey triggering module 64, a survey conducting module 66, an automated survey result analyzing module 68, a survey result monitoring module 70, a survey follow-up module 72, and a survey result reporting module 74. In some implementations, certain functions described herein may be executed by the module explicitly described or may alternatively be executed by one or more modules.
  • The survey assembling module 62 is configured to record a survey script read by a professional speaker. The survey assembling module 62 can record the read script in digitized form in a way file, vox file, and/or other audio file formats. A file naming convention can be used to help identify the properties of the survey scripts. For example, the file name may include an indication of the client, product, types of services, spoken language, store brand, and/or other information. When the scripts are recorded, the survey assembling module 62 enables a user to select different scripts to combine into a complete survey. In this respect, each script may be a single question, single statement, or other portion of an entire survey. The user may then arrange the selected scripts in a particular order. Also, the user is enabled to enter acceptable answers for each of the survey questions.
  • The survey triggering module 64 detects when a trigger event occurs that warrants the conducting of a survey. For example, the trigger event may be the completion of a delivery service or other service. In some embodiments, the survey triggering module 64 may detect when an order case is closed or when the status of a customer's order has been closed or finished (e.g., when an order has been fulfilled and properly delivered). The survey triggering module 64 may detect the order status using a polling process in which the database 56 is polled. The polling process may be operated on a periodic schedule, e.g., about every 10 minutes. When the order case is detected as being closed, the survey triggering module 64 may create a new survey case to indicate that a survey is to be launched. According to some embodiments, the survey triggering module 64 may detect when a survey record has been created automatically or manually in the database 56.
  • In some embodiments, the survey triggering module 64 may be configured to receive indications when trigger events occur that warrant the initiation of surveys. For example, when a service is complete, the servicer may use a handheld device that prompts the servicer to provide input when the service job is finished. The handheld device may transmit a wireless signal to the automated survey system 36 via the interface devices 60 and this signal may be forwarded to the survey triggering module 64. Some embodiments may also include a purchased product (e.g., a mobile phone, smart phone, cable service, etc.) that may be configured to automatically communicate notification of a trigger event (e.g., installation, registration, initiation of phone service, etc.) to the survey triggering module 64. Other trigger events and other means of communicating a notification of the trigger events to the survey triggering module 64 may be used according to the particular design.
  • When the survey triggering module 64 determines that an authentic trigger event has occurred, the survey triggering module 64 may then set a flag stored in the memory device 50 or provide some other type of indication that the service job is complete (or other trigger event has occurred) and that the status of a new survey case associated with that service job is now opened. In some implementations, the survey triggering module 64 may enter the time that the trigger signal was received in order to allow multiple service jobs to be recorded chronologically according to completion time.
  • The survey triggering module 64 may also be configured to perform a polling process in which the database 56 is polled to determined which entries were recorded over a past predetermined time period. For example, if surveys are to be initiated every ten minutes, the polling process can determine which service jobs were completed in the last ten minutes. The survey triggering module 64 places the polled service jobs in the scheduling queue 84 in the order in which the service jobs were completed. The order that the automated surveys are conducted is based in part on the list in the survey scheduling queue 84.
  • The survey triggering module 64 may also be configured to wait a predetermined amount of time before triggering the launch of the survey. The reason for the delay is to allow the customer to have time to observe the delivered product and try running it, for example, to determine if there are any defects. Also, the delay permits time for the servicer to leave the vicinity of the customer's residence to allow the customer to provide unbiased responses to the survey questions. When the predetermined lag time has elapsed, the survey triggering module 64 instructs the survey conducting module 66 to launch the survey.
  • In response to a trigger to launch, the survey conducting module 66 is configured to retrieve the appropriate survey script for the particular client, brand, product, service, customer, order, or other criteria. Also, the survey conducting module 66 retrieves the customer contact information, such as a home telephone number or mobile phone number. The survey conducting module 66 may be configured to control the IVR device to dial the customer's number and begin playing the survey scripts when the customer answers the phone. In some embodiments, other methods of contacting the customer may be used.
  • The survey conducting module 66 is also configured to capture the touch tone entries from the customer's telephone in response to the survey questions. Customer input can also be captured by the survey conducting module 66 using other input techniques, such as by e-mail, web-based inputs, spoken answers, etc. The survey conducting module 66 also gives the customer an option to leave a voice message, if desired. When a voice message is left, the survey conducting module 66 may also record the message in digital form. In some embodiments, the survey conducting module 66 may also be configured to give the customer the option of speaking with a live operator. If the customer wishes to speak with an operator, the survey conducting module 66 may redirect the call to an operator associated with the service group. The survey conducting module 66 may also be configured to give the customer the option to leave a message using text, such as typing a message in an e-mail, typing a message in a text message, typing a message on a smart phone, using a chat session, or other means of leaving a non-voice message.
  • When the survey is finished, the survey result information and voice messages can be analyzed to determine the customer's satisfaction with the service received. Some analysis of this information may be done automatically, while other analysis may require human involvement.
  • The automated survey result analyzing module 68 is configured to automatically analyze the feedback from the customer when the survey is completed. For example, the survey may include any number of questions, any of which may require numeric answers, such as answers on a numeric scale from 1 to 5, where 1 represents “completely dissatisfied” and 5 represents “completely satisfied.” Other scales can be used according to the particular design. The automated survey result analyzing module 68, according to some implementation, may be configured to calculate a score of the survey recipient's numeric answers.
  • All the scores on the five-point scale can be averaged together to determine an overall score for the survey. The automated survey result analyzing module 68 may be configured to use the overall score to determine if it is below a threshold that indicates that the customer was generally dissatisfied with the service. With a low average score, such as if the score is below 3.0 on a scale from 1 to 5, the automated survey result analyzing module 68 may set a flag to indicate that follow-up is warranted. Thresholds other than 3.0 may also be used according to the client's wishes or based on other factors. In some embodiments, the automated survey result analyzing module 68 may be configured to automatically send an e-mail or communicate in another manner to the field manager (or others) for follow up. The field manager may then respond by calling the customer to try to resolve any issues.
  • According to some embodiments, the automated survey result analyzing module 68 may detect if one or more answers indicate the lowest level of satisfaction on the part of the customer. In this case, the automated survey result analyzing module 68 may set the flag indicating the need for follow-up. Also, an automatic e-mail may be sent to the field manager (or others). The automated survey result analyzing module 68 may be configured to analyze the feedback from the survey in any suitable manner to determine if follow-up actions are warranted.
  • The survey result monitoring module 70 may be a web-based tool that can be accessed by a human operator (e.g., a survey monitor, service manager, field manager, or other authorized personnel of the service group). The survey result monitoring module 70 may provide a user interface enabling the user to access the survey result information, analyzed results from the automated survey result analyzing module 68, digitized voice messages, and/or other information. According to various implementations of the present disclosure, the survey result monitoring module 70 may enable the user to access and listen to the voice messages, enter a transcript of the voice message, enter a summary of the voice message, append notes to the survey result information, select one or more predefined classifications of customer issues, and/or select or recommend one or more follow-up actions. When follow-up actions are selected or recommended, the survey result monitoring module 70 can open a follow-up case for the purpose of monitoring the status of follow-up actions taken until the customer issues are resolved. As used herein, opening cases is understood to include the creation of one or more database records. In some embodiments, survey cases and follow-up cases for the same service may be monitored simultaneously. The survey result monitoring module 70 may provide a link or hyperlink to the survey information and/or voice messages. The input received from the user via the user interface can be stored along with the other information of the survey record and/or follow-up record.
  • The survey follow-up module 72 may be configured to track the follow-up actions that are taken to resolve customer issues. The survey follow-up module 72 may record and organize information related to the status of the follow-up case, such as, for example, the age of the follow-up case from the start of an opened follow-up case to the present. The survey follow-up module 72 enables access to this information and allows the user to use a searching tool associated with the survey follow-up module 72 to search for specific groups of follow-up cases, based on any factors, such as client, age, region, etc.
  • When analysis of the survey result information has been done, a follow-up case can be opened if necessary. If the survey is flagged as needing follow-up, the survey follow-up module 72 is configured to initiate follow-up actions. For example, if the survey feedback contains certain scores or marks that fit the specified criteria for needing follow-up, the survey follow-up module 72 may automatically send an e-mail to the field manager responsible for that servicer or service team. In this way, the field manager is informed that follow-up is needed and is incentivized to act quickly to resolve the issues. Along with the e-mail, the survey follow-up module 72 can also transmit the survey result information and recorded voice messages and/or links to the information and voice messages. In some cases, the issues may require the involvement of the client. Depending on how the client decides to establish follow-up routines, the survey follow-up module 72 may communicate information to the client directly or to both the client and the field manager.
  • The survey follow-up module 72 may be configured to determine the age of a follow-up case and track the progress being made to resolve the issues. The survey follow-up module 72 may be monitored by the survey monitor person to determine if certain issues need to be revisited. The survey follow-up module 72 may enable the transmission or re-transmission of an e-mail as a reminder as necessary to notify the field manager or other responsible party for resolving an older issue. The reminder can be send automatically by the survey follow-up module 72 based on predetermined conditions. In some embodiments, the survey follow-up module 72 may be further configured to calculate incentive payments based in part on survey scores, survey result information, compliments, or other information that is received with respect to the performance by a servicer or service team. Also, the survey follow-up module 72 may calculate bonuses for managers based on survey result numbers. In this respect, the servicers and managers can receive bonus compensation for high quality customer service.
  • The survey result reporting module 74 may be configured to send reports to one or more clients to inform them of the survey result information, types of issues encountered, overall scores, or other information or data. The reports may be sent automatically to the clients based in part on the client's preferences. Some reports may be communicated daily, monthly, quarterly, or for any time period. The survey result reporting module 76 may be configured to communicate with different groups of people who may be responsible for different aspects of a particular service. For example, when the results of surveys indicate defective products from a client, the survey result reporting module 74 may be configured to send a notice to an individual or department about the detective products.
  • The survey program 54 of the present disclosure may be implemented in hardware, software, firmware, or any combinations thereof. In the disclosed embodiments, the survey program 54 may be implemented in software or firmware that is stored on a memory device and that is executable by a suitable instruction execution system. The survey program 54 may be implemented as one or more computer programs stored on different memory devices or different computer systems of a network. If implemented in hardware, the survey program 54 may be implemented using discrete logic circuitry, an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), or any combinations thereof.
  • FIG. 7 is a diagram showing an embodiment of the database 56 shown in FIG. 5. The database 56 may contain various information and data. As illustrated, the database 56 may include order information 78, customer information 80, service information 82, survey scripts 84, a survey scheduling queue 86, survey result information 88, voice messages 90, and survey follow-up action information 91, and may further include other types of data. The service information 82 may be related to any type of service, such as a delivery service, installation service, repair service, or other services. In some embodiments, the voice messages 90 may instead be stored in a separate file system associated with the memory device 50.
  • The order information 78 may include the store name, product purchases, type of services to be provided, date and time of order, etc. The customer information 80 may include the customer's name, mailing address, billing address, delivery address, telephone and mobile phone numbers, e-mail addresses, preferred means of contact, etc. The service information 82 (e.g., when related to a delivery service) may include the product ordered, shipping identification information of the product, the delivery driver, the carrier, the servicer, the promised delivery time, the actual arrival time, status of delivery, etc.
  • The survey scripts 84 may include digitized voice scripts of portions of one or more surveys, complete surveys, or other survey information. The survey scheduling queue 86 is a queue for recording the time when survey cases are open, a sequence of surveys to be conducted, etc. The survey result information 88 may include the results, feedback, responses, etc., provided by the customer during the survey. The survey result information 88 may also include result of the analysis by the automated survey result analyzing module 68, such as overall scores. The voice messages 90 may include digitized voice messages recorded during the survey. The voice messages 90 may be stored as files (e.g., on a separate file server) that may be accessed by hyperlinks via the network. The survey follow-up action information 91 may include a record of a classification of customer issues that warrant follow-up actions in addition to a record of follow-up actions to be taken to resolve the customer issues.
  • FIG. 8 is a flow diagram illustrating an overview of the automated survey process according to various implementations. Customer Order 92 represents the process when the customer orders a product or service from the client. In some implementations, the client collects contact information associated with the customer during the ordering process. This contact information can be used for contacting the customer in order to run the survey.
  • Service Interaction 94 is the process when a service of any kind is performed for the customer. For example, the service may be a delivery of goods or packages, building and/or installing a product, maintenance, repair, improvement, communication with a service manager or customer service representative, a product registration process, or other services. When the service is complete, it may be advantageous for the client or service group to conduct a survey to collect information about the customer's satisfaction with the service. The collected information can be used to help the service group improve the quality of their services.
  • When the status of the service case has changed due to the completion of the service job, a survey may be triggered. This is indicated by block 96. One way in which the survey is triggered may include a servicer calling into an IVR device indicating that the job is complete or closed. Another way of triggering a survey may include the servicer using a handheld device to close the job and the handheld device being configured to send a trigger signal to the automated survey system 36. Another way may include the servicer calling a support center to close the job using a landline telephone or mobile phone. When the job is recorded as being closed, the closed status may be detected in the database by a program that creates a survey call record that initiates the deployment of the survey.
  • After receiving notification of the Trigger Event 96, an Automated Survey may be conducted. The survey may be conducted automatically via a phone call to the customer using an IVR device, e-mail, chat, or other means of communication. The automated survey may include pre-recorded questions and may respond to the answers captured by a numeric keypad, an alphanumeric keyboard, touch screen device, or other data entry device on the customer's telephone, mobile phone, computer, or other device. Responses may be received via telephone, in a return e-mail or chat session, or by other digital entry device. Responses to survey questions may also be in the form of voice messages received via telephone, VoIP, or other voice recording device or system. In some embodiments, the customer may be given the option to wait for live customer care if desired. Also, an option may be given to allow the customer to enter a message other than a voice message, such as, for example, a text message, e-mail message, or other textual based message. According to some implementations, the survey may be started within about ten minutes of the trigger event and completed within about two minutes.
  • When the survey results are received, the automated survey system is configured to analyze the results. This analysis can be done automatically by the processing system and/or manually by a survey monitor person. The automated analysis may include analysis of the customer data, product data, survey responses, and/or other information. The survey responses may be collected using finite answers, such as an answer 1, 2, 3, 4, or 5 for a ranking in response to a specific survey question. In addition, the survey response may include a voice message, which can be manually analyzed and entered according to certain defined classifications.
  • In many cases, the results of a survey do not require follow-up with the customer and these survey cases can be closed. However, in some cases, the customer may enter certain responses or leave a voice message that prompts the automated survey system to begin a follow-up process to resolve any issues that the customer may have. When the answers are analyzed, either automatically or manually, the issues may be identified. When these exceptions are identified, a follow-up process is opened to ensure that the issues are treated sensitively. The follow-up may include inquiries to gather additional information from the customer, if needed. Countermeasures may be followed as needed to resolve the issues.
  • Follow-up actions may be acted upon internally within the service group or if necessary reported to client management and/or client teams. Information from the analysis and the follow-up may be collected and reported to internal teams for future use, such as performance management, improving processes, services and products, tracking costs and issues, billing, etc. Reports include hyperlinks to voicemails for easy access and review.
  • FIG. 9 is a flow diagram of an embodiment of a method for executing a service case, survey case, and follow-up case. When a customer enters into a business deal with a business in which service is to be provided to the customer in some way, a service case is opened. In some implementations, the client (or business) sends order information to a servicer who acts on the client's behalf. The order information may be related to the specific service order and the customer's personal information. At a scheduled service time, the servicer performs the service for the customer. When the service is complete, the service case is closed.
  • The closing of the service case, as illustrated in FIG. 9, causes the opening of a survey case. In this respect, the completion of the service job triggers the initiation of the survey case. After a lag time, the survey case includes the conducting of an automated survey. When responses are received from the survey recipient, the survey case is closed.
  • When the survey case is closed, a follow-up case is opened to determine if follow-up to the survey is needed. Any issues fed back by the customer are analyzed to determine if follow-up actions are needed. If so, the appropriate people are contacted in order to resolve the issues. When the issues are resolved, the follow-up case is closed.
  • FIG. 10 is a flow diagram illustrating an embodiment of a method for creating a survey. In this embodiment, the method includes digitally recording voice scripts as indicated in block 106. For example, each voice script may be one or more survey question and/or one or more statements or sentences. As indicated in block 108, file names for the voice scripts are established. This process may include automatically naming the files based on the spoken language, store, store brand, product information, or other information. Block 110 includes enabling a user to select one or more voice scripts from the recorded scripts that may be used to form a completed survey. The user may be enabled to add and/or delete scripts. In some embodiments, certain scripts may be automatically selected depending on client preferences, based on a bill code associated with a client brand (if the client has multiple brands), based on order criteria, or based on other factors. Regarding the selection based on order criteria, a service order in one particular example may include a delivery and assembly, and hence automatic selection of both delivery-related questions and assembly-related questions can be made. The method further includes enabling a user to arrange the scripts in a particular order, as desired, to form a certain logical sequence of scripts for the survey, as suggested in block 112. As indicated in block 114, the user is enabled to enter the answers from the survey recipient that are acceptable for the particular survey questions.
  • FIG. 11 is a screen shot of a user interface 118 for creating an automated survey according to various implementations of the present disclosure. The user interface 118 includes, among other things, a sequence column 120 that displays a sequence of survey scripts that form the entire survey and enables the user to change the sequence as needed. A question ID column 122 identifies the respective survey scripts (i.e., questions and/or statements). A question description column 124 includes a description of the respective survey script. An answer options column 126 enables the user to enter the acceptable feedback responses, based in part on the questions being asked. Column 128 enables the user to select which answers to the respective questions are to be shown on a web-enabled user interface that reports the survey result information to the appropriate individuals responsible for handling customer issues.
  • The user interface 118 also includes an add button 132, enabling the user to add a selected question or statement to the survey. A delete button 134 enables the user to delete one or more questions, and a save button 136 enables to the user to save the survey when it is complete. The user interface 118 may also include a “sample playback” button allowing the user to listen to how the created survey might sound.
  • FIG. 12 is a diagram illustrating an example of a completed survey 140 according to various implementations. The survey 140 in this example includes an introduction, survey instructions, list of questions, and a statement giving the survey recipient an opportunity to leave a voice message. It should be understood that other wording of sentences, the wording of questions, the sequence and types of questions asked, and other aspects of the survey can be modified to meet the particular client's needs. In some implementations, the survey 140 can be formed using preset elements. The survey 140 can be read and recorded, and then accessed for playback during the survey. Elements to allow time for answers to be entered by the survey recipient can be added as needed.
  • FIG. 13 is a flow diagram illustrating an embodiment of a method for triggering and conducting a survey according to various implementations. As illustrated in FIG. 10, the method includes receiving notification of the occurrence of a trigger event associated with a service record in accordance with block 144. Particularly, the trigger event may be the completion of the designated service. As indicated in block 146, the method includes changing the status of the service record to closed. The survey record is then created, as indicated in block 148, and is placed in a survey scheduling queue, as indicated in block 150.
  • According to decision block 152, it is determined whether or not a periodic time for performing a polling function has arrived. For example, the polling function may be configured to operate every 10 or 15 minutes. If the proper time has not yet arrived, the flow path loops back to itself and block 152 is repeated until the time arrives. When it is time for polling, the database is polled to detect new survey records, as indicated in block 154. Block 156 indicates that the method includes conducting an automated survey. The order that the automated surveys are launched may be based in part on the sequence of survey records in the survey scheduling queue. The process of conducting the automated survey is described in more detail below. As indicated in block 158, survey result information is received. The survey result information may be choices entered by the survey recipient, voice messages, or other useful data.
  • FIG. 14 is a flow diagram illustrating an embodiment of a method for conducting an automated survey according to various implementations. The automated survey conducting method includes determining, according to decision block 162, whether or not a new survey record has been found. If not, the flow path returns back to block 162 until one is found. When found, an automated survey is prepared, as indicated in block 164. The preparing of the survey may include, for example, accessing scripts and questions, accessing contact information, or other functions for forming an appropriate survey. The information gathered together to prepare the survey may include field manager case information, client order information, client product information, a library of survey scripts and questions, and other suitable information.
  • According to decision block 166, it is determined whether or not the survey recipient is on a do-not-call list. If so, the method skips ahead to block 168, which indicates that the survey case is closed with a status of “no contact made—DNC.” If the survey recipient is not on the do-not-call list, the method flows to block 170, which indicates that an attempt is made to contact the survey recipient. According to decision block 172, it is determined whether or not contact is made with the survey recipient. If not, then the flow proceeds to decision block 184. If contact is made, the flow proceeds to block 174, which indicates that the automated survey is launched and responses by the survey recipient are captured.
  • During the automated survey, the survey recipient is given the option to speak with a live operator. If it is determined in decision block 176 that the survey recipient requests to speak to someone live, then the flow branches to block 178. As indicated in block 178, the survey recipient is connected with an operator, such as a customer service agent, for the completion of the survey. When the live survey is completed, the survey analysis status is set to “ready” as indicated in block 180. If in block 176 it is determined that the survey recipient does not wish to talk with a live operator, the flow proceeds to decision block 182. According to block 182, it is determined whether or not the survey was completed successfully. If so, the flow proceeds to block 180 to set the survey analysis status to “ready.” If the survey did not complete successfully, as determined in block 182, flow proceeds to decision block 184.
  • Block 184 is reached when the survey recipient could not be contacted (decision block 172) or when the survey was not completed successfully (decision block 182). At this point, it is determined whether or not the number of contact attempts is equal to a predetermined threshold. If the number of contact attempts is determined to be equal to the threshold, flow proceeds from block 184 to block 186 and the survey is closed with the status of “no contact made.” If not, then the method goes to block 188, in which the survey is reschedule for another attempt, and the flow then proceeds back to block 170.
  • FIG. 15 is a flow diagram illustrating a method for handling survey result information according to various implementations of the present disclosure. The method includes receiving survey result information from an automated survey, as indicated in block 192. According to block 194, the method includes analyzing the survey result information (e.g., averaging the survey result information) to obtain a survey score. It is determined, according to decision block 196, whether the analysis reveals that follow-up actions are warranted or not, such as by automatically comparing an average score to a defined threshold. If so, a flag is set to open a follow-up case as indicated in block 198. However, if no follow-up is warranted based on the analysis, flow proceeds from block 196 to decision block 200. In block 200, it is determined whether or not a voice message was received. If so, the voice message is made available for access by a survey monitor person according to block 202. Also, input may be received from the survey monitor person, as indicated in block 204. The input received from the survey monitor person may include a summary of the voice message, selection of one or more customer issues from a list, selection of one or more follow-up actions from a list, a flag set to open a follow-up case, and/or other inputs. The flag to open the follow-up case may be set in response to the content and interpretation of the voice message. Block 206 indicates that the survey results are made available for reporting to various individuals, teams, departments, or others and for tracking the progress of the follow-up actions.
  • FIG. 16 is a flow diagram illustrating an embodiment of a method for performing survey follow-up actions according to various implementations of the present disclosure. As indicated in decision block 210, it is determined whether or not a follow-up flag has been set. If not, which indicates that no follow-up is needed, then the flow of the method skips to block 212 and the follow-up case is closed. If a flag is set, the flow proceeds to block 214, which indicates that the survey result information and survey scores are received. As indicated in decision block 216, it is determined whether the survey result information meets certain criteria for sending an auto-notification to the client. The client may request to receive automatic notification based on any suitable conditions or criteria associated with the survey result information. For example, if the client requests to receive notification of compliments and if one or more compliments are recorded in the survey results, then the criteria in this case are met. If it is determined in block 216 that the criteria are met, an auto-notification of the survey details is sent to the client, as described in block 218. In some embodiments, block 218 may be omitted if the client chooses not to receive auto-notifications.
  • After compliments are handled, the flow proceeds to decision block 220, which indicates that a determination is made whether the survey score warrants one or more follow-up actions. If not, then the flow skips to block 212 and the follow-up case is closed. However, if follow-up is warranted, the method flows on to decision block 222, which determines whether involvement by a field manager is needed. If so, the survey result information (which may include any of the survey answers, survey scores, and voice messages) is made available to the field manager, according to block 224. When the survey result information is received, the field manager may be enabled to add or edit follow-up information, as indicated in block 225. For example, the field manager may log any follow-up actions taken to resolve the issues. The field manager may also set classifications of issues and set follow-up actions that were not previously recorded. The field manager may also be enabled to mark when the follow-up case is closed, e.g., when all the issues have been resolved. The method also includes checking if client involvement is needed, as indicated in decision block 226. If so, the flow is directed to block 228 and the survey result information is made available to the client. As indicated in block 229, the client is enabled to add and/or edit follow-up information. In some embodiments, the client's name may be logged in during the modification process. The types of follow-up information that can be modified in this method may be different for the field manager, client, and others who may be given access to the information and authority to change the information, depending on the particular design.
  • The information made available to the client may be different than that made available to the field manager, depending on the particular design. The field managers and clients, when given the information, may be responsible for contacting the customer, service group members, or others by any available communication devices in order to help resolve the issues. Decision block 230 indicates that it is determined whether or not any issues remain. This determination may be made by the field manager, who may set a flag, mark an item on a checklist, enter a summary, or other operation that may be detectable by the survey program 54. These indications can be analyzed to determine that the issues are resolved. If no issues remain, the flow goes to block 212 and the follow-up case is closed. If issues still remain, the flow loops back to block 220 to repeat follow-up actions until the issues can be resolved.
  • The flow diagrams of FIGS. 9, 10, and 13-16 show the architecture, functionality, and operation of possible implementations of the survey program 54. In this regard, each block may represent a module, segment, portion of code, etc., which comprises one or more executable instructions for performing the specified logical functions. It should be noted that the functions described with respect to the blocks may occur in a different order than shown. For example, two or more blocks may be executed substantially concurrently, in a reverse order, or in any other sequence depending on the particular functionality involved.
  • The survey program 54, which comprises an ordered listing of executable instructions for implementing logical functions, may be embodied in any computer-readable medium for use by any combination of instruction execution systems or devices, such as computer-based systems, processor-controlled systems, etc. The computer-readable medium may include one or more suitable physical media components configured to store the software, programs, or computer code for a measurable length of time. The computer-readable medium may be any medium configured to contain, store, communicate, propagate, or transport programs for execution by the instruction execution systems or devices.
  • FIG. 17 is a screen shot of a user interface 231 for enabling access to voice messages according to various embodiments. User interface 231 lists the survey responses that include a voice message that needs to be verified. More specifically, verifying a voice message may include the actions toward the voice message of screening, filtering, sorting, searching, or other actions. Section 233 of the user interface 231 includes information about the profit center (business), job identification numbers, customer, and the time and date when each respective survey was completed. Column 234 shows if the respective survey feedback included a low overall score, representing poor quality service, such as one below a minimum threshold. Column 235 includes a link to the different voice messages. If the user wishes to hear the message, the user may click on the “Listen” link to retrieve the voice message file. If the voice message warrants follow-up actions, the user can select either yes or no in the response required column 236. In column 237, the details of the surveys can be retrieved by the user by clicking on the respective “Details” link.
  • FIGS. 18A and 18B are parts of a screen shot of a user interface 238 for enabling a user to enter follow-up actions to be taken, according to various implementations of the present disclosure. The user interface 238 may be opened, for example, by clicking on the detail link in column 237 shown in FIG. 17. Also, the user interface 238 may be opened when the user clicks on the “listen” link in column 235. In section 240 of user interface 238, information about the order, profit center, servicer, customer, etc. is displayed. Within section 240 is a link 241 that enables a user to access a voice message, if one is left. In section 242, information about the survey questions is displayed.
  • Section 244 of the user interface 238 enables the user to check certain listed items to define the customer's issues and categorize them into classification categories. The list of issues included in section 244 may be customized for the client based on the client's needs, based on the particular service provided, based on the particular product being delivered, or based on any other factors. Some non-limiting examples of customer issue items listed in section 244 may include a scheduling issue, an incorrect phone number, an issue with the contract carrier, a delivery fee issue, a schedule notification issue, poor service at the store, a damaged product, the product missing items, the wrong product delivered, the wrong address, a store or client issue, a voice message compliment, or any other service issues. In some embodiments, the selection of at least of the classification items can be required before a case is closed. By listening to the voice message, the user may be able to determine the classification of issues described audibly.
  • Section 246 includes a list of possible ways to resolve the issues marked in section 244. This list may also be customized for the particular client depending on various factors. Some non-limiting examples of resolution items listed in section 246 may include the issuing of a gift card to the customer, passing the information on the store or client, leaving a voice message for the client, recording a voicemail summary, or other ways of reaching resolution. Other items may also include the closure of the follow-up case based on a failure to contact the customer or a representative speaking with the customer to resolve some issue, addressing the issue with the delivery team, or the customer misunderstanding the survey. The user interface 238 enables the user to check the appropriate boxes of section 246 as needed. The user interface 238 may display certain additional information fields depending on the selections made in section 246. For example, if the user selects “Passed to Store/Client”, the user interface 238 may prompt the user to enter the name of the person to which the survey result information is passed. According to another example, if the user selects “Issue Gift Card”, the user interface 238 may prompt the user to enter the monetary amount of the gift card to be issued.
  • If a voice message is left, the user may listen to the message by clicking on the link 241 and then may enter a summary of the voice message in window 248. The window 248 can also be used to record steps that were taken by different people of the service group to resolve issues or any other notes that may be necessary for understanding the issues of the case. The summaries entered in window 248 are displayed in section 250 when inserted by the user. The Actions selected in section 246 are also automatically displayed adjacent section 250. If the follow-up case is to be closed, the user may check the box 252.
  • FIG. 19 is a screen shot of an embodiment of a user interface 256 for enabling access to survey result information. The user interface 256 may be created automatically when the survey recipient leaves a voice message. In this embodiment, the user interface 256 displays a table 258 having details of the order, store, carrier, driver, customer, customer contact information, promised delivery time window, actual arrival time, etc. The user interface 256 also includes a table 260 displaying the survey questions, response options, and the answers provided. Table 260 also displays a calculation of the average score. Window 262 shows a summary of the voice message left by the survey recipient and textual entries made by the service team monitoring the status of the survey and follow-up.
  • The user interface 256 also includes a link 257 allowing the user to respond to the survey recipient. Also, the user interface 256 includes a link 264, which allows the user to listen to a recording of the voice message left by the survey recipient. For example, the voice message may include any file format, such as a .wav file, a vox file, etc.
  • FIG. 20 is a screen shot of an embodiment of a user interface 266. The user interface 266 may be created automatically when the overall score 267 displayed in a survey result section is below an acceptable threshold. For example, if the survey questions are based on a five-point scale with “5” representing complete satisfaction and “1” representing complete dissatisfaction, then a threshold of about 3.0 (or any other suitable number) may be set. Therefore, an overall score below 3.0 (in this case) may initiate the generation of the user interface 266. The user interface 266 may also include a delivery notes section 268 and an order history section 269. The delivery notes section 268 may include notes that were recorded when the customer placed an order. As an example, the delivery notes 268 may be useful for the completion of certain services. The order history section 269 may include a history of the order case, survey case, and/or follow-up case of a service order. Information in the order history section 269 may be entered manually and/or automatically.
  • FIG. 21 is a screen shot of an embodiment of a user interface 270 for enabling a search of survey responses. The user interface 270 may be made available to each of the service managers and other personnel responsible for monitoring the orders, surveys, and follow-up cases for a service company. The user interface 270 allows the user to search for follow-up cases and view the details of the follow-up cases. If box 272 is checked, only the follow-up cases that are still open (or pending) are searched. In field 274, the user can select one or more profit centers (or business segments) depending on the need. Also, the user can select the option to search all the profit centers of the service company. Fields 276 and 278 allow the user to enter the timeframe in which the search is made. When the search button 280 is selected, the user interface 270 is configured to search the database for follow-up cases that match the search criteria and display the results in table 282.
  • The table 282 includes rows of different entries arranged with columns for the profit center, the customer receiving the service (“ship to”), the job number, the time and date the follow-up case was opened (“reported at”), the deadline, the age of the follow-up case, whether a low score was received in the survey, whether a voice message link is available, the number of responses, whether the follow-up case has been closed, and a details link linking to the details of the survey. The table 282 may list the follow-up cases in a sequence from the oldest case to the newest, ordered according to the age column. The age column may work with a suitable clock or timing device to update the age of opened cases every six minutes (0.1 hours). The age may be used by the service team to give priority to older issues.
  • FIGS. 22A and 22B are screen shots of a service issue report 286 according to various implementations of the present disclosure. The service issue report 286 may be communicated to the client or store to report issues regarding the order or service that need attention by the store. The clients may be given the option to receive such a report at different stages of the follow-up or when certain situations occur. In this embodiment, the service issue report 286 includes an information table 288, a survey result table 290, a voice message link 292, and a voice message response table 294 shown in FIG. 22B. The information table 288 includes information about the order, service, customer, etc., and the survey result table 290 includes respective responses to the survey questions. The user can click on the voice message link 292 to accept the voice message file and listen to the recording. In some implementations, the voice message response table 294 shows the voice message summary and a summary of follow-up calls (e.g., by JBROWN in this example) to the customer to resolve the issues.
  • FIG. 23 is a diagram of an embodiment of a quality report 300. In this example, the quality report 300 may be communicated to the client (i.e., “Acme”). The quality report 300 includes the client's survey scores broken up among the different regions (e.g., starting in this example with the New York region). Also, the quality report 300 divides each region down to the individual servicers. With this report, the client can obtain useful information about the overall success of the delivery teams, the success of teams within each region, and success of individual servicers.
  • FIG. 24 is a diagram of a survey feedback report 304 according to various implementations of the present disclosure. The survey feedback report 304 may include a table 306 showing the daily survey results, a table 308 showing the month-to-date survey results, and a table 310 showing a response classification matrix. The survey result reporting module 74 (FIG. 6) may be configured to send the survey feedback report 304 to the service managers and other teams of the service group. The report may be transmitted with an e-mail or may be accessed using a hyperlink. The survey feedback report 304 may be sent on a periodic basis to keep the managers and teams up to date. For example, it may be sent on a daily basis, issued on the morning following the day of service being reported. The next-day information can be useful for training or coaching purposes, such as for use by a manager to coach service teams to practice proper technique and behavior that may better please the customers. In this way, service teams can be given immediate feedback based on the previous day's survey responses.
  • The daily survey result table 306 may include numbers broken out by region. The columns of the daily survey result table 306 include the number of service orders (e.g., deliveries), the number of surveys completed, the percentages of customers completing the survey, and an average score goal. The daily survey result table 306 also may include the particular questions of the survey, such as whether the customer would desire to have the delivery team back, the appearance of the delivery team, on-time success, call ahead success, whether the delivery team properly tested and demonstrated the product, the professional courtesy of the team, and the overall average score.
  • The month-to-date results table 308 may include the same column divisions as the daily report but for the longer time period from the first of the month to the present. The response classification matrix table 310 includes columns for each of a number of specific customer issues. For example, the table 310 may include scheduling issues, incorrect phone number issues, contract carrier issues, notification of deliver time issues, damaged product issues, address issues, store/client issues, etc.
  • According to various implementations, the survey result reporting module 74 (FIG. 6) may generate one or more reports describing the issuance of gift cards. For example, the follow-up action of issuing a gift card may be initiated by a user selecting the item labeled “Issued Gift Card” in the follow-up actions section 246 of the user interface 238 (FIG. 18). Gift cards may be issued when service mistakes, mishaps, or other problems occur. The gift cards may be used to reimburse, compensate, or in some way appease the customer for the service problems. Since the service group is representing the client, the issued gift cards may be valid only at the client's stores, for example, or in other embodiments may be valid at any stores.
  • The service managers may analyze the events surrounding the service problems and determine if a particular servicer is at fault or responsible. If it is determined that a particular servicer is responsible for the service problem, such as for arriving outside the promised time window, failing to complete the service, and/or other problems, then the automated survey system 36 may be configured to automatically subtract the gift card amount from the servicer's pay. In this respect, the amount that the managed services 22 pays for gift card issuance is charged back to the servicer. However, if the problem is not caused by the servicer but is caused by other operators or systems, then the servicer is not held responsible.
  • FIG. 25 is a diagram of an embodiment of a survey response report 314. The survey response report 314 includes tables for follow-up cases that remain opened and those that are closed. The survey response report 314 also includes for each case the customer, classifications of issues, overall score on the survey, age of the case, summary notes, and access to more details of the case.
  • FIG. 26 is a diagram of an embodiment of a summary quality report 318 according to various implementations. The summary quality report 318 includes a first column 320, which includes each of the stores of a client, divided regionally. A second column 322 includes the overall average survey scores for the respective stores, and a third set of columns 324 includes the average scores and number of surveys completed for each of the questions asked on the surveys. In some embodiments, the survey result reporting module 74 may be configured to distinguish between the average scores that meet a particular goal and those that do not meet the goal. For example, a first score 326 may be displayed in one manner (e.g., black) while another score 328 may be displayed in a different manner (e.g., red).
  • Many advantages might be gained by a service business or other entity by the use of the survey network system 34 and particularly the automated survey system 36 and survey program 54. For example, one benefit might be the ability to provide rapid follow-up actions to customers that have issues. In some cases, it may be possible to resolve the customer's issue within two hours, which is a desirable service goal for a service company. By responding to issues quickly, the overall customer satisfaction level of a service group can be high.
  • Another advantage might be the aspect of performing the survey using an automated system as opposed to a survey conducted by a live operator. An automated system may allow the survey recipient to answer more truthfully and may also lead to a high survey participation rate. For example, many surveys have a participation rate of under 10%. However, with the automated survey system 36 described herein, a participation rate of about 40% or higher can been achieved. In this respect, the service company can obtain a larger sample of data that may better define the satisfaction level of the customers. Also, by conducting the survey at an advantageous time, which is controlled by the automated survey system 36, customers are more likely to take the survey and likely to answer more accurately, because the service experience might still be fresh in their minds.
  • One should note that conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more particular embodiments or that one or more particular embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
  • It should be emphasized that the above-described embodiments are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the present disclosure. Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included in which functions may not be included or executed at all, may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the present disclosure. Further, the scope of the present disclosure is intended to cover any and all combinations and sub-combinations of all elements, features, and aspects discussed above. All such modifications and variations are intended to be included herein within the scope of the present disclosure, and all possible claims to individual aspects or combinations of elements or steps are intended to be supported by the present disclosure.

Claims (30)

We claim:
1. A computer-implemented method for conducting an automated survey regarding quality of a service performed by a servicer for a customer associated with a customer location and determining if at least one follow-up action is warranted based in part on analyzing survey result information, the computer-implemented method comprising the steps of:
conducting of a survey recipient associated with the customer location the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, the automated survey being configured by a computer utilizing a processing device to automatically prompt the survey recipient for survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location;
after conducting of the survey recipient the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, automatically receiving, by the computer, the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location;
in response to receiving the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location, automatically analyzing by the computer the survey result information by performing a numerical analysis of the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location; and
in response to the performing of the numerical analysis of the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location, determining by the computer if at least one subsequent follow-up action is warranted.
2. The computer-implemented method of claim 1, further comprising the steps of:
receiving, by a computer, a notification of an occurrence of a trigger event associated with a completion of the service performed for the customer associated with the customer location;
waiting a configurable amount of time after receiving the notification of the occurrence of the trigger event; and
after waiting the configurable amount of time, initiating the conducting of the automated survey.
3. The computer-implemented method of claim 1, wherein the performing the numerical analysis includes calculating a score based in part on answers provided by the survey recipient in response to one or more survey questions in the automated survey.
4. The computer-implemented method of claim 3, wherein the answers provided by the survey recipient are numeric answers ranging from a number that substantially represents “completely dissatisfied” to another number that substantially represents “completely satisfied.”
5. The computer-implemented method of claim 4, wherein the score is an average of the numeric answers, and wherein the step of determining if at least one subsequent follow-up action is warranted includes determining by the computer if the score is below a threshold.
6. The computer-implemented method of claim 3, wherein the step of determining if at least one subsequent follow-up action is warranted includes determining if at least one numeric answer is below a threshold.
7. The computer-implemented method of claim 1, wherein the step of conducting the automated survey on a survey recipient includes utilizing a survey scheduling queue to determine when the automated survey will take place.
8. The computer-implemented method of claim 1, wherein the customer is also the survey recipient.
9. The computer-implemented method of claim 1, wherein all the claimed steps are performed by means for accomplishing the claimed steps.
10. A survey result analysis system for conducting an automated survey regarding quality of a service performed by a servicer for a customer associated with a customer location and determining if at least one follow-up action is warranted based in part on analyzing survey result information, the survey result analysis system comprising:
a processing device associated with a computing system, the processing device configured to execute a survey program; and
a memory device in communication with the processing device, the memory device configured to store a database and the survey program, the survey program including
a survey conducting module configured to automatically conduct on a survey recipient associated with the customer location the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, the automated survey being configured to prompt the survey recipient for survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location, and
an automated survey result analyzing module configured to perform a numerical analysis on received survey result information regarding the quality of the service performed by the servicer associated with the customer location, the automated survey result analyzing module further configured to determine if at least one subsequent follow-up action is warranted based in part on performing the numerical analysis of the survey result information from the survey.
11. The survey result analysis system of claim 10, wherein the survey program is further configured to enable the processing device to receive a notification of an occurrence of a trigger event associated with a completion of the service performed for the customer associated with the customer location and to wait a configurable amount of time after receiving the notification before conducting the automated survey.
12. The survey result analysis system of claim 10, wherein the survey program is further configured to enable the processing device to calculate scores based in part on numeric answers provided by the survey recipient in response to a plurality of survey questions in the automated survey, wherein acceptable numeric answers range from 1 to 5, where an answer of 1 represents “completely dissatisfied” and an answer of 5 represents “completely satisfied.”
13. The survey result analysis system of claim 12, wherein the processing device determines whether at least one subsequent follow-up action is warranted if the processing device determines that an average of the numeric answers is below 3.0 or if one or more numeric answers are 1.
14. The survey result analysis system of claim 10, wherein the survey conducting module includes a survey scheduling queue to determine when the automated survey will take place.
15. The survey result analysis system of claim 10, wherein the survey program is further configured to set a flag to open a follow-up case upon determining that at least one subsequent follow-up action is warranted.
16. The survey result analysis system of claim 10, wherein all of the claimed functions are performed by means for accomplishing the claimed functions.
17. A computer-implemented method for conducting an automated survey regarding quality of a service performed by a servicer for a customer associated with a customer location, the computer-implemented method comprising the steps of:
conducting of a survey recipient associated with the customer location the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, the automated survey being configured by a computer utilizing a processing device to automatically prompt the survey recipient for survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location;
after conducting of the survey recipient the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, automatically receiving, by the computer, the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location;
after receiving the survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location, automatically determining by the computer if a voice message was received in response to the conducting of the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location; and
responsive to determining that a voice message was received in response to the conducting of the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, presenting by the computer the voice message to a person for listening to the voice message.
18. The computer-implemented method of claim 17, wherein the presenting by the computer the voice message to a person for listening to the voice message includes designating from other survey result information a survey results record associated with the voice message, and wherein the computer-implemented method comprises receiving by the computer follow-up action information from the person after the voice message is presented to the person.
19. The computer-implemented method of claim 17, further comprising automatically analyzing the survey result information by the computer with a numeric analysis based in part on numeric answers provided by the survey recipient in response to survey questions in the automated survey.
20. The computer-implemented method of claim 19, wherein the numeric analysis includes calculation of a score based on the numeric answers.
21. The computer-implemented method of claim 17, wherein the step of conducting the automated survey on a survey recipient includes utilizing a survey scheduling queue to determine when the automated survey will take place.
22. The computer-implemented method of claim 17, wherein all the claimed steps are performed by means for accomplishing the claimed steps.
23. A survey result analysis system for conducting an automated survey regarding quality of a service performed by a servicer for a customer associated with a customer location, the survey result analysis system comprising:
a processing device associated with a computing system, the processing device configured to execute a survey program; and
a memory device in communication with the processing device, the memory device configured to store a database and the survey program, the survey program including
a survey conducting module configured to automatically conduct on a survey recipient associated with the customer location the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, the automated survey being configured to prompt the survey recipient for survey result information regarding the quality of the service performed by the servicer for the customer associated with the customer location, and
an automated survey result analyzing module configured to automatically determine if a voice message was received in response to the conducting of the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location, the automated survey result analysis module being further configured to present the voice message to a person for listening to the voice message in response to a determination that the voice message was received in response to the conducting of the automated survey regarding the quality of the service performed by the servicer for the customer associated with the customer location.
24. The survey result analysis system of claim 23, wherein the automated survey result analysis module is further configured to designate from other survey result information a survey results record associated with the voice message, and wherein the survey program further includes a module for receiving by the computer follow-up action information from the person after the voice message is presented to the person.
25. The survey result analysis system of claim 23, wherein the automated survey result analysis module is further configured to calculate scores based in part on numeric answers to survey questions in the automated survey.
26. The survey result analysis system of claim 23, wherein the automated survey result analysis module is further configured to determine if at least one follow-up action is warranted if an average of numeric answers is below 3.0.
27. The survey result analysis system of claim 23, wherein the automated survey result analysis module is further configured to determine if at least one follow-up action is warranted if at least one numeric answer is below a threshold.
28. The survey result analysis system of claim 23, wherein the survey conducting module utilizes a survey scheduling queue to determine when the automated survey will take place.
29. The survey result analysis system of claim 23, wherein the survey program is further configured to set a flag to open a follow-up case based on the follow-up action information received from the person after the voice message is presented to the person.
30. The survey result analysis system of claim 23, wherein all the claimed functions are performed by means for accomplishing the claimed functions.
US14/057,396 2009-12-04 2013-10-18 Triggering, conducting, and analyzing an automated survey Abandoned US20140046729A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/057,396 US20140046729A1 (en) 2009-12-04 2013-10-18 Triggering, conducting, and analyzing an automated survey
US16/382,257 US10664853B2 (en) 2009-12-04 2019-04-12 Triggering, conducting, and analyzing an automated survey

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US26659909P 2009-12-04 2009-12-04
US12/722,455 US20110137709A1 (en) 2009-12-04 2010-03-11 Triggering and conducting an automated survey
US14/057,396 US20140046729A1 (en) 2009-12-04 2013-10-18 Triggering, conducting, and analyzing an automated survey

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/722,455 Continuation US20110137709A1 (en) 2009-12-04 2010-03-11 Triggering and conducting an automated survey

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/382,257 Continuation US10664853B2 (en) 2009-12-04 2019-04-12 Triggering, conducting, and analyzing an automated survey

Publications (1)

Publication Number Publication Date
US20140046729A1 true US20140046729A1 (en) 2014-02-13

Family

ID=44082903

Family Applications (16)

Application Number Title Priority Date Filing Date
US12/722,474 Abandoned US20110137696A1 (en) 2009-12-04 2010-03-11 Performing follow-up actions based on survey results
US12/722,455 Abandoned US20110137709A1 (en) 2009-12-04 2010-03-11 Triggering and conducting an automated survey
US12/722,463 Abandoned US20110137808A1 (en) 2009-12-04 2010-03-11 Analyzing survey results
US12/959,900 Abandoned US20110137698A1 (en) 2009-12-04 2010-12-03 Service call-ahead system and method
US13/245,858 Abandoned US20120016720A1 (en) 2009-12-04 2011-09-26 Analyzing survey results
US13/245,859 Abandoned US20120022905A1 (en) 2009-12-04 2011-09-26 Performing follow-up actions based on survey results
US13/245,854 Active US8515803B2 (en) 2009-12-04 2011-09-26 Triggering and conducting an automated survey
US13/246,494 Abandoned US20120059681A1 (en) 2009-12-04 2011-09-27 Service call-ahead system and method
US14/057,396 Abandoned US20140046729A1 (en) 2009-12-04 2013-10-18 Triggering, conducting, and analyzing an automated survey
US14/255,626 Abandoned US20140229238A1 (en) 2009-12-04 2014-04-17 Performing follow-up actions based on survey results
US14/664,385 Active 2032-09-30 US11769163B2 (en) 2009-12-04 2015-03-20 Service call-ahead system and method
US14/746,207 Active 2030-12-10 US10262329B2 (en) 2009-12-04 2015-06-22 Triggering and conducting an automated survey
US16/278,507 Active US10650397B2 (en) 2009-12-04 2019-02-18 Triggering and conducting an automated survey
US16/382,257 Active US10664853B2 (en) 2009-12-04 2019-04-12 Triggering, conducting, and analyzing an automated survey
US16/382,264 Active US10657549B2 (en) 2009-12-04 2019-04-12 Performing follow-up actions based on survey results
US16/828,485 Active 2030-04-18 US11288687B2 (en) 2009-12-04 2020-03-24 Triggering and conducting an automated survey

Family Applications Before (8)

Application Number Title Priority Date Filing Date
US12/722,474 Abandoned US20110137696A1 (en) 2009-12-04 2010-03-11 Performing follow-up actions based on survey results
US12/722,455 Abandoned US20110137709A1 (en) 2009-12-04 2010-03-11 Triggering and conducting an automated survey
US12/722,463 Abandoned US20110137808A1 (en) 2009-12-04 2010-03-11 Analyzing survey results
US12/959,900 Abandoned US20110137698A1 (en) 2009-12-04 2010-12-03 Service call-ahead system and method
US13/245,858 Abandoned US20120016720A1 (en) 2009-12-04 2011-09-26 Analyzing survey results
US13/245,859 Abandoned US20120022905A1 (en) 2009-12-04 2011-09-26 Performing follow-up actions based on survey results
US13/245,854 Active US8515803B2 (en) 2009-12-04 2011-09-26 Triggering and conducting an automated survey
US13/246,494 Abandoned US20120059681A1 (en) 2009-12-04 2011-09-27 Service call-ahead system and method

Family Applications After (7)

Application Number Title Priority Date Filing Date
US14/255,626 Abandoned US20140229238A1 (en) 2009-12-04 2014-04-17 Performing follow-up actions based on survey results
US14/664,385 Active 2032-09-30 US11769163B2 (en) 2009-12-04 2015-03-20 Service call-ahead system and method
US14/746,207 Active 2030-12-10 US10262329B2 (en) 2009-12-04 2015-06-22 Triggering and conducting an automated survey
US16/278,507 Active US10650397B2 (en) 2009-12-04 2019-02-18 Triggering and conducting an automated survey
US16/382,257 Active US10664853B2 (en) 2009-12-04 2019-04-12 Triggering, conducting, and analyzing an automated survey
US16/382,264 Active US10657549B2 (en) 2009-12-04 2019-04-12 Performing follow-up actions based on survey results
US16/828,485 Active 2030-04-18 US11288687B2 (en) 2009-12-04 2020-03-24 Triggering and conducting an automated survey

Country Status (2)

Country Link
US (16) US20110137696A1 (en)
CA (3) CA2696345C (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140095259A1 (en) * 2012-10-01 2014-04-03 Cadio, Inc. Offering survey response opportunities for sale
US9491291B1 (en) * 2011-04-20 2016-11-08 Confinement Telephony Technology, Llc Systems and methods for institutional messaging
WO2016210114A1 (en) * 2015-06-25 2016-12-29 Alibaba Group Holding Limited System, device, and method for making automatic calls
US10191895B2 (en) * 2014-11-03 2019-01-29 Adobe Systems Incorporated Adaptive modification of content presented in electronic forms
US10423905B2 (en) * 2015-02-04 2019-09-24 Hexagon Technology Center Gmbh Work information modelling
US10643223B2 (en) 2015-09-29 2020-05-05 Microsoft Technology Licensing, Llc Determining optimal responsiveness for accurate surveying
US10650397B2 (en) 2009-12-04 2020-05-12 Xpo Last Mile, Inc. Triggering and conducting an automated survey
US11170876B2 (en) * 2010-10-09 2021-11-09 MEI Research, Ltd. System to dynamically collect and synchronize data with mobile devices

Families Citing this family (248)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8135331B2 (en) * 2006-11-22 2012-03-13 Bindu Rama Rao System for providing interactive user interactive user interest survey to user of mobile devices
US10803474B2 (en) 2006-11-22 2020-10-13 Qualtrics, Llc System for creating and distributing interactive advertisements to mobile devices
US11256386B2 (en) 2006-11-22 2022-02-22 Qualtrics, Llc Media management system supporting a plurality of mobile devices
US8700014B2 (en) 2006-11-22 2014-04-15 Bindu Rama Rao Audio guided system for providing guidance to user of mobile device on multi-step activities
US8478250B2 (en) 2007-07-30 2013-07-02 Bindu Rama Rao Interactive media management server
US9348804B2 (en) * 2008-05-12 2016-05-24 The Nielsen Company (Us), Llc Methods and apparatus to provide a choice selection with data presentation
US20100262463A1 (en) * 2009-04-14 2010-10-14 Jason Tryfon Systems, Methods, and Media for Management of a Survey Response Associated with a Score
US8694358B2 (en) * 2009-04-14 2014-04-08 Vital Insights Inc. Systems, methods, and media for survey management
US9230292B2 (en) 2012-11-08 2016-01-05 Uber Technologies, Inc. Providing on-demand services through use of portable computing devices
CA2782611C (en) 2009-12-04 2018-07-10 Uber Technologies, Inc. System and method for arranging transport amongst parties through use of mobile devices
US20130246301A1 (en) * 2009-12-04 2013-09-19 Uber Technologies, Inc. Providing user feedback for transport services through use of mobile devices
US8660965B1 (en) * 2010-03-09 2014-02-25 Intuit Inc. System and method for mobile proximity ordering
CA2699871A1 (en) * 2010-04-09 2011-10-09 121Qa Inc. Customer satisfaction analytics system using on-site service quality evaluation
US8817969B1 (en) * 2010-06-25 2014-08-26 Amazon Technologies, Inc. Systems and methods for query input via telephony devices
US9213980B2 (en) * 2010-11-12 2015-12-15 Ebay Inc. Using behavioral data in rating user reputation
US20120213404A1 (en) 2011-02-18 2012-08-23 Google Inc. Automatic event recognition and cross-user photo clustering
US20130096985A1 (en) * 2011-04-05 2013-04-18 Georgia Tech Research Corporation Survey systems and methods useable with mobile devices and media presentation environments
US10528914B2 (en) 2012-04-27 2020-01-07 Benbria Corporation System and method for rule-based information routing and participation
US9009067B1 (en) * 2012-04-30 2015-04-14 Grubhub Holdings Inc. System, method and apparatus for managing made-to-order food tickets for a restaurant service
WO2013182965A1 (en) 2012-06-06 2013-12-12 Ben Volach Survey sampling prior to message publishing
US9391792B2 (en) 2012-06-27 2016-07-12 Google Inc. System and method for event content stream
US8983972B2 (en) * 2012-10-01 2015-03-17 Sap Se Collection and reporting of customer survey data
US20140100918A1 (en) * 2012-10-05 2014-04-10 Lightspeed Online Research, Inc. Analyzing market research survey results using social networking activity information
US20140114733A1 (en) * 2012-10-23 2014-04-24 Thomas A Mello Business Review Internet Posting System Using Customer Survey Response
US9418370B2 (en) * 2012-10-23 2016-08-16 Google Inc. Obtaining event reviews
US9671233B2 (en) 2012-11-08 2017-06-06 Uber Technologies, Inc. Dynamically providing position information of a transit object to a computing device
US8892359B2 (en) 2013-01-11 2014-11-18 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for estimating time of arrival for vehicle navigation
US9071677B2 (en) 2013-02-12 2015-06-30 Unify Square, Inc. Enhanced data capture, analysis, and reporting for unified communications
US20190287122A9 (en) * 2013-03-15 2019-09-19 Benbria Corporation Real-time survey and scoreboard systems
US20140278783A1 (en) * 2013-03-15 2014-09-18 Benbria Corporation Real-time customer engagement system
US20140324728A1 (en) * 2013-04-24 2014-10-30 Lifeworx, Inc. System and method for progressive feedback collection and review
US20140330638A1 (en) * 2013-05-02 2014-11-06 Go Daddy Operating Company, LLC System and method for management of marketing allocations using a return on investment metric
US9747630B2 (en) 2013-05-02 2017-08-29 Locu, Inc. System and method for enabling online ordering using unique identifiers
US20140337100A1 (en) * 2013-05-10 2014-11-13 Mark Crawford System and method of obtaining customer feedback
US9607311B2 (en) * 2013-06-10 2017-03-28 Chian Chiu Li System and methods for conducting surveys
US10264113B2 (en) 2014-01-10 2019-04-16 Onepin, Inc. Automated messaging
US10298740B2 (en) 2014-01-10 2019-05-21 Onepin, Inc. Automated messaging
SG2014014773A (en) * 2014-02-13 2015-09-29 Service Bureau Pte Ltd Method and system for managing customer feedback survey responses
US20150324821A1 (en) * 2014-05-07 2015-11-12 Rincon & Associates Multicultural Survey Response System
US10181051B2 (en) 2016-06-10 2019-01-15 OneTrust, LLC Data processing systems for generating and populating a data inventory for processing data access requests
US9729583B1 (en) 2016-06-10 2017-08-08 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10289867B2 (en) 2014-07-27 2019-05-14 OneTrust, LLC Data processing systems for webform crawling to map processing activities and related methods
US9824323B1 (en) 2014-08-11 2017-11-21 Walgreen Co. Gathering in-store employee ratings using triggered feedback solicitations
US20160048799A1 (en) 2014-08-15 2016-02-18 Xpo Last Mile, Inc. Cascading call notification system and method
US10753751B2 (en) * 2014-09-19 2020-08-25 Arris Enterprises Llc Systems and methods for street level routing
US10007918B1 (en) * 2014-12-26 2018-06-26 Sprint Communications Company L.P. Customer care automation system
WO2016123381A1 (en) * 2015-01-28 2016-08-04 Kamensky William Concepts for determining attributes of a population of mobile device users
US10026506B1 (en) 2015-02-06 2018-07-17 Brain Trust Innovations I, Llc System, RFID chip, server and method for capturing vehicle data
EP3269159A4 (en) 2015-03-09 2019-01-16 OnePin, Inc. Automatic event-based network monitoring
WO2016179197A1 (en) 2015-05-04 2016-11-10 Onepin, Inc. Automatic aftercall directory and phonebook entry advertising
US9451083B1 (en) 2015-08-03 2016-09-20 International Business Machines Corporation Communication answering time estimation
WO2017058826A1 (en) 2015-09-28 2017-04-06 Google Inc. Sharing images and image albums over a communication network
US9711836B1 (en) 2015-10-30 2017-07-18 The United States of America as requested by the Secretary of the Air Force Tunable high isolation circulator
US9894342B2 (en) * 2015-11-25 2018-02-13 Red Hat Israel, Ltd. Flicker-free remoting support for server-rendered stereoscopic imaging
US10176502B2 (en) 2016-04-01 2019-01-08 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US9892444B2 (en) 2016-04-01 2018-02-13 OneTrust, LLC Data processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US10176503B2 (en) 2016-04-01 2019-01-08 OneTrust, LLC Data processing systems and methods for efficiently assessing the risk of privacy campaigns
US11004125B2 (en) 2016-04-01 2021-05-11 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US9898769B2 (en) 2016-04-01 2018-02-20 OneTrust, LLC Data processing systems and methods for operationalizing privacy compliance via integrated mobile applications
US20220164840A1 (en) 2016-04-01 2022-05-26 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US10423996B2 (en) 2016-04-01 2019-09-24 OneTrust, LLC Data processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US11244367B2 (en) 2016-04-01 2022-02-08 OneTrust, LLC Data processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US9892443B2 (en) 2016-04-01 2018-02-13 OneTrust, LLC Data processing systems for modifying privacy campaign data via electronic messaging systems
US10706447B2 (en) 2016-04-01 2020-07-07 OneTrust, LLC Data processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US10885485B2 (en) 2016-06-10 2021-01-05 OneTrust, LLC Privacy management systems and methods
US10783256B2 (en) 2016-06-10 2020-09-22 OneTrust, LLC Data processing systems for data transfer risk identification and related methods
US10776518B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Consent receipt management systems and related methods
US10503926B2 (en) 2016-06-10 2019-12-10 OneTrust, LLC Consent receipt management systems and related methods
US11025675B2 (en) 2016-06-10 2021-06-01 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10769301B2 (en) 2016-06-10 2020-09-08 OneTrust, LLC Data processing systems for webform crawling to map processing activities and related methods
US10467432B2 (en) 2016-06-10 2019-11-05 OneTrust, LLC Data processing systems for use in automatically generating, populating, and submitting data subject access requests
US10242228B2 (en) 2016-06-10 2019-03-26 OneTrust, LLC Data processing systems for measuring privacy maturity within an organization
US10846433B2 (en) 2016-06-10 2020-11-24 OneTrust, LLC Data processing consent management systems and related methods
US11438386B2 (en) 2016-06-10 2022-09-06 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11228620B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10685140B2 (en) 2016-06-10 2020-06-16 OneTrust, LLC Consent receipt management systems and related methods
US10509920B2 (en) 2016-06-10 2019-12-17 OneTrust, LLC Data processing systems for processing data subject access requests
US10839102B2 (en) 2016-06-10 2020-11-17 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US10282559B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10353674B2 (en) 2016-06-10 2019-07-16 OneTrust, LLC Data processing and communications systems and methods for the efficient implementation of privacy by design
US10796260B2 (en) 2016-06-10 2020-10-06 OneTrust, LLC Privacy management systems and methods
US10848523B2 (en) 2016-06-10 2020-11-24 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10181019B2 (en) 2016-06-10 2019-01-15 OneTrust, LLC Data processing systems and communications systems and methods for integrating privacy compliance systems with software development and agile tools for privacy design
US10289870B2 (en) 2016-06-10 2019-05-14 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11366909B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10565236B1 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11544667B2 (en) 2016-06-10 2023-01-03 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10346638B2 (en) 2016-06-10 2019-07-09 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US10430740B2 (en) 2016-06-10 2019-10-01 One Trust, LLC Data processing systems for calculating and communicating cost of fulfilling data subject access requests and related methods
US10740487B2 (en) 2016-06-10 2020-08-11 OneTrust, LLC Data processing systems and methods for populating and maintaining a centralized database of personal data
US10586075B2 (en) 2016-06-10 2020-03-10 OneTrust, LLC Data processing systems for orphaned data identification and deletion and related methods
US11416798B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for providing training in a vendor procurement process
US10510031B2 (en) 2016-06-10 2019-12-17 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10944725B2 (en) 2016-06-10 2021-03-09 OneTrust, LLC Data processing systems and methods for using a data model to select a target data asset in a data migration
US11475136B2 (en) 2016-06-10 2022-10-18 OneTrust, LLC Data processing systems for data transfer risk identification and related methods
US10565397B1 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11625502B2 (en) 2016-06-10 2023-04-11 OneTrust, LLC Data processing systems for identifying and modifying processes that are subject to data subject access requests
US11481710B2 (en) 2016-06-10 2022-10-25 OneTrust, LLC Privacy management systems and methods
US11328092B2 (en) 2016-06-10 2022-05-10 OneTrust, LLC Data processing systems for processing and managing data subject access in a distributed environment
US10437412B2 (en) 2016-06-10 2019-10-08 OneTrust, LLC Consent receipt management systems and related methods
US11138242B2 (en) 2016-06-10 2021-10-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US10585968B2 (en) 2016-06-10 2020-03-10 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11074367B2 (en) 2016-06-10 2021-07-27 OneTrust, LLC Data processing systems for identity validation for consumer rights requests and related methods
US10565161B2 (en) 2016-06-10 2020-02-18 OneTrust, LLC Data processing systems for processing data subject access requests
US10346637B2 (en) 2016-06-10 2019-07-09 OneTrust, LLC Data processing systems for the identification and deletion of personal data in computer systems
US11295316B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems for identity validation for consumer rights requests and related methods
US11562097B2 (en) 2016-06-10 2023-01-24 OneTrust, LLC Data processing systems for central consent repository and related methods
US11057356B2 (en) 2016-06-10 2021-07-06 OneTrust, LLC Automated data processing systems and methods for automatically processing data subject access requests using a chatbot
US10496803B2 (en) 2016-06-10 2019-12-03 OneTrust, LLC Data processing systems and methods for efficiently assessing the risk of privacy campaigns
US11675929B2 (en) 2016-06-10 2023-06-13 OneTrust, LLC Data processing consent sharing systems and related methods
US12118121B2 (en) 2016-06-10 2024-10-15 OneTrust, LLC Data subject access request processing systems and related methods
US11636171B2 (en) 2016-06-10 2023-04-25 OneTrust, LLC Data processing user interface monitoring systems and related methods
US11651106B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10592692B2 (en) 2016-06-10 2020-03-17 OneTrust, LLC Data processing systems for central consent repository and related methods
US11210420B2 (en) 2016-06-10 2021-12-28 OneTrust, LLC Data subject access request processing systems and related methods
US10997315B2 (en) 2016-06-10 2021-05-04 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10949170B2 (en) 2016-06-10 2021-03-16 OneTrust, LLC Data processing systems for integration of consumer feedback with data subject access requests and related methods
US11138299B2 (en) 2016-06-10 2021-10-05 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10776517B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Data processing systems for calculating and communicating cost of fulfilling data subject access requests and related methods
US10878127B2 (en) 2016-06-10 2020-12-29 OneTrust, LLC Data subject access request processing systems and related methods
US10452866B2 (en) 2016-06-10 2019-10-22 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10318761B2 (en) 2016-06-10 2019-06-11 OneTrust, LLC Data processing systems and methods for auditing data request compliance
US11586700B2 (en) 2016-06-10 2023-02-21 OneTrust, LLC Data processing systems and methods for automatically blocking the use of tracking tools
US11392720B2 (en) 2016-06-10 2022-07-19 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US11087260B2 (en) 2016-06-10 2021-08-10 OneTrust, LLC Data processing systems and methods for customizing privacy training
US11651104B2 (en) 2016-06-10 2023-05-16 OneTrust, LLC Consent receipt management systems and related methods
US10726158B2 (en) 2016-06-10 2020-07-28 OneTrust, LLC Consent receipt management and automated process blocking systems and related methods
US11301796B2 (en) 2016-06-10 2022-04-12 OneTrust, LLC Data processing systems and methods for customizing privacy training
US10496846B1 (en) 2016-06-10 2019-12-03 OneTrust, LLC Data processing and communications systems and methods for the efficient implementation of privacy by design
US11144622B2 (en) 2016-06-10 2021-10-12 OneTrust, LLC Privacy management systems and methods
US12052289B2 (en) 2016-06-10 2024-07-30 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10853501B2 (en) 2016-06-10 2020-12-01 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10169609B1 (en) 2016-06-10 2019-01-01 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US11151233B2 (en) 2016-06-10 2021-10-19 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11520928B2 (en) 2016-06-10 2022-12-06 OneTrust, LLC Data processing systems for generating personal data receipts and related methods
US10706131B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems and methods for efficiently assessing the risk of privacy campaigns
US10282692B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11416109B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Automated data processing systems and methods for automatically processing data subject access requests using a chatbot
US10607028B2 (en) 2016-06-10 2020-03-31 OneTrust, LLC Data processing systems for data testing to confirm data deletion and related methods
US11238390B2 (en) 2016-06-10 2022-02-01 OneTrust, LLC Privacy management systems and methods
US10896394B2 (en) 2016-06-10 2021-01-19 OneTrust, LLC Privacy management systems and methods
US10708305B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Automated data processing systems and methods for automatically processing requests for privacy-related information
US11222139B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems and methods for automatic discovery and assessment of mobile software development kits
US10642870B2 (en) 2016-06-10 2020-05-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11146566B2 (en) 2016-06-10 2021-10-12 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10873606B2 (en) 2016-06-10 2020-12-22 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10289866B2 (en) 2016-06-10 2019-05-14 OneTrust, LLC Data processing systems for fulfilling data subject access requests and related methods
US10235534B2 (en) 2016-06-10 2019-03-19 OneTrust, LLC Data processing systems for prioritizing data subject access requests for fulfillment and related methods
US10803200B2 (en) 2016-06-10 2020-10-13 OneTrust, LLC Data processing systems for processing and managing data subject access in a distributed environment
US10762236B2 (en) 2016-06-10 2020-09-01 OneTrust, LLC Data processing user interface monitoring systems and related methods
US11416589B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10706379B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems for automatic preparation for remediation and related methods
US12045266B2 (en) 2016-06-10 2024-07-23 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10606916B2 (en) 2016-06-10 2020-03-31 OneTrust, LLC Data processing user interface monitoring systems and related methods
US11294939B2 (en) 2016-06-10 2022-04-05 OneTrust, LLC Data processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US10592648B2 (en) 2016-06-10 2020-03-17 OneTrust, LLC Consent receipt management systems and related methods
US11200341B2 (en) 2016-06-10 2021-12-14 OneTrust, LLC Consent receipt management systems and related methods
US10438017B2 (en) 2016-06-10 2019-10-08 OneTrust, LLC Data processing systems for processing data subject access requests
US11366786B2 (en) 2016-06-10 2022-06-21 OneTrust, LLC Data processing systems for processing data subject access requests
US10997318B2 (en) 2016-06-10 2021-05-04 OneTrust, LLC Data processing systems for generating and populating a data inventory for processing data access requests
US10776514B2 (en) 2016-06-10 2020-09-15 OneTrust, LLC Data processing systems for the identification and deletion of personal data in computer systems
US11188615B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Data processing consent capture systems and related methods
US10353673B2 (en) * 2016-06-10 2019-07-16 OneTrust, LLC Data processing systems for integration of consumer feedback with data subject access requests and related methods
US10454973B2 (en) 2016-06-10 2019-10-22 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10284604B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing and scanning systems for generating and populating a data inventory
US11403377B2 (en) 2016-06-10 2022-08-02 OneTrust, LLC Privacy management systems and methods
US10706174B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data processing systems for prioritizing data subject access requests for fulfillment and related methods
US10509894B2 (en) 2016-06-10 2019-12-17 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11336697B2 (en) 2016-06-10 2022-05-17 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11354435B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for data testing to confirm data deletion and related methods
US10706176B2 (en) 2016-06-10 2020-07-07 OneTrust, LLC Data-processing consent refresh, re-prompt, and recapture systems and related methods
US11134086B2 (en) 2016-06-10 2021-09-28 OneTrust, LLC Consent conversion optimization systems and related methods
US10798133B2 (en) 2016-06-10 2020-10-06 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10275614B2 (en) 2016-06-10 2019-04-30 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11023842B2 (en) 2016-06-10 2021-06-01 OneTrust, LLC Data processing systems and methods for bundled privacy policies
US10572686B2 (en) 2016-06-10 2020-02-25 OneTrust, LLC Consent receipt management systems and related methods
US10416966B2 (en) 2016-06-10 2019-09-17 OneTrust, LLC Data processing systems for identity validation of data subject access requests and related methods
US11227247B2 (en) 2016-06-10 2022-01-18 OneTrust, LLC Data processing systems and methods for bundled privacy policies
US10614247B2 (en) 2016-06-10 2020-04-07 OneTrust, LLC Data processing systems for automated classification of personal information from documents and related methods
US10949565B2 (en) 2016-06-10 2021-03-16 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11727141B2 (en) 2016-06-10 2023-08-15 OneTrust, LLC Data processing systems and methods for synching privacy-related user consent across multiple computing devices
US11418492B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing systems and methods for using a data model to select a target data asset in a data migration
US11461500B2 (en) 2016-06-10 2022-10-04 OneTrust, LLC Data processing systems for cookie compliance testing with website scanning and related methods
US10713387B2 (en) 2016-06-10 2020-07-14 OneTrust, LLC Consent conversion optimization systems and related methods
US10909488B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Data processing systems for assessing readiness for responding to privacy-related incidents
US11100444B2 (en) 2016-06-10 2021-08-24 OneTrust, LLC Data processing systems and methods for providing training in a vendor procurement process
US11354434B2 (en) 2016-06-10 2022-06-07 OneTrust, LLC Data processing systems for verification of consent and notice processing and related methods
US11157600B2 (en) 2016-06-10 2021-10-26 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US11038925B2 (en) 2016-06-10 2021-06-15 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10440062B2 (en) 2016-06-10 2019-10-08 OneTrust, LLC Consent receipt management systems and related methods
US11343284B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10909265B2 (en) 2016-06-10 2021-02-02 OneTrust, LLC Application privacy scanning systems and related methods
US11416590B2 (en) 2016-06-10 2022-08-16 OneTrust, LLC Data processing and scanning systems for assessing vendor risk
US10678945B2 (en) 2016-06-10 2020-06-09 OneTrust, LLC Consent receipt management systems and related methods
US11277448B2 (en) 2016-06-10 2022-03-15 OneTrust, LLC Data processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11341447B2 (en) 2016-06-10 2022-05-24 OneTrust, LLC Privacy management systems and methods
US11222309B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for generating and populating a data inventory
US10204154B2 (en) 2016-06-10 2019-02-12 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11222142B2 (en) 2016-06-10 2022-01-11 OneTrust, LLC Data processing systems for validating authorization for personal data collection, storage, and processing
US10282700B2 (en) 2016-06-10 2019-05-07 OneTrust, LLC Data processing systems for generating and populating a data inventory
US11188862B2 (en) 2016-06-10 2021-11-30 OneTrust, LLC Privacy management systems and methods
US10452864B2 (en) 2016-06-10 2019-10-22 OneTrust, LLC Data processing systems for webform crawling to map processing activities and related methods
FR3053192A1 (en) * 2016-06-23 2017-12-29 Orange METHOD FOR TRANSMITTING A DIGITAL SIGNAL FOR A SYSTEM HAVING AT LEAST ONE DYNAMIC HALF-DUPLEX RELAY WITH SELECTIVE LOGIC, PROGRAM PRODUCT AND CORRESPONDING RELAY DEVICE
EP3523790A4 (en) 2016-10-04 2020-06-10 E*Dray 20/20 LLC System and method for collaborative and dynamic coordination of transportation of shipping containers
US10423991B1 (en) * 2016-11-30 2019-09-24 Uber Technologies, Inc. Implementing and optimizing safety interventions
US10977604B2 (en) 2017-01-23 2021-04-13 Uber Technologies, Inc. Systems for routing and controlling vehicles for freight
US11392970B2 (en) * 2017-02-15 2022-07-19 Qualtrics, Llc Administering a digital survey over voice-capable devices
CN106982304B (en) 2017-02-16 2019-04-19 平安科技(深圳)有限公司 A kind of score information matching process and device
US10440180B1 (en) 2017-02-27 2019-10-08 United Services Automobile Association (Usaa) Learning based metric determination for service sessions
US11334476B2 (en) * 2017-03-28 2022-05-17 Microsoft Technology Licensing, Llc Client-side survey control
WO2018212815A1 (en) 2017-05-17 2018-11-22 Google Llc Automatic image sharing with designated users over a communication network
US10013577B1 (en) 2017-06-16 2018-07-03 OneTrust, LLC Data processing systems for identifying whether cookies contain personally identifying information
US11250372B2 (en) 2017-09-22 2022-02-15 Uber Technologies, Inc Freight network system using modularized trailers
US10293832B2 (en) 2017-10-25 2019-05-21 Uber Technologies, Inc. Network computer system to evaluate an operator of a freight vehicle
US11455644B2 (en) 2017-11-03 2022-09-27 Microsoft Technology Licensing, Llc Dynamic governance of exposing inquiries and notifications at client devices
US10902033B2 (en) * 2017-12-01 2021-01-26 Uber Technologies, Inc. Point of interest accuracy using tickets
JP6550162B1 (en) * 2018-03-29 2019-07-24 中村 一人 Taxi allocation system and taxi allocation method
US11392881B2 (en) 2018-04-16 2022-07-19 Uber Technologies, Inc. Freight vehicle matching and operation
CN108711028B (en) * 2018-05-28 2022-11-29 环境保护部华南环境科学研究所 Distributed computation-based solid waste classified transportation multistage cooperative decision making system
US11303632B1 (en) * 2018-06-08 2022-04-12 Wells Fargo Bank, N.A. Two-way authentication system and method
US10740536B2 (en) * 2018-08-06 2020-08-11 International Business Machines Corporation Dynamic survey generation and verification
US10803202B2 (en) 2018-09-07 2020-10-13 OneTrust, LLC Data processing systems for orphaned data identification and deletion and related methods
US11144675B2 (en) 2018-09-07 2021-10-12 OneTrust, LLC Data processing systems and methods for automatically protecting sensitive data within privacy management systems
US11544409B2 (en) 2018-09-07 2023-01-03 OneTrust, LLC Data processing systems and methods for automatically protecting sensitive data within privacy management systems
US10997013B2 (en) 2018-12-10 2021-05-04 Microsoft Technology Licensing, Llc Systems and methods of analyzing user responses to inquiries to diagnose and mitigate reported performance issues on a client device
US11155263B2 (en) 2019-03-08 2021-10-26 Uber Technologies, Inc. Network computer system to control freight vehicle operation configurations
US11386441B2 (en) * 2019-04-17 2022-07-12 Citrix Systems, Inc. Enhancing employee engagement using intelligent workspaces
CN110077676B (en) * 2019-04-19 2021-11-23 宝鸡胜利现代农业开发有限公司 Energy-saving industrial edible mushroom production system
US11810135B2 (en) 2019-06-25 2023-11-07 Otsuka America Pharmaceutical, Inc. System and method for generating transaction trigger data structures for aggregated reporting
CN111080109B (en) * 2019-12-06 2023-05-05 中信银行股份有限公司 Customer service quality evaluation method and device and electronic equipment
US11797528B2 (en) 2020-07-08 2023-10-24 OneTrust, LLC Systems and methods for targeted data discovery
WO2022026564A1 (en) 2020-07-28 2022-02-03 OneTrust, LLC Systems and methods for automatically blocking the use of tracking tools
US20220037004A1 (en) * 2020-07-31 2022-02-03 Hennepin Healthcare System, Inc. Healthcare worker burnout detection tool
WO2022032072A1 (en) 2020-08-06 2022-02-10 OneTrust, LLC Data processing systems and methods for automatically redacting unstructured data from a data subject access request
US11436373B2 (en) 2020-09-15 2022-09-06 OneTrust, LLC Data processing systems and methods for detecting tools for the automatic blocking of consent requests
WO2022061270A1 (en) 2020-09-21 2022-03-24 OneTrust, LLC Data processing systems and methods for automatically detecting target data transfers and target data processing
US11397819B2 (en) 2020-11-06 2022-07-26 OneTrust, LLC Systems and methods for identifying data processing activities based on data discovery results
US11687528B2 (en) 2021-01-25 2023-06-27 OneTrust, LLC Systems and methods for discovery, classification, and indexing of data in a native computing system
WO2022170047A1 (en) 2021-02-04 2022-08-11 OneTrust, LLC Managing custom attributes for domain objects defined within microservices
US11494515B2 (en) 2021-02-08 2022-11-08 OneTrust, LLC Data processing systems and methods for anonymizing data samples in classification analysis
US20240098109A1 (en) 2021-02-10 2024-03-21 OneTrust, LLC Systems and methods for mitigating risks of third-party computing system functionality integration into a first-party computing system
WO2022178089A1 (en) 2021-02-17 2022-08-25 OneTrust, LLC Managing custom workflows for domain objects defined within microservices
WO2022178219A1 (en) 2021-02-18 2022-08-25 OneTrust, LLC Selective redaction of media content
US11533315B2 (en) 2021-03-08 2022-12-20 OneTrust, LLC Data transfer discovery and analysis systems and related methods
US20220292519A1 (en) * 2021-03-15 2022-09-15 Ncr Corporation Item return data integration processing
US11562078B2 (en) 2021-04-16 2023-01-24 OneTrust, LLC Assessing and managing computational risk involved with integrating third party computing functionality within a computing system
US11637793B2 (en) 2021-05-11 2023-04-25 Zipwhip, Llc Communication system facilitating delivery of a predefined message to end user devices on initial communications with an entity
US11620142B1 (en) 2022-06-03 2023-04-04 OneTrust, LLC Generating and customizing user interfaces for demonstrating functions of interactive user environments

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6380928B1 (en) * 1997-12-31 2002-04-30 Kenneth J. Todd Dynamically configurable electronic survey response alert system
US20050130110A1 (en) * 2003-12-16 2005-06-16 Gosling Martin M. System and method to give a true indication of respondent satisfaction to an electronic questionnaire survey
US6963848B1 (en) * 2000-03-02 2005-11-08 Amazon.Com, Inc. Methods and system of obtaining consumer reviews
US20060258347A1 (en) * 2001-11-30 2006-11-16 Interdigital Technology Corporation Cognition models for wireless communication systems and method and apparatus for optimal utilization of a radio channel based on cognition model data
US20070083472A1 (en) * 2005-10-06 2007-04-12 Israel Max L Customer Satisfaction Reporting
US20070168241A1 (en) * 2006-01-19 2007-07-19 Benchmark Integrated Technologies, Inc. Survey-based management performance evaluation systems
US7346505B1 (en) * 2001-09-28 2008-03-18 At&T Delaware Intellectual Property, Inc. System and method for voicemail transcription
US20100080365A1 (en) * 2008-09-29 2010-04-01 Microsoft Corporation Offline voicemail
US20100262463A1 (en) * 2009-04-14 2010-10-14 Jason Tryfon Systems, Methods, and Media for Management of a Survey Response Associated with a Score

Family Cites Families (202)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4153874A (en) * 1977-08-26 1979-05-08 Kaestner Erwin A Mobile perpetually self-correcting estimated time of arrival calculator
US4937439A (en) * 1988-05-13 1990-06-26 National Computer Systems, Inc. Method and system for creating and scanning a customized survey form
US6684195B1 (en) * 1989-05-01 2004-01-27 Catalina Marketing International, Inc. Method and system for selective incentive point-of-sale marketing in response to customer shopping histories
US5131024A (en) * 1990-05-16 1992-07-14 Messager Partners Method and apparatus for providing proactive call services following call completion
US5309505A (en) * 1991-05-20 1994-05-03 Inventions, Inc. Automated voice system for improving agent efficiency and improving service to parties on hold
US7133834B1 (en) * 1992-08-06 2006-11-07 Ferrara Ethereal Llc Product value information interchange server
WO1994027264A1 (en) * 1993-05-14 1994-11-24 Worldwide Notification Systems, Inc. Apparatus and method of notifying a recipient of an unscheduled delivery
US6952645B1 (en) * 1997-03-10 2005-10-04 Arrivalstar, Inc. System and method for activation of an advance notification system for monitoring and reporting status of vehicle travel
US6278936B1 (en) * 1993-05-18 2001-08-21 Global Research Systems, Inc. System and method for an advance notification system for monitoring and reporting proximity of a vehicle
US6492912B1 (en) * 1993-05-18 2002-12-10 Arrivalstar, Inc. System and method for efficiently notifying users of impending arrivals of vehicles
US5668543A (en) * 1993-05-18 1997-09-16 Global Research Systems, Inc. Advance notification system and method utilizing passenger calling report generator
US6748318B1 (en) * 1993-05-18 2004-06-08 Arrivalstar, Inc. Advanced notification systems and methods utilizing a computer network
US20030098802A1 (en) * 1999-03-01 2003-05-29 Jones Martin Kelly Base station apparatus and method for monitoring travel of a mobile vehicle
US5623260A (en) * 1993-05-18 1997-04-22 Global Research Systems, Inc. Advance notification system and method utilizing passenger-definable notification time period
US5400020A (en) * 1993-05-18 1995-03-21 Global Research Systems, Inc. Advance notification system and method
US6700507B2 (en) * 1993-05-18 2004-03-02 Arrivalstar, Inc. Advance notification system and method utilizing vehicle signaling
US6618668B1 (en) * 2000-04-26 2003-09-09 Arrivalstar, Inc. System and method for obtaining vehicle schedule information in an advance notification system
US5657010A (en) * 1993-05-18 1997-08-12 Global Research Systems, Inc. Advance notification system and method utilizing vehicle progress report generator
US6363323B1 (en) * 1993-05-18 2002-03-26 Global Research Systems, Inc. Apparatus and method for monitoring travel of a mobile vehicle
US6486801B1 (en) * 1993-05-18 2002-11-26 Arrivalstar, Inc. Base station apparatus and method for monitoring travel of a mobile vehicle
US6683542B1 (en) * 1993-05-18 2004-01-27 Arrivalstar, Inc. Advanced notification system and method utilizing a distinctive telephone ring
US6748320B2 (en) * 1993-05-18 2004-06-08 Arrivalstar, Inc. Advance notification systems and methods utilizing a computer network
US5517405A (en) * 1993-10-14 1996-05-14 Aetna Life And Casualty Company Expert system for providing interactive assistance in solving problems such as health care management
US5471382A (en) * 1994-01-10 1995-11-28 Informed Access Systems, Inc. Medical network management system and process
US5703935A (en) * 1994-03-29 1997-12-30 Mci Communications Corporation Automated telephone operator services
AUPM813394A0 (en) * 1994-09-14 1994-10-06 Dolphin Software Pty Ltd A method and apparatus for preparation of a database document in a local processing apparatus and loading of the database document with data from remote sources
US5724243A (en) * 1995-02-10 1998-03-03 Highwaymaster Communications, Inc. Method and apparatus for determining expected time of arrival
US5706441A (en) * 1995-06-07 1998-01-06 Cigna Health Corporation Method and apparatus for objectively monitoring and assessing the performance of health-care providers
US5884032A (en) * 1995-09-25 1999-03-16 The New Brunswick Telephone Company, Limited System for coordinating communications via customer contact channel changing system using call centre for setting up the call between customer and an available help agent
US5924073A (en) * 1995-11-14 1999-07-13 Beacon Patient Physician Association, Llc System and method for assessing physician performance using robust multivariate techniques of statistical analysis
US5838774A (en) * 1996-07-01 1998-11-17 Bellsouth Corporation Telephone polling method
US5956693A (en) * 1996-07-19 1999-09-21 Geerlings; Huib Computer system for merchant communication to customers
US6513014B1 (en) * 1996-07-24 2003-01-28 Walker Digital, Llc Method and apparatus for administering a survey via a television transmission network
AUPO214096A0 (en) * 1996-09-04 1996-09-26 Telefonaktiebolaget Lm Ericsson (Publ) A telecommunications system and method for automatic call recognition and distribution
US6151581A (en) * 1996-12-17 2000-11-21 Pulsegroup Inc. System for and method of collecting and populating a database with physician/patient data for processing to improve practice quality and healthcare delivery
WO1998040837A1 (en) * 1997-03-10 1998-09-17 Global Research Systems, Inc. Advanced notification systems and methods utilizing a computer network
US6032177A (en) * 1997-05-23 2000-02-29 O'donnell; Charles A. Method and apparatus for conducting an interview between a server computer and a respondent computer
US6990458B2 (en) * 1997-08-28 2006-01-24 Csg Systems, Inc. System and method for computer-aided technician dispatch and communication
US6574621B1 (en) * 1997-09-23 2003-06-03 Unisys Corporation Survey analysis system and method
US6151584A (en) * 1997-11-20 2000-11-21 Ncr Corporation Computer architecture and method for validating and collecting and metadata and data about the internet and electronic commerce environments (data discoverer)
CA2223597A1 (en) * 1998-01-06 1999-07-06 Ses Canada Research Inc. Automated survey kiosk
US5943416A (en) * 1998-02-17 1999-08-24 Genesys Telecommunications Laboratories, Inc. Automated survey control routine in a call center environment
US6477504B1 (en) * 1998-03-02 2002-11-05 Ix, Inc. Method and apparatus for automating the conduct of surveys over a network system
EP1062559A2 (en) * 1998-03-12 2000-12-27 DMW Worldwide, Inc. Operational system for operating on client defined rules
US6101483A (en) * 1998-05-29 2000-08-08 Symbol Technologies, Inc. Personal shopping system portable terminal
US6535743B1 (en) * 1998-07-29 2003-03-18 Minorplanet Systems Usa, Inc. System and method for providing directions using a communication network
US6167255A (en) * 1998-07-29 2000-12-26 @Track Communications, Inc. System and method for providing menu data using a communication network
US6363254B1 (en) * 1998-09-30 2002-03-26 Global Research Systems, Inc. System and method for enciphering and communicating vehicle tracking information
CA2287768C (en) * 1998-11-02 2004-01-13 Ahmed Abdoh Method for automated data collection, analysis and reporting
US6370231B1 (en) * 1998-11-24 2002-04-09 Bellsouth Intellectual Property Corporation Method and system for calculating the estimated time of arrival of a service technician
US6665395B1 (en) * 1998-12-11 2003-12-16 Avaya Technology Corp. Automatic call distribution system using computer network-based communication
US6311190B1 (en) * 1999-02-02 2001-10-30 Harris Interactive Inc. System for conducting surveys in different languages over a network with survey voter registration
US6970831B1 (en) * 1999-02-23 2005-11-29 Performax, Inc. Method and means for evaluating customer service performance
US6415207B1 (en) * 1999-03-01 2002-07-02 Global Research Systems, Inc. System and method for automatically providing vehicle status information
CN1345413A (en) * 1999-03-01 2002-04-17 环球研究系统公司 Base station system and method for monitoring travel of mobile vehicles and communication notification messages
US20020052774A1 (en) * 1999-12-23 2002-05-02 Lance Parker Collecting and analyzing survey data
US7191142B1 (en) * 1999-12-30 2007-03-13 General Electric Company Internet based goods delivery system
US6724887B1 (en) * 2000-01-24 2004-04-20 Verint Systems, Inc. Method and system for analyzing customer communications with a contact center
US6999565B1 (en) * 2000-02-01 2006-02-14 Envoyworldwide, Inc. Multi-mode message routing and management
US20040128155A1 (en) * 2000-02-15 2004-07-01 Lalitha Vaidyanathan System and method for resolving a dispute in electronic commerce and managing an online dispute resolution process
AU2001241510A1 (en) * 2000-02-16 2001-08-27 Askit Systems Inc. Customer service system and method
US6510383B1 (en) * 2000-03-01 2003-01-21 Arrivalstar, Inc. Vehicular route optimization system and method
US6975998B1 (en) * 2000-03-01 2005-12-13 Arrivalstar, Inc. Package delivery notification system and method
US20010037206A1 (en) * 2000-03-02 2001-11-01 Vivonet, Inc. Method and system for automatically generating questions and receiving customer feedback for each transaction
US20020035474A1 (en) * 2000-07-18 2002-03-21 Ahmet Alpdemir Voice-interactive marketplace providing time and money saving benefits and real-time promotion publishing and feedback
US6934684B2 (en) * 2000-03-24 2005-08-23 Dialsurf, Inc. Voice-interactive marketplace providing promotion and promotion tracking, loyalty reward and redemption, and other features
US6539392B1 (en) * 2000-03-29 2003-03-25 Bizrate.Com System and method for data collection, evaluation, information generation, and presentation
US7962359B2 (en) * 2000-04-06 2011-06-14 Autopoll, Inc. Method and system for collecting and disseminating survey data over the internet
US20020032613A1 (en) * 2000-04-18 2002-03-14 Buettgenbach Thomas H. Methods and systems for the physical delivery of goods ordered through an electronic network
WO2001084433A1 (en) * 2000-05-01 2001-11-08 Mobliss, Inc. System for conducting electronic surveys
US6994253B2 (en) * 2000-05-11 2006-02-07 United Parcel Service Of America Systems and methods of item delivery utilizing a delivery notice
US20020016726A1 (en) * 2000-05-15 2002-02-07 Ross Kenneth J. Package delivery systems and methods
US7606726B2 (en) * 2000-05-31 2009-10-20 Quality Data Management Inc. Interactive survey and data management method and apparatus
US7509266B2 (en) * 2000-05-31 2009-03-24 Quality Data Management Inc. Integrated communication system and method
US6870900B1 (en) * 2000-06-16 2005-03-22 Bellsouth Intellectual Property Corporation Proactive maintenance application
US7216145B2 (en) * 2000-06-23 2007-05-08 Mission Communications, Llc Event notification system
US20030167197A1 (en) * 2000-06-30 2003-09-04 Walker Information Customer relationship measurement and management system and method
US7925524B2 (en) * 2000-07-14 2011-04-12 United Parcel Service Of America, Inc. Method and system of delivering items using overlapping delivery windows
CA2386080A1 (en) 2000-08-07 2002-02-14 General Electric Company Computerized method and system for guiding service personnel to select a preferred work site for servicing transportation equipment
US20020062251A1 (en) * 2000-09-29 2002-05-23 Rajan Anandan System and method for wireless consumer communications
US6980131B1 (en) * 2000-10-24 2005-12-27 @Road, Inc. Targeted impending arrival notification of a wirelessly connected location device
US6999987B1 (en) * 2000-10-25 2006-02-14 America Online, Inc. Screening and survey selection system and method of operating the same
US6962531B2 (en) * 2000-11-03 2005-11-08 Harrah's Operating Company, Inc. Automated service scheduling system
IL156038A0 (en) 2000-11-29 2003-12-23 Method and system for conducting fully antomated survey research
US6496775B2 (en) * 2000-12-20 2002-12-17 Tracer Net Corporation Method and apparatus for providing automatic status information of a delivery operation
US7644057B2 (en) * 2001-01-03 2010-01-05 International Business Machines Corporation System and method for electronic communication management
US7526434B2 (en) * 2001-01-30 2009-04-28 Linda Sharp Network based system and method for marketing management
US20020103693A1 (en) * 2001-01-30 2002-08-01 Horst Bayer System and method for aggregating and analyzing feedback
US6848142B2 (en) 2001-02-02 2005-02-01 Trynex, Inc. Quick-release bucket adapter
US6483433B2 (en) * 2001-02-20 2002-11-19 International Business Machines Corporation Method and apparatus for notifying of receipt
US20020138338A1 (en) * 2001-03-23 2002-09-26 Trauth Gregory L. Customer complaint alert system and method
US20030018510A1 (en) * 2001-03-30 2003-01-23 E-Know Method, system, and software for enterprise action management
US20020173934A1 (en) * 2001-04-11 2002-11-21 Potenza John J. Automated survey and report system
US6677096B2 (en) * 2001-04-27 2004-01-13 Kao Corporation Positively chargeable toner for two-component development
AU2002305577A1 (en) * 2001-05-15 2002-11-25 Dominic A. Marasco System and method for managing interactions between healthcare providers and pharma companies
US20020194047A1 (en) * 2001-05-17 2002-12-19 International Business Machines Corporation End-to-end service delivery (post-sale) process
US6912521B2 (en) * 2001-06-11 2005-06-28 International Business Machines Corporation System and method for automatically conducting and managing surveys based on real-time information analysis
FI119168B (en) * 2006-04-21 2008-08-15 Jukka Tapio Aula SMS delivery method and system for queries and invitations
US20030114206A1 (en) * 2001-08-24 2003-06-19 United Parcel Service Of America, Inc. Portable data acquisition and management system and associated device and method
US7444298B2 (en) * 2001-08-28 2008-10-28 United Parcel Service Of America, Inc. Order and payment visibility process
US20030065522A1 (en) * 2001-10-01 2003-04-03 John Wepfer Electronic system for equipment maintenance and repair orders
US7493274B2 (en) * 2001-10-31 2009-02-17 Amazon.Com, Inc. Marketplace system in which users generate and browse user-to-user preorder listings via a definitive products catalog
US7584115B2 (en) 2001-12-19 2009-09-01 Rightnow Technologies, Inc. System for automated control and reporting of sales processes
US20030135405A1 (en) * 2002-01-17 2003-07-17 Vehnet, Inc. System, method, application to maximize electronic commerce and sales in retail automotive industry
US7698162B2 (en) * 2002-02-25 2010-04-13 Xerox Corporation Customer satisfaction system and method
US20030200135A1 (en) * 2002-04-19 2003-10-23 Wright Christine Ellen System and method for predicting and preventing customer churn
JP2003316885A (en) * 2002-04-23 2003-11-07 Oak Lawn Marketing Inc Safety-in-disaster confirmation service system
US7395221B2 (en) * 2002-05-09 2008-07-01 International Business Machines Corporation Intelligent free-time search
US20030220827A1 (en) * 2002-05-21 2003-11-27 Neil Murphy System and Method for Scheduling Service Technicians
US6928156B2 (en) * 2002-05-31 2005-08-09 Sbc Properties, L.P. Automated operator assistance with menu options
US20030229533A1 (en) * 2002-06-06 2003-12-11 Mack Mary E. System and method for creating compiled marketing research data over a computer network
US6807274B2 (en) * 2002-07-05 2004-10-19 Sbc Technology Resources, Inc. Call routing from manual to automated dialog of interactive voice response system
US6898435B2 (en) * 2002-07-16 2005-05-24 David A Milman Method of processing and billing work orders
US7233907B2 (en) * 2002-08-07 2007-06-19 United Parcel Service Of America, Inc. Parcel or service delivery with partially scheduled time windows
US20040059624A1 (en) * 2002-08-27 2004-03-25 John Wantulok Systems and methods for analyzing customer surveys
US7809595B2 (en) * 2002-09-17 2010-10-05 Jpmorgan Chase Bank, Na System and method for managing risks associated with outside service providers
US20040117470A1 (en) * 2002-12-16 2004-06-17 Rehm William A Temporal service level metrics system and method
US20040122721A1 (en) * 2002-12-18 2004-06-24 Lasorsa Peter M. Calendar travel time module
WO2004059547A1 (en) * 2002-12-26 2004-07-15 Japan Tobacco Inc. Analyzing system, analyzing method in that system, and system for collecting examination results used for analyzing
US20040128183A1 (en) * 2002-12-30 2004-07-01 Challey Darren W. Methods and apparatus for facilitating creation and use of a survey
US20040143478A1 (en) * 2003-01-18 2004-07-22 Ward Andrew David Method and process for capuring, storing, processing and displaying customer satisfaction information
US20040172323A1 (en) * 2003-02-28 2004-09-02 Bellsouth Intellectual Property Corporation Customer feedback method and system
US20040181443A1 (en) * 2003-03-10 2004-09-16 Horton Carl A. Method and apparatus for the management of infrastructure assets, work orders,service requests, and work flows, utilizing an integrated call center, database, GIS system, and wireless handheld device
US20040220848A1 (en) * 2003-04-28 2004-11-04 Leventhal Jeffrey P. System and method for managing requests for services
US7856406B2 (en) * 2003-04-28 2010-12-21 Onforce, Inc. System and method for managing accounts payable and accounts receivable
US7877265B2 (en) * 2003-05-13 2011-01-25 At&T Intellectual Property I, L.P. System and method for automated customer feedback
US7418496B2 (en) * 2003-05-16 2008-08-26 Personnel Research Associates, Inc. Method and apparatus for survey processing
US7119716B2 (en) * 2003-05-28 2006-10-10 Legalview Assets, Limited Response systems and methods for notification systems for modifying future notifications
US7341186B2 (en) * 2003-06-20 2008-03-11 United Parcel Service Of America, Inc. Proof of presence and confirmation of parcel delivery systems and methods
US20050027666A1 (en) * 2003-07-15 2005-02-03 Vente, Inc Interactive online research system and method
US7050569B1 (en) * 2003-08-08 2006-05-23 Sprint Spectrum L.P. Selecting an interactive application to run while a caller is on hold depending on the caller's expected wait time
US7251312B2 (en) * 2003-09-06 2007-07-31 Intrado Inc. Method and system for availing participants in a special number call event and others of information contained in a plurality of data stores
US7558380B2 (en) * 2003-09-25 2009-07-07 Ateb, Inc. Methods, systems and computer program products for providing targeted messages for pharmacy interactive voice response (IVR) systems
US8094804B2 (en) * 2003-09-26 2012-01-10 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US7609832B2 (en) * 2003-11-06 2009-10-27 At&T Intellectual Property, I,L.P. Real-time client survey systems and methods
US20050114167A1 (en) * 2003-11-21 2005-05-26 Mcevoy Dean Booking system and method
US7672444B2 (en) * 2003-12-24 2010-03-02 At&T Intellectual Property, I, L.P. Client survey systems and methods using caller identification information
EP1704523A4 (en) * 2003-12-30 2009-09-30 United Parcel Service Inc Integrated global tracking and virtual inventory system
US20050154626A1 (en) * 2004-01-09 2005-07-14 Mike Jones Dynamic window vehicle tracking method
US20050159977A1 (en) * 2004-01-16 2005-07-21 Pharmacentra, Llc System and method for facilitating compliance and persistency with a regimen
US7536321B2 (en) * 2004-01-30 2009-05-19 Canon U.S.A., Inc. Estimated time of arrival (ETA) systems and methods
US7613627B2 (en) * 2004-02-02 2009-11-03 Ford Motor Company Computer-implemented method and system for collecting and communicating inspection information for a mechanism
US7421546B2 (en) * 2004-02-12 2008-09-02 Relaystar Sa/Nv Intelligent state engine system
US20050192848A1 (en) * 2004-02-26 2005-09-01 Vocantas Inc. Method and apparatus for automated post-discharge follow-up of medical patients
US7909241B2 (en) * 2004-03-09 2011-03-22 Lowe's Companies, Inc. Systems, methods and computer program products for implementing processes relating to retail sales
IL162113A0 (en) * 2004-05-23 2005-11-20 Ori Einhorn Method and system for managing customer relations
US7487018B2 (en) * 2004-08-04 2009-02-03 Verifacts Automotive, Llc Data management systems for collision repair coaching
US20060053058A1 (en) * 2004-08-31 2006-03-09 Philip Hotchkiss System and method for gathering consumer feedback
US20060047419A1 (en) * 2004-09-02 2006-03-02 Diendorf John R Telematic method and apparatus for managing shipping logistics
WO2006036660A2 (en) * 2004-09-27 2006-04-06 Roger Cook Moving ornamental design element
US7428502B2 (en) * 2004-10-06 2008-09-23 United Parcel Service Of America, Inc. Delivery systems and methods involving verification of a payment card from a handheld device
US20060085203A1 (en) * 2004-10-19 2006-04-20 Ford Motor Company Computer-implemented method and system for determining vehicle delivery estimated time of arrival
US20080275582A1 (en) 2004-11-19 2008-11-06 Nettles Steven C Scheduling AMHS pickup and delivery ahead of schedule
US20060111955A1 (en) * 2004-11-24 2006-05-25 Agilis Systems, Inc. System and method for mobile resource management with customer confirmation
US8165773B1 (en) * 2005-03-29 2012-04-24 Avaya Inc. Destination arrival estimates auto-notification based on cellular systems
US7353230B2 (en) * 2005-04-18 2008-04-01 Cisco Technology, Inc. Dynamic distributed customer issue analysis
US20060259347A1 (en) * 2005-05-13 2006-11-16 Zentaro Ohashi Automatic gathering of customer satisfaction information
US8885812B2 (en) * 2005-05-17 2014-11-11 Oracle International Corporation Dynamic customer satisfaction routing
US7945041B2 (en) * 2005-05-27 2011-05-17 International Business Machines Corporation Method, system and program product for managing a customer request
US7634598B2 (en) * 2005-08-17 2009-12-15 Permanent Solution Industries, Inc. Dynamic total asset management system (TAMS) and method for managing building facility services
US20070061471A1 (en) * 2005-09-09 2007-03-15 Preserving Sentiments, Inc. Future delivery apparatus and method
US20070071184A1 (en) * 2005-09-28 2007-03-29 Clift Jeffrey C Automated Voice Activated Telephone Reminder System
US7411942B1 (en) * 2005-11-29 2008-08-12 At&T Corp. Method and apparatus for automated calendar selections
AU2005338854A1 (en) * 2005-12-06 2007-06-14 Daniel John Simpson Interactive natural language calling system
US20070179805A1 (en) * 2006-01-27 2007-08-02 Gilbert Richard L Method and system for improving the quality of service and care in a healthcare organization
US7860803B1 (en) * 2006-02-15 2010-12-28 Google Inc. Method and system for obtaining feedback for a product
US8112298B2 (en) * 2006-02-22 2012-02-07 Verint Americas, Inc. Systems and methods for workforce optimization
US20070239353A1 (en) * 2006-03-03 2007-10-11 David Vismans Communication device for updating current navigation contents
US7792278B2 (en) * 2006-03-31 2010-09-07 Verint Americas Inc. Integration of contact center surveys
US7525429B2 (en) * 2006-04-21 2009-04-28 Persage, Inc. Delivery notification system
WO2007140321A2 (en) * 2006-05-26 2007-12-06 Mix & Meet, Inc. System and method for scheduling meetings and user interface
US7818195B2 (en) * 2006-07-06 2010-10-19 International Business Machines Corporation Method, system and program product for reporting a call level view of a customer interaction with a contact center
US8255248B1 (en) * 2006-07-20 2012-08-28 Intuit Inc. Method and computer program product for obtaining reviews of businesses from customers
US20080040129A1 (en) * 2006-08-08 2008-02-14 Capital One Financial Corporation Systems and methods for providing a vehicle service management service
US20080077468A1 (en) * 2006-08-10 2008-03-27 Yahoo! Inc. Managing responses to extended interviews to enable profiling of mobile device users
US20080040189A1 (en) * 2006-08-14 2008-02-14 Cisco Technology, Inc. Automatic selection of meeting participants
IL177617A (en) * 2006-08-22 2013-11-28 Israel Beniaminy System for mobile workforce, vehicle, asset and service management
US20080082257A1 (en) * 2006-09-05 2008-04-03 Garmin Ltd. Personal navigational device and method with automatic call-ahead
US7899700B2 (en) 2006-09-29 2011-03-01 Knowledge Networks, Inc. Method and system for providing multi-dimensional feedback
US8223953B2 (en) * 2006-11-17 2012-07-17 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for rule-based direction of customer service calls
JP2008152575A (en) * 2006-12-18 2008-07-03 Fujitsu Ltd Complaint handling method and device
US8099085B2 (en) * 2007-01-16 2012-01-17 At&T Intellectual Property I, Lp Method and system for communicating with users of wireless devices when approaching a predetermined destination
WO2008107880A2 (en) 2007-03-05 2008-09-12 Invoke Solutions Inc. Point of experience survey
US8290808B2 (en) * 2007-03-09 2012-10-16 Commvault Systems, Inc. System and method for automating customer-validated statement of work for a data storage environment
US8255158B2 (en) 2007-03-23 2012-08-28 Verizon Patent And Licensing Inc. Travel route adjustment
US8718254B2 (en) * 2007-06-26 2014-05-06 At&T Intellectual Property I, L.P. Techniques for conference scheduling
US8095395B2 (en) * 2007-09-25 2012-01-10 United Parcel Service Of America, Inc. Method, system, and computer readable medium for analyzing damage to a package in a shipping environment
US8209209B2 (en) * 2007-10-02 2012-06-26 Incontact, Inc. Providing work, training, and incentives to company representatives in contact handling systems
US20090125425A1 (en) * 2007-11-01 2009-05-14 Hunter E. Riley, Llc Auditable merchandise delivery using an electronic bill of lading
US20090150217A1 (en) * 2007-11-02 2009-06-11 Luff Robert A Methods and apparatus to perform consumer surveys
US20090150206A1 (en) * 2007-12-07 2009-06-11 Mci Communications Services Notification system and method
US8805724B2 (en) * 2007-12-18 2014-08-12 Verizon Patent And Licensing Inc. Intelligent customer retention and offer/customer matching
US8100322B1 (en) * 2007-12-24 2012-01-24 Symantec Corporation Systems, apparatus, and methods for obtaining satisfaction ratings for online purchases
US20090187460A1 (en) * 2008-01-23 2009-07-23 Your Fast Track, Inc., D/B/A Qualitick System and method for real-time feedback
US9659299B2 (en) * 2008-04-30 2017-05-23 Hartford Fire Insurance Company Computer system and method for interim transaction diagnosis for selective remediation and customer loyalty enhancement
US8489472B2 (en) * 2008-05-08 2013-07-16 United Parcel Service Of America, Inc. Proactive monitoring and intervention capabilities in a package delivery system
US20090306967A1 (en) * 2008-06-09 2009-12-10 J.D. Power And Associates Automatic Sentiment Analysis of Surveys
US7940172B2 (en) * 2008-12-04 2011-05-10 International Business Machines Corporation Combining time and GPS locations to trigger message alerts
EP3499507A1 (en) * 2009-07-21 2019-06-19 Zoll Medical Corporation System for providing role-based data feeds for caregivers
US20110082721A1 (en) 2009-10-02 2011-04-07 International Business Machines Corporation Automated reactive business processes
US20110137696A1 (en) 2009-12-04 2011-06-09 3Pd Performing follow-up actions based on survey results
US20160048799A1 (en) 2014-08-15 2016-02-18 Xpo Last Mile, Inc. Cascading call notification system and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6380928B1 (en) * 1997-12-31 2002-04-30 Kenneth J. Todd Dynamically configurable electronic survey response alert system
US6963848B1 (en) * 2000-03-02 2005-11-08 Amazon.Com, Inc. Methods and system of obtaining consumer reviews
US7346505B1 (en) * 2001-09-28 2008-03-18 At&T Delaware Intellectual Property, Inc. System and method for voicemail transcription
US20060258347A1 (en) * 2001-11-30 2006-11-16 Interdigital Technology Corporation Cognition models for wireless communication systems and method and apparatus for optimal utilization of a radio channel based on cognition model data
US20050130110A1 (en) * 2003-12-16 2005-06-16 Gosling Martin M. System and method to give a true indication of respondent satisfaction to an electronic questionnaire survey
US20070083472A1 (en) * 2005-10-06 2007-04-12 Israel Max L Customer Satisfaction Reporting
US20070168241A1 (en) * 2006-01-19 2007-07-19 Benchmark Integrated Technologies, Inc. Survey-based management performance evaluation systems
US20100080365A1 (en) * 2008-09-29 2010-04-01 Microsoft Corporation Offline voicemail
US20100262463A1 (en) * 2009-04-14 2010-10-14 Jason Tryfon Systems, Methods, and Media for Management of a Survey Response Associated with a Score

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11288687B2 (en) 2009-12-04 2022-03-29 Xpo Last Mile, Inc. Triggering and conducting an automated survey
US10650397B2 (en) 2009-12-04 2020-05-12 Xpo Last Mile, Inc. Triggering and conducting an automated survey
US10657549B2 (en) 2009-12-04 2020-05-19 Xpo Last Mile, Inc. Performing follow-up actions based on survey results
US10664853B2 (en) 2009-12-04 2020-05-26 Xpo Last Mile, Inc. Triggering, conducting, and analyzing an automated survey
US11170876B2 (en) * 2010-10-09 2021-11-09 MEI Research, Ltd. System to dynamically collect and synchronize data with mobile devices
US11915801B2 (en) 2010-10-09 2024-02-27 MEI Research, Ltd. System to dynamically collect and synchronize data with mobile devices
US9491291B1 (en) * 2011-04-20 2016-11-08 Confinement Telephony Technology, Llc Systems and methods for institutional messaging
US20140095259A1 (en) * 2012-10-01 2014-04-03 Cadio, Inc. Offering survey response opportunities for sale
US10191895B2 (en) * 2014-11-03 2019-01-29 Adobe Systems Incorporated Adaptive modification of content presented in electronic forms
US10762288B2 (en) 2014-11-03 2020-09-01 Adobe Inc. Adaptive modification of content presented in electronic forms
US10423905B2 (en) * 2015-02-04 2019-09-24 Hexagon Technology Center Gmbh Work information modelling
CN106303102A (en) * 2015-06-25 2017-01-04 阿里巴巴集团控股有限公司 Automatization's calling-out method, Apparatus and system
WO2016210114A1 (en) * 2015-06-25 2016-12-29 Alibaba Group Holding Limited System, device, and method for making automatic calls
US10643223B2 (en) 2015-09-29 2020-05-05 Microsoft Technology Licensing, Llc Determining optimal responsiveness for accurate surveying

Also Published As

Publication number Publication date
CA3073908C (en) 2023-09-26
CA2696345A1 (en) 2011-06-04
US20190180299A1 (en) 2019-06-13
US10657549B2 (en) 2020-05-19
US10650397B2 (en) 2020-05-12
US11769163B2 (en) 2023-09-26
CA2723506A1 (en) 2011-06-04
US20120016720A1 (en) 2012-01-19
US10664853B2 (en) 2020-05-26
US20120022905A1 (en) 2012-01-26
US20150193778A1 (en) 2015-07-09
US20110137696A1 (en) 2011-06-09
US20110137698A1 (en) 2011-06-09
CA3073908A1 (en) 2011-06-04
US20190236624A1 (en) 2019-08-01
US11288687B2 (en) 2022-03-29
US20120059681A1 (en) 2012-03-08
US10262329B2 (en) 2019-04-16
US20120016719A1 (en) 2012-01-19
US20140229238A1 (en) 2014-08-14
US20150287063A1 (en) 2015-10-08
US20200226625A1 (en) 2020-07-16
US20110137709A1 (en) 2011-06-09
US20190236623A1 (en) 2019-08-01
US8515803B2 (en) 2013-08-20
CA2696345C (en) 2016-12-20
US20110137808A1 (en) 2011-06-09
CA2723506C (en) 2020-04-21

Similar Documents

Publication Publication Date Title
US11288687B2 (en) Triggering and conducting an automated survey
US10306055B1 (en) Reviewing portions of telephone call recordings in a contact center using topic meta-data records
US10789039B2 (en) Call visualization
US11915248B2 (en) Customer management system
US11423410B2 (en) Customer management system
US8209218B1 (en) Apparatus, system and method for processing, analyzing or displaying data related to performance metrics
US8467518B2 (en) Systems and methods for analyzing contact center interactions
US8588395B2 (en) Customer service methods, apparatus and report/alert generation based on customer service call information
AU785168B2 (en) End-to-end service delivery (post-sale) process
JP5209267B2 (en) Sales representative workbench with account-based interface
US11528362B1 (en) Agent performance measurement framework for modern-day customer contact centers
US20130046581A1 (en) System and methods for strategically ranking and allocating results of web-based inquiries
US20140207521A1 (en) Systems and methods for enhanced preselection and confirmation process for potential candidates for approvals to multiple potential matching transaction partners
US9037481B2 (en) System and method for intelligent customer data analytics
McDonald et al. Managing Campaigns
DOSUNMU et al. A knowledge-based expert system for campus helpdesk request processing
Piroontanapisarn Advanced help desk for enterprise system
Karagöl Successful CRM application model building in telecom sector
Kenmoku Pursuing Stable Operation by Reforming Service Quality

Legal Events

Date Code Title Description
AS Assignment

Owner name: XPO LAST MILE, INC., GEORGIA

Free format text: CHANGE OF NAME;ASSIGNOR:3PD, INC.;REEL/FRAME:033278/0872

Effective date: 20140612

AS Assignment

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND

Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:XPO LAST MILE, INC.;CON-WAY FREIGHT, INC.;REEL/FRAME:037108/0894

Effective date: 20151030

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND

Free format text: PATENT SECURITY AGREEMENT;ASSIGNORS:XPO LAST MILE, INC.;CON-WAY FREIGHT, INC.;REEL/FRAME:037108/0885

Effective date: 20151030

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: XPO LAST MILE, INC., CONNECTICUT

Free format text: PARTIAL RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 037108, FRAME 0885;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS AGENT;REEL/FRAME:061838/0413

Effective date: 20221101

Owner name: XPO LAST MILE, INC., CONNECTICUT

Free format text: PARTIAL RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 037108, FRAME 0894;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS AGENT;REEL/FRAME:061838/0952

Effective date: 20221101