US20200234193A1 - Cloud computing scoring systems and methods - Google Patents
Cloud computing scoring systems and methods Download PDFInfo
- Publication number
- US20200234193A1 US20200234193A1 US16/842,987 US202016842987A US2020234193A1 US 20200234193 A1 US20200234193 A1 US 20200234193A1 US 202016842987 A US202016842987 A US 202016842987A US 2020234193 A1 US2020234193 A1 US 2020234193A1
- Authority
- US
- United States
- Prior art keywords
- category
- unstructured
- sentiment
- data
- service provider
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 49
- 238000005516 engineering process Methods 0.000 claims description 13
- 238000004422 calculation algorithm Methods 0.000 claims description 5
- 238000010801 machine learning Methods 0.000 claims 3
- 238000012545 processing Methods 0.000 description 16
- 238000004458 analytical method Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 9
- 238000007405 data analysis Methods 0.000 description 5
- 230000006855 networking Effects 0.000 description 5
- 238000012358 sourcing Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 210000003813 thumb Anatomy 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000013077 scoring method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/169—Annotation, e.g. comment data or footnotes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
Definitions
- SMIs service measurement indices
- Another method of choosing a cloud service provider may be to process and integrate social sentiment data from a variety of social networking sources such as Twitter®.
- sentiment analysis may have substantial inaccuracies, especially if generic and not tailored to a specific domain like cloud computing.
- generic opinion mining may lack a structured detail on specific service categories.
- benchmarking services may be able to periodically measure the fine details of the many technical components of a cloud platform, reporting the performance to a consumer.
- benchmarking is expensive, and the results lack an aggregate user's perspective for “how all the pieces fit together” to make a good cloud computing experience.
- a computer-implemented cloud computing scoring system which may comprise a parser receiving unstructured sentiment data commenting on a scored service.
- the parser may identify in the unstructured sentiment data a service category of the scored service.
- the parser may select from the unstructured sentiment data text relating to the service category and matching one or more opinionative words and phrases listed in a keyword dictionary, thereby producing a structured comment associated with the service category.
- the structured comment may be classified as positive or negative according to a list of exemplary sentiment data sets contained in a learning seed file.
- the exemplary sentiment data sets may be manually assigned a positive or a negative polarity.
- the learning seed file may be configured to be enhanced by the ongoing addition of structured sentiment data, the structured sentiment data commenting on the scored service and having a polarity classification.
- a computer-implemented cloud computing scoring system which may comprise a data acquisition component gathering data reporting on a scored service in a service category.
- the data may be gathered from at least two of unstructured sentiment data, structured sentiment data, and structured analytics data.
- a data analysis component may perform sentiment analysis on the sentiment data which generates a classified sentiment result from the unstructured sentiment data and a structured sentiment result from the structured sentiment data.
- the data analysis component may manually score the structured analytics data to generate a structured analytics result.
- a data processing component may weight the structured analytics result, the classified sentiment result, and the structured sentiment result according to a relative influence of each.
- the weighted results may be combined and normalized into a normalized score on a standard scale.
- a data application component may display the normalized score for the scored service within the service category.
- a computer-implemented cloud computing scoring method which may comprise parsing unstructured sentiment data commenting on a scored service, thereby identifying a service category of the scored service.
- the method may further include selecting from the unstructured sentiment data text that matches one or more opinionative words and phrases listed in a keyword dictionary, thereby producing structured comment associated with the service category.
- the method may further include classifying, using a learning seed file, the structured comment as positive or negative according to a list of exemplary sentiment data sets contained in the learning seed file, the exemplary sentiment data sets being manually assigned a positive or a negative polarity, said classifying thereby generating a classified sentiment result.
- the method may further include configuring the learning seed file to be enhanced by the ongoing addition of structured sentiment data, the structured sentiment data commenting on the scored service and having a polarity classification.
- FIG. 1 illustrates a system block diagram for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 2 illustrates a block diagram for a learning seed file of a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 3 illustrates a distributed computing architecture for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 4 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 5 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 6 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 7 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 8 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 9 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 10 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 11 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 12 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 13 illustrates a data application displaying normalized scores for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 14 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 15 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 16 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 17 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 18 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- FIG. 19 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure.
- the subject matter may be embodied as devices, systems, methods, and/or computer program products. Accordingly, some or all of the subject matter may be embodied in hardware and/or in software (including firmware, resident software, micro-code, state machines, gate arrays, etc.) Furthermore, the subject matter may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system.
- a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- computer readable media may comprise computer storage media and communication media.
- Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by an instruction execution system.
- the computer-usable or computer-readable medium could be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, of otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
- the embodiment may comprise program modules, executed by one or more systems, computers, or other devices.
- program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- functionality of the program modules may be combined or distributed as desired in various embodiments.
- a cloud computing scoring system (“scoring system”) 10 may comprise a data acquisition and feed component 30 feeding data automatically 16 into a data analysis component 32 for performing sentiment analysis on the collected commentary regarding various cloud computing service providers (not shown) to be scored.
- Data acquisition block 30 may also manually 18 deliver structured analytics data 24 , such as benchmarking measurements or performance reports, for further processing.
- Data processing block 34 may receive sentiment results 60 and analytics data 24 for weighting, combining, and normalizing, which may then produce a normalized score 78 rating the quality of each service category of one or more cloud computing service providers ( FIG. 13 ).
- Displayed score 68 in data application component 36 may be displayed to web application 15 and/or mobile application 72 .
- the compute, memory, storage, and software resources of scoring system 10 may be distributed according to best performance, price, and resource availability known to those skilled in the art.
- server 86 may coordinate the operation of resources within data analysis 32 and data processing 34 components, while analytics database 26 may be connected to processing 32 from another location within internet cloud 28 .
- Unstructured sentiment data 20 and structured sentiment data 22 may be connected from different locations within internet cloud 28 .
- Normalized score 78 may be delivered to web application 15 and mobile application 72 in various locations via internet cloud 28 .
- portions of data analysis 32 and data processing 34 may be executed by separate servers or remotely.
- keyword dictionary 42 may exist in the cloud and may be accessed by parser 40 for performing sentiment analysis.
- unstructured sentiment data 20 may be fed automatically 16 from a variety of data sources 12 in order to mine opinion data for reporting on the performance of various service categories of a cloud service provider.
- Service categories FIG. 13
- Unstructured sentiment data 20 may be widely available, copious, and may comprise text commentary, yet may generally lack an identified service category or service provider structured to the sentiment data.
- Data sources 12 may include Twitter®, Facebook®, unstructured data from a crowd-sourcing application called CloudVibeTM, unstructured analytics trending data from a service provider assessment platform, and other data sources 12 such as social networking feeds, internet articles, news, and blogs. Because cloud platforms may be rapidly changing due to competitive and technology churn, the availability of unstructured sentiment data 20 may present an opportunity to update and improve service provider scoring without the cost of elaborate marketing surveys or benchmarking measurement programs. Unfortunately, unstructured sentiment data 20 may not have been structured according to service provider or service category, nor classified as to a positive or negative polarity classification indicating attitude or observation.
- a crowd-sourcing application database 14 may store structured sentiment data from a mobile application 72 such as CloudVibeTM ( FIGS. 14-19 ).
- Mobile application 72 may allow registered users to score a service provider by entering a “thumbs up” or thumbs down” polarity classification 80 associated with a service category and a service provider, including entering a brief text comment.
- the CloudVibeTM mobile application 72 may display scoring for various service providers to the registered users and may thereby present one useful perspective for choosing a service provider.
- the disclosure that follows describes how CloudVibeTM or other structured sentiment data may be utilized to update and improve service provider scoring beyond what a dedicated crowd-sourcing application or structured analytics may do alone.
- parser 40 may receive unstructured sentiment data 20 commenting on a service provider (a scored service) to be scored and may identify a service category (not shown). Parser 40 may also identify a scored service. For example, a list of service categories and scored services may be stored in parser 40 or in the associated keyword dictionary 42 , and which may be matched to words in the unstructured sentiment data for identification thereof. Parser 40 may select from the unstructured sentiment data 20 text relating to the service category and matching one or more opinionative words and phrases listed in keyword dictionary 42 , thereby producing structured comment 50 associated with the service category.
- a scored service may identify a service category (not shown). Parser 40 may also identify a scored service. For example, a list of service categories and scored services may be stored in parser 40 or in the associated keyword dictionary 42 , and which may be matched to words in the unstructured sentiment data for identification thereof. Parser 40 may select from the unstructured sentiment data 20 text relating to the service category and matching one or more opinionative words and
- Keyword dictionary 42 may be a generic sentiment database with a thorough list of words and phrases indicative of unambiguous opinion, or may be domain-specific, such as for engineering or computers, and may include terms and jargon common in the field of cloud computing in order to identify the service category and affect.
- structured comment 50 may then be classified by classifier 46 as positive or negative according to a list of exemplary sentiment data sets 38 contained in learning seed file 44 , and may thereby generate classified sentiment result 52 .
- the exemplary sentiment data sets 38 may be manually assigned a positive or a negative polarity 80 by an industry expert 88 . This manual assignment may be a kind of training process performed when scoring system 10 is installed, or may be periodically performed. Additionally, in an embodiment, a third classification may be a neutral classification in the case of a weak or ambiguous opinion. Alternatively, data sets 38 may be assigned a positive or negative strength on a scale, such as from ⁇ 10 to +10.
- the learning seed file may provide lists of commentary specific to each service category and associated with an industry-trained polarity, whereas the keyword dictionary may identify and structure the target categories to an opinionative subset of the commentary in the sentiment data.
- the classifying function 46 may be contained in the learning seed file 44 storing exemplary data sets 38 .
- learning seed file 44 may be configured to be enhanced by the ongoing addition of structured sentiment data 22 commenting on the scored service and having a polarity classification 80 .
- the learning seed file 44 may learn additional words, phrases, and/or word constellations which may appear in unstructured sentiment data 20 and whose addition to exemplary sentiment data sets 38 updates and improves the accuracy of service provider scoring.
- learning seed file 44 through its algorithm 48 , may identify opinionative words and phrases within structured sentiment data 22 that enhance exemplary sentiment data sets 38 , and may add the sentiment data to the learning seed file 44 .
- crowd-sourcing database 14 may provide sentiment data structured in a form ready to be added directly to exemplary sentiment data sets 38 should the addition improve the quality of the scoring.
- a structured sentiment may strongly indicate an opinion for a service category largely missing from exemplary sentiment data sets 38 , and learning seed file 44 may determine that adding the strong sentiment data will therefore enhance the scoring system's ability to benefit from the receipt of unstructured data 20 .
- the use of pre-classified, structured sentiment data 22 to update an industry-tuned 88 exemplary sentiment data sets 38 may act as a continuous self-training, making better contextual use of social networking data and thereby provide aggregate scoring from the user's perspective.
- the steps of parsing, classifying, and enhancing the sentiment analysis of unstructured social networking data 20 may provide an advantage over existing methods of parsing and classifying against a list of words after training the sentiment analysis algorithm prior to initial deployment.
- simple sentiment analysis block 58 may associate a service category with a polarity classification 80 of structured sentiment data 22 for delivering a structured sentiment result 56 to data processing component 34 .
- Sentiment results pool 60 may collect structured sentiment result 56 and classified sentiment result 52 for weighting, combining, and normalizing. Alternately, the format of structured sentiment data 22 may not require formatting by simple sentiment analysis block 58 if data 22 is ready for combining.
- classified sentiment result 52 may be processed 34 without being combined with structured sentiment data 22 , score 78 having already benefited by the ongoing addition of structured sentiment data 22 to learning seed file 44 .
- simple sentiment analysis block 58 may be simple because data 22 has already been parsed and classified with a polarity 80 .
- weighting block 64 may weight results 56 and 52 according to a relative influence of each. The relative weighting may depend on the reliability of each result to generate an accurate score for normalized score 78 .
- analytics data 24 quantifying the scored service may be processed by analytics scoring block 62 to generate a structured analytics result 54 compatible in format with the format of sentiment results 56 and 52 .
- analytics data 24 may quantify several analytics performance factors 82 ( FIGS. 4 and 5 ) that need to be formatted to associate with a particular service category being scored such as reliability or performance.
- Analytics performance factors 82 such as geographic coverage or benchmarking data may be collected by various analytics processes such as technology surveys, or benchmarking measurements of read/write latency in a cloud storage device.
- analytics result 54 may be combined with sentiment result 52 in order to stabilize and broaden the perspective of scoring system 10 .
- analytics data 24 may not be combined with sentiment results for providing a normalized score 78 , and the decision to combine analytic result 54 may be dependent on the service category being scored.
- the results being combined for a particular service category may be weighted 64 according to a relative influence of each result.
- the results being combined may include at least one of classified sentiment result 52 , structured sentiment result 56 , and structured analytics result 54 .
- a weighting 64 factor of 0.1 (10%) may be applied to each of five analytics performance factors 82 making up structured analytics result 54
- a weighting 64 factor of 0.4 (40%) may be applied to structured sentiment result 56 from the CloudVibeTM crowd-sourcing application.
- a weighting 64 factor of 0.1 (10%) may be applied to classified sentiment result 52 from TwitterTM.
- 111 points may be chosen as the maximum weighted result 84 for any category and for normalized score 78 .
- any scale value may be used for the maximum normalized score 78 .
- all of the weighted results 84 may be combined into weighted sum 90 and normalized 66 to a standard scale, such as 1000.
- classified sentiment result 52 may be combined with structured sentiment result 56 , as shown in FIGS. 6, 8, and 9 , to yield a normalized score 78 ( FIG. 13 ).
- structured analytics result 54 may be combined with both sentiment results 52 and 56 , as shown in FIGS. 4, 5, 7 , and 10 - 12 , to yield a normalized score 78 .
- structured analytics result 54 may be combined with one of sentiment result 52 and 56 to yield a normalized score 78 .
- Normalization 66 may be set to a standard scale, such as 1000, for matching the scale of a user interface used to display 68 scores.
- Scores 78 may be displayed 68 on user interfaces such as the CloudSphereTM and CloudVibeTM products.
- a mobile phone having the CloudVibeTM mobile application 72 may display normalized scores 78 for each of five scored services 74 and according to service categories 76 on a standard scale of 1000 ( FIGS. 13 and 16 ).
- Each normalized score 78 may be color coded according to a low (e.g. 114 ), medium (e.g. 422 ) or high (e.g. 790 ) score.
- scores 78 may be displayed on any user interface or by any communications means, such as displaying scores 78 on a web application, by a text message, by an email message, or through a paper report.
- FIG. 14 illustrates a log-in screen through which a user may access mobile application 72 .
- Options may be presented to the user and may include posting sentiment 22 ( FIG. 17 ), viewing scores 78 ( FIG. 18 ), or viewing trend reports ( FIG. 19 ).
- a user of mobile application 72 may be asked to classify a service category with a “thumbs up” or thumbs down” polarity classification 80 associated with brief sentiment data 22 .
- Various embodiments of the present systems and methods may be used as a tool internally by a cloud consultant as input into a final report for a client.
- Various embodiments of the present systems and methods may be integrated into upstream or downstream supply chain or provisioning systems in the form of OEM.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
- The present application is a continuation of U.S. application Ser. No. 14/687,748, filed Apr. 15, 2015, which claims priority to U.S. Provisional Application No. 61/980,928 filed on Apr. 17, 2014 and entitled CLOUD COMPUTING SCORING SYSTEMS AND METHODS, the entire contents of the foregoing applications are expressly incorporated by reference herein.
- As businesses and enterprises migrate to the Cloud for accessing IT resources, they require reliable, contextual data for choosing a service provider that will best suit their particular constellation of needs. Evaluating cloud providers may be difficult because the service measurement indices (SMIs) used to evaluate performance may vary widely from one service provider to the next. One method of comparing cloud service providers is to gather individual reports through word of mouth, blogs, and social networking. However, individual reports are highly unstructured, lack context, and do not address all of SMIs.
- Another method of choosing a cloud service provider may be to process and integrate social sentiment data from a variety of social networking sources such as Twitter®. However, sentiment analysis may have substantial inaccuracies, especially if generic and not tailored to a specific domain like cloud computing. Additionally, generic opinion mining may lack a structured detail on specific service categories. Alternately, benchmarking services may be able to periodically measure the fine details of the many technical components of a cloud platform, reporting the performance to a consumer. Unfortunately, benchmarking is expensive, and the results lack an aggregate user's perspective for “how all the pieces fit together” to make a good cloud computing experience.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key aspects or essential aspects of the claimed subject matter. Moreover, this Summary is not intended for use as an aid in determining the scope of the claimed subject matter.
- In an embodiment, there is disclosed a computer-implemented cloud computing scoring system which may comprise a parser receiving unstructured sentiment data commenting on a scored service. The parser may identify in the unstructured sentiment data a service category of the scored service. The parser may select from the unstructured sentiment data text relating to the service category and matching one or more opinionative words and phrases listed in a keyword dictionary, thereby producing a structured comment associated with the service category. The structured comment may be classified as positive or negative according to a list of exemplary sentiment data sets contained in a learning seed file. The exemplary sentiment data sets may be manually assigned a positive or a negative polarity. The learning seed file may be configured to be enhanced by the ongoing addition of structured sentiment data, the structured sentiment data commenting on the scored service and having a polarity classification.
- In another embodiment, there is disclosed a computer-implemented cloud computing scoring system which may comprise a data acquisition component gathering data reporting on a scored service in a service category. The data may be gathered from at least two of unstructured sentiment data, structured sentiment data, and structured analytics data. A data analysis component may perform sentiment analysis on the sentiment data which generates a classified sentiment result from the unstructured sentiment data and a structured sentiment result from the structured sentiment data. The data analysis component may manually score the structured analytics data to generate a structured analytics result. A data processing component may weight the structured analytics result, the classified sentiment result, and the structured sentiment result according to a relative influence of each. The weighted results may be combined and normalized into a normalized score on a standard scale. A data application component may display the normalized score for the scored service within the service category.
- In yet another embodiment, there is disclosed a computer-implemented cloud computing scoring method which may comprise parsing unstructured sentiment data commenting on a scored service, thereby identifying a service category of the scored service. The method may further include selecting from the unstructured sentiment data text that matches one or more opinionative words and phrases listed in a keyword dictionary, thereby producing structured comment associated with the service category. The method may further include classifying, using a learning seed file, the structured comment as positive or negative according to a list of exemplary sentiment data sets contained in the learning seed file, the exemplary sentiment data sets being manually assigned a positive or a negative polarity, said classifying thereby generating a classified sentiment result. The method may further include configuring the learning seed file to be enhanced by the ongoing addition of structured sentiment data, the structured sentiment data commenting on the scored service and having a polarity classification.
- Additional objects, advantages and novel features of the technology will be set forth in part in the description which follows, and in part will become more apparent to those skilled in the art upon examination of the following, or may be learned from practice of the technology.
- Non-limiting and non-exhaustive embodiments of the present invention, including the preferred embodiment, are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. Illustrative embodiments of the invention are illustrated in the drawings, in which:
-
FIG. 1 illustrates a system block diagram for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 2 illustrates a block diagram for a learning seed file of a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 3 illustrates a distributed computing architecture for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 4 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 5 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 6 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 7 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 8 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 9 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 10 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 11 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 12 illustrates a data processing calculation for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 13 illustrates a data application displaying normalized scores for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 14 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 15 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 16 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 17 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 18 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. -
FIG. 19 illustrates a mobile application for data acquisition and display for a cloud computing scoring systems and methods, in accordance with an embodiment of the present disclosure. - Embodiments are described more fully below in sufficient detail to enable those skilled in the art to practice the system and method. However, embodiments may be implemented in many different forms and should not be construed as being limited to the embodiments set forth herein. The following detailed description is, therefore, not to be taken in a limiting sense.
- When elements are referred to as being “connected” or “coupled,” the elements can be directly connected or coupled together or one or more intervening elements may also be present. In contrast, when elements are referred to as being “directly connected” or “directly coupled,” there are no intervening elements present.
- The subject matter may be embodied as devices, systems, methods, and/or computer program products. Accordingly, some or all of the subject matter may be embodied in hardware and/or in software (including firmware, resident software, micro-code, state machines, gate arrays, etc.) Furthermore, the subject matter may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media.
- Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by an instruction execution system. Note that the computer-usable or computer-readable medium could be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, of otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
- When the subject matter is embodied in the general context of computer-executable instructions, the embodiment may comprise program modules, executed by one or more systems, computers, or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.
- In an embodiment, referring to
FIGS. 1 and 3 , a cloud computing scoring system (“scoring system”) 10 may comprise a data acquisition andfeed component 30 feeding data automatically 16 into adata analysis component 32 for performing sentiment analysis on the collected commentary regarding various cloud computing service providers (not shown) to be scored.Data acquisition block 30 may also manually 18 deliverstructured analytics data 24, such as benchmarking measurements or performance reports, for further processing.Data processing block 34 may receivesentiment results 60 andanalytics data 24 for weighting, combining, and normalizing, which may then produce a normalizedscore 78 rating the quality of each service category of one or more cloud computing service providers (FIG. 13 ). Displayedscore 68 indata application component 36 may be displayed toweb application 15 and/ormobile application 72. - Continuing with
FIGS. 1 and 3 , in various embodiments, the compute, memory, storage, and software resources of scoringsystem 10 may be distributed according to best performance, price, and resource availability known to those skilled in the art. In an embodiment, referring toFIG. 3 ,server 86 may coordinate the operation of resources withindata analysis 32 anddata processing 34 components, whileanalytics database 26 may be connected to processing 32 from another location withininternet cloud 28.Unstructured sentiment data 20 and structuredsentiment data 22 may be connected from different locations withininternet cloud 28.Normalized score 78 may be delivered toweb application 15 andmobile application 72 in various locations viainternet cloud 28. Alternately, portions ofdata analysis 32 anddata processing 34 may be executed by separate servers or remotely. For example,keyword dictionary 42 may exist in the cloud and may be accessed byparser 40 for performing sentiment analysis. - Now referring to
FIG. 1 , in an embodiment,unstructured sentiment data 20 may be fed automatically 16 from a variety ofdata sources 12 in order to mine opinion data for reporting on the performance of various service categories of a cloud service provider. Service categories (FIG. 13 ) for which scoring is desired may include infrastructure, security, reliability, service level, customer service, usability, price, performance, technology, and an overall rating for the service provider.Unstructured sentiment data 20 may be widely available, copious, and may comprise text commentary, yet may generally lack an identified service category or service provider structured to the sentiment data.Data sources 12 may include Twitter®, Facebook®, unstructured data from a crowd-sourcing application called CloudVibe™, unstructured analytics trending data from a service provider assessment platform, andother data sources 12 such as social networking feeds, internet articles, news, and blogs. Because cloud platforms may be rapidly changing due to competitive and technology churn, the availability ofunstructured sentiment data 20 may present an opportunity to update and improve service provider scoring without the cost of elaborate marketing surveys or benchmarking measurement programs. Unfortunately,unstructured sentiment data 20 may not have been structured according to service provider or service category, nor classified as to a positive or negative polarity classification indicating attitude or observation. - Continuing with
FIG. 1 , in an embodiment, a crowd-sourcing application database 14 may store structured sentiment data from amobile application 72 such as CloudVibe™ (FIGS. 14-19 ).Mobile application 72 may allow registered users to score a service provider by entering a “thumbs up” or thumbs down”polarity classification 80 associated with a service category and a service provider, including entering a brief text comment. The CloudVibe™mobile application 72 may display scoring for various service providers to the registered users and may thereby present one useful perspective for choosing a service provider. The disclosure that follows describes how CloudVibe™ or other structured sentiment data may be utilized to update and improve service provider scoring beyond what a dedicated crowd-sourcing application or structured analytics may do alone. - Referring to
FIGS. 1 and 2 , in an embodiment,parser 40 may receiveunstructured sentiment data 20 commenting on a service provider (a scored service) to be scored and may identify a service category (not shown).Parser 40 may also identify a scored service. For example, a list of service categories and scored services may be stored inparser 40 or in the associatedkeyword dictionary 42, and which may be matched to words in the unstructured sentiment data for identification thereof.Parser 40 may select from theunstructured sentiment data 20 text relating to the service category and matching one or more opinionative words and phrases listed inkeyword dictionary 42, thereby producing structuredcomment 50 associated with the service category.Keyword dictionary 42 may be a generic sentiment database with a thorough list of words and phrases indicative of unambiguous opinion, or may be domain-specific, such as for engineering or computers, and may include terms and jargon common in the field of cloud computing in order to identify the service category and affect. - Continuing with
FIGS. 1 and 2 , in various embodiments,structured comment 50 may then be classified byclassifier 46 as positive or negative according to a list of exemplary sentiment data sets 38 contained in learningseed file 44, and may thereby generateclassified sentiment result 52. The exemplary sentiment data sets 38 may be manually assigned a positive or anegative polarity 80 by anindustry expert 88. This manual assignment may be a kind of training process performed when scoringsystem 10 is installed, or may be periodically performed. Additionally, in an embodiment, a third classification may be a neutral classification in the case of a weak or ambiguous opinion. Alternatively, data sets 38 may be assigned a positive or negative strength on a scale, such as from −10 to +10. The learning seed file may provide lists of commentary specific to each service category and associated with an industry-trained polarity, whereas the keyword dictionary may identify and structure the target categories to an opinionative subset of the commentary in the sentiment data. Alternately, the classifyingfunction 46 may be contained in thelearning seed file 44 storing exemplary data sets 38. - Continuing further with
FIGS. 1 and 2 , in an embodiment, learningseed file 44 may be configured to be enhanced by the ongoing addition ofstructured sentiment data 22 commenting on the scored service and having apolarity classification 80. By the use of structureddata 22, thelearning seed file 44 may learn additional words, phrases, and/or word constellations which may appear inunstructured sentiment data 20 and whose addition to exemplary sentiment data sets 38 updates and improves the accuracy of service provider scoring. In an embodiment, learningseed file 44, through itsalgorithm 48, may identify opinionative words and phrases withinstructured sentiment data 22 that enhance exemplary sentiment data sets 38, and may add the sentiment data to thelearning seed file 44. Alternatively, crowd-sourcingdatabase 14 may provide sentiment data structured in a form ready to be added directly to exemplary sentiment data sets 38 should the addition improve the quality of the scoring. For example, a structured sentiment may strongly indicate an opinion for a service category largely missing from exemplary sentiment data sets 38, and learningseed file 44 may determine that adding the strong sentiment data will therefore enhance the scoring system's ability to benefit from the receipt ofunstructured data 20. - Advantageously, the use of pre-classified, structured
sentiment data 22 to update an industry-tuned 88 exemplary sentiment data sets 38 may act as a continuous self-training, making better contextual use of social networking data and thereby provide aggregate scoring from the user's perspective. In summary, the steps of parsing, classifying, and enhancing the sentiment analysis of unstructuredsocial networking data 20 may provide an advantage over existing methods of parsing and classifying against a list of words after training the sentiment analysis algorithm prior to initial deployment. - Continuing further with
FIG. 1 , in an embodiment, simplesentiment analysis block 58 may associate a service category with apolarity classification 80 ofstructured sentiment data 22 for delivering astructured sentiment result 56 todata processing component 34. Sentiment results pool 60 may collectstructured sentiment result 56 andclassified sentiment result 52 for weighting, combining, and normalizing. Alternately, the format ofstructured sentiment data 22 may not require formatting by simplesentiment analysis block 58 ifdata 22 is ready for combining. In an embodiment,classified sentiment result 52 may be processed 34 without being combined withstructured sentiment data 22, score 78 having already benefited by the ongoing addition ofstructured sentiment data 22 to learningseed file 44. Advantageously, simplesentiment analysis block 58 may be simple becausedata 22 has already been parsed and classified with apolarity 80. - Referring still to
FIG. 1 , in an embodiment,weighting block 64 may weightresults score 78. In an embodiment,analytics data 24 quantifying the scored service may be processed byanalytics scoring block 62 to generate a structured analytics result 54 compatible in format with the format of sentiment results 56 and 52. For example,analytics data 24 may quantify several analytics performance factors 82 (FIGS. 4 and 5 ) that need to be formatted to associate with a particular service category being scored such as reliability or performance. Analytics performance factors 82 such as geographic coverage or benchmarking data may be collected by various analytics processes such as technology surveys, or benchmarking measurements of read/write latency in a cloud storage device. In an embodiment, analytics result 54 may be combined withsentiment result 52 in order to stabilize and broaden the perspective of scoringsystem 10. Alternately,analytics data 24 may not be combined with sentiment results for providing a normalizedscore 78, and the decision to combineanalytic result 54 may be dependent on the service category being scored. - Referring to
FIG. 1 andFIGS. 4 through 13 , in various embodiments, the results being combined for a particular service category may be weighted 64 according to a relative influence of each result. The results being combined may include at least one ofclassified sentiment result 52, structuredsentiment result 56, and structured analytics result 54. In an embodiment depicted inFIG. 4 , for example, aweighting 64 factor of 0.1 (10%) may be applied to each of five analytics performance factors 82 making up structured analytics result 54, whereas aweighting 64 factor of 0.4 (40%) may be applied to structured sentiment result 56 from the CloudVibe™ crowd-sourcing application. And, aweighting 64 factor of 0.1 (10%) may be applied to classified sentiment result 52 from Twitter™. 111 points may be chosen as the maximumweighted result 84 for any category and for normalizedscore 78. Alternately, any scale value may be used for the maximum normalizedscore 78. Afterweighting 64, all of theweighted results 84 may be combined intoweighted sum 90 and normalized 66 to a standard scale, such as 1000. - Continuing with
FIG. 1 andFIGS. 4 through 13 , in various embodiments,classified sentiment result 52 may be combined withstructured sentiment result 56, as shown inFIGS. 6, 8, and 9 , to yield a normalized score 78 (FIG. 13 ). In other embodiments, structured analytics result 54 may be combined with both sentiment results 52 and 56, as shown inFIGS. 4, 5, 7 , and 10-12, to yield a normalizedscore 78. In an embodiment not shown inFIGS. 4-12 , structured analytics result 54 may be combined with one ofsentiment result score 78.Normalization 66 may be set to a standard scale, such as 1000, for matching the scale of a user interface used to display 68 scores.Scores 78 may be displayed 68 on user interfaces such as the CloudSphere™ and CloudVibe™ products. - Referring now to
FIGS. 13 through 19 , in various embodiments, a mobile phone having the CloudVibe™mobile application 72 may display normalizedscores 78 for each of five scoredservices 74 and according toservice categories 76 on a standard scale of 1000 (FIGS. 13 and 16 ). Each normalizedscore 78 may be color coded according to a low (e.g. 114), medium (e.g. 422) or high (e.g. 790) score. Alternatively, scores 78 may be displayed on any user interface or by any communications means, such as displayingscores 78 on a web application, by a text message, by an email message, or through a paper report.FIG. 14 illustrates a log-in screen through which a user may accessmobile application 72. Options may be presented to the user and may include posting sentiment 22 (FIG. 17 ), viewing scores 78 (FIG. 18 ), or viewing trend reports (FIG. 19 ). In an embodiment shown inFIG. 17 , a user ofmobile application 72 may be asked to classify a service category with a “thumbs up” or thumbs down”polarity classification 80 associated withbrief sentiment data 22. - Although the above embodiments have been described in language that is specific to certain structures, elements, compositions, and methodological steps, it is to be understood that the technology defined in the appended claims is not necessarily limited to the specific structures, elements, compositions and/or steps described. Rather, the specific aspects and steps are described as forms of implementing the claimed technology. Since many embodiments of the technology can be practiced without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.
- Various embodiments of the present systems and methods may be used as a tool internally by a cloud consultant as input into a final report for a client. Various embodiments of the present systems and methods may be integrated into upstream or downstream supply chain or provisioning systems in the form of OEM.
- Various embodiments of the present systems and methods may be the foundation for a cloud marketplace resource trading or bidding system. The foregoing description of the subject matter has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject matter to the precise form disclosed, and other modifications and variations may be possible in light of the above teachings. The embodiment was chosen and described in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and various modifications as are suited to the particular use contemplated. It is intended that the appended claims be construed to include other alternative embodiments except insofar as limited by the prior art.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/842,987 US20200234193A1 (en) | 2014-04-17 | 2020-04-08 | Cloud computing scoring systems and methods |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461980928P | 2014-04-17 | 2014-04-17 | |
US14/687,748 US10621505B2 (en) | 2014-04-17 | 2015-04-15 | Cloud computing scoring systems and methods |
US16/842,987 US20200234193A1 (en) | 2014-04-17 | 2020-04-08 | Cloud computing scoring systems and methods |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/687,748 Continuation US10621505B2 (en) | 2014-04-17 | 2015-04-15 | Cloud computing scoring systems and methods |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200234193A1 true US20200234193A1 (en) | 2020-07-23 |
Family
ID=54322290
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/687,748 Active 2038-05-08 US10621505B2 (en) | 2014-04-17 | 2015-04-15 | Cloud computing scoring systems and methods |
US16/842,987 Pending US20200234193A1 (en) | 2014-04-17 | 2020-04-08 | Cloud computing scoring systems and methods |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/687,748 Active 2038-05-08 US10621505B2 (en) | 2014-04-17 | 2015-04-15 | Cloud computing scoring systems and methods |
Country Status (1)
Country | Link |
---|---|
US (2) | US10621505B2 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10084913B2 (en) * | 2014-08-18 | 2018-09-25 | Wells Fargo Bank, N.A. | Sentiment management system |
KR101755227B1 (en) * | 2015-08-10 | 2017-07-06 | 숭실대학교산학협력단 | Apparatus and method for prodict type classification |
US10108465B1 (en) | 2016-06-23 | 2018-10-23 | EMC IP Holding Company LLC | Automated cloud service evaluation and workload migration utilizing standardized virtual service units |
US11087085B2 (en) * | 2017-09-18 | 2021-08-10 | Tata Consultancy Services Limited | Method and system for inferential data mining |
CN109671487A (en) * | 2019-02-25 | 2019-04-23 | 上海海事大学 | A kind of social media user psychology crisis alert method |
US10963639B2 (en) * | 2019-03-08 | 2021-03-30 | Medallia, Inc. | Systems and methods for identifying sentiment in text strings |
CN112836515A (en) * | 2019-11-05 | 2021-05-25 | 阿里巴巴集团控股有限公司 | Text analysis method, recommendation device, electronic equipment and storage medium |
CN110888983B (en) * | 2019-11-26 | 2022-07-15 | 厦门市美亚柏科信息股份有限公司 | Positive and negative emotion analysis method, terminal equipment and storage medium |
CN113507399B (en) * | 2021-07-09 | 2022-07-26 | 西安电子科技大学 | Network performance evaluation device and method for different levels of cloud platform |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070011183A1 (en) * | 2005-07-05 | 2007-01-11 | Justin Langseth | Analysis and transformation tools for structured and unstructured data |
US7930302B2 (en) * | 2006-11-22 | 2011-04-19 | Intuit Inc. | Method and system for analyzing user-generated content |
US20080249764A1 (en) * | 2007-03-01 | 2008-10-09 | Microsoft Corporation | Smart Sentiment Classifier for Product Reviews |
US7996210B2 (en) * | 2007-04-24 | 2011-08-09 | The Research Foundation Of The State University Of New York | Large-scale sentiment analysis |
US7987188B2 (en) * | 2007-08-23 | 2011-07-26 | Google Inc. | Domain-specific sentiment classification |
US20100119053A1 (en) * | 2008-11-13 | 2010-05-13 | Buzzient, Inc. | Analytic measurement of online social media content |
US7987262B2 (en) | 2008-11-19 | 2011-07-26 | Accenture Global Services Limited | Cloud computing assessment tool |
US8271615B2 (en) | 2009-03-31 | 2012-09-18 | Cloud Connex, Llc | Centrally managing and monitoring software as a service (SaaS) applications |
US8504443B2 (en) | 2009-08-31 | 2013-08-06 | Red Hat, Inc. | Methods and systems for pricing software infrastructure for a cloud computing environment |
US8533208B2 (en) * | 2009-09-28 | 2013-09-10 | Ebay Inc. | System and method for topic extraction and opinion mining |
US20120316916A1 (en) * | 2009-12-01 | 2012-12-13 | Andrews Sarah L | Methods and systems for generating corporate green score using social media sourced data and sentiment analysis |
US20110213712A1 (en) | 2010-02-26 | 2011-09-01 | Computer Associates Think, Ink. | Cloud Broker and Procurement System and Method |
WO2011120211A1 (en) * | 2010-03-29 | 2011-10-06 | Nokia Corporation | Method and apparatus for seeded user interest modeling |
US8032846B1 (en) | 2010-03-30 | 2011-10-04 | Synopsys, Inc. | Efficient provisioning of resources in public infrastructure for electronic design automation (EDA) tasks |
US20110270968A1 (en) | 2010-04-30 | 2011-11-03 | Salsburg Michael A | Decision support system for moving computing workloads to public clouds |
US9239996B2 (en) | 2010-08-24 | 2016-01-19 | Solano Labs, Inc. | Method and apparatus for clearing cloud compute demand |
JP2012053853A (en) | 2010-09-03 | 2012-03-15 | Ricoh Co Ltd | Information processor, information processing system, service provision device determination method and program |
US9536269B2 (en) * | 2011-01-19 | 2017-01-03 | 24/7 Customer, Inc. | Method and apparatus for analyzing and applying data related to customer interactions with social media |
US10678602B2 (en) | 2011-02-09 | 2020-06-09 | Cisco Technology, Inc. | Apparatus, systems and methods for dynamic adaptive metrics based application deployment on distributed infrastructures |
US8949270B2 (en) * | 2011-03-10 | 2015-02-03 | Salesforce.Com, Inc. | Methods and systems for processing social media data |
US8532798B2 (en) * | 2011-08-23 | 2013-09-10 | Longitude Llc | Predicting outcomes of future sports events based on user-selected inputs |
US8819171B2 (en) | 2011-09-07 | 2014-08-26 | Microsoft Corporation | Monitoring and benchmarking client performance from the server-side |
US9781205B2 (en) | 2011-09-12 | 2017-10-03 | Microsoft Technology Licensing, Llc | Coordination engine for cloud selection |
US20130117157A1 (en) | 2011-11-09 | 2013-05-09 | Gravitant, Inc. | Optimally sourcing services in hybrid cloud environments |
EP2807622A4 (en) | 2012-01-29 | 2015-08-19 | Hewlett Packard Development Co | Best-deal list generation |
WO2013119200A1 (en) | 2012-02-06 | 2013-08-15 | Empire Technology Development, Llc | Maintaining application performances upon transfer between cloud services |
US9071613B2 (en) | 2012-04-06 | 2015-06-30 | International Business Machines Corporation | Dynamic allocation of workload deployment units across a plurality of clouds |
US20130346227A1 (en) | 2012-06-22 | 2013-12-26 | Microsoft Corporation | Performance-Based Pricing for Cloud Computing |
US20130346161A1 (en) | 2012-06-25 | 2013-12-26 | Sap Ag | Benchmarking with peer groups in a cloud environment |
US20140006369A1 (en) * | 2012-06-28 | 2014-01-02 | Sean Blanchflower | Processing structured and unstructured data |
US9588820B2 (en) | 2012-09-04 | 2017-03-07 | Oracle International Corporation | Cloud architecture recommender system using automated workload instrumentation |
US20140074647A1 (en) | 2012-09-07 | 2014-03-13 | Xerox Corporation | Methods and systems for recommending cloud configuration in a cloud service market place |
US20150269234A1 (en) * | 2014-03-19 | 2015-09-24 | Hewlett-Packard Development Company, L.P. | User Defined Functions Including Requests for Analytics by External Analytic Engines |
US20150286627A1 (en) * | 2014-04-03 | 2015-10-08 | Adobe Systems Incorporated | Contextual sentiment text analysis |
-
2015
- 2015-04-15 US US14/687,748 patent/US10621505B2/en active Active
-
2020
- 2020-04-08 US US16/842,987 patent/US20200234193A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US10621505B2 (en) | 2020-04-14 |
US20150302304A1 (en) | 2015-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200234193A1 (en) | Cloud computing scoring systems and methods | |
US20200335002A1 (en) | Guiding creation of an electronic survey | |
US11430013B2 (en) | Configurable relevance service test platform | |
US10664777B2 (en) | Automated recommendations for task automation | |
US8788442B1 (en) | Compliance model training to classify landing page content that violates content item distribution guidelines | |
US8374983B1 (en) | Distributed object classification | |
US7660786B2 (en) | Data independent relevance evaluation utilizing cognitive concept relationship | |
US20170154307A1 (en) | Personalized data-driven skill recommendations and skill gap prediction | |
US20150006414A1 (en) | Social network for employment search | |
US20130332385A1 (en) | Methods and systems for detecting and extracting product reviews | |
US9881344B2 (en) | User characteristics-based sponsored company postings | |
US20170032386A1 (en) | Growth-based ranking of companies | |
US20200279077A1 (en) | System and method for detecting portability of sentiment analysis system | |
US10628511B2 (en) | Machine learning system and method of classifying an application link as broken or working | |
US10290032B2 (en) | Blacklisting based on image feature analysis and collaborative filtering | |
US20140195312A1 (en) | System and method for management of processing workers | |
JP2019503522A (en) | Optimized digital component analysis system | |
Jørgensen | A strong focus on low price when selecting software providers increases the likelihood of failure in software outsourcing projects | |
CN117541401A (en) | Information pushing method, device, electronic equipment and storage medium | |
US20210097424A1 (en) | Dynamic selection of features for training machine learning models | |
US20150235281A1 (en) | Categorizing data based on cross-category relevance | |
US9218420B1 (en) | Detecting new businesses with unrecognized query terms | |
US10586046B1 (en) | Automated security feed analysis for threat assessment | |
US20160171608A1 (en) | Methods and systems for finding similar funds | |
US20160034854A1 (en) | Job hosting service for paid and unpaid job postings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: XOCUR, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MONDEN, JASON PETER;KARMAZYN, DANIEL DAVID;SUTTON, PERRON RICHARD;AND OTHERS;SIGNING DATES FROM 20150430 TO 20150803;REEL/FRAME:052341/0201 Owner name: HYPERGRID, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:XOCUR, INC.;REEL/FRAME:052341/0290 Effective date: 20170413 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |