WO2019163823A1 - Système de traitement d'informations - Google Patents

Système de traitement d'informations Download PDF

Info

Publication number
WO2019163823A1
WO2019163823A1 PCT/JP2019/006319 JP2019006319W WO2019163823A1 WO 2019163823 A1 WO2019163823 A1 WO 2019163823A1 JP 2019006319 W JP2019006319 W JP 2019006319W WO 2019163823 A1 WO2019163823 A1 WO 2019163823A1
Authority
WO
WIPO (PCT)
Prior art keywords
learning
model
unit
data
function
Prior art date
Application number
PCT/JP2019/006319
Other languages
English (en)
Japanese (ja)
Inventor
敏弥 河崎
Original Assignee
株式会社Abeja
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Abeja filed Critical 株式会社Abeja
Priority to JP2019542650A priority Critical patent/JP6689507B2/ja
Publication of WO2019163823A1 publication Critical patent/WO2019163823A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • FIG. 3 is a diagram illustrating an example of a learning job progress confirmation screen.
  • the learning job progress status confirmation screen of the example of FIG. 3 is displayed on the output unit 17 of FIG. This also applies to the example of FIG. 4 and the example of FIG. 5 described later.
  • the user assigns each of a plurality of learning jobs to one learning definition (Job Definition) for learning executed by the learning unit 14 (FIG. 2). They can be executed independently of each other.
  • the learning definition is, for example, Ver. 11 or Ver.
  • the learning definition of a predetermined version corresponds to the above-mentioned “one learning definition”.
  • function (2) The characteristics of function (2) are described below. Conventionally, local development requires manual operation such as directory management. On the other hand, the source code of the model in which the data set, code, parameter, and log used for learning are version-controlled is linked on the cloud by the function (2). As a result, the learning result and the source code can be managed separately, so that the license problem can be solved. Furthermore, function (2) can manage software copyright and model rights separately.
  • Function (2) has the basicity that it is important for the platform to clarify the right relationship between the source code created by the developer who is the user (user) and the learning result generated on the cloud. .
  • Function (3) provides preprocessing of input / output data while hiding it from the user (user).
  • Function (3) is a mechanism for making the user program and the execution environment independent.
  • the function (3) is HTTP (Hypertext Transfer Protocol).
  • Function (3) can be event driven on the cloud.
  • Function (3) can be performed on the edge device. Function (3) can reduce the man-hours of the developer as one user.
  • function (4) a billing plan function (hereinafter referred to as “function (4)”) according to the degree of achievement of model learning will be described.
  • Function (4) is that it is difficult to predict in advance how much accuracy will be achieved in “how much time” for model learning in machine learning / deep learning. It is possible to solve the problem of wanting to suppress the problem.
  • the function (4) can achieve the following effects.
  • the operations “open” and “commit” are defined as follows in the model learning in the cloud. “Open” represents the start of learning. “Commit” represents that learning is completed and a model is created from the learning result.
  • the function (4) charges the model learning at the time of “open” and “commit”, and keeps the price at the time of “open” lower than the price at the time of “commit”. The (user) can achieve an effect that the cost until learning is completed can be reduced.
  • Function (4) is that in the process of machine learning / deep learning model learning shifting to the cloud, it is considered that the user's awareness of the GPU 14 itself becomes obsolete, and the charging system corresponding to the GPU 14 is no longer used. Should come.
  • function (5) can be switched instantaneously on a UI (User Interface).
  • Function (5) can compare the accuracy between different versions by sending one or more test data sets as requests to each endpoint. In function (5), the accuracy comparison is visualized, and the best one can be selected as the primary.
  • the accuracy of the model is important as an index in the machine learning / deep learning API.
  • Function (5) can use multiple versions of accuracy comparison results under the same conditions for Blue / Green deployment.
  • Function (5) has the basicity that, as described above, the accuracy of the model is important in the API of machine learning / deep learning, and general Blue / Green deployment is not sufficient.
  • function (6) having a Web API confirmation interface (hereinafter referred to as “function (6)”) will be described.
  • Function (6) indicates that the input and output results including the teacher data of the machine learning / deep learning model are consistent with the unstructured text data of the inference result and the teacher data. Can solve the problem of wanting to confirm.
  • the function (6) can achieve the following effects.
  • a confirmation interface that can be used in the Web browser can be prepared. This interface is divided into an input data drawing area and an output data drawing area, and each can be scrolled independently. Thereby, the visibility is improved while maintaining the ease of comparing the input image and the output.
  • the following data (a) to (c) can be used as input data.
  • C Images and teacher data (unstructured text data) stored in the data lake. Here, when there is teacher data of data (c), the unstructured text data is analyzed and superimposed on the image as a color-coded rectangle to improve visibility.
  • unstructured text data or an image can be acquired as an inference result.
  • the image is drawn as it is.
  • the unstructured text data is analyzed and superimposed on the image as a color-coded rectangle. Since the function (6) can provide these interfaces, an effect of facilitating comparison between input and output can be achieved.
  • function (7) a function for managing the same model for each application using the concept of deployment (hereinafter referred to as “function (7)”) will be described.
  • Function (7) has the feature that it is divided for each store and for each edge.
  • Function (7) can solve the problem that it is difficult to manage different settings and capacities for each application, although it is desired to reuse the same model for different applications. As a result, the function (7) can execute a certain model on a plurality of infrastructures, and can achieve the effect that the scale of the infrastructure and the program settings necessary for execution can be managed in units of “deployment”. .
  • Function (7) has the basicity that it is important because the performance and settings of the same model need to be changed in response to different requests in the SaaS service or the like.
  • function (8) a function for automatically saving an inference request (raw data) and an inference response (inference result) (hereinafter referred to as “function (8)”) will be described.
  • Function (8) is not inferred and finished, but later confirms that the inference result is correct. Therefore, as a learning data for improving accuracy, other requests for inference (raw data) and inference The response (inference result) is stored automatically.
  • Function (9) has a feature of graphing whether the accuracy of the inferred result satisfies the intended standard, and sending an alert if the accuracy is not satisfied.
  • Function (10) can display a label indicating what the image is classified as a result of inferring the image and a thumbnail of the image. In addition to classification, segmentation and bounding boxes are also supported.
  • the function (10) can solve the following problems. That is, in a case where it is impossible to measure whether the inferred result is correct numerically, it is necessary to check the accuracy by checking with human eyes. It is necessary to save the data and check the data for easy confirmation.
  • the function (10) can produce an effect that development and workflow for checking data are unnecessary.
  • function (11) the function of checking the true value of the inference result (hereinafter referred to as function (11)) will be described.
  • function (11) The features of function (11) are described below. That is, in a case where it cannot be measured numerically whether the inferred result is correct, it is necessary to check the accuracy by checking with human eyes. Annotation is manually performed on a part of the data obtained by sampling the inferred result, and the true value is checked.
  • the function (11) can solve the problem of how to check the accuracy in a case where it is impossible to numerically determine whether the inferred result is correct. As a result, the function (11) can bring about an effect that it is possible to reduce the assignment and management cost of the person for checking the true value.
  • the series of processes described above can be executed by hardware or can be executed by software.
  • the functional configuration of FIG. 2 is merely an example and is not particularly limited. That is, it is sufficient that the machine learning management operation system has a function capable of executing the above-described series of processes as a whole.
  • the configuration of the machine learning management operation system is not particularly limited to the example of FIG. What functional blocks are used for the implementation is not particularly limited to the example of FIG. Further, the location of the functional block is not particularly limited to that shown in FIG.
  • the machine learning management operation system may include a server and a client. In this case, each functional block of the information processing apparatus 1 in FIG. 1 may be distributed to the server or the client and transferred.
  • one functional block may be constituted by hardware alone, software alone, or a combination thereof.
  • a recording medium including such a program is not only constituted by a removable medium (not shown) distributed separately from the apparatus main body in order to provide a program to the user, but is also pre-installed in the apparatus main body. And a recording medium provided to the user.
  • the step of describing the program recorded on the recording medium is not limited to the processing performed in time series along the order, but is not necessarily performed in time series, either in parallel or individually.
  • the process to be executed is also included.
  • the term “system” means an overall apparatus configured by a plurality of devices, a plurality of means, and the like.
  • an information processing system to which the present invention is applied is Data input means for inputting learning data (for example, the learning data input unit 101 in FIG. 2); Learning method input means for inputting a learning method (for example, the learning method input unit 102 in FIG. 2); Learning condition selection means (for example, the learning condition selection unit 103 in FIG. 2) for selecting the predetermined learning data input by the data input means and the predetermined set of learning methods input by the learning method input means; , Learning means (for example, the learning unit 104 in FIG.
  • Employment determination means for example, the employment determination unit 105 in FIG. 2 for determining whether to adopt or not adopt the result of learning performed by the learning means;
  • An inference input means for inputting an inference method (for example, the inference input unit 106 in FIG. 2);
  • a task of managing one or more of the model versions by combining the learning result determined by the adoption determination unit and the inference method input by the inference input unit into a model version Model management means for example, model management unit 107 in FIG.
  • Execution selection means associated with the model management means, an execution selection means for selecting a model version and execution environment managed by the model management means (for example, the execution selection unit 108 in FIG. 2); Execution selection means (for example, the execution execution unit 109 in FIG. 2) that executes the model version selected by the execution selection means in the execution environment selected by the execution selection means; Is provided.
  • DESCRIPTION OF SYMBOLS 1 ... Information processing apparatus, 11 ... CPU, 14 ... GPU, 101 ... Learning data input part, 102 ... Learning method input part, 103 ... Learning condition selection part, 104 ... ⁇ Learning unit, 105... Adoption decision unit, 106 .. inference input unit, 107... Model management unit, 108... Execution selection unit, 109.
  • DB 300, 400 ... model DB, 500 ... execution environment data DB

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Stored Programmes (AREA)

Abstract

L'objectif de la présente invention est de simplifier la gestion d'un modèle d'apprentissage automatique. Une unité d'entrée de données d'apprentissage (101) reçoit une entrée de données d'apprentissage. Une unité d'entrée de procédé d'apprentissage (102) reçoit une entrée d'un procédé d'apprentissage. Une unité de sélection de condition d'apprentissage (103) sélectionne un ensemble comprenant des données prescrites et un procédé d'apprentissage prescrit. Une unité d'apprentissage (104) utilise le procédé de données et/ou d'apprentissage sélectionné par l'unité de sélection de condition d'apprentissage (103) pour effectuer un apprentissage. Une unité de détermination d'adoption (105) détermine s'il faut adopter ou non un résultat d'apprentissage fourni par l'unité d'apprentissage (104). Une unité d'entrée d'inférence (106) reçoit une entrée d'un procédé d'inférence. Une unité de gestion de modèle (107) combine un résultat adopté et un procédé d'inférence et définit une version de modèle, et prépare une ou plusieurs versions de modèle pour chaque tâche gérant celle-ci. Une unité de sélection d'exécution (108) reliée à l'unité de gestion de modèle (107) sélectionne un environnement d'exécution et une version de modèle gérée par l'unité de gestion de modèle (107), et une unité de mise en œuvre d'exécution (109) exécute la version de modèle dans l'environnement d'exécution.
PCT/JP2019/006319 2018-02-20 2019-02-20 Système de traitement d'informations WO2019163823A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019542650A JP6689507B2 (ja) 2018-02-20 2019-02-20 情報処理システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-028235 2018-02-20
JP2018028235 2018-02-20

Publications (1)

Publication Number Publication Date
WO2019163823A1 true WO2019163823A1 (fr) 2019-08-29

Family

ID=67688294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/006319 WO2019163823A1 (fr) 2018-02-20 2019-02-20 Système de traitement d'informations

Country Status (2)

Country Link
JP (2) JP6689507B2 (fr)
WO (1) WO2019163823A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021060940A (ja) * 2019-10-09 2021-04-15 株式会社日立製作所 運用支援システム及び方法
JP7285417B1 (ja) 2022-03-04 2023-06-02 株式会社エクサウィザーズ 情報処理方法、情報処理システム及びプログラム
KR20230101855A (ko) 2020-11-10 2023-07-06 도쿄엘렉트론가부시키가이샤 모델 관리 시스템, 모델 관리 방법 및 모델 관리 프로그램
KR20240049620A (ko) 2021-08-31 2024-04-16 도쿄엘렉트론가부시키가이샤 정보 처리 방법, 정보 처리 장치, 및 기판 처리 시스템
KR20240058131A (ko) 2021-08-31 2024-05-03 도쿄엘렉트론가부시키가이샤 정보 처리 방법, 정보 처리 장치, 및 정보 처리 시스템

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022210017A1 (fr) * 2021-03-31 2022-10-06 日本電気株式会社 Système d'analyse par ia, procédé de calcul de frais d'utilisation, et support d'enregistrement

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018010475A (ja) * 2016-07-13 2018-01-18 富士通株式会社 機械学習管理プログラム、機械学習管理装置および機械学習管理方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018010475A (ja) * 2016-07-13 2018-01-18 富士通株式会社 機械学習管理プログラム、機械学習管理装置および機械学習管理方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Operate machine learning models in production environment with TensorFlow Serving", FREEE DEVELOPERS BLOG, 23 December 2017 (2017-12-23), XP055633108, Retrieved from the Internet <URL:https://developers.freee.co.jp/entry/serve-ml-model-by-tensorflow-serving> [retrieved on 20190419] *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021060940A (ja) * 2019-10-09 2021-04-15 株式会社日立製作所 運用支援システム及び方法
KR20230101855A (ko) 2020-11-10 2023-07-06 도쿄엘렉트론가부시키가이샤 모델 관리 시스템, 모델 관리 방법 및 모델 관리 프로그램
KR20240049620A (ko) 2021-08-31 2024-04-16 도쿄엘렉트론가부시키가이샤 정보 처리 방법, 정보 처리 장치, 및 기판 처리 시스템
KR20240058131A (ko) 2021-08-31 2024-05-03 도쿄엘렉트론가부시키가이샤 정보 처리 방법, 정보 처리 장치, 및 정보 처리 시스템
JP7285417B1 (ja) 2022-03-04 2023-06-02 株式会社エクサウィザーズ 情報処理方法、情報処理システム及びプログラム
WO2023166803A1 (fr) * 2022-03-04 2023-09-07 株式会社エクサウィザーズ Procédé de traitement d'informations, système de traitement d'informations et programme
JP2023129109A (ja) * 2022-03-04 2023-09-14 株式会社エクサウィザーズ 情報処理方法、情報処理システム及びプログラム

Also Published As

Publication number Publication date
JP6689507B2 (ja) 2020-04-28
JP7281427B2 (ja) 2023-05-25
JPWO2019163823A1 (ja) 2020-04-09
JP2020113319A (ja) 2020-07-27

Similar Documents

Publication Publication Date Title
JP6689507B2 (ja) 情報処理システム
US9740479B2 (en) Complexity reduction of user tasks
US20110185315A1 (en) Simplified user controls for authoring workflows
Asadi et al. The effects of visualization and interaction techniques on feature model configuration
KR101312446B1 (ko) 사용자의 행위 로그를 이용한 모바일 어플리케이션의 사용성 분석 장치 및 방법
US20080270197A1 (en) Project status calculation algorithm
JP6341531B2 (ja) 組織改善活動支援装置、組織改善活動支援方法および組織改善活動支援プログラム
JP2018067286A (ja) モデル妥当性確認システムおよび方法
EP2608031A1 (fr) Cartographie de gestion de projet
WO2021049365A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et programme
Vasilecas et al. Analysis of using resources in business process modeling and simulation
Ahrens et al. Improving requirements specification use by transferring attention with eye tracking data
JP6395852B2 (ja) 業務状況管理システム、及び業務状況管理方法
JP2006235872A (ja) プロジェクト管理装置
US20140089224A1 (en) Modeling an enterprise
Jespersen Dashboard design guidelines for improved evidence based decision making in public health in developing countries
JP6299579B2 (ja) プログラム、情報処理装置、評価方法
Janes Non-distracting, continuous collection of software development process data
Weber et al. Measuring and explaining cognitive load during design activities: A fine-grained approach
JP5098632B2 (ja) 活動管理装置、活動管理システムおよび活動管理プログラム
Kedziora et al. The Cascade Analysis Tool: software to analyze and optimize care cascades
Sellier et al. Managing requirements inter-dependency for software product line derivation
Malik Appsheet vs React Native: evaluation of performance and development of Android Apps
US20230021249A1 (en) Dynamically generating platform resources by guiding user responses
KR102444933B1 (ko) 전자 장치 및 그 제어 방법

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019542650

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19757369

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19757369

Country of ref document: EP

Kind code of ref document: A1