CN117130780A - Service execution method, device, electronic equipment and computer readable storage medium - Google Patents

Service execution method, device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN117130780A
CN117130780A CN202311063954.2A CN202311063954A CN117130780A CN 117130780 A CN117130780 A CN 117130780A CN 202311063954 A CN202311063954 A CN 202311063954A CN 117130780 A CN117130780 A CN 117130780A
Authority
CN
China
Prior art keywords
model
target
service
execution
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311063954.2A
Other languages
Chinese (zh)
Inventor
滕安琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innovation Qizhi Zhejiang Technology Co ltd
Original Assignee
Innovation Qizhi Zhejiang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innovation Qizhi Zhejiang Technology Co ltd filed Critical Innovation Qizhi Zhejiang Technology Co ltd
Priority to CN202311063954.2A priority Critical patent/CN117130780A/en
Publication of CN117130780A publication Critical patent/CN117130780A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5038Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • G06F16/2282Tablespace storage structures; Management thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Databases & Information Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a service execution method, a device, an electronic device and a computer readable storage medium, wherein the method comprises the following steps: acquiring target service information; the target service information comprises a target service identifier; searching a target model arranging process corresponding to the target service identifier in a database according to the target service identifier; the database stores a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model; according to the target execution sequence of the identification models in the searched target model arrangement flow, a plurality of target identification models are sequentially called to execute corresponding execution actions so as to execute target services, thereby improving the efficiency of service execution, and reducing resource waste for the common basic model of different services.

Description

Service execution method, device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of artificial intelligence technology, and in particular, to a service execution method, apparatus, electronic device, and computer readable storage medium.
Background
As artificial intelligence is increasingly used in practice, the number of models that need to be deployed in a production system is also increasing.
When some application service scenes need to be applied with multiple models, multiple recognition models need to be called to participate in executing the service scenes, but at present, the calling selection of the multiple recognition models is generally selected manually, so that the service processing efficiency is lower.
Disclosure of Invention
An embodiment of the application aims to provide a service execution method, a device, electronic equipment and a computer readable storage medium, which are used for solving the problem of low service processing efficiency caused by manual selection of a plurality of identification models required to be called in a current service scene.
In a first aspect, the present application provides a service execution method, including: acquiring target service information; the target service information comprises a target service identifier; searching a target model arranging process corresponding to the target service identifier in a database according to the target service identifier; the database stores a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model; and according to the searched target execution sequence of the identification models in the target model arrangement flow, sequentially calling a plurality of target identification models to execute corresponding execution actions so as to execute target business.
According to the business execution method, the mapping table is established in advance between each business identifier and the corresponding model arrangement flow and stored in the database for unified management, under the condition of executing the business, the corresponding target model arrangement flow can be searched based on the target business identifier corresponding to the execution business, and then a plurality of target identification models are sequentially called to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow, so that the business execution efficiency is improved, and resource waste can be reduced for the common basic model of different businesses.
In an optional implementation manner of the first aspect, before acquiring the target service information, the method further includes: obtaining model programming flow information input by a user; the model programming flow information comprises a plurality of identification models, an execution sequence of the identification models and an execution action of each identification model; and acquiring the service identification, establishing a mapping relation between the service identification and the model programming flow information, and storing the mapping relation in a database.
According to the embodiment, the user inputs the model arrangement flow information by himself, and configures a corresponding service identifier for the model arrangement flow information input by the user, so that the deployment of the service content and the corresponding model arrangement flow is realized rapidly.
In an optional implementation manner of the first aspect, before acquiring the target service information, the method further includes: acquiring one or more business description text information input by a user; generating a corresponding model programming flow according to each service description text message; acquiring a service identifier corresponding to each service description text message; establishing a mapping table between the service identification of each service description text message and the corresponding modeling process; the mapping table is stored in a database.
According to the method, the model arrangement process is automatically generated based on text information through the model arrangement process identification model, the threshold of model arrangement deployment is greatly reduced, and the generated model arrangement process is more accurate and efficient in the continuous feedback process of users.
In an optional implementation manner of the first aspect, generating a corresponding model orchestration procedure according to each service description text message includes: and inputting each service description text message into the trained model programming flow identification model to obtain a model programming flow corresponding to each service description text message output by the model programming flow identification model.
In an optional implementation of the first aspect, the method further comprises: acquiring a plurality of service description text samples; each service description text sample comprises service description text sample information and a model arrangement flow corresponding to the service description text sample information; training the model programming flow identification model according to the plurality of business description text samples to obtain the trained model programming flow identification model.
In an optional implementation manner of the first aspect, after sequentially invoking a plurality of target recognition models to execute corresponding execution actions according to the target execution order of the recognition models in the found target model orchestration flow, the method further includes: and monitoring and counting the call quantity of each stored identification model. According to the method, the number of model examples is compressed or expanded for each recognition model based on the call volume data of each recognition model, so that elastic expansion and contraction are achieved, and further more model examples are deployed by limited machine resources.
In an optional implementation manner of the first aspect, after sequentially invoking a plurality of target recognition models to execute corresponding execution actions according to the target execution order of the recognition models in the found target model orchestration flow, the method further includes: acquiring the latest calling time of each stored identification model; and calculating the non-calling time length according to the latest calling time and the current time of each recognition model, and if the recognition model with the non-calling time length exceeding the preset time length exists, releasing the system resource where the recognition model with the non-calling time length exceeding the preset time length is located. According to the method, the monitoring service is used for carrying out monitoring statistics on service call of different models, more model examples can be deployed in limited machine resources, manual capacity reduction and offline are not needed for few models, manual capacity expansion and online are not needed for common models, and operation and maintenance cost of resource allocation in different peaks of business is greatly reduced.
In a second aspect, the present application provides a service execution apparatus, including: the device comprises an acquisition module, a search module and an execution module; the acquisition module is used for acquiring target service information; the target service information comprises a target service identifier; the searching module is used for searching a target model arranging process corresponding to the target service identifier in the database according to the target service identifier; the database stores a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model; the execution module is used for sequentially calling a plurality of target recognition models to execute corresponding execution actions according to the target execution sequence of the recognition models in the searched target model arrangement flow so as to execute target business.
According to the business execution device, the mapping table is established in advance between each business identifier and the corresponding model arrangement flow and stored in the database for unified management, under the condition of executing the business, the corresponding target model arrangement flow can be searched based on the target business identifier corresponding to the execution business, and then a plurality of target identification models are sequentially called to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow, so that the business execution efficiency is improved, and resource waste is reduced for the common basic model of different businesses.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain model orchestration flow information input by a user; the model programming flow information comprises a plurality of identification models, an execution sequence of the identification models and an execution action of each identification model; acquiring a service identifier; the device also comprises a storage module which is used for establishing a mapping relation between the service identification and the model programming flow information and then storing the mapping relation in a database.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain one or more service description text information input by a user; the device also comprises a generation module for generating a corresponding model arrangement flow according to each service description text message; the acquisition module is also used for acquiring the service identifier corresponding to each service description text message; the building module is used for building a mapping table between the service identification of each service description text message and the corresponding model arrangement flow; the storage module is also used for storing the mapping table in a database.
In an optional implementation manner of the second aspect, the generating module is specifically configured to input each service description text information into a trained modeling procedure identification model, and obtain a modeling procedure corresponding to each service description text information output by the modeling procedure identification model.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain a plurality of service description text samples; each service description text sample comprises service description text sample information and a model arrangement flow corresponding to the service description text sample information; the device also comprises a training module for training the model programming flow identification model according to the plurality of business description text samples to obtain a trained model programming flow identification model.
In an optional implementation manner of the second aspect, the apparatus further includes a monitoring statistics module, configured to monitor and count a call volume of each stored identification model.
In an optional implementation manner of the second aspect, the obtaining module is further configured to obtain a stored latest invocation time of each identification model; the device also comprises a calculation module for calculating the non-calling time length according to the latest calling time and the current time of each recognition model, and a resource release module for releasing the system resource of the recognition model with the non-calling time length exceeding the preset time length if the recognition model with the non-calling time length exceeding the preset time length exists.
In a third aspect, the present application provides an electronic device comprising a memory storing a computer program and a processor executing the computer program to perform the method of any of the alternative implementations of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method of any of the alternative implementations of the first aspect.
In a fifth aspect, the present application provides a computer program product which, when run on a computer, causes the computer to perform the method of any of the alternative implementations of the first aspect.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a service execution method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a second flow of a service execution method according to an embodiment of the present application;
fig. 3 is a schematic third flow chart of a service execution method according to an embodiment of the present application;
fig. 4 is a fourth flowchart of a service execution method according to an embodiment of the present application;
fig. 5 is a fifth flowchart of a service execution method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a service execution device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Icon: 600-an acquisition module; 610-a lookup module; 620-an execution module; 630-a storage module; 640-a generation module; 650-building up a module; 660-training module; 670-a monitoring statistics module; 680-a calculation module; 690-resource release module; 7-an electronic device; 701-a processor; 702-a memory; 703-a communication bus.
Detailed Description
Embodiments of the technical scheme of the present application will be described in detail below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present application, and thus are merely examples, and are not intended to limit the scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion.
In the description of embodiments of the present application, the technical terms "first," "second," and the like are used merely to distinguish between different objects and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated, a particular order or a primary or secondary relationship. In the description of the embodiments of the present application, the meaning of "plurality" is two or more unless explicitly defined otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In the description of the embodiments of the present application, the term "and/or" is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In the description of the embodiments of the present application, the term "plurality" means two or more (including two), and similarly, "plural sets" means two or more (including two), and "plural sheets" means two or more (including two).
In the description of the embodiments of the present application, the orientation or positional relationship indicated by the technical terms "center", "longitudinal", "transverse", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. are based on the orientation or positional relationship shown in the drawings, and are merely for convenience of description and simplification of the description, and do not indicate or imply that the apparatus or element referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the embodiments of the present application.
In the description of the embodiments of the present application, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured" and the like should be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally formed; or may be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the embodiments of the present application will be understood by those of ordinary skill in the art according to specific circumstances.
As artificial intelligence is increasingly used in practice, the number of models that need to be deployed in a production system is also increasing. For example, machine learning applications often require training many models to provide a personalized experience; for example, the news classification service may train a custom model of news categories, and the recommendation model may train each user's usage history to personalize its suggestions; the main reason for training so many models separately is to protect the user's model and data privacy security.
When some application service scenes need to be applied with multiple models, multiple recognition models need to be called to participate in executing the service scenes, but at present, the calling selection of the multiple recognition models is generally selected manually, so that the service processing efficiency is lower.
In this regard, the present application provides a service execution method, where a mapping table is created by using a service identifier and a corresponding model arrangement flow in advance and stored for unified management, and under the condition of executing a service, the corresponding model arrangement flow can be found based on the service identifier corresponding to the service, and then a plurality of target recognition models are sequentially called to execute corresponding execution actions according to the target execution sequence of the recognition models in the found target model arrangement flow, so as to execute the target service, thereby improving the service execution efficiency, and reducing resource waste for the basic model common to different services. In addition, the scheme can also automatically generate a model arranging process based on text information through the model arranging process identification model, so that the threshold of model arranging and deploying is greatly reduced, and the generated model arranging process is more accurate and efficient in the continuous feedback process of users; meanwhile, the scheme monitors and counts different model service calls through the monitoring service, more model examples can be deployed in limited machine resources, manual capacity reduction and offline are not needed for few models, manual capacity expansion and online are not needed for common models, and the operation and maintenance cost of resource allocation in different peaks of business is greatly reduced.
Specifically, the present application first provides a service execution method, which can be applied to a computing device, including but not limited to a computer, a server, a chip, a processor, etc., as shown in fig. 1, and can be implemented by:
step S100: and obtaining target service information.
Step S110: and searching a target model arranging process corresponding to the target service identifier in the database according to the target service identifier.
Step S120: and according to the searched target execution sequence of the identification models in the target model arrangement flow, sequentially calling a plurality of target identification models to execute corresponding execution actions so as to execute target business.
For step S100, the target service information may include a target service identifier, where different service identifiers correspond to different service execution contents, for example, the service identifier ID1 has a mapping relationship with the service content H1, the service identifier ID2 has a mapping relationship with the service content H2, for example, the service content H1 may be specifically "control the robot to take out the commodity C from the location a to the shelf B through the shortest path", and the service content H2 may be specifically "control the robot to reach the residence of the user D from the location a through the shortest path, and identify the user D.
The target service information can be obtained by inputting the target service identifier at the page end by the user, for example, the user inputs the service identifier ID1 described above.
After the target service information input by the user is obtained, the scheme can search the target model arrangement flow corresponding to the target service identifier in the database according to the target service identifier. The database stores a plurality of modeling flows, and each modeling flow has a mapping relation with one service identifier, so that after the target service information input by the user is obtained, the corresponding target modeling flow can be searched according to the target service identifier input by the user. Each modeling flow contains a plurality of recognition models required to execute the business, the order in which the plurality of recognition models are executed, and the actions each recognition model performs when executing the business.
For example, assuming that the target service identifier is the service identifier ID1 described above, the corresponding service content H1 is "control the robot to take out the commodity C from the place a to the shelf B through the shortest path", and the corresponding target modeling process may include a path planning model and a commodity identification model; the execution order of the path planning model, the shelf identification model and the commodity identification model, and the execution actions of the path planning model and the commodity identification model. For example, the execution sequence is to execute a path planning model first and then execute a commodity identification model, the path planning model executes an action of performing shortest path planning based on the location a and the shelf B, and the commodity identification model executes an action of identifying the commodity C based on the photo. Each recognition model can be obtained through training in advance according to the corresponding sample.
Under the condition that the target model arrangement flow corresponding to the target service identifier is obtained, the method sequentially calls a plurality of target identification models to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow so as to execute the target service. For example, according to the foregoing example, the present solution may first perform path planning according to the location a and the shelf B by using the path planning model to control the movement of the robot, and after the movement is completed, identify the commodity photo on the shelf B photographed by the robot by using the commodity identification model, identify the commodity C, and further control the robot to take out the commodity C, thereby executing the service content H1 corresponding to the service identifier ID1.
According to the business execution method, the mapping table is established in advance between each business identifier and the corresponding model arrangement flow and stored in the database for unified management, under the condition of executing the business, the corresponding target model arrangement flow can be searched based on the target business identifier corresponding to the execution business, and then a plurality of target identification models are sequentially called to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow, so that the business execution efficiency is improved, and resource waste can be reduced for the common basic model of different businesses.
In an alternative implementation manner of this embodiment, as a possible implementation manner, the service identifier stored in the database and the model arrangement flow corresponding to the service identifier may be implemented in the following manner, as shown in fig. 2, including:
step S200: and obtaining the model programming flow information input by the user.
Step S210: and acquiring the service identification, establishing a mapping relation between the service identification and the model programming flow information, and storing the mapping relation in a database.
In the above embodiment, the present solution may obtain the model arrangement flow information input by the user by himself, for example, a worker may analyze by himself that the service content "control the robot to take out the commodity C from the place a to the shelf B through the shortest path", and the corresponding model arrangement flow information may include a path planning model and a commodity identification model; the method comprises the steps of carrying out a path planning model, a goods shelf identification model and a commodity identification model, and carrying out actions of the path planning model and the commodity identification model, wherein on the basis of the execution actions, a user inputs corresponding model programming flow information.
Then, under the condition of obtaining the model arrangement flow information, the scheme configures a corresponding service identifier for the model arrangement flow information input by the user, establishes a mapping relation between the configured service identifier and the model arrangement flow information input by the user, and finally stores the service identifier establishing the mapping relation and the model arrangement flow information input by the user in a database. Here, it should be noted that, when a user inputs a plurality of modeling flow information, service identifiers configured by different modeling flow information are different.
According to the embodiment, the user inputs the model arrangement flow information by himself, and configures the corresponding service identifier for the model arrangement flow information input by the user, so that the deployment of the service content and the corresponding model arrangement flow is realized rapidly.
In an alternative implementation manner of this embodiment, as another possible implementation manner, the service identifier stored in the database and the model arrangement flow corresponding to the service identifier may also be implemented in the following manner, as shown in fig. 3, including:
step S300: one or more business description text information input by a user is obtained.
Step S310: and generating a corresponding model arrangement flow according to each service description text message.
Step S320: and acquiring a service identifier corresponding to each service description text message.
Step S330: and establishing a mapping table between the service identification of each service description text message and the corresponding modeling process.
Step S340: the mapping table is stored in a database.
In the above embodiment, the present solution may obtain one or more service description text information input by a user, where the service description text information may be text information or voice information describing the current service execution content, and so on. For example, the service description text information input by the user may be "control the robot to take out the commodity C from the place a to the shelf B through the shortest path"; or "control robot from site a to user D residence through shortest path and identify user D", etc.
After obtaining one or more service description text information input by a user, the scheme can generate a corresponding model arrangement flow according to each service description text information. Specifically, as a possible implementation manner, the scheme can input each business description text information into a trained modeling process identification model to obtain a modeling process corresponding to each business description text information output by the modeling process identification model. For example, the present solution may input the service description text information "control the robot to take out the commodity C from the place a through the shortest path to the shelf B" into the trained model programming process identification model, so as to obtain the model programming process information output by the model programming process identification model, where the model programming process information includes the path planning model and the commodity identification model; the execution order of the path planning model, the shelf identification model and the commodity identification model, and the execution actions of the path planning model and the commodity identification model.
The modeling process identification model is obtained through pre-training, and specifically, the modeling process identification model can be obtained through training in the following manner: acquiring a plurality of service description text samples; each service description text sample comprises service description text sample information and a model arrangement flow corresponding to the service description text sample information; training the modeling process identification model according to the plurality of business description text samples, thereby obtaining the trained modeling process identification model.
After the modeling process corresponding to each service description text message output by the modeling process identification model is obtained, the scheme can establish a mapping table between the service identification of each service description text message and the corresponding modeling process, and further store the mapping table in a database, so that the service identification and the corresponding modeling process can be deployed in advance.
According to the embodiment of the design, the model arrangement process is automatically generated based on text information through the model arrangement process identification model, so that the threshold of model arrangement deployment is greatly reduced, and the generated model arrangement process is more accurate and efficient in the continuous feedback process of users.
In an alternative implementation manner of this embodiment, the method may detect call volumes of a plurality of deployed recognition models, as shown in fig. 4, and may be implemented by the following manner, including:
step S400: and monitoring and counting the call quantity of each stored identification model.
In the above embodiment, the present solution may monitor and count the call volume of each stored recognition model, and further may compress or expand the number of model instances for each recognition model based on the call volume data of each recognition model, thereby implementing elastic expansion. In addition, under the condition that the identification model with the calling quantity smaller than the preset calling quantity exists, the scheme can release the system resource where the identification model with the calling quantity smaller than the preset calling quantity is located.
In an optional implementation manner of this embodiment, as shown in fig. 5, the present solution may further implement reasonable resource allocation by a method including:
step S500: the stored latest invocation time for each recognition model is obtained.
Step S510: and calculating the non-calling time length according to the latest calling time and the current time of each recognition model.
Step S520: and if the identification model with the non-calling time length exceeding the preset time length exists, releasing the system resource where the identification model with the non-calling time length exceeding the preset time length is located.
According to the method, the calling time of each recognition model can be monitored, the latest calling time of each recognition model is obtained, the latest calling time refers to the latest called time of the recognition model, the non-calling time length is calculated based on the latest calling time and the current time, the non-calling time length represents how long the recognition model is not called, and for recognition model services which are not called for a long time, system resources where the recognition model services which are not called for a long time are located can be released, and then the recognition model is restarted when being called next time. The system resource described above may be a computer graphics resource.
According to the implementation mode of the design, the monitoring service is used for carrying out monitoring statistics on service call of different models, so that more model examples can be deployed in limited machine resources, manual capacity reduction and offline are not needed for using fewer models, manual capacity expansion and online are not needed for common models, and the operation and maintenance cost of resource allocation in different peaks of business is greatly reduced.
Fig. 6 shows a schematic block diagram of a service execution device according to the present application, and it should be understood that the device corresponds to the embodiment of the method executed in fig. 1 to 5, and is capable of executing the steps involved in the foregoing method, and specific functions of the device may be referred to in the foregoing description, and detailed descriptions thereof are omitted herein as appropriate to avoid redundancy. The device includes at least one software functional module that can be stored in memory in the form of software or firmware (firmware) or cured in an Operating System (OS) of the device. Specifically, the device comprises: an acquisition module 600, a lookup module 610, and an execution module 620; the acquisition module 600 is configured to acquire target service information; the target service information comprises a target service identifier; the searching module 610 is configured to search a database for a target model arrangement procedure corresponding to a target service identifier according to the target service identifier; the database stores a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model; the execution module 620 is configured to sequentially call the plurality of target recognition models to execute corresponding execution actions according to the target execution sequence of the recognition models in the found target model arrangement flow, so as to execute the target service.
According to the business execution device, the mapping table is established in advance between each business identifier and the corresponding model arrangement flow and stored in the database for unified management, under the condition of executing the business, the corresponding target model arrangement flow can be searched based on the target business identifier corresponding to the execution business, and then a plurality of target identification models are sequentially called to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow, so that the business execution efficiency is improved, and resource waste is reduced for the common basic model of different businesses.
In an optional implementation manner of this embodiment, the obtaining module 600 is further configured to obtain model layout flow information input by a user; the model programming flow information comprises a plurality of identification models, an execution sequence of the identification models and an execution action of each identification model; acquiring a service identifier; the apparatus further includes a storage module 630, configured to store the service identifier and the modeling flow information in a database after establishing a mapping relationship.
In an optional implementation manner of this embodiment, the obtaining module 600 is further configured to obtain one or more service description text information input by a user; the device further comprises a generating module 640, configured to generate a corresponding model arrangement flow according to each service description text message; the obtaining module 600 is further configured to obtain a service identifier corresponding to each service description text information; an establishing module 650, configured to establish a mapping table between the service identifier of each service description text message and the corresponding model orchestration procedure; the storage module 630 is further configured to store the mapping table in a database.
In an optional implementation manner of this embodiment, the generating module 640 is specifically configured to input each service description text information into a trained modeling process identification model, and obtain a modeling process corresponding to each service description text information output by the modeling process identification model.
In an optional implementation manner of this embodiment, the obtaining module 600 is further configured to obtain a plurality of service description text samples; each service description text sample comprises service description text sample information and a model arrangement flow corresponding to the service description text sample information; the apparatus further comprises a training module 660 for training the model orchestration flow recognition model according to the plurality of business description text samples, to obtain a trained model orchestration flow recognition model.
In an alternative implementation of this embodiment, the apparatus further includes a monitoring statistics module 670, configured to monitor and count the call volume of each stored identification model.
In an optional implementation manner of this embodiment, the obtaining module 600 is further configured to obtain a stored latest calling time of each recognition model; the apparatus further includes a calculation module 680, configured to calculate an amount of non-invoked time according to a latest invoking time and a current time of each recognition model, and a resource release module 690, configured to release, if there is a recognition model with an amount of non-invoked time exceeding a preset amount of time, a system resource where the recognition model with an amount of non-invoked time exceeding the preset amount of time is located.
According to some embodiments of the present application, as shown in fig. 7, the present application provides an electronic device 7, comprising: the processor 701 and the memory 702, the processor 701 and the memory 702 being interconnected and communicating with each other by a communication bus 703 and/or other form of connection mechanism (not shown), the memory 702 storing a computer program executable by the processor 701, which when executed by the computing device, the processor 701 executes the method executed by the external terminal in any alternative implementation, such as step S100 and step S120: acquiring target service information, searching a target model arrangement flow corresponding to the target service identification in a database according to the target service identification, and calling a plurality of target identification models in sequence to execute corresponding execution actions according to the target execution sequence of the identification models in the searched target model arrangement flow so as to execute the target service.
The present application provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs a method according to any of the preceding alternative implementations.
The storage medium may be implemented by any type of volatile or nonvolatile Memory device or combination thereof, such as static random access Memory (Static Random Access Memory, SRAM), electrically erasable Programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), erasable Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk.
The present application provides a computer program product which, when run on a computer, causes the computer to perform the method in any of the alternative implementations.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application, and are intended to be included within the scope of the appended claims and description. In particular, the technical features mentioned in the respective embodiments may be combined in any manner as long as there is no structural conflict. The present application is not limited to the specific embodiments disclosed herein, but encompasses all technical solutions falling within the scope of the claims.

Claims (10)

1. A method for executing a service, the method comprising:
acquiring target service information; wherein the target service information comprises a target service identifier;
searching a target model arranging process corresponding to the target service identifier in a database according to the target service identifier; the database is used for storing a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model;
and according to the searched target execution sequence of the identification models in the target model arrangement flow, sequentially calling a plurality of target identification models to execute corresponding execution actions so as to execute target business.
2. The method of claim 1, wherein prior to the obtaining the target traffic information, the method further comprises:
obtaining model programming flow information input by a user; the model arrangement flow information comprises a plurality of identification models, an execution sequence of the identification models and an execution action of each identification model;
and acquiring a service identifier, establishing a mapping relation between the service identifier and the modeling flow information, and storing the mapping relation in a database.
3. The method of claim 1, wherein prior to the obtaining the target traffic information, the method further comprises:
acquiring one or more business description text information input by a user;
generating a corresponding model programming flow according to each service description text message;
acquiring a service identifier corresponding to each service description text message;
establishing a mapping table between the service identification of each service description text message and the corresponding model arrangement flow;
the mapping table is stored in a database.
4. A method according to claim 3, wherein said generating a corresponding modeling flow from each of said business description text information comprises:
and inputting each business description text message into a trained model programming flow identification model to obtain a model programming flow corresponding to each business description text message output by the model programming flow identification model.
5. The method according to claim 4, wherein the method further comprises:
acquiring a plurality of service description text samples; each service description text sample comprises service description text sample information and a model arrangement flow corresponding to the service description text sample information;
training the model programming flow identification model according to a plurality of service description text samples to obtain the trained model programming flow identification model.
6. The method according to claim 1, wherein after sequentially calling a plurality of object recognition models to execute corresponding execution actions according to the found object execution order of the recognition models in the object model arrangement flow to execute the object service, the method further comprises:
and monitoring and counting the call quantity of each stored identification model.
7. The method according to claim 1, wherein after sequentially calling a plurality of object recognition models to execute corresponding execution actions according to the found object execution order of the recognition models in the object model arrangement flow to execute the object service, the method further comprises:
acquiring the latest calling time of each stored identification model;
calculating the non-calling time length according to the latest calling time and the current time of each recognition model;
if the identification model with the non-calling time length exceeding the preset time length exists, releasing the system resource where the identification model with the non-calling time length exceeding the preset time length is located.
8. A service execution device, the device comprising: the device comprises an acquisition module, a search module and an execution module;
the acquisition module is used for acquiring target service information; wherein the target service information comprises a target service identifier;
the searching module is used for searching a target model arranging flow corresponding to the target service identifier in a database according to the target service identifier; the database is used for storing a plurality of model arrangement flows, each model arrangement flow corresponds to a service identifier, and each model arrangement flow comprises a plurality of identification models required by service execution, an execution sequence of the plurality of identification models and an execution action of each identification model;
the execution module is used for sequentially calling a plurality of target recognition models to execute corresponding execution actions according to the searched target execution sequence of the recognition models in the target model arrangement flow so as to execute target business.
9. An electronic device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the method of any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1 to 7.
CN202311063954.2A 2023-08-22 2023-08-22 Service execution method, device, electronic equipment and computer readable storage medium Pending CN117130780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311063954.2A CN117130780A (en) 2023-08-22 2023-08-22 Service execution method, device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311063954.2A CN117130780A (en) 2023-08-22 2023-08-22 Service execution method, device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117130780A true CN117130780A (en) 2023-11-28

Family

ID=88853904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311063954.2A Pending CN117130780A (en) 2023-08-22 2023-08-22 Service execution method, device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117130780A (en)

Similar Documents

Publication Publication Date Title
CN107844634B (en) Modeling method of multivariate general model platform, electronic equipment and computer readable storage medium
CN109558479B (en) Rule matching method, device, equipment and storage medium
CN110532056A (en) A kind of control recognition methods and device applied in user interface
CN109285024B (en) Online feature determination method and device, electronic equipment and storage medium
CN114997414B (en) Data processing method, device, electronic equipment and storage medium
CN115237857A (en) Log processing method and device, computer equipment and storage medium
CN115187331A (en) Product recommendation method, device, equipment and storage medium based on multi-modal data
CN108021713B (en) Document clustering method and device
CN117130780A (en) Service execution method, device, electronic equipment and computer readable storage medium
CN112487163A (en) Execution method of automation process and acquisition method and device of interface data of automation process
CN111125263A (en) Reservation request management method, reservation function entity and readable storage medium
CN110908642A (en) Policy generation and execution method and device
CN111078984B (en) Network model issuing method, device, computer equipment and storage medium
CN106055550A (en) Methods and systems for adaptive and contextual collaboration in a network
CN112199012A (en) Data processing method and related equipment
CN112417259A (en) Media resource processing method, device, equipment and storage medium
US10139796B2 (en) System and method for state-transition-controlled processing of objects
CN115052035B (en) Message pushing method, device and storage medium
CN110796532A (en) Credit risk processing method, terminal equipment, server and system
CN114510221A (en) Data processing method, device and equipment and readable storage medium
CN114610856A (en) Dialog interaction intelligent decision-making method and device based on causal graph
CN117354113A (en) Command arrangement method, device and storage medium of network element management system
CN117687678A (en) Linkage data sending method and system, storage medium and electronic equipment
CN114757716A (en) Method and device for device call-through, electronic device and storage medium
CN115658176A (en) Intelligent cabinet application configuration method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination