CN114489937A - Mirror image caching method and device, electronic equipment and storage medium - Google Patents

Mirror image caching method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114489937A
CN114489937A CN202111646100.8A CN202111646100A CN114489937A CN 114489937 A CN114489937 A CN 114489937A CN 202111646100 A CN202111646100 A CN 202111646100A CN 114489937 A CN114489937 A CN 114489937A
Authority
CN
China
Prior art keywords
mirror image
task
caching
cached
cache
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111646100.8A
Other languages
Chinese (zh)
Inventor
陈羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN202111646100.8A priority Critical patent/CN114489937A/en
Publication of CN114489937A publication Critical patent/CN114489937A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0866Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches for peripheral storage systems, e.g. disk cache
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation
    • G06F8/63Image based installation; Cloning; Build to order
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/65Updates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45591Monitoring or debugging support
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45595Network integration; Enabling network access in virtual machine instances

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The application provides a mirror image caching method, a mirror image caching device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. The embodiment of the application can meet the requirement of mirror image operation on high speed and real time.

Description

Mirror image caching method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a mirror image caching method and apparatus, an electronic device, and a storage medium.
Background
With the development of computer technology, the appearance frequency of concepts such as containers and images on the network is higher and higher, and application containerization has become a development trend. The application can be delivered in a form of mirror image after being containerized and run in a form of container, and the management of a plurality of mirror images is also the key point in application development work. It should be understood that the mirror management usually does not depart from the pulling of the mirror, the mirror with strict requirements on time, such as the mirror requiring high speed and real time in the aspects of starting and capacity expansion, cannot tolerate the time-consuming process in the pulling process, while the current solution usually is based on the local mirror warehouse to perform the pulling cache of the mirror, but there is still a significant delay in pulling the mirror from the local mirror warehouse, and the node may have a network loss with the local mirror warehouse, and it is still difficult to meet the requirements of the mirror operation on high speed and real time.
Disclosure of Invention
In view of the above problems, the present application provides a mirror image caching method, device, electronic device, and storage medium, which can meet the requirement of mirror image operation on high speed and real time.
In order to achieve the above object, a first aspect of the embodiments of the present application provides a mirror caching method, which is applied to a container arrangement platform, and the method includes:
acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching;
and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
With reference to the first aspect, in one possible implementation, the method further includes:
and under the condition that the target node does not successfully pull the mirror image to be cached, updating the state of the mirror image caching task to be failure.
With reference to the first aspect, in a possible implementation manner, in a case that a target node does not successfully pull a mirror to be cached, updating a state of a mirror caching task to be a failure includes:
under the condition that the target node does not successfully pull the mirror image to be cached, re-executing the operations of obtaining the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information;
if the target node is detected to be unsuccessfully pulling the mirror image to be cached, the number of times of unsuccessfully pulling the mirror image caching task is increased once;
if the times of the mirror image cache tasks which are not successfully pulled do not reach the preset times, continuing to execute the operations of obtaining the mirror image cache tasks and generating the deployment and arrangement of the mirror images to be cached according to the task configuration information until the times of the mirror image cache tasks which are not successfully pulled reach the preset times;
and under the condition that the times of the mirror image cache task which is not successfully pulled reach the preset times, ending the mirror image cache task, and updating the state of the mirror image cache task to be failure.
With reference to the first aspect, in a possible implementation manner, before acquiring the mirror cache task, the method further includes:
receiving a caching request of a to-be-cached mirror image, generating a mirror image caching task according to the caching request, and adding the mirror image caching task to a mirror image caching task queue;
and/or the presence of a gas in the gas,
and responding to the triggering of the timing task, generating a mirror image cache task according to the timing task, and adding the mirror image cache task to a mirror image cache task queue.
With reference to the first aspect, in a possible implementation manner, the generating a mirror cache task according to a cache request includes:
checking the first task list;
under the condition that the first task list passes verification, analyzing the first task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
With reference to the first aspect, in a possible implementation manner, in response to triggering of a timing task, generating a mirror cache task according to the timing task includes:
responding to the trigger of the timing task, and acquiring the list information of the mirror image to be cached and the list information of the target node;
integrating the list information of the mirror image to be cached and the list information of the target node into a second task list;
analyzing the second task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
With reference to the first aspect, in a possible implementation manner, the task configuration information further includes a pull policy of the mirror image to be cached, and the deployment orchestration is specifically configured to instruct the target node to pull the cache mirror image from the mirror image repository for caching according to the pull policy.
A second aspect of the embodiments of the present application provides a mirror image caching apparatus, which is applied to a container arrangement platform, and includes an obtaining unit and a processing unit; wherein the content of the first and second substances,
the system comprises an acquisition unit, a cache unit and a cache unit, wherein the acquisition unit is used for acquiring a mirror image cache task, the mirror image cache task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
the processing unit is used for generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching;
and the processing unit is also used for updating the state of the mirror image caching task to be successful under the condition that the target node successfully pulls the mirror image to be cached.
A third aspect of embodiments of the present application provides an electronic device, which includes an input device, an output device, and a processor, and is adapted to implement one or more instructions; and a memory storing one or more computer programs adapted to be loaded by the processor and to perform the steps of:
acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from a mirror image warehouse for caching;
and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
A fourth aspect of embodiments of the present application provides a computer storage medium having one or more instructions stored thereon, the one or more instructions adapted to be loaded by a processor and to perform the following steps:
acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from a mirror image warehouse for caching;
and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
The above scheme of the present application includes at least the following beneficial effects:
in the embodiment of the application, a mirror image caching task is obtained, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. Therefore, deployment arrangement of the mirror image to be cached is generated based on the mirror image caching task, then the target node pulls the mirror image to be cached for caching, and the purpose that the pulling cache of the mirror image to be cached can be completed before the mirror image to be cached is used can be achieved, so that time consumed by pulling the mirror image to be cached when the mirror image to be cached is used is saved, and the requirement of mirror image operation on high speed and real time can be met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an application environment provided in an embodiment of the present application;
FIG. 2 is a block diagram of a container orchestration platform according to embodiments of the present application;
fig. 3 is a schematic flowchart of a mirror caching method according to an embodiment of the present application;
fig. 4 is a schematic diagram of task configuration information according to an embodiment of the present application;
fig. 5 is a schematic flowchart of another mirror caching method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a mirror image caching apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprising" and "having," and any variations thereof, as appearing in the specification, claims and drawings of this application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. In addition, the terms "first", "second", and "third", etc. are used to distinguish different objects, and are not used to describe a particular order.
The following is a brief analysis of the related art to which the embodiments of the present application relate.
Mirror pull typically has the following characteristics: the time consumption of the process cannot be tolerated, namely the pulling process has strict requirements on low time consumption; the dependence on a mirror image warehouse is high, a mirror image cannot be pulled due to single-point failure of the mirror image warehouse, and a result that the mirror image cannot be deployed is finally obtained; the dependence on the network is high, for applications running on the edge device, such as an internet of things application program, the network connection between the edge device and the mirror image warehouse is intermittent, and the situation that the mirror image in the mirror image warehouse is pulled is usually blocked; and (4) rolling upgrading, which can cause application upgrading failure under the condition of mirror image pulling failure. Aiming at the characteristics, the related technology designs a local mirror image warehouse to carry out mirror image pulling cache, all mirror image requests pass through the local mirror image warehouse in the mode, and if the local mirror image warehouse has a corresponding mirror image, the mirror image of the local mirror image warehouse is used; and if the local mirror image warehouse does not have the corresponding mirror image, the corresponding mirror image is dragged from the public mirror image warehouse and is stored in the local mirror image warehouse. However, in the scheme based on the local mirror image warehouse, a large amount of computing resources and human resources are consumed in deploying, operating and maintaining the local mirror image warehouse, and the problem of rapid start of container application cannot be completely solved.
Based on the problems in the related art, an embodiment of the present application provides a mirror caching method, which can be implemented based on an application environment shown in fig. 1, please refer to fig. 1, where the application environment includes an electronic device 101 and a terminal device 102 connected to the electronic device 101 through a network. The terminal device 102 may be configured to send a mirror image cache request to the electronic device 101, where the cache request includes task configuration information of a mirror image cache task, and when the electronic device 101 receives the cache request, the format of the task configuration information is checked, and when the check is passed, a corresponding mirror image cache task is generated according to the task configuration information, and the mirror image cache task is added to a mirror image cache task queue. After determining that the target node successfully pulls the mirror image, the electronic device 101 may update the state of the mirror image caching task of the mirror image in the mirror image caching task queue to be successful, that is, complete the mirror image caching task.
For example, the electronic device 101 may be an independent server, or may be a cloud server that provides basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), and a big data and artificial intelligence platform. The terminal device 102 may be a smart phone, a computer, a personal digital assistant, a kiosk, and so on.
Illustratively, a container orchestration platform is also deployed or operated in the electronic device 101 shown in fig. 1, and the method steps performed by the electronic device 101 may be specifically completed by the container orchestration platform. Referring to fig. 2, fig. 2 is a block diagram of a container arrangement platform according to an embodiment of the present disclosure, and as shown in fig. 2, the container arrangement platform includes an interactor and a controller, where the interactor includes an analysis module and a mirror image warehouse interaction module, the analysis module is configured to check and analyze a cache request sent from the outside, and after the check and the analysis are completed, deliver a corresponding mirror image cache task to the controller; the mirror image warehouse interaction module is used for interacting with the mirror image warehouse, and particularly used for acquiring a mirror image list and acquiring description information of a specified mirror image. The controller comprises a mirror image management module and a task management module, wherein the mirror image management module is used for monitoring whether a mirror image cache task to be executed exists in a mirror image cache task queue, if so, corresponding deployment arrangement is generated according to task configuration information of the mirror image cache task, for example, the deployment code can be that a corresponding node in the task configuration information pulls a corresponding mirror image of a corresponding mirror image warehouse, and an execution result of the mirror image cache task is returned under the condition that the mirror image pull is determined to be successful; the task management module is used for maintaining the mirror image cache task queue, classifying the tasks, for example, distinguishing the deleted mirror image and the cached mirror image, updating and recording the state and the execution result of the mirror image cache task, and executing a task retry mechanism. Based on the container arrangement platform, the mirror image pulling cache can be completed before the mirror image is used, so that time consumed by pulling the mirror image when the mirror image is used is saved, and the requirement of mirror image operation on high speed and real time can be met. Based on the application environment shown in fig. 1 and the container orchestration platform shown in fig. 2, the mirror caching method provided by the embodiment of the present application is described in detail below with reference to other drawings.
Referring to fig. 3, fig. 3 is a flowchart illustrating a mirror caching method according to an embodiment of the present application, where the method is applied to a container arrangement platform, as shown in fig. 3, including steps 301 and 303:
301: and acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached.
In the embodiment of the application, the mirror image cache task usually exists in a mirror image cache task queue, the container arrangement platform can monitor the mirror image cache task queue in real time or at regular time, and when the mirror image cache task exists in the mirror image cache task queue, the mirror image cache task is extracted for execution. The mirror image caching task includes task configuration information of a mirror image to be cached, where the task configuration information includes a target node for caching the mirror image to be cached, related information of the mirror image to be cached, and related information of a mirror image warehouse, for example, as shown in fig. 4, the related information of the mirror image to be cached may include a name of the mirror image to be cached, a pull policy, and the like, the related information of the mirror image warehouse may include an address, an account number, and the like of the mirror image warehouse, and the target node may be a server and is identified by a node name or an IP (Internet Protocol) address.
The pulling policy of the mirror image to be cached may be IfNotPresent or Always, IfNotPresent indicates that the mirror image to be cached is pulled only when the mirror image to be cached does not exist on the target node (host), and Always indicates that the mirror image to be cached is pulled every time.
302: and generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching.
In the specific embodiment of the application, after the task configuration information of the mirror image caching task is acquired, corresponding deployment arrangement can be generated according to information such as a mirror image, a mirror image warehouse, a node and the like in the task configuration information. It should be understood that the orchestration refers to making an execution sequence of each action in a deployment process, a storage location and an acquisition manner of a dependent file required by the deployment process, and how to verify the success of the deployment according to a coupling relationship between deployed objects and dependencies of a deployed object environment, and the orchestration refers to performing orchestration-specified environment initialization on a target machine according to content and processes specified by the orchestration, storing specified dependencies and files, running specified deployment actions, and finally confirming the success of the joint deployment according to rules in the orchestration. In the container arrangement tool, there is a most basic arrangement mode, so that an application is operated as a container, and here, since only a pull cache of a mirror image to be cached is involved, the arrangement is mainly used for instructing a target node to pull a cache mirror image from a mirror image warehouse for caching, specifically, the arrangement may be used for instructing the target node to pull the cache mirror image from the mirror image warehouse according to a pull policy, for example, the target node is instructed to pull which mirror image from which mirror image warehouse, so that the container operation is only simple printing, and no substantial operation is performed.
303: and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
In the embodiment of the present application, the container arrangement platform may update the state of the mirror image caching task in the mirror image caching task queue to be successful, for example, when it is determined that the target node successfully pulls the mirror image to be cached, for example, when receiving a prompt message that the pulling is successful or a prompt message that the caching is successful, which is returned by the target node.
For example, in the case that the target node does not successfully pull the mirror to be cached, the container orchestration platform may update the state of the mirror caching task to fail. Specifically, when the target node does not successfully pull the mirror image to be cached, the container arrangement platform re-executes the operations of obtaining the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information, such as: and re-acquiring the task configuration information of the mirror image cache task from the mirror image cache task queue, and generating deployment arrangement of the mirror image to be cached according to the task configuration information, namely, indicating the target node to pull the cache image from the mirror image warehouse again for caching, and simultaneously increasing the number of times of success of non-pulling of the mirror image cache task once. If the number of times of the mirror image caching task which is not successfully pulled does not reach the preset number of times, continuing to execute the operations of obtaining the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information until the number of times of the mirror image caching task which is not successfully pulled reaches the preset number of times (for example, 3 times), ending the mirror image caching task under the condition that the number of times of the mirror image caching task which is not successfully pulled reaches the preset number of times, updating the state of the mirror image caching task to be failed, and stopping executing the mirror image caching task. It should be appreciated that the reason for the mirror caching task failing execution may be that the mirror repository is unavailable, and so on.
Illustratively, before obtaining the mirror caching task, the method further comprises:
receiving a caching request of a mirror to be cached, generating a mirror caching task according to the caching request, and adding the mirror caching task to a mirror caching task queue;
and/or the presence of a gas in the gas,
and responding to the triggering of the timing task, generating a mirror image cache task according to the timing task, and adding the mirror image cache task to a mirror image cache task queue.
In this embodiment, the mirror caching task may be generated based on an external request, or may be generated by triggering a substantially local timing task.
Illustratively, the cache request includes a first task list of the mirror cache task, and the mirror cache task is generated according to the cache request, including:
checking the first task list;
under the condition that the first task list passes verification, analyzing the first task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
Specifically, the first task list includes task configuration information of a mirror image to be cached, the container arrangement platform needs to verify the format of the task configuration information, if the verification is successful, the task configuration information is obtained, the mirror image caching task is generated, and if the verification is failed, the next task list is processed.
Illustratively, in response to the triggering of the timed task, generating a mirror cache task according to the timed task includes:
responding to the trigger of the timing task, and acquiring the list information of the mirror image to be cached and the list information of the target node;
integrating the list information of the mirror image to be cached and the list information of the target node into a second task list;
analyzing the second task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
Specifically, the timing task is a task of executing mirror image pull cache at regular time, and when the timing task is triggered, the container arrangement platform acquires list information of the mirror image to be cached and list information of a target node, where the list information of the mirror image to be cached may include a name of the mirror image to be cached, a pull policy, a mirror image warehouse address, and the like, and the list information of the target node may include a name of the target node, an IP address, and the like. The container arrangement platform integrates the list information of the mirror image to be cached and the list information of the target node into a task list, namely a second task list, checks the format of the designated task configuration information in the second task list, analyzes the second task list if the checking is successful, obtains the task configuration information shown in fig. 4, and generates the mirror image caching task.
It can be seen that, by acquiring a mirror image caching task, the mirror image caching task includes task configuration information of a mirror image to be cached, and the task configuration information includes a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. Therefore, deployment arrangement of the mirror image to be cached is generated based on the mirror image caching task, then the target node pulls the mirror image to be cached for caching, and the purpose that the pulling cache of the mirror image to be cached can be completed before the mirror image to be cached is used can be achieved, so that time consumed by pulling the mirror image to be cached when the mirror image to be cached is used is saved, and the requirement of mirror image operation on high speed and real time can be met. In addition, because a large amount of computing resources and human resources are not consumed to deploy and operate and maintain the local mirror image warehouse, the embodiment of the application relatively weakens the dependence on the local mirror image warehouse and the network.
Referring to fig. 5, fig. 5 is a flow chart illustrating another mirror caching method provided in the embodiment of the present application, as shown in fig. 5, including steps 501 and 507:
501: judging whether a mirror image caching task exists or not and whether the times of the mirror image caching task which is not successfully pulled are less than preset times or not;
if yes, go to step 502, otherwise go to step 507;
502: acquiring a mirror image caching task;
the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
503: generating deployment arrangement of the mirror image to be cached according to the task configuration information;
the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching;
504: judging whether the mirror image to be cached is pulled successfully or not;
if yes, go to step 505, otherwise go to step 506;
505: updating the state of the mirror image caching task to be successful;
506: re-executing the operations of obtaining the mirror image cache task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information, and increasing the times of the non-pulling success of the mirror image cache task once;
507: and finishing the mirror image caching task and updating the state of the mirror image caching task to be failure.
The specific implementation of steps 501-507 has already been described in the embodiments shown in fig. 3 and fig. 4, and can achieve the same or similar beneficial effects, and in order to avoid repetition, the details are not repeated here.
It can be seen that, in the embodiment of the present application, a mirror image caching task is obtained, where the mirror image caching task includes task configuration information of a mirror image to be cached, and the task configuration information includes a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. Therefore, deployment arrangement of the mirror image to be cached is generated based on the mirror image caching task, then the target node pulls the mirror image to be cached for caching, and the purpose that the pulling cache of the mirror image to be cached can be completed before the mirror image to be cached is used can be achieved, so that time consumed by pulling the mirror image to be cached when the mirror image to be cached is used is saved, and the requirement of mirror image operation on high speed and real time can be met.
Based on the description of the foregoing mirror image caching method embodiment, please refer to fig. 6, and fig. 6 is a schematic structural diagram of a mirror image caching apparatus provided in the present embodiment, as shown in fig. 6, the apparatus includes an obtaining unit 601 and a processing unit 602; wherein the content of the first and second substances,
an obtaining unit 601, configured to obtain a mirror image caching task, where the mirror image caching task includes task configuration information of a mirror image to be cached, and the task configuration information includes a target node for caching the mirror image to be cached;
the processing unit 602 is configured to generate deployment orchestration of the mirror image to be cached according to the task configuration information, where the deployment orchestration is used to instruct the target node to pull the cache mirror image from the mirror image warehouse for caching;
the processing unit 602 is further configured to update the state of the mirror caching task to be successful when the target node successfully pulls the mirror to be cached.
It can be seen that, in the mirror image caching device shown in fig. 6, by acquiring a mirror image caching task, the mirror image caching task includes task configuration information of a mirror image to be cached, and the task configuration information includes a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. Therefore, deployment arrangement of the mirror image to be cached is generated based on the mirror image caching task, then the target node pulls the mirror image to be cached for caching, and the purpose that the pulling cache of the mirror image to be cached can be completed before the mirror image to be cached is used can be achieved, so that time consumed by pulling the mirror image to be cached when the mirror image to be cached is used is saved, and the requirement of mirror image operation on high speed and real time can be met.
In a possible implementation, the processing unit 602 is further configured to:
and under the condition that the target node does not successfully pull the mirror image to be cached, updating the state of the mirror image caching task to be failure.
In a possible implementation manner, in a case that the target node does not successfully pull the mirror to be cached, the processing unit 602 is specifically configured to, in terms of updating the state of the mirror caching task to be a failure:
under the condition that the target node does not successfully pull the mirror image to be cached, re-executing the operations of obtaining the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information;
if the target node is detected to be unsuccessfully pulling the mirror image to be cached, the number of times of unsuccessfully pulling the mirror image caching task is increased once;
if the times of the mirror image cache tasks which are not successfully pulled do not reach the preset times, continuing to execute the operations of obtaining the mirror image cache tasks and generating the deployment and arrangement of the mirror images to be cached according to the task configuration information until the times of the mirror image cache tasks which are not successfully pulled reach the preset times;
and under the condition that the times of the mirror image cache task which is not successfully pulled reach the preset times, ending the mirror image cache task, and updating the state of the mirror image cache task to be failure.
In a possible implementation, the processing unit 602 is further configured to:
receiving a caching request of a mirror to be cached, generating a mirror caching task according to the caching request, and adding the mirror caching task to a mirror caching task queue;
and/or the presence of a gas in the gas,
and responding to the triggering of the timing task, generating a mirror image cache task according to the timing task, and adding the mirror image cache task to a mirror image cache task queue.
In a possible implementation manner, in the aspect that the cache request includes a first task list of a mirror cache task and the mirror cache task is generated according to the cache request, the processing unit 602 is specifically configured to:
checking the first task list;
under the condition that the first task list passes verification, analyzing the first task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
In a possible implementation manner, in responding to the trigger of the timing task, in terms of generating a mirror cache task according to the timing task, the processing unit 602 is specifically configured to:
responding to the trigger of the timing task, and acquiring the list information of the mirror image to be cached and the list information of the target node;
integrating the list information of the mirror image to be cached and the list information of the target node into a second task list;
analyzing the second task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
In a possible implementation manner, the task configuration information further includes a pull policy of the mirror image to be cached, and the deployment orchestration is specifically configured to instruct the target node to pull the cache mirror image from the mirror image repository for caching according to the pull policy.
According to an embodiment of the present application, the units of the mirror cache apparatus shown in fig. 6 may be respectively or entirely combined into one or several other units to form the mirror cache apparatus, or some unit(s) of the mirror cache apparatus may be further split into multiple units with smaller functions to form the mirror cache apparatus, which may implement the same operation without affecting implementation of technical effects of the embodiment of the present application. The units are divided based on logic functions, and in practical application, the functions of one unit can be realized by a plurality of units, or the functions of a plurality of units can be realized by one unit. In other embodiments of the present application, the mirror cache apparatus may also include other units, and in practical applications, these functions may also be implemented by being assisted by other units, and may be implemented by cooperation of multiple units.
According to another embodiment of the present application, the image caching apparatus device as shown in fig. 6 may be constructed by running a computer program (including program codes) capable of executing the steps involved in the corresponding method as shown in fig. 3 or fig. 5 on a general-purpose computing device such as a computer including a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), and the like, and a storage element, and the image caching method of the embodiment of the present application may be implemented. The computer program may be recorded on a computer-readable recording medium, for example, and loaded and executed in the above-described computing apparatus via the computer-readable recording medium.
Based on the description of the method embodiment and the device embodiment, the embodiment of the application further provides an electronic device. Referring to fig. 7, the electronic device at least includes a processor 701, an input device 702, an output device 703 and a memory 704. The processor 701, the input device 702, the output device 703, and the memory 704 within the electronic device may be connected by a bus or other means.
A memory 704 may be stored in the memory of the electronic device, the memory 704 being adapted to store a computer program comprising program instructions, the processor 701 being adapted to execute the program instructions stored by the memory 704. The processor 701 (or CPU) is a computing core and a control core of the electronic device, and is adapted to implement one or more instructions, and in particular, is adapted to load and execute the one or more instructions so as to implement a corresponding method flow or a corresponding function.
In an embodiment, the processor 701 of the electronic device provided in the embodiment of the present application may be configured to perform a series of mirror caching processes:
acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching;
and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
It can be seen that, in the electronic device shown in fig. 7, by acquiring a mirror image caching task, the mirror image caching task includes task configuration information of a mirror image to be cached, and the task configuration information includes a target node for caching the mirror image to be cached; generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from the mirror image warehouse for caching; and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful. Therefore, deployment arrangement of the mirror image to be cached is generated based on the mirror image caching task, then the target node pulls the mirror image to be cached for caching, and the purpose that the pulling cache of the mirror image to be cached can be completed before the mirror image to be cached is used can be achieved, so that time consumed by pulling the mirror image to be cached when the mirror image to be cached is used is saved, and the requirement of mirror image operation on high speed and real time can be met.
In yet another embodiment, the processor 701 is further configured to perform:
and under the condition that the target node does not successfully pull the mirror image to be cached, updating the state of the mirror image caching task to be failure.
In another embodiment, the updating of the state of the mirror caching task to failure by the processor 701 is performed when the target node does not successfully pull the mirror to be cached, including:
under the condition that the target node does not successfully pull the mirror image to be cached, re-executing the operations of obtaining the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information;
if the target node is detected to be unsuccessfully pulling the mirror image to be cached, the number of times of unsuccessfully pulling the mirror image caching task is increased once;
if the times of the mirror image cache tasks which are not successfully pulled do not reach the preset times, continuing to execute the operations of obtaining the mirror image cache tasks and generating the deployment and arrangement of the mirror images to be cached according to the task configuration information until the times of the mirror image cache tasks which are not successfully pulled reach the preset times;
and under the condition that the times of the mirror image cache task which is not successfully pulled reach the preset times, ending the mirror image cache task, and updating the state of the mirror image cache task to be failure.
In yet another embodiment, before obtaining the mirror caching task, the processor 701 is further configured to:
receiving a caching request of a mirror to be cached, generating a mirror caching task according to the caching request, and adding the mirror caching task to a mirror caching task queue;
and/or the presence of a gas in the gas,
and responding to the triggering of the timing task, generating a mirror image cache task according to the timing task, and adding the mirror image cache task to a mirror image cache task queue.
In another embodiment, the cache request includes a first task list of a mirror cache task, and the processor 701 performs generating the mirror cache task according to the cache request, including:
checking the first task list;
under the condition that the first task list passes verification, analyzing the first task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
In another embodiment, the processor 701 performs the generation of the mirror cache task according to the timed task in response to the triggering of the timed task, including:
responding to the trigger of the timing task, and acquiring the list information of the mirror image to be cached and the list information of the target node;
integrating the list information of the mirror image to be cached and the list information of the target node into a second task list;
analyzing the second task list to obtain task configuration information;
and generating a mirror image caching task according to the task configuration information.
In another embodiment, the task configuration information further includes a pull policy of the mirror image to be cached, and the deployment orchestration is specifically configured to instruct the target node to pull the cache mirror image from the mirror image repository for caching according to the pull policy.
By way of example, electronic devices include, but are not limited to, a processor 701, an input device 702, an output device 703, and a memory 704. And the system also comprises a memory, a power supply, an application client module and the like. The input device 702 may be a keyboard, touch screen, radio frequency receiver, etc., and the output device 703 may be a speaker, display, radio frequency transmitter, etc. It will be appreciated by those skilled in the art that the schematic diagrams are merely examples of an electronic device and are not limiting of an electronic device and may include more or fewer components than those shown, or some components in combination, or different components.
It should be noted that, since the steps in the image caching method are implemented when the processor 701 of the electronic device executes the computer program, the embodiments of the image caching method are all applicable to the electronic device, and all can achieve the same or similar beneficial effects.
An embodiment of the present application further provides a computer storage medium (Memory), which is a Memory device in an electronic device and is used to store programs and data. It is understood that the computer storage medium herein may include a built-in storage medium in the terminal, and may also include an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by processor 701. The computer storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; alternatively, it may be at least one computer storage medium located remotely from the processor 701. In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by processor 701 to perform the corresponding steps described above with respect to the mirror caching method.
Illustratively, the computer program of the computer storage medium includes computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, and the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like.
It should be noted that, since the computer program of the computer storage medium implements the steps in the image caching method when executed by the processor, all the embodiments of the image caching method are applicable to the computer storage medium, and can achieve the same or similar beneficial effects.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A mirror caching method is applied to a container arrangement platform, and comprises the following steps:
acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
generating deployment arrangement of the mirror image to be cached according to the task configuration information, wherein the deployment arrangement is used for indicating the target node to pull the cache mirror image from a mirror image warehouse for caching;
and under the condition that the target node successfully pulls the mirror image to be cached, updating the state of the mirror image caching task to be successful.
2. The method of claim 1, further comprising:
and under the condition that the target node does not successfully pull the mirror image to be cached, updating the state of the mirror image caching task to be failure.
3. The method according to claim 2, wherein the updating the state of the mirror caching task to failure in the case that the target node does not successfully pull the mirror to be cached comprises:
under the condition that the target node does not successfully pull the mirror image to be cached, re-executing the operation of acquiring the mirror image caching task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information;
if the target node is detected to be unsuccessfully pulling the mirror image to be cached, increasing the number of times of unsuccessfully pulling the mirror image caching task once;
if the times of the mirror image cache task which is not successfully pulled do not reach the preset times, continuing to execute the operations of obtaining the mirror image cache task and generating the deployment and arrangement of the mirror image to be cached according to the task configuration information until the times of the mirror image cache task which is not successfully pulled reach the preset times;
and under the condition that the times of the mirror image cache task which is not successfully pulled reach preset times, finishing the mirror image cache task, and updating the state of the mirror image cache task to be failure.
4. The method of claim 1, wherein prior to fetching a mirror caching task, the method further comprises:
receiving a cache request of the mirror image to be cached, generating a mirror image cache task according to the cache request, and adding the mirror image cache task to a mirror image cache task queue;
and/or the presence of a gas in the gas,
and responding to the triggering of the timing task, generating the mirror image cache task according to the timing task, and adding the mirror image cache task to a mirror image cache task queue.
5. The method of claim 4, wherein the cache request comprises a first task list of the mirror cache task, and wherein generating the mirror cache task according to the cache request comprises:
checking the first task list;
under the condition that the first task list passes verification, analyzing the first task list to obtain the task configuration information;
and generating the mirror image caching task according to the task configuration information.
6. The method of claim 4, wherein the generating the mirror cache task from the timed task in response to the triggering of the timed task comprises: responding to the trigger of the timing task, and acquiring the list information of the mirror image to be cached and the list information of the target node;
integrating the list information of the mirror image to be cached and the list information of the target node into a second task list;
analyzing the second task list to obtain the task configuration information;
and generating the mirror image caching task according to the task configuration information.
7. The method according to any one of claims 1 to 6, wherein the task configuration information further includes a pull policy of the mirror image to be cached, and the deployment orchestration is specifically configured to instruct the target node to pull the cache mirror image from a mirror repository for caching according to the pull policy.
8. The mirror image caching device is applied to a container arrangement platform and comprises an acquisition unit and a processing unit; wherein the content of the first and second substances,
the acquiring unit is used for acquiring a mirror image caching task, wherein the mirror image caching task comprises task configuration information of a mirror image to be cached, and the task configuration information comprises a target node for caching the mirror image to be cached;
the processing unit is configured to generate deployment layout of the mirror image to be cached according to the task configuration information, where the deployment layout is used to instruct the target node to pull the cache mirror image from a mirror image warehouse for caching;
the processing unit is further configured to update the state of the mirror image caching task to be successful when the target node successfully pulls the mirror image to be cached.
9. An electronic device comprising an input device and an output device, further comprising:
a processor adapted to implement one or more computer programs; and the number of the first and second groups,
memory storing one or more computer programs adapted to be loaded by the processor and to perform the method according to any of claims 1-7.
10. A computer storage medium having stored thereon one or more instructions adapted to be loaded by a processor and to perform the method of any of claims 1-7.
CN202111646100.8A 2021-12-30 2021-12-30 Mirror image caching method and device, electronic equipment and storage medium Pending CN114489937A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111646100.8A CN114489937A (en) 2021-12-30 2021-12-30 Mirror image caching method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111646100.8A CN114489937A (en) 2021-12-30 2021-12-30 Mirror image caching method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114489937A true CN114489937A (en) 2022-05-13

Family

ID=81508085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111646100.8A Pending CN114489937A (en) 2021-12-30 2021-12-30 Mirror image caching method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114489937A (en)

Similar Documents

Publication Publication Date Title
KR102414096B1 (en) Create and deploy packages for machine learning on end devices
CN109002297B (en) Deployment method, device, equipment and storage medium of consensus mechanism
US10404568B2 (en) Agent manager for distributed transaction monitoring system
CN112162795B (en) Plug-in starting method and device, computer equipment and storage medium
CN112860282B (en) Cluster plug-in upgrading method, device and server
CN110866740A (en) Processing method and device for block chain transaction request, electronic equipment and medium
CN110825399A (en) Deployment method and device of application program
CN106886422A (en) Method for upgrading software and device, electronic equipment
CN108540509A (en) A kind of processing method of terminal browser, device and server, intelligent terminal
CN111399860B (en) Light application deployment method, light application deployment device, computer equipment and storage medium
US9971611B2 (en) Monitoring a mobile device application
CN114968477A (en) Container heat transfer method and container heat transfer device
CN114327484A (en) Multi-architecture supporting K8S integration and deployment method, system and storage medium
CN112199151B (en) Application program running method and device
CN109299124A (en) Method and apparatus for more new model
CN111722994A (en) Task request response method and device
CN113032224A (en) Information acquisition method and device, electronic equipment and readable storage medium
CN114489937A (en) Mirror image caching method and device, electronic equipment and storage medium
CN115811481A (en) Interactive service testing method and device, computer equipment and storage medium
CN111679842A (en) Application program hot update processing method, device and system
CN112181470B (en) Patch deployment method and device
CN114610446A (en) Method, device and system for automatically injecting probe
CN112667491B (en) Function test method and device for virtual machine
CN114422358A (en) API gateway configuration updating method and equipment
CN114296747A (en) Installation method and device of software installation package

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination