CN115380269A - Mirror image pulling method and related product - Google Patents

Mirror image pulling method and related product Download PDF

Info

Publication number
CN115380269A
CN115380269A CN202080099553.0A CN202080099553A CN115380269A CN 115380269 A CN115380269 A CN 115380269A CN 202080099553 A CN202080099553 A CN 202080099553A CN 115380269 A CN115380269 A CN 115380269A
Authority
CN
China
Prior art keywords
mirror image
cache
target
preset
pulling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080099553.0A
Other languages
Chinese (zh)
Inventor
徐进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Shenzhen Huantai Technology Co Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Shenzhen Huantai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd, Shenzhen Huantai Technology Co Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN115380269A publication Critical patent/CN115380269A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A mirror image pulling method and related products are applied to a cache agent warehouse, and the method comprises the following steps: receiving a mirror image pulling request, wherein the mirror image pulling request is initiated by a target cluster, the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster (101); detecting whether the mirror image pulling request hits a preset cache (102); when the mirror image pulling request hits a preset cache, pulling the content (103) which needs to be pulled by the mirror image pulling request from the preset cache; and when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache (104). By adopting the method, the mirror image can be pulled according to the application mirror image use condition in the machine room.

Description

Mirror image pulling method and related product Technical Field
The application relates to the field of computers, in particular to a mirror image pulling method and a related product.
Background
The hardor can synchronize mirror images among different data centers and different operating environments based on the policy-based Docker mirror image copying function, provides a friendly management interface, and greatly simplifies mirror image management work in actual operation and maintenance. However, the mirror copy function of the Harbor cannot sense the application mirror usage in the machine room, which causes meaningless mirror copy and wastes network bandwidth and disk space.
Disclosure of Invention
The embodiment of the application provides a mirror image pulling method and a related product, which can sense the use condition of an application mirror image in a machine room, realize mirror image pulling according to the use condition of the application mirror image in the machine room, and save network bandwidth and disk space.
In a first aspect, an embodiment of the present application provides a mirror image pulling method, applied to a cache agent repository, including:
receiving a mirror image pulling request, wherein the mirror image pulling request is initiated by a target cluster, the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster;
detecting whether the mirror image pulling request hits a preset cache or not;
when the mirror image pulling request hits a preset cache, pulling the content required to be pulled by the mirror image pulling request from the preset cache;
and when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
In a second aspect, an embodiment of the present application provides an image pulling apparatus, which is applied to a cache agent repository, where the apparatus includes: a receiving unit, a detecting unit and a mirror image pulling unit, wherein,
the receiving unit is configured to receive a mirror image pull request, where the mirror image pull request is initiated by a target cluster, the cache agent warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache agent warehouse is located in a target machine room, and the target machine room further includes the target cluster;
the detection unit is used for detecting whether the mirror image pulling request hits a preset cache or not;
the mirror image pulling unit is used for pulling the content required to be pulled by the mirror image pulling request from a preset cache when the mirror image pulling request hits the preset cache;
the mirror image pulling unit is further configured to pull, when the mirror image pulling request misses the preset cache, the content that needs to be pulled by the mirror image pulling request from the central mirror image warehouse, and store the content in the preset cache.
In a third aspect, an embodiment of the present application provides a server, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
Drawings
Reference will now be made in brief to the drawings that are needed in describing embodiments or prior art.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 1B is a schematic diagram of an architecture for implementing a mirror image pull method according to an embodiment of the present application;
fig. 1C is a schematic flowchart of a mirror image pulling method disclosed in an embodiment of the present application;
fig. 2 is a schematic flowchart of another mirror image pull method disclosed in an embodiment of the present application;
fig. 3 is a schematic structural diagram of another server disclosed in the embodiment of the present application;
fig. 4A is a schematic structural diagram of a mirror image pulling apparatus disclosed in an embodiment of the present application;
fig. 4B is a schematic structural diagram of another mirror image pulling apparatus disclosed in the embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The caching proxy warehouse according to the embodiment of the present application may be a server or may be set in the server.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of a server according to an embodiment of the present disclosure, and the server 100 may include a control circuit, which may include a storage and processing circuit 110. The storage and processing circuitry 110 may be memory, such as hard disk drive memory, non-volatile memory (e.g., flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), volatile memory (e.g., static or dynamic random access memory, etc.), etc., and embodiments of the present application are not limited thereto. Processing circuitry in storage and processing circuitry 110 may be used to control the operation of server 100. The processing circuit may be implemented based on one or more microprocessors, microcontrollers, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuitry 110 may be used to run software in the server 100, such as an internet browsing application, a Voice Over Internet Protocol (VOIP) phone call application, an email application, a media playing application, operating system functions, and so forth. Such software may be used to perform control operations such as camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality based on status indicators such as status indicator lights of light emitting diodes, touch event detection based on a touch sensor, functionality associated with displaying information on multiple (e.g., layered) displays, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the server 100, for example, and without limitation.
The server 100 may also include input-output circuitry 150. The input-output circuit 150 is operable to enable the server 100 to input and output data, i.e., to allow the server 100 to receive data from external devices and also to allow the server 100 to output data from the server 100 to external devices. The input-output circuit 150 may further include a sensor 170. The sensors 170 may include ambient light sensors, proximity sensors based on light and capacitance, touch sensors (e.g., based on optical touch sensors and/or capacitive touch sensors, where the touch sensors may be part of a touch display screen or used independently as a touch sensor structure), acceleration sensors, gravity sensors, and other sensors, among others.
Input-output circuitry 150 may also include one or more displays, such as display 130. Display 130 may include one or a combination of liquid crystal displays, organic light emitting diode displays, electronic ink displays, plasma displays, displays using other display technologies. Display 130 may include an array of touch sensors (i.e., display 130 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The audio component 140 may be used to provide audio input and output functionality for the server 100. The audio components 140 in the server 100 may include speakers, microphones, buzzers, tone generators, and other components for generating and detecting sound.
The communication circuit 120 may be used to provide the server 100 with the ability to communicate with external devices. The communication circuit 120 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 120 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless communication circuitry in communication circuitry 120 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving near field coupled electromagnetic signals. For example, the communication circuit 120 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 120 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuitry and antenna, and so forth.
The server 100 may further include a battery, power management circuitry, and other input-output units 160. The input-output unit 160 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, and the like.
A user may input commands through the input-output circuitry 150 to control operation of the server 100 and may use output data of the input-output circuitry 150 to enable receipt of status information and other outputs from the server 100.
Based on this, please refer to fig. 1B, where fig. 1B provides a system architecture for implementing the method according to the embodiment of the present application, and the method according to the embodiment of the present application may be applied to a cache agent warehouse, where the cache agent warehouse may be disposed in a server, the cache agent warehouse is located in a cache distribution system, the system may include a central mirror image warehouse, a message center, N machine rooms, where N is a positive integer, and each machine room is deployed with one cache agent warehouse. The central mirror image warehouse can be used for storing all application mirror images, and when a client pushes an application mirror image to the central mirror image warehouse, the warehouse can inform the message center of the mirror image. The mirror name comprises an application name; the message center can be used for storing mirror image pushing records and supporting a real-time monitoring interface, and the client can receive an event for pushing a new mirror image in real time through the interface. The caching agent warehouse can serve the mirror image client side in the same machine room and caches the application mirror image pulled according to the requirement. Meanwhile, the caching agent warehouse also supports an application-aware mirror image preheating mechanism (mirror image preheating module) and a mirror image elimination mechanism (mirror image elimination module), and of course, the caching agent warehouse also can comprise a registry module. The cache agent repository, the central mirror repository and the message center may be located on different servers or platforms. The cache agent warehouse and the central mirror image warehouse can both have a Docker mirror image copying function.
Based on the system framework shown in fig. 1B, the following method may be implemented, and the method is applied to any caching agent repository, specifically as follows:
receiving a mirror image pulling request, wherein the mirror image pulling request is initiated by a target cluster, the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster;
detecting whether the mirror image pulling request hits a preset cache or not;
when the mirror image pulling request hits a preset cache, pulling the content to be pulled of the mirror image pulling request from the preset cache;
and when the mirror image pulling request is not hit in the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
It can be seen that the mirror image pulling method described in the foregoing embodiment of the present application is applied to a cache proxy warehouse, and receives a mirror image pulling request, where the mirror image pulling request is initiated by a target cluster, the cache proxy warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache proxy warehouse is located in a target machine room, the target machine room further includes the target cluster, and detects whether the mirror image pulling request hits a preset cache, when the mirror image pulling request hits the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the preset cache, and when the mirror image pulling request does not hit the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the central mirror image warehouse and stored in the preset cache, so that the use condition of an application mirror image in the machine room can be sensed, and the mirror image pulling is implemented according to the use condition of the application mirror image in the machine room, and therefore, the network bandwidth and the disk space can be saved.
Referring to fig. 1C, fig. 1C is a schematic flowchart of a mirror image pulling method according to an embodiment of the present disclosure, where the mirror image pulling method described in the present embodiment is applied to a server as shown in fig. 1A or a system architecture as shown in fig. 1B, and the mirror image pulling method includes:
101. receiving a mirror image pulling request initiated by a target cluster, wherein the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster.
The target cluster in the embodiment of the present application may be any type of cluster, for example, the target cluster may be a K8S cluster. The mirror distribution system may include a central mirror repository, a message center, and a plurality of computer rooms, each of which may include a caching agent repository and a cluster (e.g., a K8S cluster). The caching agent warehouse and the central mirror warehouse can be Harbor mirror warehouses.
In a specific implementation, the caching agent repository and the target cluster may be located in the same server, or may be located in different servers. The caching agent repository may receive a mirror pull request initiated by the target cluster.
102. And detecting whether the mirror image pulling request hits a preset cache or not.
The preset cache may be a cache list or a cache area, for example, the preset cache may be disposed in the cache agent repository or may be disposed in a local disk (cache disk). The cache agent repository may detect whether the mirror pull request hits in a predetermined cache, which may be preset by a user or default to the system.
In one possible example, the caching agent repository includes a mirror warming module; in the step 102, detecting whether the mirror pull request hits in the preset cache may include the following steps:
21. monitoring a mirror image pushing event of the message center through the mirror image preheating module, wherein the mirror image pushing event comprises a mirror image name;
22. analyzing the mirror image name to obtain a target application name;
23. detecting whether the application corresponding to the target application name is deployed in a target cluster of the target computer room;
24. deploying the application corresponding to the target application name in the cluster of the target machine room, and confirming that the mirror image pulling request hits the preset cache;
25. and when the application corresponding to the target application name is not deployed in the cluster of the target computer room, confirming that the mirror image pulling request does not hit the preset cache.
The caching agent repository may include a mirror preheating module, and the mirror preheating module is mainly used for monitoring a mirror pushing event of the message center.
In specific implementation, the cache agent warehouse may monitor a mirror image push event of the message center through the mirror image preheating module, where the mirror image push event may include a mirror image name, and further, the mirror image name may be analyzed to obtain a target application name, and it may be detected whether an application corresponding to the target application name is deployed in a target cluster of the target machine room, and if the application corresponding to the target application name is deployed in the cluster of the target machine room, it may be determined that the mirror image pull request hits the preset cache, and otherwise, if the application corresponding to the target application name is not deployed in the cluster of the target machine room, it may be determined that the mirror image pull request does not hit the preset cache.
103. And when the mirror image pulling request hits a preset cache, pulling the content to be pulled of the mirror image pulling request from the preset cache.
In a specific implementation, the cache agent warehouse may pull, when the mirror image pull request hits the preset cache, the content that the mirror image pull request needs to be pulled from the preset cache.
104. And when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
In a specific implementation, when the mirror image pull request does not hit the preset cache, the cache agent warehouse may pull the content to be pulled of the mirror image pull request from the central mirror image warehouse and store the content in the preset cache.
According to the embodiment of the application, the mirror synchronization of multiple machine rooms can be realized through the Harbor mirror warehouse, the Harbor supports the mirror copying function, and the mirror can be actively copied to the Harbor of other machine rooms.
In a possible example, the caching agent repository further includes a mirror elimination module, and before or after any of the above steps 101 to 104, the method further includes the following steps:
a1, acquiring the utilization rate of a first cache disk;
and A2, executing a mirror image cleaning task when the utilization rate of the first cache disk is greater than a preset threshold value.
In a specific implementation, the cache agent warehouse may obtain a first cache disk usage rate of the cache disk, which indicates that the disk usage rate is too high and needs to be cleaned, and may execute the mirror image cleaning task when the first cache disk usage rate is greater than the preset threshold, otherwise, when the first cache disk usage rate is less than or equal to the preset threshold, which indicates that the disk cache space is sufficient, and may end the mirror image cleaning task.
In one possible example, the step A2, performing the mirror cleaning task, may include the following steps:
a21, detecting whether a mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, wherein the mirror image i is any mirror image in the preset cache;
and A22, if yes, keeping the mirror image i, and if not, deleting the mirror image i.
In the specific implementation, taking the mirror image i as an example, the mirror image i is any mirror image in a preset cache, and the cache agent warehouse may detect whether the mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, if so, the mirror image i may be retained, and if not, the mirror image i is deleted, so that the disk utilization rate may be reduced.
Further, in a possible embodiment, after the step A2, the following steps may be further included:
a3, obtaining the utilization rate of a second cache disk;
and A4, when the utilization rate of the second cache disk is less than or equal to the preset threshold value, ending the mirror image cleaning task.
After the mirror image cleaning task is executed, the cache agent warehouse may obtain a second cache disk usage rate of the cache disk, and when the second cache disk usage rate is less than or equal to a preset threshold, it indicates that the disk cache space is sufficient, and the mirror image cleaning task may be ended.
Further, in a possible example, after the step A3, the following steps may be further included:
a4, when the utilization rate of the second cache disk is greater than the preset threshold value, acquiring an application list;
a5, reserving the mirror image which is used in the application list;
and A6, deleting the images of which the versions are lower than the preset version in the images which are not used in the application list.
In a specific implementation, the preset version may be set by a user or default by the system. The cache agent repository may obtain the application list when the usage rate of the second cache disk is greater than the preset threshold, and further may retain the currently used image in the application list, but delete the image whose version is lower than the preset version in the images that are not currently used in the application list, so that the image whose version is not currently used and is low may be retained by the user.
Further, in a possible example, after the step A6, the following steps may be further included:
a7, acquiring the utilization rate of a third cache disk;
and A8, deleting all the images which are not used in the application list when the utilization rate of the third cache disk is greater than the preset threshold value.
In a specific implementation, the cache agent warehouse may obtain a third cache disk usage rate of the cache disk, and may delete all unused mirror images in the application list when the third cache disk usage rate is greater than a preset threshold, so as to vacate as much free disk space as possible.
Further, in a possible example, after the step A7, the following steps may be further included:
and A9, when the utilization rate of the third cache disk is less than or equal to the preset threshold value, ending the execution of the mirror image cleaning task.
In a specific implementation, the cache agent warehouse may indicate that the disk space is sufficient when the usage rate of the third cache disk is less than or equal to the preset threshold, and may end executing the mirror image cleaning task.
Further, in a possible example, after the step A8, the following steps may be further included
A10, acquiring the utilization rate of a fourth cache disk;
and A11, when the utilization rate of the fourth cache disk is greater than the preset threshold value, triggering an alarm operation.
In a specific implementation, the cache agent warehouse may obtain a fourth cache disk usage rate of the cache disk, and may trigger an alarm operation when the fourth cache disk usage rate is greater than a preset threshold, where the alarm operation specifically may be at least one of the following: and otherwise, when the utilization rate of the fourth cache disk is less than or equal to a preset threshold value, the disk space is sufficient, and the execution of the mirror image cleaning task can be finished.
The embodiment of the application provides a cross-machine-room mirror image pulling method, a cache acceleration method, a mirror image preheating method and a mirror image elimination mechanism. Due to limited network bandwidth between rooms, pulling a mirror image is often slow, especially under the condition of high concurrent pulling of the mirror image. By applying a perceived mirror image distribution system, a mirror image caching agent warehouse is deployed in each machine room, a mirror image client of each machine room pulls a mirror image from the agent warehouse of the machine room nearby, and if the mirror image client does not exist, the mirror image client is pulled from the agent to the central warehouse by an agent. By combining the mirror preheating mechanism of application perception, the problem of mirror pulling efficiency limited by bandwidth of multiple machine rooms can be effectively solved, and therefore application publishing efficiency is effectively improved.
In specific implementation, the cache agent warehouse can support mirror image pulling according to needs, effectively share the load of the central warehouse, improve the mirror image pulling efficiency, and in addition, can improve the first mirror image pulling efficiency based on an application-aware mirror image preheating method, and ensure a higher cache hit rate while ensuring the use rate of a disk based on an application-aware mirror image elimination mechanism.
For example, the mirror preheating module may monitor a mirror push event of the message center in real time, where the mirror push event may include a name of a newly pushed mirror, and after the mirror push event is monitored, an application name may be obtained by analyzing the mirror name, and whether the application has deployment in a K8S cluster of the local room is queried, if the deployment exists, the corresponding content of the mirror is pulled to the cache in advance, otherwise, the event is ignored.
In addition, the image elimination module may periodically check the cache disk usage rate, and when the usage rate reaches a preset threshold (e.g., 80%), may start an image cleaning task to release the disk space. For example, a first scan round, each mirror in the cache may be checked for an application in the k8s cluster of the native house that deploys the mirror association. If so, the image is retained, otherwise, the image is deleted. Further, whether the disk space is lower than a preset threshold value or not is checked, and if yes, the cleaning task is finished; otherwise, a second scan round is performed, and then all application lists deployed in the native machine can be queried through the API of K8S, and for each application, a predetermined maximum of N (e.g., N = 3) latest versions of images are reserved, and the image in use is preferentially reserved. Deleting the mirror images of other versions, checking whether the disk space is lower than a preset threshold value, and if so, ending the cleaning task; and if not, carrying out third-round scanning, deleting all the mirror images which are not used currently, checking whether the disk space is lower than a preset threshold value, and if so, ending the cleaning task. Otherwise, an alarm is triggered, and further, manual intervention can be carried out.
In one possible example, between step 101 and step 102, the following steps may be further included:
b1, acquiring target physiological state parameters of a user;
b2, determining a target emotion type corresponding to the target physiological state parameter;
and B3, when the target emotion type is a preset emotion type, executing the step of detecting whether the mirror image pulling request hits a preset cache.
In this embodiment, the physiological status parameter may be various parameters for reflecting the physiological function of the user, and the physiological status parameter may be at least one of the following parameters: heart rate, blood pressure, blood temperature, blood lipid content, blood glucose content, thyroxine content, epinephrine content, platelet content, blood oxygen content, and the like, without limitation. The preset emotion type may be set by the user or by the system default. The preset emotion type may be at least one of: oppression, crying, calmness, violence, excitement, depression, and the like, without limitation.
In specific implementation, the caching agent warehouse may obtain the target physiological state parameter of the user through a wearable device that is in communication connection with the caching agent warehouse, different physiological state parameters reflect the emotion type of the user, a mapping relationship between the physiological state parameter and the emotion type may be stored in the caching agent warehouse in advance, and then the target emotion type corresponding to the target physiological state parameter may be determined according to the mapping relationship, and then step 102 may be executed when the target emotion type is the preset emotion type, otherwise step 102 may not be executed.
In a possible example, when the target physiological state parameter is a heart rate variation curve in a specified time period, the step B1 of determining the target emotion type corresponding to the target physiological state parameter may be implemented as follows:
b11, sampling the heart rate change curve to obtain a plurality of heart rate values;
b12, performing mean value operation according to the plurality of heart rate values to obtain an average heart rate value;
b13, determining a target heart rate grade corresponding to the average heart rate value;
b14, determining a target first emotion value corresponding to the target heart rate grade according to a mapping relation between a preset heart rate grade and the first emotion value;
b15, performing mean square error operation according to the plurality of heart rate values to obtain a target mean square error;
b16, determining a target second emotion value corresponding to the target mean square error according to a mapping relation between a preset mean square error and the second emotion value;
b17, determining a target weight pair corresponding to the target heart rate level according to a preset mapping relation between the heart rate level and the weight pair, wherein the weight pair comprises a first weight and a second weight, the first weight is a weight corresponding to the first emotion value, and the second weight is a weight corresponding to the second emotion value;
b18, carrying out weighted operation according to the target first emotion value, the target second emotion value and the target weight value to obtain a final emotion value;
and B19, determining the target emotion type corresponding to the target emotion value according to a preset mapping relation between the emotion value and the emotion type.
The designated time period can be set by a user or default by a system, a mapping relation between a preset heart rate level and a first emotion value, a mapping relation between a preset mean square error and a second emotion value, a mapping relation between a preset heart rate level and a weight pair, and a mapping relation between a preset emotion value and an emotion type can be stored in a cache agent warehouse in advance, the weight pair can comprise a first weight and a second weight, the first weight is a weight corresponding to the first emotion value, the second weight is a weight corresponding to the second emotion value, the sum of the first weight and the second weight can be 1, and the value ranges of the first weight and the second weight are both 0-1. In the embodiment of the application, the emotion can be evaluated through a heart rate variation curve.
In specific implementation, the cache agent warehouse may sample the heart rate change curve, and the specific sampling mode may be: the method comprises the steps of uniformly sampling or randomly sampling to obtain a plurality of heart rate values, performing mean value operation according to the plurality of heart rate values to obtain an average heart rate value, storing a mapping relation between the heart rate values and heart rate levels in a cache agent warehouse in advance, determining a target heart rate level corresponding to the average heart rate value according to the mapping relation, determining a target first emotion value corresponding to the target heart rate level according to the preset mapping relation between the heart rate level and the first emotion value, performing mean square error operation according to the plurality of heart rate values to obtain a target mean square error, and determining a target second emotion value corresponding to the target mean square error according to the preset mapping relation between the mean square error and the second emotion value.
Further, the caching agent warehouse may also determine a target weight pair corresponding to the target heart rate level according to a mapping relationship between the preset heart rate level and the weight pair, where the target weight pair may include a target first weight and a target first weight, the target first weight is a weight corresponding to a target first emotion value, and the target second weight is a weight corresponding to a target second emotion value, and further, the caching agent warehouse may perform weighting operation according to the target first emotion value, the target second emotion value, the target first weight and the target second weight to obtain a final emotion value, and a specific calculation formula is as follows:
final emotion value = target first emotion value + target first weight + target second emotion value + target second weight
And then, determining a target emotion type corresponding to the target emotion value according to the preset mapping relation between the emotion value and the emotion type. The average heart rate reflects the heart rate value of the user, the mean square error of the heart rate reflects the heart rate stability, the emotion of the user is reflected through two dimensions of the average heart rate and the mean square error, and the emotion type of the user can be accurately determined.
It can be seen that the mirror image pulling method described in the foregoing embodiment of the present application is applied to a cache proxy warehouse, and receives a mirror image pulling request, where the mirror image pulling request is initiated by a target cluster, the cache proxy warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache proxy warehouse is located in a target machine room, the target machine room further includes the target cluster, and detects whether the mirror image pulling request hits a preset cache, when the mirror image pulling request hits the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the preset cache, and when the mirror image pulling request does not hit the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the central mirror image warehouse and stored in the preset cache, so that the use condition of an application mirror image in the machine room can be sensed, and the mirror image pulling is implemented according to the use condition of the application mirror image in the machine room, and therefore, the network bandwidth and the disk space can be saved.
In accordance with the above description, referring to fig. 2, fig. 2 is a schematic flowchart of another image pull method provided in this embodiment, where the image pull method described in this embodiment is applied to the server shown in fig. 1A or the system architecture shown in fig. 1B, and the method may include the following steps:
201. receiving a mirror image pulling request initiated by a target cluster, wherein the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster.
202. And detecting whether the mirror image pulling request hits a preset cache or not.
203. And when the mirror image pulling request hits a preset cache, pulling the content required to be pulled by the mirror image pulling request from the preset cache.
204. And when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
205. And acquiring the utilization rate of the first cache disk.
206. And executing a mirror image cleaning task when the utilization rate of the first cache disk is greater than a preset threshold value.
207. And acquiring the utilization rate of the second cache disk.
208. And when the utilization rate of the second cache disk is less than or equal to the preset threshold value, ending the mirror image cleaning task.
209. And when the utilization rate of the second cache disk is greater than the preset threshold value, acquiring an application list.
210. And reserving the used images in the application list, and deleting the images of which the version is lower than the preset version in the images which are not used in the application list.
211. And acquiring the utilization rate of the third cache disk.
212. And when the utilization rate of the third cache disk is greater than the preset threshold value, deleting all unused application files in the application list.
213. And when the utilization rate of the third cache disk is less than or equal to the preset threshold value, finishing executing the mirror image cleaning task.
214. And acquiring the utilization rate of the fourth cache disk.
215. And when the utilization rate of the fourth cache disk is greater than the preset threshold value, triggering alarm operation.
The detailed description of the steps 201 to 215 may refer to the mirror image pulling method shown in fig. 1C, and is not described herein again.
It can be seen that, the mirror image pulling method described in the embodiment of the present application is applied to a cache agent warehouse, on one hand, the application mirror image use condition in a machine room can be sensed, and the mirror image pulling is implemented according to the application mirror image use condition in the machine room, so that the network bandwidth and the disk space can be saved, on the other hand, the mirror image cache agent warehouse is deployed in each machine room through an application-sensed mirror image distribution system, the mirror image client of each machine room pulls the mirror image from the agent warehouse of the machine room to which the mirror image client belongs nearby, and if the mirror image client does not pull the mirror image from the agent warehouse to the central warehouse, the mirror image client is returned to the central warehouse by the agent. By combining an application-aware mirror preheating mechanism, the problem of mirror pulling efficiency limited by bandwidth of multiple machine rooms can be effectively solved, so that the application publishing efficiency is effectively improved.
The following is a device for implementing the mirror image pulling method, specifically as follows:
in accordance with the above, please refer to fig. 3, in which fig. 3 is a server according to an embodiment of the present application, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the caching proxy repository being located at a server, the programs including instructions for:
receiving a mirror image pulling request, wherein the mirror image pulling request is initiated by a target cluster, the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster;
detecting whether the mirror image pulling request hits a preset cache or not;
when the mirror image pulling request hits a preset cache, pulling the content required to be pulled by the mirror image pulling request from the preset cache;
and when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
It can be seen that, the server described in the above embodiment of the present application includes a cache agent warehouse, which receives a mirror image pull request, where the mirror image pull request is initiated by a target cluster, the cache agent warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache agent warehouse is located in a target machine room, the target machine room further includes the target cluster, detects whether the mirror image pull request hits a preset cache, when the mirror image pull request hits the preset cache, the content that the mirror image pull request needs to be pulled is pulled from the preset cache, and when the mirror image pull request does not hit the preset cache, the content that the mirror image pull request needs to be pulled is pulled from the central mirror image warehouse and stored in the preset cache, which can sense the usage of an application mirror image in the machine room, and implement mirror image pull according to the usage of the application mirror image in the machine room, thereby saving network bandwidth and disk space.
In one possible example, the caching agent repository includes a mirror warming module; in the detecting whether the mirror pull request hits in a preset cache, the program includes instructions for:
monitoring a mirror image pushing event of the message center through the mirror image preheating module, wherein the mirror image pushing event comprises a mirror image name;
analyzing the mirror image name to obtain a target application name;
detecting whether the application corresponding to the target application name is deployed in a target cluster of the target computer room;
deploying the application corresponding to the target application name in the cluster of the target machine room, and confirming that the mirror image pulling request hits the preset cache;
and when the application corresponding to the target application name is not deployed in the cluster of the target computer room, confirming that the mirror image pulling request does not hit the preset cache.
In one possible example, the caching agent repository further comprises a mirror elimination module, the program further comprising instructions for:
acquiring the utilization rate of a first cache disk;
and executing a mirror image cleaning task when the utilization rate of the first cache disk is greater than a preset threshold value.
In one possible example, in said performing the image cleaning task, the program includes instructions for performing the steps of:
detecting whether a mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, wherein the mirror image i is any mirror image in the preset cache;
if yes, the mirror image i is reserved, and if not, the mirror image i is deleted.
In one possible example, the program further includes instructions for performing the steps of:
acquiring the utilization rate of a second cache disk;
and when the utilization rate of the second cache disk is less than or equal to the preset threshold value, ending the mirror image cleaning task.
In one possible example, the program further comprises instructions for performing the steps of:
when the utilization rate of the second cache disk is greater than the preset threshold value, acquiring an application list;
reserving an image being used in the application list;
and deleting the images of which the versions are lower than the preset version in the images which are not used in the application list.
In one possible example, the program further comprises instructions for performing the steps of:
acquiring the utilization rate of a third cache disk;
and when the utilization rate of the third cache disk is greater than the preset threshold value, deleting all the mirror images which are not used currently in the application list.
In one possible example, the program further comprises instructions for performing the steps of:
and when the utilization rate of the third cache disk is less than or equal to the preset threshold value, ending the execution of the mirror image cleaning task.
In one possible example, the program further comprises instructions for performing the steps of:
acquiring the utilization rate of a fourth cache disk;
and triggering an alarm operation when the utilization rate of the fourth cache disk is greater than the preset threshold value.
Referring to fig. 4A, fig. 4A is a schematic structural diagram of a mirror image pulling apparatus provided in this embodiment. The mirror image pulling device is applied to a server shown in fig. 1A or a system architecture shown in fig. 1B, and is applied to a cache agent repository, and the mirror image pulling device comprises: a receiving unit 401, a detecting unit 402, and a mirror pulling unit 403, wherein,
the receiving unit 401 is configured to receive a mirror image pull request, where the mirror image pull request is initiated by a target cluster, the cache agent warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache agent warehouse is located in a target machine room, and the target machine room further includes the target cluster;
the detecting unit 402 is configured to detect whether the mirror image pull request hits a preset cache;
the mirror image pulling unit 403 is configured to pull, when the mirror image pulling request hits a preset cache, a content that needs to be pulled by the mirror image pulling request from the preset cache;
the mirror image pulling unit 403 is further configured to pull, from the central mirror image warehouse, the content that needs to be pulled by the mirror image pulling request when the mirror image pulling request misses the preset cache, and store the content in the preset cache.
It can be seen that the mirror image pulling apparatus described in the embodiment of the present application is applied to a cache proxy warehouse, and receives a mirror image pulling request, where the mirror image pulling request is initiated by a target cluster, the cache proxy warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache proxy warehouse is located in a target machine room, the target machine room further includes the target cluster, and detects whether the mirror image pulling request hits a preset cache, when the mirror image pulling request hits the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the preset cache, and when the mirror image pulling request does not hit the preset cache, the content that the mirror image pulling request needs to be pulled is pulled from the central mirror image warehouse and stored in the preset cache, so that the use condition of an application mirror image in the machine room can be sensed, and the mirror image pulling is implemented according to the use condition of the application mirror image in the machine room, thereby saving network bandwidth and disk space.
In one possible example, the caching agent repository includes a mirror warming module; in the aspect of detecting whether the mirror image pull request hits in a preset cache, the detecting unit 402 is specifically configured to:
monitoring a mirror image pushing event of the message center through the mirror image preheating module, wherein the mirror image pushing event comprises a mirror image name;
analyzing the mirror image name to obtain a target application name;
detecting whether the application corresponding to the target application name is deployed in a target cluster of the target computer room;
deploying the application corresponding to the target application name in the cluster of the target machine room, and confirming that the mirror image pulling request hits the preset cache;
and when the application corresponding to the target application name is not deployed in the cluster of the target computer room, confirming that the mirror image pulling request does not hit the preset cache.
In one possible example, the caching agent repository further includes a mirror elimination module, as shown in fig. 4B, and fig. 4B is a further variant of the mirror pulling apparatus shown in fig. 4A, which may be compared with fig. 4A: an acquisition unit 404 and a cleaning unit 405, wherein,
the obtaining unit 404 is configured to obtain a first cache disk usage rate;
the cleaning unit 405 is configured to execute a mirror image cleaning task when the first cache disk usage rate is greater than a preset threshold.
In a possible example, in terms of performing the image cleaning task, the cleaning unit 405 is specifically configured to:
detecting whether a mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, wherein the mirror image i is any mirror image in the preset cache;
if yes, the mirror image i is reserved, and if not, the mirror image i is deleted.
In one possible example, among others,
the obtaining unit 404 is further configured to obtain a second cache disk usage rate;
the cleaning unit 405 is further configured to end the mirror image cleaning task when the usage rate of the second cache disk is less than or equal to the preset threshold.
In one possible example, among others,
the obtaining unit 404 is further configured to obtain an application list when the usage rate of the second cache disk is greater than the preset threshold;
the cleaning unit 405 is further configured to retain the image in the application list that is being used; and deleting the images of which the versions are lower than the preset version in the images which are not used in the application list.
In one possible example, among others,
the obtaining unit 404 is configured to obtain a third cache disk usage rate;
the cleaning unit 405 is further configured to delete all the images that are not being used in the application list when the usage rate of the third cache disk is greater than the preset threshold.
In one possible example, among others,
the cleaning unit 405 is further configured to end execution of the mirror image cleaning task when the usage rate of the third cache disk is less than or equal to the preset threshold.
In one possible example, among others,
the obtaining unit 404 is further configured to obtain a fourth cache disk usage rate;
the cleaning unit 405 is further configured to trigger an alarm operation when the usage rate of the fourth cache disk is greater than the preset threshold.
It can be understood that the functions of each program module of the mirror image pull device in this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the image pulling methods described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute part or all of the steps of any one of the image pull methods as described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art will recognize that the embodiments described in this specification are preferred embodiments and that acts or modules referred to are not necessarily required for this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solutions of the present application, in essence or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, can be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned memory comprises: various media that can store program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (20)

  1. A mirror image pulling method is applied to a cache agent warehouse and comprises the following steps:
    receiving a mirror image pulling request, wherein the mirror image pulling request is initiated by a target cluster, the cache agent warehouse is positioned in a mirror image distribution system, the mirror image distribution system further comprises a central mirror image warehouse and a message center, the cache agent warehouse is positioned in a target machine room, and the target machine room further comprises the target cluster;
    detecting whether the mirror image pulling request hits a preset cache or not;
    when the mirror image pulling request hits a preset cache, pulling the content to be pulled of the mirror image pulling request from the preset cache;
    and when the mirror image pulling request does not hit the preset cache, pulling the content required to be pulled by the mirror image pulling request from the central mirror image warehouse, and storing the content in the preset cache.
  2. The method of claim 1, wherein the caching agent repository comprises a mirror warming module; the detecting whether the mirror image pulling request hits a preset cache comprises:
    monitoring a mirror image pushing event of the message center through the mirror image preheating module, wherein the mirror image pushing event comprises a mirror image name;
    analyzing the mirror image name to obtain a target application name;
    detecting whether the application corresponding to the target application name is deployed in a target cluster of the target computer room;
    deploying the application corresponding to the target application name in the cluster of the target computer room, and confirming that the mirror image pulling request hits the preset cache;
    and when the application corresponding to the target application name is not deployed in the cluster of the target computer room, confirming that the mirror image pulling request does not hit the preset cache.
  3. The method of claim 1 or 2, wherein the caching agent repository further comprises a mirror eviction module, the method further comprising:
    acquiring the utilization rate of a first cache disk;
    and executing a mirror image cleaning task when the utilization rate of the first cache disk is greater than a preset threshold value.
  4. The method of claim 3, wherein performing a mirror cleaning task comprises:
    detecting whether a mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, wherein the mirror image i is any mirror image in the preset cache;
    if yes, the mirror image i is reserved, and if not, the mirror image i is deleted.
  5. The method according to claim 3 or 4, characterized in that the method further comprises:
    acquiring the utilization rate of a second cache disk;
    and when the utilization rate of the second cache disk is less than or equal to the preset threshold value, ending the mirror image cleaning task.
  6. The method of claim 5, further comprising:
    when the utilization rate of the second cache disk is greater than the preset threshold value, acquiring an application list;
    reserving an image being used in the application list;
    and deleting the images of which the versions are lower than the preset version in the images which are not used in the application list.
  7. The method of claim 6, further comprising:
    acquiring the utilization rate of a third cache disk;
    and when the utilization rate of the third cache disk is greater than the preset threshold value, deleting all the images which are not in use in the application list.
  8. The method of claim 7, further comprising:
    and when the utilization rate of the third cache disk is less than or equal to the preset threshold value, finishing executing the mirror image cleaning task.
  9. The method of claim 7, further comprising:
    acquiring the utilization rate of a fourth cache disk;
    and triggering an alarm operation when the utilization rate of the fourth cache disk is greater than the preset threshold value.
  10. An image pulling apparatus, applied to a caching agent repository, the apparatus comprising: a receiving unit, a detecting unit and a mirror image pulling unit, wherein,
    the receiving unit is configured to receive a mirror image pull request, where the mirror image pull request is initiated by a target cluster, the cache agent warehouse is located in a mirror image distribution system, the mirror image distribution system further includes a central mirror image warehouse and a message center, the cache agent warehouse is located in a target machine room, and the target machine room further includes the target cluster;
    the detection unit is used for detecting whether the mirror image pulling request hits a preset cache or not;
    the mirror image pulling unit is used for pulling the content required to be pulled by the mirror image pulling request from a preset cache when the mirror image pulling request hits the preset cache;
    the mirror image pulling unit is further configured to pull, from the central mirror image warehouse, the content that needs to be pulled by the mirror image pulling request when the mirror image pulling request misses the preset cache, and store the content in the preset cache.
  11. The apparatus of claim 10, wherein the caching agent repository comprises a mirror warming module; in the aspect of detecting whether the mirror image pull request hits in a preset cache, the detecting unit is specifically configured to:
    monitoring a mirror image pushing event of the message center through the mirror image preheating module, wherein the mirror image pushing event comprises a mirror image name;
    analyzing the mirror image name to obtain a target application name;
    detecting whether the application corresponding to the target application name is deployed in a target cluster of the target computer room;
    deploying the application corresponding to the target application name in the cluster of the target computer room, and confirming that the mirror image pulling request hits the preset cache;
    and when the application corresponding to the target application name is not deployed in the cluster of the target computer room, confirming that the mirror image pulling request does not hit the preset cache.
  12. The apparatus of claim 10 or 11, wherein the caching agent repository further comprises a mirror eviction module, the apparatus further comprising: an acquisition unit and a cleaning unit, wherein,
    the obtaining unit is used for obtaining the utilization rate of a first cache disk;
    the cleaning unit is used for executing a mirror image cleaning task when the utilization rate of the first cache disk is greater than a preset threshold value.
  13. The apparatus according to claim 12, wherein in said performing a mirror cleaning task, said cleaning unit is specifically configured to:
    detecting whether a mirror image i in the preset cache deploys an application associated with the mirror image i in the target cluster, wherein the mirror image i is any mirror image in the preset cache;
    if yes, the mirror image i is reserved, and if not, the mirror image i is deleted.
  14. The apparatus of claim 12 or 13, wherein,
    the obtaining unit is further configured to obtain a second cache disk usage rate;
    the cleaning unit is further configured to end the mirror image cleaning task when the usage rate of the second cache disk is less than or equal to the preset threshold.
  15. The apparatus of claim 14, wherein,
    the obtaining unit is further configured to obtain an application list when the usage rate of the second cache disk is greater than the preset threshold;
    the cleaning unit is further used for reserving the mirror image in use in the application list; and deleting the images of which the versions are lower than the preset version in the images which are not used in the application list.
  16. The apparatus of claim 15, wherein,
    the obtaining unit is used for obtaining the utilization rate of a third cache disk;
    the cleaning unit is further configured to delete all the images that are not being used in the application list when the usage rate of the third cache disk is greater than the preset threshold.
  17. The apparatus of claim 16, wherein,
    the cleaning unit is further configured to end execution of the mirror image cleaning task when the usage rate of the third cache disk is less than or equal to the preset threshold.
  18. A server, comprising a processor, memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-9.
  19. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any of the claims 1-9.
  20. A computer program product, characterized in that the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform the method according to any one of claims 1-9.
CN202080099553.0A 2020-05-20 2020-05-20 Mirror image pulling method and related product Pending CN115380269A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/091316 WO2021232289A1 (en) 2020-05-20 2020-05-20 Image pulling method and related product

Publications (1)

Publication Number Publication Date
CN115380269A true CN115380269A (en) 2022-11-22

Family

ID=78709073

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080099553.0A Pending CN115380269A (en) 2020-05-20 2020-05-20 Mirror image pulling method and related product

Country Status (2)

Country Link
CN (1) CN115380269A (en)
WO (1) WO2021232289A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115794139A (en) * 2023-01-16 2023-03-14 腾讯科技(深圳)有限公司 Mirror image data processing method, device, equipment and medium
CN116614517A (en) * 2023-04-26 2023-08-18 江苏博云科技股份有限公司 Container mirror image preheating and distributing method for edge computing scene

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114785770A (en) * 2022-04-01 2022-07-22 京东科技信息技术有限公司 Mirror layer file sending method and device, electronic equipment and computer readable medium
CN117033325B (en) * 2023-10-08 2023-12-26 恒生电子股份有限公司 Mirror image file preheating and pulling method and device
CN117369952B (en) * 2023-12-08 2024-03-15 中电云计算技术有限公司 Cluster processing method, device, equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102033755A (en) * 2009-09-30 2011-04-27 国际商业机器公司 Method and system for running virtual machine mirror image
CN107733977B (en) * 2017-08-31 2020-11-03 北京百度网讯科技有限公司 Cluster management method and device based on Docker
CN110099076A (en) * 2018-01-29 2019-08-06 中兴通讯股份有限公司 A kind of method and its system that mirror image pulls
CN110908671A (en) * 2018-09-18 2020-03-24 北京京东尚科信息技术有限公司 Method and device for constructing docker mirror image and computer readable storage medium
CN110096333B (en) * 2019-04-18 2021-06-29 华中科技大学 Container performance acceleration method based on nonvolatile memory

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115794139A (en) * 2023-01-16 2023-03-14 腾讯科技(深圳)有限公司 Mirror image data processing method, device, equipment and medium
CN116614517A (en) * 2023-04-26 2023-08-18 江苏博云科技股份有限公司 Container mirror image preheating and distributing method for edge computing scene
CN116614517B (en) * 2023-04-26 2023-09-29 江苏博云科技股份有限公司 Container mirror image preheating and distributing method for edge computing scene

Also Published As

Publication number Publication date
WO2021232289A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
CN115380269A (en) Mirror image pulling method and related product
EP3443446B1 (en) Electronic device comprising force sensor
CN108153647B (en) Log processing method and device, terminal equipment and storage medium
CN108874466B (en) Control calling method, electronic device and computer readable storage medium
US10506292B2 (en) Video player calling method, apparatus, and storage medium
CN112307405B (en) Cross-device application relay method, device, equipment, system and storage medium
CN108566332A (en) A kind of instant communication information processing method, device and storage medium
CN108235754A (en) A kind of method and apparatus that user is prompted to update application version
CN110020293B (en) Multimedia data display method, device and storage medium
CN110457621B (en) Page display method, device and medium for hybrid application
CN110780940A (en) Application program loading method, electronic device and storage medium
CN108833683B (en) Dynamic antenna adjustment implementation method and related product
CN110058980B (en) Terminal start time early warning method, electronic device and computer readable storage medium
CN109669662A (en) A kind of pronunciation inputting method, device, storage medium and mobile terminal
CN109684011B (en) Interface display control method, electronic device and computer readable storage medium
JP6974620B2 (en) Notification message processing method and terminal
CN109902232B (en) Display control method and terminal
CN112612552A (en) Application program resource loading method and device, electronic equipment and readable storage medium
CN107688498B (en) Application program processing method and device, computer equipment and storage medium
CN107168648B (en) File storage method and device and terminal
CN115828845A (en) Multimedia data viewing method, device, medium and equipment
CN104717283A (en) File downloading control method, terminal and logic processing server
KR102436383B1 (en) Electronic device and method of operating the same
CN110187893B (en) Method for deleting application, mobile terminal and computer readable storage medium
CN109740538B (en) Fingerprint acquisition method, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination