CN113792169A - Digital archive management method and system based on big data application - Google Patents

Digital archive management method and system based on big data application Download PDF

Info

Publication number
CN113792169A
CN113792169A CN202111086516.9A CN202111086516A CN113792169A CN 113792169 A CN113792169 A CN 113792169A CN 202111086516 A CN202111086516 A CN 202111086516A CN 113792169 A CN113792169 A CN 113792169A
Authority
CN
China
Prior art keywords
image
recovery
pixel
restoration
archive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111086516.9A
Other languages
Chinese (zh)
Other versions
CN113792169B (en
Inventor
于胜田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yantai Penglai District Archives
Original Assignee
Yantai Penglai District Archives
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yantai Penglai District Archives filed Critical Yantai Penglai District Archives
Priority to CN202111086516.9A priority Critical patent/CN113792169B/en
Publication of CN113792169A publication Critical patent/CN113792169A/en
Application granted granted Critical
Publication of CN113792169B publication Critical patent/CN113792169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30176Document

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Library & Information Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a digital archive management method and a system based on big data application, wherein an original archive image is obtained by scanning; the original archive image is subjected to image restoration, so that the accuracy and integrity of archive information in the archive image are improved, and the effectiveness of archive management is improved. In addition, index information is generated based on the information of the restored archive image, the restored archive image is stored based on the index information, when a user needs to search the restored archive image or the archive information in the restored archive image, the restored archive image or the archive information can be quickly and accurately searched and printed based on the index information, and the accuracy, the safety and the efficiency of archive management are improved. The archive information is stored in the big database, so that the inquiry of a user is facilitated, meanwhile, the archive information is convenient to share, and the intellectualization of archive information management is realized.

Description

Digital archive management method and system based on big data application
Technical Field
The invention relates to the technical field of computer archive management, in particular to a digital archive management method and system based on big data application.
Background
The big data is a data set which is beyond the capability range of traditional database software tools in the aspects of acquisition, storage, management, analysis and the like on a large scale, and has the four characteristics of massive data scale, rapid data circulation, various data types and low value density. The strategic significance of big data technology is not to grasp huge data information, but to specialize the data having significance.
The archives are various original records with preservation value directly formed in social activities of people, and bear unique missions in the process of socialization. Conventionally, people store archive information through media such as paper and cloth, and these media are easily damaged, so that the archive information is easily lost. With the development of high and new technology industries, people develop electronic storage media, and by storing file information in the electronic storage media, the electronic storage media have strong stability for storing the file information, are not easy to damage and lose, and can be stored for a long time. However, since the conventional practice and the spread of advanced technology are limited, a large amount of archive information is delivered to an archive administration unit through paper materials. It is an important and meaningful task for an archive administration unit to convert the paper-based archive information into electronic archive information, store the archive information in an electronic storage medium, and effectively manage the archive information.
At present, with the development of science and technology, especially with the development of big data technology, information is gradually integrated into a whole through internet technology in various technical fields, and resource sharing is carried out on the information. The file information has the characteristics of large data volume, complex information and various types, so that the file management is necessary for integrated management of the file information by high and new technical means, and intellectualization, modernization of information technology and standardization of intelligent management of the file management can be realized by the high and new technical means.
An intelligent archive management platform concept is proposed, which includes a plurality of items such as collection, management, service, protection and supervision of archives, so as to realize intelligent management of archives. However, how to effectively collect, manage, service, protect and monitor files is a technical problem in the field.
Disclosure of Invention
The invention aims to provide a digital file management method and a digital file management system based on big data application, which are used for solving the problems in the prior art.
In a first aspect, an embodiment of the present invention provides a digital archive management method based on a big data application, where the method includes:
scanning to obtain an original file image;
performing image restoration on the original file image to obtain a restored file image;
generating index information of the restored archive image based on the restored archive image through a pre-trained neural network model;
constructing an archive big database based on the index information, and storing the repaired archive image into the big database; restoring the file image to contain file information;
managing the archive information in the archive big database: and if a file inquiry request of a user is received, file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user.
Optionally, the image restoration of the original archive image to obtain a restored archive image includes:
reducing the original file image to obtain a first recovery kernel and a second recovery kernel; the first recovered nucleus is larger in size than the second recovered nucleus; the size of the first restoring core and the size of the second restoring core are smaller than the size of the original archive image;
performing transverse restoration on the original archive image based on the first restoration core and the second restoration core to obtain a transverse fusion image;
performing longitudinal restoration on the original archive image based on the first restoration core and the second restoration core to obtain a longitudinal fusion image;
fusing the transverse fusion image and the longitudinal fusion image to obtain a repaired image;
and fusing the repaired image and the original file image to obtain a repaired file image.
Optionally, performing a horizontal restoration on the original archival image based on the first restoration core and the second restoration core to obtain a horizontal fusion image, including:
performing transverse restoration on the original archival image based on the first restoration core to obtain a first transverse restoration image;
performing transverse restoration on the original archival image based on the second restoration core to obtain a second transverse restoration image;
performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image;
and fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image.
Optionally, the fusing the first horizontal restored image, the second horizontal restored image, and the third horizontal restored image to obtain a horizontal fused image, including:
obtaining corresponding pixel points of pixel points in a damaged area in an original file image, wherein the corresponding pixel points are pixel points in a first transverse recovery image, a second transverse recovery image and a third transverse recovery image, and the corresponding pixel points have the same position information with the pixel points in the damaged area; each pixel point in the damaged area corresponds to three corresponding pixel points;
taking the average value of the pixel values of three corresponding pixel points of the pixel points in the damaged area as the pixel value of the pixel point corresponding to the pixel point in the damaged area in the transverse fusion image;
taking the pixel values of the pixel points in the non-damaged area in the original file image as the pixel values of the pixel points in the non-damaged area in the transverse fusion image;
the non-damaged area in the original file image is the other area of the file image except the damaged area; and the pixel points in the non-damaged area in the transverse fusion image and the pixel points in the non-damaged area in the archival image have the same coordinate values.
Optionally, performing longitudinal restoration on the archival image based on the first restoration core and the second restoration core to obtain a longitudinal fused image, including:
performing longitudinal restoration on the original archival image based on the first restoration core to obtain a first longitudinal restoration image;
performing longitudinal restoration on the original archival image based on the second restoration core to obtain a second longitudinal restoration image;
performing longitudinal restoration on the first transverse restored image based on the second restored image to obtain a third longitudinal restored image;
and fusing the first longitudinal recovery image, the second longitudinal recovery image and the third longitudinal recovery image to obtain a longitudinal fused image.
In a second aspect, an embodiment of the present invention provides a digital archive management system based on big data applications, where the system includes:
the scanning module is used for scanning to obtain an original file image;
the recovery module is used for carrying out image restoration on the original file image to obtain a restored file image;
the index module is used for generating index information of the repaired archive image based on the repaired archive image through a pre-trained neural network model;
the storage module is used for constructing a file big database based on the index information and storing the repaired file image into the big database; restoring the file image to contain file information;
the management module is used for managing the archive information in the archive big database: and if a file inquiry request of a user is received, file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides a digital archive management method and a system based on big data application, wherein the method comprises the following steps: scanning to obtain an original file image; performing image restoration on the original file image to obtain a restored file image; generating index information of the restored archive image based on the restored archive image through a pre-trained neural network model; constructing an archive big database based on the index information, and storing the repaired archive image into the big database; restoring the file image to contain file information; managing the archive information in the archive big database: and if a file inquiry request of a user is received, file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user. The original file image is restored, so that the accuracy and the integrity of file information in the file image are improved, and the effectiveness of file management is improved. In addition, index information is generated based on the information of the restored archive image, the restored archive image is stored based on the index information, when a user needs to search the restored archive image or the archive information in the restored archive image, the restored archive image or the archive information can be quickly and accurately searched and printed based on the index information, and the accuracy, the safety and the efficiency of archive management are improved. The archive information is stored in the big database, so that the inquiry of a user is facilitated, meanwhile, the archive information is convenient to share, and the intellectualization of archive information management is realized.
Drawings
Fig. 1 shows a block diagram of an electronic device 100 according to an embodiment of the present invention.
FIG. 2 is a flow chart of a method for managing a digital archive based on a big data application according to an embodiment of the present invention.
Fig. 3 is a schematic diagram illustrating a corresponding relationship between a first image block and a first recovery kernel in an original archive image according to an embodiment of the present invention.
Fig. 4 is a schematic diagram illustrating a corresponding relationship between a second image block and a second recovery kernel in an original archive image according to an embodiment of the present invention.
Fig. 5 is a schematic diagram illustrating that a pixel point (0,0) in an original file image is used as a recovery reference point, and the recovery reference point (0,0) coincides with a center pixel point of a first recovery kernel according to an embodiment of the present invention.
Fig. 6 is a schematic diagram illustrating that a pixel point (0,1) in an original file image is used as a recovery reference point, and the recovery reference point (0,1) coincides with a center pixel point of a first recovery kernel according to an embodiment of the present invention.
Fig. 7 is a schematic diagram illustrating that a pixel point (1,0) in an original file image is used as a recovery reference point, and the recovery reference point (1,0) coincides with a center pixel point of a first recovery kernel according to an embodiment of the present invention.
Icon: 100-an electronic device; 101-a memory; 103-a processor; 104-peripheral interfaces; 105-an image pick-up device; 106-a display device; 200-digital archive management system based on big data applications.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings.
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention provided in the accompanying drawings, which is intended to limit the scope of the invention as claimed, and is intended to represent only selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a block diagram illustrating an electronic device 100 according to an embodiment of the invention. The electronic device 100 may be, but is not limited to, a smart phone, a tablet computer, a laptop portable computer, a car computer, a Personal Digital Assistant (PDA), a wearable mobile terminal, a desktop computer, and the like. The electronic device 100 includes a memory 101, a memory controller, a processor 103, a peripheral interface 104, a camera 105, a display 106, and a big data application based digital archive management system 200.
The memory 101, the memory controller, the processor 103, the peripheral interface 104, the camera device 105 and the display device 106 are electrically connected directly or indirectly to realize data transmission or interaction. For example, these components may be electrically connected to each other via one or more communication buses or signal lines. The image encryption apparatus 200 includes at least one software function module that may be stored in the memory 101 in the form of software or firmware (firmware) or solidified in an Operating System (OS) of the electronic device 100. The processor 103 is configured to execute executable modules or computer programs stored in the memory 101, such as software functional modules or computer programs included in the big data application-based digital archive management system 200.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 101 is used for storing a program, and the processor 103 executes the program after receiving an execution instruction, and the method executed by the server defined by the process explained in any embodiment of the present invention can be applied to the processor 103, or implemented by the processor 103.
The processor 103 may be an integrated chip having signal processing capabilities. The Processor 103 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), a voice Processor, a video Processor, and the like; but may also be a digital processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps and logic blocks disclosed in embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor 103 may be any conventional processor or the like.
The peripheral interface 104 is used to couple various input/output devices to the processor 103 as well as to the memory 101. In some embodiments, the peripheral interface 104, the processor 103, and the memory controller may be implemented in a single chip. In other examples, they may be implemented separately from each other.
The camera 105 is used to capture an image, and in the embodiment of the present invention, the camera 105 may be, but is not limited to, a scanner, and is used to scan an archival image. The scanned original archival image can be converted into a document in PDF format.
The display device 106 is used for interaction between the user and the electronic device 100, for example, but not limited to, the display device 106 can display the archive image and the archive information.
In the embodiment of the present invention, the digital archive management system 200 based on big data application is used to realize the management of archive information, and the digital archive management system 200 based on big data application can realize archive management by the following method.
Referring to fig. 2, fig. 2 is a flow chart illustrating a method for managing a digital archive based on a big data application according to an embodiment of the present invention. The following describes a digital archive management method based on big data applications in detail. In the embodiment of the invention, the digital archive management method based on big data application comprises the following steps:
s101: scanning to obtain an original file image. The original archive image is an image obtained by scanning a paper archive through a scanner, and the original archive image contains archive information of a user.
Alternatively, the scanned archive image may be converted into an archive document in the PDF format, and then the archive document in the PDF format may be stored in the archive large database.
S102: and carrying out image restoration on the original file image to obtain a restored file image.
S103: and generating index information of the repair archive image based on the repair archive image through a pre-trained neural network model.
The index information is used for indexing the restored archive image and the archive information contained in the restored archive image. Optionally, the neural network model may include a convolutional neural network model and a support vector machine model, the image features (texture features) of the restored archive image are extracted through the convolutional neural network model, the feature information (character features) of the archive image is obtained through the support vector machine model based on the texture features, a problem is identified from the character features, then a keyword in the character is identified, and the keyword is used as index information.
S104: and constructing an archive big database based on the index information, and storing the repaired archive image into the big database.
The large archive database stores archive information, and the archive information is contained in the restored archive image.
S105: managing the archive information in the archive big database: and if a file inquiry request of a user is received, the file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user. It should be noted that the index information of the restored archive image corresponds to the archive information in the restored archive image.
By adopting the scheme, the original file image is subjected to image recovery, so that the accuracy and integrity of file information in the file image are improved, and the effectiveness of file management is improved. In addition, index information is generated based on the information of the restored archive image, the restored archive image is stored based on the index information, and when a user needs to search the restored archive image or the archive information in the restored archive image, the restored archive image or the archive information can be quickly and accurately searched and printed based on the index information, so that the accuracy, the safety and the efficiency of archive management are improved. The archive information is stored in the large database, so that the inquiry of a user is facilitated, and meanwhile, the archive information is convenient to share, and the intelligence of archive information management is realized.
Optionally, the image restoration of the original archive image to obtain a restored archive image includes:
reducing the original file image to obtain a first recovery kernel and a second recovery kernel; the first recovered nucleus is larger in size than the second recovered nucleus; the size of the first restoring core and the size of the second restoring core are smaller than the size of the original archive image;
performing transverse restoration on the original archive image based on the first restoration core and the second restoration core to obtain a transverse fusion image;
performing longitudinal restoration on the original archive image based on the first restoration core and the second restoration core to obtain a longitudinal fusion image;
fusing the transverse fusion image and the longitudinal fusion image to obtain a repaired image;
and fusing the repaired image and the original file image to obtain a repaired file image.
Optionally, the method of reducing the original archive image to obtain the first recovery kernel and the second recovery kernel is as follows:
obtaining the length and the width of the first recovery kernel, and dividing the length of the original archive image by the length of the first recovery kernel to obtain a reduced first step length; dividing the width of the original file image by the width of the first recovery kernel to obtain a reduced second step length; the length and width of the first recovery kernel may be set in real time.
Dividing an original file image into a plurality of first image blocks, wherein the length of each first image block is a reduced first step length, and the width of each first image block is a reduced second step length; each first image block corresponds to a pixel point in the first recovery kernel one by one;
and obtaining the pixel value mean value of each first image block, and taking the pixel value mean value of each first image block as the value of the pixel point in the first recovery kernel corresponding to the first image block. The mean value of the pixel values of the first image block is the mean value of the pixel values of all the pixel points in the first image block.
The length and width of the second restoration kernel can be set in real time, and the second restoration kernel is obtained in the same manner. Dividing the original file image into a plurality of second image blocks, wherein the length of each second image block is a third step length reduction, and the width of each second image block is a fourth step length reduction; each second image block corresponds to a pixel point in the second recovery kernel one by one;
and obtaining the pixel value mean value of each second image block, and taking the pixel value mean value of each second image block as the value of the pixel point in the second recovery kernel corresponding to the second image block. The mean value of the pixel values of the second image block is the mean value of the pixel values of all the pixel points in the second image block.
As an example shown in fig. 3, the original archive image has a size of 12 pixels by 12 pixels, the first restoration kernel has a size of 4 pixels by 4 pixels, the second restoration kernel has a size of 3 pixels by 3 pixels, the first restoration kernel has a length of 4 pixels and a width of 4 pixels, and the second restoration kernel has a length of 3 pixels and a width of 3 pixels.
As shown in fig. 3, the original file image is divided into 16 first image blocks, and each first image block has a length of reducing a first step size and a width of reducing a second step size; the first image block corresponds to the pixel points in the first recovery kernel one by one. Each image block is 3 x 3 pixels in size, i.e. the first step reduction is 3 pixels in length and the second step reduction is also 3 pixels in length. The one-to-one correspondence relationship between the first image block and the pixel point in the first restoration kernel, and the mutually corresponding first image block and the pixel point in the first restoration kernel in fig. 3 are connected with each other. The method specifically comprises the following steps: as shown in fig. 3, the first image block formed by the pixels (0,0), (0,1), (0,2), (1,0), (1,1), (1,2), (2,0), (2,1) and (2,2) in the original archive image corresponds to the pixel (a0, b0) in the first restoration kernel, and the average of the pixel values of the pixels (0,0), (0,1), (0,2), (1,0), (1,1), (1,2), (2,0), (2,1) and (2,2) is used as the pixel value I (a0, b0) of the pixel (a0, b0) in the first restoration kernel. I.e. I (a0, b0) ([ I (0,0) + I (0,1) + I (0,2) + I (1,0) + I (1,1) + I (1,2) + I (2,0) + I (2,1) + I (2,2) ]/9). The first image block formed by the pixels (0,3), (0,4), (0,5), (1,3), (1,4), (15), (2,3), (2,4), (2,5) in the original file image corresponds to the pixel (a0, b1) in the first recovery kernel, and the average value of the pixel values of the pixels (0,3), (0,4), (0,5), (1,3), (1,4), (15), (2,3), (2,4), (2,5) is used as the pixel value of the pixel (a0, b 1). Corresponding a first image block formed by pixel points (3,0), (3,1), (3,2), (4,0), (4,1), (4,2), (5,0), (5,1) and (5,2) in the original file image to pixel points (a1, b0) in the first restoring kernel, and so on, corresponding a first image block formed by pixel points (9,9), (9,10), (9,11), (10,9), (10,10), (10,11), (11,9), (11,10) and (11,11) in the original file image to pixel points (a3, b3) in the first restoring kernel, and corresponding pixel points (9,9), (9,10), (9,11), (10,9), (10,10), (10,11), (11,9), (11,10), (11,11) and (11,11) as the pixel value of the pixel point (a3, b3) in the first restoration kernel. Thus, a first recovery kernel is obtained.
Obtaining a second recovery kernel in the same manner, as shown in fig. 4, dividing the original archive image into 9 second image blocks, where the length of each second image block is reduced by a third step size and the width of each second image block is reduced by a fourth step size; and the second image block corresponds to the pixel points in the second recovery kernel one by one. The size of each image block is 4 x 4 pixels, i.e. the third step of reduction is 4 pixels in length and the fourth step of reduction is also 4 pixels in length. The one-to-one correspondence relationship between the second image block and the pixel point in the second recovery kernel specifically includes: a second image block composed of pixels (0,0), (0,1), (0,2), (0,3), (1,0), (1,1), (1,2), (1,3), (2,0), (2,1), (2,2), (2,3), (3,0), (3,1), (3,2), and (3,3) in the original file image corresponds to the pixel (x0, y0) in the second restoration kernel, and the average of the pixel values of the pixels (0,0), (0,1), (0,2), (0,3), (1,0), (1,3), (2,0), (2,1), (2,2), (2,3), (3,0), (3,1), (3,2), (3,3) in the second restoration kernel is used as the pixel (x 0), y0) of the pixel value I (x0, y 0). The specific calculation method is the method for obtaining the first recovery kernel as described above, and so on, the second image block formed by the pixel points (8,8) (8,9), (8,10), (8,11), (9,8) (9,9), (9,10), (9,11), (10,8), (10,9), (10,10), (10,11), (11,8), (11,9), (11,10), (11,11) in the original file image corresponds to the pixel point (x2, y2) in the second recovery kernel, and the pixel points 8,8) (8,9), (8,10), (8,11), (9,8) (9,9), (9,11), (10,8), (10,9), (10,10), (10,11), (11,8), (11,9), (11,11), 10) the average value of the pixel values of (11,11) is used as the pixel value of the pixel point (x2, y2) in the second restoration kernel. Thus, a second recovery kernel is obtained.
And performing transverse restoration on the original archive image based on the first restoration core and the second restoration core to obtain a transverse fusion image, wherein the transverse fusion image comprises:
performing transverse restoration on the original archival image based on the first restoration core to obtain a first transverse restoration image;
performing transverse restoration on the original archival image based on the second restoration core to obtain a second transverse restoration image;
performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image;
and fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image.
Performing a transverse restoration on the original archival image based on the first restoration core to obtain a first transverse restored image, as shown in fig. 5:
firstly, taking the 0 th pixel point (0,0) of the 0 th line in the original file image as a recovery reference point, and superposing the recovery reference point (0,0) and the central pixel point of the first recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the first recovery kernel in the original file image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0).
As shown in fig. 5, for example, the size of the first recovery kernel is 5 × 5 pixels, and the pixel value of the reference point (0,0) is obtained as shown in formula (1):
Figure BDA0003265753160000091
wherein, I (I, j) represents the pixel value of the pixel point of the ith row and the jth column, I represents the row in the original file image, and j represents the column in the original file image. I1(0,0) represents the pixel value of pixel (0,0) after update.
Then, the 1 st pixel point (0,1) in the 0 th line in the original file image is used as a recovery reference point, the recovery reference point (0,1) is overlapped with the central pixel point of the first recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the first recovery kernel in the original file image under the reference point (0,1) is used as the pixel value of the recovery reference point (0, 1).
As shown in fig. 6, the pixel value of the restored reference point (0,1) is obtained in the manner shown in equation (2):
Figure BDA0003265753160000092
wherein, I1(0,1) represents the pixel value after the pixel point (0,1) is updated.
And updating the recovery reference point in the 0 th line according to the mode until the M-1 th pixel point (0, M-1) in the 0 th line in the original file image is taken as the recovery reference point, coinciding the recovery reference point (0, M-1) with the central pixel point of the first recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the first recovery kernel in the original file image under the reference point (0, M-1) as the pixel value of the recovery reference point (0, M-1).
The manner of obtaining the pixel values of the recovery reference points (0, M-1) refers to the manner of calculating the pixel values of the recovery reference points (0,0) and the recovery reference points (0,1), and is not described herein again.
Then updating the line, taking the 0 th pixel point (1,0) of the 1 st line in the original file image as a recovery reference point, and coinciding the recovery reference point (1,0) with the central pixel point of the first recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the first recovery kernel in the original file image under the reference point (1,0) as the pixel value of the recovery reference point (1, 0).
The manner of obtaining the pixel value of the recovery reference point (1,0) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
Then, the 1 st pixel point (1,1) of the 1 st line in the original file image is used as a recovery reference point, the recovery reference point (1,1) is overlapped with the central pixel point of the first recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the first recovery kernel in the original file image under the reference point (1,1) is used as the pixel value of the recovery reference point (1, 1).
The manner of obtaining the pixel value of the recovery reference point (1,1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point in the 1 st line according to the mode until the M-1 st pixel point (1, M-1) in the 1 st line in the original file image is taken as the recovery reference point, coinciding the recovery reference point (1, M-1) with the central pixel point of the first recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the first recovery kernel in the original file image under the reference point (1, M-1) as the pixel value of the recovery reference point (1, M-1).
The manner of obtaining the pixel value of the recovery reference point (1, M-1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the M-1 th pixel point (N-1, M-1) of the (N-1) th line in the original file image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the first recovery kernel, and the average pixel value of the pixel values of the pixel points which are coincided with the first recovery kernel in the original file image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of performing transverse restoration on the original file image based on the first restoration check to obtain a first transverse restoration image.
The manner of obtaining the pixel values of the recovery reference points (N-1, M-1) refers to the manner of calculating the pixel values of the recovery reference points (0,0) and the recovery reference points (0,1), and is not described herein again.
According to the mode, the pixel value of each pixel point is updated until all the pixel points in the original file image are traversed, and the image recovery of the original file image based on the first recovery check in the transverse direction is completed, so that the first transverse recovery image is obtained.
Optionally, for the pixel point whose pixel value has been updated, in the subsequent process of updating the pixel point, the pixel value is the pixel value in the original file image, as in the example shown in fig. 5 and fig. 6, it is assumed that in the original file image, I (0,0) is 128, I (0,1) is 100, I (0,2) is 0, I (0,3) is 255, I (1,0) is 120, I (1,1) is 100, I (1,2) is 25, I (1,3) is 128, I (2,0) is 128, I (2,1) is 95, I (2,2) is 10, and I (2,3) is 128. Then, after the pixel value of the pixel point (0,0) is updated, the updated pixel value I1(0,0) of the pixel point (0,0) is [ I (0,0) + I (0,1) + I (0,2) + I (1,0) + I (1,1) + I (1,2) + I (2,0) + I (2,1) + I (2,2) ]/9 is 78 according to the calculation of the formula (1). The updated pixel value I1(0,1) of the pixel point (0,1) is [ I (0,0) + I (0,1) + I (0,2) + I (0,3) + I (1,0) + I (1,1) + I (1,2) + I (1,3) + I (2,0) + I (2,1) + I (2,2) + I (2,3) ]/12 is 101. The above calculations are rounded.
In summary, a specific calculation manner for obtaining an average pixel value of pixel values of pixel points coinciding with the first recovery kernel and assigning the average pixel value as a pixel value of the recovery reference point (x, y) is represented by formula (3):
Figure BDA0003265753160000111
wherein m represents the number of columns of the pixel points in the original file image coinciding with the first recovery kernel, and n represents the number of rows of the pixel points in the original file image coinciding with the first recovery kernel. (m +1) × (n +1) represents the number of pixels in the original archive image that coincide with the first recovery kernel. I1(x, y) represents a pixel value of the restoration reference point (x, y). The value of x is an integer between 0 and (N-1), N represents the number of rows of pixel points in the original file image, the value of y is an integer between 0 and (M-1), and M represents the number of columns of pixel points in the original file image.
Optionally, for the pixel point with the updated pixel value, in the subsequent process of updating the pixel point, the pixel value taken is the updated pixel value. As in the example shown in fig. 5 and 6, it is assumed that, in the original file image, I (0,0) is 128, I (0,1) is 100, I (0,2) is 0, I (0,3) is 255, I (1,0) is 120, I (1,1) is 100, I (1,2) is 25, I (1,3) is 128, I (2,0) is 128, I (2,1) is 95, I (2,2) is 10, and I (2,3) is 128. Then, after the pixel value of the pixel point (0,0) is updated, the updated pixel value I1(0,0) of the pixel point (0,0) is [ I (0,0) + I (0,1) + I (0,2) + I (1,0) + I (1,1) + I (1,2) + I (2,0) + I (2,1) + I (2,2) ]/9 is 78 according to the calculation of the formula (1). Then, the updated pixel value I1(0,1) of the pixel point (0,1) is [ I1(0,0) + I (0,1) + I (0,2) + I (0,3) + I (1,0) + I (1,1) + I (1,2) + I (1,3) + I (2,0) + I (2,1) + I (2,2) + I (2,3) ]/12 is 97. The above calculations are rounded.
In summary, a specific calculation manner for obtaining an average pixel value of pixel values of pixel points coinciding with the first recovery kernel and assigning the average pixel value as a pixel value of the recovery reference point (x, y) is represented by formula (4):
Figure BDA0003265753160000112
wherein f (i, j) represents the value of the pixel point (i, j). If the pixel point (I, j) is not traversed and is not regarded as the recovery reference point, the value of f (I, j) is the pixel value I (I, j) of the pixel point (I, j) in the original file image. If the pixel point (I, j) is traversed and is regarded as the recovery reference point, the value of f (I, j) is the updated pixel value I1(I, j) of the pixel point (I, j) in the original file image.
And performing transverse restoration on the original archival image based on the second restoration core to obtain a second transverse restoration image, specifically:
and sequentially taking the jth pixel point (i, j) of the ith row in the original file image as a recovery reference point, and superposing the recovery reference point (i, j) and the central pixel point of the second recovery kernel. The coordinate of the center pixel point of the second recovery kernel is the pixel point where the center of gravity of the second recovery kernel is located. Then obtaining an average pixel value of pixel values of pixel points coincident with the second recovery kernel; the average pixel value is assigned as the pixel value of the restoration reference point (i, j). And completing transverse image restoration of the original file image based on second restoration check until all pixel points in the original file image are traversed to complete updating of the pixel value of each pixel point, and obtaining a second transverse restored image. Firstly, taking the 0 th pixel point (0,0) of the 0 th line in the original file image as a recovery reference point, and superposing the recovery reference point (0,0) and the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the original file image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0).
Then, the 1 st pixel point (0,1) in the 0 th line in the original file image is used as a recovery reference point, the recovery reference point (0,1) is overlapped with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the second recovery kernel in the original file image under the reference point (0,1) is used as the pixel value of the recovery reference point (0, 1).
And updating the recovery reference point in the 0 th line according to the mode until the M-1 th pixel point (0, M-1) in the 0 th line in the original file image is taken as the recovery reference point, coinciding the recovery reference point (0, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the second recovery kernel in the original file image under the reference point (0, M-1) as the pixel value of the recovery reference point (0, M-1).
Then updating the line, taking the 0 th pixel point (1,0) of the 1 st line in the original file image as a recovery reference point, and coinciding the recovery reference point (1,0) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the original file image under the reference point (1,0) as the pixel value of the recovery reference point (1, 0).
Then, the 1 st pixel point (1,1) of the 1 st line in the original file image is used as a recovery reference point, the recovery reference point (1,1) is overlapped with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the second recovery kernel in the original file image under the reference point (1,1) is used as the pixel value of the recovery reference point (1, 1).
And updating the recovery reference point in the 1 st line according to the mode until the M-1 st pixel point (1, M-1) in the 1 st line in the original file image is taken as the recovery reference point, coinciding the recovery reference point (1, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the second recovery kernel in the original file image under the reference point (1, M-1) as the pixel value of the recovery reference point (1, M-1).
The manner of obtaining the pixel value of the recovery reference point (1, M-1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the M-1 th pixel point (N-1, M-1) of the (N-1) th line in the original file image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points which are coincided with the second recovery kernel in the original file image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of performing transverse restoration on the original file image based on the second restoration check to obtain a second transverse restoration image.
In the specific embodiment, referring to the above-mentioned modes shown in fig. 5 and fig. 6, only the above-mentioned first recovery kernel needs to be replaced by the second recovery kernel, and the number of the pixels involved in the process of performing the horizontal recovery are determined by the number of the pixels and the number of the pixels actually overlapped by the second recovery kernel and the original archive image, and the specific determination mode is the above-mentioned mode, which is not described herein again.
Performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image, specifically:
and sequentially taking the jth pixel point (i, j) of the ith row in the first transverse recovery image as a recovery reference point, and enabling the recovery reference point (i, j) to be superposed with the central pixel point of the second recovery kernel. The coordinate of the center pixel point coincidence of the second restoration kernel is the pixel point of the center of gravity of the second restoration kernel. Then obtaining an average pixel value of pixel values of pixel points coincident with the second recovery kernel; the average pixel value is assigned as the pixel value of the restoration reference point (i, j). And completing the transverse image restoration of the first transverse restored image based on the second restored kernel to obtain a third transverse restored image. Firstly, taking the 0 th pixel point (0,0) of the 0 th line in the first transverse recovery image as a recovery reference point, and superposing the recovery reference point (0,0) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which are coincident with the second recovery kernel in the first transverse recovery image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0).
Then, the 1 st pixel point (0,1) of the 0 th line in the first horizontal restored image is used as a restored reference point, the restored reference point (0,1) coincides with the central pixel point of the second restored kernel, and the average pixel value of the pixel values of the pixel points in the first horizontal restored image coinciding with the second restored kernel under the reference point (0,1) is used as the pixel value of the restored reference point (0, 1).
And updating the recovery reference point in the 0 th row according to the mode until the M-1 th pixel point (0, M-1) in the 0 th row in the first transverse recovery image is taken as the recovery reference point, coinciding the recovery reference point (0, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the second recovery kernel in the first transverse recovery image under the reference point (0, M-1) as the pixel value of the recovery reference point (0, M-1).
Then updating the line, taking the 0 th pixel point (1,0) of the 1 st line in the first transverse recovery image as a recovery reference point, and coinciding the recovery reference point (1,0) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which are coincident with the second recovery kernel in the first transverse recovery image under the reference point (1,0) as the pixel value of the recovery reference point (1, 0).
Then, the 1 st pixel point (1,1) of the 1 st line in the first horizontal restored image is used as a restored reference point, the restored reference point (1,1) coincides with the central pixel point of the second restored kernel, and the average pixel value of the pixel values of the pixel points which coincide with the second restored kernel in the first horizontal restored image under the reference point (1,1) is used as the pixel value of the restored reference point (1, 1).
And updating the recovery reference point in the 1 st line according to the mode until the M-1 st pixel point (1, M-1) in the 1 st line in the first transverse recovery image is taken as the recovery reference point, coinciding the recovery reference point (1, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the first transverse recovery image under the reference point (1, M-1) as the pixel value of the recovery reference point (1, M-1).
The manner of obtaining the pixel value of the recovery reference point (1, M-1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the M-1 st pixel point (N-1, M-1) of the (N-1) th line in the first transverse recovery image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points which are coincided with the second recovery kernel in the first transverse recovery image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image. In the specific embodiment, referring to the above-mentioned modes shown in fig. 5 and 6, only the above-mentioned first recovery kernel needs to be replaced by the second recovery kernel, and the original archive image is replaced by the first transverse recovery image, and the number of the pixels involved in the transverse recovery process are determined by the number of the pixels and the number of the pixels actually overlapped by the second recovery kernel and the first transverse recovery image, and the specific determination mode is the above-mentioned mode, which is not described herein again.
Fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image, which specifically comprises the following steps:
obtaining corresponding pixel points of pixel points in a damaged area in an original file image, wherein the corresponding pixel points are pixel points in a first transverse recovery image, a second transverse recovery image and a third transverse recovery image, and the corresponding pixel points have the same position information with the pixel points in the damaged area; each pixel point in the damaged area corresponds to three corresponding pixel points. For example, the pixel (0,0) in the damaged area corresponds to the pixel (0,0) of the first horizontal restored image, the pixel (0,0) of the second horizontal restored image, and the pixel (0,0) of the third horizontal restored image.
Taking an average value of pixel values of three corresponding pixel points of the pixel points in the damaged area as a pixel value of a pixel point corresponding to the pixel point in the damaged area in the transverse fusion image, for example, the pixel point (0,0) in the transverse fusion image corresponds to the pixel point (0,0) in the damaged area, the pixel value of the pixel point (0,0) in the first transverse recovery image is I1(0,0), the pixel value of the pixel point (0,0) in the second transverse recovery image is I2(0, 0), and the pixel value of the pixel point (0,0) in the third transverse recovery image is I3(0, 0). Then the pixel value I4 (0,0) of the pixel point (0,0) in the transversely fused image is [ I1(0,0) + I2(0, 0) + I3(0, 0) ]/3.
And taking the pixel values of the pixel points in the non-damaged area in the original file image as the pixel values of the pixel points in the non-damaged area in the transverse fusion image.
The non-damaged area in the original file image is the other area of the original file image except the damaged area; and the coordinate values of the pixel points in the non-damaged area in the transverse fusion image and the pixel points in the non-damaged area in the original archival image are the same.
And performing longitudinal restoration on the original archival image based on the first restoration core and the second restoration core to obtain a longitudinal fusion image, wherein the longitudinal fusion image comprises: performing longitudinal restoration on the original archival image based on the first restoration core to obtain a first longitudinal restoration image;
performing longitudinal restoration on the original archival image based on the second restoration core to obtain a second longitudinal restoration image;
performing longitudinal restoration on the first transverse restored image based on the second restored image to obtain a third longitudinal restored image;
and fusing the first longitudinal recovery image, the second longitudinal recovery image and the third longitudinal recovery image to obtain a longitudinal fused image.
Based on the first recovery core, the original archival image is subjected to longitudinal recovery to obtain a first longitudinal recovery image, specifically comprising:
and sequentially taking the ith pixel point (i, j) of the jth column in the original file image as a recovery reference point, and enabling the recovery reference point (i, j) to be superposed with the central pixel point of the first recovery kernel. The coordinate of the center pixel point of the first recovery kernel is the pixel point where the center of gravity of the first recovery kernel is located. Then obtaining an average pixel value of pixel values of pixel points coincident with the first recovery kernel; the average pixel value is assigned as the pixel value of the restoration reference point (i, j). And completing the image restoration of the original file image based on the longitudinal first restoration check to obtain a first longitudinal restoration image. Firstly, taking the 0 th pixel point (0,0) of the 0 th line in the original file image as a recovery reference point, and superposing the recovery reference point (0,0) and the central pixel point of the first recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the first recovery kernel in the original file image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0). As shown in fig. 2.
Then, the 0 th pixel point (1,0) of the 1 st line in the original file image is used as a recovery reference point, the recovery reference point (1,0) is overlapped with the central pixel point of the first recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the first recovery kernel in the original file image under the reference point (1,0) is used as the pixel value of the recovery reference point (1, 0). As shown in fig. 7.
And updating the recovery reference point in the 0 th column according to the mode until the N-1 st pixel point (N-1,0) in the 0 th column in the original file image is used as the recovery reference point, and overlapping the recovery reference point (N-1,0) with the central pixel point of the first recovery kernel, so that the average pixel value of the pixel values of the pixel points overlapped with the first recovery kernel in the original file image under the reference point (N-1,0) is used as the pixel value of the recovery reference point (N-1, 0).
Then updating the columns, taking the 0 th row pixel point (0,1) of the 1 st column in the original file image as a recovery reference point, and coinciding the recovery reference point (0,1) with the central pixel point of the first recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the first recovery kernel in the original file image under the reference point (0,1) as the pixel value of the recovery reference point (0, 1).
Then, taking the 1 st row pixel (1,1) in the 1 st column in the original file image as a recovery reference point, coinciding the recovery reference point (1,1) with the central pixel of the first recovery kernel, and taking the average pixel value of the pixel values of the pixels in the original file image coinciding with the first recovery kernel at the reference point (1,1) as the pixel value of the recovery reference point (1, 1).
And updating the recovery reference point in the 1 st column according to the mode until the N-1 st pixel point (N-1,1) in the 1 st column in the original file image is taken as the recovery reference point, coinciding the recovery reference point (N-1,1) with the central pixel point of the first recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the first recovery kernel in the original file image under the reference point (N-1,1) as the pixel value of the recovery reference point (N-1, 1).
The manner of obtaining the pixel value of the recovery reference point (N-1,1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the pixel point (N-1, M-1) of the (M-1) th column and the (N-1) th row in the original file image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the first recovery kernel, and the average pixel value of the pixel values of the pixel point which is coincided with the first recovery kernel in the original file image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of longitudinal recovery of the original file image based on the first recovery check to obtain a first longitudinal recovery image.
Performing longitudinal restoration on the original archival image based on the second restoration core to obtain a second longitudinal restoration image, comprising:
and sequentially taking the ith pixel point (i, j) of the jth column in the original file image as a recovery reference point, and superposing the recovery reference point (i, j) and the central pixel point of the second recovery kernel. The coordinate of the center pixel point of the second recovery kernel is the pixel point where the center of gravity of the second recovery kernel is located. Then obtaining an average pixel value of pixel values of pixel points coincident with the second recovery kernel; the average pixel value is assigned as the pixel value of the restoration reference point (i, j). And completing the image restoration of the original file image based on the longitudinal second restoration check to obtain a first longitudinal restoration image. Firstly, taking the 0 th pixel point (0,0) of the 0 th line in the original file image as a recovery reference point, and superposing the recovery reference point (0,0) and the central pixel point of the first recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the first recovery kernel in the original file image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0).
Then, the 0 th pixel point (1,0) of the 1 st line in the original file image is used as a recovery reference point, the recovery reference point (1,0) is overlapped with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points overlapped with the second recovery kernel in the original file image under the reference point (1,0) is used as the pixel value of the recovery reference point (1, 0).
And updating the recovery reference point in the 0 th column according to the mode until the N-1 st pixel point (N-1,0) in the 0 th column in the original file image is used as the recovery reference point, and overlapping the recovery reference point (N-1,0) with the central pixel point of the second recovery kernel, so that the average pixel value of the pixel values of the pixel points overlapped with the second recovery kernel in the original file image under the reference point (N-1,0) is used as the pixel value of the recovery reference point (N-1, 0).
Then updating the columns, taking the 0 th row pixel point (0,1) of the 1 st column in the original file image as a recovery reference point, and coinciding the recovery reference point (0,1) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the original file image under the reference point (0,1) as the pixel value of the recovery reference point (0, 1).
Then, taking the 1 st row pixel (1,1) in the 1 st column in the original file image as a recovery reference point, coinciding the recovery reference point (1,1) with the central pixel of the second recovery kernel, and taking the average pixel value of the pixel values of the pixels in the original file image coinciding with the second recovery kernel at the reference point (1,1) as the pixel value of the recovery reference point (1, 1).
And updating the recovery reference point in the 1 st column according to the mode until the N-1 st pixel point (N-1,1) in the 1 st column in the original file image is taken as the recovery reference point, coinciding the recovery reference point (N-1,1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the second recovery kernel in the original file image under the reference point (N-1,1) as the pixel value of the recovery reference point (N-1, 1).
The manner of obtaining the pixel value of the recovery reference point (N-1,1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the pixel point (N-1, M-1) of the (M-1) th column and the (N-1) th row in the original file image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel point which is coincided with the second recovery kernel in the original file image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of longitudinal restoration of the original file image based on the second restoration check to obtain a second longitudinal restoration image.
Performing longitudinal restoration on the first transverse restored image based on the second restored image to obtain a third longitudinal restored image, including:
and sequentially taking the jth pixel point (i, j) of the ith row in the first longitudinal recovery image as a recovery reference point, and enabling the recovery reference point (i, j) to be superposed with the central pixel point of the second recovery kernel. The coordinate of the center pixel point coincidence of the second restoration kernel is the pixel point of the center of gravity of the second restoration kernel. Then obtaining an average pixel value of pixel values of pixel points coincident with the second recovery kernel; the average pixel value is assigned as the pixel value of the restoration reference point (i, j). And completing the longitudinal image restoration of the first longitudinal restored image based on the second restored kernel to obtain a third longitudinal restored image. Firstly, taking the 0 th pixel point (0,0) of the 0 th line in the first longitudinal recovery image as a recovery reference point, and superposing the recovery reference point (0,0) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the first longitudinal recovery image under the reference point (0,0) as the pixel value of the recovery reference point (0, 0).
Then, the 1 st pixel point (0,1) in the 0 th line in the first longitudinal restored image is used as a restored reference point, the restored reference point (0,1) is overlapped with the central pixel point of the second restored kernel, and the average pixel value of the pixel values of the pixel points overlapped with the second restored kernel in the first longitudinal restored image under the reference point (0,1) is used as the pixel value of the restored reference point (0, 1).
And updating the recovery reference point in the 0 th row according to the mode until the M-1 th pixel point (0, M-1) in the 0 th row in the first longitudinal recovery image is taken as the recovery reference point, coinciding the recovery reference point (0, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points coinciding with the second recovery kernel in the first longitudinal recovery image under the reference point (0, M-1) as the pixel value of the recovery reference point (0, M-1).
Then updating the line, taking the 0 th pixel point (1,0) of the 1 st line in the first longitudinal recovery image as a recovery reference point, and coinciding the recovery reference point (1,0) with the central pixel point of the second recovery kernel; and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the first longitudinal recovery image under the reference point (1,0) as the pixel value of the recovery reference point (1, 0).
Then, the 1 st pixel point (1,1) of the 1 st line in the first longitudinal recovery image is used as a recovery reference point, the recovery reference point (1,1) coincides with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the first longitudinal recovery image under the reference point (1,1) is used as the pixel value of the recovery reference point (1, 1).
And updating the recovery reference point in the 1 st line according to the mode until the M-1 st pixel point (1, M-1) in the 1 st line in the first longitudinal recovery image is taken as the recovery reference point, coinciding the recovery reference point (1, M-1) with the central pixel point of the second recovery kernel, and taking the average pixel value of the pixel values of the pixel points which coincide with the second recovery kernel in the first longitudinal recovery image under the reference point (1, M-1) as the pixel value of the recovery reference point (1, M-1).
The manner of obtaining the pixel value of the recovery reference point (1, M-1) refers to the manner of calculating the pixel values of the recovery reference point (0,0) and the recovery reference point (0,1), and is not described herein again.
And updating the recovery reference point according to the mode, wherein the M-1 st pixel point (N-1, M-1) of the (N-1) th line in the first longitudinal recovery image is taken as the recovery reference point, the recovery reference point (N-1, M-1) is coincided with the central pixel point of the second recovery kernel, and the average pixel value of the pixel values of the pixel points which are coincided with the second recovery kernel in the first longitudinal recovery image under the reference point (N-1, M-1) is taken as the pixel value of the recovery reference point (N-1, M-1). And finishing the operation of longitudinal recovery of the first longitudinal recovery image based on the second recovery core to obtain a third longitudinal recovery image. In the specific embodiment, referring to the above-mentioned modes shown in fig. 5 and 6, only the first recovery kernel mentioned above needs to be replaced by the second recovery kernel, and the original archive image is replaced by the first vertical recovery image, and the number of the pixels involved in the vertical recovery process are determined by the number of the pixels and the number of the pixels actually overlapped by the second recovery kernel and the first vertical recovery image, and the specific determination mode is the above-mentioned mode, which is not described herein again.
The method for fusing the first longitudinal restored image, the second longitudinal restored image and the third longitudinal restored image to obtain the longitudinal fused image can refer to the mode for fusing the first transverse restored image, the second transverse restored image and the third transverse restored image to obtain the transverse fused image, and specifically comprises the following steps:
obtaining corresponding pixel points of pixel points in a damaged area in an original archive image, wherein the corresponding pixel points are pixel points in a first longitudinal recovery image, a second longitudinal recovery image and a third longitudinal recovery image, and the corresponding pixel points have the same position information as the pixel points in the damaged area; each pixel point in the damaged area corresponds to three corresponding pixel points. For example, the pixel (0,0) in the damaged area corresponds to the pixel (0,0) of the first vertical recovery image, the pixel (0,0) of the second vertical recovery image, and the pixel (0,0) of the third vertical recovery image.
Taking an average value of pixel values of three corresponding pixel points of the pixel points in the damaged area as a pixel value of a pixel point corresponding to the pixel point in the damaged area in the longitudinal fused image, for example, the pixel point (0,0) in the longitudinal fused image corresponds to the pixel point (0,0) in the damaged area, the pixel value of the pixel point (0,0) in the first longitudinal restored image is I1(0,0), the pixel value of the pixel point (0,0) in the second longitudinal restored image is I2(0, 0), and the pixel value of the pixel point (0,0) in the third longitudinal restored image is I3(0, 0). Then the pixel value I4 (0,0) of the pixel point (0,0) in the vertical blended image is [ I1(0,0) + I2(0, 0) + I3(0, 0) ]/3.
And taking the pixel values of the pixel points in the non-damaged area in the original file image as the pixel values of the pixel points in the non-damaged area in the longitudinal fusion image.
The non-damaged area in the original file image is the other area of the original file image except the damaged area; and the coordinate values of the pixel points in the non-damaged area in the longitudinal fusion image and the pixel points in the non-damaged area in the original archival image are the same.
Fusing the transverse fusion image and the transverse fusion image to obtain a repaired image, wherein the method comprises the following steps:
the method comprises the steps of obtaining a to-be-repaired area corresponding to a damaged area in a repaired image, wherein the damaged area and the to-be-repaired area are in one-to-one correspondence, two pixel points in the one-to-one correspondence pixel point pair are respectively from the damaged area and the to-be-repaired area, and the position coordinate values of the two pixel points in the one-to-one correspondence pixel point pair are the same.
And obtaining a repairing pixel point pair corresponding to the pixel point pair, wherein two pixel points in the repairing pixel point pair are respectively from the transverse fusion image and the transverse fusion image, and the position coordinate values of the two pixel points in the repairing pixel point pair are the same as those of the two pixel points in the pixel point pair.
And taking the average value of the pixel values of the two pixel points in the repairing pixel point pair as the pixel value of the pixel point with the same position coordinate value as the two pixel points in the repairing pixel point pair in the region to be repaired.
And taking the pixel value of the pixel point in the non-patch area as 0. The non-repaired area is the other area except the area to be repaired in the repaired image, and the non-repaired area corresponds to the non-damaged area.
Fusing the repaired image with the original archive image to obtain a restored original archive image, which specifically comprises the following steps:
and obtaining a region to be repaired in the repaired image, wherein the position coordinates of the pixel points in the region to be repaired are the same as the position coordinates of the pixel points in the damaged region in the original file image. The pixel value of the pixel point in the damaged area in the original file image is taken as the repairing pixel value, and the repairing pixel value is the pixel value of the pixel point in the area to be repaired, wherein the position coordinate of the pixel point is the same as that of the pixel point.
And keeping the original pixel value of the pixel point of the undamaged area in the original file image.
By adopting the scheme, the accuracy and the reliability of image recovery can be improved.
As another optional implementation, the restoring the original archive image comprises:
obtaining a damaged area in an original file image;
for a pixel point (I, j) in the damaged area, performing chaotic mapping on the pixel point (I, j) to a pixel point (I ', j'), and assigning a pixel value I (I ', j') of the pixel point (I ', j') in the original archive image to a pixel value I (I, j) of the pixel point (I, j), where I (I, j) ═ I (I ', j'), namely:
Figure BDA0003265753160000202
wherein, (i, j) represents the position of the pixel point in the ith row and the jth column in the damaged area, (i ', j') represents the position of the pixel point corresponding to the pixel point (i, j) in the original archive image, a and d are constant parameters, and the value range of a is 1 to 2128And excluding numbers that are multiples of N; d ranges from 1 to 2128Integer of between, | i-ai2(i-ai) is represented by/2 + d mod N |2Absolute value of/2 + d mod N). 1-ai2+ j mod N | represents (1-ai)2+ j mod N). Optionally, i is 0,1,2, 3., N-1, j is 0,1, 2., M-1, N is the total number of rows of pixel points of the damaged region, and M is the total number of columns of pixel points of the damaged region. The pixel values in the damaged area are repaired by the scheme to obtain the restored original pixel valuesAn archival image.
For example, the damaged area includes pixel (0,0), pixel (0,1), pixel (0,2), pixel (1,0), pixel (1,1), pixel (2,0), pixel (2,1), and pixel (3, 0). Then the pixel values in the damaged area are repaired in the following way to obtain the restored original archive image:
pixel value of the recovery pixel point (0, 0):
Figure BDA0003265753160000211
then, the pixel value I (1,8) of the pixel (1,8) in the original file image is assigned to the pixel value I (0,0) of the pixel (0,0), I (0,0) ═ I (1, 8).
Restoring the pixel value of the pixel point (0, 1):
Figure BDA0003265753160000212
then, the pixel value I (1,8) of the pixel (1,8) in the original file image is assigned to the pixel value I (0,1) of the pixel (0,1), I (0,1) ═ I (1, 8).
And (3) restoring the pixel value of the pixel point (2, 1):
Figure BDA0003265753160000213
then the pixel value I (47, 14) of the pixel point (47, 14) in the original file image is assigned to the pixel value I (2,1) of the pixel point (2,1), i.e. I (2,1) ═ I (47, 14).
The recovery method of the pixel values of other pixel points in the damaged area is the same as the recovery method of the pixel values of the pixel points (0,1), the recovery method of the pixel values of the pixel points (0,0), and the recovery method of the pixel values of the pixel points (2,1), and specific reference is made to the above methods, which are not repeated herein.
It should be noted that, when the pixel point (I ', j') mapped by the pixel point (I, j) in the calculated damaged area already exceeds the range of the position coordinates of all the pixel points of the original archive image, that is, if the pixel point (I ', j') is not in the original archive image, the pixel value of the pixel point with the shortest euclidean distance from the original archive image to the pixel point (I ', j') is assigned to the pixel value I (I, j) of the pixel point (I, j), specifically:
if the pixel point (I ', j') mapped by the pixel point (I, j) is not in the original file image, obtaining the pixel point (x, y) with the shortest Euclidean distance from the pixel point (I ', j') in the original file image, and assigning the pixel value I (x, y) of the pixel point (x, y) to the pixel value I (I, j) of the pixel point (I, j), namely I (I, j) ═ I (x, y).
Through adopting above scheme, can reply impaired region fast, to the recovery of pixel in the impaired region simultaneously, considered the pixel information of other pixels of whole image of original archives image, the original archives image of the recovery that obtains is lifelike, has improved the accuracy of image recovery.
As another optional implementation, before performing the transverse restoration on the original archival image based on the first restoration core to obtain a first transverse restored image, the method further comprises: chaotically mapping the pixel point (I, j) to the pixel point (I ', j') aiming at the pixel point (I, j) in the damaged area, assigning the pixel value I (I ', j') of the pixel point (I ', j') in the original file image to the pixel value I (I, j) of the pixel point (I, j),
Figure BDA0003265753160000221
namely:
Figure BDA0003265753160000222
wherein, (i, j) represents the position of the pixel point in the ith row and the jth column in the damaged area, (i ', j') represents the position of the pixel point corresponding to the pixel point (i, j) in the original archive image, a and d are constant parameters, and the value range of a is 1 to 2128And excluding numbers that are multiples of N; d ranges from 1 to 2128Integer of between, | i-ai2/2+ d mod N | represents i-ai2Absolute value of/2 + d mod N. 1-ai2+ j mod N | represents 1-a 2Absolute value of i + j m o. Optionally, i is 0,1,2, 3., N-1, j is 0,1, 2., M-1, N is the total number of rows of pixel points of the damaged region, and M is the total number of columns of pixel points of the damaged region. And repairing the pixel value in the damaged area by the scheme to obtain an initial recovery image. Then, performing transverse restoration on the initial restored image based on the first restoration core to obtain a first transverse restored image; performing transverse restoration on the initial restored image based on the second restored image to obtain a second transverse restored image; performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image; performing longitudinal restoration on the initial restored image based on the first restoration core to obtain a first longitudinal restored image; performing longitudinal restoration on the initial restored image based on the second restored image to obtain a second longitudinal restored image; performing longitudinal restoration on the first transverse restored image based on the second restored image to obtain a third longitudinal restored image; fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image; fusing the first longitudinal recovery image, the second longitudinal recovery image and the third longitudinal recovery image to obtain a longitudinal fused image; fusing the transverse fusion image and the longitudinal fusion image to obtain a repaired image; and fusing the repaired image and the original archive image to obtain a restored original archive image.
Optionally, the method further includes: a damaged area in the original archival image is obtained. The damaged area can be obtained by adopting canny operator to carry out edge detection in the original file image, and if the edge of a closed area is detected and the closed area is smaller than the original file image, the closed area is determined as the damaged area. The damaged area in the original file image can be obtained by adopting a canny operator to carry out edge detection on the original file image, and the detected closed area is the damaged area, namely the area which is surrounded by the edge of the closed area can be formed in the detected edge.
By adopting the scheme, firstly, pixel value recovery is carried out on pixel points of a loss area based on the pixel points of the mixture mapping and the original archival image to obtain an initial recovery image, then the initial recovery image is reduced into a first recovery kernel and a second recovery kernel, and then the initial recovery image is transversely recovered based on the first recovery kernel to obtain a first transverse recovery image; performing transverse restoration on the initial restored image based on the second restored image to obtain a second transverse restored image; performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image; performing longitudinal restoration on the initial restored image based on the first restoration core to obtain a first longitudinal restored image; performing longitudinal restoration on the initial restored image based on the second restored image to obtain a second longitudinal restored image; performing longitudinal restoration on the first transverse restored image based on the second restored kernel to obtain a third longitudinal restored image; fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image; fusing the first longitudinal recovery image, the second longitudinal recovery image and the third longitudinal recovery image to obtain a longitudinal fused image; fusing the transverse fused image and the longitudinal fused image to obtain a repaired image; and fusing the repaired image and the original file image to obtain a restored original file image, wherein the restored original file image is vivid, and has high image restoration precision and good effect.
Optionally, the method includes generating, by using a pre-trained neural network model, index information of the restored archive image based on the restored archive image, specifically: and extracting the file features from the restored file image through a support vector machine model, and taking the file features as index information. The profile characteristics may be keywords in the profile. The pre-trained neural network model is a pre-trained support vector machine model. Specifically, characters are identified in the restored archive image through a support vector machine model, and then keywords are extracted from the identified characters. This key is the index information.
Constructing a file big database from the index information, and storing the repaired file image into the big database, wherein the steps of:
and classifying the keywords to obtain the categories of the keywords, and storing the repair archive images corresponding to the keywords belonging to the same category into the same storage block in the big database. In the large database, different types of repair archive images are stored in different storage blocks, and archive information of the same type is stored in the same storage block in the large database. The type of the restored file image is the type of the keyword extracted from the restored file image, and the type of the file information is the type of the restored file image in which the file information is located.
The specific way of classifying the keywords to obtain the categories of the keywords is as follows: and performing near-meaning word classification on the keywords, wherein the keywords belonging to the same category are near-meaning words. Or keywords belonging to the same category have the same function.
It should be noted that, in the archive big database, each archive information is preset to correspond to one archive query request, and each restored archive image corresponds to one archive query request.
Therefore, when a file inquiry request of a user is received, the file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user, so that the user can conveniently inquire the file information, and meanwhile, the file information is conveniently shared, and the intellectualization of file information management is realized.
The embodiment of the present application further provides an execution subject for executing the above steps, and the execution subject may be the digital archive management system 200 based on the big data application. The system comprises:
the scanning module is used for scanning to obtain an original file image;
the recovery module is used for carrying out image restoration on the original file image to obtain a restored file image;
the index module is used for generating index information of the repaired archive image based on the repaired archive image through a pre-trained neural network model;
the storage module is used for constructing a file big database based on the index information and storing the repaired file image into the big database; restoring the file image to contain file information;
the management module is used for managing the archive information in the archive big database: and if a file inquiry request of a user is received, file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user.
With regard to the system in the above embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other media capable of storing program codes. It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (5)

1. A method for managing a digital archive based on a big data application, the method comprising:
scanning to obtain an original file image;
performing image restoration on the original file image to obtain a restored file image;
generating index information of the restored archive image based on the restored archive image through a pre-trained neural network model;
constructing an archive big database based on the index information, and storing the repaired archive image into the big database; restoring the file image to contain file information;
managing the archive information in the archive big database: and if a file inquiry request of a user is received, file information corresponding to the file inquiry request is called from the file big database, and the file information is sent to the user.
2. The method of claim 1, wherein said image inpainting said original archival image to obtain an inpainted archival image comprises:
reducing the original file image to obtain a first recovery kernel and a second recovery kernel; the first recovered nucleus is larger in size than the second recovered nucleus; the size of the first restoring core and the size of the second restoring core are smaller than the size of the original archive image;
performing transverse restoration on the original archive image based on the first restoration core and the second restoration core to obtain a transverse fusion image;
performing longitudinal restoration on the original archive image based on the first restoration core and the second restoration core to obtain a longitudinal fusion image;
fusing the transverse fusion image and the longitudinal fusion image to obtain a repaired image;
and fusing the repaired image and the original file image to obtain a repaired file image.
3. The method of claim 1, wherein laterally restoring the original archival image based on the first and second restoration checks to obtain a laterally fused image comprises:
performing transverse restoration on the original archival image based on the first restoration core to obtain a first transverse restoration image;
performing transverse restoration on the original archival image based on the second restoration core to obtain a second transverse restoration image;
performing transverse restoration on the first transverse restored image based on the second restored image to obtain a third transverse restored image;
and fusing the first transverse recovery image, the second transverse recovery image and the third transverse recovery image to obtain a transverse fused image.
4. The method of claim 3, wherein fusing the first, second, and third laterally restored images to obtain a laterally fused image comprises:
obtaining corresponding pixel points of pixel points in a damaged area in an original file image, wherein the corresponding pixel points are pixel points in a first transverse recovery image, a second transverse recovery image and a third transverse recovery image, and the pixel points have the same position information with the pixel points in the damaged area; each pixel point in the damaged area corresponds to three corresponding pixel points;
taking the average value of the pixel values of three corresponding pixel points of the pixel points in the damaged area as the pixel value of the pixel point corresponding to the pixel point in the damaged area in the transverse fusion image;
taking the pixel values of the pixel points in the non-damaged area in the original file image as the pixel values of the pixel points in the non-damaged area in the transverse fusion image;
the non-damaged area in the original file image is the other area of the file image except the damaged area; and the pixel points in the non-damaged area in the transverse fusion image and the pixel points in the non-damaged area in the archival image have the same coordinate values.
5. The method of claim 4, wherein the longitudinally restoring the archival image based on the first and second restoration checks to obtain a longitudinally fused image comprises:
performing longitudinal restoration on the original archival image based on the first restoration core to obtain a first longitudinal restoration image;
performing longitudinal restoration on the original archival image based on the second restoration core to obtain a second longitudinal restoration image;
performing longitudinal restoration on the first transverse restored image based on the second restored image to obtain a third longitudinal restored image;
and fusing the first longitudinal recovery image, the second longitudinal recovery image and the third longitudinal recovery image to obtain a longitudinal fusion image.
CN202111086516.9A 2021-09-16 2021-09-16 Digital archive management method and system based on big data application Active CN113792169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111086516.9A CN113792169B (en) 2021-09-16 2021-09-16 Digital archive management method and system based on big data application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111086516.9A CN113792169B (en) 2021-09-16 2021-09-16 Digital archive management method and system based on big data application

Publications (2)

Publication Number Publication Date
CN113792169A true CN113792169A (en) 2021-12-14
CN113792169B CN113792169B (en) 2022-05-10

Family

ID=79183573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111086516.9A Active CN113792169B (en) 2021-09-16 2021-09-16 Digital archive management method and system based on big data application

Country Status (1)

Country Link
CN (1) CN113792169B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060072830A1 (en) * 2004-02-26 2006-04-06 Xerox Corporation Method for automated image indexing and retrieval
US20100177978A1 (en) * 2009-01-14 2010-07-15 Samsung Electronics Co., Ltd. Image restoring apparatus and method thereof
US20140126833A1 (en) * 2012-11-02 2014-05-08 Cyberlink Corp. Systems and Methods for Performing Image Inpainting
CN107451277A (en) * 2017-08-04 2017-12-08 光典信息发展有限公司 Image file processing method and device
CN109712092A (en) * 2018-12-18 2019-05-03 上海中信信息发展股份有限公司 Archives scan image repair method, device and electronic equipment
CN110688348A (en) * 2019-10-09 2020-01-14 李智鹏 File management system
CN112541490A (en) * 2020-12-03 2021-03-23 广州城市规划技术开发服务部有限公司 Archive image information structured construction method and device based on deep learning
CN113190502A (en) * 2021-01-26 2021-07-30 云南电网有限责任公司信息中心 Archive management method based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060072830A1 (en) * 2004-02-26 2006-04-06 Xerox Corporation Method for automated image indexing and retrieval
US20100177978A1 (en) * 2009-01-14 2010-07-15 Samsung Electronics Co., Ltd. Image restoring apparatus and method thereof
US20140126833A1 (en) * 2012-11-02 2014-05-08 Cyberlink Corp. Systems and Methods for Performing Image Inpainting
CN107451277A (en) * 2017-08-04 2017-12-08 光典信息发展有限公司 Image file processing method and device
CN109712092A (en) * 2018-12-18 2019-05-03 上海中信信息发展股份有限公司 Archives scan image repair method, device and electronic equipment
CN110688348A (en) * 2019-10-09 2020-01-14 李智鹏 File management system
CN112541490A (en) * 2020-12-03 2021-03-23 广州城市规划技术开发服务部有限公司 Archive image information structured construction method and device based on deep learning
CN113190502A (en) * 2021-01-26 2021-07-30 云南电网有限责任公司信息中心 Archive management method based on deep learning

Also Published As

Publication number Publication date
CN113792169B (en) 2022-05-10

Similar Documents

Publication Publication Date Title
US9535930B2 (en) System and method for using an image to provide search results
JP2018522343A (en) Method, computer device and storage device for building a decision model
JP6740457B2 (en) Content-based search and retrieval of trademark images
TWI536186B (en) Three-dimension image file serching method and three-dimension image file serching system
US20100223299A1 (en) 3d object descriptors
CN113449187A (en) Product recommendation method, device and equipment based on double portraits and storage medium
CN114550051A (en) Vehicle loss detection method and device, computer equipment and storage medium
CN115035316A (en) Target area image identification method and device and computer equipment
CN114494751A (en) License information identification method, device, equipment and medium
US9613283B2 (en) System and method for using an image to provide search results
Akhyar et al. A beneficial dual transformation approach for deep learning networks used in steel surface defect detection
CN113792169B (en) Digital archive management method and system based on big data application
CN113704276A (en) Map updating method and device, electronic equipment and computer readable storage medium
CN111709422A (en) Image identification method and device based on neural network and computer equipment
CN110688995A (en) Map query processing method, computer-readable storage medium and mobile terminal
Breuel A comparison of search strategies for geometric branch and bound algorithms
Cao et al. Stable image matching for 3D reconstruction in outdoor
CN115861927A (en) Image identification method and device for power equipment inspection image and computer equipment
CN115758271A (en) Data processing method, data processing device, computer equipment and storage medium
CN116206125A (en) Appearance defect identification method, appearance defect identification device, computer equipment and storage medium
Al-Jaberi et al. Topological data analysis to improve exemplar-based inpainting
van Blokland et al. Partial 3D object retrieval using local binary QUICCI descriptors and dissimilarity tree indexing
US20210103750A1 (en) Processing irregularly arranged characters
US10796197B2 (en) Automatic method and system for similar images and image fragments detection basing on image content
Liang et al. Automated filtering of façade defect images using a similarity method for enhanced inspection documentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant