CN114494750A - Computer-assisted oracle bone conjugation method based on orthogonal V-system transformation - Google Patents

Computer-assisted oracle bone conjugation method based on orthogonal V-system transformation Download PDF

Info

Publication number
CN114494750A
CN114494750A CN202210127638.6A CN202210127638A CN114494750A CN 114494750 A CN114494750 A CN 114494750A CN 202210127638 A CN202210127638 A CN 202210127638A CN 114494750 A CN114494750 A CN 114494750A
Authority
CN
China
Prior art keywords
vector
edge
image
rubbing
denotes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210127638.6A
Other languages
Chinese (zh)
Other versions
CN114494750B (en
Inventor
宋传鸣
武惠娟
张焱程
洪飏
王相海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaoning Normal University
Original Assignee
Liaoning Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Normal University filed Critical Liaoning Normal University
Priority to CN202210127638.6A priority Critical patent/CN114494750B/en
Publication of CN114494750A publication Critical patent/CN114494750A/en
Application granted granted Critical
Publication of CN114494750B publication Critical patent/CN114494750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The invention discloses a computer-assisted oracle bone conjugation method based on orthogonal V-system transformation. Extracting the outer edge contour of the oracle bone rubbing by using morphological operation and an edge tracking algorithm; detecting large curvature angular points of the oracle bone edge by using a curvature scale space algorithm, further determining a plurality of sections of edges to be matched with slag notch characteristics by using the large curvature angular points as end points, and searching candidate rubbings in a database in an edge-by-edge matching mode; for an edge to be matched, orthogonal V-system transformation is carried out on the edge to be matched to construct a V-descriptor vector, then the low frequency and the intermediate frequency of the V-system transformation are utilized to construct an approximate vector and a detail vector of the edge, and the approximate vector and the detail vector are further converted into a direction histogram through a Freeman chain code; and taking the Euclidean distance and the Pearson correlation coefficient as similarity measurement, and respectively utilizing the V-descriptor vector and the direction histogram to carry out initial retrieval and fine retrieval so as to determine a topology set for conjugation.

Description

Computer-assisted oracle bone conjugation method based on orthogonal V-system transformation
Technical Field
The invention relates to the crossing field of digital image processing and ancient character information processing, in particular to a computer-assisted oracle bone conjugation method based on orthogonal V-system transformation, which can effectively resist boundary flaws and defects, has good matching fault tolerance of edge slag notches, high accuracy, good robustness and strong adaptive capacity.
Background
The oracle is a character for recording and divining by tortoise shells and animal bones in the later period of the business, is an ancient character which is discovered in China to be the earliest system so far, and has extremely important protection value and research value. However, the nail bones are fragile and fragile, and are burnt by drilling , and then the words are incomplete and difficult to solve. Thus, "the current situation of fragments of the carapace bone unearthed and collected, determines that the carapace bone conjugation work will always be the most urgent and fundamental work in the study of carapace bone science". Only by combining the oracle bone fragments, the examples, positions, grammar and contents of the Buyi dictionary can be known more deeply, and the research significance and the cultural value of the Buyi dictionary are huge.
According to incomplete statistics, the number of the carapace bones unearthed in China is up to more than 16 ten thousand, and each carapace bone fragment can be in different collections and belong to different collections. Only by means of personal experience and mental memory, the oracle calluses are manually arranged and conjugated, the workload, difficulty and threshold are highly conceivable, and certain damage can be caused to the oracle bone. In order to solve the problem, the fundamental purpose of computer-assisted oracle bone conjugation is to use an oracle bone rubbing image as an object and use the professional experience of human experts as guidance to model and extract quantitative characteristics suitable for automatic oracle bone grafting, thereby realizing full-automatic or semi-automatic splicing of broken oracle bone slices. The method is expected to break through the limitations of expert dependence, experience knowledge, space-time regions and working efficiency, is one of the new directions of oracle research at present, and is an important way for developing oracle digitalization and promoting the oracle research.
The existing computer-assisted oracle bone conjugation technologies are roughly divided into four categories, namely a splicing method based on corner features, a splicing method based on digital codes, a splicing method based on shape features and a splicing method based on edge features.
(1) Splicing method based on digital coding. Zhou et al determines the splicing rule through the edge, texture, direction of the aloft, the aloft and other elements of the oracle bone fragment, and then digitally codes each oracle bone fragment according to the information of the number of the oracle bone fragment in the B-code, the position of the oracle bone fragment on the full-page tortoise shell, the size of the oracle bone, the font of the oracle bone engraved and words, the engraving position on the oracle bone fragment and the like, and further completes the splicing based on the specific coding rule; tong et al modified the encoding and splicing method to determine the splicing rules, and could digitally encode individual fragments of the oracle bone by conjugating more than one quarter of the fragments of each bone plate in addition to adjacent bone versions. Similarly, in silico carapace conjugation was performed by ge venna et al based on digital processing of the carapace bone fragments. The method is convenient for manual identification, is relatively labor-consuming, is not beneficial to automatic processing by a computer, and particularly directly influences the matching accuracy when the information representation of the fragments to be conjugated is inaccurate.
(2) A splicing method based on corner features. Lin et al extracts the edge features of the oracle bone fragment image by preprocessing operations such as oracle bone edge extraction, edge vectorization, polygon adjustment and the like, analyzes the angle features of the oracle bone fragment image under three conditions of concave-convex corners, continuous multiple corners and 360-degree corners, further provides an edge length ratio method and a splicing method of concave corners and convex corners of edges, searches for corners meeting the square sum value constraint of the ratio in all candidate oracle bone fragments, and realizes splicing of the oracle bone fragments. In addition, the smith bead and other people obtain an acceptable fragment image splicing result through the processes of rough matching based on angle features, matching based on angle edge length features, defect searching and leakage repairing of fragment matching and the like. However, the calculation amount of the two methods is still large, and for the severely worn nail bone fragments, the boundary flaws and defects of the severely worn nail bone fragments often introduce interference characteristics, so that the probability of mismatching is inevitably increased.
(3) A splicing method based on edge features. Wang et al calculate the angle formed by the connecting lines of every 3 contour sequence points according to the contour sequence points of the oracle bone rubbing image, and match the oracle bone fragment rubbing image by taking the angle as the characteristic of the oracle bone rubbing image; the method comprises the steps that firstly, the contour of the oracle bone fragments is extracted by Wang lover and the like, the characteristics of a chain code descriptor, a Fourier descriptor, a b-spline descriptor and the like of the contour are calculated to obtain contour characteristic vectors of the oracle bone fragments, and on the basis, the contour characteristic vectors of the fragments to be spliced are matched with the contour characteristic vectors of each image in a database by adopting certain distance measurement (such as Euclidean distance, similarity distance and the like), and finally, conjugation is completed; zhang Sheng et al add margin line and tangent line for the oracle bone fragment in a manual interaction mode, and then utilize the computer to automatically generate the time sequence of the margin line, and then under the assistance of tangent line, accomplish conjugation through the incremental rotation of the oracle bone fragment; tian et al extracts boundary features such as corners, edges, shapes and the like of the oracle bone rubbing image, and selects candidate contour segments of the oracle bone fragment image from the database for comparison by utilizing curve fitting degree analysis and Pearson correlation coefficient. However, the threshold selection of the method is difficult, the distance measure of the edge features is sensitive, the concavity and the convexity among the fragments of the oracle bone are ignored, so that the method can obtain a satisfactory conjugation result only when the high matching of the slag notch at the fracture part of the oracle bone occurs, the fault tolerance of the edge matching is low, and manual interactive intervention is needed under individual conditions.
(4) A shape feature based stitching method. Zhang et al obtains the original boundary of the oracle bone fragments by using methods such as boundary augmentation, boundary extraction and the like, then moves the original boundary of the oracle bone fragments to a target boundary by adopting geometric operations (such as translation, rotation and the like) of images, and takes the ratio of the maximum boundary matching length to the matching boundary gap area as a similarity measure, thereby finding out candidate oracle bone fragments with the highest similarity and completing automatic splicing. The Zhouchui et al uses a triangulation-based contour curve extraction and filtering algorithm to obtain a contour curve of a two-dimensional image, and provides a two-dimensional contour matching algorithm of a multi-scale self-adaptive threshold. Although the splicing method based on the outer contour shape characteristics achieves certain effect, certain flaws and defects often exist on the boundary of some severely worn oracle bone fragments, so that the method is influenced and error in a non-negligible manner and is not easy to conjugate and reduce.
In view of the above, it is still a very challenging task to achieve computer-automated conjugation of oracle lexical fragments. At present, a computer-assisted oracle bone conjugation method which can effectively resist boundary flaws and defects, has good matching fault tolerance of edge slag notches, high accuracy, good robustness and strong self-adaptive capacity does not exist.
Disclosure of Invention
The invention aims to solve the technical problems in the prior art and provides a computer-assisted oracle bone conjugation method which can effectively resist boundary flaws and defects, has good matching tolerance of edge slag notches, high accuracy, good robustness and strong adaptive capacity and is based on orthogonal V-system transformation.
The technical solution of the invention is as follows: a computer-assisted oracle bone conjugation method based on orthogonal V-system transformation is characterized by comprising the following steps:
step 1, inputting an oracle bone thesaurus rubbing image I to be matched, and enabling the height of the oracle bone thesaurus rubbing image I to be hIPixel of width wIA pixel;
step 2, converting the color space of the I from RGB to HSV and extracting the brightness component I of the color spaceV
Step 3, for image IVHistogram equalization is carried out to enhance the brightness contrast thereof to obtain I'V
Step 4, adopting Otsu method to treat I'VComputing an optimal global segmentation threshold TVAnd use of TVIs prepared from'VCarrying out binarization to obtain an image Ibw
Step 5, for the binary image IbwPerforming a reverse color treatment toObtaining a background area of the oracle bone rubbing as white and a character area as black to obtain I'bw
Step 6, to I'bwPerforming 8-connected domain analysis, and filling background holes in the white foreground region by adopting a seed filling algorithm so as to suppress I'bwThe point noise and the sheet speckle contained in the image I ″, an image I ″bw
Step 7, for I ″)bwIs marked and the connected area is smaller than AconnThe 8-connected region of (a) is filled, thereby removing I ″bwObtaining a binary image I 'from the discretely distributed fracture megastriations, shield striations and tooth gaps'bwSaid A isconnIs a preset constant;
step 8, adopting a disc-shaped structural element pair I 'with the radius of r'bwPerforming morphological expansion operation to smooth burr caused by uneven contact between paper and the outer edge of the oracle bone rubbing during rubbing process to obtain image IfixR is a preset constant;
step 9, searching the image I by adopting a bwperemm method in an 8-connected tracking modefixThe outer edge contour of the middle foreground area obtains an edge e with a single pixel width, and the edge e is displayed
Figure BDA0003501093780000041
The Ω represents a set of topologies available for conjugation;
step 10, detecting the corner of the edge e by using a curvature scale space algorithm with the scale factor sigma to obtain a corner sequence { cs(xs,ys)|s=1,2,…,NcAnd the angular point with the smallest abscissa is taken as the traversal starting point cstartWhere σ is a predetermined constant, cs(xs,ys) Coordinates representing the s-th corner point, NcS is more than or equal to 1 and less than or equal to N representing the number of angular pointsc
Step 11, take out and c along the clockwise directionstartAdjacent len angular points to obtain a point set
Figure BDA0003501093780000042
Then taking out the product along the counterclockwise direction and cstartAdjacent len angular points to obtain a point set
Figure BDA0003501093780000043
The len is a preset constant number which,
Figure BDA0003501093780000044
denotes in the clockwise direction and cstartThe adjacent 1 st corner point is,
Figure BDA0003501093780000045
denotes in the clockwise direction and cstartThe adjacent 2 nd corner point is,
Figure BDA0003501093780000046
denotes in the clockwise direction and cstartThe adjacent lenth corner point is,
Figure BDA0003501093780000047
denotes in the counterclockwise direction and cstartThe adjacent 1 st corner point is,
Figure BDA0003501093780000048
denotes in the counterclockwise direction and cstartThe adjacent 2 nd corner point is,
Figure BDA0003501093780000049
denotes in the counterclockwise direction and cstartThe adjacent len-th corner point;
step 12, extracting from the edge e
Figure BDA00035010937800000410
As a starting point, with cstartIs the midpoint
Figure BDA00035010937800000411
A section of the edge of the end point is used as the edge e to be processed currentlyc
Step 13, if the edge ecHas been already coveredIf the processed image I is the current oracle bone thesaurus rubbing image I to be matched, the step 36 is carried out, otherwise, the step 14 is carried out;
step 14. in
Figure BDA00035010937800000412
As a starting point, edge e is subtended in a clockwise directioncThe pixels are scanned and converted into a one-dimensional boundary pixel vector { p }i(xi,yi)|i=1,2,…,NeIs said pi(xi,yi) Coordinates representing the ith boundary pixel, NeDenotes ecThe number of pixels included, i is more than or equal to 1 and less than or equal to Ne
Step 15. calculate edge ecThe center of gravity is positioned at the origin of the coordinate system by translation operation, and the edge e is positioned by rotation operationcRotate about its center of gravity such that
Figure BDA00035010937800000413
Parallel to the X-axis of the coordinate system, to obtain an updated boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeH, p'i(xi,yi) Indicating the updated coordinates of the ith boundary pixel point,
Figure BDA0003501093780000051
is represented by
Figure BDA0003501093780000052
Point of direction
Figure BDA0003501093780000053
The vector of (a);
step 16. to the boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarrying out horizontal mirror image operation around the gravity center of the pixel to obtain a mirrored boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeIs p ″)i(xi,yi) Is shown asCoordinates of the i boundary pixel points after horizontal mirroring;
step 17, order
Figure BDA0003501093780000054
To boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure BDA0003501093780000055
And the spectrum vector of the ordinate
Figure BDA0003501093780000056
Step 18. Pair boundary pixel vector { pi(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure BDA0003501093780000057
And the spectrum vector of the ordinate
Figure BDA0003501093780000058
Step 19, constructing { p'i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure BDA0003501093780000059
Figure BDA00035010937800000510
A is saidkTo represent
Figure BDA00035010937800000511
The (k) th component of (a),
Figure BDA00035010937800000512
representing the direction of a spectrumMeasurement of
Figure BDA00035010937800000513
The (k) th component of (a),
Figure BDA00035010937800000514
representing spectral vectors
Figure BDA00035010937800000515
K is 1. ltoreq. k.ltoreq.2LJ represents an imaginary unit;
step 20, according to the formula (2), constructing { p ″)i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure BDA00035010937800000516
Figure BDA00035010937800000517
Beta is saidkTo represent
Figure BDA00035010937800000518
The (k) th component of (a),
Figure BDA00035010937800000519
representing spectral vectors
Figure BDA00035010937800000520
The (k) th component of (a),
Figure BDA00035010937800000521
representing spectral vectors
Figure BDA00035010937800000522
K is 1. ltoreq. k.ltoreq.2L
Step 21, according to the formula (3) and the formula (4), V-descriptor vector is processed
Figure BDA00035010937800000523
And
Figure BDA00035010937800000524
normalization is carried out to respectively obtain the normalized V-descriptor vectors
Figure BDA00035010937800000525
And
Figure BDA00035010937800000526
Figure BDA00035010937800000527
Figure BDA0003501093780000061
step 22. extraction
Figure BDA0003501093780000062
And
Figure BDA0003501093780000063
front 2 ofLThe low-frequency coefficient set A is obtained by 4 componentsLIs then extracted
Figure BDA0003501093780000064
And
Figure BDA0003501093780000065
2 ndL4+1 component to 2L2 components forming a set A of intermediate frequency coefficientsM
Step 23. extraction
Figure BDA0003501093780000066
And
Figure BDA0003501093780000067
front 2 ofLThe low-frequency coefficient set B is obtained by 4 componentsLThen extract again
Figure BDA0003501093780000068
And
Figure BDA0003501093780000069
2 ndL4+1 component to 2L2 components forming a set of intermediate frequency coefficients BM
Step 24, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set ALCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeAn approximation vector of { m'i(xi,yi)′i=1,2,…,NeH, m'i(xi,yi) Represents { m'i(xi,yi)|i=1,2,…,NeCoordinates of the ith pixel point of the pixel;
step 25, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set BLCarry out 2LThe order orthogonal V-system inverse transform is performed to obtain a boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeAn approximation vector of { m ″ }i(xi,yi)|i=1,2,…,NeIs m ″)i(xi,yi) Denotes { mi(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 26, keeping the low-frequency coefficient and the high-frequency coefficient zero clearing, and independently utilizing the intermediate-frequency coefficient set AMCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeA detail vector of { n'i(xi,yi)|i=1,2,…,NeH, n'i(xi,yi) Denotes { n'i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 27, keeping the low-frequency coefficient and the high-frequency coefficient zero, and independently clearingUsing intermediate frequency coefficient sets BMCarry out 2LInverse transformation of the order orthogonal V-system to obtain the boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeThe detail vector of { n ″ }i(xi,yi)|i=1,2,…,NeIs the same as the formulai(xi,yi) Denotes n ″i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 28, adopting an 8-direction Freeman chain code pair to convert { m'i(xi,yi)|i=1,2,…,NeIs expressed and its direction histogram H 'is counted'L
Step 29, adopting 8-direction Freeman chain code pairs to convert { m ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedL
Step 30, adopting an 8-direction Freeman chain code pair to convert { n'i(xi,yi)|i=1,2,…,NeIs represented and its direction histogram H 'is counted'H
Step 31, adopting 8-direction Freeman chain code pair to convert { n ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedH
Step 32, according to the similarity measure function given by the formula (5), the normalized V-descriptor vector is utilized
Figure BDA0003501093780000071
And
Figure BDA0003501093780000072
retrieving sums e from a database of oracle bone rubscFirst N with most similar edgescandidateThe oracle bone rubbing image is formed into a candidate matching rubbing image subset Scandidate
Figure BDA0003501093780000073
Euc (·,) denotes the Euclidean distance, max denotes taking the maximum function,
Figure BDA0003501093780000074
represents a V-descriptor vector, N, corresponding to any section of edge of any image of the rubbing to be matched in the oracle bone rubbing databasecandidateIs a preset constant;
and step 33, utilizing the direction histogram H 'according to the similarity measure function given by the formula (6)'L、H′H、H″LAnd H ″)HAt ScandidateRetrieve sum ecFront with most similar edge
Figure BDA0003501093780000075
The oracle bone rubbing images are arranged in a non-ascending order according to the similarity of the oracle bone rubbing images, and then candidate matching rubbing subsets are obtained
Figure BDA0003501093780000076
Figure BDA0003501093780000077
Said p (·, ·) represents a pearson correlation coefficient,
Figure BDA0003501093780000078
denotes ScandidateThe direction histogram of the approximate vector corresponding to any section of edge of any rubbing image,
Figure BDA0003501093780000079
denotes ScandidateThe direction histogram of the detail vector corresponding to any section of edge of any one rubbing image, a, b and
Figure BDA00035010937800000710
are all preset constants;
step 34, order
Figure BDA00035010937800000711
Thereby will be
Figure BDA00035010937800000712
Adding the obtained product into a topology set omega available for conjugation;
step 35. order
Figure BDA00035010937800000713
Returning to the step 11;
and step 36, outputting a topology set omega available for conjugation.
Compared with the prior art, the invention has the following advantages: firstly, a contour curve is decomposed into a group of linear combinations of piecewise continuous functions with different scales and different shapes by utilizing discontinuous orthogonal V-system transformation, so that the geometric parameters of the edge contour of the oracle bone rubbing, such as differential, extreme points, shapes and the like under each scale are easy to express, the fractal information of the oracle bone rubbing is more fully extracted, and more accurate oracle conjugation is realized; secondly, for the oracle bone contour curve containing abundant singular points, the nonlinear approximation efficiency of the orthogonal V-system transformation is superior to that of the traditional Fourier transformation, discrete cosine transformation and the like. Under the condition, the low-frequency coefficient component of the orthogonal V-system transformation can effectively inhibit ink rubbing flaws, sawtooth burrs and corrosion defects at the carapace boundary on the premise of keeping main concave-convex points and large curvature points, so that the fault tolerance and robustness of small disturbance of the edge slag notch in the conjugation process are improved, the fractal characteristic of the carapace boundary profile can be effectively kept on the premise of inhibiting noise by the medium-frequency coefficient component of the orthogonal V-system transformation, and the reliability and accuracy of the profile fractal information in the conjugation process are improved; thirdly, detecting the angular points of the oracle bone rubbing edge by using a curvature scale space algorithm, determining a fragment of the oracle bone rubbing edge to carry out edge-to-edge matching by taking the extracted angular points as end points, and being beneficial to ensuring that an oracle bone fragment opening provides a remarkable feature to be matched for a conjugation process through the combination of concave and convex points and large curvature points; fourthly, initial retrieval of a frequency domain is carried out by utilizing the normalized V-descriptor vector and the Euclidean distance, and fine retrieval of a space domain is carried out by utilizing an approximate vector, a direction histogram of a detail vector and a Pearson correlation coefficient. Therefore, the computer-assisted oracle bone conjugation method based on orthogonal V-system transformation provides an effective way for oracle characters researchers, particularly for oracle bone conjugation researchers based on computer vision, can effectively resist boundary flaws and defects, and has the characteristics of good edge ballast matching fault tolerance, high accuracy, good robustness and strong adaptive capacity.
Drawings
FIG. 1 is a graph of the results of computer-generated fusion of the front oracle bone rubbing image of Oracle Collection 6820 according to the present invention.
Detailed Description
The invention provides a computer-assisted oracle bone conjugation method based on orthogonal V-system transformation, which is carried out according to the following steps;
step 1, inputting an oracle bone thesaurus rubbing image I to be matched, and enabling the height of the oracle bone thesaurus rubbing image I to be hIPixel of width wIA pixel;
step 2, converting the color space of the I from RGB to HSV and extracting the brightness component I of the color spaceV
Step 3, for image IVHistogram equalization is carried out to enhance the brightness contrast thereof to obtain I'V
Step 4, adopting Otsu method to treat I'VComputing an optimal global segmentation threshold TVAnd use of TVIs prepared from'VCarrying out binarization to obtain an image Ibw
Step 5, for the binary image IbwPerforming reverse color treatment to make background area of the oracle bone rubbing white and character area black to obtain I'bw
Step 6, to I'bwPerforming 8-connected domain analysis, and filling background holes in the white foreground region by adopting a seed filling algorithm so as to suppress I'bwIn which the dot noise is containedSound and flaky streaking to obtain image Ibw
Step 7, for I ″)bwIs marked and the connected area is smaller than AconnThe 8-connected region of (a) is filled, thereby removing I ″bwObtaining a binary image I 'from the discretely distributed fracture megastriations, shield striations and tooth gaps'bwSaid A isconnIs a predetermined constant, in this embodiment, let Aconn=100;
Step 8, adopting a disc-shaped structural element pair I 'with the radius of r'bwPerforming morphological expansion operation to smooth burr caused by uneven contact between paper and the outer edge of the oracle bone rubbing during rubbing process to obtain image IfixR is a preset constant, and in the present embodiment, r is 4;
step 9, searching the image I by adopting a bwperemm method in an 8-connected tracking modefixThe outer edge contour of the middle foreground area obtains an edge e with a single pixel width, and the edge e is displayed
Figure BDA0003501093780000091
The Ω represents a set of topologies available for conjugation;
step 10, detecting the corner of the edge e by using a curvature scale space algorithm with the scale factor sigma to obtain a corner sequence { cs(xs,ys)|s=1,2,…,NcAnd the angular point with the smallest abscissa is taken as the traversal starting point cstartWhere σ is a predetermined constant, cs(xs,ys) Coordinates representing the s-th corner point, NcS is more than or equal to 1 and less than or equal to N representing the number of angular pointscIn the present embodiment, let σ be 5;
step 11, take out and c along the clockwise directionstartAdjacent len angular points to obtain a point set
Figure BDA0003501093780000092
Then taking out the product along the counterclockwise direction and cstartAdjacent len angular points to obtain a point set
Figure BDA0003501093780000093
The len is a preset constant number which,
Figure BDA0003501093780000094
denotes in the clockwise direction and cstartThe adjacent 1 st corner point is,
Figure BDA0003501093780000095
denotes in the clockwise direction and cstartThe adjacent 2 nd corner point is,
Figure BDA0003501093780000096
denotes in the clockwise direction and cstartThe adjacent lenth corner point is,
Figure BDA0003501093780000097
denotes in the counterclockwise direction and cstartThe adjacent 1 st corner point is,
Figure BDA0003501093780000098
denotes in the counterclockwise direction and cstartThe adjacent 2 nd corner point is,
Figure BDA0003501093780000099
denotes in the counterclockwise direction and cstartThe len-th corner point that is adjacent, in this embodiment, let len be 2;
step 12, extracting from the edge e
Figure BDA00035010937800000910
As a starting point, with cstartIs the midpoint
Figure BDA00035010937800000911
A section of the edge of the end point is used as the edge e to be processed currentlyc
Step 13, if the edge ecIf the oracle bone thesaurus image I to be matched is processed, the step 36 is carried out, otherwise, the step 14 is carried out;
step 14. in
Figure BDA0003501093780000101
As a starting point, edge e is subtended in a clockwise directioncThe pixels are scanned and converted into a one-dimensional boundary pixel vector { p }i(xi,yi)|i=1,2,…,NeIs said pi(xi,yi) Coordinates representing the ith boundary pixel, NeDenotes ecThe number of pixels included, i is more than or equal to 1 and less than or equal to Ne
Step 15. calculate edge ecThe center of gravity is positioned at the origin of the coordinate system by translation operation, and the edge e is positioned by rotation operationcRotate about its center of gravity such that
Figure BDA0003501093780000102
Parallel to the X-axis of the coordinate system, to obtain an updated boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeH, p'i(xi,yi) Indicating the updated coordinates of the ith boundary pixel point,
Figure BDA0003501093780000103
is represented by
Figure BDA0003501093780000104
Point of direction
Figure BDA0003501093780000105
The vector of (a);
step 16. to the boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarrying out horizontal mirror image operation around the gravity center of the pixel to obtain a mirrored boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeIs p ″)i(xi,yi) Representing the coordinates of the ith boundary pixel point after horizontal mirroring;
step 17, order
Figure BDA0003501093780000106
To boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure BDA0003501093780000107
And the spectrum vector of the ordinate
Figure BDA0003501093780000108
Step 18. Pair boundary pixel vector { pi(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure BDA0003501093780000109
And the spectrum vector of the ordinate
Figure BDA00035010937800001010
Step 19, constructing { p'i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure BDA00035010937800001011
Figure BDA00035010937800001012
A is saidkTo represent
Figure BDA00035010937800001013
The (k) th component of (a),
Figure BDA00035010937800001014
representing spectral vectors
Figure BDA00035010937800001015
The kth point ofThe amount of the compound (A) is,
Figure BDA00035010937800001016
representing a spectral vector
Figure BDA00035010937800001017
K is 1. ltoreq. k.ltoreq.2LJ represents an imaginary unit;
step 20, according to the formula (2), constructing { p ″)i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure BDA00035010937800001018
Figure BDA00035010937800001019
Beta is the same askTo represent
Figure BDA0003501093780000111
The (k) th component of (a),
Figure BDA0003501093780000112
representing spectral vectors
Figure BDA0003501093780000113
The (k) th component of (a),
Figure BDA0003501093780000114
representing spectral vectors
Figure BDA0003501093780000115
K is 1. ltoreq. k.ltoreq.2L
Step 21, according to the formula (3) and the formula (4), V-descriptor vector is processed
Figure BDA0003501093780000116
And
Figure BDA0003501093780000117
normalization is carried out to respectively obtain the normalized V-descriptor vectors
Figure BDA0003501093780000118
And
Figure BDA0003501093780000119
Figure BDA00035010937800001110
Figure BDA00035010937800001111
step 22. extraction
Figure BDA00035010937800001112
And
Figure BDA00035010937800001113
front 2 ofLThe low-frequency coefficient set A is obtained by 4 componentsLIs then extracted
Figure BDA00035010937800001114
And
Figure BDA00035010937800001115
2 ndL4+1 component to 2L2 components forming a set A of intermediate frequency coefficientsM
Step 23. extraction
Figure BDA00035010937800001116
And
Figure BDA00035010937800001117
front 2 of (2)LThe low-frequency coefficient set B is obtained by 4 componentsLIs then extracted
Figure BDA00035010937800001118
And
Figure BDA00035010937800001119
2 ndL4+1 component to 2L2 components forming a set of intermediate frequency coefficients BM
Step 24, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set ALCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeAn approximation vector of { m'i(xi,yi)|i=1,2,…,NeH, m'i(xi,yi) Represents { m'i(xi,yi)|i=1,2,…,NeCoordinates of the ith pixel point of the pixel;
step 25, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set BLCarry out 2LThe order orthogonal V-system inverse transform is performed to obtain a boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeAn approximation vector of { m ″ }i(xi,yi)|i=1,2,…,NeIs m ″)i(xi,yi) Denotes { mi(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 26, keeping the low-frequency coefficient and the high-frequency coefficient zero clearing, and independently utilizing the intermediate-frequency coefficient set AMCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeA detail vector of { n'i(xi,yi)|i=1,2,…,NeH, n'i(xi,yi) Denotes { n'i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 27, keeping the low-frequency coefficient and the high-frequency coefficient zero clearing, and independently utilizing the intermediate-frequency coefficient set BMCarry out 2LThe order orthogonal V-system inverse transform is performed to obtain a boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeThe detail vector of { n ″ }i(xi,yi)|i=1,2,…,NeIs the same as the formulai(xi,yi) Denotes n ″i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 28, adopting an 8-direction Freeman chain code pair to convert { m'i(xi,yi)|i=1,2,…,NeIs represented and its direction histogram H 'is counted'L
Step 29, adopting 8-direction Freeman chain code pairs to convert { m ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedL
Step 30, adopting an 8-direction Freeman chain code pair to convert { n'i(xi,yi)|i=1,2,…,NeIs represented and its direction histogram H 'is counted'H
Step 31, adopting 8-direction Freeman chain code pair to convert { n ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedH
Step 32, according to the similarity measure function given by the formula (5), the normalized V-descriptor vector is utilized
Figure BDA0003501093780000121
And
Figure BDA0003501093780000122
search for and e in oracle bone rubbing databasecFirst N with most similar edgescandidateThe oracle bone rubbing image is formed into a candidate matching rubbing image subset Scandidate
Figure BDA0003501093780000123
Said Euc (·, ·) denotes the euclidean distance, max denotes taking the maximum function,
Figure BDA0003501093780000124
representing a V-descriptor vector, N, corresponding to any section of edge of any rubbing image to be matched in the oracle bone rubbing databasecandidateIs a predetermined constant, in this embodiment, let Ncandidate=10;
And step 33, utilizing the direction histogram H 'according to the similarity measure function given by the formula (6)'L、H′H、H″LAnd H ″)HAt ScandidateRetrieve sum ecFront with most similar edge
Figure BDA0003501093780000125
The oracle bone rubbing images are arranged in a non-ascending order according to the similarity of the oracle bone rubbing images, and then candidate matching rubbing subsets are obtained
Figure BDA0003501093780000126
Figure BDA0003501093780000127
Said p (·, ·) represents a pearson correlation coefficient,
Figure BDA0003501093780000128
denotes ScandidateThe direction histogram of the approximate vector corresponding to any section of edge of any one of the topology images,
Figure BDA0003501093780000131
denotes ScandidateThe direction histogram of the detail vector corresponding to any section of edge of any one rubbing image, a, b and
Figure BDA0003501093780000132
all are preset constants, in this embodiment, let a be 0.7, b be 0.3,
Figure BDA0003501093780000133
step 34, order
Figure BDA0003501093780000134
Thereby will be
Figure BDA0003501093780000135
Adding the obtained product into a topology set omega available for conjugation;
step 35. order
Figure BDA0003501093780000136
Returning to the step 11;
and step 36, outputting a topology set omega available for conjugation.
An experiment is carried out by taking the 6820 th front oracle bone rubbing image of the oracle bone inscription collection as the rubbing to be conjugated, 17466 th oracle bone rubbing image and 99 randomly selected oracle bone rubbing images are selected from the oracle bone inscription collection to form a retrieval database, wherein the 6820 th front oracle bone rubbing and the 17466 th oracle bone rubbing have real conjugation relation. The result is shown in fig. 1, wherein (a) is a 6820 th front oracle bone rubbing image; (b) 17466 # oracle bone rubbing image, which is also the rubbing image with the highest conjugation similarity with 6820 # front oracle bone rubbing retrieved by the present invention; (c) the image of the rubbing with the second highest conjugation similarity with the front oracle bone rubbing No. 6820 retrieved by the invention; (d) ranking a third rubbing image for the retrieved conjugation similarity with the No. 6820 front oracle bone rubbing; (e) and ranking the fourth rubbing image for the similarity of the retrieved front oracle bone rubbing No. 6820.
As can be seen from the figure 1, the invention effectively resists the interference of punctiform noise, flaky stripes and the inherent texture of the oracle bone, can search the rubbing with higher similarity with the rubbing to be conjugated, provides a rubbing set for human experts for conjugation and is beneficial to improving the working efficiency of subsequent manual conjugation.

Claims (1)

1. A computer-assisted oracle bone conjugation method based on orthogonal V-system transformation is characterized by comprising the following steps:
step 1, inputting an oracle bone thesaurus rubbing image I to be matched, and enabling the height of the oracle bone thesaurus rubbing image I to be hIPixel of width wIA pixel;
step 2, converting the color space of the I from RGB to HSV and extracting the brightness component I of the color spaceV
Step 3, for image IVHistogram equalization is carried out to enhance the brightness contrast thereof to obtain I'V
Step 4, adopting Otsu method to treat I'VComputing an optimal global segmentation threshold TVAnd use of TVIs prepared from'VCarrying out binarization to obtain an image Ibw
Step 5, for the binary image IbwPerforming reverse color treatment to make background area of the oracle bone rubbing white and character area black to obtain I'bw
Step 6, to I'bwPerforming 8-connected domain analysis, and filling background holes in the white foreground region by adopting a seed filling algorithm so as to suppress I'bwThe point noise and the sheet speckle contained in the image I ″, an image I ″bw
Step 7, for I ″)bwIs marked and the connected area is smaller than AconnThe 8-connected region of (a) is filled, thereby removing I ″bwObtaining a binary image I 'from the discretely distributed fracture megastriations, shield striations and tooth gaps'bwSaid A isconnIs a preset constant;
step 8, adopting a disc-shaped structural element pair I 'with the radius of r'bwPerforming morphological expansion operation to smooth burr caused by uneven contact between paper and the outer edge of the oracle bone rubbing during rubbing process to obtain image IfixR is a preset constant;
step 9, searching the image I by adopting a bwperemm method in an 8-connected tracking modefixThe outer edge contour of the middle foreground area obtains an edge e with a single pixel width, and the edge e is displayed
Figure FDA0003501093770000011
The Ω represents a set of topologies available for conjugation;
step 10, detecting the corner of the edge e by using a curvature scale space algorithm with the scale factor sigma to obtain a corner sequence { cs(xs,ys)|s=1,2,…,NcAnd the angular point with the smallest abscissa is taken as the traversal starting point cstartThe σ is a predetermined constant, cs(xs,ys) Coordinates representing the s-th corner point, NcS is more than or equal to 1 and less than or equal to N representing the number of angular pointsc
Step 11, take out and c along the clockwise directionstartAdjacent len angular points to obtain a point set
Figure FDA0003501093770000021
Then taking out the product along the counterclockwise direction and cstartAdjacent len angular points to obtain a point set
Figure FDA0003501093770000022
Said len is a predetermined constant number which,
Figure FDA0003501093770000023
denotes in the clockwise direction and cstartThe adjacent 1 st corner point is,
Figure FDA0003501093770000024
denotes in the clockwise direction and cstartThe adjacent 2 nd corner point is,
Figure FDA0003501093770000025
denotes in the clockwise direction and cstartThe adjacent lenth corner point is,
Figure FDA0003501093770000026
denotes in the counterclockwise direction and cstartThe adjacent 1 st corner point is,
Figure FDA0003501093770000027
denotes in the counterclockwise direction and cstartThe adjacent 2 nd corner point is,
Figure FDA0003501093770000028
denotes in the counterclockwise direction and cstartThe adjacent len-th corner point;
step 12, extracting from the edge e
Figure FDA0003501093770000029
As a starting point, with cstartIs the midpoint
Figure FDA00035010937700000210
A section of the edge of the end point is used as the edge e to be processed currentlyc
Step 13, if the edge ecIf the oracle bone thesaurus image I to be matched is processed, the step 36 is carried out, otherwise, the step 14 is carried out;
step 14. in
Figure FDA00035010937700000211
As a starting point, edge e is subtended in a clockwise directioncThe pixels are scanned and converted into a one-dimensional boundary pixel vector { p }i(xi,yi)|i=1,2,…,NeIs said pi(xi,yi) Coordinates representing the ith boundary pixel, NeDenotes ecThe number of pixels included, i is more than or equal to 1 and less than or equal to Ne
Step 15. calculate edge ecThe center of gravity is positioned at the origin of the coordinate system by translation operation, and the edge e is positioned by rotation operationcRotate about its center of gravity such that
Figure FDA00035010937700000212
Parallel to the X-axis of the coordinate system, to obtain an updated boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeH, p'i(xi,yi) The coordinates of the ith boundary pixel point after updating are represented,
Figure FDA00035010937700000213
is represented by
Figure FDA00035010937700000214
Point of direction
Figure FDA00035010937700000215
The vector of (a);
step 16. to the boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarrying out horizontal mirror image operation around the gravity center of the pixel to obtain a mirrored boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeIs p ″)i(xi,yi) Representing the coordinates of the ith boundary pixel point after horizontal mirroring;
step 17, order
Figure FDA00035010937700000220
To boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure FDA00035010937700000216
And the spectrum vector of the ordinate
Figure FDA00035010937700000217
Step 18. Pair boundary pixel vector { pi(xi,yi)|i=1,2,…,NeCarry out 2LOrder orthogonal V-system transformation to obtain the frequency spectrum vector of its abscissa
Figure FDA00035010937700000218
And frequency of ordinateSpectral vector
Figure FDA00035010937700000219
Step 19, constructing { p'i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure FDA00035010937700000328
Figure FDA0003501093770000031
A is saidkTo represent
Figure FDA0003501093770000032
The (k) th component of (a),
Figure FDA0003501093770000033
representing spectral vectors
Figure FDA0003501093770000034
The (k) th component of (a),
Figure FDA0003501093770000035
representing spectral vectors
Figure FDA0003501093770000036
K is 1. ltoreq. k.ltoreq.2LJ represents an imaginary unit;
step 20, according to the formula (2), constructing { p ″)i(xi,yi)|i=1,2,…,NeV-descriptor vector of }
Figure FDA0003501093770000037
Figure FDA0003501093770000038
Beta is the same askTo represent
Figure FDA0003501093770000039
The (k) th component of (a),
Figure FDA00035010937700000310
representing spectral vectors
Figure FDA00035010937700000311
The (k) th component of (a),
Figure FDA00035010937700000312
representing spectral vectors
Figure FDA00035010937700000313
K is 1. ltoreq. k.ltoreq.2L
Step 21, according to the formula (3) and the formula (4), V-descriptor vector is processed
Figure FDA00035010937700000314
And
Figure FDA00035010937700000315
normalization is carried out to respectively obtain the normalized V-descriptor vectors
Figure FDA00035010937700000316
And
Figure FDA00035010937700000317
Figure FDA00035010937700000318
Figure FDA00035010937700000319
step 22. extraction
Figure FDA00035010937700000320
And
Figure FDA00035010937700000321
front 2 ofLThe low-frequency coefficient set A is obtained by 4 componentsLIs then extracted
Figure FDA00035010937700000322
And
Figure FDA00035010937700000323
2 ndL4+1 component to 2L2 components forming a set A of intermediate frequency coefficientsM
Step 23. extraction
Figure FDA00035010937700000324
And
Figure FDA00035010937700000325
front 2 ofLThe low-frequency coefficient set B is obtained by 4 componentsLIs then extracted
Figure FDA00035010937700000326
And
Figure FDA00035010937700000327
2 ndL4+1 component to 2L2 components forming a set of intermediate frequency coefficients BM
Step 24, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set ALCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeAn approximation vector of { m'i(xi,yi)|i=1,2,…,NeH, m'i(xi,yi) Represents { m'i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 25, keeping the intermediate frequency coefficient and the high frequency coefficient to be zero, and independently utilizing the low frequency coefficient set BLCarry out 2LThe order orthogonal V-system inverse transform is performed to obtain a boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeAn approximation vector of { m ″ }i(xi,yi)|i=1,2,…,NeIs m ″)i(xi,yi) Denotes { mi(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 26, keeping the low-frequency coefficient and the high-frequency coefficient zero clearing, and independently utilizing the intermediate-frequency coefficient set AMCarry out 2LStep orthogonal V-systematic inverse transformation to obtain a boundary pixel vector { p'i(xi,yi)|i=1,2,…,NeA detail vector of { n'i(xi,yi)|i=1,2,…,NeH, n'i(xi,yi) Denotes { n'i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 27, keeping the low-frequency coefficient and the high-frequency coefficient zero clearing, and independently utilizing the intermediate-frequency coefficient set BMCarry out 2LThe order orthogonal V-system inverse transform is performed to obtain a boundary pixel vector { p ″)i(xi,yi)|i=1,2,…,NeThe detail vector of { n ″ }i(xi,yi)|i=1,2,…,NeIs the same as the formulai(xi,yi) Denotes n ″i(xi,yi)|i=1,2,…,NeThe coordinates of the ith pixel point;
step 28, adopting an 8-direction Freeman chain code pair to convert { m'i(xi,yi)|i=1,2,…,NeIs represented and its direction histogram H 'is counted'L
Step 29, adopting 8-direction Freeman chain code pairs to convert { m ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedL
Step 30, adopting an 8-direction Freeman chain code pair to convert { n'i(xi,yi)|i=1,2,…,NeIs represented and its direction histogram H 'is counted'H
Step 31, adopting 8-direction Freeman chain code pair to convert { n ″)i(xi,yi)|i=1,2,…,NeIs expressed and the histogram of its orientation H ″, is countedH
Step 32, according to the similarity measure function given by the formula (5), the normalized V-descriptor vector is utilized
Figure FDA0003501093770000041
And
Figure FDA0003501093770000042
retrieving sums e from a database of oracle bone rubscFirst N with most similar edgescandidateThe oracle bone rubbing image is formed into a candidate matching rubbing image subset Scandidate
Figure FDA0003501093770000051
Euc (·,) denotes the Euclidean distance, max denotes taking the maximum function,
Figure FDA0003501093770000052
representing a V-descriptor vector, N, corresponding to any section of edge of any rubbing image to be matched in the oracle bone rubbing databasecandidateIs a preset constant;
and step 33, utilizing the direction histogram H 'according to the similarity measure function given by the formula (6)'L、H′H、H″LAnd H ″)HAt ScandidateIn search of andecfront with most similar edge
Figure FDA0003501093770000053
The oracle bone rubbing images are arranged in a non-ascending order according to the similarity of the oracle bone rubbing images, and then candidate matching rubbing subsets are obtained
Figure FDA0003501093770000054
Figure FDA0003501093770000055
Said ρ (·,) represents the pearson correlation coefficient,
Figure FDA0003501093770000056
denotes ScandidateThe direction histogram of the approximate vector corresponding to any section of edge of any one of the topology images,
Figure FDA0003501093770000057
denotes ScandidateThe direction histogram of the detail vector corresponding to any section of edge of any one rubbing image, a, b and
Figure FDA0003501093770000058
are all preset constants;
step 34, order
Figure FDA0003501093770000059
Thereby will be
Figure FDA00035010937700000510
Adding the obtained product into a topology set omega available for conjugation;
step 35. order
Figure FDA00035010937700000511
Returning to the step 11;
and step 36, outputting a topology set omega available for conjugation.
CN202210127638.6A 2022-02-11 2022-02-11 Computer-aided alpha bone conjugation method based on orthogonal V-system transformation Active CN114494750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210127638.6A CN114494750B (en) 2022-02-11 2022-02-11 Computer-aided alpha bone conjugation method based on orthogonal V-system transformation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210127638.6A CN114494750B (en) 2022-02-11 2022-02-11 Computer-aided alpha bone conjugation method based on orthogonal V-system transformation

Publications (2)

Publication Number Publication Date
CN114494750A true CN114494750A (en) 2022-05-13
CN114494750B CN114494750B (en) 2024-04-02

Family

ID=81479637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210127638.6A Active CN114494750B (en) 2022-02-11 2022-02-11 Computer-aided alpha bone conjugation method based on orthogonal V-system transformation

Country Status (1)

Country Link
CN (1) CN114494750B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060147094A1 (en) * 2003-09-08 2006-07-06 Woong-Tuk Yoo Pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
CN111563506A (en) * 2020-03-18 2020-08-21 西南大学 Oracle bone rubbing conjugation method based on curve contour matching
CN113362361A (en) * 2021-07-20 2021-09-07 辽宁师范大学 Image data set construction method for oracle character detection based on morphological prior constraint

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060147094A1 (en) * 2003-09-08 2006-07-06 Woong-Tuk Yoo Pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
CN111563506A (en) * 2020-03-18 2020-08-21 西南大学 Oracle bone rubbing conjugation method based on curve contour matching
CN113362361A (en) * 2021-07-20 2021-09-07 辽宁师范大学 Image data set construction method for oracle character detection based on morphological prior constraint

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张长青;王爱民;: "一种计算机辅助甲骨文拓片缀合方法", 电子设计工程, no. 17, 5 September 2012 (2012-09-05) *

Also Published As

Publication number Publication date
CN114494750B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
Justino et al. Reconstructing shredded documents through feature matching
Yang Plant leaf recognition by integrating shape and texture features
Joshi et al. Latent fingerprint enhancement using generative adversarial networks
CN104680127A (en) Gesture identification method and gesture identification system
Dibeklioglu et al. 3D facial landmarking under expression, pose, and occlusion variations
CN111310760B (en) Method for detecting alpha bone inscription characters by combining local priori features and depth convolution features
CN103870808A (en) Finger vein identification method
Thalji et al. Iris Recognition using robust algorithm for eyelid, eyelash and shadow avoiding
CN114494306B (en) Edge gradient covariance guided method for repairing character outline of first bone and Doppler dictionary
CN109523484B (en) Fractal feature-based finger vein network repair method
Sari et al. Papaya fruit type classification using LBP features extraction and naive bayes classifier
Nayef et al. On the use of geometric matching for both: Isolated symbol recognition and symbol spotting
Soumya et al. Recognition of ancient Kannada Epigraphs using fuzzy-based approach
JP3819236B2 (en) Pattern recognition method and computer-readable storage medium storing program for performing pattern recognition
CN114494750A (en) Computer-assisted oracle bone conjugation method based on orthogonal V-system transformation
Lee et al. Backbone alignment and cascade tiny object detecting techniques for dolphin detection and classification
CN115731550A (en) Deep learning-based automatic drug specification identification method and system and storage medium
Hu et al. A multiple point boundary smoothing algorithm
Zanwar et al. A comprehensive survey on soft computing based optical character recognition techniques
RU2336655C1 (en) Method of object and background areas selection in digital images
Nguyen et al. Bags of strokes based approach for classification and indexing of drop caps
Mahadik et al. Access Control System using fingerprint recognition
Jaswal et al. Deformable multi-scale scheme for biometric personal identification
CN111311586A (en) Nonlinear health analysis system data-based multi-index dynamic integration algorithm and system
Zhang et al. Shape representation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant