KR20170095062A - A Method Of Providing For Searching Footprint And The System Practiced The Method - Google Patents

A Method Of Providing For Searching Footprint And The System Practiced The Method Download PDF

Info

Publication number
KR20170095062A
KR20170095062A KR1020160016573A KR20160016573A KR20170095062A KR 20170095062 A KR20170095062 A KR 20170095062A KR 1020160016573 A KR1020160016573 A KR 1020160016573A KR 20160016573 A KR20160016573 A KR 20160016573A KR 20170095062 A KR20170095062 A KR 20170095062A
Authority
KR
South Korea
Prior art keywords
image
contrast
pattern
comparative
similarity
Prior art date
Application number
KR1020160016573A
Other languages
Korean (ko)
Other versions
KR101781359B1 (en
Inventor
이중
변준석
심규선
Original Assignee
대한민국(관리부서: 행정자치부 국립과학수사연구원장)
대한민국(관리부서: 행정자치부 국립과학수사연구원장)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 대한민국(관리부서: 행정자치부 국립과학수사연구원장), 대한민국(관리부서: 행정자치부 국립과학수사연구원장) filed Critical 대한민국(관리부서: 행정자치부 국립과학수사연구원장)
Priority to KR1020160016573A priority Critical patent/KR101781359B1/en
Publication of KR20170095062A publication Critical patent/KR20170095062A/en
Application granted granted Critical
Publication of KR101781359B1 publication Critical patent/KR101781359B1/en

Links

Images

Classifications

    • G06F17/30271
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • G06F17/30256
    • G06F17/3028

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a method for providing a footprint search data, which provides objective data on the similarity of a comparison footprint and a comparative footprint stored in advance in searching for footprints similar to a comparative footprint, The present invention relates to a method for providing stitch search data.

Description

Technical Field [0001] The present invention relates to a method for providing a footstep search,

The present invention relates to a method for providing a footprint search data, and more particularly, to a method for providing footprint search data by providing objective data for searching for footprints so that similar footprints can be retrieved quickly and more accurately.

Recently, as the intelligence of crime has become more advanced, such as not leaving fingerprints, it is difficult to erase it easily, and there is an increasing demand for analysis of footprints that are likely to leave traces. Footprints are traces of shoe marks left on the floor or floor of a building by a criminal.

It is helpful to identify the suspects and suspects by promptly searching for the remaining footsteps in the crime scenes as well as in the strong cases and providing information about the types of shoes the criminal wears. Similar to DNA search, it is also helpful to identify the same crime and crime through the stigma.

FIG. 1 shows an example of a footstep, FIG. 2 shows a footstep DB, FIG. 3 schematically shows a contrast footstep and a contrast footstep, FIG. FIG. 5 schematically shows an example of a pattern DB, and FIG. 6 shows a system for searching a footstep for a conventional footstep search.

In the following description, "comparative streak" means a footstep left on the scene of the incident, and "contrast footstep " means a footstep stored in the storage unit to obtain information about a comparative footstep. The contrast footsteps are obtained from the shoes distributed on the market. For example, an ink or the like is applied to the bottom of a shoe which is distributed on the market, and is stamped on a paper or the like, shot, and obtained as a contrasting image, and stored in a storage unit.

As shown in FIG. 6, an apparatus for searching for a footprint similar to a comparative footprint includes a control unit, a storage unit connected to the control unit and storing a footprint database of a plurality of contrast shakes, An input unit (e.g., a keyboard and a mouse) as input means, a display unit connected to the control unit and displaying a contrast group or a comparison group image, and an output unit connected to the control unit and outputting a search result. The storage unit also stores a program that is executed to compare the comparison unit and the contrast unit. The storage unit may be a server communicating with the control unit.

Figure 1 shows an example of a comparative footprint image. The contrast streak image is also shown in Fig. The family trace image is saved as a picture file (eg bmp file) by taking a picture or the like.

The contrasted root image is binarized, and the binarized image may be displayed on the display unit. Since the contents of the binarization are conventionally known, the description of the binarization process is omitted.

As shown in FIG. 1, the comparative footprint image 10 may have various shapes depending on the shape of the shoe bottom. The comparative footprint image 10 has a wavy pattern 15, a circular pattern 11, a horseshoe pattern 13, and the like. When the comparative footprints are displayed on the display unit, the first region A is divided into the upper region A, the second region B as an intermediate region, and the third region C as a lower region so that the user can easily select the pattern position Lines indicated by L1 and L2 may be displayed on the display together with the comparative footprint image. L 1 and L 2 are the maximum and minimum values of the vertical coordinate (y coordinate) of the pixels having the pixel color RGB (0, 0, 0) in the binarized image of the pixel, and between the maximum value and the minimum value of the vertical coordinate The coordinate value of the ordinate (y coordinate) which is one-third of the reference strain image 10 and the coordinate value of the ordinate (y coordinate) which is two-thirds of the comparative strain image 10 are derived, A pixel having a coordinate value of a coordinate (y coordinate) and a pixel having a coordinate value of an ordinate (y coordinate) of 2/3 become a pixel color RGB having a specific color (for example, GREEN) Can be displayed.

As shown in Fig. 2, the footprint database for the contrast footsteps comprises a shoe DB and a pattern DB. The shoe DB may include the name of the shoe model, the image of the shoe under the shoe (bottom) image, the pattern number, and the like. The pattern DB includes the pattern number and the pattern name. The footprint of each contrast footprint is stored in the storage.

As shown in FIG. 3, the images of the comparative footprints obtained by a method such as shooting on the spot are stored in the storage unit, and information such as the shoe model name is obtained by comparing with a plurality of contrast footprints.

When the user executes a program to search for a contrast streak image similar to the comparative striae image through the input unit, the pattern number and the pattern shape are displayed on the display unit. The user selects the pattern through the input unit, selects the pattern position, . A comparative strain image is displayed on one side of the display portion, and a pop-up window for inputting a glyph as shown in Fig. 5 is displayed. The area for the glyph input may be displayed with the comparative footprint image on one side of the comparative footprint image.

When the user selects the shape and position of the pattern through the input unit, the selected information is stored in the storage unit, and compared with the pattern DB (pattern shape and pattern position) of the contrast wave stored in the storage unit, The footprint image is displayed on the display side-by-side with the comparative footprint image. When there are plural contrast streaks, images of a plurality of contrast streaks are sequentially displayed in parallel with the comparison streak image according to a user's input command.

The method of searching existing footsteps is to judge whether the position of the footwear is in the middle, middle, or the bottom of the circle, rectangle, stripes, horseshoe which are predefined patterns in the footprints of analysts (investigators) And input the pattern and position in the shoe pattern database marketed in Korea.

However, the positions of the top, middle, and bottom are very fragmented, and accuracy is high only when the actual input is sufficient. If there is not enough input data, there are many kinds of shoes to be searched, and there is a problem that analysts have to perform secondary search with eyes. In addition, the footprint of a crime scene is rather blurry than a clean photograph, and it is difficult to find three or more patterns. Therefore, conventional search techniques are often ineffective and perform secondary search with a lot of human data.

Korean Patent No. 10-1582142

SUMMARY OF THE INVENTION It is an object of the present invention to provide a method for providing a footprint search data to search for similar footprints more quickly and accurately by providing comparative footprints against similarities of comparative footprints.

In accordance with another aspect of the present invention, there is provided an image processing apparatus including a control unit, a storage unit connected to the control unit and storing a footprint database of a plurality of contrast shafts, a display unit connected to the control unit, Executed by a system consisting of an input section serving as a command input means; Wherein the footprint DB comprises a footprint DB for at least one pattern included in the shoe DB and the contrast footprint image, the footprint DB includes a contrast footprint image and a shoe model name, and the flint DB includes each pattern included in the contrast footprint image A contour vector for a pixel outside the glyph of the contour vector, an autocorrelation function value (ACF) computed from contour vectors, and a norm function value;

Calculating a contour vector for outline pixels of at least one of the glyphs included in the comparison group image stored in the storage unit; calculating an autocorrelation function value for each glyph from the contour vector; Calculating a norm function value for each of the patterns by comparing an autocorrelation function value of each contrast stance with an autocorrelation function value and a norm function value of the comparison group and a normal scalar value R of a norm function value, And displaying a normal scalar value of each contrast zone on the display; The outline is a closed loop, and the contour vector of each pattern provides a method of providing a sketch search data, which is an incremental complex vector of outline pixels forming a pattern.

In the above, the autocorrelation function value (ACF) for each pattern of the contrast group or the comparison group is calculated as a shift scalar sum of the incremental complex vector with respect to the outer pixel constituting each pattern

Figure pat00001
Lt; / RTI >

The function value (AC ACF)) for each pattern of the contrast streak or the comparison streak is the scalar sum of the incremental complex vector for the outer vector forming each glyph

Figure pat00002
Lt; / RTI >

The normal color value (R)

Figure pat00003
Lt; / RTI >

Where ACF (n) is a shift scalar sum computed for the glyphs contained in the comparison root image, ACF '(n) is the contour vector for the contour pixel, (N) is the computed function value for the pattern included in the comparative grouping image, and | ACF '(n) | is the contrast value of the contrasting sketch image Is a genomic function value computed for a glyph included in the glyph.

In the above, the autocorrelation function values of the contrast masks and the normal scalar set values of the normal scalar values (R) of the norm function values are stored in the storage unit for the autocorrelation function value and the norm function value of the comparative group, The shoe DB for the contrast shoe is displayed on the display unit together with the normal scalar value R only when the normal scalar value R is larger than the normal scalar setting value stored in the storage unit.

3. The method according to claim 2, wherein a class correlation function value (ICF) for each pattern included in each pattern included in the comparison group image and the contrast group images is calculated and stored in the storage unit, and the normal scalar value (ICF) is displayed on the display unit,

In the above, the class correlation function value (ICF) for each pattern of contrast streaks and each pattern of comparative group shakes

Figure pat00004
Lt; / RTI >

In the above equations, v is a contour vector for the outline pixels of each pattern included in the comparative group image, v 'is a contour vector for the contour pixels of each pattern included in the contrast pattern image, and k is the number of contour pixels to be.

Before calculating the autocorrelation function value, the norm function value, and the class correlation function value, the number of outline pixels for each pattern included in the comparison pattern image and the contrast pattern image included in the comparison group image is counted and stored in the storage unit ; The autocorrelation function value, the norm function value, and the class correlation function value are calculated for each pattern included in the contrast pattern images having the same number of pixels and the number of pixels included in the comparison group image.

In the above, the shoe DB includes data of the relative footstep origin position of the contrast streak image calculated in the contrast streak image, contrast center data of the center of each pattern included in the contrast streak images, Further comprising contour pedestal gauging distance data that is a distance between the center point of the contrast stair pattern;

The method for providing the stigma retrieval data further includes a step of computing the relative distance of each pattern included in the comparative group image, and a step of providing distance similarity data;

Wherein the step of computing the comparative core stiffness includes calculating the origin position (comparison root image origin position data) of the comparative rootstock image in the comparative rootstock image, calculating the central position of each pattern included in the comparative rootstock image Calculating a distance between an origin position of the comparison group image and each center position of the comparison group (comparative arch span distance data);

The distance similarity data providing step may include a step of calculating a distance similarity degree of each pattern included in the contrast pattern images for each pattern included in the comparison group image and a step of calculating maximum distance similarity values for each pattern included in the comparison group image And calculating an average distance similarity value for the derived maximum distance similarity values;

Wherein the display unit displays the average distance similarity of each of the contrast shakes relative to the comparison shoe,

The distance similarity (tn, tn ') for each pattern included in the comparative footprint images for each pattern included in the comparative footprint image is

Figure pat00005
Lt; / RTI >

Where dn is the pattern distance of the angle included in the comparative footprint image and dn 'is the pattern distance of each pattern included in the contrast footprint images.

In the above description, the shoe DB includes the origin position data of the contrast shoe images calculated from the contrast shoe images, the contrast center shoe center data which is the center position of each shoe included in the contrast shoe image, Further comprising angle data;

Wherein the method for providing the footstep search data further includes a step of calculating a relative footstep angle and a step of providing the degree of similarity data, the flint angle being calculated for each pattern included in the comparative footstep image;

The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image Calculating a center angle of each of the patterns relative to an origin position of the comparative group image (comparison pattern image angle data);

The step of providing the degree of similarity data may include the steps of calculating the degree of similarity of each pattern included in the contrasted grouped images for each pattern included in the comparative grouped image, and calculating the maximum degree of similarity values for each pattern included in the comparative grouped image And calculating an average degree of similarity value for the derived maximum degree of similarity values;

Wherein the display unit displays the average degree of similarity of the respective contrast shots with respect to the comparative group,

The angle similarity (? N,? N ') of each pattern included in the contrast pattern images for each pattern included in the comparative group image

Figure pat00006
Lt; / RTI >

Where rn is the pattern angle of each pattern included in the comparative footprint image, and rn 'is the pattern angle of each pattern included in the contrast footprint image.

In the above, the shoe DB stores data of the relative footstep origin position of the contrast streak image calculated from the contrast streak images, the contrast center staple center data of the center of each pattern included in the contrast streak image, Further comprising contour pedestal crossing distance data which is a distance between center point positions of the patina stones and contrast pedestal crossing angle data which are angles of center strokes of the contrast stones;

The method for providing the stigma retrieval data includes a step of computing the relative angle of each of the patterns included in the comparative group image, a step of computing the relative distance of each pattern included in the comparative group image, And a similarity data providing step;

The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image Calculating a center angle of each of the patterns relative to an origin position of the comparative group image (comparison pattern image angle data);

Wherein the step of computing the comparative core stiffness comprises the step of calculating a distance between the origin position of the comparative grouping image and the center position of each pattern (comparative legend image pattern distance data);

Wherein the similarity data providing step includes the steps of calculating distance similarities of the patterns included in the contrasting stone images for the respective patterns included in the comparison group image,

Calculating the degree of similarity of each of the patterns contained in the contrasted root image for each pattern included in the comparative group image,

Calculating a degree of similarity of each pattern included in the contrasted root image for each pattern included in the comparison root image from the distance similarity and the degree of similarity;

Calculating maximum similarity values for each pattern included in the comparison group image, and calculating an average similarity value for the derived maximum similarity values;

Wherein the display unit is configured to display an average similarity of each of the contrast shakes with respect to the comparative group,

The angle similarity (? N,? N ') for each pattern included in the contrast pattern images for each pattern included in the comparative group image

Figure pat00007
Lt; / RTI >

Where rn is the gyration angle of the angle included in the comparative group image, and rn 'is the gyration angle of each glyph included in the contrast group image;

The distance similarity (tn, tn ') for each pattern included in the contrast image for each pattern included in the comparative group image is

Figure pat00008
Lt; / RTI >

Where dn is the gyration distance of the angle included in the comparative group image, dn 'is the gyration distance of each glyph included in the contrast group image,

 The similarity (Sn, Sn ') of each pattern included in the contrast image for each pattern included in the comparative group image

Figure pat00009
.

According to the method of providing the footstep search data according to the present invention, objective numerical data for judging similarity can be provided in searching for a contrast streak similar to the comparative footstep, so that the user can more precisely and quickly compare the contrast streak It is possible to search and obtain information about the comparative streak.

Figure 1 shows an example of a footprint image,
2 is a view for explaining a footprint DB stored in a storage unit,
FIG. 3 is a diagrammatic representation for the comparison of the comparative and contrast zones,
FIG. 4 is a schematic view of a conventional footstep retrieval process,
Fig. 5 schematically shows an example of a pattern DB,
FIG. 6 shows a footprint search system for searching for a footprint,
FIG. 7 is a flowchart of a method for providing footprint search data according to the present invention,
8 is an enlarged view of the pixels of the pattern included in the footprint image,
FIG. 9 is a drawing showing the glyph pixels in numerical form,
FIG. 10 shows pixels of a pattern derived from internal pixels and outline pixels among the pixels forming the pattern,
FIG. 11 shows a pixel of a pattern from which outline pixels are derived out of the pixels forming the pattern,
12 and 13 are diagrams for explaining the outline vector derivation for the outline pixel among the pixels forming the pattern.

Hereinafter, a method for providing the present invention's footstep search data will be described in detail with reference to the drawings. In the description, the contents of the conventional known art are not described.

FIG. 7 is a flowchart of a method for providing a footprint search data according to the present invention, FIG. 8 is an enlarged view of pixels of a pattern included in a footprint image, FIG. 9 is a numerical representation of a pattern pixel, Fig. 10 shows the pixels of the pattern from which the inner pixels and the outline pixels are derived, and Fig. 11 shows the pixels of the pattern from which the outer pixels are derived from among the pixels forming the pattern. Figs. 12 and 13 Is used to illustrate the derivation of the contour vector for the outer pixel among the pixels forming the pattern.

The program in which the method for providing the footstep search data according to the present invention is executed is stored in the storage unit of the computer apparatus as shown in Fig. 6 and is executed by the execution instruction of the user. The user selects the comparative image of the staple stored in the storage unit and executes the program to execute the method of providing the staple image data.

As shown in FIG. 7, the method for providing the pedestrian detection data of the present invention includes a step (ST-110) of selecting a pedestrian trace image (comparative pedestrian image) and a step A step (ST-120, deriving a feature vector), a step (ST-130, a complex vector derivation) of deriving a contour vector from outline pixels for each glyph, a function (ST-140, operation step), and a step (ST-150) in which the calculated function values are displayed on the display unit. The calculated function values are displayed on the display unit, so that the user can obtain more objective numerical data, and can quickly and accurately search for the contrast streak image similar to the comparative streak image.

The comparative footprint image and the contrast footprint image are binarized and stored in the storage unit, and displayed on the display unit by a user's input command. The comparative footprint image and the contrast footprint image displayed on the display unit include a plurality of patterns as shown in FIG.

Describes the calculation process for the comparison pattern image and the pattern included in the contrast pattern image. Hereinafter, a process of calculating a pattern included in the comparative group image will be described, and a description of the contrast group will be omitted for the same part in the process of calculating the pattern included in the contrast group image.

FIG. 8 shows an example of a pattern included in the comparative grouping image, wherein the comparative grouping image is binarized and stored, and the pixels forming the comparative grouping image are R = 0, G G, and B = 255, G = 255, and B = 255 in the pixel color information (R, G, B) Quot; background pixel " hereinafter). FIG. 9 shows the background pixel as 0 and the glyph pixel as 1.

The outline pixels constituting the pattern are derived by the user clicking the outline pixels among the pattern pixels in each pattern of the comparison group image through the input unit and storing the outline pixel information (abscissa, ordinate) for each pattern in the storage unit It is possible.

The process of deriving the outline pixels constituting the pattern by the execution of the program will be described as an example.

First, when the program in which the method of providing the footprint search data is executed is executed so that all the four directions of the glyph pixels are the glyph pixels, the pixel color information (R, G, B) G = 255, and B = 255 to be displayed. The pattern consisting of the pixels as shown in FIG. 8 is displayed by changing a part of the glyph pixels to pixels having the same color information as the background pixels, as shown in FIG.

The maximum horizontal coordinate value, the maximum vertical coordinate value, and the minimum vertical coordinate value are derived from the coordinate values of the glyph pixels of the pixels as shown in FIG. 10, and the pixel Pm1 (Hereinafter referred to as " minimum pixel ") is derived and stored in the storage unit. When there are a plurality of pixels having a minimum sum of coordinate values, a pixel having the minimum vertical coordinate value (y coordinate value) becomes a minimum pixel and is stored in the storage unit together with the coordinate value information.

The distance between the glyph pixel and the minimum pixel Pm1 shown in Fig. 10 is calculated, and the pixel having the closest distance to the minimum pixel Pm1 is derived and stored in the storage unit. When there are two pixels having the same distance from the minimum pixel Pm1, a pixel having a large horizontal coordinate value (x coordinate value) is derived as a pixel (Ps1: first pixel) closest to the minimum pixel Pm1 and stored .

In the same way, a pixel closest to the first pixel Ps1 is derived and is executed until the x coordinate value of the pixel derived in this way becomes the maximum abscissa value. When the x coordinate value of the derived pixel becomes the maximum abscissa value, a pixel having the longest ordinate value (y coordinate value) among the closest distance pixels is derived and stored in the storage unit. (Y coordinate value) of the derived pixel is derived to be the maximum ordinate value and stored in the storage unit in order. When the derived ordinate value of the pixel is the maximum ordinate value, a pixel having the smallest abscissa value among the pixels having the closest distance among neighboring pixels is derived, and the information is stored in the storage unit. The pixel is derived and stored in the storage unit. When the derived coordinate value becomes the minimum abscissa value, a pixel having the smallest vertical coordinate value among the pixels having the closest distance is derived, and the information is stored in the storage unit, and until the pixel having the vertical ordinate value is derived And is stored in the storage unit. As shown in FIG. 10, when the color information (R, G, B) of the glyph pixels except for the pixels stored in the storage unit among the pixels constituting a certain pattern is changed to R = 255, G = 255, As shown in FIG. Outline pixels for a certain pattern are derived.

In this manner, the outline pixel information of one or more glyphs included in the comparative group image is stored in the storage unit. Likewise, outline pixel information for one or more glyphs included in the contrast group images is stored in the storage unit. Outer pixels form a closed circuit.

FIG. 12 exemplifies the outer pixels P1 to P10 constituting the pattern. In Fig. 12, v1 to v10 are contour vectors for the outline of the pattern composed of the outer pixels. The contour vector can be derived and stored in the storage by the user clicking the outer pixels in turn. As described above, a contour vector forming a pattern may be derived according to the process of deriving the outline pixels. The contour vectors are stored in the storage as incremental complex vectors of outer pixels.

V2 = 1-i, v3 = -i, v4 = -i, v5 = -1-i, v6 = -1, v7 = 1 + i, v8 = i, v2 = v9 = i, v10 = 1 + i.

Since the contour vectors are incremental complex vectors of the outer pixels, the same contour vectors are generated and stored in the storage even when the user changes the starting pixel when clicking the outer pixel.

When the user clicks the pixel P5 and clicks on the pixel P6, the first contour vector v1 is generated as a real number and an imaginary number at the abscissa increment and the ordinate increment of the pixel, respectively, and stored in the storage unit . A plurality of contour vectors for the surrounding pixels constituting the closed circuit are generated and stored in the storage unit.

The contour vector for the outer pixels forming the glyph is derived from the comparison and contrast stoe images and stored in the storage.

The autocorrelation function value (ACF), the norm function value, and the normal scalar value (R) are calculated from the contour vectors when the contour vectors are derived for the patterns included in the comparative grouped image and the contrasted grouped image (ST-141 , A first calculation step).

The autocorrelation function value (ACF) is a shift scalar sum of the incremental complex vectors for the outer pixels forming each glyph, and is calculated by the following equation (1) and stored in the storage unit.

Figure pat00010

 In Equation (1), v is an incremental complex vector of the outline pixel, and k is the number of outline pixels constituting a certain pattern.

Below

Figure pat00011
,when,

Figure pat00012
.

The autocorrelation function value (ACF) is calculated and stored in the storage unit for the patterns included in the comparative grouped image and the contrasted grouped image. Likewise, the auto correlation function (ACF) is calculated for the patterns included in the contrast group images and stored in the storage unit as the pattern DB of the contrast group.

The norm function value for each glyph from the contour vectors of the outline pixels constituting the patterns included in the comparative group image and the contrast group image is calculated by the following Equation 2 and stored in the storage unit.

Figure pat00013

In Equation (2), v is the incremental complex vector of the outline pixel, and k is the number of outline pixels forming the pattern.

The normalized scalar product is calculated from the autocorrelation function value ACF and the norm function value calculated as described above and stored in the storage unit. The normal scalar value (R) of the pattern of the plurality of comparative grouping images for the pattern of the comparative grouping image is calculated as shown in Equation (3) below and stored in the storage unit.

Figure pat00014

In Equation (3), n is the number of outline pixels constituting the pattern included in the comparative grouping image and the number of outline pixels forming the pattern included in the contrasting grouping image.

In the comparative group image, when there are three patterns having closed pixels constituting a closed circuit, and a plurality of contrast group images stored in the storage unit include three patterns having outer pixels constituting a closed circuit in the contrast grouped images, The normal scalar values R of the nine patterns included in the contrast pattern image are calculated as shown in Equation 3 for the three patterns included in the footprint image. Thus, 27 normal scalar values R are computed and stored in the storage.

Before the autocorrelation function value, the norm function value, and the class correlation function value to be described below are calculated, the number of outline pixels for each pattern included in the comparison pattern image and the contrast pattern image included in the comparison group image is counted, Lt; / RTI > The autocorrelation function value, the norm function value and the class correlation function value are calculated for each pattern included in the contrast pattern images having the same number of pixels and the number of pixels included in the comparative group image.

The normal scalar value R calculated according to the above procedure is displayed on the display together with the comparative strain image. When the normal scalar value (R) is larger than 0.96, the patterns included in the comparative and the comparative stiffness images are found to be the same or similar.

Thus, only when the normal scalar value R is greater than the normal scalar set value (e.g., 0.96) stored in the storage unit, the shoe DB for the contrast stitch is displayed on the display together with the normal scalar value R desirable.

Next, the class correlation function value (ICF) is calculated from the contour vectors derived for the patterns included in the comparison group image and the contrast group image (ST-143, second calculation step).

(ICF) for each pattern included in the contrast streak image and the pattern included in the comparison streak image is calculated by the following equation (4).

Figure pat00015

Where v is a contour vector of the outline pixel of the pattern included in the comparative grouping image, v 'is a contour vector of the contour pixel of the pattern included in the contour image, and k is the number of contour pixels.

The class correlation function value (ICF) is calculated and stored in the storage unit, and displayed on the display unit as the pedestal DB of each contrast zone together with the comparison frame image. It is confirmed that the similarity of both images is larger as the correlation coefficient value (ICF) is larger.

It is preferable that when the grade correlation function value ICF is displayed on the display unit as the pedestrian DB of each contrast streak, the grade correlation function value ICF is displayed sequentially from a large value.

As a third calculation step (ST-145), the relative distance of the pattern included in the footprint image is calculated.

The shoe DB includes data of the relative footstep origin position of the contrast streak image calculated from the contrast streak images, contrast center data of the center of each pattern included in the contrast streak images, the contrast streak origin position, And further includes contrast speckle gauging distance data, which is the distance between the pattern center positions.

The method for providing the stigma retrieval data further includes a step of computing the relative distance of each pattern included in the comparative group image, and a step of providing distance similarity data;

Wherein the step of computing the comparative core stiffness includes calculating the origin position (comparison root image origin position data) of the comparative rootstock image in the comparative rootstock image, calculating the central position of each pattern included in the comparative rootstock image Calculating a distance between an origin position of the comparison group image and each center position of the comparison group (comparative arch span distance data);

The distance similarity data providing step may include a step of calculating a distance similarity degree of each pattern included in the contrast pattern images for each pattern included in the comparison group image and a step of calculating maximum distance similarity values for each pattern included in the comparison group image And calculating a mean distance similarity value for the derived maximum distance similarity values, and the calculated result is stored in the storage unit. The display unit displays the average distance similarity of the respective contrast shakes to the comparative footsteps.

The distance similarity (tn, tn ') for each pattern included in the contrast image for each pattern included in the comparative group image is calculated by the following equation (5).

Figure pat00016

Where dn is the distance of each pattern contained in the comparative footprint image, and dn 'is the pattern distance of each pattern contained in the contrast footprint images.

The origin of the footstep can be the center of the pixel that makes up each footstep. In addition, the value obtained by dividing the maximum value and the minimum value of the horizontal coordinate of the pixel forming the footprint image by 2 is the abscissa of the origin, and the value obtained by dividing the sum of the maximum value and the minimum value of the ordinate by 2 is the ordinate of the origin have.

The center of each glyph can be the center of a pixel that forms a pattern. The value obtained by dividing the maximum value and the minimum value of the abscissa of the pixel forming the pattern as described above and dividing by 2 becomes the abscissa of the center of the glyph, and the value obtained by dividing the sum of the maximum value and the minimum value of the ordinate by 2 is the ordinate .

The distance between the center of the pattern and the origin is calculated by adding the square of the difference in the abscissa value to the square of the difference in the ordinate value and the square root thereof.

The similarity of the patterns of the respective patterns included in the plurality of contrast group images is calculated with respect to the pattern distances of the respective patterns included in the comparative group image by the same operation as in Equation (5).

In the case of dn '> 2dn in Equation (5), it is calculated to be 0.

As a fourth calculation step (ST-147), the relative angle of the pattern included in the footprint image is calculated.

The shoe DB stores the origin position data of the contrast shoe images calculated in the contrast shoe images, the contrast center shoe center data as the center positions of the respective patterns included in the contrast shoe image, and the contrast shoe angle data .

The method for providing a footprint search data further includes a step of computing a relative footstep angle and a step of providing an angle similarity data, in which a pattern angle for each pattern included in the comparative footprint image is calculated.

The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image ), And a central angle of each pattern relative to the origin position of the comparison group image (comparison pattern image angle data) is calculated.

The step of providing the degree of similarity data may include the steps of calculating the degree of similarity of each pattern included in the contrasted grouped images for each pattern included in the comparative grouped image, and calculating the maximum degree of similarity values for each pattern included in the comparative grouped image And calculating an average degree of similarity value for the derived maximum degree of similarity values.

The display unit displays the average degree of similarity of each contrasting rock to the comparison streak.

The angular similarities (? N,? N ') of the respective patterns included in the contrast image for each pattern included in the comparative group image are calculated as shown in Equation (6).

Figure pat00017

In the above equation, rn is the pattern angle of each pattern included in the comparative group image, and rn 'is the pattern angle of each pattern included in the contrast pattern image. In the above, the angle can be calculated by using a line extending the maximum value and the minimum value of the ordinate as a reference line.

In the case of rn '> 2rn in Equation (6), 0 is calculated.

The method of providing the streak search data includes the steps of calculating a relative angle of each of the patterns included in the comparative image of the striae, An operation step, and a similarity data providing step;

The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image And calculating a center angle of each pattern relative to an origin position of the comparison group image (comparison pattern image data).

Wherein the step of computing the comparative core stiffness comprises the step of calculating a distance between the origin position of the comparative grouping image and the center position of each pattern (comparative legend image pattern distance data);

Wherein the similarity data providing step includes the steps of calculating distance similarities of the patterns included in the contrasting stone images for the respective patterns included in the comparison group image,

Calculating the degree of similarity of each of the patterns contained in the contrasted root image for each pattern included in the comparative group image,

Calculating a degree of similarity of each pattern included in the contrasted root image for each pattern included in the comparison root image from the distance similarity and the degree of similarity;

Extracting maximum similarity values for each pattern included in the comparative group image, and calculating an average similarity value for the derived maximum similarity values.

The display unit displays the average degree of similarity of each contrasting shake with respect to the comparative grouping.

The angle similarity (? N,? N ') for each pattern included in the contrast image for each pattern included in the comparison group image is calculated by Equation (6)

The distance similarity (tn, tn ') for each pattern included in the contrast image of each pattern included in the comparative group image is calculated by the above equation (5)

 The similarity (Sn, Sn ') of each pattern included in the contrast image for each pattern included in the comparative group image is calculated as shown in Equation (7).

Figure pat00018

10: footprint image 11: first feature
13: second characteristic portion 15: third characteristic portion

Claims (8)

A control unit, a storage unit connected to the control unit and storing a footprint database of a plurality of contrast shafts, a display unit connected to the control unit and displaying a footprint image, and an input unit connected to the control unit, Lt; / RTI > Wherein the footprint DB comprises a footprint DB for at least one pattern included in the shoe DB and the contrast footprint image, the footprint DB includes a contrast footprint image and a shoe model name, and the flint DB includes each pattern included in the contrast footprint image A contour vector for a pixel outside the glyph of the contour vector, an autocorrelation function value (ACF) computed from contour vectors, and a norm function value;
Calculating a contour vector for outline pixels of at least one of the glyphs included in the comparison group image stored in the storage unit; calculating an autocorrelation function value for each glyph from the contour vector; Calculating a norm function value for each of the patterns by comparing an autocorrelation function value of each contrast stance with an autocorrelation function value and a norm function value of the comparison group and a normal scalar value R of a norm function value, And displaying a normal scalar value of each contrast zone on the display; Wherein the outline is a closed loop, and the contour vector of each pattern is an incremental complex vector vector of outline pixels constituting the pattern.
2. The method of claim 1, wherein the auto-correlation function (ACF) for each pattern of the contrast or the comparison group is a shift scalar sum of the incremental complex vectors for the outer pixels forming each glyph
Figure pat00019
Lt; / RTI >
The function value (AC ACF)) for each pattern of the contrast streak or the comparison streak is the scalar sum of the incremental complex vector for the outer vector forming each glyph
Figure pat00020
Lt; / RTI >
The normal color value (R)
Figure pat00021
Lt; / RTI >
Where ACF (n) is a shift scalar sum computed for the glyphs contained in the comparison root image, ACF '(n) is the contour vector for the contour pixel, (N) is the computed function value for the pattern included in the comparative grouping image, and | ACF '(n) | is the contrast value of the contrasting sketch image Is a genomic function value calculated for a glyph included in the segment search data.
2. The method of claim 1, wherein the autocorrelation function value of each contrast streak relative to the autocorrelation function value of the comparison group and the norm function value and the normal scalar setting value of the normal scalar value (R) And the shoe DB for the contrast shoe is displayed on the display together with the normal scalar value (R) only when the normal scalar value (R) is greater than the normal scalar setpoint stored in the storage. How to provide search materials. 3. The method according to claim 2, wherein a class correlation function value (ICF) for each pattern included in each pattern included in the comparison group image and the contrast group images is calculated and stored in the storage unit, and the normal scalar value And a class correlation function value (ICF) is displayed on the display unit.
In the above, the class correlation function value (ICF) for each pattern of contrast streaks and each pattern of comparative group shakes
Figure pat00022
Lt; / RTI >
In the above equations, v is a contour vector for the outline pixels of each pattern included in the comparative group image, v 'is a contour vector for the contour pixels of each pattern included in the contrast pattern image, and k is the number of contour pixels being.
5. The method according to any one of claims 1 to 4, wherein before the autocorrelation function value, the norm function value, and the class correlation function value are calculated, each pattern included in the comparative strain image and each pattern included in the contrast strain image The number of outline pixels for the pixel is counted and stored in the storage unit; The autocorrelation function value, the norm function value, and the grade correlation function value are calculated for each pattern included in the contrast pattern images having the same number of pixels and the number of pixels included in the comparative group image. . 3. The method according to claim 2, wherein the shoe DB includes data on the relative footstep origin position of the contrast streak image calculated in the contrast streak images, contrast streak center data on the center of each pattern included in the contrast streak images, Further comprising contrast speckle gauging distance data, which is the distance between the origin position and each of the contrast center gyro center positions;
The method for providing the stigma retrieval data further includes a step of computing the relative distance of each pattern included in the comparative group image, and a step of providing distance similarity data;
Wherein the step of computing the comparative core stiffness includes calculating the origin position (comparison root image origin position data) of the comparative rootstock image in the comparative rootstock image, calculating the central position of each pattern included in the comparative rootstock image Calculating a distance between an origin position of the comparison group image and each center position of the comparison group (comparative arch span distance data);
The distance similarity data providing step may include a step of calculating a distance similarity degree of each pattern included in the contrast pattern images for each pattern included in the comparison group image and a step of calculating maximum distance similarity values for each pattern included in the comparison group image And calculating an average distance similarity value for the derived maximum distance similarity values;
Wherein the display unit displays the average distance similarity of each of the contrast shakes relative to the comparison shoe.
The distance similarity (tn, tn ') for each pattern included in the comparative footprint images for each pattern included in the comparative footprint image is
Figure pat00023
Lt; / RTI >
Where dn is the pattern distance of the angle included in the comparative footprint image, and dn 'is the pattern distance of each pattern included in the contrast footprint images.
3. The method according to claim 2, wherein the shoe database stores the origin position data of the contrast shoe images calculated from the contrast shoe images, the contrast center shoe center data that is the center positions of the respective patterns included in the contrast shoe image, Further comprising plantarge angle data relative to phosphorus;
Wherein the method for providing the footstep search data further includes a step of calculating a relative footstep angle and a step of providing the degree of similarity data, the flint angle being calculated for each pattern included in the comparative footstep image;
The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image Calculating a center angle of each of the patterns relative to an origin position of the comparative group image (comparison pattern image angle data);
The step of providing the degree of similarity data may include the steps of calculating the degree of similarity of each pattern included in the contrasted grouped images for each pattern included in the comparative grouped image, and calculating the maximum degree of similarity values for each pattern included in the comparative grouped image And calculating an average degree of similarity value for the derived maximum degree of similarity values;
Wherein the display unit displays the average degree of similarity of each of the contrast shakes relative to the comparison shoe.
The angle similarity (? N,? N ') of each pattern included in the contrast pattern images for each pattern included in the comparative group image
Figure pat00024
Lt; / RTI >
In the above equation, rn is the pattern angle of each pattern included in the comparative group image, and rn 'is the pattern angle of each pattern included in the contrast pattern image.
3. The method according to claim 2, wherein the shoe DB includes data on the contrast striae origin position data of the contrast striae images calculated from the contrast striae images, contrast center striae data on the center of each pattern included in the contrast striae image, And a contrast speckle gauging distance data which is a distance between the position and the center position of the contrast speckle gauges;
The method for providing the stigma retrieval data includes a step of computing the relative angle of each of the patterns included in the comparative group image, a step of computing the relative distance of each pattern included in the comparative group image, And a similarity data providing step;
The step of computing the comparative core stiffness angle includes a step of calculating an origin position (comparison root image origin position data) of the comparative grouping image in the comparative grouping image, a step of calculating a center position of each pattern included in the comparative grouping image Calculating a center angle of each of the patterns relative to an origin position of the comparative group image (comparison pattern image angle data);
Wherein the step of computing the comparative core stiffness comprises the step of calculating a distance between the origin position of the comparative grouping image and the center position of each pattern (comparative legend image pattern distance data);
Wherein the similarity data providing step includes the steps of calculating distance similarities of the patterns included in the contrasting stone images for the respective patterns included in the comparison group image,
Calculating the degree of similarity of each of the patterns contained in the contrasted root image for each pattern included in the comparative group image,
Calculating a degree of similarity of each pattern included in the contrasted root image for each pattern included in the comparison root image from the distance similarity and the degree of similarity;
Calculating maximum similarity values for each pattern included in the comparison group image, and calculating an average similarity value for the derived maximum similarity values;
Wherein the display unit displays the average similarity of each of the contrast shakes to the comparison shoe.
The angle similarity (? N,? N ') for each pattern included in the contrast pattern images for each pattern included in the comparative group image
Figure pat00025
Lt; / RTI >
Where rn is the gyration angle of the angle included in the comparative group image, and rn 'is the gyration angle of each glyph included in the contrast group image;
The distance similarity (tn, tn ') for each pattern included in the contrast image for each pattern included in the comparative group image is
Figure pat00026
Lt; / RTI >
Where dn is the gyration distance of the angle included in the comparative group image, dn 'is the gyration distance of each glyph included in the contrast group image,
The similarity (Sn, Sn ') of each pattern included in the contrast image for each pattern included in the comparative group image
Figure pat00027
Calculated as an expression.
KR1020160016573A 2016-02-12 2016-02-12 A Method Of Providing For Searching Footprint And The System Practiced The Method KR101781359B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160016573A KR101781359B1 (en) 2016-02-12 2016-02-12 A Method Of Providing For Searching Footprint And The System Practiced The Method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160016573A KR101781359B1 (en) 2016-02-12 2016-02-12 A Method Of Providing For Searching Footprint And The System Practiced The Method

Publications (2)

Publication Number Publication Date
KR20170095062A true KR20170095062A (en) 2017-08-22
KR101781359B1 KR101781359B1 (en) 2017-09-26

Family

ID=59757922

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160016573A KR101781359B1 (en) 2016-02-12 2016-02-12 A Method Of Providing For Searching Footprint And The System Practiced The Method

Country Status (1)

Country Link
KR (1) KR101781359B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112257662A (en) * 2020-11-12 2021-01-22 安徽大学 Pressure footprint image retrieval system based on deep learning
CN112800267A (en) * 2021-02-03 2021-05-14 大连海事大学 Fine-grained shoe print image retrieval method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005027951A (en) 2003-07-08 2005-02-03 Gen Tec:Kk Sole database and creating method thereof
JP2007226756A (en) 2006-02-22 2007-09-06 Univ Kinki Method and device for evaluating left image such as footprint using ridgelet transform

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112257662A (en) * 2020-11-12 2021-01-22 安徽大学 Pressure footprint image retrieval system based on deep learning
CN112800267A (en) * 2021-02-03 2021-05-14 大连海事大学 Fine-grained shoe print image retrieval method
CN112800267B (en) * 2021-02-03 2024-06-11 大连海事大学 Fine-granularity shoe print image retrieval method

Also Published As

Publication number Publication date
KR101781359B1 (en) 2017-09-26

Similar Documents

Publication Publication Date Title
JP6091560B2 (en) Image analysis method
CN110738101A (en) Behavior recognition method and device and computer readable storage medium
Lin et al. A multi-level morphological active contour algorithm for delineating tree crowns in mountainous forest
US20050053276A1 (en) Method of obtaining a depth map from a digital image
JP6648925B2 (en) Image processing method, image processing device, image processing system, production device, program, and recording medium
KR102073468B1 (en) System and method for scoring color candidate poses against a color image in a vision system
Goodbody et al. Digital aerial photogrammetry for assessing cumulative spruce budworm defoliation and enhancing forest inventories at a landscape-level
JP2006285310A (en) Evaluation method of canopy of forest, and its canopy evaluation program
US9292929B2 (en) Image region extraction device, image region extraction method, and image region extraction program
KR101510206B1 (en) Urban Change Detection Method Using the Aerial Hyper Spectral images for Digital Map modify Drawing
US11170215B1 (en) System and method for discriminating and demarcating targets of interest in a physical scene
US11049268B2 (en) Superimposing position correction device and superimposing position correction method
JP2013030183A (en) Environment recognition device, and program
KR101781359B1 (en) A Method Of Providing For Searching Footprint And The System Practiced The Method
EP1760636B1 (en) Ridge direction extraction device, ridge direction extraction method, ridge direction extraction program
JP2017003525A (en) Three-dimensional measuring device
JP4721829B2 (en) Image retrieval method and apparatus
JP6293505B2 (en) Tool inspection method and tool inspection apparatus
CN109657540B (en) Withered tree positioning method and system
US11562505B2 (en) System and method for representing and displaying color accuracy in pattern matching by a vision system
JP3919722B2 (en) Skin shape measuring method and skin shape measuring apparatus
JP2014146181A (en) Operation information recording device, display system, and program
CN110427961B (en) Building information extraction method and system based on rule and sample fusion
CN109034138B (en) Image processing method and device
KR100954137B1 (en) Edge-based text localization and segmentation algorithms for automatic slab information recognition

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant