CN116128850A - Perforation measurement method, device, equipment and readable storage medium - Google Patents

Perforation measurement method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN116128850A
CN116128850A CN202310132523.0A CN202310132523A CN116128850A CN 116128850 A CN116128850 A CN 116128850A CN 202310132523 A CN202310132523 A CN 202310132523A CN 116128850 A CN116128850 A CN 116128850A
Authority
CN
China
Prior art keywords
image
dimensional
tubular column
representing
column
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310132523.0A
Other languages
Chinese (zh)
Other versions
CN116128850B (en
Inventor
严正国
严正娟
周超
王飞
吴银川
苏娟
吕源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Zhengshi Intelligent Technology Co ltd
Original Assignee
Xi'an Zhengshi Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Zhengshi Intelligent Technology Co ltd filed Critical Xi'an Zhengshi Intelligent Technology Co ltd
Priority to CN202310132523.0A priority Critical patent/CN116128850B/en
Publication of CN116128850A publication Critical patent/CN116128850A/en
Application granted granted Critical
Publication of CN116128850B publication Critical patent/CN116128850B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • EFIXED CONSTRUCTIONS
    • E21EARTH OR ROCK DRILLING; MINING
    • E21BEARTH OR ROCK DRILLING; OBTAINING OIL, GAS, WATER, SOLUBLE OR MELTABLE MATERIALS OR A SLURRY OF MINERALS FROM WELLS
    • E21B47/00Survey of boreholes or wells
    • E21B47/002Survey of boreholes or wells by visual inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • G06T3/067Reshaping or unfolding 3D tree structures onto 2D planes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Mining & Mineral Resources (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Geology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • Fluid Mechanics (AREA)
  • Geophysics (AREA)
  • Quality & Reliability (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geochemistry & Mineralogy (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a perforation measurement method, a perforation measurement device, perforation measurement equipment and a readable storage medium, wherein the perforation measurement method comprises the steps of mapping an underground three-dimensional tubular column image shot by an ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image; performing image processing on the two-dimensional tubular column image through a camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image; and measuring a target detection point on the surface image of the tubular column. The method can achieve the effect of improving the accuracy of measuring perforation by the underground camera.

Description

Perforation measurement method, device, equipment and readable storage medium
Technical Field
The present application relates to the field of perforation measurement, and in particular, to a method, apparatus, device and readable storage medium for perforation measurement.
Background
At present, the technology for detecting perforation erosion based on a downhole array side view camera is not perfect, and the detection of mineral substances such as petroleum, natural gas and the like can be completed only through a simple algorithm, or the image processing and final target detection can be manually performed after the image is shot through a camera.
In the detection process, safety cannot be ensured due to the imperfect technology, and the detection result of some substances is inaccurate.
Therefore, how to improve the accuracy of perforation measurement by the downhole camera is a technical problem to be solved.
Disclosure of Invention
The embodiment of the application aims to provide a perforation measurement method, and the technical scheme of the embodiment of the application can achieve the effect of improving the accuracy of measuring perforation by a downhole camera.
In a first aspect, an embodiment of the present application provides a method for perforation measurement, including mapping an underground three-dimensional tubular column image captured by an ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image; performing image processing on the two-dimensional tubular column image through a camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image; and measuring a target detection point on the surface image of the tubular column.
In the above embodiment, the three-dimensional tubular column image can be converted into the two-dimensional tubular column surface image through the camera imaging model and the sleeve column model, the three-dimensional tubular column surface can be directly detected by indirectly detecting the converted two-dimensional image, and the effect of improving the accuracy of measuring perforation of the underground camera can be achieved.
In some embodiments, mapping the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image, including:
mapping the space three-dimensional point set on the underground three-dimensional pipe column image into a two-dimensional point set in a plane through a camera imaging model to obtain a two-dimensional pipe column image, wherein the mapping of the space three-dimensional point set on the underground three-dimensional pipe column image into the two-dimensional point set in the plane through the camera imaging model is obtained through the following formula:
I(x i ,y i )=a*f(T(x,y,z),C);
Figure BDA0004084763420000021
Figure BDA0004084763420000022
Figure BDA0004084763420000023
r=af(α);
x i =rcosθ+n/2;
y i =rsinθ+m/2;
wherein I represents a point on a two-dimensional tubular column image, x i X-axis coordinate, y representing a point on a two-dimensional tubular column image i The method comprises the steps of representing y-axis coordinates of points on a two-dimensional tubular column image, a representing camera parameters, f (alpha) representing camera model functions, f/f (alpha) representing camera imaging model functions, T representing points on a three-dimensional tubular column image downhole, x representing x-axis coordinates of points on the three-dimensional tubular column image downhole, y representing y-axis coordinates of points on the three-dimensional tubular column image downhole, z representing z-axis coordinates of points on the three-dimensional tubular column image downhole, C being a constant, d representing the positions of the cameras to three downholeDistance, x, of points on the dimension string image c X-axis coordinates, y, representing camera position c Y-axis coordinates, z, representing camera position c Z-axis coordinates representing camera position, α represents pitch angle of points on camera and downhole three-dimensional string image, R represents radius of two-dimensional string image, R represents radius of downhole three-dimensional string image, θ represents resolution, x o The x-axis coordinate representing the center of the three-dimensional tubular string image downhole, n representing the width of the two-dimensional tubular string image, and m representing the length of the two-dimensional tubular string image.
In the embodiment, the two-dimensional tubular column image obtained by mapping through the algorithm can accurately convert the image.
In some embodiments, performing image processing on a two-dimensional string image through a camera imaging model and a preset sleeve string model to obtain a string surface image of a three-dimensional string image downhole, including:
performing model matching on the two-dimensional tubular column image through a sleeve column model to obtain a coupling image, wherein the coupling image represents a regular image corresponding to the two-dimensional tubular column image;
and performing image processing on the coupling image through a camera imaging model to obtain a tubular column surface image.
In the embodiment, the two-dimensional pipe column image can be converted into the regular two-dimensional image by matching the coupling image, and then the result is more accurate when the two-dimensional image is converted into the pipe column surface image.
In some embodiments, image processing of the collar image by the camera imaging model is obtained by the following formula:
Figure BDA0004084763420000031
Figure BDA0004084763420000032
r=af(α);
x 1 =rcosθ+x o
y 2 =rsinθ+y o
Im 2 (x 2 ,y 2 )=Im 1 (x 1 ,y 1 );
wherein alpha represents the pitch angle of the camera and a point on the down-hole three-dimensional pipe column image, a represents the camera parameter, R represents the radius of the down-hole three-dimensional pipe column image, R represents the radius of the two-dimensional pipe column image, d 0 Representing the distance from the camera position to a point on the three-dimensional tubular string image downhole, θ representing the resolution, im 1 Representing points on a two-dimensional tubular column image, x 1 X-axis coordinate, y representing a point on a two-dimensional tubular column image 1 Representing the y-axis coordinates, im, of a point on a two-dimensional tubular column image 2 Representing points on the surface image of the pipe string, x 2 X-axis coordinates, y representing points on the surface image of the pipe string 2 Representing the y-axis coordinates of a point on the image of the surface of the pipe, n representing the width of the two-dimensional pipe image, n 2 Representing the vertical number of pixels of the surface-mounted image, f (alpha) representing a camera model function, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image downhole o And the y-axis coordinate of the center of the image of the three-dimensional tubular string downhole is represented.
In the embodiment, the two-dimensional pipe column image can be accurately converted into the pipe column surface image through the algorithm.
In some embodiments, model matching the two-dimensional tubular string image by a casing string model to obtain a collar image comprises:
And adjusting the radius and the center coordinates of a preset coupling image through the sleeve column model until the radius and the center coordinates of the coupling image coincide with the two-dimensional tubular column image, so as to obtain the coupling image.
In the embodiment, the radius and the center coordinates of the preset coupling image can be adjusted until the preset coupling image is overlapped with the two-dimensional pipe column image, so that a more accurate coupling image is obtained, and the standard two-dimensional coupling image corresponding to the two-dimensional pipe column image can be better restored.
In some embodiments, before mapping the downhole three-dimensional tubular string image captured by the ultra-wide angle camera through the preset camera imaging model, the method further comprises:
acquiring a standard three-dimensional column image set shot by an ultra-wide angle camera, wherein the standard three-dimensional column image set comprises a plurality of three-dimensional column images and a plurality of standard two-dimensional column images corresponding to the three-dimensional column images;
and training the basic mapping model through the standard three-dimensional tubular column image set to obtain a camera imaging model.
In the above embodiment, the camera imaging model obtained by training the basic mapping model through the standard three-dimensional column image set can accurately convert the three-dimensional column image into the two-dimensional column surface image.
In some embodiments, training the base mapping model with the standard three-dimensional string image set includes:
inputting a plurality of three-dimensional tubular column images in a standard three-dimensional tubular column image set into a basic mapping model to obtain a plurality of two-dimensional images;
and adjusting parameters of the model until the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images.
In the embodiment, the model can be more accurate by adjusting the parameters of the model, the obtained two-dimensional tubular column surface image can be more accurate, and the result obtained by finally carrying out perforation measurement can be more accurate.
In some embodiments, measuring a target detection point on a tubular surface image includes:
measuring parameters of a target detection point on the surface image of the pipe column, wherein the parameters comprise: at least one of a major axis length parameter, a minor axis length parameter, an average pore diameter parameter, a perimeter parameter, an area parameter, and a roundness parameter.
In the above embodiment, the detection method can directly and accurately measure a plurality of parameters of the target object on the surface of the cylinder.
In a second aspect, embodiments of the present application provide an apparatus for perforation measurement, comprising:
The mapping module is used for mapping the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image;
the processing module is used for carrying out image processing on the two-dimensional tubular column image through the camera imaging model and the preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image;
and the measuring module is used for measuring the target detection point on the surface image of the tubular column.
Optionally, the mapping module is specifically configured to:
mapping the space three-dimensional point set on the underground three-dimensional pipe column image into a two-dimensional point set in a plane through a camera imaging model to obtain a two-dimensional pipe column image, wherein the mapping of the space three-dimensional point set on the underground three-dimensional pipe column image into the two-dimensional point set in the plane through the camera imaging model is obtained through the following formula:
I(x i ,y i )=a*f(T(x,y,z),C);
Figure BDA0004084763420000051
Figure BDA0004084763420000052
/>
Figure BDA0004084763420000053
r=af(α);
x i =rcosθ+n/2;
y i =rsinθ+m/2;
wherein I represents a point on a two-dimensional tubular column image, x i X-axis coordinate, y representing a point on a two-dimensional tubular column image i The y-axis coordinates representing points on the two-dimensional tubular string image, a representing camera parameters, f (α) representing a camera model function, f/f (α) representing a camera imaging model function, T representing points on the three-dimensional tubular string image downhole, and x representing three-dimensional downhole The x-axis coordinate of the point on the tubular column image, y represents the y-axis coordinate of the point on the three-dimensional tubular column image in the well, z represents the z-axis coordinate of the point on the three-dimensional tubular column image in the well, C is a constant, d represents the distance from the camera position to the point on the three-dimensional tubular column image in the well, and x c X-axis coordinates, y, representing camera position c Y-axis coordinates, z, representing camera position c Z-axis coordinates representing camera position, α represents pitch angle of points on camera and downhole three-dimensional string image, R represents radius of two-dimensional string image, R represents radius of downhole three-dimensional string image, θ represents resolution, x o The x-axis coordinate representing the center of the three-dimensional tubular string image downhole, n representing the width of the two-dimensional tubular string image, and m representing the length of the two-dimensional tubular string image.
Optionally, the processing module is specifically configured to:
performing model matching on the two-dimensional tubular column image through a sleeve column model to obtain a coupling image, wherein the coupling image represents a regular image corresponding to the two-dimensional tubular column image;
and performing image processing on the coupling image through a camera imaging model to obtain a tubular column surface image.
Optionally, the processing module performs image processing on the collar image through a camera imaging model, and the processing module is obtained through the following formula:
Figure BDA0004084763420000061
Figure BDA0004084763420000062
r=af(α);
x 1 =rcosθ+x o
y 2 =rsinθ+y o
Im 2 (x 2 ,y 2 )=Im 1 (x 1 ,y 1 );
Wherein alpha represents the pitch angle of the camera and the point on the image of the underground three-dimensional tubular column, a represents the camera parameter, R representsRadius of the underground three-dimensional tubular column image is shown, r represents radius of the two-dimensional tubular column image, d 0 Representing the distance from the camera position to a point on the three-dimensional tubular string image downhole, θ representing the resolution, im 1 Representing points on a two-dimensional tubular column image, x 1 X-axis coordinate, y representing a point on a two-dimensional tubular column image 1 Representing the y-axis coordinates, im, of a point on a two-dimensional tubular column image 2 Representing points on the surface image of the pipe string, x 2 X-axis coordinates, y representing points on the surface image of the pipe string 2 Representing the y-axis coordinates of a point on the image of the surface of the pipe, n representing the width of the two-dimensional pipe image, n 2 Representing the vertical number of pixels of the surface-mounted image, f (alpha) representing a camera model function, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image downhole o And the y-axis coordinate of the center of the image of the three-dimensional tubular string downhole is represented.
Optionally, the processing module is specifically configured to:
and adjusting the radius and the center coordinates of a preset coupling image through the sleeve column model until the radius and the center coordinates of the coupling image coincide with the two-dimensional tubular column image, so as to obtain the coupling image.
Optionally, the apparatus further includes:
the system comprises a training module, a mapping module and a control module, wherein the training module is used for acquiring a standard three-dimensional column image set shot by the ultra-wide angle camera before mapping the underground three-dimensional column image shot by the ultra-wide angle camera through a preset camera imaging model, wherein the standard three-dimensional column image set comprises a plurality of three-dimensional column images and a plurality of standard two-dimensional column images corresponding to the three-dimensional column images;
And training the basic mapping model through the standard three-dimensional tubular column image set to obtain a camera imaging model.
Optionally, the training module is specifically configured to:
inputting a plurality of three-dimensional tubular column images in a standard three-dimensional tubular column image set into a basic mapping model to obtain a plurality of two-dimensional images;
and adjusting parameters of the model until the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images.
Optionally, the measurement module is specifically configured to:
measuring parameters of a target detection point on the surface image of the pipe column, wherein the parameters comprise: at least one of a major axis length parameter, a minor axis length parameter, an average pore diameter parameter, a perimeter parameter, an area parameter, and a roundness parameter.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing computer readable instructions that, when executed by the processor, perform the steps of the method as provided in the first aspect above.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method as provided in the first aspect above.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the embodiments of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method of perforation measurement provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a three-dimensional column image and a two-dimensional column image according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a model-matching collar image according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a surface image of a tubular column according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a perforation parameter measurement according to an embodiment of the present application;
FIG. 6 is a schematic block diagram of an apparatus for perforation measurement according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a perforation measurement device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. The components of the embodiments of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
The method and the device are applied to perforation measurement scenes, and the specific scenes are three-dimensional images shot underground through an underground forward-looking ultra-wide angle camera, and two-dimensional images of the inner wall of the underground are obtained through conversion of the three-dimensional images, so that perforation measurement of the two-dimensional images can be directly carried out. For example, it is applied to shale oil gas well perforation detection and fracturing evaluation.
However, at present, the technology for detecting perforation erosion based on a side view camera of an underground array is not perfect, and the detection of mineral substances such as petroleum, natural gas and the like can be completed only through a simple algorithm, or the image processing and final target detection can be manually performed after the image is shot through a camera. In the detection process, safety cannot be ensured due to the imperfect technology, and the detection result of some substances is inaccurate.
Therefore, mapping processing is carried out on the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model, and a two-dimensional tubular column image is obtained; performing image processing on the two-dimensional tubular column image through a camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image; and measuring a target detection point on the surface image of the tubular column. The three-dimensional tubular column image can be converted into the two-dimensional tubular column surface image through the camera imaging model and the sleeve column model, the three-dimensional tubular column surface can be directly detected by the two-dimensional image obtained through conversion, and the effect of improving the accuracy of measuring perforation by the underground camera can be achieved.
In the embodiment of the present application, the execution body may be a perforation measurement device in a perforation measurement system, and in practical application, the perforation measurement device may be electronic devices such as an ultra-wide angle camera terminal device and a server, which are not limited herein.
The method of perforation measurement according to an embodiment of the present application is described in detail below in conjunction with fig. 1.
Referring to fig. 1, fig. 1 is a flowchart of a method for perforation measurement according to an embodiment of the present application, where the method for perforation measurement shown in fig. 1 includes:
step 110: and mapping the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image.
The camera imaging model can be combined with the ultra-wide angle camera into a whole to jointly realize underground perforation measurement, and can directly convert underground three-dimensional tubular column images shot by the ultra-wide angle camera into two-dimensional images of underground inner walls, wherein the two-dimensional images are inner wall tiling images. The two-dimensional column image can be understood as a top view of the three-dimensional column image, and a two-dimensional column image can be directly obtained through mapping.
In some embodiments of the present application, before mapping the downhole three-dimensional tubular string image captured by the ultra-wide angle camera through the preset camera imaging model, the method shown in fig. 1 further includes: acquiring a standard three-dimensional column image set shot by an ultra-wide angle camera, wherein the standard three-dimensional column image set comprises a plurality of three-dimensional column images and a plurality of standard two-dimensional column images corresponding to the three-dimensional column images; and training the basic mapping model through the standard three-dimensional tubular column image set to obtain a camera imaging model.
In the process, the camera imaging model obtained by training the basic mapping model through the standard three-dimensional tubular column image set can accurately convert the three-dimensional tubular column image into a two-dimensional tubular column surface image.
The standard three-dimensional string image set comprises a plurality of three-dimensional string images, and the three-dimensional string images can be string images of various mineral wells. The standard two-dimensional column diagram can be obtained by adjusting a two-dimensional column diagram obtained by mapping a three-dimensional column image.
In some embodiments of the present application, training the base mapping model with a standard three-dimensional set of tubular string images includes: inputting a plurality of three-dimensional tubular column images in a standard three-dimensional tubular column image set into a basic mapping model to obtain a plurality of two-dimensional images; and adjusting parameters of the model until the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images.
In the process, the model can be more accurate by adjusting the parameters of the model, the obtained two-dimensional tubular column surface image can be more accurate, and the result obtained by finally carrying out perforation measurement can be more accurate.
Wherein the parameters of the model may include parameters of the camera. After parameters are adjusted, when the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images, the camera imaging model is a standard model, and the standard two-dimensional tubular column images can be accurately output when the three-dimensional tubular column images are input.
In some embodiments of the present application, mapping a downhole three-dimensional tubular column image captured by an ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image includes: mapping the space three-dimensional point set on the underground three-dimensional pipe column image into a two-dimensional point set in a plane through a camera imaging model to obtain a two-dimensional pipe column image, wherein the mapping of the space three-dimensional point set on the underground three-dimensional pipe column image into the two-dimensional point set in the plane through the camera imaging model is obtained through the following formula:
I(x i ,y i )=a*f(T(x,y,z),C);
Figure BDA0004084763420000111
Figure BDA0004084763420000112
Figure BDA0004084763420000113
r=af(α);
x i =rcosθ+n/2;
y i =rsinθ+m/2;
wherein I represents a point on a two-dimensional tubular column image, x i X-axis coordinate, y representing a point on a two-dimensional tubular column image i The method comprises the steps of representing y-axis coordinates of a point on a two-dimensional tubular column image, a representing camera parameters, f (alpha) representing a camera model function, f/f (alpha) representing a camera imaging model function, T representing a point on a three-dimensional tubular column image downhole, x representing x-axis coordinates of a point on a three-dimensional tubular column image downhole, y representing y-axis coordinates of a point on a three-dimensional tubular column image downhole, z representing z-axis coordinates of a point on a three-dimensional tubular column image downhole, C being a constant, d representing a distance from a camera position to a point on a three-dimensional tubular column image downhole, x c X-axis coordinates, y, representing camera position c Y-axis coordinates, z, representing camera position c Z-axis coordinates representing camera position, α representing depression of points on the camera and the three-dimensional tubular string image downholeElevation angle, R, represents the radius of the two-dimensional tubular string image, R represents the radius of the three-dimensional tubular string image downhole, θ represents resolution, x o The x-axis coordinate representing the center of the three-dimensional tubular string image downhole, n representing the width of the two-dimensional tubular string image, and m representing the length of the two-dimensional tubular string image.
In the process, the two-dimensional tubular column image obtained by mapping through the algorithm can be accurately converted.
Referring to fig. 2, fig. 2 is a schematic diagram of a three-dimensional column image and a two-dimensional column image provided in the present application.
The two-dimensional tubular column image (a) is obtained by performing image conversion on the three-dimensional tubular column image (b) through the algorithm.
In addition, before the image conversion by the algorithm, some spatial point sets on the three-dimensional tubular column image need to be determined, wherein some spatial point sets on the three-dimensional tubular column image can be obtained by the following formula:
Figure BDA0004084763420000121
wherein x represents the x-axis coordinates of some spatial points on the three-dimensional tubular column image, y represents the y-axis coordinates of some spatial points on the three-dimensional tubular column image, z represents the z-axis coordinates of some spatial points on the three-dimensional tubular column image, R represents the radius of the three-dimensional tubular column image, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image o Representing the y-axis coordinate, z of the center of the three-dimensional tubular column image o Representing the z-axis coordinates of the three-dimensional column image center, θ represents the three-dimensional column image center and the angle of some spatial points on the three-dimensional column image relative to the plane.
Step 120: and performing image processing on the two-dimensional tubular column image through the camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image.
The sleeve column model is used for matching standard two-dimensional column images with the two-dimensional column images, and because the two-dimensional column images may not be regular downhole images in practical application, the corresponding standard two-dimensional column images need to be matched, and the column surface images may be tiled images of the inner walls of the three-dimensional column images.
In some embodiments of the present application, performing image processing on a two-dimensional string image through a camera imaging model and a preset sleeve model to obtain a string surface image of a three-dimensional string image downhole, including: performing model matching on the two-dimensional tubular column image through a sleeve column model to obtain a coupling image, wherein the coupling image represents a regular image corresponding to the two-dimensional tubular column image; and performing image processing on the coupling image through a camera imaging model to obtain a tubular column surface image.
According to the method and the device, in the process, the two-dimensional pipe column image can be converted into the regular two-dimensional image through matching the coupling image, and then the result is more accurate when the two-dimensional image is converted into the pipe column surface image.
The coupling images can be meshed two-dimensional tubular column images corresponding to the three-dimensional tubular column images, and the meshed two-dimensional tubular column images are two-dimensional images corresponding to standard cylindrical images.
Specifically, referring to fig. 3, fig. 3 is a schematic diagram of a model matching collar image provided in the present application, in which a white point in the inner side is the center of a two-dimensional tubular column image, a mesh cylindrical image is a collar image obtained by matching, and it can be seen that the mesh cylindrical image is identical to the inner side surface of a three-dimensional tubular column image.
Specifically, referring to fig. 4, fig. 4 is a schematic diagram of a surface image of a tubular string provided in the present application, and the image in fig. 4 may be understood as a surface image of a three-dimensional tubular string, for example, may be an inner wall tile image corresponding to a cylindrical image photographed downhole.
In some embodiments of the present application, image processing of the collar image by the camera imaging model is obtained by the following formula:
Figure BDA0004084763420000141
Figure BDA0004084763420000142
r=af(α);
x 1 =rcosθ+x o
y 2 =rsinθ+y o
Im 2 (x 2 ,y 2 )=Im 1 (x 1 ,y 1 );
wherein alpha represents the pitch angle of the camera and a point on the down-hole three-dimensional pipe column image, a represents the camera parameter, R represents the radius of the down-hole three-dimensional pipe column image, R represents the radius of the two-dimensional pipe column image, d 0 Representing the distance from the camera position to a point on the three-dimensional tubular string image downhole, θ representing the resolution, im 1 Representing points on a two-dimensional tubular column image, x 1 X-axis coordinate, y representing a point on a two-dimensional tubular column image 1 Representing the y-axis coordinates, im, of a point on a two-dimensional tubular column image 2 Representing points on the surface image of the pipe string, x 2 X-axis coordinates, y representing points on the surface image of the pipe string 2 Representing the y-axis coordinates of a point on the image of the surface of the pipe, n representing the width of the two-dimensional pipe image, n 2 Representing the vertical number of pixels of the surface-mounted image, f (alpha) representing a camera model function, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image downhole o And the y-axis coordinate of the center of the image of the three-dimensional tubular string downhole is represented.
In the process, the two-dimensional pipe column image can be accurately converted into the pipe column surface image through the algorithm.
In some embodiments of the present application, performing model matching on a two-dimensional tubular column image through a casing column model to obtain a coupling image, including: and adjusting the radius and the center coordinates of a preset coupling image through the sleeve column model until the radius and the center coordinates of the coupling image coincide with the two-dimensional tubular column image, so as to obtain the coupling image.
According to the method and the device, in the process, the preset coupling image can be adjusted in radius and center coordinates until the preset coupling image is overlapped with the two-dimensional pipe column image, so that a more accurate coupling image is obtained, and the standard two-dimensional coupling image corresponding to the two-dimensional pipe column image can be restored better.
The collar images can be stored in the gallery in advance, and the collar images with different specifications are further included, so that the collar images with corresponding specifications can be directly matched when the method is used, or the radius and the center coordinates of the collar images can be adjusted through the method until the radius and the center coordinates of the collar images are overlapped with the two-dimensional tubular column images, and the standard collar images can be obtained.
Step 130: and measuring a target detection point on the surface image of the tubular column.
In some embodiments of the present application, measuring a target detection point on a tube post surface image includes: measuring parameters of a target detection point on the surface image of the pipe column, wherein the parameters comprise: at least one of a major axis length parameter, a minor axis length parameter, an average pore diameter parameter, a perimeter parameter, an area parameter, and a roundness parameter.
In the process, the detection method can be used for directly and accurately measuring a plurality of parameters of the target object on the surface of the cylinder.
The detection points can be the detection of some minerals in the underground detection and can be some perforation detection.
Specifically, referring to fig. 5, fig. 5 is a schematic diagram of perforation parameter measurement provided in the present application, where parameters shown in the drawings include: long axis length parameters, short axis length parameters, average pore diameter parameters, perimeter parameters, area parameters, and roundness parameters.
For example: length of long axis: 19.6mm, short axis length: 11.4mm, average pore size: 15.1mm, perimeter: 51.6mm, area: 180.2mm 2 And roundness: 0.85.
in the process shown in fig. 1, mapping processing is performed on an underground three-dimensional tubular column image shot by an ultra-wide angle camera through a preset camera imaging model, so as to obtain a two-dimensional tubular column image; performing image processing on the two-dimensional tubular column image through a camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image; and measuring a target detection point on the surface image of the tubular column. The three-dimensional tubular column image can be converted into the two-dimensional tubular column surface image through the camera imaging model and the sleeve column model, the three-dimensional tubular column surface can be directly detected by the two-dimensional image obtained through conversion, and the effect of improving the accuracy of measuring perforation by the underground camera can be achieved.
The method of perforation measurement is described above by means of fig. 1 and the apparatus for perforation measurement is described below in connection with fig. 6-7.
Referring to fig. 6, a schematic block diagram of an apparatus 600 for perforation measurement according to an embodiment of the present application is provided, where the apparatus 600 may be a module, a program segment, or a code on an electronic device. The apparatus 600 corresponds to the above-described embodiment of the method of fig. 1, and is capable of performing the steps involved in the embodiment of the method of fig. 1, and specific functions of the apparatus 600 may be referred to as the following description, and detailed descriptions thereof are omitted herein as appropriate to avoid redundancy.
Optionally, the apparatus 600 includes:
the mapping module 610 is configured to map the underground three-dimensional tubular column image captured by the ultra-wide angle camera through a preset camera imaging model, so as to obtain a two-dimensional tubular column image;
the processing module 620 is configured to perform image processing on the two-dimensional string image through the camera imaging model and a preset sleeve model, so as to obtain a string surface image of the downhole three-dimensional string image;
and the measurement module 630 is used for measuring the target detection point on the surface image of the tubular column.
Optionally, the mapping module is specifically configured to:
mapping the space three-dimensional point set on the underground three-dimensional pipe column image into a two-dimensional point set in a plane through a camera imaging model to obtain a two-dimensional pipe column image, wherein the mapping of the space three-dimensional point set on the underground three-dimensional pipe column image into the two-dimensional point set in the plane through the camera imaging model is obtained through the following formula:
I(x i ,y i )=a*f(T(x,y,z),C);
Figure BDA0004084763420000161
Figure BDA0004084763420000162
Figure BDA0004084763420000163
r=af(α);
x i =rcosθ+n/2;
y i =rsinθ+m/2;
wherein I represents a point on a two-dimensional tubular column image, x i X-axis coordinate, y representing a point on a two-dimensional tubular column image i The method comprises the steps of representing y-axis coordinates of a point on a two-dimensional tubular column image, a representing camera parameters, f (alpha) representing a camera model function, f/f (alpha) representing a camera imaging model function, T representing a point on a three-dimensional tubular column image downhole, x representing x-axis coordinates of a point on a three-dimensional tubular column image downhole, y representing y-axis coordinates of a point on a three-dimensional tubular column image downhole, z representing z-axis coordinates of a point on a three-dimensional tubular column image downhole, C being a constant, d representing a distance from a camera position to a point on a three-dimensional tubular column image downhole, x c X-axis coordinates, y, representing camera position c Y-axis coordinates, z, representing camera position c Z-axis coordinates representing camera position, α represents pitch angle of points on camera and downhole three-dimensional string image, R represents radius of two-dimensional string image, R represents radius of downhole three-dimensional string image, θ represents resolution, x o The x-axis coordinate representing the center of the three-dimensional tubular string image downhole, n representing the width of the two-dimensional tubular string image, and m representing the length of the two-dimensional tubular string image.
Optionally, the processing module is specifically configured to:
performing model matching on the two-dimensional tubular column image through a sleeve column model to obtain a coupling image, wherein the coupling image represents a regular image corresponding to the two-dimensional tubular column image; and performing image processing on the coupling image through a camera imaging model to obtain a tubular column surface image.
Optionally, the processing module performs image processing on the collar image through a camera imaging model, and the processing module is obtained through the following formula:
Figure BDA0004084763420000171
Figure BDA0004084763420000172
r=af(α);
x 1 =rcosθ+x o
y 2 =rsinθ+y o
Im 2 (x 2 ,y 2 )=Im 1 (x 1 ,y 1 );
wherein alpha represents the pitch angle of the camera and a point on the down-hole three-dimensional pipe column image, a represents the camera parameter, R represents the radius of the down-hole three-dimensional pipe column image, R represents the radius of the two-dimensional pipe column image, d 0 Representing the distance from the camera position to a point on the three-dimensional tubular string image downhole, θ representing the resolution, im 1 Representing points on a two-dimensional tubular column image, x 1 X-axis coordinate, y representing a point on a two-dimensional tubular column image 1 Representing the y-axis coordinates, im, of a point on a two-dimensional tubular column image 2 Representing points on the surface image of the pipe string, x 2 X-axis coordinates, y representing points on the surface image of the pipe string 2 Representing the y-axis coordinates of a point on the image of the surface of the pipe, n representing the width of the two-dimensional pipe image, n 2 Representing the vertical number of pixels of the surface-mounted image, f (alpha) representing a camera model function, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image downhole o And the y-axis coordinate of the center of the image of the three-dimensional tubular string downhole is represented.
Optionally, the processing module is specifically configured to:
and adjusting the radius and the center coordinates of a preset coupling image through the sleeve column model until the radius and the center coordinates of the coupling image coincide with the two-dimensional tubular column image, so as to obtain the coupling image.
Optionally, the apparatus further includes:
the system comprises a training module, a mapping module and a control module, wherein the training module is used for acquiring a standard three-dimensional column image set shot by the ultra-wide angle camera before mapping the underground three-dimensional column image shot by the ultra-wide angle camera through a preset camera imaging model, wherein the standard three-dimensional column image set comprises a plurality of three-dimensional column images and a plurality of standard two-dimensional column images corresponding to the three-dimensional column images; and training the basic mapping model through the standard three-dimensional tubular column image set to obtain a camera imaging model.
Optionally, the training module is specifically configured to:
inputting a plurality of three-dimensional tubular column images in a standard three-dimensional tubular column image set into a basic mapping model to obtain a plurality of two-dimensional images; and adjusting parameters of the model until the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images.
Optionally, the measurement module is specifically configured to:
measuring parameters of a target detection point on the surface image of the pipe column, wherein the parameters comprise: at least one of a major axis length parameter, a minor axis length parameter, an average pore diameter parameter, a perimeter parameter, an area parameter, and a roundness parameter.
Referring to fig. 7, a schematic block diagram of an apparatus for perforation measurement according to an embodiment of the present application may include a memory 710 and a processor 720. Optionally, the apparatus may further include: a communication interface 730, and a communication bus 740. The apparatus corresponds to the embodiment of the method of fig. 1 described above, and is capable of performing the steps involved in the embodiment of the method of fig. 1, and specific functions of the apparatus may be found in the following description.
In particular, the memory 710 is used to store computer readable instructions.
Processor 720, which processes the memory-stored readable instructions, is capable of performing the various steps in the method of fig. 1.
Communication interface 730 for communicating signaling or data with other node devices. For example: for communication with a server or terminal, or with other device nodes, the embodiments of the application are not limited in this regard.
A communication bus 740 for implementing direct connection communication of the above-described components.
The communication interface 730 of the device in the embodiment of the present application is used to perform signaling or data communication with other node devices. The memory 710 may be a high-speed RAM memory or a non-volatile memory, such as at least one disk memory. Memory 710 may optionally also be at least one storage device located remotely from the aforementioned processor. The memory 710 has stored therein computer readable instructions which, when executed by the processor 720, perform the method process described above in fig. 1. Processor 720 may be used on apparatus 600 and to perform the functions herein. By way of example, the processor 720 may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, and the embodiments are not limited in this regard.
Embodiments of the present application also provide a readable storage medium, which when executed by a processor, performs a method process performed by an electronic device in the method embodiment shown in fig. 1.
It will be clear to those skilled in the art that, for convenience and brevity of description, reference may be made to the corresponding procedure in the foregoing method for the specific working procedure of the apparatus described above, and this will not be repeated here.
In summary, the embodiment of the application provides a perforation measurement method, a perforation measurement device, an electronic device and a readable storage medium, wherein the method comprises the steps of mapping an underground three-dimensional tubular column image shot by an ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image; performing image processing on the two-dimensional tubular column image through a camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image; and measuring a target detection point on the surface image of the tubular column. The method can achieve the effect of improving the accuracy of measuring perforation by the underground camera.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application, and various modifications and variations may be suggested to one skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (11)

1. A method of perforation measurement, comprising:
mapping the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image;
performing image processing on the two-dimensional tubular column image through the camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image;
and measuring a target detection point on the pipe column surface image.
2. The method of claim 1, wherein the mapping the three-dimensional tubular string image of the well shot by the ultra-wide angle camera through the preset camera imaging model to obtain a two-dimensional tubular string image comprises:
mapping the space three-dimensional point set on the underground three-dimensional pipe column image into a two-dimensional point set in a plane through the camera imaging model to obtain the two-dimensional pipe column image, wherein the mapping of the space three-dimensional point set on the underground three-dimensional pipe column image into the two-dimensional point set in the plane through the camera imaging model is obtained through the following formula:
I(x i ,y i )=a*f(T(x,y,z),C);
Figure FDA0004084763410000011
Figure FDA0004084763410000012
Figure FDA0004084763410000013
r=af(α);
x i =rcosθ+n/2;
y i =rsinθ+m/2;
wherein I represents a point on the two-dimensional tubular column image, x i X-axis coordinate, y representing a point on a two-dimensional tubular column image i A represents a y-axis coordinate of a point on a two-dimensional tubular string image, a represents a camera parameter, f (α) represents a camera model function, f/f (α) represents a camera imaging model function, T represents a point on the downhole three-dimensional tubular string image, x represents an x-axis coordinate of a point on the downhole three-dimensional tubular string image, y represents a y-axis coordinate of a point on the downhole three-dimensional tubular string image, z represents a z-axis coordinate of a point on the downhole three-dimensional tubular string image, C is a constant, d represents a distance from a camera position to a point on the downhole three-dimensional tubular string image, x c X-axis coordinates, y representing the camera position c Y-axis coordinates, z, representing the camera position c A z-axis coordinate representing the camera position, alpha representing a pitch angle of a point on the camera and the downhole three-dimensional string image, R representing a radius of the two-dimensional string image, R representing a radius of the downhole three-dimensional string image, θ representing a resolution, x o And the x-axis coordinate of the center of the underground three-dimensional tubular column image is represented, n represents the width of the two-dimensional tubular column image, and m represents the length of the two-dimensional tubular column image.
3. The method according to claim 1 or 2, wherein the performing image processing on the two-dimensional string image by using the camera imaging model and a preset sleeve string model to obtain a string surface image of the downhole three-dimensional string image comprises:
Performing model matching on the two-dimensional tubular column image through the sleeve column model to obtain a coupling image, wherein the coupling image represents a regular image corresponding to the two-dimensional tubular column image;
and performing image processing on the coupling image through the camera imaging model to obtain the pipe column surface image.
4. A method according to claim 3, wherein the image processing of the collar image by the camera imaging model is obtained by the following formula:
Figure FDA0004084763410000021
Figure FDA0004084763410000022
r=af(α);
x 1 =rcosθ+x o
y 2 =rsinθ+y o
Im 2 (x 2 ,y 2 )=Im 1 (x 1 ,y 1 );
wherein alpha represents the pitch angle of the camera and a point on the downhole three-dimensional pipe string image, a represents the camera parameters, R represents the radius of the downhole three-dimensional pipe string image, R represents the radius of the two-dimensional pipe string image, d 0 Representing the distance from the camera position to a point on the three-dimensional tubular column image downhole, θ representing the resolution, im 1 Representing a point on the two-dimensional tubular column image, x 1 Representing the x-axis coordinates, y of a point on the two-dimensional tubular column image 1 Representing the y-axis coordinates, im, of the points on the two-dimensional tubular column image 2 Representing a point on the surface image of the pipe string, x 2 Representing the x-axis coordinates, y of a point on the image of the surface of the tubular string 2 Representing the y-axis coordinates of points on the column surface image, n representing the width of the two-dimensional column image, n 2 Representing the vertical number of pixels of the surface-mounted image, f (alpha) representing a camera model function, x o X-axis coordinate, y representing the center of the three-dimensional tubular column image downhole o And the y-axis coordinate of the center of the image of the three-dimensional tubular string downhole is represented.
5. A method according to claim 3, wherein said model matching said two-dimensional tubular string image by said casing model to obtain a collar image comprises:
and adjusting the radius and the center coordinates of a preset coupling image through the sleeve column model until the radius and the center coordinates of the coupling image coincide with the two-dimensional tubular column image, so as to obtain the coupling image.
6. The method according to claim 1 or 2, wherein before the mapping process of the downhole three-dimensional string image captured by the ultra-wide angle camera by the preset camera imaging model, the method further comprises:
acquiring a standard three-dimensional column image set shot by the ultra-wide angle camera, wherein the standard three-dimensional column image set comprises a plurality of three-dimensional column images and a plurality of standard two-dimensional column images corresponding to the three-dimensional column images;
and training the basic mapping model through the standard three-dimensional tubular column image set to obtain the camera imaging model.
7. The method of claim 6, wherein the training the base mapping model with the standard three-dimensional string image set comprises:
inputting the three-dimensional tubular column images in the standard three-dimensional tubular column image set into the basic mapping model to obtain a plurality of two-dimensional images;
and adjusting parameters of the model until the two-dimensional images are respectively overlapped with the standard two-dimensional tubular column images.
8. The method according to claim 1 or 2, wherein said measuring a target detection point on said pipe string surface image comprises:
measuring parameters of a target detection point on the pipe column surface image, wherein the parameters comprise: at least one of a major axis length parameter, a minor axis length parameter, an average pore diameter parameter, a perimeter parameter, an area parameter, and a roundness parameter.
9. An apparatus for perforation measurement, comprising:
the mapping module is used for mapping the underground three-dimensional tubular column image shot by the ultra-wide angle camera through a preset camera imaging model to obtain a two-dimensional tubular column image;
the processing module is used for carrying out image processing on the two-dimensional tubular column image through the camera imaging model and a preset sleeve column model to obtain a tubular column surface image of the underground three-dimensional tubular column image;
And the measuring module is used for measuring the target detection point on the pipe column surface image.
10. An electronic device, comprising:
a memory and a processor, the memory storing computer readable instructions that, when executed by the processor, perform the steps in the method of any of claims 1-8.
11. A computer-readable storage medium, comprising:
computer program which, when run on a computer, causes the computer to perform the method according to any one of claims 1-8.
CN202310132523.0A 2023-02-18 2023-02-18 Perforation measurement method, device, equipment and readable storage medium Active CN116128850B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310132523.0A CN116128850B (en) 2023-02-18 2023-02-18 Perforation measurement method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310132523.0A CN116128850B (en) 2023-02-18 2023-02-18 Perforation measurement method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN116128850A true CN116128850A (en) 2023-05-16
CN116128850B CN116128850B (en) 2023-11-21

Family

ID=86295456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310132523.0A Active CN116128850B (en) 2023-02-18 2023-02-18 Perforation measurement method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116128850B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117005847A (en) * 2023-10-07 2023-11-07 西安正实智能科技有限公司 Oil-gas well perforation visualization quantitative detection device adopting ultra-wide angle lens

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108019202A (en) * 2016-10-31 2018-05-11 中国石油集团长城钻探工程有限公司 A kind of display methods of 3-D view
CN108805147A (en) * 2017-04-27 2018-11-13 中国石油集团长城钻探工程有限公司 A kind of tube or cased well jacket casing damage characteristics of image mode identification method
CN111879264A (en) * 2020-06-30 2020-11-03 武汉数字化设计与制造创新中心有限公司 Flatness measurement and evaluation system based on line structured light
US20220074299A1 (en) * 2020-09-10 2022-03-10 Baker Hughes Oilfield Operations Llc System and method for diagnosing borehole structure variances using independent component analysis
CN115131374A (en) * 2022-07-22 2022-09-30 上海大学 Petroleum drill pipe diameter-changing positioning method and system based on three-dimensional point cloud and electronic equipment
US20230003917A1 (en) * 2020-06-18 2023-01-05 Shandong University Three-dimensional imaging method and system for surface comprehensive geophysical prospecting

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108019202A (en) * 2016-10-31 2018-05-11 中国石油集团长城钻探工程有限公司 A kind of display methods of 3-D view
CN108805147A (en) * 2017-04-27 2018-11-13 中国石油集团长城钻探工程有限公司 A kind of tube or cased well jacket casing damage characteristics of image mode identification method
US20230003917A1 (en) * 2020-06-18 2023-01-05 Shandong University Three-dimensional imaging method and system for surface comprehensive geophysical prospecting
CN111879264A (en) * 2020-06-30 2020-11-03 武汉数字化设计与制造创新中心有限公司 Flatness measurement and evaluation system based on line structured light
US20220074299A1 (en) * 2020-09-10 2022-03-10 Baker Hughes Oilfield Operations Llc System and method for diagnosing borehole structure variances using independent component analysis
CN115131374A (en) * 2022-07-22 2022-09-30 上海大学 Petroleum drill pipe diameter-changing positioning method and system based on three-dimensional point cloud and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117005847A (en) * 2023-10-07 2023-11-07 西安正实智能科技有限公司 Oil-gas well perforation visualization quantitative detection device adopting ultra-wide angle lens

Also Published As

Publication number Publication date
CN116128850B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
Verhoeven et al. Computer vision‐based orthophoto mapping of complex archaeological sites: The ancient quarry of Pitaranha (Portugal–Spain)
CN116128850B (en) Perforation measurement method, device, equipment and readable storage medium
EP2458336A1 (en) Method and system for reporting errors in a geographic database
US11989832B2 (en) Method for constructing a 3D representation of a conduit internal surface
CN101718551B (en) Flexible cable movement measuring method and measuring device
JP6532412B2 (en) Self-position estimation system, self-position estimation method, mobile terminal, server and self-position estimation program
KR102113068B1 (en) Method for Automatic Construction of Numerical Digital Map and High Definition Map
US9396584B2 (en) Obtaining geographic-location related information based on shadow characteristics
US11625860B1 (en) Camera calibration method
US10432915B2 (en) Systems, methods, and devices for generating three-dimensional models
KR102362504B1 (en) System for detecting error on digital-map
US20240093988A1 (en) Methods of measuring strctures
KR20200012379A (en) Image-based indoor position detection apparatus and detection method
Fritsch et al. Modeling facade structures using point clouds from dense image matching
JP2015197851A (en) Image processor, program for image processing and information management system
CN108053409B (en) Automatic construction method and system for remote sensing image segmentation reference library
CN106897683B (en) Ground object detection method and system of remote sensing image
CN116106904B (en) Facility deformation monitoring method and facility deformation monitoring equipment for object MT-InSAR
Wan et al. The P2L method of mismatch detection for push broom high-resolution satellite images
KR102008772B1 (en) System and Method for Calibration and Integration of Multi-Sensor using Feature Geometry
CN107907101B (en) Large ground wire interpolation method based on large ground wire length tolerance constraint
EP4291856A1 (en) Determining deformations of quay walls using a photogrammetric system
CN114266835A (en) Deformation monitoring control method and system for non-measuring camera
Bernat et al. Automation of measurements of selected targets of photopoints in application to photogrammetric reconstruction of road accidents
US20220245938A1 (en) Image recognition method and unmanned aerial vehicle system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant