CN110246157B - Oil-gas field equipment production state distinguishing system and method based on big data monitoring - Google Patents

Oil-gas field equipment production state distinguishing system and method based on big data monitoring Download PDF

Info

Publication number
CN110246157B
CN110246157B CN201910542449.3A CN201910542449A CN110246157B CN 110246157 B CN110246157 B CN 110246157B CN 201910542449 A CN201910542449 A CN 201910542449A CN 110246157 B CN110246157 B CN 110246157B
Authority
CN
China
Prior art keywords
pumping unit
oil
unit
video
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910542449.3A
Other languages
Chinese (zh)
Other versions
CN110246157A (en
Inventor
崔日华
邹本泉
赵叶辰
张健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing fujirui Optoelectronic Technology Co.,Ltd.
DAQING ANRUIDA TECHNOLOGY DEVELOPMENT Co.,Ltd.
Original Assignee
Beijing Fjr Optoelectronic Technology Co ltd
Daqing Anruida Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Fjr Optoelectronic Technology Co ltd, Daqing Anruida Technology Development Co ltd filed Critical Beijing Fjr Optoelectronic Technology Co ltd
Priority to CN201910542449.3A priority Critical patent/CN110246157B/en
Publication of CN110246157A publication Critical patent/CN110246157A/en
Application granted granted Critical
Publication of CN110246157B publication Critical patent/CN110246157B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Animal Husbandry (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an oil and gas field equipment production state discrimination system and method based on big data monitoring. The system comprises a storage unit, a video inspection control unit and a video analysis and identification unit; after the video inspection control unit sends out an inspection instruction each time, adding an information frame including attribute information of the preset point, such as information of the oil pumping unit, in a database; the video analysis and identification unit acquires the attribute information of the preset point in the database, determines the position of the oil pumping unit in a video picture through image analysis and comparison, judges the working state of the oil pumping unit according to the motion track of an observation point, and returns the identified result to the database after processing; the video inspection control unit reads the result returned by the video analysis and identification unit. The technology of the invention can identify the working state of the pumping unit, can timely know whether the pumping unit idles without manual monitoring, and can timely start and stop the idling pumping unit.

Description

Oil-gas field equipment production state distinguishing system and method based on big data monitoring
Technical Field
The invention relates to an information processing technology, in particular to an oil and gas field equipment production state distinguishing system and method based on big data monitoring.
Background
At present, due to long-term operation and mining, the oil liquid amount in partial areas cannot meet the full-load working state of the oil pumping unit, so the oil pumping unit is in an idle running state for part of time, and a large amount of resources can be wasted in the working state. Therefore, in the oil extraction operation of the petroleum industry, a scientific interval pumping period is established by comprehensively analyzing the liquid production amount of the oil well aiming at the oil well with insufficient liquid production amount, and the pumping equipment is started and stopped reasonably, so that the aims of saving energy and reducing consumption are fulfilled. Such wells are commonly referred to as interbumps. Firstly, the working state of the pumping unit needs to be judged, and secondly, whether a potential safety hazard exists nearby needs to be checked when the machine is started.
However, the current working state judgment of the oil well is also based on original modes such as manual inspection, and the mode has the problems of long inspection time, high manual intensity, untimely discovery and the like.
Disclosure of Invention
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. It should be understood that this summary is not an exhaustive overview of the invention. It is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is discussed later.
In view of the above, the invention provides a system and a method for judging the production state of oil and gas field equipment based on big data monitoring, so as to at least solve the problems of long inspection time, high labor intensity, untimely discovery and the like existing in the conventional method for judging the working state of an oil well based on manual inspection and the like.
The invention provides an oil and gas field equipment production state discrimination system based on big data monitoring, which comprises a storage unit, a video inspection control unit and a video analysis and identification unit, wherein the storage unit is used for storing the production state of the oil and gas field equipment; the storage unit is used for storing a preset database; the video inspection control unit is used for setting video inspection preset points, adding corresponding information frames in the database after an inspection instruction is sent each time, wherein the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points; the video analysis and identification unit is used for acquiring the attribute information of the preset point in the database, analyzing and comparing the image of the preset point to determine the position of the pumping unit in a video picture, judging the working state of the pumping unit according to the motion track of the observation point, and returning the identified result to the database after processing; the video inspection control unit is also used for reading the result returned by the video analysis and identification unit in the database.
Furthermore, the attribute information of each preset point comprises an inspection point data table and an oil pumping unit data table; the inspection point data table comprises one or more of inspection point identification, holder information, camera type, camera information, observation range and pumping unit identification array; the pumping unit data table comprises one or more information of a pumping unit identifier, a located inspection point identifier, a pumping unit type and a pumping unit external rectangle.
Further, the system also comprises a model obtaining unit, wherein the model obtaining unit is used for constructing an identification model for identifying the working state of the pumping unit.
Further, the working state of the pumping unit comprises: idle running and normal running.
Furthermore, in the process of training the recognition model, the model obtaining unit firstly recognizes the relative position of the pumping unit in an image picture by using a database sample model, corrects displacement offset by using a data algorithm, recognizes the corrected pumping unit state according to 1 frame per second, and then judges whether the moving part of the pumping unit moves or not according to image comparison of each frame.
The invention also provides an oil and gas field equipment production state distinguishing method based on big data monitoring, which is characterized by comprising the following steps: step one, setting video inspection preset points, adding corresponding information frames in a database after each inspection instruction is sent, wherein the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points; secondly, acquiring attribute information of the preset point in the database, analyzing and comparing images of the preset point to determine the position of the pumping unit in a video picture, and judging the working state of the pumping unit according to the motion track of the observation point; step three, the recognized result is returned to the database after being processed; reading the result returned by the video analysis and identification unit in the database; and step five, repeating the step one to the step four when the next video inspection is carried out.
Further, the attribute information of each preset point comprises an inspection point data table and an oil pumping unit data table: the inspection point data table comprises one or more of inspection point identification, holder information, camera type, camera information, observation range and pumping unit identification array; the pumping unit data table comprises one or more information of a pumping unit identifier, a located inspection point identifier, a pumping unit type and a pumping unit external rectangle.
Further, the method further comprises: and constructing an identification model for identifying the working state of the pumping unit.
Further, the working state of the pumping unit comprises: idle running and normal running.
Further, in the process of training the recognition model, the relative position of the oil pumping unit in an image picture is recognized by using a database sample model, displacement offset correction is carried out by using a data algorithm, the corrected oil pumping unit state is recognized according to 1 frame per second, and whether the moving part of the oil pumping unit moves or not is judged according to image comparison of each frame.
The invention provides an oil and gas field equipment production state discrimination system and method based on big data monitoring.
These and other advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings.
Drawings
The invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like reference numerals are used throughout the figures to indicate like or similar parts. The accompanying drawings, which are incorporated in and form a part of this specification, illustrate preferred embodiments of the present invention and, together with the detailed description, serve to further explain the principles and advantages of the invention. Wherein:
FIG. 1 is a block diagram showing the structure of one example of the big data monitoring-based oil and gas field equipment production state discrimination system of the invention;
FIG. 2 is a block diagram showing another example of the big data monitoring based oil and gas field equipment production status discrimination system of the present invention;
FIG. 3 is a schematic diagram illustrating an exemplary process of the big data monitoring based oil and gas field equipment production status discrimination method of the present invention;
FIG. 4 is a schematic diagram illustrating an identification flow in a preferred embodiment of the present invention;
FIG. 5 is a schematic diagram showing the algorithm flow of the human vehicle alarm determination algorithm in a preferred embodiment of the present invention.
Skilled artisans appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve the understanding of the embodiments of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described hereinafter with reference to the accompanying drawings. In the interest of clarity and conciseness, not all features of an actual implementation are described in the specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the device structures and/or processing steps closely related to the solution according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
The embodiment of the invention provides an oil and gas field equipment production state discrimination system based on big data monitoring, which comprises a storage unit, a video inspection control unit and a video analysis and identification unit; the storage unit is used for storing a preset database; the video inspection control unit is used for setting video inspection preset points, adding corresponding information frames in the database after an inspection instruction is sent each time, wherein the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points; the video analysis and identification unit is used for acquiring the attribute information of the preset point in the database, analyzing and comparing the image of the preset point to determine the position of the pumping unit in a video picture, judging the working state of the pumping unit according to the motion track of the observation point, and returning the identified result to the database after processing; the video inspection control unit is also used for reading the result returned by the video analysis and identification unit in the database.
Fig. 1 shows a block diagram of an example of the production state discrimination system of oil and gas field equipment based on big data monitoring according to the invention.
As shown in FIG. 1, the oil and gas field equipment production state discrimination system based on big data monitoring comprises a storage unit 1, a video inspection control unit 2 and a video analysis and identification unit 3.
The storage unit 1 is used for storing a predetermined database. The database is used for storing attribute information obtained by routing inspection of each preset point, and the attribute information at least includes pumping unit information corresponding to the preset point, such as the name of a pumping unit, the angle of the pumping unit, the number of the pumping units, and the like.
In addition, the video inspection control unit 2 is configured to set a video inspection preset point location, and may add a corresponding information frame in the database after sending an inspection instruction each time, where the information frame includes attribute information of the preset point, such as a location name, a location of a focus observation target, and the like.
As an example, the attribute information of each preset point includes a patrol data table and a pumping unit data table. Wherein, the same beam-pumping unit has two records, corresponding to infrared and visible light respectively.
Wherein, the data table of the patrol inspection point comprises one or more of the following items: marking a routing inspection point; cradle head information; a camera type; camera information; the oil pumping unit identification array and the observation range.
For example, the patrol point identifier may be a number or a string of characters representing separate warning zones, differing in infrared and visible light.
The pan/tilt information includes, for example, a direction angle, a pitch angle, and the like.
The camera types include, for example, a visible light type camera (as denoted by 0) and an infrared type camera (as denoted by 1).
The camera type may be divided by focal length, for example.
In addition, the observation range of each preset point refers to an effective observation area in the current field, and may be defined as a rectangular area, for example, and may record pixel coordinates of the upper left corner and the lower right corner.
The array of pump identification data corresponds, for example, to a corresponding record in a pump data sheet that records pump information within an observation range.
In addition, the pumping unit data table comprises one or more information of pumping unit identification, located inspection point identification, pumping unit type and external rectangle of the pumping unit.
The pumping unit types include, for example, a beam pumping unit (as indicated by 0) and a tower pumping unit (as indicated by 1).
In addition, the external rectangle of the pumping unit refers to the external rectangle of the pumping unit in the current field of view, and pixel coordinates of the upper left corner and the lower right corner can be recorded.
The video analyzing and identifying unit 3 is configured to obtain attribute information of the preset point in the database, perform image analysis and comparison on the preset point to determine a position of the pumping unit in a video picture (where the position of the target pumping unit may be located in a middle area of the video picture when the preset point is set), determine an operating state of the pumping unit according to a motion trajectory of an observation point (such as a horse head), and return a result after identification to the database after processing.
As an example, when the working state of the pumping unit is determined, for example, the position of an observation point (e.g., horse head) can be found in a video picture, the position of the observation point can be found again at intervals, whether a displacement occurs or not is compared between the observation point and the observation point, and after multiple determinations, if a displacement does occur, it is determined that the machine is working; if the displacement does not occur for a plurality of times, the machine is judged to stop working.
Further, as an example, the processing of the result after the recognition by the video analysis and recognition unit 3 may be, for example: and converting the video analysis result into a data state. For example, if the working state of the pumping unit is normal, displaying work: 1, if no work is displayed, displaying work: 0.
in this way, the video patrol controlling unit 2 can read the results returned by the video analysis recognizing unit 3 in the database. For example, the video inspection control unit 2 may perform verification once every fixed time (e.g., 8 seconds), and perform status notification and recording on the basis of a large number of determination results.
As an example, the system further comprises a model obtaining unit 4, as shown in fig. 2, the model obtaining unit 4 is used for constructing a recognition model for recognizing the working state of the pumping unit.
As an example, the working state of the pumping unit includes: idle running and normal running.
As an example, in the process of training the recognition model, the model obtaining unit 4 first identifies the relative position of the pumping unit in the image picture by using the database sample model (the database sample model is also a sample after training), performs displacement offset correction by using a data algorithm, recognizes the corrected pumping unit state according to 1 frame per second, and then determines whether the moving part of the pumping unit moves according to the image contrast of each frame.
For example, a picture can be prestored in a video picture of each preset point, and a certain error exists in the positioning accuracy of the rotary table part of the camera, so that the formed picture has a little offset, and the identification result can be influenced when the camera is used for detecting the oil pumping unit. In this case, the relative positional relationship between a plurality of points in the original image may be compared with the relative relationship between a plurality of points in the actual image, so as to determine the amount of displacement between the original image and the actual image.
In this way, the calibration frame for recognition is moved by the offset amount based on the offset amount calculated in the previous step, and thus the calibration frame matches the attention target in the actual image.
In another aspect of the present invention, a method for determining the production state of oil and gas field equipment based on big data monitoring is further provided, as shown in fig. 3, the method for determining the production state of oil and gas field equipment based on big data monitoring includes: step one, setting video inspection preset points, adding corresponding information frames in a database after an inspection instruction is sent each time, wherein the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points; secondly, acquiring attribute information of the preset point in the database, analyzing and comparing images of the preset point to determine the position of the pumping unit in a video picture, and judging the working state of the pumping unit according to the motion track of the observation point; step three, processing the identified result and returning the processed result to the database; reading the result returned by the video analysis and identification unit in the database; and step five, repeating the step one to the step four when the next video inspection is carried out.
As an example, the attribute information of each preset point includes a patrol data table and a pumping unit data table.
Wherein, the data table of the patrol inspection point comprises one or more of the following items: marking a routing inspection point; cradle head information; a camera type; camera information; the oil pumping unit identification array and the observation range.
In addition, the pumping unit data table comprises one or more information of pumping unit identification, located inspection point identification, pumping unit type and external rectangle of the pumping unit.
As an example, the method further comprises: and constructing an identification model for identifying the working state of the pumping unit.
As an example, the working state of the pumping unit includes: idle running and normal running.
As an example, in the process of training the recognition model, the database sample model is used to recognize the relative position of the pumping unit in the image picture, the displacement offset correction is performed by using a data algorithm, the corrected pumping unit state is recognized according to 1 frame per second, and then whether the moving part of the pumping unit moves or not is judged according to the image comparison of each frame.
Referring to the recognition flowchart shown in fig. 4 and the processing principle shown in fig. 5, a preferred embodiment of the present invention is described below.
The conventional video monitoring cannot intelligently identify whether an intrusion behavior exists, the line-crossing alarm and the intrusion alarm cannot identify target information, and whether an intrusion target is a person, a vehicle or an animal cannot be judged. And the conventional video is mostly in a one-to-one mode, namely one camera is opposite to one oil pumping well. The invention adopts a 750mm to 1000mm long-focus lens to collect video images, can cover a plurality of oil wells within the radius of 3km, and through a deep learning mode, a data server can extract learned targets such as people, vehicles and the like from pictures with complex background environment, and can accurately report the types of invasion targets, thereby judging the targets.
In this example, this can be realized by, for example, the following steps one to five.
In the first step, the video inspection control unit plans the video inspection preset point, and inserts an information frame into the database after each inspection instruction is sent. The information frame contains the attribute information of the routing inspection point: such as location name, focal observation target location, etc.
In the second step: the video analysis and identification unit receives the attribute information in the database and performs image analysis and comparison on the preset point. And identifying whether an intrusion target exists, and if so, what object enters the warning area.
In the third step, the video analysis and identification unit returns the identified result to the database after processing.
And in the fourth step, the video inspection control unit reads the result returned by the video analysis and identification unit in the database, and alarms if the result exceeds a preset alarm threshold value.
In the fifth step, the video is subjected to next inspection, and the contents from the first step to the fourth step are repeated.
Thus, referring to the identification flow chart, the identification software can read the video file and select representative picture frames and target objects for calibration; human-computer interactive calibration is supported; the calibration result can be directly used for training the deep learning model.
In order to improve the judgment precision, a routing inspection parameter definition method is designed, and comprises the following data tables:
the patrol point data table includes:
patrol spot identification (number or character string, representing separate alarm zone, different under infrared and visible light)
Tripod head information (Direction angle, Pitch angle)
Camera type (0: visible; 1: infrared)
Camera information (focal length)
Observation scope (meaning the definition of effective observation area in the current visual field, rectangular area, recordable upper left corner and lower right corner pixel coordinates)
Oil extractor identification array (corresponding to the corresponding record in the oil extractor data table, which records the oil extractor information in the observation range)
In addition, the pumping unit data sheet (two records for the same pumping unit, corresponding to infrared and visible light) includes:
beam-pumping unit sign
Location inspection point mark
Types of pumping units (0: beam pumping unit; 1: tower pumping unit)
External rectangle of pumping unit (referring to the external rectangle of the pumping unit in the current view field, and pixel coordinates of the upper left corner and the lower right corner can be recorded)
In addition, when the deep learning sample library is completed, a representative image is selected from 30 ten thousand frames of monitoring images, the shape of the pumping unit under infrared and visible light is labeled by utilizing calibration software, and meanwhile, partial personnel and vehicle image samples are introduced from an open image library for labeling to generate the sample library.
Upon completion of the deep learning model, the PASCAL VOC data set provides a standardized set of excellent data sets for image identification and classification, and the target of the specific purpose is identified by constructing a VOC data set suitable for the specific detection purpose of the user for network construction.
The VOC data set mainly comprises three folders of Annotations/ImageSets/JPEGImages:
JPEGImages: all picture information is included, including training pictures and test pictures. These images are named in "number. jpg" format, where the numbers are all six digits. These images are the image data used for training and test validation.
The indications: and storing tag files in an xml format, wherein each xml records the label information of each picture and is consistent with the picture name.
And (3) writing information such as the holding position, the object type, the picture name size and the like selected by the frame into an xml file by selecting the oil pumping machine head in each frame of image of the video in the frame, wherein the information corresponds to the picture name in the JPEGImages. Taking 000001.jpg as an example, two pumping unit heads are selected, and the type is named as machine, and then corresponding 000001. xml:
in order to prevent overfitting of the model and make the model more robust, data augmentation needs to be applied to perform data expansion. It is possible to use: 1) noise increase: denoising the image by using Gaussian noise; 2) fuzzy processing; 3) turning: including horizontal flipping and vertical flipping.
ImageSets: four txt files are contained under the Main folder, and each txt file records a picture number.
Fast R-CNN: training own data set
Faster R-CNN is an optimized accelerated version of R-CNN as well as Fast R-CNN. The target detection steps are divided into four steps: 1) generation of candidate regions 2) feature extraction 3) detection target classification 4) refinement of framed regions. The diagram is a structural comparison diagram of R-CNN, Fast R-CNN and Fast R-CNN frameworks:
under an Ubuntu16.04 system, a Caffe frame is built, a GPU is used for accelerating calculation of a deep neural network, and a fast R-CNN method is used for training a VOC data set of the user. For example, a ZF model can be selected as a pre-training model, and an alternative training (alt _ opt) can be selected as a training mode. Finally, a preliminary model (ZF _ false _ rcnn _ final. ca ffemodel) is obtained
According to different pumping unit types and routing inspection parameters, the pumping unit state judgment process integrating two methods of deep learning object identification and dynamic target detection is designed in the embodiment, as shown in fig. 5.
The method comprises the steps of firstly identifying the relative position of the pumping unit in an image picture by using a database sample model, carrying out displacement offset correction by using a data algorithm, identifying the corrected state of the pumping unit according to 1 frame per second, then finding whether the moving part of the pumping unit moves according to image comparison of each frame, and carrying out judgment once every 8 seconds.
Therefore, a training model, an algorithm and inspection parameters are integrated, real-time video streams and a database are accessed, and an alarm result is displayed and output by using a video inspection unit on the existing large data platform.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; although the present invention and the advantageous effects thereof have been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (6)

1. The oil and gas field equipment production state discrimination system based on big data monitoring is characterized by comprising a storage unit, a video inspection control unit and a video analysis and identification unit;
the storage unit is used for storing a preset database;
the video inspection control unit is used for setting video inspection preset points, adding corresponding information frames in the database after an inspection instruction is sent each time, wherein the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points;
the video analysis and identification unit is used for acquiring the attribute information of the preset point in the database, analyzing and comparing the image of the preset point to determine the position of the pumping unit in a video picture, judging the working state of the pumping unit according to the motion track of the observation point, and returning the identified result to the database after processing;
the video inspection control unit is also used for reading the result returned by the video analysis and identification unit in the database;
the video analysis and identification module is a long-focus lens of 750mm to 1000mm and covers a plurality of oil wells within 3km of radius;
the oil pumping unit further comprises a model obtaining unit, wherein the model obtaining unit is used for constructing an identification model for identifying the working state of the oil pumping unit;
the attribute information of each preset point comprises an inspection point data table and an oil pumping unit data table, wherein the oil pumping unit data table comprises an external rectangle of the oil pumping unit; the external rectangle of the pumping unit is the external rectangle of the pumping unit in the current view field, and the pixel coordinates of the upper left corner and the lower right corner are recorded;
the model obtaining unit firstly identifies the relative position of the pumping unit in an image picture by using a database sample model in the process of training the identification model, carries out displacement offset correction by using a data algorithm, identifies the corrected pumping unit state according to 1 frame per second, and judges whether the moving part of the pumping unit moves or not according to the image comparison of each frame;
the method comprises the steps that a picture is prestored in a video picture of each preset point location, and the offset between an original image and an actual image is determined by comparing the relative position relation of a plurality of point locations in the prestored picture with the relative relation of a plurality of points in the actual image.
2. The oil and gas field equipment production state discrimination system based on big data monitoring as claimed in claim 1,
the inspection point data table comprises one or more of inspection point identification, holder information, camera type, camera information, observation range and pumping unit identification array;
the pumping unit data table further comprises one or more information of pumping unit identification, located inspection point identification and pumping unit type.
3. The oil and gas field equipment production state discrimination system based on big data monitoring as claimed in claim 2, wherein the working state of the pumping unit comprises: idle running and normal running.
4. The method for distinguishing the production state of the oil and gas field equipment based on big data monitoring is characterized by comprising the following steps of:
the method comprises the steps that firstly, video inspection preset point positions are set, corresponding information frames are added in a database after an inspection instruction is sent each time, the information frames comprise attribute information of the preset points, and the attribute information at least comprises pumping unit information corresponding to the preset points;
secondly, acquiring attribute information of the preset point in the database, analyzing and comparing images of the preset point to determine the position of the pumping unit in a video picture, and judging the working state of the pumping unit according to the motion track of the observation point;
step three, the recognized result is returned to the database after being processed;
reading the result returned by the video analysis and identification unit in the database;
step five, repeating the step one to the step four when next video inspection is carried out;
the method further comprises the following steps:
constructing an identification model for identifying the working state of the pumping unit;
acquiring a video image by adopting a long-focus lens of 750mm to 1000mm, and covering a plurality of oil wells within 3km of radius;
the attribute information of each preset point comprises an inspection point data table and an oil pumping unit data table, wherein the oil pumping unit data table comprises an external rectangle of the oil pumping unit; the external rectangle of the pumping unit is the external rectangle of the pumping unit in the current view field, and the pixel coordinates of the upper left corner and the lower right corner are recorded;
in the process of training the recognition model, a database sample model is utilized to recognize the relative position of the pumping unit in an image picture, a data algorithm is utilized to correct displacement offset, the corrected pumping unit state is recognized according to 1 frame per second, and then whether the moving part of the pumping unit moves or not is judged according to the image comparison of each frame;
the method comprises the steps that a picture is prestored in a video picture of each preset point location, and the offset between an original image and an actual image is determined by comparing the relative position relation of a plurality of point locations in the prestored picture with the relative relation of a plurality of points in the actual image.
5. The oil and gas field equipment production state discrimination method based on big data monitoring as claimed in claim 4, wherein the attribute information of each preset point comprises an inspection point data table and a pumping unit data table:
the inspection point data table comprises one or more of inspection point identification, holder information, camera type, camera information, observation range and pumping unit identification array;
the pumping unit data table comprises one or more information of a pumping unit identifier, a located inspection point identifier, a pumping unit type and a pumping unit external rectangle.
6. The oil and gas field equipment production state discrimination method based on big data monitoring as claimed in claim 5, wherein the working state of the pumping unit comprises: idle running and normal running.
CN201910542449.3A 2019-06-21 2019-06-21 Oil-gas field equipment production state distinguishing system and method based on big data monitoring Active CN110246157B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910542449.3A CN110246157B (en) 2019-06-21 2019-06-21 Oil-gas field equipment production state distinguishing system and method based on big data monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910542449.3A CN110246157B (en) 2019-06-21 2019-06-21 Oil-gas field equipment production state distinguishing system and method based on big data monitoring

Publications (2)

Publication Number Publication Date
CN110246157A CN110246157A (en) 2019-09-17
CN110246157B true CN110246157B (en) 2020-09-04

Family

ID=67888728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910542449.3A Active CN110246157B (en) 2019-06-21 2019-06-21 Oil-gas field equipment production state distinguishing system and method based on big data monitoring

Country Status (1)

Country Link
CN (1) CN110246157B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110889395B (en) * 2019-12-12 2023-08-15 广州中科永信科技有限公司 Machine learning-based mechanical motion recognition method and system
CN111611953B (en) * 2020-05-28 2021-01-29 北京富吉瑞光电科技股份有限公司 Target feature training-based oil pumping unit identification method and system
CN112528937A (en) * 2020-12-22 2021-03-19 嘉洋智慧安全生产科技发展(北京)有限公司 Method for detecting starting and stopping of video pumping unit
CN113807389A (en) * 2021-08-03 2021-12-17 嘉洋智慧安全生产科技发展(北京)有限公司 Method and device for determining target object dynamic state and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101667020A (en) * 2009-09-25 2010-03-10 山东汉和能源科技有限公司 Remote measurement and control terminal of oil pumping unit
CN104516335A (en) * 2013-09-26 2015-04-15 罗斯蒙特公司 Wireless industrial process field device with imaging
CN104899675A (en) * 2015-04-29 2015-09-09 高志亮 Design method based on oil field Internet of Things engineering
CN106121627A (en) * 2016-06-03 2016-11-16 山东天工石油装备有限公司 A kind of oil pumper displacement acquisition method based on image recognition
CN106247000A (en) * 2016-08-09 2016-12-21 曲阜师范大学 A kind of ball-and-seat analyzing oil well liquid-supplying situation and the analysis method of oil well liquid-supplying situation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102121365A (en) * 2011-03-09 2011-07-13 黑龙江八一农垦大学 System for wirelessly acquiring data and monitoring working condition of pumping unit of oil field
CN106998447B (en) * 2017-03-31 2018-05-11 大庆安瑞达科技开发有限公司 Wide area, oil field infrared panorama imaging radar scout command and control system
CN108952673B (en) * 2018-06-22 2023-09-26 中国石油天然气股份有限公司 Method and device for checking working condition of oil pumping well
CN109403928B (en) * 2018-11-23 2021-05-04 徐州东方传动机械股份有限公司 Intelligent pumping unit monitoring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101667020A (en) * 2009-09-25 2010-03-10 山东汉和能源科技有限公司 Remote measurement and control terminal of oil pumping unit
CN104516335A (en) * 2013-09-26 2015-04-15 罗斯蒙特公司 Wireless industrial process field device with imaging
CN104899675A (en) * 2015-04-29 2015-09-09 高志亮 Design method based on oil field Internet of Things engineering
CN106121627A (en) * 2016-06-03 2016-11-16 山东天工石油装备有限公司 A kind of oil pumper displacement acquisition method based on image recognition
CN106247000A (en) * 2016-08-09 2016-12-21 曲阜师范大学 A kind of ball-and-seat analyzing oil well liquid-supplying situation and the analysis method of oil well liquid-supplying situation

Also Published As

Publication number Publication date
CN110246157A (en) 2019-09-17

Similar Documents

Publication Publication Date Title
CN110246157B (en) Oil-gas field equipment production state distinguishing system and method based on big data monitoring
Zhou et al. Automatic detection method of tunnel lining multi‐defects via an enhanced You Only Look Once network
CN107977656A (en) A kind of pedestrian recognition methods and system again
CN111126122B (en) Face recognition algorithm evaluation method and device
US10235576B2 (en) Analysis method of lane stripe images, image analysis device, and non-transitory computer readable medium thereof
CN111310826B (en) Method and device for detecting labeling abnormality of sample set and electronic equipment
US20120281918A1 (en) Method for dynamically setting environmental boundary in image and method for instantly determining human activity
CN112070135A (en) Power equipment image detection method and device, power equipment and storage medium
CN116363125B (en) Deep learning-based battery module appearance defect detection method and system
CN110263719B (en) Artificial intelligence oil and gas field prevention and judgment system and method based on big data monitoring
CN111507332A (en) Vehicle VIN code detection method and equipment
Hakim et al. Implementation of an image processing based smart parking system using Haar-Cascade method
Yeum Computer vision-based structural assessment exploiting large volumes of images
CN103279904A (en) Mobile terminal and building information processing method thereof
CN113269081A (en) System and method for automatic personnel identification and video track query
CN113239854A (en) Ship identity recognition method and system based on deep learning
Kiew et al. Vehicle route tracking system based on vehicle registration number recognition using template matching algorithm
CN112364687A (en) Improved Faster R-CNN gas station electrostatic sign identification method and system
CN111047731A (en) AR technology-based telecommunication room inspection method and system
CN111126286A (en) Vehicle dynamic detection method and device, computer equipment and storage medium
CN110689028A (en) Site map evaluation method, site survey record evaluation method and site survey record evaluation device
CN115953744A (en) Vehicle identification tracking method based on deep learning
Vilgertshofer et al. Recognising railway infrastructure elements in videos and drawings using neural networks
CN112052824A (en) Gas pipeline specific object target detection alarm method, device and system based on YOLOv3 algorithm and storage medium
CN111046878B (en) Data processing method and device, computer storage medium and computer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200327

Address after: 163316 21 Floor, Gate 3, Block A, No. 2, Xinxing Street, Daqing High-tech Zone, Heilongjiang Province

Applicant after: DAQING ANRUIDA TECHNOLOGY DEVELOPMENT Co.,Ltd.

Applicant after: BEIJING FJR OPTOELECTRONIC TECHNOLOGY Co.,Ltd.

Address before: 163316 21 Floor, Gate 3, Block A, No. 2, Xinxing Street, Daqing High-tech Zone, Heilongjiang Province

Applicant before: DAQING ANRUIDA TECHNOLOGY DEVELOPMENT Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Room 205, building D-1, Service Outsourcing Industrial Park, No. 4-8 Xinfeng Road, high tech Zone, Daqing City, Heilongjiang Province

Patentee after: DAQING ANRUIDA TECHNOLOGY DEVELOPMENT Co.,Ltd.

Patentee after: Beijing fujirui Optoelectronic Technology Co.,Ltd.

Address before: 163316 21 Floor, Gate 3, Block A, No. 2, Xinxing Street, Daqing High-tech Zone, Heilongjiang Province

Patentee before: DAQING ANRUIDA TECHNOLOGY DEVELOPMENT Co.,Ltd.

Patentee before: BEIJING FJR OPTOELECTRONIC TECHNOLOGY Co.,Ltd.