CN115393700A - Water level height identification method and device, electronic equipment and storage medium - Google Patents

Water level height identification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115393700A
CN115393700A CN202210895717.1A CN202210895717A CN115393700A CN 115393700 A CN115393700 A CN 115393700A CN 202210895717 A CN202210895717 A CN 202210895717A CN 115393700 A CN115393700 A CN 115393700A
Authority
CN
China
Prior art keywords
image
scale
water level
height
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210895717.1A
Other languages
Chinese (zh)
Inventor
祁奕霏
高小宏
陆江辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210895717.1A priority Critical patent/CN115393700A/en
Publication of CN115393700A publication Critical patent/CN115393700A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/05Underwater scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/63Scene text, e.g. street names
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19147Obtaining sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/02Recognising information on displays, dials, clocks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Measurement Of Levels Of Liquids Or Fluent Solid Materials (AREA)

Abstract

The embodiment of the application discloses a water level height identification method, a water level height identification device, electronic equipment and a storage medium. The method comprises the steps of extracting an image to be identified from an acquired water gauge video stream; carrying out target detection on an image to be recognized to obtain an image area above a horizontal plane, and acquiring the image area as a water gauge image; identifying scale information above a horizontal plane in the water gauge image, and determining initial water level height information of the horizontal plane based on the scale information; determining the pixel height of the adjacent scale from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water scale; and adjusting the initial water level height information according to the measurement error to obtain the water level height information. The embodiment of the application discloses that the water level height identification method can improve the precision of the water level height identification method.

Description

Water level height identification method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of data processing, in particular to a water level height identification method and device, electronic equipment and a storage medium.
Background
The water level is a basic hydrological factor of a river, a lake and a reservoir, the real-time water level is an important reference basis for decision-making of flood prevention related departments in mountain flood disaster prevention and control, river basin flood control and urban flood control, and meanwhile, water level monitoring data is also an important index for water resource environment supervision, development and utilization. The water levels of rivers and lakes and reservoirs are monitored by automatic water level meters besides manual observation readings, wherein the automatic water level meter monitoring mainly comprises a float type, a pressure type, a bubble type, an ultrasonic type and a radar type, and the water levels can be monitored by adopting a traditional image recognition method. The methods have many defects, such as manual monitoring, safety problems, high labor intensity and low automation degree; various automatic water level gauges are high in cost, easy to be affected by the environment, high in installation difficulty and high in maintenance cost. The traditional image identification method has larger error due to inaccurate scale identification and the like.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide a water level height identification method, a water level height identification device, an electronic apparatus, and a computer-readable storage medium, which can improve the accuracy of water level height identification.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided a water level identification method, including: extracting an image to be identified from the collected water gauge video stream; carrying out target detection on an image to be recognized to obtain an image area above a horizontal plane, and acquiring the image area as a water gauge image; identifying scale information above a water level in the water gauge image, and determining initial water level height information of the water level based on the scale information, wherein the scale information comprises adjacent scales which are nearest to the water level; determining the pixel height of the adjacent scales from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water gauge scales; and adjusting the initial water level height information according to the measurement error to obtain the water level height information.
According to an aspect of an embodiment of the present application, there is provided a water level height identifying apparatus including: the extraction module is used for extracting an image to be identified from the acquired water gauge video stream; the target detection module is used for carrying out target detection on the image to be recognized to obtain an image area above a horizontal plane and acquiring the image area as a water gauge image; the identification module is used for identifying scale information above a water level in the water gauge image and determining initial water level height information of the water level based on the scale information, wherein the scale information comprises adjacent scales, and the adjacent scales are the scales closest to the water level; the first determining module is used for determining the pixel height of the adjacent scales from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water scale scales; and the adjusting module is used for adjusting the initial water level height information according to the measurement error to obtain the water level height information.
According to an aspect of the embodiments of the present application, there is provided an electronic device, including a processor and a memory, where the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, implement the above water level height identification method.
According to an aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to execute the water level height identifying method as previously provided.
According to an aspect of embodiments herein, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the water level height identification method provided in the various alternative embodiments described above.
The embodiment of the application provides a technical scheme based on close on the scale apart from the horizontal plane close on the proportion that pixel height accounts for the average pixel height of water gauge scale, determine the measuring error that initial water level height information corresponds, adjust initial water level height information according to measuring error, obtain water level height information, can eliminate the measuring error to water level height between two scales that the water level at the horizontal plane is located the water gauge or under the condition that the nearest scale of distance horizontal plane is not discerned for close on the scale through this kind of mode, thereby improve the identification accuracy of water level height.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is a schematic illustration of an implementation environment in which the present invention is described, according to an exemplary embodiment of the present application;
FIG. 2 is a schematic illustration of an implementation environment to which the present invention relates, as shown in another exemplary embodiment of the present application;
FIG. 3 is a flow chart illustrating a water level identification method according to an exemplary embodiment of the present application;
FIG. 4 is a flow chart illustrating a water level height identification method according to an exemplary embodiment based on the embodiment shown in FIG. 3;
FIG. 5 is a flow chart illustrating a water level height identification method according to an exemplary embodiment based on the embodiment shown in FIG. 4;
FIG. 6 is a flow chart illustrating a water level height identification method according to an exemplary embodiment based on the embodiment shown in FIG. 5;
FIG. 7 is a flow chart illustrating a water level height identification method according to an exemplary embodiment based on the embodiment shown in FIG. 6;
FIG. 8 is a flowchart of step S103 in the embodiment shown in FIG. 3 in an exemplary embodiment;
FIG. 9 is a flowchart of step S103 in the embodiment shown in FIG. 3 in an exemplary embodiment;
FIG. 10 is a flowchart of step S103 in the embodiment shown in FIG. 3 in another exemplary embodiment;
FIG. 11 is a diagram illustrating a water level height identification method according to an exemplary embodiment of the present application;
FIG. 12 is a diagram illustrating a water level height identification method according to an exemplary embodiment of the present application;
FIG. 13 is a block diagram of a water level height identification apparatus shown in an exemplary embodiment of the present application;
FIG. 14 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It should also be noted that: reference to "a plurality" in this application means two or more. "and/or" describe the association relationship of the associated objects, meaning that there may be three relationships, e.g., A and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It is understood that, in the specific implementation manner of the present application, data related to users, such as result data of game play and interaction data of the first type candidate account and the account to be recommended in game play, need to obtain permission or consent of the users when the above embodiments of the present application are applied to specific products or technologies, and collection, use and processing of the related data need to comply with relevant laws and regulations and standards in relevant countries and regions.
It is understood that in the specific implementation of the present application, related data of user information such as source video, client decoding capability, etc. need to be obtained user permission or consent when the above embodiments of the present application are applied to specific products or technologies, and the collection, use and processing of related data need to comply with related laws and regulations and standards of related countries and regions.
It is understood that in the specific implementation of the present application, related data such as water gauge images, images to be identified, etc. when the above embodiments of the present application are applied to specific products or technologies, permission or consent of relevant organizations needs to be obtained, and the collection, use and processing of the related data need to comply with relevant laws and regulations and standards of relevant countries and regions.
Some of the nouns to which this application relates will be explained first:
a water gauge: the water gauge is widely applied to projects such as soft soil foundation treatment, ports and wharfs and the like and is used for observing changes of underground water level and ocean tide water level.
Water depth: distance of water surface from river bottom.
Absolute basal plane: and taking the elevation of the characteristic sea level at a certain position as a zero level base plane.
Base level of the survey station: and taking the elevation of the specific point as the zero point of the reference calculation water level.
Water level: the water level refers to the elevation of the free water surface relative to a certain base level, and the base level used for calculating the water level can be an absolute base level, and is usually a yellow sea level. Station bases may also be used.
It should be noted that Artificial Intelligence (AI) is a theory, method, technique, and application system that simulates, extends, and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge, and uses the knowledge to obtain the best results. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Computer Vision technology (CV) Computer Vision is a science for researching how to make a machine "see", and further refers to replacing human eyes with a camera and a Computer to perform machine Vision such as identification and measurement on a target, and further performing image processing, so that the Computer processing becomes an image more suitable for human eyes to observe or transmit to an instrument to detect. As a scientific discipline, computer vision research-related theories and techniques attempt to build artificial intelligence systems that can capture information from images or multidimensional data. Computer vision technologies generally include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technologies, virtual reality, augmented reality, synchronous positioning, map construction, and other technologies, and also include common biometric technologies such as face recognition and fingerprint recognition.
Machine Learning (ML) is a multi-domain cross discipline, and relates to a plurality of disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and the like. The special research on how a computer simulates or realizes the learning behavior of human beings so as to acquire new knowledge or skills and reorganize the existing knowledge structure to continuously improve the performance of the computer. Machine learning is the core of artificial intelligence, is the fundamental approach for computers to have intelligence, and is applied to all fields of artificial intelligence. Machine learning and deep learning generally include techniques such as artificial neural networks, belief networks, reinforcement learning, transfer learning, inductive learning, and formal education learning.
The water level height identification method and apparatus, the electronic device, and the computer-readable storage medium according to the embodiments of the present application relate to artificial intelligence technology and computer vision technology, and the embodiments will be described in detail below.
FIG. 1 is a schematic illustration of an implementation environment to which the present invention relates, as shown in an exemplary embodiment of the present application. As shown in fig. 1, the implementation environment related to the present invention at least includes a water gauge 13 installed in a water body, an image acquisition device 12, and a terminal device 11, where the water gauge 13 is used for measuring a water level height of the water body, and the water gauge 13 may be an electronic water gauge (including a water gauge using ultrasonic or radar measurement technology, which is high in cost), a stainless steel water gauge, a stainless steel combination water gauge, a polymer water gauge, an aluminum plate reflective water gauge, an enamel water gauge, and the like, which is not limited specifically herein. The image acquisition device 12 is used for acquiring a water body image containing information of the water gauge 13, and transmitting the acquired water body image to the terminal device 11 through a data communication link, so that the terminal device 11 analyzes the water body image, and further the water level height of the water body is obtained.
Illustratively, the terminal device 11 is configured to identify scale information above a water level in the water gauge image, determine initial water level height information of the water level based on the scale information, the scale information including a proximity scale closest to the water level; determining the height of an adjacent pixel of the adjacent scale from the horizontal plane; determining a measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water scale; and adjusting the initial water level height information according to the measurement error to obtain the water level height information.
The terminal device 11 may be a smart phone, a tablet Computer, a PC (Personal Computer), an intelligent voice interaction device, an intelligent household appliance, a vehicle borne terminal, an aircraft, or other electronic devices, and is not limited herein. The server 30 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and an artificial intelligence platform, and is not limited herein.
FIG. 2 is a schematic illustration of an implementation environment to which the present invention relates, as shown in another exemplary embodiment of the present application. As shown in fig. 2, the implementation environment according to the present invention at least includes a water gauge 23, a central image collecting device 22, an edge image collecting device 23 and a terminal device 21, which are installed in a water body, wherein the central image collecting device 22 is in communication connection with the edge image collecting device 23, the central image collecting device 22 is in communication connection with the terminal device 21, the central image collecting device 22 and the edge image collecting device 23 are used for shooting a water body containing information of the water gauge 23 from different angles or directions to obtain water body images of different angles, the edge image collecting device 23 transmits the collected water body images to the central image collecting device 22 through a communication link, the central image collecting device 22 sends the collected water body images and the water body images sent from the edge image collecting device 23 to the terminal device 21 together, it can be understood that the number of the image collecting devices is not limited in this embodiment, and can be changed flexibly according to the actual application scenario.
In this embodiment, the water gauge 23 is used for measuring the water level of the water body, and the water gauge 23 may be an electronic water gauge, a stainless steel combination water gauge, a polymer water gauge, an aluminum plate reflective water gauge, an enamel water gauge, or the like, which is not specifically limited herein. The central image acquisition device 22 and the edge image acquisition device 23 are used for acquiring water body images at different angles containing information of the water gauge 23, and transmitting the acquired water body images to the terminal equipment 21 through a data communication link, so that the terminal equipment 21 analyzes the water body images, and further the water level height of the water body is obtained.
In this embodiment, the water level height measurement accuracy of the water can be improved by acquiring the water body image containing the information of the water gauge 23 from different angles by setting the central image acquisition device 22 and the edge image acquisition device 23.
In recent years, with the development of technologies such as edge devices and data warehouses, the water conservancy department accumulates a large amount of video or picture data of places such as river channels and gates, and the opening and closing degree of the gates and the prevention and control of river channel flood discharge depend on the identification and monitoring of water levels.
The existing water level identification mainly comprises a traditional method and a data method, and the traditional water level identification also comprises manual identification and sensor identification. The manual identification relies on workers to observe, identify and record the water gauge and the water level on site, and the sensor identification mainly relies on equipment such as the water gauge with built-in water level sensing hardware to carry out remote sensing and identification. The data-driven identification method is mainly characterized in that by means of artificial intelligence technologies such as target detection, image segmentation, optical Character Recognition (OCR), image processing and the like, edge devices such as cameras and the like are combined, water gauge or water bitmap images are processed, identified, monitored and reported to a cloud for storage at certain frequency at edge sides, and workers are assisted in executing strategies such as gate opening and closing, flood discharge and flood control in time. The target detection is also called target extraction, and is an image segmentation method based on target geometry and statistical characteristics.
Through long-term research, the inventor of the application finds that the existing water level identification method has the following defects:
the manual counting method depends on the experience of workers, consumes a large amount of working time, and cannot ensure the safety in extreme weather;
the hardware method for sensor identification is difficult to operate and maintain, faults such as too low identification precision, timeliness reduction and the like often occur in a severe environment, and the hardware method can be used for a long time only by depending on a large amount of manpower and material resources for maintenance;
generally, a one-stage or two-stage image segmentation, OCR (optical character recognition), target detection and other artificial intelligent methods cannot identify small scales above the water surface, the error is generally about 5cm, and the finer identification requirements such as 2cm cannot be met;
at least based on the defects, the three-section type water level identification method based on image segmentation, target detection and scale processing is provided, a water level fine identification model is established through technologies such as target detection and scale sparsity processing, and the problems that manpower and material resources are consumed, an image processing identification error is too large and the like in a traditional method are solved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a water level height identifying method according to an exemplary embodiment of the present application, and as shown in fig. 3, the water level height identifying method according to the present embodiment includes steps S101 to S105, and reference is made to the following for detailed description:
step S101: and extracting an image to be identified from the acquired water gauge video stream.
The image acquisition device acquires a plurality of water gauge video streams containing water gauge information and to-be-identified images, records acquisition time corresponding to each image, and selects a target to-be-identified image corresponding to the acquisition time to be analyzed from the plurality of images. In this embodiment, the video stream is a sequence of images of a water body including water gauge information acquired by an image acquisition device, for example, a river, a lake, etc. including a water gauge, and is not limited in this embodiment.
Step S102: and carrying out target detection on the image to be recognized to obtain an image area above the horizontal plane, and acquiring the image area as a water gauge image.
Because the image to be recognized includes information irrelevant to water level height recognition, which may interfere with the water level height recognition process, the embodiment performs target detection on the image to be recognized to obtain an image area above the horizontal plane, where the image area above the horizontal plane includes water gauge information.
For example, the present embodiment may perform target detection on an image to be recognized based on a machine learning manner, so as to obtain an image area above a horizontal plane. For example, the deplabv 3+ image segmentation model is trained in advance, the image to be recognized is input into the trained deplabv 3+ image segmentation model, and the image area above the horizontal plane is output. Exemplarily, a labelme labeling tool is used for respectively carrying out water gauge outline drawing on a plurality of images to be recognized containing water gauge information, and the images are stored as training samples to train the image segmentation model. The Deeplabv3+ algorithm adopts the resnet as a classification Network, and introduces more scale information through DCNN (Dynamic Convolution Neural Network) with hole Convolution and pyramid pooling. And constructing an accurate depeplabv 3+ image segmentation model through training of a plurality of labeled training images.
Step S103: and identifying scale information above the water level in the water gauge image, and determining initial water level height information of the water level based on the scale information.
The water gauge is various scales which are arranged on the bank and used for observing the lifting condition of the water surface. The water gauge is widely applied to soft soil foundation treatment, port and wharf projects and other projects, is used for observing the changes of underground water level and ocean tide water level, and has the characteristics of stable reading, interference resistance, long service life and the like. The surface of the water gauge is usually printed with a plurality of scales, each scale is correspondingly provided with a character mark, the precision of the water gauge is generally measured by centimeters (the minimum scale is 1 cm), and the water gauge is generally measured by 1 meter as a section. The water gauge image is an image obtained by image acquisition of a river or the like including water gauge information by the image acquisition device.
For example, since the scale information below the horizontal plane is generally hidden and the scale information below the horizontal plane is not needed for determining the initial water level height information of the horizontal plane, the present embodiment recognizes only the scale information above the horizontal plane, and the present embodiment does not limit the manner of recognizing the scale information above the horizontal plane in the water gauge image, for example, the manner of recognizing the scale information above the horizontal plane in the water gauge image by a human or a sensor. Illustratively, the scale information above the horizontal plane in the water gauge image recognized by the present embodiment includes position information of the scales and character identification information corresponding to each scale.
Before the water level is primarily identified, the scale information of the water gauge image needs to be identified, and the scale information of the water gauge image can be identified based on a machine learning mode in the embodiment, for example, a preset scale identification model is trained by adopting a YOLOv5 algorithm, and the water gauge image is input into the trained scale identification model, so that the scale information can be output. The YOLOv5 algorithm is a single-stage target detection algorithm, and new improvement ideas are added to the YOLOv4 algorithm, so that the speed and the precision of the YOLOv5 algorithm are greatly improved. The input end provides some improvement ideas in the model training stage, mainly including Mosaic data enhancement, adaptive anchor frame calculation, adaptive picture scaling; the reference network is fused with some new ideas in other detection algorithms and mainly comprises a Focus structure and a CSP structure; the target detection network often inserts some layers between the backhaul and the final Head output layer.
For example, if the adjacent pixel height is greater than the average pixel height of the water gauge scales of the water gauge, it is determined that the water gauge scale recognition model does not recognize the scale closest to the horizontal plane, and a large recognition error exists, and therefore notification information is sent to the service platform so that the service platform adjusts the model parameters of the water gauge scale recognition model based on the notification information.
In this embodiment, the scale information includes the nearest neighbor scale from the horizontal plane. The proximity scale is a scale closest to the horizontal plane, and specifically, after the plurality of scales above the horizontal plane are recognized, the scale closest to the horizontal plane is screened out from the plurality of scales and recognized as the proximity scale. Illustratively, the adjacent scale is determined based on the character representation corresponding to each scale, for example, a total of four scales are identified, namely a first scale, a second scale, a third scale and a fourth scale, wherein the character corresponding to the first scale is identified as 20, the character corresponding to the second scale is identified as 25, the character corresponding to the third scale is identified as 30, and the character corresponding to the fourth scale is identified as 35, and then the adjacent scale is determined as the first scale based on the character identifications corresponding to the four scales.
In this embodiment, the initial water level height information of the horizontal plane determined based on the scale information is likely to have an error, on one hand, since the horizontal plane is not always exactly flush with the scale of the water gauge in an actual application scene, in this case, the height between the horizontal plane and the first identified scale above the horizontal plane may cause an identification error, and in fact, if the scale of the water gauge is larger, the error of the initial water level height information obtained based on the scale information is larger; in addition, in general, a scale above the horizontal plane is not easily recognized compared to other scales, and thus a recognition error may be caused.
Initial water level height information of the horizontal plane is determined based on the scale information, and the scale information exemplarily comprises character marks corresponding to adjacent scales, and the character marks corresponding to the adjacent scales are used as the initial water level height information. For example, if the character mark corresponding to the adjacent scale can be recognized, the character mark corresponding to the adjacent scale is used as the initial water level height information.
Step S104: and determining the pixel height of the adjacent scales from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water gauge scales.
In this embodiment, the adjacent pixel height represents a pixel distance between the adjacent scale and the horizontal plane in the water gauge image, and the adjacent pixel height may be determined in various ways in this embodiment, for example, the pixel height of the adjacent scale from the horizontal plane is measured manually or calculated based on a constructed coordinate system, which is not limited herein. Exemplarily, firstly, the pixel adjacent to the scale is determined, then the pixel in the water gauge image, where the horizontal plane intersects with the water gauge, is determined, and the pixel height from the scale to the horizontal plane can be obtained by multiplying the number of the pixels between the two pixels by the height of the unit pixel.
For example, if the water gauge is marked with scales of various accuracies, in order to reduce the error of identifying the water level height, the average pixel height of the water gauge scale refers to the scale of the minimum accuracy of the water gauge scale mark, for example, if the water gauge scale is marked with scales of two accuracies, the accuracy of the scales is 1 dm and 1 m respectively, and then the average pixel height of the water gauge scale is 1 dm.
In fact, when the initial water level height information is determined based on the scale information, the actual height corresponding to the adjacent pixel height is mistaken for a part of the water level height, and therefore, the measurement error corresponding to the initial water level height information is determined based on the adjacent pixel height in the embodiment, and the measurement error can be determined by calculating the actual height corresponding to the adjacent pixel height in the embodiment.
For example, in this embodiment, the actual height corresponding to the height of the adjacent pixel may be determined by using a proportional method, the image acquisition device and the marker are arranged at the measurement position of the water body to be measured, the image acquisition device acquires an image of the marker to obtain a water gauge image with the marker, and calculates parameters such as a coordinate position, a spatial geometric dimension and the like of the marker in the image, the actual spatial distance of the marker is divided by the pixel distance in the water gauge image to obtain a water gauge scale, and the measurement error may be obtained by multiplying the water gauge scale by the height of the adjacent pixel.
For example, the embodiment first calculates the ratio of the height of the adjacent pixel to the average height of the pixels on the water scale, and then determines the measurement error corresponding to the initial water level height information based on the ratio. Illustratively, the measurement error is taken as the product of the ratio of the adjacent pixel height to the average pixel height of the water scale and the actual height corresponding to the unit scale.
Step S105: and adjusting the initial water level height information according to the measurement error to obtain the water level height information.
In this embodiment, the actual water level height information can be obtained by subtracting the measurement error from the initial water level height included in the initial water level height information.
The water level height identification method that this embodiment provided is based on the proportion that the close on pixel height of close on scale apart from the horizontal plane accounts for the average pixel height of water gauge scale, determine the measuring error that initial water level height information corresponds, adjust initial water level height information according to measuring error, obtain water level height information, can eliminate the measuring error to the water level height between two scales that the water level at the horizontal plane is located the water gauge or under the condition that the nearest scale of distance horizontal plane is not discerned for close on scale through this kind of mode, thereby improve the recognition accuracy of water level height.
Considering that the image to be recognized is affected by noise during the forming, transmitting and receiving processes, the noise may reduce the image quality, the noise may generate abrupt changes in the originally uniform and continuously changing gray values in the image to be recognized, and a false edge or a false contour is formed, in order to eliminate the noise of the image to be recognized, this embodiment provides a water level height recognition method shown in an exemplary embodiment based on the embodiment shown in fig. 3.
Exemplarily, referring to fig. 4, fig. 4 is a schematic flow chart of a water level height identification method according to an exemplary embodiment proposed on the basis of the embodiment shown in fig. 3, and as shown in fig. 4, the water level height identification method provided by the present embodiment further includes steps S201 to S202, and the following reference is made for detailed description:
step S201: and determining adjacent pixel areas corresponding to all pixels in the image to be recognized.
In this embodiment, the adjacent pixel region includes a predetermined number of surrounding pixels of the corresponding pixel. The present embodiment determines the neighboring pixel region of the corresponding pixel by determining a preset number of pixels closest to the corresponding pixel.
The present embodiment may determine the value of the preset number according to the actual application scenario, and is not limited herein, for example, if the preset number is 8, the pixels around the preset number of pixels include eight pixels closest to the corresponding pixel.
In this embodiment, the number of pixels included in the adjacent pixel regions corresponding to each pixel may be the same or different, and is not limited herein.
Step S202: and calculating the average pixel value of the pixels contained in the adjacent pixel area, and replacing the pixel value of the corresponding pixel with the average pixel value so as to take the obtained processed image as an image to be identified for target detection.
The embodiment can weaken, inhibit or eliminate the noise in the image to be identified in the above mode.
Illustratively, the image to be recognized is subjected to Gaussian filtering, wherein the Gaussian filtering is a linear filter, and the weighted average value of pixels in the range of a filter window is taken as a transformation result, so that the image can be smoothed, and noise can be effectively suppressed. The gaussian filter needs to obtain a template coefficient by discretizing a gaussian function, so as to generate a filter template, and traverse an image window for processing.
For a template with window size of 2k + l, the calculation formula of the output value is as follows:
Figure BDA0003766334350000121
and sigma is the standard deviation of the pixels. White noise of the image to be identified subjected to Gaussian filtering and subjected to normal distribution can be removed, and the next identification can be performed more accurately.
Exemplarily, referring to fig. 5, fig. 5 is a schematic flowchart of a water level height identification method according to an exemplary embodiment based on the embodiment shown in fig. 4, and as shown in fig. 5, the water level height identification method provided by this embodiment further includes steps 301 to S302, and the detailed description refers to the following:
step S301: and carrying out graying processing on the processed image to obtain a corresponding grayscale image.
In this embodiment, the processed image is the image to be recognized after the denoising process of the embodiment shown in fig. 4. The process of converting a color image into a grayscale image is referred to as a graying process of the image.
In this embodiment, a plurality of graying processing manners may be adopted to perform graying processing on the processed image to obtain a corresponding grayscale image, for example, a component method, a maximum value method, an average value method, a weighted average method, and the like, which is not limited herein. The component method is to use the brightness of three components of pixels in a color image as the gray values of three gray images, and one gray image can be selected as the gray processing result of the image according to the application requirement. The maximum value method is to use the maximum value of the three-component brightness in the color image as the gray value of the gray map. The average method is to average the three-component brightness in the color image to obtain a gray value. The weighted average method is to perform weighted average on the three components with different weights according to importance and other indexes.
Step S302: and carrying out gray linear transformation on the gray image based on a preset linear transformation formula to obtain a corresponding linear gray image so as to take the linear gray image as an image to be identified for target detection.
Since the gray image obtained by directly performing the graying processing may have a problem of insignificant features, it is necessary to perform gray enhancement (i.e., gray conversion) on the gray image, and this embodiment performs linear gray conversion on the gray image again to obtain an enhanced gray image.
The gray scale linear transformation is to use a linear transformation formula to establish a gray scale mapping relationship to adjust the gray scale of a gray scale image, so as to achieve the purpose of image enhancement.
The linear formula is:
f(c)=a*c+b
wherein f (c) is the gray value of a pixel point of the gray image after the linear transformation, c is the gray value of a corresponding pixel point of the original gray image before the linear transformation, a is the slope of the linear transformation, and b is the intercept of the linear transformation.
The linear transformation formula of the gray scale linear transformation can also change along with the change of the slope and the intercept, different gray scale linear transformations are carried out on the same gray scale image based on different linear formulas, the linear formulas corresponding to different parameters are different, the improvement degrees of the linear formulas on the image are different, and the different parameters are the linear slope and the intercept. The implementation of the invention needs to find an image with the best image enhancement effect after gray scale linear transformation, the intercept and the slope in the linear formula need to be continuously traversed, different parameters are substituted into the linear formula, and then the linear gray scale image is obtained according to the linear formula.
In this embodiment, the range of the intercept in the linear transformation formula is determined by the gray scale mean of the gray scale image.
The inventor of the application finds that, in the prior art, when the gray scale linear transformation is performed on an image, a linear transformation formula is formed by traversing a preset slope value interval and an intercept value interval to obtain corresponding slopes and intercepts, and then the image is linearly transformed based on the linear transformation formula.
Specifically, the method comprises the following steps: the gray level average value calculating method comprises the steps of firstly obtaining the ratio of the number of gray level pixel points in a gray level image, multiplying the ratio of the number of the gray level pixel points by the corresponding gray level to obtain the ratio of the gray level, and calculating the sum of the ratio of a plurality of gray levels to obtain the gray level average value.
When setting the linear slope, the maximum range of the intercept is [ -255, 255] because the gray value ranges of the gray image before and after the gray linear transformation and the linear gray image are to be kept in the normal range, and the maximum range has many useless ranges, and when the gray value obtained by adding or subtracting most of the gray value and the intercept is not in [0, 255], the value of the intercept is not considered any more, so the embodiment scales the range of the intercept according to the gray average value of the gray image before the linear gray linear transformation
In this embodiment, the range of the intercept in the linear formula of the gray scale linear transformation is [ -255+ μ,255- μ ], where μ is the gray scale mean of the gray scale image.
The embodiment of the invention utilizes the gray average value of the gray image to reduce the value range of the intercept in the linear formula of the gray linear transformation, the value range is relatively small, a plurality of linear gray images are obtained by traversing the intercept in the value range, and the gray enhancement image is selected from the plurality of linear gray images according to the characteristics of the gray histogram of the linear gray image, thereby achieving the purposes of reducing the traversal times and improving the calculation speed.
Exemplarily, referring to fig. 6, fig. 6 is a schematic flow chart of a water level height identification method according to an exemplary embodiment based on the embodiment shown in fig. 5, and as shown in fig. 6, the water level height identification method provided by this embodiment further includes steps 401 to S403, and the detailed description refers to the following:
step 401: and counting the histogram distribution information of each gray level in the linear gray level image.
The histogram distribution information of each gray level is obtained by counting the frequency of occurrence of all pixels in the digital image according to the size of the gray level.
Step 402: and determining a cumulative gray scale probability distribution function corresponding to the histogram distribution information.
In this embodiment, assuming that the number of pixels of the linear gray image is n, there are l gray levels, and n _ k represents the number of pixels with gray level r _ k, the probability of the k-th gray level is as follows:
Figure BDA0003766334350000151
therefore, the cumulative gray-scale probability distribution function corresponding to the histogram distribution information is determined as follows:
Figure BDA0003766334350000152
step 403: and performing histogram equalization on the image to be recognized by taking the cumulative gray probability distribution function as a transformation formula, and taking the obtained corrected image as the image to be recognized for target detection.
In this embodiment, the modified image can be obtained by substituting the pixel values of the respective pixels of the linear gray scale image into the following cumulative gray scale probability distribution function:
Figure BDA0003766334350000153
after the histogram equalization is adopted for the graph, the gray scale interval of the image can be pulled apart or the gray scale distribution is uniform, so that the contrast is increased, the image details are clear, and the image is enhanced. Therefore, the histogram equalization is carried out on the image to be recognized after the gray scale transformation, and a good basis is provided for the subsequent analysis of the water gauge and the scale grabbing.
Exemplarily, referring to fig. 7, fig. 7 is a schematic flowchart illustrating a water level height identifying method according to an exemplary embodiment based on the embodiment shown in fig. 6, and as shown in fig. 7, the water level height identifying method further includes steps 501 to S503, and the detailed description refers to the following:
step 501: and acquiring the gradient amplitude of each pixel in the corrected image.
In this embodiment, the gradient amplitude of the pixel gray level of each pixel in the corrected image is calculated by using a finite difference calculation method of first-order partial derivatives. For example, the partial derivatives of the corrected image in the horizontal and vertical directions, which are respectively denoted by Gx and Gy, may be first approximately calculated by a two-dimensional first-order finite difference method, and the specific calculation manner is as follows:
Figure BDA0003766334350000154
Figure BDA0003766334350000155
the magnitude and azimuth angle of its pixel gradient can be calculated as follows:
Figure BDA0003766334350000161
θ(x,y)=arctan(G x /G y )
wherein, A (x, y) represents the gradient amplitude and reflects the edge strength of the corrected image, and theta (x, y) represents the direction angle of the gradient and reflects the direction of the edge.
Step 502: and performing non-maximum suppression on each gradient amplitude, and determining a non-maximum suppression image based on the obtained gradient amplitude.
In this embodiment, the magnitude relationship between the gradient magnitude of each pixel in the corrected image and the gradient magnitude of its corresponding adjacent pixel is compared along the gradient line, and if the comparison result is that the gradient magnitude of the pixel in the corrected image is greater than that of the adjacent pixel, the gradient magnitude of the pixel is retained; and if the comparison result is that the gradient amplitude of the pixel in the corrected image is not greater than the gradient amplitude of the adjacent pixel, setting the gradient amplitude of the pixel to be 0, and generating a non-maximum value suppression image according to the comparison result.
It should be understood that the gradient amplitudes of the non-maximum-value-suppressed image are in one-to-one correspondence with the pixels in the corrected image, and after the above processing, the gradient amplitudes of the pixels with gradient amplitudes smaller than the gradient amplitudes of the adjacent pixels are set to 0, and the gradient amplitudes of the pixels with gradient amplitudes larger than the gradient amplitudes of the adjacent pixels are retained in the non-maximum-value-suppressed image.
S503: and detecting edge information of the corrected image based on the non-maximum suppression image to perform target detection on the corrected image based on the edge information.
In this step, since the obtained non-maximum suppression image has edge information of each subject in the image, useful edge information needs to be screened, and exemplarily, the gradient amplitude of the pixel in the non-maximum suppression image is compared with a preset threshold; if the gradient amplitude of the pixel in the non-maximum value inhibition image is smaller than a preset threshold, setting the pixel gray level of the pixel corresponding to each pixel in the modified image to be 0, and if the gradient amplitude of the pixel in the non-maximum value inhibition image is larger than or equal to the preset threshold, keeping the pixel gray level of the pixel corresponding to each pixel in the modified image unchanged to obtain edge information of the modified image, so as to perform target detection on the modified image based on the edge information.
Exemplarily, referring to fig. 8, fig. 8 is a flowchart of step S103 in the embodiment shown in fig. 3 in an exemplary embodiment, and as shown in fig. 8, step S104 includes steps S601-S602, and the detailed description refers to the following:
step S601: and calculating the ratio of the height of the adjacent pixels to the average pixel height, and determining the ratio as the ratio of the height of the adjacent pixels to the average pixel height of the water gauge.
In an actual application scene, the ratio of the height of the adjacent pixel to the average pixel height of the water scale is possibly more than 1, equal to 1 or less than 1, if the ratio of the height of the adjacent pixel to the average pixel height of the water scale is equal to 1, it is indicated that no measurement error exists in the initial water level height information, and if the ratio of the height of the adjacent pixel to the average pixel height of the water scale is more than 1, it is indicated that the scale closest to the horizontal plane is not identified when the scale information of the water scale image is identified; if the ratio of the height of the adjacent pixels to the height of the average pixels of the water gauge scales is smaller than 1, it is indicated that the scales closest to the horizontal plane have been identified, the horizontal plane is located between the adjacent scales and the scales closest to the horizontal plane below the horizontal plane, if the scale information identification condition is the former, the error is relatively too large, and if the scale information identification condition is the latter, the error is relatively small.
Step S602: and calculating the product between the proportion and the actual average height corresponding to the water scale, and determining the calculated product value as the measurement error corresponding to the initial water level height information.
In this embodiment, the product of the ratio of the height of the adjacent pixel to the average pixel height of the water scale and the actual average height corresponding to the water scale is the real height of the adjacent scale from the horizontal plane.
In the present embodiment, the unit scale in both the actual average height corresponding to the water gauge scale and the average pixel height of the water gauge scale refers to the smallest unit scale identified on the water gauge. The actual average height corresponding to the water gauge scale is an attribute parameter of the water gauge, and can be directly obtained or obtained based on the scale information of the water gauge, and is not specifically limited herein.
Exemplarily, referring to fig. 9, fig. 9 is a flowchart of step S103 in the embodiment shown in fig. 3 in an exemplary embodiment, and as shown in fig. 9, step S103 includes steps S701 to S703, and the detailed description refers to the following:
step S701: the number of divisions spaced between the non-adjacent division and the adjacent division is determined.
In this embodiment, the scale information further includes a character identification of each scale. Generally, the character mark of the scale is a number, and the character mark of the scale indicates that if the water level reaches the scale, the height of the scale is the height of the current water level, for example, the scale information above the water gauge image includes a first scale and a second scale, wherein the character mark corresponding to the non-adjacent scale is 20, and the character mark corresponding to the second scale is 25.
In the embodiment, in an actual application scenario, in the process of recognizing the water gauge image and obtaining the scale information, the character identifier corresponding to the adjacent scale may not be completely recognized for various reasons, for example, the character identifier corresponding to the adjacent scale is covered by mud, or the recognition effect of the scale recognition model is too poor to recognize the character identifier corresponding to the adjacent scale, and the like, which is not specifically limited herein. In this case, if the character marks corresponding to the scales other than the adjacent scale can be recognized, the initial water level height information of the horizontal plane may be identified based on the characters of the adjacent scale and the other scales.
In this embodiment, the scale information further includes a character identifier corresponding to the non-adjacent scale, and a distance between the non-adjacent scale and the horizontal plane is greater than a distance between the adjacent scale and the horizontal plane. In this embodiment, if the character identifier corresponding to the adjacent scale is not recognized, and the character identifiers corresponding to the plurality of other scales are recognized, the scale closest to the adjacent scale in the scales where the corresponding character identifier is recognized is taken as the non-adjacent scale. And determining the number of scales spaced between the non-adjacent scales and the adjacent scales, wherein the number of scales comprises the non-adjacent scales.
Step S702: the product of the number of graduations and the actual height of the unit graduation is calculated.
In this embodiment, the product of the number of graduations and the actual height of a unit graduation is the distance between a non-adjacent graduation and an adjacent graduation.
Step S703: and determining the difference between the character identification of the non-adjacent scale and the calculated product value, and taking the obtained difference as the initial water level height information.
In fact, the difference value between the character identification of the non-adjacent scale and the calculated product value is equivalent to the scale value corresponding to the adjacent scale, and by means of the method, the fault tolerance of the water level height identification method provided by the application is improved.
Exemplarily, referring to fig. 10, fig. 10 is a flowchart of step S103 in the embodiment shown in fig. 3 in another exemplary embodiment, and as shown in fig. 10, step S101 includes steps S801 to S802, and the detailed description refers to the following:
step S801: and calculating the product of the actual average height corresponding to the unit scale of the water gauge and the total number of scales.
In an actual application scenario, if the water level height is measured by using a water gauge without a character identifier or the character identifier corresponding to the scale is not identified when the scale information is identified, the initial water level height information can be determined by the method provided by the embodiment.
In this embodiment, the scale information further includes the total number of scales above the horizontal plane, and the product of the actual average height corresponding to the unit scale of the water gauge and the total number of scales is the remaining actual height of the water gauge above the adjacent scale.
Step S802: and taking a third difference value between the total length of the water gauge and the calculated product value as initial water level height information.
The third difference between the total length of the water gauge and the calculated product value is the remaining actual height of the water gauge below the adjacent scale, and this part of the height is used as the initial water level height information in the present embodiment.
Referring to fig. 11, fig. 11 is a schematic diagram illustrating a water level height identification method according to an exemplary embodiment of the present disclosure, and as shown in fig. 11, the water level height identification method provided in this embodiment includes steps S901 to S906, and reference is made to the following steps for detailed description:
step S901: and inputting a video stream.
In this embodiment, the video stream is an image sequence of a water body including water gauge information acquired by an image acquisition device, for example, a river, a lake, etc. including a water gauge, and is not limited in detail herein.
Step S902: and (4) image preprocessing.
In this embodiment, the image preprocessing includes performing linear smoothing filtering processing, gray scale linear transformation, histogram modification, binarization processing, and the like on the image to be recognized, so as to accurately recognize scale information of the preprocessed water gauge image in the following step.
Step S903: and (5) identifying a water gauge.
In this embodiment, the target detection is performed on the image to be recognized to locate the water gauge, and since the image to be recognized includes information irrelevant to the water level height recognition, which may interfere with the water level height recognition process, the image to be recognized is subjected to the target detection in this embodiment, so as to obtain an image area above the horizontal plane, where the image area above the horizontal plane includes the water gauge information.
The embodiment can perform target detection on the image to be recognized based on a machine learning mode to obtain the image area above the horizontal plane. For example, the deplabv 3+ image segmentation model is trained in advance, the image to be recognized is input into the trained deplabv 3+ image segmentation model, and the image area above the horizontal plane is output. Exemplarily, a plurality of training images containing a water gauge are respectively subjected to water gauge outline drawing by using a labelme marking tool and are stored as training samples. The Deeplabv3+ algorithm adopts renet as a classification network, and introduces more scale information through DCNN with cavity convolution and pyramid pooling. And constructing an accurate depeplabv 3+ image segmentation model through training of a plurality of labeled training images.
Step S904: and (5) primarily identifying the water level.
In the embodiment, the water level is preliminarily identified, that is, the water level in the water gauge image is preliminarily measured. Before the water level is primarily recognized, the scale information of the water gauge image needs to be recognized, the scale information of the water gauge image can be recognized based on a machine learning mode in the embodiment, for example, a YOLOv5 algorithm is adopted to train a scale recognition model, the water gauge image is input into the trained scale recognition model, and then the scale information can be output. The YOLOv5 algorithm is a single-stage target detection algorithm, and new improvement ideas are added to the YOLOv4 algorithm, so that the speed and the precision of the YOLOv5 algorithm are greatly improved. The input end provides some improvement ideas in a model training stage, and the improvement ideas mainly comprise Mosaic data enhancement, self-adaptive anchor frame calculation and self-adaptive picture scaling; the reference network is fused with some new ideas in other detection algorithms and mainly comprises a Focus structure and a CSP structure; the target detection network often inserts some layers between the backhaul and the final Head output layer.
In this embodiment, a method for preliminarily measuring the water level in the water gauge image can be determined according to the scale recognition condition, for example, if a character mark corresponding to an adjacent scale is recognized, the character mark corresponding to the adjacent scale is used as a result that the water level is not recognized; for example, if the character marks corresponding to the adjacent scales are not recognized, but the character marks corresponding to the other scales are recognized, the distance between the other scales and the horizontal plane is greater than the distance between the adjacent scales and the horizontal plane. In this case, by first determining the number of divisions of the interval between the non-adjacent division and the adjacent division; calculating the product of the number of scales and the actual height of the unit scale; and determining the difference between the character identification of the non-adjacent scale and the calculated product value, and taking the obtained difference as the initial water level height information.
For example, if the water level height is measured by using a water gauge without a character identifier or the character identifier corresponding to the scale is not identified when the scale information is identified, the initial water level height information may be determined by the method provided by this embodiment; firstly, the product of the actual height corresponding to the unit scale of the water gauge and the total number of the scales is calculated. And taking a third difference value between the total length of the water gauge and the calculated product value as initial water level height information.
Step S905: and (5) performing water level refinement treatment.
The recognition result obtained by the water level preliminary recognition in step S904 may have a large error, for example, if the horizontal plane is located between the adjacent scale and the scale below the horizontal plane, the height between the horizontal plane and the adjacent scale is easily ignored in this case, and a recognition error is caused; or inaccurate scale information identification of the water gauge image.
In the embodiment, the initial water level height information is refined by, for example, calculating a ratio of the height of the adjacent pixel to the height of the average pixel of the water scale, and determining the ratio as a ratio of the height of the adjacent pixel to the height of the average pixel of the water scale. And calculating the product of the proportion and the actual height corresponding to the unit scale of the water gauge, determining the calculated product value as a measurement error corresponding to the initial water level height information, and adjusting the initial digital height information based on the measurement error.
Step S906: and (5) identifying the water level.
The present embodiment subtracts the measurement error from the initial digital height information to obtain the final water level height.
Referring to fig. 12, fig. 12 is a schematic diagram illustrating a water level height identification method according to an exemplary embodiment of the present application, and as shown in fig. 12, the water level height identification method provided in this embodiment includes steps S1 to S6, and reference is made to the following in detail:
step S1: the average pixel height of the scale is calculated.
In this embodiment, the average pixel height of the scale can be obtained by dividing the total length of the water gauge by the total number of scales of the water gauge.
Step S2: the height of the first scaled pixel above the water surface is calculated.
And step S3: the magnitude relationship of the first scaled pixel height above the horizontal plane to the average pixel height is compared.
And step S4: if the height of the first scale pixel above the horizontal plane is larger than the average pixel height of the scales, the average pixel height is subtracted from the height of the first scale pixel above the horizontal plane, and then the actual height between the first scale above the horizontal plane and the horizontal plane is calculated proportionally.
In this embodiment, if the height of the first scale pixel above the horizontal plane is greater than the average pixel height of the scales, it is indicated that the scale closest to the horizontal plane cannot be identified when the scale is identified for the water gauge, and therefore, when the actual height between the first scale above the horizontal plane and the horizontal plane is calculated, the average pixel height is subtracted first, then the actual height between the first scale above the horizontal plane and the horizontal plane is calculated by using a ratio, specifically, the ratio of the height of the adjacent pixel to the average pixel height of the scale of the water gauge is calculated, and the ratio is determined as the ratio of the height of the adjacent pixel to the average pixel height of the scale of the water gauge; and calculating the product of the proportion and the actual height corresponding to the unit scale of the water gauge, and determining the calculated product value plus the height of the unit scale as the measurement error corresponding to the initial water level height information.
Step S5: the initial water level height information is adjusted based on the actual height between the first scale above the horizontal plane and the horizontal plane.
Step S6: and if the height of the first scale pixel above the horizontal plane is less than the average pixel height of the scales, calculating the actual height between the first scale above the horizontal plane and the horizontal plane according to a proportion.
Referring to fig. 13, fig. 13 is a block diagram of a water level height recognition apparatus according to an exemplary embodiment of the present application, and as shown in fig. 13, the water level height recognition apparatus 2000 includes an extraction module 2001, an object detection module 2002, a recognition module 2003, a first determination module 2004, and an adjustment module 2005.
The extraction module 2001 is used for extracting an image to be identified from the acquired water gauge video stream; the target detection module 2002 is configured to perform target detection on an image to be recognized to obtain an image area above a horizontal plane, and acquire the image area as a water gauge image; the identification module 2003 is used for identifying scale information above the water level in the water gauge image and determining initial water level height information of the water level based on the scale information, wherein the scale information comprises adjacent scales which are nearest to the water level; the first determining module 2004 is configured to determine a pixel height from the horizontal plane to the adjacent scale, use the determined pixel height as the adjacent pixel height, and determine a measurement error corresponding to the initial water level height information based on a ratio of the adjacent pixel height to an average pixel height of the water scale; the adjusting module 2005 is configured to adjust the initial water level height information according to the measurement error, so as to obtain the water level height information.
In another exemplary embodiment, the water level height recognition apparatus 2000 further includes a second determination module, a calculation module, where the second determination module is configured to determine an adjacent pixel area corresponding to each pixel in the image to be recognized, where the adjacent pixel area includes a preset number of surrounding pixels of the corresponding pixel; the calculation module is used for calculating an average pixel value of pixels contained in the adjacent pixel area, and replacing the average pixel value with a pixel value of a corresponding pixel, so that the obtained processed image is used as an image to be identified for target detection.
In another exemplary embodiment, the water level height identifying apparatus 2000 further includes a graying processing module and a grayscale linear transformation module, wherein the graying processing module is configured to perform graying processing on the processed image to obtain a corresponding grayscale image; the gray scale linear transformation module is used for carrying out gray scale linear transformation on the gray scale image based on a preset linear transformation formula to obtain a corresponding linear gray scale image so as to take the linear gray scale image as an image to be identified for target detection, wherein the value range of the intercept in the linear transformation formula is determined by the gray scale mean value of the gray scale image.
In another exemplary embodiment, the water level height identifying apparatus 2000 further includes a statistics module, an equalization module, and a third determination module, the statistics module is configured to count histogram distribution information of each gray level in the linear gray image; the third determining module is used for determining an accumulative gray probability distribution function corresponding to the histogram distribution information; in another exemplary embodiment, the water level height recognition apparatus 2000 further includes a second calculation module, a suppression module, and a fourth determination module, wherein the second calculation module is configured to calculate a gradient amplitude of the modified image in a first-order partial derivative finite difference manner; the suppression module is used for performing non-maximum suppression on the gradient amplitude to obtain a suppressed gradient amplitude; the fourth determining module is used for determining edge pixel points of the corrected image according to the restrained gradient amplitude and the gradient amplitude before restraining so as to perform target detection on the corrected image based on the edge pixel points.
In another exemplary embodiment, the recognition module 2003 includes a first calculation unit and a second calculation unit, the first calculation unit is configured to calculate a ratio of the height of the adjacent pixel to the average pixel height, and determine the ratio as a ratio of the height of the adjacent pixel to the average pixel height; and the second calculating unit is used for calculating the product between the proportion and the actual average height corresponding to the water gauge scale, and determining the calculated product value as the measurement error corresponding to the initial water level height information.
In another exemplary embodiment, the recognition module 2003 includes a first determination unit for determining the number of scales spaced between a non-adjacent scale and an adjacent scale, a third calculation unit for determining a second determination unit; the third calculating unit is used for calculating the product of the number of scales and the actual height of the unit scale; and the second determining unit is used for determining the difference between the character identifier of the non-adjacent scale and the calculated product value, and taking the obtained difference as the initial water level height information.
In another exemplary embodiment, the recognition module 2003 includes a third calculation unit and a height acquisition unit, wherein the third calculation unit is configured to calculate a product of an actual average height corresponding to the water gauge scale and a total number of scales; the height acquisition unit is used for taking a third difference value between the total length of the water gauge and the calculated product value as initial water level height information.
It should be noted that the apparatus provided in the foregoing embodiment and the method provided in the foregoing embodiment belong to the same concept, and the specific manner in which each module and unit execute operations has been described in detail in the method embodiment, and is not described again here.
In another exemplary embodiment, the present application provides an electronic device comprising a processor and a memory, wherein the memory has stored thereon computer readable instructions which, when executed by the processor, implement the foregoing game account number recommendation method. In this embodiment, the electronic device includes, but is not limited to, a mobile phone, a computer, an intelligent voice interaction device, an intelligent appliance, a vehicle-mounted terminal, and the like.
FIG. 14 illustrates a schematic structural diagram of a computer system suitable for use to implement the electronic device of the embodiments of the subject application.
It should be noted that the computer system 1000 of the electronic device shown in fig. 14 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 14, the computer system 1000 includes a Central Processing Unit (CPU) 1001 that can perform various appropriate actions and processes, such as performing the information recommendation method in the above-described embodiment, according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage portion 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other via a bus 1004. An Input/Output (I/O) interface 1005 is also connected to the bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output section 1007 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 1008 including a hard disk and the like; and a communication section 1009 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the I/O interface 1005 as necessary. A removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage section 1008 as necessary.
In particular, according to embodiments of the application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication part 1009 and/or installed from the removable medium 1011. When the computer program is executed by a Central Processing Unit (CPU) 1001, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
It is understood that in the specific implementation of the present application, the data related to the user information, etc. need to obtain user permission or consent when the above embodiments of the present application are applied to specific products or technologies, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related countries and regions.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Another aspect of the present application also provides a computer readable storage medium, on which computer readable instructions are stored, and the computer readable instructions, when executed by a processor, implement the game account recommendation method according to any one of the previous embodiments.
Another aspect of the application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the game account recommendation method provided in the above embodiments.
It should be noted that the computer readable media shown in the embodiments of the present application may be computer readable signal media or computer readable storage media or any combination of the two. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with a computer program embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
The above description is only a preferred exemplary embodiment of the present application, and is not intended to limit the embodiments of the present application, and those skilled in the art can easily make various changes and modifications according to the main concept and spirit of the present application, so that the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A water level height identification method is characterized by comprising the following steps:
extracting an image to be identified from the collected water gauge video stream;
carrying out target detection on the image to be recognized to obtain an image area above a horizontal plane, and acquiring the image area as the water gauge image;
identifying scale information above a horizontal plane in the water gauge image, and determining initial water level height information of the horizontal plane based on the scale information, wherein the scale information comprises adjacent scales which are nearest to the horizontal plane;
determining the pixel height of the adjacent scale from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water scale;
and adjusting the initial water level height information according to the measurement error to obtain water level height information.
2. The method according to claim 1, wherein before the target detection of the image to be recognized is performed to obtain an image area above a horizontal plane, and the image area is acquired as the water gauge image, the method further comprises:
determining an adjacent pixel area corresponding to each pixel in the image to be recognized, wherein the adjacent pixel area comprises a preset number of surrounding pixels of the corresponding pixel;
and calculating the average pixel value of the pixels contained in the adjacent pixel area, and replacing the pixel value of the corresponding pixel with the average pixel value, so that the obtained processed image is used as an image to be identified for target detection.
3. The method of claim 2, further comprising:
carrying out graying processing on the processed image to obtain a corresponding grayscale image;
and carrying out gray scale linear transformation on the gray scale image based on a preset linear transformation formula to obtain a corresponding linear gray scale image, and taking the linear gray scale image as an image to be identified for target detection, wherein the value range of the intercept in the linear transformation formula is determined by the gray scale mean value of the gray scale image.
4. The method of claim 3, further comprising:
counting histogram distribution information of each gray level in the linear gray level image;
determining a cumulative gray level probability distribution function corresponding to the histogram distribution information;
and performing histogram equalization on the image to be recognized by taking the cumulative gray probability distribution function as a transformation formula, and taking the obtained corrected image as the image to be recognized for target detection.
5. The method of claim 4, further comprising:
obtaining the gradient amplitude of each pixel in the corrected image;
carrying out non-maximum suppression on the gradient amplitude of each pixel, and determining a non-maximum suppression image based on the obtained gradient amplitude;
detecting edge information of the corrected image based on the non-maximum suppression image to perform target detection on the corrected image based on the edge information.
6. The method of claim 1, wherein the determining a measurement error corresponding to the initial water level height information based on the ratio of the adjacent pixel height to the average pixel height of the water scale comprises:
calculating a ratio of the adjacent pixel height to the average pixel height, determining the ratio as a proportion of the adjacent pixel height to the average pixel height;
and calculating the product of the proportion and the actual average height corresponding to the water scale, and determining the calculated product value as the measurement error corresponding to the initial water level height information.
7. The method of claim 1, wherein the scale information further comprises character marks of non-adjacent scales, the distance between the non-adjacent scales and the horizontal plane being greater than the distance between the adjacent scales and the horizontal plane; the determining initial water level height information of the water level based on the scale information includes:
determining the number of scales spaced between the non-adjacent scale and the adjacent scale;
calculating the product of the number of scales and the actual height of the unit scale;
and determining the difference between the character identifier of the non-adjacent scale and the calculated product value, and taking the obtained difference as the initial water level height information.
8. The method of claim 1, wherein the scale information further comprises a total number of scales above the horizontal plane; the determining initial water level height information of the water level based on the scale information includes:
calculating the product of the actual average height corresponding to the water gauge scales and the total number of the scales;
and taking a third difference value between the total length of the water gauge and the calculated product value as the initial water level height information.
9. The method of claim 1, wherein the identifying scale information above a water level in the water gauge image comprises:
inputting the water gauge image into a water gauge scale recognition model trained in advance to obtain scale information output by the water gauge scale recognition model;
the method further comprises the following steps:
and if the height of the adjacent pixel is greater than the average pixel height, sending notification information to a service platform so that the service platform adjusts the model parameters of the water gauge scale recognition model based on the notification information.
10. A water level height recognition apparatus, comprising:
the extraction module is used for extracting an image to be identified from the acquired water gauge video stream;
the target detection module is used for carrying out target detection on the image to be identified to obtain an image area above a horizontal plane, and acquiring the image area as the water gauge image;
the identification module is used for identifying scale information above the horizontal plane in the water gauge image and determining initial water level height information of the horizontal plane based on the scale information, wherein the scale information comprises adjacent scales, and the adjacent scales are the scales closest to the horizontal plane;
the first determining module is used for determining the pixel height of the adjacent scales from the horizontal plane, taking the determined pixel height as the adjacent pixel height, and determining the measurement error corresponding to the initial water level height information based on the proportion of the adjacent pixel height to the average pixel height of the water scale scales;
and the adjusting module is used for adjusting the initial water level height information according to the measurement error to obtain the water level height information.
11. An electronic device, comprising:
a memory storing computer readable instructions;
a processor to read computer readable instructions stored by the memory to perform the method of any of claims 1-9.
12. A computer-readable storage medium having computer-readable instructions stored thereon, which, when executed by a processor of a computer, cause the computer to perform the method of any one of claims 1-9.
CN202210895717.1A 2022-07-26 2022-07-26 Water level height identification method and device, electronic equipment and storage medium Pending CN115393700A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210895717.1A CN115393700A (en) 2022-07-26 2022-07-26 Water level height identification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210895717.1A CN115393700A (en) 2022-07-26 2022-07-26 Water level height identification method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115393700A true CN115393700A (en) 2022-11-25

Family

ID=84117587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210895717.1A Pending CN115393700A (en) 2022-07-26 2022-07-26 Water level height identification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115393700A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342965A (en) * 2023-05-26 2023-06-27 中国电建集团江西省电力设计院有限公司 Water level measurement error analysis and control method and system
CN117808815A (en) * 2024-03-01 2024-04-02 北京阿迈特医疗器械有限公司 Method and device for detecting quality of outer wall of implantation and intervention tubular instrument

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342965A (en) * 2023-05-26 2023-06-27 中国电建集团江西省电力设计院有限公司 Water level measurement error analysis and control method and system
CN116342965B (en) * 2023-05-26 2023-11-24 中国电建集团江西省电力设计院有限公司 Water level measurement error analysis and control method and system
CN117808815A (en) * 2024-03-01 2024-04-02 北京阿迈特医疗器械有限公司 Method and device for detecting quality of outer wall of implantation and intervention tubular instrument
CN117808815B (en) * 2024-03-01 2024-04-26 北京阿迈特医疗器械有限公司 Method and device for detecting quality of outer wall of implantation and intervention tubular instrument

Similar Documents

Publication Publication Date Title
CN109766878B (en) A kind of method and apparatus of lane detection
CN110287932B (en) Road blocking information extraction method based on deep learning image semantic segmentation
CN115393700A (en) Water level height identification method and device, electronic equipment and storage medium
CN109001780A (en) A kind of adaptive SAR satellite surface vessel target In-flight measurement method
CN110619258B (en) Road track checking method based on high-resolution remote sensing image
CN105894504B (en) Manhole cover loss detection method based on image
CN104361590A (en) High-resolution remote sensing image registration method with control points distributed in adaptive manner
CN106156758B (en) A kind of tidal saltmarsh method in SAR seashore image
CN107301649B (en) Regional merged SAR image coastline detection algorithm based on superpixels
CN108921165A (en) Water level recognition methods based on water gauge image
CN110490150A (en) A kind of automatic auditing system of picture violating the regulations and method based on vehicle retrieval
CN115099653A (en) Method and system for evaluating risk of oil spill accidents under extreme weather conditions of Bohai Bay
CN110889840A (en) Effectiveness detection method of high-resolution 6 # remote sensing satellite data for ground object target
CN109241867A (en) Using the method and device of intelligent algorithm identification digital cores image
Cai et al. Broken ice circumferential crack estimation via image techniques
CN114639064B (en) Water level identification method and device
CN115060343B (en) Point cloud-based river water level detection system and detection method
CN117233762B (en) Reservoir monitoring method based on GB-SAR
CN106940782A (en) High score SAR based on variogram increases construction land newly and extracts software
Wang et al. A dynamic marine oil spill prediction model based on deep learning
Li et al. Automatic detection of actual water depth of urban floods from social media images
Dong et al. Pixel-level intelligent segmentation and measurement method for pavement multiple damages based on mobile deep learning
CN112017213B (en) Target object position updating method and system
Utomo et al. Early warning flood detector adopting camera by Sobel Canny edge detection algorithm method
CN117152617A (en) Urban flood identification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination