US20190078882A1 - Water level measurement system and water level measurement method - Google Patents

Water level measurement system and water level measurement method Download PDF

Info

Publication number
US20190078882A1
US20190078882A1 US16/080,419 US201616080419A US2019078882A1 US 20190078882 A1 US20190078882 A1 US 20190078882A1 US 201616080419 A US201616080419 A US 201616080419A US 2019078882 A1 US2019078882 A1 US 2019078882A1
Authority
US
United States
Prior art keywords
water level
water
flow
processing unit
graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/080,419
Other versions
US10473463B2 (en
Inventor
Yoshiki AGATA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Kokusai Electric Inc
Original Assignee
Hitachi Kokusai Electric Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Kokusai Electric Inc filed Critical Hitachi Kokusai Electric Inc
Assigned to HITACHI KOKUSAI ELECTRIC INC. reassignment HITACHI KOKUSAI ELECTRIC INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AGATA, Yoshiki
Publication of US20190078882A1 publication Critical patent/US20190078882A1/en
Application granted granted Critical
Publication of US10473463B2 publication Critical patent/US10473463B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C13/00Surveying specially adapted to open water, e.g. sea, lake, river or canal
    • G01C13/002Measuring the movement of open water
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C13/00Surveying specially adapted to open water, e.g. sea, lake, river or canal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/22Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
    • G01F23/28Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
    • G01F23/284Electromagnetic waves
    • G01F23/292Light, e.g. infrared or ultraviolet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/162Segmentation; Edge detection involving graph-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Abstract

In a technology for detecting the water level of a river by image processing, after an angle of view setting process, an area setting unit sets an arbitrarily defined certain range from the center of the set angle of view as a processing area. Then, a flow processing unit calculates, from the processing area, motion information and a flow direction, computes a flow density, determines a region having a high density and similar flow directions to be a flow of water currents, and deletes flows in the other directions. Thereafter, a graph-cut processing unit creates an object seed and a background seed for graph-cutting, and detects a water surface by automatic graph-cutting. After an edge extraction processing unit has performed edge extraction, a water level calculation processing unit determines an edge that satisfies a predetermined condition to be a water level line, and outputs a water level measurement result.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a water level measurement system and a water level measurement method; and, particularly, to a water level measurement system and a water level measurement method using an image processing technique.
  • BACKGROUND OF THE INVENTION
  • As for a generally used water level measurement method, there are two methods, i.e., a method using a water gauge and a method using image processing. The method using a water gauge can measure a water level with high accuracy by transmitting a pressure measured by a detector installed in water to a measuring board via a repeater. The method using image processing which is considered as a method replacing the method using a water gauge proposes various techniques (see, e.g., Patent Document 1). FIG. 2 shows an outline of water level measurement using image processing. A monitoring camera 500 set on a river RV side images a predetermined area CA of the river RV which includes a structure OB. In the measurement method using image processing, a water level is measured by detecting a boundary WL with respect to a water surface W from an image B0 of the monitoring camera 500 through image processing.
  • In a disaster prevention system, when it is determined that a water level of a river which is constantly measured by the above methods reaches a dangerous level, for example, alarm is notified to a monitoring center, and the monitoring center that received the notification gives an evacuation instruction to neighboring residents.
  • Patent Document 1: Japanese Patent Application Publication No. 2008-057994
  • However, the above-described two water level measurement methods have drawbacks. First, the method using a water gauge which has been introduced for river monitoring has two major drawbacks. The first drawback is that it is not difficult to observe fine water level variation between an upstream side and a downstream side of a river since a small number of water gauges is installed. When a monitoring camera and a water gauge are distanced from each other, an error occurs in the water level appearing on the camera and a measurement value of the water gauge. Although this drawback can be solved by increasing the number of water gauges, it is expected that a high installation cost and a high maintenance cost are required, which is not realistic. Second, when an observer checks only a measurement value of the water gauge, it is difficult to display a state of a river corresponding to a current measurement value. Accordingly, it is difficult for the observer to make accurate judgment, and an evacuation instruction may be delayed.
  • The first drawback can be solved by using an existing monitoring camera. This is because monitoring cameras are often installed at a short distance in major rivers managed by the Ministry of Land, Infrastructure and Transport of Japan. If the water level can be measured from this camera image, it is possible to observe fine water level variation of the river and minimize an equipment installation cost. The second drawback can be solved by displaying a current water level on a camera image. By drawing the water level line on the image, it is easier to visually judge dangerous situation, and a prompt evacuation instruction can be made. Due to these advantages, the demand for water level measurement by image processing using an existing monitoring camera is increasing.
  • However, the water level measurement using image processing is disadvantageous in initial setting and measurement accuracy. First, in regard to the initial setting, it is necessary that an expert sets an angle of view in order to perform image processing with high accuracy. This is disadvantageous in that the number of working processes at the time of introduction is increased and it is difficult to change the angle of view after the introduction. In regard to the measurement accuracy, the accuracy may deteriorate depending on an environment of a measurement target area. The accuracy deteriorates when a water surface cannot be detected accurately and when motion other than water flow inflicts an adverse effect. The water level measurement using image processing is realized by finding a boundary line between the water surface and a pier or the like above the water surface. Therefore, if the water surface cannot be detected accurately, the accuracy deteriorates. FIG. 2 shows an example of an image B1 having reflection R of the structure OB on the water surface W. If the structure OB such as a bridge pier or the like is reflected on the water surface W, the water surface W may be determined as the structure OB. Therefore, a new technique for measuring a water level through image processing is required.
  • SUMMARY OF THE INVENTION
  • The present invention has been made in view of such conventional circumstances, and it is an object of the present invention to solve the above drawbacks.
  • In accordance with an aspect of the present invention, there is provided a water level measuring system including: a flow processing unit configured to acquire an image with a water surface and a structure and calculate motion information (motion vector); a graph-cut processing unit configured to specify a water surface region of the water surface by using a graph cut theory based on the motion information; and a water level calculating unit configured to calculate a water level based on a boundary between the water surface region and another region.
  • The flow processing unit may extract a water current flow from the motion information, and the graph-cut processing unit may extract the water surface region by performing labeling processing on the extracted water current flow and the other flows based on the graph cut theory.
  • The water level measuring system may further include an edge extraction processing unit configured to extract an edge near a boundary between the water surface region specified by the graph-cut processing unit and another region and set an edge satisfying a predetermined feature from the extracted edge to a boundary for calculating the water level.
  • The water level measurement system may further include a setting unit configured to set an angle of view of a camera that images the image with the water surface and the structure by manipulating a cross bar displayed on a screen.
  • In accordance with another aspect of the present invention, there is provided a water level measurement method for measuring a water level of a river by using an image processing technique, the method including: a flow processing step of acquiring an image with a water surface and a structure and calculating motion information (motion vector); a graph cut processing step of specifying a region of the water surface by using a graph cut theory based on the motion information; and a water level calculating step of calculating a water level based on a boundary between the water surface region and another region.
  • In the flow processing step, a water current flow may be extracted from the motion information, and in the graph cut processing step, the water surface region may be extracted by performing labeling processing on the extracted water level flow and the other flows based on the graph cut theory.
  • The water level measurement method may further include an edge extracting step of extracting an edge near the boundary between the water surface region specified in the graph cut processing step and calculating an edge satisfying a specific feature from the extracted edge.
  • Effect of the Invention
  • As described above, in accordance with the present invention, it is possible to realize high-accuracy water level measurement by detecting a water surface with high accuracy by an image processing technique using a graph cut theory based on motion information (motion vector).
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an outline of water level measurement using image processing according to a background art.
  • FIG. 2 shows an example of reflection of an object on a water surface according to the background art.
  • FIGS. 3A to 3C show an example of motion information other than water flow according to an embodiment.
  • FIG. 4 schematically shows of a water level measurement system 10 according to an embodiment which performs image processing using a river monitoring camera.
  • FIG. 5 is a block diagram showing a configuration of an image processing apparatus according to an embodiment.
  • FIG. 6 is a flowchart of processes of a water level measurement system according to an embodiment.
  • FIG. 7 is a flowchart of an image processing algorithm for measuring a water level according to an embodiment.
  • FIGS. 8A to 8D show a process of setting an angle of view and an example of an interface according to an embodiment.
  • FIGS. 9A to 9C show a process of calculating an optical flow and a process of deleting flows other than water flow according to the embodiment.
  • FIG. 10 is a coloring map showing relation between a direction and a color of a vector according to an embodiment.
  • FIGS. 11A to 11C show a procedure of calculating a graph cut based on motion information according to an embodiment.
  • FIGS. 12A to 12C show a procedure of calculating a graph cut based on motion information in the case where there is reflection according to an embodiment.
  • FIGS. 13A to 13D show processes from edge extraction to output of a water level measurement result according to an embodiment.
  • FIG. 14 shows an interface for displaying the water level measurement result according to an embodiment.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Next, a mode for implementing the present invention (hereinafter, simply referred to as “embodiment”) will be described in detail with reference to the drawings. In an embodiment to be described below, high-accuracy water level measurement is realized by detecting a water surface with high accuracy by an image processing technique referred to as “graph-cut” based on motion information (motion vector), and detecting and deleting the motion information other than water flow. At the time of initial setting, an interface that allows an appropriate angle of view to be obtained simply by setting an angle of view such that a cross bar on a setting screen becomes the center of a bridge pier or the like above the water surface is realized.
  • In the technique using graph cut based on motion information, it is necessary to appropriately judge motion of water flow. FIGS. 3A to 3C show examples in which there is motion information other than water flow. The motion information is indicated by arrows. FIG. 3A shows an example in which only water flow (motion of the water surface W) is shown as the motion information. FIG. 3B shows an example in which there is motion information in a grass area G and a tree T. FIG. 3C shows an example in which there is motion information in a structure OB such as a bridge pier or the like. In this case, generally, the motion information occurs due to shake of the camera itself. In the case of using the motion information to find the water surface W, the accuracy may deteriorate due to the motion of plants (the grass area G and the tree T) or the motion caused by the shake of the camera itself other than water flow (motion of the water surface W). Therefore, it is required to properly distinguish and remove such unnecessary motion information. Hereinafter, a river monitoring technique will be described in detail.
  • FIG. 4 schematically shows the water level measurement system 10 that performs image processing using a river monitoring camera 30. The water level measurement system 10 includes the river monitoring camera 30 installed at a river and a monitoring center 20 connected thereto via a network 90. An image processing apparatus 40 is provided between the monitoring center 20 and the network 90. Here, an upstream camera 31 installed at an upstream side of the river and a downstream camera 32 installed at a downstream side of the river are used as examples of the river monitoring camera 30. When it is not necessary to distinguish the upstream camera 31 and the downstream camera 32, they will be simply described as the river monitoring camera 30.
  • An image from the river monitoring camera 30 installed at the river is sent to the monitoring center 20 via the network 90. The image processing apparatus 40 arranged therebetween performs water level measurement through image processing and displays a water level measurement result on a monitor 21 of the monitoring center 20. The water level measurement result and a position of the camera with respect to the entire river map are superimposed and displayed on the monitor 21.
  • In the monitoring center 20, the monitor image is monitored. When it is determined by a manager that it is a dangerous situation, an evacuation instruction is issued. Then, evacuation broadcasting for a dangerous area is executed and evacuation instruction vehicles or the like are dispatched so that the evacuation of the neighborhood can be carried out.
  • FIG. 5 is a block diagram showing a configuration of the image processing apparatus 40. The image processing apparatus 40 includes a communication interface unit 42 for connecting with the monitoring center 20 and the river monitoring camera 30, a setting unit 50, and an image processing unit 60. Each component of the image processing apparatus 40 includes a memory or an LSI, e.g., an MPU or the like, and a function thereof is realized by executing a program stored in the memory.
  • The image processing unit 60 includes an area setting processing unit 61, a flow processing unit 62, a graph-cut processing unit 63, an edge extraction processing unit 64, and a water level calculation processing unit 65. Specific functions of the elements of the setting unit 50 and the image processing unit 60 will be described together with the processing flow.
  • In the present embodiment, the water level measurement system 10 that utilizes the existing river monitoring camera 30 is realized by providing the image processing apparatus 40 having the image processing function. However, when the river monitoring camera 30 has the function of the image processing apparatus 40, for example, the image processing apparatus 40 becomes unnecessary. As for the river monitoring camera 30, it is possible to use a visible light camera and a far infrared camera.
  • FIG. 6 is a flowchart of processes of the water level measurement system 10. First, as initial setting, a manager or the like manipulates the setting unit 50 to set an angle of view of the river monitoring camera 30 (S10). Next, the image processing unit 60 executes an image processing mode (S11). The image processing mode and the angle of view setting process will be described in detail later with reference to FIGS. 7 to 14.
  • The image processing unit 60 constantly measures a water level and receives a result. When the result exceeds a preset dangerous water level (Y in S12), a danger determination unit 70 issues alarm (515). Then, a manager determines whether or not it is dangerous (S16) and issues an evacuation instruction (N in S16), the evacuation instruction for the neighborhood is executed (S17). However, even if the alarm is not issued, if the manager determines, from the dangerous water level determination (Y in S12) and the monitoring image, that it is dangerous (Y in S13), the evacuation instruction is issued (S14). If the manager determines that the evacuation instruction is unnecessary (N in S13 and N in S16), the image processing mode is continued (S11).
  • FIG. 7 is a flowchart of the operation of the image processing unit 60, i.e., the image processing algorithm (the image processing mode (S11) in FIG. 6) for measuring a water level. After the process of setting the angle of view (S10) of FIG. 6, the process of the image processing mode (S11) is performed.
  • First, the area setting processing unit 61 sets as a processing area a certain range around the center of the set angle of view (S21). At this time, the area setting processing unit 61 sets the processing area to be within a range of a background object such as a bridge pier or the like. Next, the flow processing unit 62 calculates motion information (flow) from the processing area (S22), and further calculates a flow direction (523).
  • Then, the flow processing unit 62 calculates a flow density, determines a region having a high density and similar flow directions to be a flow of water current, and deletes flows in the other directions (S24). Accordingly, it is possible to delete the flows of plants and the flows due to the shake of the camera itself.
  • Next, the graph-cut processing unit 63 creates an object seed for graph-cutting by using the narrowed-down water current flow (S25). Here, a seed is considered a label. This will be described in detail later.
  • In the graph cutting, as the initial setting, a user selects a certain pixel in an object and a certain pixel in a background and cuts out the object based on label information. Therefore, the graph-cut processing unit 63 sets a region including a water current flow to an object seed and sets a region with less flow and separated by a certain distance from the water current flow to a background seed (S26). With this processing, it is possible to make the graph cutting automatic. Then, the graph-cut processing unit 63 detects a water surface by automatic graph cutting (S27).
  • Next, the edge extraction processing unit 64 extracts an edge from the vicinity of an upper end of the water surface detected by the graph-cut processing unit 63 (S28). The water level calculation processing unit 65 determines, as a water level line, an edge that has a certain length and that can be stably extracted for several frames among the extracted edges, and outputs a water level measurement result (S29). After the water level measurement result is outputted, the image processing mode is ended, and the processing proceeds to a dangerous level determination process of the danger determination unit 70 (process after S12 in FIG. 6).
  • FIGS. 8A to 8D show a process of setting angle of view and an example of an interface. On setting screens A11 to A13, a cross bar CB is displayed at the center of the image. An appropriate angle of view can be set by manipulating the river monitoring camera 30 such that a background object OB suitable for edge extraction such as a bridge pier or the like is positioned at the center of the cross bar CB (center of angle of view).
  • In the example shown in the drawings, when an initial angle of view is set as shown in the setting screen A11 of FIG. 8A, first, a user such as a manager or the like moves the river monitoring camera 30 upward as shown in the setting screen S12 of FIG. 8B and then to the left as shown in the setting screen A13 of FIG. 8C so that the bridge pier (background object OB) is positioned at the center of the cross bar CB. At this time, the zoom is adjusted to a size in which the maximum level and the minimum level of the water surface W can be shown. When the setting of angle of view is completed, the setting screen is ended and the cross bar CB on the screen is not displayed. Then, as shown in FIG. 8D, the setting unit 50 automatically sets an area indicated by dashed lines as a processing area PA. The processing area PA is defined by the range of a pixels at the right and the left from the center of the angle of view, and a value of the a pixel can be arbitrarily set. The setting unit 50 sets the processing area PA to be included within the background object OB such as a bridge pier or the like.
  • Next, the calculation of motion information and the calculation of flow direction will be described. The motion information (flow) is calculated by calculating an optical flow after the detection of feature points. The feature point detection can be performed by using SIFT (Scale-Invariant Feature Transform), SURF (Speeded Up Robust Features), or the like. The optical flow calculation can be performed by using block matching, a concentration gradient method, a Horn-Schunk method, a Lucas-Kanade method, or the like. In the present invention, a KLT (Kanade-Lucas-Tomasi) Tracker method in which the feature point detection and the optical flow calculation are combined is used.
  • Since the optical flow shows the motion vector of the feature point, an angle can be obtained from the vector. The angle obtained from the vector is defined as a direction by arbitrary resolution to be set as a flow direction.
  • FIGS. 9A to 9C show a process of coloring the calculated optical flow for each direction and deleting flows other than the water flow. FIG. 10 is a coloring map showing relation between a direction and a color of a vector. In the present embodiment, 360 degrees is classified into 8 ranges, and the classified ranges are colored with 8 types from COLOR 1 to COLOR 8. For example, in the case of a direction from −22.5 degrees to +22.5 degrees, a vector is displayed in a color (e.g., red) defined as COLOR 1.
  • The optical flow uses a current frame F1 and a past frame F0 as shown in FIG. 9A. By tracking the motion of the feature point between these two frames, the flow is calculated as shown in FIG. 9B and colored based on the coloring map of FIG. 10. Here, five types of directions, i.e., COLOR 1 (c1), COLOR 2 (c2), COLOR 3 (c3), COLOR 4 (c4), and COLOR 5 (c5), are shown.
  • Among these flows, flows having a high density and similar flow directions are set as the water flow directions, and the other flows are deleted. In the example of FIG. 9C, flows other than COLOR 1 (c1) are deleted.
  • Next, the graph cut based on motion information will be explained. This process is performed by the graph-cut processing unit 63. The graph cut is one of the segmentation methods for cutting out a target object from a single image. The graph cut can cut out a target object with high accuracy by solving a problem of energy minimization. However, it is required in the initial setting for a user to assign a label (seed) to an object region and a background region. Therefore, there is suggested to automate the graph cut by performing automatic labeling while setting a water current flow to an object area and an area with no flow to a background area. “Labeling” is also referred to as “assigning seed”.
  • FIGS. 11A to 11C show a procedure of calculating the graph cut based on the motion information. First, with respect to the image of the calculated water current flow shown in FIG. 11A, a straight line is drawn at the center of the flow distribution as shown in FIG. 11B to create an object seed 98. Similarly, a straight line is drawn at a region with no flow and separated by a certain distance from the water current flow to create a background seed 99. By performing the graph cut based on these seeds, it is possible to extract only the water surface W as shown in FIG. 11C. Since the water level moves up and down, seeds need to be updated at a regular interval.
  • FIGS. 12A to 12C show a procedure of calculating the graph cut based on the motion information. FIGS. 12A to 12C illustrate an example in which there is reflection RO on the water surface W. With respect to the image of the calculated water current flow shown in FIG. 12A, a straight line is drawn at the center of the flow distribution to create an object seed 98 as shown in FIG. 12B. Here, the straight line of the object seed 98 crosses the reflection RO. Similarly to the case of FIGS. 11A to 11C, a straight line is drawn at a region with no flow and separated by a certain distance from the water current flow to create a background seed 99. By executing the graph cut based on these seeds, it is possible to extract only the water surface W as shown in FIG. 12C. Even if there is an area where the flow cannot be calculated due to the reflection RO on the water surface W, it is possible to extract only the water surface W as long as the portion of the reflection RO is assigned as the object seed 98.
  • Next, the process of extracting and narrowing-down an edge which is performed by the edge extraction processing unit 64 will be described.
  • FIGS. 13A to 13D show processes from the edge extraction to the output of the water level measurement result. After the water surface W is detected by the processing described in FIGS. 11A to 11C or FIGS. 12A to 12C, i.e., after the water surface area WE is specified as shown in FIG. 13A, the edge extraction processing unit 64 extracts an edge from the vicinity of the upper end of the water surface W. The edge extraction processing unit 64 extracts the edge EG as shown in FIG. 13B by using Canny, Sobel, Laplacian, or the like.
  • Thereafter, as shown in FIG. 13C, the edge extraction processing unit 64 selects an edge EG1 in consideration of features such as a distance to the water surface, a length of the edge, stable extraction of the edge for several frames, and the like, and outputs the measurement result with the edge position as a water level RWS, as shown in FIG. 13D. Here, a portion of the edge EG1 which is included in the processing area set by the area setting processing unit 61 is displayed as the water level RWS.
  • FIG. 14 shows a display interface of the water level measurement result which is displayed on the monitor 21 of FIG. 4. The processing area TA and the current water level WLc are displayed on the camera image A31. If water level prediction information can be acquired from another device, a future water level WLf is also displayed, which is more useful for danger determination.
  • In a water level map A32, the measured water level is displayed based on the color map CM, and a water level map Mc is displayed so that the water level condition of the entire river can be monitored at a glance. For example, red indicates a high water level and blue indicates a low water level. In the example of the drawing, the water level is high at the downstream side and low at the upstream side.
  • Then, a camera icon CI is displayed at a camera installation position, and the camera image A31 is switched by manipulation such as clicking or the like. When it is determined from the measurement result that the water level is dangerous, the color of the camera icon CI is changed and displayed. Then, a dangerous area DA that needs evacuation is displayed on the map. By clicking the dangerous area, the information on municipalities or the like which needs an evacuation instruction is displayed, and the evacuation instruction can be issued by single click in conjunction with another alarm system.
  • As described above, in accordance with the present embodiment, it is possible to accurately measure the water level, and specific water level variation of the river can be observed by applying the present invention to an existing river monitoring camera. Further, it is possible to observe an actual water level and an actual state of a river together by superimposing the water level on the monitoring camera image. Therefore, a system useful for determining an evacuation instruction can be constructed. In addition, since the interface that can be initialized by a user is realized, the angle of view after the application can be easily changed while minimizing an application cost.
  • The present invention has been described based on the embodiments. It is to be understood by those skilled in the art that the embodiments are merely examples and various modifications can be made to combinations of the respective components and such modifications are also within the scope of the present invention.
  • DESCRIPTION OF REFERENCE NUMERALS
    • 10: water level measurement system
    • 20: monitoring center
    • 21: monitor
    • 30: river monitoring camera
    • 31: upstream camera
    • 32: downstream camera
    • 40: image processing apparatus
    • 42: communication interface unit
    • 50: setting unit
    • 60: image processing unit
    • 61: area setting unit
    • 62: flow processing unit
    • 63: graph-cut processing unit
    • 64: edge extraction processing unit
    • 65: water level calculation processing unit
    • 70: danger determination unit
    • 90: network

Claims (7)

What is claimed is:
1. A water level measuring system comprising:
a flow processing unit configured to acquire an image with a water surface and a structure and calculate motion information (motion vector);
a graph-cut processing unit configured to specify a water surface region of the water surface by using a graph cut theory based on the motion information; and
a water level calculating unit configured to calculate a water level based on a boundary between the water surface region and another region.
2. The water level measuring system of claim 1, wherein the flow processing unit extracts a water current flow from the motion information, and
the graph-cut processing unit extracts the water surface region by performing labeling processing on the extracted water current flow and the other flows based on the graph cut theory.
3. The water level measuring system of claim 2, further comprising:
an edge extraction processing unit configured to extract an edge near a boundary between the water surface region specified by the graph-cut processing unit and another region and set an edge satisfying a predetermined feature from the extracted edge to a boundary for calculating the water level.
4. The water level measurement system of claim 1, further comprising:
a setting unit configured to set an angle of view of a camera that images the image with the water surface and the structure by manipulating a cross bar displayed on a screen.
5. A water level measurement method for measuring a water level of a river by using an image processing technique, the method comprising:
a flow processing step of acquiring an image with a water surface and a structure and calculating motion information (motion vector);
a graph cut processing step of specifying a region of the water surface by using a graph cut theory based on the motion information; and
a water level calculating step of calculating a water level based on a boundary between the water surface region and another region.
6. The water level measurement method of claim 5, wherein in the flow processing step, a water current flow is extracted from the motion information, and
in the graph cut processing step, the water surface region is extracted by performing labeling processing on the extracted water level flow and the other flows based on the graph cut theory.
7. The water level measurement method of claim 6, further comprising:
an edge extracting step of extracting an edge near the boundary between the water surface region specified in the graph cut processing step and calculating an edge satisfying a specific feature from the extracted edge.
US16/080,419 2016-03-04 2016-03-04 Water level measurement system and water level measurement method Active US10473463B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/056734 WO2017149744A1 (en) 2016-03-04 2016-03-04 Water level measurement system and water level measurement method

Publications (2)

Publication Number Publication Date
US20190078882A1 true US20190078882A1 (en) 2019-03-14
US10473463B2 US10473463B2 (en) 2019-11-12

Family

ID=59743617

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/080,419 Active US10473463B2 (en) 2016-03-04 2016-03-04 Water level measurement system and water level measurement method

Country Status (3)

Country Link
US (1) US10473463B2 (en)
JP (1) JP6442641B2 (en)
WO (1) WO2017149744A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109508630A (en) * 2018-09-27 2019-03-22 杭州朗澈科技有限公司 A method of water gauge water level is identified based on artificial intelligence
US10366500B2 (en) * 2017-06-30 2019-07-30 The United States Of America, As Represented By The Secretary Of The Navy Autonomous characterization of water flow from surface water velocity
CN111292371A (en) * 2020-02-21 2020-06-16 廖赟 Intelligent water level detection equipment, intelligent water level monitoring management device and detection method
CN112784471A (en) * 2021-01-26 2021-05-11 郑州轻工业大学 Water environment visual simulation method, terminal equipment and computer readable storage medium
CN113819971A (en) * 2020-07-07 2021-12-21 湖北亿立能科技股份有限公司 Artificial intelligence water level monitoring system based on water, scale and floater semantic segmentation
US20210404856A1 (en) * 2018-12-03 2021-12-30 Bio-Rad Laboratories, Inc. Liquid Level Determination
CN117268498A (en) * 2023-11-20 2023-12-22 中国航空工业集团公司金城南京机电液压工程研究中心 Oil mass measurement method and system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6911697B2 (en) * 2017-10-20 2021-07-28 富士通株式会社 Water level judgment program, water level judgment method, and water level judgment device
KR101975476B1 (en) * 2018-11-13 2019-05-08 주식회사 하이드로셈 Apparatus and Method for Measuring Real Time River Stage using Location Data Filtering
CN110728691B (en) * 2019-10-08 2021-03-23 中国石油大学(华东) Multi-temporal water sideline-based coastline automatic judgment method
CN113762618B (en) * 2021-09-07 2022-03-01 中国水利水电科学研究院 Lake water level forecasting method based on multi-factor similarity analysis

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6098029A (en) * 1994-06-14 2000-08-01 Hitachi, Ltd. Liquid-level position measuring method and system
US20090107234A1 (en) * 2005-09-16 2009-04-30 Won Kim System and method for measuring liquid level by image
US20100158386A1 (en) * 2007-08-02 2010-06-24 Emza Visual Sense Ltd. Universal counting and measurement system
US20100220914A1 (en) * 2009-03-02 2010-09-02 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11122602A (en) * 1997-10-17 1999-04-30 Kensetsu Denki Gijutsu Kyokai Method and system for monitoring occurrence of avalanche of rocks and earth
JP4910139B2 (en) * 2006-02-24 2012-04-04 国立大学法人長岡技術科学大学 Running water area detection system, running water area detection method, and program
JP5321615B2 (en) * 2011-03-07 2013-10-23 三菱電機株式会社 Water level detection device, water level detection system, and water level detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6098029A (en) * 1994-06-14 2000-08-01 Hitachi, Ltd. Liquid-level position measuring method and system
US20090107234A1 (en) * 2005-09-16 2009-04-30 Won Kim System and method for measuring liquid level by image
US20100158386A1 (en) * 2007-08-02 2010-06-24 Emza Visual Sense Ltd. Universal counting and measurement system
US20100220914A1 (en) * 2009-03-02 2010-09-02 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10366500B2 (en) * 2017-06-30 2019-07-30 The United States Of America, As Represented By The Secretary Of The Navy Autonomous characterization of water flow from surface water velocity
CN109508630A (en) * 2018-09-27 2019-03-22 杭州朗澈科技有限公司 A method of water gauge water level is identified based on artificial intelligence
US20210404856A1 (en) * 2018-12-03 2021-12-30 Bio-Rad Laboratories, Inc. Liquid Level Determination
CN111292371A (en) * 2020-02-21 2020-06-16 廖赟 Intelligent water level detection equipment, intelligent water level monitoring management device and detection method
CN113819971A (en) * 2020-07-07 2021-12-21 湖北亿立能科技股份有限公司 Artificial intelligence water level monitoring system based on water, scale and floater semantic segmentation
CN112784471A (en) * 2021-01-26 2021-05-11 郑州轻工业大学 Water environment visual simulation method, terminal equipment and computer readable storage medium
CN117268498A (en) * 2023-11-20 2023-12-22 中国航空工业集团公司金城南京机电液压工程研究中心 Oil mass measurement method and system

Also Published As

Publication number Publication date
JPWO2017149744A1 (en) 2018-09-20
WO2017149744A1 (en) 2017-09-08
JP6442641B2 (en) 2018-12-19
US10473463B2 (en) 2019-11-12

Similar Documents

Publication Publication Date Title
US10473463B2 (en) Water level measurement system and water level measurement method
US10839211B2 (en) Systems, methods and computer program products for multi-resolution multi-spectral deep learning based change detection for satellite images
Kopsiaftis et al. Vehicle detection and traffic density monitoring from very high resolution satellite video data
EP3270134B1 (en) Gas leak location estimating device, gas leak location estimating system, gas leak location estimating method and gas leak location estimating program
RU2484531C2 (en) Apparatus for processing video information of security alarm system
EP2207138B1 (en) House movement determining method, house movement determining program, house movement determining image generating method, and house movement determining image
KR20100069655A (en) Runway surveillance system and method
US20110280478A1 (en) Object monitoring system and method
EP3668077B1 (en) Image processing system, server device, image processing method, and image processing program
EP3748283A1 (en) Repair length determination method and repair length determination device
KR20160115130A (en) Marine risk management system and marine risk management method using marine object distance measuring system with monocular camera
JP2018041406A (en) Water surface boundary detection device, water surface boundary detection method and computer program
CN105787870A (en) Graphic image splicing fusion system
US9478032B2 (en) Image monitoring apparatus for estimating size of singleton, and method therefor
JP7427615B2 (en) Information processing device, information processing method and program
KR20230036557A (en) A system for monitoring solar panel failures on a large scale using drones
KR101793264B1 (en) Analysis method for occurrence and growth progression of crack
CN112017243B (en) Medium visibility recognition method
US20190325587A1 (en) Gas detection-use image processing device, gas detection-use image processing method, and gas detection-use image processing program
CN116046790B (en) Defect detection method, device, system, electronic equipment and storage medium
US10062155B2 (en) Apparatus and method for detecting defect of image having periodic pattern
JP2011252746A (en) Device, method and program for detecting cable position
TWI676815B (en) Weather radar device and rain forecasting method
KR101111434B1 (en) Surveying System Using a measuring rule
JP6664078B2 (en) Three-dimensional intrusion detection system and three-dimensional intrusion detection method

Legal Events

Date Code Title Description
AS Assignment

Owner name: HITACHI KOKUSAI ELECTRIC INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AGATA, YOSHIKI;REEL/FRAME:046724/0895

Effective date: 20180802

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4