CN111435081B - Sea surface measuring system, sea surface measuring method and storage medium - Google Patents

Sea surface measuring system, sea surface measuring method and storage medium Download PDF

Info

Publication number
CN111435081B
CN111435081B CN202010027415.3A CN202010027415A CN111435081B CN 111435081 B CN111435081 B CN 111435081B CN 202010027415 A CN202010027415 A CN 202010027415A CN 111435081 B CN111435081 B CN 111435081B
Authority
CN
China
Prior art keywords
sea surface
wave
image
camera
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010027415.3A
Other languages
Chinese (zh)
Other versions
CN111435081A (en
Inventor
卢存伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
School Juridical Person of Fukuoka Kogyo Daigaku
Original Assignee
School Juridical Person of Fukuoka Kogyo Daigaku
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by School Juridical Person of Fukuoka Kogyo Daigaku filed Critical School Juridical Person of Fukuoka Kogyo Daigaku
Priority to CN202210062109.2A priority Critical patent/CN114526710A/en
Priority to CN202210062123.2A priority patent/CN114485579A/en
Publication of CN111435081A publication Critical patent/CN111435081A/en
Application granted granted Critical
Publication of CN111435081B publication Critical patent/CN111435081B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C13/00Surveying specially adapted to open water, e.g. sea, lake, river or canal
    • G01C13/002Measuring the movement of open water
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention provides a sea surface measuring system, a sea surface measuring method and a storage medium, which apply an image measuring technology to measure sea surface anomalies such as tsunamis and the like. The method comprises the following steps: a photographing unit which simultaneously photographs images including the same sea surface area by a plurality of cameras disposed at different locations; a wave extraction unit that extracts waves from respective images simultaneously captured by the plurality of cameras; a wave correspondence unit that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras for each wave extracted by the wave extraction unit; a three-dimensional information calculation unit that calculates the height of the sea surface and the height of the waves by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit; and a sea surface abnormality presence/absence determination unit that determines whether or not a sea surface abnormality has occurred, based on the height of the sea surface and the height of the waves calculated by the three-dimensional information calculation unit.

Description

Sea surface measuring system, sea surface measuring method and storage medium
Technical Field
The present invention relates to a sea surface measurement system, a sea surface measurement method, and a sea surface measurement program for measuring sea surface abnormalities such as tsunamis, billows, full tides, and rapid tides based on images captured by a plurality of cameras.
Background
The existing measurement (observation) methods of tsunami can be roughly classified into two methods, an observation method of tsunami that has arrived at land and an observation method of tsunami that has arrived before land.
(1) Method for observing tsunami arriving on land
In the observation of the height of a tsunami arriving on land, tsunami observation devices such as a tsunami observer and a giant tsunami observer are used in addition to tidal stations installed in various places by the weather bureau. The meteorological bureau uses a tide station and a tsunami observer to observe the tide level all the time, and the observed data is published as tide observation data (prompt report value) and tide observation data. However, since these observation systems are installed on the coast, the tsunami arriving on the land can be observed, but the tsunami far from the land (several tens of kilometers (km) to over 50 kilometers (km)) cannot be observed, and the arrival time cannot be estimated.
(2) Method for observing tsunami before land arrival
In the observation of tsunami before arrival at land, an ocean bottom earthquake and tsunami observation net, a GPS wavemeter, and a recently proposed tsunami radar are used.
The submarine earthquake and tsunami observation network is a large-scale real-time earthquake and tsunami observation network which is provided by the national institute of research and development and the national institute of disaster prevention science and technology as a service provided by the Japanese trench submarine earthquake and tsunami observation network. In an ocean bottom earthquake and tsunami observation network, an observation device in which a seismometer and a tsunami meter are integrated is connected to an ocean bottom optical cable, and the observation device is installed on the ocean bottom of eastern japan to continuously acquire observation data in real time for 24 hours. By 1 month in 2018, an observation device was installed at 150 on the coast of japan, and the total length of the cable was about 5, 700 km. It is expected that the submarine earthquake and tsunami observation network directly detects a trench-type earthquake and a tsunami immediately after the trench-type earthquake and contributes to disaster prevention measures such as disaster reduction and evacuation actions by rapid and highly accurate information transmission.
However, installation of the subsea system and the land base station requires a huge amount of work and approval from an autonomous agency or fishery-related person. In addition, maintenance and management of the facility facilities also require enormous costs. Therefore, it is difficult to spread the sea area nationwide.
The GPS wavemeter system is a marine observation device that directly observes sea surface variations such as waves and tides by measuring the vertical variations of a buoy (GPS wavemeter) floating in the offshore area using GPS satellites. As of 1 month in 2017, 18 GPS wavemeters were installed in the near sea area in japan.
The GPS wavemeter is installed in a bay device to obtain wave information in a required offshore area, but is also capable of observing the up-and-down fluctuation of the sea surface caused by a tsunami when an earthquake occurs, and therefore, is highly expected to be effectively used for tsunami prevention measures. The GPS wavemeter is generally installed in a place around 20km in the near sea area, and therefore can observe tsunami over several tens of km and estimate arrival time.
However, GPS wavemeters are expensive devices, and at the beginning of 2005, one is over 3 billion yen, and now also in billions. In addition, the GPS wavemeter can be installed only in a specific place, and it is difficult to perform wide-range observation.
Tsunami radar is a system for monitoring the occurrence of tsunami by measuring the sea surface at a distance using radar measurement technology. In these methods, the wave height and arrival time of a tsunami are estimated by extracting a tsunami component from the flow velocity of the sea surface observed by a radar.
However, in these methods, it is difficult to extract a tsunami component from the flow velocity of the sea surface, and particularly, it is difficult to measure a long-period tsunami in advance.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2013-40898
Patent document 2: japanese patent laid-open publication No. 2018-4529
Patent document 3: japanese patent laid-open No. 2014-160044
Patent document 4: japanese laid-open patent publication No. 2004-191268
Patent document 5: japanese laid-open patent publication No. 2017-150917
Patent document 6: japanese patent laid-open publication No. 2016-85206
Disclosure of Invention
Problems to be solved by the invention
Further, image measuring techniques have been studied from 30 years ago, and recently, practical use of image measuring techniques has been advanced with progress of hardware techniques such as digital camera techniques and computer manufacturing techniques, and development of software techniques such as algorithms and artificial intelligence. Currently, image measuring techniques are practically used in quality control fields such as control of printing quality, quality control of products, and identification of jewelry and art. In addition, the present invention is widely used in the field of face recognition and object recognition.
The invention aims to provide a sea surface measuring system, a sea surface measuring method and a sea surface measuring program which are applied with an image measuring technology and used for measuring sea surface anomalies such as tsunamis.
Means for solving the problems
The sea surface measuring system of the present invention comprises: a photographing unit which simultaneously photographs images including the same sea surface area by a plurality of cameras disposed at different locations; a wave extraction unit that extracts waves from respective images simultaneously captured by the plurality of cameras; a wave correspondence unit that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras for each wave extracted by the wave extraction unit; a three-dimensional information calculation unit that calculates the height of the sea surface and the height of the waves by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit; and a sea surface abnormality presence/absence determination unit that determines whether or not a sea surface abnormality has occurred, based on the height of the sea surface and the height of the waves calculated by the three-dimensional information calculation unit.
In addition, the sea surface measuring method of the present invention comprises the steps of: simultaneously capturing images including the same sea surface area by a plurality of cameras disposed at different locations; extracting waves from respective images simultaneously captured by a plurality of cameras; aiming at each extracted wave, finding out and corresponding the same wave from each image shot by a plurality of cameras simultaneously; calculating the height of the sea surface and the height of the waves by three-dimensional analysis of the waves in the corresponding images; and determining whether a sea surface abnormality occurs based on the calculated sea surface height and the wave height.
According to these inventions, images including the same sea surface area are simultaneously captured by a plurality of cameras provided at different locations, waves are extracted from the images simultaneously captured by the plurality of cameras, the same waves are found and correlated from the images simultaneously captured by the plurality of cameras for each extracted wave, the height of the sea surface and the height of the waves are calculated by three-dimensionally analyzing each wave in the correlated images, and it is determined whether or not a sea surface abnormality occurs based on the calculated height of the sea surface and the calculated height of the waves.
In addition, the sea surface measurement program of the present invention causes the computer to function as: a photographing unit which simultaneously photographs images including the same sea surface area by a plurality of cameras disposed at different locations; a wave extraction unit that extracts waves from respective images simultaneously captured by the plurality of cameras; a wave correspondence unit that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras for each wave extracted by the wave extraction unit; a three-dimensional information calculation unit that calculates the height of the sea surface and the height of the waves by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit; and a sea surface abnormality presence/absence determination unit that determines whether or not a sea surface abnormality has occurred, based on the height of the sea surface and the height of the waves calculated by the three-dimensional information calculation unit. The computer executing the sea surface measuring program of the present invention functions in the same manner as the sea surface measuring apparatus of the present invention described above.
In addition, the storage medium of the present invention stores a sea surface measurement program that causes a computer to function as: a photographing unit which simultaneously photographs images including the same sea surface area by a plurality of cameras disposed at different locations; a wave extraction unit that extracts waves from respective images simultaneously captured by the plurality of cameras; a wave correspondence unit that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras for each wave extracted by the wave extraction unit; a three-dimensional information calculation unit that calculates the height of the sea surface and the height of the waves by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit; and a sea surface abnormality presence/absence determination unit that determines whether or not a sea surface abnormality has occurred, based on the height of the sea surface and the height of the waves calculated by the three-dimensional information calculation unit.
Invention ofEffect
According to the present invention, it is possible to determine in real time whether or not a sea surface abnormality such as a tsunami, a billow, a flood tide, or a rush tide occurs based on images captured by a plurality of cameras, and when a tsunami occurs as a sea surface abnormality, it is possible to calculate the scale and arrival time of the tsunami in real time, or automatically send an alarm about the sea surface abnormality such as the tsunami to a necessary place or a necessary person, and it is possible to contribute to evacuation or disaster reduction for the sea surface abnormality.
Drawings
Fig. 1 is a hardware configuration diagram of a sea surface survey system in an embodiment of the present invention.
Fig. 2 is an explanatory diagram showing a setting state of a camera group of the sea surface measuring system of fig. 1.
Fig. 3 is an explanatory diagram showing a coordinate relationship of three-dimensional image measurement based on stereoscopic observation.
FIG. 4 is a block diagram illustrating the structure of the surface measurement system of FIG. 1.
FIG. 5 is a flow chart illustrating a flow of a surface measurement by the surface measurement system of FIG. 1.
Fig. 6 is a photographic image of the atmospheric model.
Fig. 7 is a flowchart of processing performed by the rain influence mitigating unit.
Fig. 8 is a photograph of a wave extraction.
Fig. 9 is a flowchart showing the flow of the correspondence processing of waves using two images.
FIG. 10 is a photographic image of the feature vectors of a wave.
Fig. 11 is a photographic image of calculating the height of the sea surface using photographic images of the left and right cameras.
Fig. 12 is a flowchart showing a procedure of calculating the height of the sea surface using the photographic images of the left and right cameras.
Fig. 13 is a diagram showing an example of a columnar target.
Fig. 14 is a flowchart showing a flow of calibration by the telecamera system calibration unit.
Fig. 15 is a flowchart showing the flow of determination as to whether tsunami has occurred.
Fig. 16 is a photographic image of three-dimensional image measurement of the sea surface height at each time.
Fig. 17 is an image diagram of mechanical control of the line of sight of the camera.
Fig. 18 is an image diagram of fine control by image measurement control of two cameras.
Fig. 19 is a flowchart showing a flow of fine control by image measurement control in the case of having two cameras.
Description of the reference numerals
1: a server; 2A, 2B: a client computer; 3A, 3B: a camera set; 4A, 4B: a camera; 5A, 5B: a camera fixing and adjusting part; 10: a photographing unit; 11: a wave extraction unit; 12: a wave corresponding unit; 13: a three-dimensional information calculation unit; 14: whether a judgment unit exists when the sea surface is abnormal; 15: a telecamera system calibration unit; 16: a tsunami travel speed calculation unit; 17: a tsunami scale estimation unit; 18: tsunami arrival time estimation means; 19: a tsunami information transmitting unit; 20: a storage unit; 21: a fog effect mitigating unit; 22: a rain influence reducing unit; 23: two-stage control unit.
Detailed Description
Next, a sea surface measuring system according to an embodiment of the present invention will be described with reference to the drawings. Fig. 1 is a hardware configuration diagram of a sea surface surveying system according to an embodiment of the present invention, fig. 2 is an explanatory diagram showing an installation state of a camera group of the sea surface surveying system of fig. 1, fig. 3 is an explanatory diagram showing a coordinate relationship of three-dimensional image surveying based on stereoscopic viewing, and fig. 4 is a block diagram showing a configuration of the sea surface surveying system of fig. 1.
As shown in fig. 1, the sea surface surveying system according to the present embodiment includes a server 1 as a computer, client computers 2A and 2B connected to the server 1, respectively, and a first camera group 3A and a second camera group 3B connected to the client computers 2A and 2B, respectively. The first camera group 3A and the second camera group 3B each include cameras 4A and 4B and camera fixing and adjusting units 5A and 5B such as a rotating table, the cameras 4A and 4B include a camera body and a lens, and the camera fixing and adjusting units 5A and 5B fix the cameras 4A and 4B so that the postures of the cameras can be adjusted.
As shown in fig. 2, the first camera group 3A and the second camera group 3B are disposed at different points C1, C2 in positions close to the base length B and the set height H on land on the coast of the ocean. Strictly speaking, as shown in fig. 3, the distance between the camera lens centers of the first camera group 3A and the second camera group 3B provided at the points C1 and C2, respectively, is set as a base length B. In the following description, the direction in which the base length B extends is defined as the X direction, the height direction is defined as the Y direction, and the direction orthogonal to the X direction and the Y direction is defined as the Z direction.
As shown in fig. 4, the server 1 includes: a photographing unit 10 that simultaneously photographs images including the same sea surface area by a plurality of cameras 4A, 4B provided at different locations; a wave extraction unit 11 that extracts waves from the respective images simultaneously captured by the plurality of cameras 4A, 4B; a wave correspondence unit 12 that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras 4A and 4B for each wave extracted by the wave extraction unit 11; a three-dimensional information calculation unit 13 that calculates the height of the sea surface and the height of the waves by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit 12; and a sea surface abnormality presence/absence determination unit 14 that determines whether or not a sea surface abnormality such as a tsunami has occurred, based on the height of the sea surface and the height of waves calculated by the three-dimensional information calculation unit 13.
In addition, the sea surface measuring system in the present embodiment further includes: a telecamera system calibration unit 15 for improving the calculation accuracy of the three-dimensional information of the sea surface; a tsunami travel speed calculation unit 16 that calculates a travel speed of a tsunami when the occurrence of the tsunami is measured as a sea surface abnormality; a tsunami scale estimating unit 17 that estimates the scale of the tsunami; a tsunami arrival time estimation unit 18 that estimates an arrival time of the tsunami; tsunami information transmitting section 19; and a storage unit 20. Further, although details will be described later, the sea surface measuring system according to the present embodiment further includes a fog effect reducing unit 21, a rain effect reducing unit 22, and a two-stage control unit 23. The server 1 functions as each of the units 10 to 23 by executing a sea surface measurement program.
Fig. 5 is a flow chart illustrating a flow of tsunami measurements by the surface measurement system of fig. 1.
In order to measure images of a tsunami as a sea surface abnormality, two or more cameras 4A and 4B are first installed at different points (S101).
Next, images including the same sea surface area are simultaneously captured by the plurality of cameras 4A, 4B (S102). The infrared camera, the fog influence reducing means 21, and the rain influence reducing means 22 are used for photographing in a severe environment such as night, rain, and fog, and the details are described later (S201).
Then, the waves required for tsunami measurement are extracted from the images captured by the cameras 4A and 4B by the wave extraction unit 11 (S103).
After the waves are extracted for each image, the waves extracted from the images of the cameras 4A and 4B are associated by using the wave associating means 12 to perform the association of the waves, and a group of the same waves is obtained (S104).
And calculating the three-dimensional information of the sea surface by using the three-dimensional information calculation unit 13 of the sea surface aiming at each wave group obtained by corresponding. First, three-dimensional world coordinates (X, Y, Z) of each detected wave are calculated, and three-dimensional information of the sea surface, that is, the height of the sea surface and the height of the wave are calculated based on the three-dimensional world coordinates (X, Y, Z) (S105).
Further, in order to ensure the three-dimensional information calculation accuracy, calibration of the camera system is performed using the remote camera system calibration unit 15 (S202).
Based on the calculated sea surface height and wave height, it is determined whether tsunami occurs using a determination unit for determining whether tsunami occurs (S106, S107).
When it is determined that a tsunami has occurred, the tsunami travel speed calculation unit 16, the tsunami scale estimation unit 17, and the tsunami arrival time estimation unit 18 are used to estimate the scale of the tsunami (S108), and estimate the arrival time (S109).
Finally, the tsunami information transmitting unit 19 transmits the measured tsunami information to a necessary place and a necessary person (S110).
Next, each unit will be described in detail.
[ photographing Unit 10]
The photographing unit 10 simultaneously photographs images including the same sea surface area by a plurality of cameras 4A, 4B provided at different locations. As described above, the first camera group 3A including the camera 4A and the second camera group 3B including the camera 4B are installed at different points C1 and C2 on land near the coast of the ocean as shown in fig. 2, and acquire images of the long-distance sea surface within a range of about 5km to 20km in radius. In order to measure a three-dimensional image of the remote sea surface having a radius of 5km to 20km, a three-dimensional image measurement technique based on stereoscopic observation is used in the present embodiment. The distance between the cameras 4A and 4B, that is, the baseline length B between the cameras 4A and 4B, is 20 meters or more, and the altitude of the installation place of the cameras 4A and 4B, that is, the installation height H of the cameras 4A and 4B, is 20 meters or more.
In order to minimize damage to the first camera group 3A and the second camera group 3B due to tsunami or the like, suppress disturbance at the time of photographing as much as possible, and achieve photographing at a place as far as possible, it is desirable that the first camera group 3A and the second camera group 3B are provided at the coast and the first camera group 3A and the second camera group 3B are as close to the ocean as possible. In the present embodiment, two sets of the first camera group 3A and the second camera group 3B are used, and three or more sets can be used.
It is desirable that the cameras 4A, 4B use high-sensitivity cameras having a sensitivity of 8 bits or more so as to be able to improve the measurement accuracy and to be able to capture images required for measurement also in a severe environment. In addition, it is desirable to use a large-aperture lens having a telephoto function or a zoom lens having a telephoto function to photograph a distant sea surface.
The client computers 2A and 2B have a function of controlling photographing by adjusting parameters of the cameras 4A and 4B, and a function of acquiring images captured by the cameras 4A and 4B. The server 1 has a communication function of communicating with the client computers 2A, 2B, and has the following functions: the client computers 2A and 2B are controlled to send commands necessary for taking photographs to the client computers 2A and 2B, and to acquire photographed images from the client computers 2A and 2B.
In the sea surface surveying system according to the present embodiment, the lines of sight of the cameras 4A and 4B need to be adjusted in order to photograph different sea surfaces. Therefore, the camera fixing and adjusting sections 5A, 5B can rotate in the pan direction (horizontal direction) and the tilt direction (vertical direction). By adjusting the angles of the camera fixing and adjusting units 5A and 5B in the pan direction and the tilt direction, the angles of the cameras 4A and 4B provided in the camera fixing and adjusting units 5A and 5B in the shooting direction are adjusted up and down and left and right, and the plurality of cameras 4A and 4B can shoot the same sea surface.
The rotation angles in the pan direction and the tilt direction are controlled according to the commands of the client computers 2A, 2B. Specific values of the rotation angles in the pan direction and the tilt direction differ depending on the position of the measurement sea area, and the values are transmitted from the server 1 to the client computers 2A and 2B that control the camera groups 3A and 3B. In addition, the plurality of cameras 4A and 4B simultaneously capture images in response to an electronic external synchronization signal or a software synchronization command from the server 1, thereby realizing synchronous shooting by the cameras 4A and 4B. The client computers 2A and 2B perform parameter control such as exposure time of the camera bodies of the cameras 4A and 4B, zoom control of the lenses, focus control, and the like. First, images captured by the cameras 4A and 4B are acquired by the client computers 2A and 2B connected to the cameras 4A and 4B, and then the images are transmitted to the server 1 in accordance with a command from the server 1, and image processing is performed in the server 1.
Further, it takes 24 hours to photograph the ocean when measuring sea surface abnormalities such as tsunami. Therefore, in the sea surface measuring system of the present embodiment, as the cameras 4A and 4B, a visible light camera is mainly used in the daytime, and an infrared camera such as a far-infrared camera is mainly used at night.
[ fog Effect reducing Unit 21]
The server 1 further includes a fog influence reducing unit 21 for reducing the influence of fog and the like. The fog effect reducing means 21 reduces the adverse effect of fog or the like on the images captured by the cameras 4A and 4B, based on the atmospheric model, based on the intensity component of sunlight and the vertical length of the images of the ocean captured by the cameras 4A and 4B, that is, the transmittance distribution of the atmosphere of the sea surface linearly approximated to the y-coordinate of the images.
Fig. 6 shows a video image of the atmospheric model. In the atmosphere, there is an atmospheric glow whose intensity is denoted by A. The amount of light entering the camera includes two components, that is, an atmospheric glow component directly entering and a reflected light component from the object. The atmosphere model can be represented by equation (1).
[ formula 1]
I=J·t+A·(1-t) (1)
Where I is the amount of light entering the camera, i.e., an image affected by the environment such as fog, a is the intensity component of the atmospheric glow, J is the reflection component of the object, and t is the atmospheric transmittance from the object to the camera. J is an image in the case where there is no environmental influence such as fog, and is a target component required when the fog or the like removal processing is performed by the fog influence reducing means 21.
The transmittance t can be represented by formula (2).
[ formula 2]
t=e-βd (2)
Where β is an atmospheric diffusion coefficient, and d is a distance from the object to the camera.
From the formula (1), the formula (3) can be obtained.
[ formula 3]
Figure GDA0003470713350000101
That is, if the intensity component a of the atmospheric glow and the transmittance t of the atmosphere can be estimated, the influence of the fog can be removed by the fog influence reducing means 21, and an image having only the simple reflection component J can be obtained.
In reference 1(Robby T. Tan, Visibility in bed weather from a single image, IEEE Conference on Computer Vision and Pattern Recognition, pp.1-8,2008), Tan's proposed a method based on local area contrast maximization. The proposed method is to estimate t for a local region based on local region contrast maximization, provided that t is constant and J < a within the local region. This method uses only contrast and therefore sometimes oversaturates.
In reference 2(R.Fattal.Single Image Dehazing [ J ]. ACM SIGGRAPH'08, pp.1-9,2008), a mist removal method based on independent component analysis was proposed. In the local region, the mode of the reflection component J of the object is statistically independent of the atmospheric transmittance, and therefore the reflectance can be estimated by the independent component analysis method. The setting of independence of this method is based on the color information of the image, and therefore, the adaptability is low for a photograph of an ocean whose color information changes poorly.
In the present embodiment, the atmospheric glow intensity component a is estimated by a conventional method, and therefore, a detailed description thereof is omitted. As for the atmospheric transmittance t, the following calculation method is proposed: the atmospheric transmittance t is approximated to the longitudinal length of the captured image of the ocean, i.e., the y-coordinate of the image, in a linear relationship as shown in the following expression (4) without performing complicated calculation.
[ formula 4]
t=t0+ky (4)
Wherein, t0As an initial value of the atmospheric transmittance, k is a scale factor, and y is a y coordinate of an image of the ocean captured by the camera. This can significantly shorten the calculation time and quickly calculate the tsunami information.
[ rain influence reducing means 22]
The server 1 is provided with a rain influence reducing unit 22 for reducing the influence of rain. Fig. 7 is a flowchart of the processing performed by the rain influence mitigating unit 22. As shown in fig. 7, the rain influence reducing unit 22 obtains a maximum value image and a minimum value image of the color intensity of a pixel using a plurality of images captured in succession, and applies a guide filter (japanese: ガイドフィルター) using the minimum value image as a guide to process the maximum value image.
First, five images are continuously captured at fixed time intervals (S301).
Next, for five input images InEach pixel (1, 2, 3, 4, 5) has a maximum value of the color intensity of the pixel obtained by equation (5), and a minimum value of the color intensity of the pixel obtained by equation (6). Generating a minimum value image I using the minimum value of each pixelminGenerating a maximum value image I using the maximum value of each pixelmax(S302)。
[ formula 5]
Imax(i,j)=max(I1(i,j),…,In(i,j)) (5)
[ formula 6]
Imin(i,j)=min(I1(i,j),...,In(i,j)) (6)
Where, (i, j) is the pixel coordinate, and n is 5.
Next, for the target pixel k (i, j), a small region ω centered on k (i, j) is formedkInner, will minimum value image IminThe intermediate image I is obtained as a guide image as followsf(S303)。
[ formula 7]
If(i,j)=akImin(i,j)+bk,(i,j)∈ωk (7)
[ formula 8]
Figure GDA0003470713350000121
[ formula 9]
Figure GDA0003470713350000122
Wherein, mukAnd σkRespectively, small region omegakInside ofMinimum value image ImaxThe average and variance of the color intensity of the pixels, | ω | is a small region ω |kThe number of pixels within the image data stream,
Figure GDA0003470713350000123
is a small region omegakAverage value of color intensity of the input image.
Finally, based on the input image InAnd intermediate image IfThe influence of rain is removed by processing the maximum value image by applying a guide filter as described below, and an output image in which the appearance of waves is enhanced is obtained (S304).
[ formula 10]
Iout=(In-If)A+If (10)
Wherein A is an adjustment coefficient.
[ wave extraction unit 11]
The wave extraction unit 11 extracts waves from the respective images simultaneously captured by the plurality of cameras 4A, 4B. It is difficult to apply a general pattern extraction method to extract waves from a photographed image of a sea area, particularly from a long-distance photographed image of 5km to 20km which is a target of the sea surface measuring system in the present embodiment. The reasons for this include uncertainty of the wave shape, differences in the wave characteristics in a plurality of images captured by the plurality of cameras 4A and 4B, and changes in the imaging environment associated with all-weather imaging. In addition, high speed of detection is required.
Fig. 8 is a photograph of a wave extraction. As shown in fig. 8, first, the captured image is divided into several adjacent small rectangular or square regions. For the four corners of the upper left corner, the upper right corner, the lower left corner, and the lower right corner of each small region, a local region having the same area as that of the small region is selected around the corner, and a threshold value (small region threshold value) T for binarizing the image in the local region is obtained.
As shown in equation (11), the small region threshold T is multiplied by four corners of the upper left corner, the upper right corner, the lower left corner, and the lower right corner of the small region in which the attention point P (i, j) existsWith a weighting factor k dependent on the absolute distance to the four corners1~k4To calculate a threshold value T (i, j) for binarization of the image coordinates (i, j), i.e., the point of interest P (i, j).
[ formula 11]
T(i,j)=k1T(LUi,LUj)+k2T(RUi,RUj)+k3T(LDi,LDj)+k4T(RDi,RDj) (11)
[ formula 12]
Figure GDA0003470713350000131
[ formula 13]
Figure GDA0003470713350000132
[ formula 14]
Figure GDA0003470713350000133
[ formula 15]
Figure GDA0003470713350000134
[ formula 16]
Figure GDA0003470713350000135
That is, binarization is performed using different weight coefficients for each point of interest according to the distance between the point of interest and the four corners. This makes it possible to smoothly connect the binarization results of the pixels in the small region of interest to the binarization results of the pixels in the small regions adjacent to each other.
As described above, the component extracted by the binarization method (dynamic threshold value calculation algorithm) based on the dynamic threshold value in which the threshold value for binarizing the target point dynamically changes is defined as a wave in the image.
[ wave corresponding unit 12]
The wave correspondence unit 12 finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras 4A and 4B for each wave extracted by the wave extraction unit 11. Fig. 9 is a flowchart showing the flow of the correspondence processing of waves using two images. A similar method can be used in the case where three or more images are present.
First, by analyzing the epipolar geometry and the abscissa-ordinate relationship of the two-dimensional image, a correspondence candidate region, which is a region where a correspondence wave is likely to exist, is obtained from a plurality of images captured by a plurality of cameras provided at different locations (S401).
Next, a wave extraction means is applied to the corresponding candidate region of each image, m waves are extracted from the left image, and n waves are extracted from the right image (S402). Further, as shown in fig. 10, a two-dimensional feature vector is generated in which parameters such as barycentric coordinates, the number of pixels, a perimeter, a circularity, and a length in each direction of the extracted image of each wave are used as features of the wave. As an example, the two-dimensional feature vector V can be defined as follows.
[ formula 17]
V={G,S,L,C,T} (17)
Wherein G is the gravity center coordinate of the wave, and G (x)0,y0) And (4) showing. S is a parameter indicating the size of the wave, and is the number of pixels included in the extracted wave. L is the circumference of the wave and C is the circularity of the wave. T is a parameter representing the longitudinal and transverse lengths of the wave, denoted by T (h)1,h2,w1,w2) And (4) showing.
[ formula 18]
h1=y0-y1 (18)
[ formula 19]
h2=y2-y0 (19)
[ formula 20]
w1=x0-x1 (20)
[ formula 21]
w2=x2-x0 (21)
Furthermore, P1(x1,y1) Pixel coordinate, P, of the upper left corner of a rectangle enclosing a wave2(x2,y2) The pixel coordinates of the lower right corner.
Next, a feature vector VL of the wave in the left image and a feature vector VR of the wave in the right image are created (S403).
For m waves extracted from the left image, the characteristic parameter VL of the ith (i ═ 1,2, ·, m) wave is obtained according to the following equationiAnd generates a vector VL of the waves of the left image.
[ formula 22]
VLT={VL1,VL2,......,VLm-1,vLm} (22)
For n waves extracted from the right image, the characteristic parameter VR of the jth (j-1, 2, n) wave is obtained according to the following formulajAnd generates a vector VR of the waves of the right image.
[ formula 23]
VRT={VR1,VR2,......,VRn-1,VRn} (23)
Then, for all waves in the same area of each image, candidate waves are found out based on epipolar constraint, the similarity of the two-dimensional characteristic vectors is calculated, and a candidate wave list of the corresponding waves of each wave is made according to the sequence of the similarity from high to low. Specifically, for the ith wave (i ═ 1,2, ·, m) extracted from the left image, k waves near the epipolar line are extracted from the right image, and the k waves are set as k candidate waves of the corresponding wave in the right image corresponding to the ith wave of the left image (S404, S405).
For k candidate waves, the similarity of the two-dimensional feature vector is calculated, and a candidate wave list of the corresponding wave corresponding to the ith wave extracted from the left image is created in the order of the similarity from high to low (S406). The rank of the candidate wave is determined such that the wave with the highest similarity is set as a first candidate wave and the wave with the second highest similarity is set as a second candidate wave.
Based on the image coordinates of the gravity center of the wave of interest and the higher order waves in the candidate wave list, that is, the waves with high similarity, the three-dimensional world coordinates (X) of the wave of interest are obtained by a three-dimensional image measurement methodAttention,YAttention,ZAttention) And the three-dimensional world coordinate (X) of the candidate waveCandidates,YCandidates,ZCandidates) (S407), the three-dimensional world coordinate (X) of the wave of interest is determinedAttention,YAttention,ZAttention) With the three-dimensional world coordinate (X) of the candidate waveCandidates,YCandidates,ZCandidates) Whether the difference is smaller than a predetermined threshold value (S408).
If three-dimensional world coordinates (X) of waves are concernedAttention,YAttention,ZAttention) With the three-dimensional world coordinate (X) of the candidate waveCandidates,YCandidates,ZCandidates) If the difference is less than a predetermined threshold, the wave is set as the corresponding wave of the concerned wave (S409), otherwise the next wave is verified (S410, S411).
[ three-dimensional information calculation Unit 13]
The three-dimensional information calculation unit 13 calculates the sea surface height and the wave height by three-dimensionally analyzing each wave of the image corresponding to the wave correspondence unit 12.
If the sea surface has waves, the shape and the size of the waves change at any time. Therefore, when the height of the sea surface is obtained based on the three-dimensional measurement result of the waves at the sea surface, it is difficult to determine which part of which waves have the height of the sea surface.
On the other hand, in the tsunami measurement, it is not necessary to obtain the height of a specific point on the sea surface, and it is sufficient to calculate the height of the sea surface in a certain area to determine whether or not tsunami occurs.
Therefore, in the sea surface measuring system according to the present embodiment, the height of the sea surface in a certain small region is calculated from the photographed image. Fig. 11 is a video image of the sea surface height calculated using the photographic images of the left and right cameras, and fig. 12 is a flowchart showing a flow of calculating the sea surface height using the photographic images of the left and right cameras.
As shown in fig. 12, first, the same area is continuously captured by the left and right cameras 4A and 4B, and a plurality of continuous time-series images are acquired (S501). For example, the image is captured for 5 seconds at a capturing speed of 1/30 seconds for one image, and 150 images are captured by the left and right cameras, respectively, to obtain 300 images in total.
Next, waves are extracted from each image and are correlated (S502).
Then, three-dimensional world coordinates (X, Y, Z) of feature points such as the center of gravity, center, upper portion, lower portion, left end, and right end of all the waves are obtained (S503). For a camera, X denotes a horizontal direction, Y denotes a vertical direction, and Z denotes a depth direction.
Based on the calculated three-dimensional world coordinates (X, Y, Z) of each wave, the wave is classified into several small areas (S504). Waves are classified into three small regions as shown in fig. 11, for example.
For a specific small region, as shown in equation (24), for all the waves associated with all the time-series images, the gravity center of the wave and the average value Y of the Y coordinate values in the vertical direction of the three-dimensional world coordinates of the other feature points, that is, in the direction perpendicular to the sea surface are obtainedAAverage value YAMultiplying by a coefficient to calculate the height H of the sea surface of the small areas(S505)。
[ formula 24]
Hs=ksYA+Hs0 (24)
Wherein k issFor the scale factor to be used in the calculation, Hs0Is a coefficient for adjusting the three-dimensional world coordinate when calculating the height of the sea surface.
Finally, for a specific small area, as shown in equation (25), the variance value v (Y) of the Y coordinate values of all the corresponding waves in all the time-series images is obtained, and the variance value v (Y) is multiplied by a coefficient to calculate the height H of the wave in the small areaV(S506)。
[ formula 25]
Hv=kvV(Y)+Hv0 (25)
Wherein k isVFor the scale factor to be used in the calculation, HV0Is a coefficient for adjusting the world coordinates when calculating the height of the wave.
[ telecamera system calibration unit 15]
For calibration of the camera system, internal parameters such as the pixel arrangement of the image sensors of the cameras 4A and 4B and the focal length of the lenses, and external parameters such as the base length, which is the distance between the two cameras 4A and 4B, and the rotation angle, which is the position and orientation of the cameras 4A and 4B, are obtained. Various Calibration methods for Camera systems are proposed, and currently, in particular, the Zhang. Z, A Flexible New technology for Camera Calibration, IEEE Transactions on Pattern Analysis and Machine understanding, Vol.22, No.11,2000, pp.1330-1334, is evaluated and actually installed in OpenCV.
However, it is necessary to take a picture at a long distance to perform tsunami measurement, and there are still problems in that a large-sized calibration object is necessary for calibration of a long-distance three-dimensional image measurement system, and an object needs to be taken from different viewpoints.
Therefore, the telecamera system calibration unit 15 in the present embodiment is configured to perform two-stage calibration of inside and outside separation using different targets, in which an internal parameter calibration method using a plate-shaped target and an external parameter calibration method using a cylindrical target are integrated.
In the remote camera system calibration unit 15 of the present embodiment, two or more cylindrical targets or other elongated targets having a fixed width are used as targets for external parameter calibration. Fig. 13 shows an example of a columnar target.
The reason why the cylindrical object is selected as the object to be calibrated is because, as shown in fig. 13 (a), the distance from the measurement point used for image measurement to the center line of the cylindrical object does not change even when viewed from a different viewpoint. For example, when the target object is viewed from the line of sight 1, the measurement point 1 is observed on the image, and the distance from the measurement point 1 to the center line of the cylinder is R. Even when observed from the line of sight 2, the distance from the observation point 2 to the center line of the cylinder is R, as in the case of observation from the line of sight 1. That is, in the cylindrical target, no calibration error occurs due to a change in the observation point.
In consideration of the long-distance calibration, the length of a cylindrical or other elongated target is set to 1 meter or more. In order to facilitate recognition of the object, the surface of the object is made to have two or more colors, i.e., white and red, or other colors that are easily clearly distinguishable.
A point on the boundary line of the different colors such as the feature point candidate 1 and the feature point candidate 3 shown in fig. 13 (B), a point at the center of the color such as the feature point candidate 2, or the center of gravity is set as the feature point for calibration.
The sizes of the regions of different colors of the target, such as L1, L2, and L3 shown in fig. 13 (C), are used as known parameters for calibration.
Fig. 14 is a flowchart showing a flow of calibration by the telecamera system calibration unit 15.
First, a photograph of the calibration target is taken by the plurality of cameras 4A and 4B (S601).
Next, a plurality of feature points necessary for calibration are extracted from each image (S602).
Then, the extracted feature points are associated with each other (S603).
Finally, the parameters of the camera are calculated using the corresponding feature points and using the known parameters such as L1, L2, and L3 (S604).
[ determination unit 14 for the presence or absence of sea surface abnormality ]
The sea surface abnormality presence/absence determination unit 14 determines whether tsunami occurs based on the height of the sea surface and the height of waves calculated by the three-dimensional information calculation unit 13. Fig. 15 is a flowchart showing the flow of determination as to whether tsunami has occurred.
As shown in fig. 15, first, in order to determine whether or not tsunami occurs, the heights of the sea surface in several specific regions calculated by the technique of three-dimensional image measurement are compared with the level at the time of flatness when tsunami does not occur (S701). Based on the result of the comparison, the specific regions are classified into three regions, i.e., a candidate region in which tsunami occurs, a region in which tsunami is likely to occur, or a region in which tsunami does not occur.
When the difference between the measured height of the sea surface in the plurality of specific areas and the tidal level at the time of flatness in which tsunami does not occur is larger than a predetermined threshold value, the specific areas are classified as candidate areas in which tsunami occurs. When a region in which the difference between the height of the sea surface and the tidal level at the time of flatness in which tsunami does not occur is larger than a predetermined threshold value and a region in which the difference is smaller than the predetermined threshold value are mixed in the plurality of specific regions, the specific regions are classified as regions in which tsunami is likely to occur. When the difference between the height of the sea surface in the plurality of specific areas and the tidal level at the time of flatness in which tsunami does not occur is less than a predetermined threshold value, the specific areas are classified as areas in which tsunami does not occur.
In a candidate area where tsunami occurs or an area where tsunami is likely to occur, a plurality of small areas are further added around the candidate area as measurement areas (S702), and the measurement results of the sea surface heights in the measurement areas are compared with the normal sea levels in the measurement areas (S703). Based on the comparison result between the height of the sea surface in these measurement regions and the tidal level at the time of leveling, whether tsunami has occurred is determined using a fuzzy inference theory (japanese: ファージ sensing theory ) (S704), and the specific region is determined as a region where tsunami has occurred or a region where tsunami has not occurred. Further, sea surface abnormalities such as billows, flood tides, and rush tides other than tsunamis can be determined by the same procedure.
Tsunami travel speed calculation section 16, tsunami scale estimation section 17, and tsunami arrival time estimation section 18
The tsunami travel speed calculation unit 16 corrects the tsunami travel speed calculated based on the three-dimensional image measurement, using the tsunami travel speed calculated based on the conventional tsunami travel speed calculation method, and calculates a more accurate tsunami travel speed.
Fig. 16 is a photographic image of three-dimensional image measurement of the sea surface height at each time.
When the occurrence of a tsunami is measured as a sea surface abnormality, the height of the sea surface in the tsunami occurrence region and the peripheral region thereof is measured at any time, and the region having the largest height of the sea surface is detected at any time. Further, by the attitude control of the cameras 4A and 4B, the tsunami front, which is the area closest to the cameras 4A and 4B and having the largest height of the sea surface, is tracked, and the distance between the tsunami fronts, that is, the Z value of the three-dimensional measurement result, is acquired.
As shown in fig. 16, four-dimensional information (X, Y, Z, t (measurement time)) of the height change of the sea surface is acquired by three-dimensional image measurement of the height of the sea surface at each time such as time 1 and time 2, and the height change of the sea surface height of the tsunami front, the traveling direction, and the traveling speed, that is, the tsunami traveling speed V calculated based on the image measurement are calculatedm
Further, the travel speed V of the tsunami is calculated by the formula (26) based on the conventional calculation methodC
[ formula 26]
Figure GDA0003470713350000201
Wherein h is the water depth of the ocean in the measurement area, and g is the acceleration of gravity.
Tsunami travel speed V calculated by conventional calculation methodCFor the tsunami traveling speed V calculated by image measurementmA correction is made to calculate a more accurate tsunami travel speed V.
[ formula 27]
V=kmVm+(1-km)Vc (27)
Wherein k ismIs a coefficient for tsunami speed correction.
The tsunami scale estimating unit 17 evaluates the scale of the tsunami based on the height of the tsunami. For example, the tsunami scale estimation unit 17 evaluates the height of the tsunami to six levels of-1, 0, 1,2, 3, and 4.
The tsunami arrival time estimation unit 18 calculates the time at which the tsunami arrives at the coast, based on the position of the tsunami front, the height of the tsunami, and the tsunami travel speed measured by three-dimensional image measurement.
[ storage unit 20]
First, the observer creates a measurement plan as to when and which sea area to measure. The server 1 determines the posture of each camera according to the measurement plan, calculates the rotation angle of each camera, and transmits the rotation angle to each client computer 2A, 2B. The storage unit 20 stores the captured images of the cameras 4A and 4B acquired from the client computers 2A and 2B. The server 1 processes these images stored in the storage unit 20 to calculate and output whether or not a sea surface abnormality such as a tsunami has occurred.
[ two-stage control Unit 23]
In the sea surface surveying system according to the present embodiment, the imaging unit 10 includes a two-stage control unit 23 using a machine and an image, which integrates machine control based on angle feedback and image surveying control based on image imaging and image processing.
In the two-stage control unit 23, first, mechanical control based on angle feedback is performed using a low-resolution angle sensor, and the line of sight of each of the cameras 4A and 4B is roughly controlled so that the plurality of cameras 4A and 4B are directed to substantially the same sea area. As a target of the rough control, the line of sight of each camera 4A, 4B is roughly adjusted so that each camera 4A, 4B can photograph a part of the waves at the sea surface of the measurement target.
After the rough control, the cameras 4A and 4B take pictures, and the images taken by the cameras 4A and 4B are fed back, so that the line of sight of the cameras 4A and 4B is precisely controlled by image measurement control. The targets for fine control are: even if there are errors in mechanical control, vibrations of the cameras, and the like, the waves at the sea surface of the measurement target can be maximally captured by the respective cameras 4A, 4B.
Fig. 17 is an image diagram of mechanical control of the line of sight of the camera. The mechanical control is a first-level rough control for ensuring that the visual lines of the two cameras 4A and 4B are directed to the same sea area. In the rough control, the accuracy of the visual control is not required, and it is sufficient that the two or more cameras 4A and 4B can be oriented in the same sea area, that is, it is sufficient that several identical waves can be detected in the images captured by the two cameras 4A and 4B. Therefore, a low-resolution angle sensor of about 0.1 ° can be used for the rough control.
Fig. 18 is an image diagram of fine control by image measurement control of two cameras. The same control method can be applied to the case where the number of cameras is three or more. The image measurement control is a second-stage precision control, and fine adjustment of the angles and images of the cameras 4A and 4B is performed by image feedback, thereby ensuring that the same target is captured by the two cameras 4A and 4B.
In order to capture the same target by the left and right cameras 4A and 4B, it is necessary to transmit the respective target line-of-sight angles to the left and right cameras 4A and 4B. While one of the plurality of cameras is a main camera and the other is a sub-camera, in the present embodiment, the left camera 4A is a main camera and the right camera 4B is a sub-camera, and the angle of the line of sight of the right camera 4B, i.e., the sub-camera, is automatically calculated from the angle of the line of sight of the left camera 4A, i.e., the main camera.
The key points of image measurement control are as follows: the images captured by the left and right cameras 4A and 4B are fed back, compared with the captured image as the target, and based on the difference between the images, the panning and tilting angles of the camera fixing and adjusting units 5A and 5B are further adjusted. This makes it possible to correct errors caused by vibrations of the cameras 4A and 4B, that is, errors such as disturbance shown in fig. 18.
Fig. 19 is a flowchart showing a flow of fine control by image measurement control in the case of having two cameras. First, in the server 1, the line-of-sight angles for adjusting the postures of the cameras 4A, 4B, that is, the rotation angles in the pan and tilt directions are calculated from the photographing target area. Then, the line-of-sight angle of the right camera 4B, that is, the rotation angles in the pan and tilt directions are calculated using the theory of triangulation from the spatial relationship between the cameras 4A, 4B and the photographing target area, and transmitted to the client computers 2A, 2B that control the respective cameras 4A, 4B (S801).
The client computer 2A connected to the left camera 4A sets the rotation angle of the pan and tilt of the left camera 4A received from the server 1 as a target value of the line-of-sight angle, compares the actual line-of-sight angle of the left camera 4A acquired from the angle sensor with the target value of the line-of-sight angle, and adjusts the angles of the pan and tilt of the camera fixing and adjusting unit 5A so that the difference therebetween is minimized, thereby performing mechanical control (S802). The right camera 4B is also mechanically controlled in the same manner (S803).
Then, the left camera 4A and the right camera 4B take photographs, and the waves are extracted from the left and right photographed images (S804, S805).
Next, a common region, which is a region existing in both the left and right images, is obtained from the waves extracted from the left and right images, and one wave at the center of the common region is extracted and set as a target wave for camera pose adjustment (S806).
It is expected that the target wave will be located in the center of the images in both the left and right images. The target wave is a target of attitude control of the left and right cameras 4A and 4B, that is, a photographic target. It is determined whether or not the target wave enters a fixed range in the center periphery of the image with respect to the left camera 4A (S807), and if the target wave enters the fixed range, the attitude adjustment of the left camera 4A is ended, and if the target wave does not enter the fixed range, the above-described mechanical control is performed to adjust the pan and tilt angles (S809).
Next, photo shooting is performed after the machine control, a target wave is extracted from the shot image, and whether image adjustment is necessary or not is investigated (S811, S813). If the target wave enters a fixed area in the center of the image, image adjustment is not required, otherwise the image is translated within a fixed range in the horizontal and vertical directions to perform image adjustment (S815).
By the above image adjustment, it is checked whether or not the target wave is adjusted to the center of the left image, and if the target wave is located in the center area of the image, the adjustment is terminated, otherwise, the control is returned to the mechanical control of S809, and the control is repeated (S809 to S815). That is, the case where the target wave is not located at a position near the center of the image regardless of how the image is translated means that the range in which the image measurement control can be controlled is narrow and the control target cannot be achieved. This case, i.e., the portion that cannot be covered by the image measurement control, is realized by the mechanical control. The posture of the camera 4B is also adjusted for the right camera 4B in the same manner.
Finally, it is checked whether the adjustment of the left and right cameras 4A and 4B is completed, and if both are completed, the control of the cameras is completed, otherwise, the control is waited until the completion (S819 and S820).
That is, the image measurement control includes angle feedback by an angle sensor that acquires the pan and tilt angles of the camera fixing and adjusting units 5A and 5B that fix the cameras 4A and 4B so as to be able to adjust the postures, and image feedback that extracts waves from the photographs taken by the cameras 4A and 4B and determines the postures of the cameras 4A and 4B, and the angle feedback is internal feedback and the image feedback is external feedback, and the image feedback has a dual feedback structure surrounding the angle feedback.
Industrial applicability
The sea surface measurement system, the sea surface measurement method, and the sea surface measurement program according to the present invention can also be applied to measurement of whether or not a tsunami has occurred, measurement of the height of the tsunami when a tsunami has occurred, measurement of the travel speed of the tsunami, estimation of the arrival time of the tsunami, and the like. In addition, not only tsunami but also measurement of the height, position, and arrival time of waves generated by typhoon as sea surface abnormality can be applied. In addition, the method can be applied to measurement of sea surface abnormalities such as billows, flood tides and emergent tides, and monitoring of coasts.

Claims (19)

1. A surface survey system comprising:
a photographing unit which simultaneously photographs images including the same sea surface area by a plurality of cameras disposed at different locations;
a wave extraction unit that extracts waves from the respective images simultaneously captured by the plurality of cameras;
a wave correspondence unit that finds and corresponds the same wave from each image simultaneously captured by the plurality of cameras for each wave extracted by the wave extraction unit;
a three-dimensional information calculation unit that calculates a height of a sea surface and a height of a wave by three-dimensionally analyzing each wave in the image corresponding to the wave correspondence unit; and
a sea surface abnormality presence/absence determination unit that determines whether or not a sea surface abnormality has occurred based on the height of the sea surface and the height of the waves calculated by the three-dimensional information calculation unit,
wherein, the sea surface abnormal condition judging unit carries out the following processing:
comparing the calculated sea level heights in the specific areas with the normal sea level without sea level abnormality, and classifying the specific areas into three areas, namely a candidate area with sea level abnormality, an area with possible sea level abnormality or an area without sea level abnormality according to the comparison result;
classifying a plurality of specific areas into candidate areas where sea surface abnormality occurs, when differences between the measured sea surface heights in the specific areas and the sea surface level at a normal state where sea surface abnormality does not occur are both greater than a predetermined threshold value;
classifying the plurality of specific regions into regions where sea surface abnormalities are likely to occur, when there are regions where the difference between the height of the sea surface in the plurality of specific regions and the sea level at a normal time when sea surface abnormalities do not occur is larger than a predetermined threshold value and regions where the difference is smaller than the predetermined threshold value;
classifying the plurality of specific regions into regions where sea surface abnormality does not occur, when differences between the heights of the sea surfaces in the plurality of specific regions and the sea level at a normal time when sea surface abnormality does not occur are both smaller than a predetermined threshold value; and
a plurality of small areas are further added around a candidate area where sea surface abnormality occurs or an area where sea surface abnormality is likely to occur as a measurement area, and whether sea surface abnormality occurs or not is determined by using a fuzzy inference theory based on a comparison result between the height of the sea surface in the measurement area and the sea level at a normal time, so that the specific area is determined as an area where sea surface abnormality occurs or an area where sea surface abnormality does not occur.
2. A sea surface measuring system as defined in claim 1,
the three-dimensional information calculation unit calculates three-dimensional coordinates of feature points of each wave in the image corresponding to the wave correspondence unit, and calculates the height of the sea surface based on an average value of the calculated three-dimensional coordinates of the feature points in the height direction.
3. A sea surface measuring system as defined in claim 2,
the three-dimensional information calculation unit calculates the height of the wave based on the variance in the height direction of the three-dimensional coordinates of the feature points.
4. A sea surface measuring system as defined in any one of claims 1 to 3,
the sea surface abnormality presence/absence determination unit determines whether or not a sea surface abnormality has occurred by comparing the level at which the sea surface abnormality has not occurred when flat with the height of the sea surface obtained in consideration of the height of the waves calculated by the three-dimensional information calculation unit.
5. A sea surface measuring system as defined in any one of claims 1 to 3,
the plurality of cameras are two or more sets of cameras including a camera body, a lens, and a camera fixing and adjusting unit that fixes the plurality of cameras so that the postures of the cameras can be adjusted,
the surface measurement system comprises:
a client computer having a function of controlling photographing by adjusting parameters of the camera body and the lens, and a function of acquiring images photographed by the plurality of cameras; and
a server having a communication function for communicating with the client computer, and having: and controlling the client computer, sending a command required by photo shooting to the client computer, and acquiring the photo image of the camera from the client computer.
6. A sea surface measuring system as defined in any one of claims 1 to 3,
the plurality of cameras have a telephoto function,
the plurality of cameras are positioned at a coast,
the distance between the cameras is more than 20 meters,
the altitude of the installation place of each camera is more than 20 m.
7. A sea surface measuring system as defined in any one of claims 1 to 3,
comprises a camera fixing and adjusting part which respectively fixes the plurality of cameras in a way of adjusting postures and is used for adjusting the angles of the shooting direction along the up-down direction and the left-right direction by adjusting the angles in the pan direction and the tilt direction so as to enable the plurality of cameras to shoot the same sea surface,
the plurality of cameras simultaneously perform photographing according to an electronic external synchronization signal or a software synchronization command from a server.
8. A sea surface measuring system as defined in any one of claims 1 to 3,
the camera is a far infrared camera and comprises a fog influence reducing unit and a rain influence reducing unit,
the fog effect reducing unit reduces the effect of fog on the image captured by the camera based on an atmospheric model based on the intensity component of sunlight and the transmittance distribution of the atmosphere of the sea surface linearly approximated to the longitudinal length of the image of the ocean captured by the camera,
the rain influence reducing unit obtains a maximum value image and a minimum value image of the color intensity of the pixel by using a plurality of images continuously captured by the camera, and processes the maximum value image by applying a guide filter that guides the minimum value image.
9. A sea surface measuring system as defined in any one of claims 1 to 3,
the wave extraction means divides the image captured by the plurality of cameras into several adjacent rectangular or square small regions, finds a threshold value for binarizing the image for a local region including four corners of the small regions, i.e., the upper left corner, the upper right corner, the lower left corner, and the lower right corner,
the threshold for binarizing the attention point is obtained by multiplying a small region threshold value at four corners of the upper left corner, the upper right corner, the lower left corner, and the lower right corner of the small region in which the attention point exists by a weight coefficient depending on the absolute distance to the four corners.
10. A sea surface measuring system as defined in any one of claims 1 to 3,
the wave correspondence unit performs the following:
determining a corresponding candidate region from a plurality of images captured by the plurality of cameras by analyzing a horizontal-vertical coordinate relationship between the epipolar geometry and the two-dimensional image;
generating a two-dimensional feature vector characterized by the barycentric coordinates, the number of pixels, the circumference, the circularity, and the lengths in each direction of the images of each wave extracted in the corresponding candidate region of each image;
aiming at all waves in the same area of each image, finding out candidate waves based on epipolar constraint, solving the similarity of two-dimensional characteristic vectors of the candidate waves, and making a candidate wave list of the corresponding waves of each wave according to the sequence of the similarity from high to low; and
and determining whether the candidate wave corresponding to the wave is the corresponding wave by judging whether the difference between the three-dimensional world coordinate of the concerned wave and the three-dimensional world coordinate of the candidate wave is smaller than a predetermined threshold value.
11. A sea surface measuring system as defined in any one of claims 1 to 3,
the three-dimensional information calculation unit performs the following processing:
continuously shooting the same area through the plurality of cameras to obtain a plurality of images;
calculating three-dimensional world coordinates of feature points of each corresponding wave for the waves extracted from the shot image, wherein the feature points comprise the gravity center, the upper part, the lower part, the left end and the right end of the wave;
classifying each wave into several small areas based on the solved three-dimensional world coordinates of the waves;
calculating the height of the sea surface in a specific small area by calculating the average value of the coordinates of the three-dimensional world coordinates of the feature points in the direction vertical to the sea surface for all the waves corresponding to the small area and multiplying the average value by a coefficient; and
for all the waves that are associated in a specific small area, the variance of the coordinates of the three-dimensional world coordinates of the feature points in the direction perpendicular to the sea surface is determined, and the variance is multiplied by a coefficient to calculate the height of the waves in the small area.
12. A sea surface measuring system as defined in any one of claims 1 to 3,
further comprising a remote camera system calibration unit which is an internal-external separation two-stage calibration using different targets, fusing an internal parameter calibration using a plate-shaped target and an external parameter calibration using a cylindrical target,
two or more cylindrical objects or an elongated object having a constant width are used as the object,
the length of the target object is more than 1 meter,
the surface of the object has two or more colors that are easily clearly distinguished,
setting a point on a boundary line of regions of different colors of the object, a point at the center of the color, or a center of gravity as a characteristic point for calibration,
the size of the different colored regions of the object is set as a known parameter for calibration.
13. A sea surface measuring system as defined in any one of claims 1 to 3,
the tsunami navigation device further comprises a tsunami travel speed calculation unit which performs the following processing:
under the condition that the occurrence of the tsunami is measured as sea surface abnormality, measuring the height of the sea surface in a tsunami occurrence area and a peripheral area thereof at any time, and detecting an area with the largest height of the sea surface at any time;
tracking an area which is closest to the camera and has the largest sea surface height through the posture control of the camera, and acquiring the distance of the tsunami frontal surface;
acquiring four-dimensional information of height change of the sea surface by three-dimensional image measurement of the height of the sea surface at each moment, and calculating the height change, the advancing direction and the advancing speed of the sea surface at the tsunami frontal surface, namely the tsunami advancing speed; and
the tsunami travel speed calculated based on the three-dimensional image measurement is corrected using the tsunami travel speed calculated based on a conventional method for calculating the tsunami travel speed.
14. A sea surface measuring system as defined in any one of claims 1 to 3,
the tsunami arrival time estimation device further includes a tsunami arrival time estimation unit configured to calculate a time at which the tsunami arrives at the coast, based on the position of the tsunami front measured by the three-dimensional image measurement, the height of the tsunami, and the tsunami travel speed, when the occurrence of the tsunami is measured as the sea surface abnormality.
15. A sea surface measuring system as defined in any one of claims 1 to 3,
the photographing unit includes a two-stage type control unit,
the two-stage control unit is a two-stage control unit which has mechanical control based on angle feedback and image measurement control based on image photographing and image processing and combines the mechanical control and the image control,
in the mechanical control, the line of sight of each camera is roughly adjusted so that each camera can photograph a part of waves at the sea surface of the measurement target by the mechanical control based on the angle feedback obtained using the angle sensor,
in the image measurement control, each camera takes a picture, and the visual line of each camera is precisely controlled by feeding back the image taken by each camera.
16. A sea surface measuring system as defined in claim 15,
in the image measurement control,
when one camera of the plurality of cameras is set as a main camera and the other cameras are set as sub-cameras, first, a view angle of the main camera is calculated from a photographing target area and the calculated view angle of the main camera is set as a target value of the view angle of the main camera,
based on the installation positions of the main camera and the sub camera and the photographing target, the view angle of the sub camera is calculated by using the principle of triangulation, and the calculated view angle of the sub camera is set as a target value of the view angle of the sub camera.
17. A sea surface measuring system as defined in claim 15,
the image measurement control has an angle feedback using an angle sensor that acquires angles of pan and tilt of a camera fixed adjustment section that fixes the camera in a manner that the posture can be adjusted, and an image feedback that extracts waves from a photo shot of the camera to determine the posture of the camera, the angle feedback being set as an inner feedback, the image feedback being set as an outer feedback, the image feedback having a dual feedback structure around the angle feedback.
18. A method of sea surface surveying comprising the steps of:
simultaneously capturing images including the same sea surface area by a plurality of cameras disposed at different locations;
extracting waves from respective images simultaneously captured by the plurality of cameras;
for each extracted wave, finding out and corresponding the same wave from each image shot by the plurality of cameras at the same time;
calculating the height of the sea surface and the height of the waves by performing three-dimensional analysis on the corresponding waves in the image; and
determining whether a sea surface abnormality occurs based on the calculated sea surface height and the calculated wave height,
wherein, the following processing is carried out in the step of judging whether sea surface abnormity occurs:
comparing the calculated sea level heights in the specific areas with the normal sea level without sea level abnormality, and classifying the specific areas into three areas, namely a candidate area with sea level abnormality, an area with possible sea level abnormality or an area without sea level abnormality according to the comparison result;
classifying a plurality of specific areas into candidate areas where sea surface abnormality occurs, when differences between the measured sea surface heights in the specific areas and the sea surface level at a normal state where sea surface abnormality does not occur are both greater than a predetermined threshold value;
classifying the plurality of specific regions into regions where sea surface abnormalities are likely to occur, when there are regions where the difference between the height of the sea surface in the plurality of specific regions and the sea level at a normal time when sea surface abnormalities do not occur is larger than a predetermined threshold value and regions where the difference is smaller than the predetermined threshold value;
classifying the plurality of specific regions into regions where sea surface abnormality does not occur, when differences between the heights of the sea surfaces in the plurality of specific regions and the sea level at a normal time when sea surface abnormality does not occur are both smaller than a predetermined threshold value; and
a plurality of small areas are further added around a candidate area where sea surface abnormality occurs or an area where sea surface abnormality is likely to occur as a measurement area, and whether sea surface abnormality occurs or not is determined by using a fuzzy inference theory based on a comparison result between the height of the sea surface in the measurement area and the sea level at a normal time, so that the specific area is determined as an area where sea surface abnormality occurs or an area where sea surface abnormality does not occur.
19. A storage medium storing a surface measurement program that causes a computer to execute the surface measurement method according to claim 18.
CN202010027415.3A 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium Active CN111435081B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210062109.2A CN114526710A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method, and storage medium
CN202210062123.2A CN114485579A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-003426 2019-01-11
JP2019003426A JP6858415B2 (en) 2019-01-11 2019-01-11 Sea level measurement system, sea level measurement method and sea level measurement program

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202210062123.2A Division CN114485579A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium
CN202210062109.2A Division CN114526710A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method, and storage medium

Publications (2)

Publication Number Publication Date
CN111435081A CN111435081A (en) 2020-07-21
CN111435081B true CN111435081B (en) 2022-03-08

Family

ID=71580254

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202210062109.2A Pending CN114526710A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method, and storage medium
CN202210062123.2A Pending CN114485579A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium
CN202010027415.3A Active CN111435081B (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202210062109.2A Pending CN114526710A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method, and storage medium
CN202210062123.2A Pending CN114485579A (en) 2019-01-11 2020-01-10 Sea surface measuring system, sea surface measuring method and storage medium

Country Status (2)

Country Link
JP (1) JP6858415B2 (en)
CN (3) CN114526710A (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102648567B1 (en) * 2021-07-06 2024-03-18 한국해양과학기술원 Deep visual domain adaptation method and system for estimating water elevation for target area through domain adaptation between source and target models obtained by different modalities
CN113971679A (en) * 2021-11-08 2022-01-25 南京智慧水运科技有限公司 Ocean tide measuring method based on computer vision and image processing
CN115482248B (en) * 2022-09-22 2023-12-08 推想医疗科技股份有限公司 Image segmentation method, device, electronic equipment and storage medium
CN116124181B (en) * 2023-04-14 2023-07-14 国家海洋技术中心 On-site calibration method and system for tide observation equipment
CN116758123A (en) * 2023-04-25 2023-09-15 威海凯思信息科技有限公司 Ocean wave image processing method and device and server
CN116975504B (en) * 2023-09-22 2023-12-15 中科星图测控技术股份有限公司 Rapid calculation method for satellite reconnaissance coverage area target

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09325027A (en) * 1996-06-04 1997-12-16 Tech Res & Dev Inst Of Japan Def Agency Method for measuring wave peak
JP2009229424A (en) * 2008-03-25 2009-10-08 Mitsubishi Electric Corp Tsunami monitoring system
CN101887589A (en) * 2010-06-13 2010-11-17 东南大学 Stereoscopic vision-based real low-texture image reconstruction method
JP2011242315A (en) * 2010-05-20 2011-12-01 Topcon Corp Electronic level
CN103782147A (en) * 2011-08-02 2014-05-07 纳克斯公司 Underwater detection apparatus
CN105136126A (en) * 2015-08-27 2015-12-09 国家海洋技术中心 Method for performing tsunami wave detection through deep sea bottom pressure data
JP2018092546A (en) * 2016-12-07 2018-06-14 株式会社日立製作所 Tsunami monitoring system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07229736A (en) * 1994-02-18 1995-08-29 Mitsubishi Heavy Ind Ltd Device for calibrating three-dimensional measuring instrument
WO2004076972A1 (en) * 2003-02-27 2004-09-10 Mitsubishi Denki Kabushiki Kaisha Water level measuring system
JP4807439B2 (en) * 2009-06-15 2011-11-02 株式会社デンソー Fog image restoration device and driving support system
US10852134B2 (en) * 2017-05-08 2020-12-01 John W. Tauriac Real-time wave monitoring and sensing methods and systems

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09325027A (en) * 1996-06-04 1997-12-16 Tech Res & Dev Inst Of Japan Def Agency Method for measuring wave peak
JP2009229424A (en) * 2008-03-25 2009-10-08 Mitsubishi Electric Corp Tsunami monitoring system
JP2011242315A (en) * 2010-05-20 2011-12-01 Topcon Corp Electronic level
CN101887589A (en) * 2010-06-13 2010-11-17 东南大学 Stereoscopic vision-based real low-texture image reconstruction method
CN103782147A (en) * 2011-08-02 2014-05-07 纳克斯公司 Underwater detection apparatus
CN105136126A (en) * 2015-08-27 2015-12-09 国家海洋技术中心 Method for performing tsunami wave detection through deep sea bottom pressure data
JP2018092546A (en) * 2016-12-07 2018-06-14 株式会社日立製作所 Tsunami monitoring system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Long-distance Sea Wave Height Measurement Based on 3D Image;Hao Yi,Lei Yan, Kazuhiro Tsujino, Cunwei Lu;《Progress In Electromagnetic Research Symposium》;20160831(第2016期);1-6 *
A Sea Wave Height Measurement Method Based On 3-D Image Measurement Technique;Cunwei Lu等;《International Society of Offshore and Polar Engineers》;20150630(第201525期);330-335 *
Hao Yi,Lei Yan, Kazuhiro Tsujino, Cunwei Lu.A Long-distance Sea Wave Height Measurement Based on 3D Image.《Progress In Electromagnetic Research Symposium》.2016,(第2016期), *

Also Published As

Publication number Publication date
JP2020112438A (en) 2020-07-27
JP6858415B2 (en) 2021-04-14
CN114526710A (en) 2022-05-24
CN111435081A (en) 2020-07-21
CN114485579A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN111435081B (en) Sea surface measuring system, sea surface measuring method and storage medium
JP6484729B2 (en) Unmanned aircraft depth image acquisition method, acquisition device, and unmanned aircraft
CN110926474B (en) Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
EP1792282B1 (en) A method for automated 3d imaging
CN108692719B (en) Object detection device
CN106384382A (en) Three-dimensional reconstruction system and method based on binocular stereoscopic vision
CN104902246A (en) Video monitoring method and device
RU2626051C2 (en) Method for determining distances to objects using images from digital video cameras
CN111915678B (en) Underwater monocular vision target depth positioning fusion estimation method based on depth learning
CN102387374A (en) Device and method for acquiring high-precision depth map
CN107710091B (en) System and method for selecting an operating mode of a mobile platform
Koryttsev et al. Practical aspects of range determination and tracking of small drones by their video observation
CN112837207A (en) Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera
Crispel et al. All-sky photogrammetry techniques to georeference a cloud field
Savoy et al. Cloud base height estimation using high-resolution whole sky imagers
CN210986289U (en) Four-eye fisheye camera and binocular fisheye camera
CN111486820B (en) Measurement system, measurement method, and storage medium
CN110989645A (en) Target space attitude processing method based on compound eye imaging principle
CN106303412A (en) Refuse dump displacement remote real time monitoring apparatus and method based on monitoring image
US20220103762A1 (en) Imaging apparatus, imaging system, and imaging method
CN111412898B (en) Large-area deformation photogrammetry method based on ground-air coupling
RU2685761C1 (en) Photogrammetric method of measuring distances by rotating digital camera
Wang Towards real-time 3d reconstruction using consumer uavs
JPH04264207A (en) Measurement of multi-view point stereoscopic image
CN111931638B (en) Pedestrian re-identification-based local complex area positioning system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant