WO2021235155A1 - Cloud location calculation system, method, and program - Google Patents

Cloud location calculation system, method, and program Download PDF

Info

Publication number
WO2021235155A1
WO2021235155A1 PCT/JP2021/016017 JP2021016017W WO2021235155A1 WO 2021235155 A1 WO2021235155 A1 WO 2021235155A1 JP 2021016017 W JP2021016017 W JP 2021016017W WO 2021235155 A1 WO2021235155 A1 WO 2021235155A1
Authority
WO
WIPO (PCT)
Prior art keywords
cloud
region
position information
area
sky
Prior art date
Application number
PCT/JP2021/016017
Other languages
French (fr)
Japanese (ja)
Inventor
祐弥 ▲高▼島
Original Assignee
古野電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 古野電気株式会社 filed Critical 古野電気株式会社
Priority to JP2022524336A priority Critical patent/JPWO2021235155A1/ja
Publication of WO2021235155A1 publication Critical patent/WO2021235155A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • This disclosure relates to cloud position calculation systems, methods, and programs.
  • Patent Document 2 describes that two spherical cameras are used to calculate the altitude of clouds vertically above equipment whose distance from the two spherical cameras is known.
  • the present disclosure provides a cloud position calculation system, a method, and a program capable of calculating cloud position information for each cloud mass.
  • the cloud position calculation system of the present disclosure is from an image acquisition unit that acquires at least an aerial image including the sky taken by a plurality of cameras arranged at different positions from each other, and at least two of the plurality of aerial images.
  • a first cloud position calculation unit that calculates cloud position information in the first region based on the comparison of the acquired sky images, and an extraction unit that extracts cloud masses from the cloud position information in the first region.
  • the second area setting unit that includes the extracted cloud mass and sets a second area narrower than the first area, and the first in the sky image acquired from at least two cameras among the plurality of sky images. It is provided with a second cloud position calculation unit that calculates the position information of the cloud mass in the second area based on the comparison of the images corresponding to the two areas.
  • the position information of the clouds in the relatively wide first area is calculated based on the comparison of the aerial images, and the second area narrower than the first area including the cloud mass extracted from the position information of the clouds is targeted. Since the position information of the cloud mass is calculated by comparing the sky image, the position information can be calculated accurately for each cloud mass compared to the case where the comparison is performed by focusing on a part of the cloud mass of the sky image from the beginning. Become. Nevertheless, first, since the position information of the clouds in the relatively wide first region is calculated, the consistency as a whole can be ensured, and then the position information of the local cloud mass in the second region can be reduced in estimation error. It becomes possible.
  • the figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h 2575m.
  • the figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h 3575m.
  • the figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h 1800 m.
  • the cloud position calculation system 1 is used for the observation system.
  • the system 1 includes two or more cameras 10 that capture the sky and a computer that processes the sky images taken by the cameras 10.
  • the system 1 acquires sky images acquired from at least two cameras 10 arranged at different positions from each other, and calculates unknown cloud position information based on the images.
  • the cloud position information includes at least one of the latitude / longitude information and the altitude information of the cloud.
  • the camera 10 may be any camera as long as it can photograph the sky.
  • an omnidirectional camera using a fisheye lens is installed facing upward in order to capture a wide area of the sky with one camera.
  • the camera 10 is installed so as to be vertically upward and horizontal, the center of the aerial image obtained from the camera 10 is directly above (elevation angle 90 °), and the elevation angle becomes smaller toward the edge of the image from the center. Further, if the orientation of the camera 10 is known, the orientation in the image is also known. That is, each pixel at an arbitrary position in the aerial image has a known elevation angle and a known orientation.
  • FIG. 4 an example in which a plurality of cameras 10 are installed at different positions in Hachijojima, Tokyo, Japan, clouds are observed, and cloud position information is calculated will be described.
  • Each of the plurality of cameras 10 is installed at the camera installation points C1, C2, C3, and C4 shown in FIG.
  • the system 1 includes an image acquisition unit 11, a first cloud position calculation unit 12, an extraction unit 13, a second area setting unit 14, and a second cloud position calculation unit 15. .
  • software and hardware cooperate by executing a program in which the processor 1b is stored in the memory in advance in a computer equipped with a processor 1b such as a CPU, a storage 1a such as a memory, and various interfaces. Will be realized.
  • the image acquisition unit 11 acquires aerial images G1, G2, and G3 including at least the sky taken by a plurality of cameras 10 installed at different positions.
  • the image acquisition unit 11 acquires aerial images G1, G2, and G3 including at least the sky taken by a plurality of cameras 10 installed at different positions.
  • four aerial images are acquired from each camera installation point C1, C2, C3, C4. These four aerial images are taken at the same time by synchronizing the time of the camera 10 in order to contrast with each other.
  • the time error is within 1 second. It is preferable that the time is the same as described above, but it is not necessary that the time is exactly the same.
  • a plurality of aerial images may be taken at times corresponding to each other. For example, it may be in the same time zone.
  • the same time zone refers to, for example, the same time in a state where the times of a plurality of cameras are synchronized with each other.
  • the time difference between the cameras pointed to by the same time zone is zero or small, but the time difference is, for example, about 10 minutes. It is possible to exert the effect even if there is. It is more preferably within 2 minutes, preferably within 1 minute, and more preferably within 10 seconds.
  • the first cloud position calculation unit 12 calculates the cloud position information in the first region Ar1 based on the comparison of the sky images acquired from at least two cameras among the plurality of sky images.
  • the first region Ar1 is a geographical region based on latitude and longitude, as shown in FIGS. 4 and 14, and corresponds to, for example, a certain region on a map.
  • the first area Ar1 is set to include Hachijojima.
  • the first area Ar1 can be changed as appropriate.
  • 13 and 14 show cloud position information (latitude / longitude and its distribution) in the first area Ar1 calculated by the first cloud position calculation unit 12.
  • the cloud position information in the first region Ar1 indicates the latitude / longitude, its distribution, and the altitude of the cloud.
  • the altitudes of the clouds are all the same.
  • three aerial images acquired from the three cameras 10 are used, but the present invention is not limited thereto.
  • the number of cameras may be two or more.
  • the cloud position information is calculated by comparing the two sky images acquired from the two cameras. That is, if the number of cameras is two or more, the number can be changed to any number, and the combination of two or more aerial images can be changed arbitrarily.
  • the first cloud position calculation unit 12 shown in FIG. 1 includes a common area setting unit 12a, a subregion setting unit 12b, a subregion cloud position calculation unit 12c, and a synthesis unit 12d.
  • the common area setting unit 12a shown in FIG. 1 is a common area Ar1 used to combine a plurality of areas specified by a set of aerial images.
  • the common area is used as the first area Ar1, but the common area and the first area Ar1 do not have to be the same.
  • the common area setting unit 12a sets the geographical area as the common area Ar1 in order to generate the position information of the clouds.
  • the common area Ar1 is set based on the maximum value and the minimum value of the latitude and longitude of each camera installation point C1 to C4. Specifically, the predetermined distance D1 is wider than the maximum and minimum values of latitude and longitude, which is an example. In the example of FIG.
  • D1 5000 m
  • the common area Ar1 is separated by a mesh of 50 m.
  • Two matrices corresponding to each cell of the mesh of the common area Ar1 were generated, and a cloud distribution matrix showing the distribution of clouds and a cloud altitude matrix showing the altitude of the bottom of the clouds were generated.
  • the small area setting unit 12b shown in FIG. 1 generates all sets of two cameras 10 from all the cameras 10 (C1 to C4), and sets the small area Ar10 for each set.
  • the subregion Ar10 is used to represent the position information of the clouds obtained by contrasting the aerial images obtained from the two corresponding cameras 10.
  • the third area Ar13 and the fourth area Ar14 will be described as an example.
  • the third region Ar13 and the fourth region Ar14 are each included in the first region Ar1. As shown in FIG. 12, the third region Ar13 shown in FIG.
  • each subregion Ar10 is set based on the center of the two camera installation points, but the setting method can be changed as appropriate. In order to facilitate the synthesis process described later, each subregion Ar10 is generated by cutting out from the common region Ar1.
  • the subregion cloud position calculation unit 12c shown in FIG. 1 captures cloud position information in the set subregion Ar10 (third region Ar13, fourth region Ar14) from aerial images acquired from the corresponding two cameras 10. Calculated based on the comparison of.
  • a method of calculating the position information of clouds in the third region Ar13 based on the aerial images G1 and G2 will be described.
  • FIG. 6 is a diagram showing the relationship between the aerial images G1 and G2 and the third region Ar13.
  • the points are plotted so as to form a cross.
  • Each camera 10 has a known latitude and longitude. Further, all the pixels in the aerial images G1 and G2 have known elevation angles and azimuth angles, respectively. Therefore, any pixel in the aerial image can be associated with the latitude and longitude in the third region Ar13. Specific examples are shown as the following two (1) and (2).
  • the elevation angle ⁇ 1 and the azimuth angle are known for the pixel P1'(the cloud is reflected) in the sky image G1 shown in FIG. If the altitude h is tentatively determined, the distance L1 from the point P1 on the ground surface where P1'is projected directly below to the camera installation point C1 can be calculated. The latitude / longitude of the point P1 can be calculated based on the calculated distance L1, the known azimuth, and the latitude / longitude of the camera installation point C1, and this is the latitude / longitude of the cloud.
  • the distance L2 and the azimuth of the point P2 with respect to the camera installation point C2 are known. Therefore, if the altitude h is tentatively determined, the elevation angle ⁇ 2 with respect to the point P2'at the altitude h directly above the point P2 on the ground surface can be calculated. Based on the calculated elevation angle ⁇ 2 and azimuth angle, the pixel P2'in the aerial image G2 can be specified.
  • the small area cloud position calculation unit 12c uses the correspondence between the sky image and the third area Ar13 to obtain the cloud distribution on the third area Ar13 based on the latitude and longitude, as shown in FIG. into a cloud distribution matrix a 3 and cloud height matrix (not shown) indicate.
  • Cloud distribution matrix A 3 inputs 1 to the value of the element corresponding to the cells of the mesh cloud exists clearly, 0.5 is input to the value of the element corresponding to the cell of the mesh thin clouds are present There is.
  • cloud altitude matrix the elements corresponding to the elements that there are clouds in cloud distribution matrix A 3, the value of the altitude h of the temporarily determined is input.
  • cloud distribution matrix A 3 and cloud altitude matrix the values of elements that do not exist cloud becomes zero.
  • the value of the elements of the cloud distribution matrix A 3 is 0.0 or more and take 1.0.
  • the value of the element is also called cloud cover.
  • the small area cloud position calculation unit 12c tentatively determines the altitude h, generates the cloud distribution matrix A3_G1 and the cloud altitude matrix of the third area Ar13 based on the sky image G1 as shown in FIG. 12, and generates the sky image G2.
  • the cloud distribution matrix A3_G2 and the cloud altitude matrix of the third region Ar13 are generated based on the above.
  • the subregion cloud position calculation unit 12c calculates the degree of coincidence (reliability) between the cloud distribution matrices A 3_G1 and A 3_G2.
  • an evaluation function ZNCC Zero-mean Normalized Cross-Correlation
  • the evaluation function is not limited to this. ..
  • pattern matching of another method may be used.
  • the subregion cloud position calculation unit 12c calculates the cloud distribution matrices A 3_G1 , A 3_G2 , the cloud altitude matrix, and the reliability ZNCC, respectively, by making the altitude h different by a predetermined altitude.
  • the cloud distribution matrix and the cloud altitude matrix corresponding to the highest reliability ZNCC are adopted, and as shown in FIG. 12, the reliability ZNCC 3_FIX , the cloud distribution matrix A 3_FIX, and the cloud altitude matrix are adopted. Is stored in the storage unit. A similar process is performed for all subregions Ar10. In FIG.
  • a plurality of sets of cloud distribution matrices A 4_G2 , A4_G3 , cloud altitude matrix and reliability ZNCC are calculated based on the comparison of aerial images G2 and G3 for the fourth region Ar14, and the maximum reliability ZNCC 4_FIX and clouds are calculated.
  • the distribution matrix A 4_FIX and the cloud altitude matrix are stored in the storage unit.
  • the synthesis unit 12d shown in FIG. 1 obtains cloud position information of each subregion Ar10 (including the third region Ar13 and the fourth region Ar14) calculated by the subregion cloud position calculation unit 12c. Synthesize to the common area Ar1.
  • the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to calculate the cloud position information in the first region Ar1.
  • the region where the third region Ar13 and fourth regions Ar14 do not overlap may be adopted the value of the corresponding elements of each cloud distribution matrix A 3, A 4 as it is.
  • the cloud position information at the overlapping point For the area where the third area Ar13 and the fourth area Ar14 overlap, the cloud position information at the overlapping point, the cloud position information of the corresponding third area Ar13 and the fourth area Ar14, and the reliability ZNCC of each. It is preferable to calculate based on the weights of 3_FIX and ZNCC 4_FIX.
  • the value A (i, j) of the element of the cloud distribution matrix A at the overlapping portion can be calculated by the following equation.
  • FIG. 8 to 11 are diagrams showing a process of calculating the position information of clouds in the third region Ar13 based on the comparison of the aerial images G1 and G2.
  • the extraction unit 13 shown in FIG. 1 extracts cloud masses (m1 to m5) from the position information of clouds in the first region Ar1.
  • each element of the cloud distribution matrix in the first region Ar1 has a numerical value indicating the existence of a cloud.
  • the elements having a high existence probability are shown in white, and the light elements are shown in gray.
  • the altitude of each cloud in the first region Ar1 is the same, the altitude is actually different for each cloud mass, so the extraction unit 13 extracts the cloud mass.
  • the extraction unit 13 extracted the elements whose cloud distribution matrix elements have a predetermined value or more, and grouped them by a labeling algorithm used for general image processing. In FIG.
  • the second area setting unit 14 shown in FIG. 1 sets the second areas Ar21 to Ar25 including the cloud mass extracted by the extraction unit 13.
  • the second region Ar2 may be narrower than the first region Ar1 and may include a part of at least one cloud mass. More preferably, the second region Ar2 includes at least the boundary between the extracted cloud mass and the sky. More preferably, the second region Ar2 includes all the boundaries of the extracted cloud mass with the sky.
  • the second region Ar21 is set for the cloud mass m1
  • the second region Ar22 is set for the cloud mass m2
  • the second region Ar23 is set for the cloud mass m3.
  • the second region Ar24 is set for the cloud mass m4, and the second region Ar25 is set for the cloud mass m5. These examples are set to include all the boundaries of one cloud mass with the sky.
  • the second region Ar21 includes all the boundaries of the cloud mass m1 with the sky.
  • the second cloud position calculation unit 15 shown in FIG. 1 is a second cloud position calculation unit 15 based on a comparison of images corresponding to the second regions Ar21 to Ar25 in the sky images acquired from at least two cameras 10 among the plurality of sky images.
  • the position information of the cloud mass in the areas Ar21 to Ar25 is calculated.
  • the second region Ar21 shown in FIG. 14 is taken. First, the two cameras 10 (C1, C2) closest to the second region Ar21 are selected, and the aerial images G1 and G2 obtained from those two cameras 10 are acquired. Using these two sky images G1 and G2, cloud position information [latitude / longitude (cloud distribution matrix A), cloud altitude matrix] is calculated for the second region Ar21.
  • the method of calculating the cloud position information is the same as that of the first cloud position calculation unit 12. Similarly, the cloud position information is calculated for the second regions Ar22, Ar23, Ar24, and Ar25. Regarding this embodiment, in the processing by the first cloud position calculation unit 12, the altitude was searched in the range of 0 to 4000 m, but the second cloud position calculation unit 15 searched in the range of ⁇ 500 m centered on 1800 m. Since the altitude of the cloud is estimated in advance in this way, it is possible to narrow the search range and reduce the calculation cost.
  • each calculation result is overwritten in the first region Ar1.
  • the results are shown in FIG.
  • the altitude of the cloud mass m1 was 1948 m
  • the altitude of the cloud mass m2 was 1934 m
  • the altitude of the cloud mass m3 was 1905 m
  • the altitude of the cloud mass m4 was 1965 m
  • the altitude of the cloud mass m5 was 1750 m. .. It is possible to calculate different altitudes for each cloud mass.
  • step ST100 the image acquisition unit 11 acquires aerial images G1, G2, and G3 including at least the sky taken by a plurality of cameras 10 arranged at different positions from each other.
  • the first cloud position calculation unit 12 determines the clouds in the first region Ar1 based on the comparison of the sky images acquired from at least two cameras 10 among the plurality of sky images G1, G2, and G3. Calculate the position information of. A detailed explanation will be described later with reference to FIG.
  • the extraction unit 13 extracts cloud masses m1 to m5 from the cloud position information in the first region Ar1.
  • the second area setting unit 14 sets the second area Ar2 (Ar21, Ar22, Ar23, Ar24, Ar25) including the extracted cloud masses m1 to m5 and narrower than the first area Ar1.
  • the second cloud position calculation unit 15 determines the second region Ar2 (Ar21, Ar22, Ar23, in the aerial image acquired from at least two cameras 10 among the plurality of aerial images G1, G2, G3. Based on the comparison of the images corresponding to Ar24, Ar25), the position information of the cloud mass in the second region Ar2 (Ar21, Ar22, Ar23, Ar24, Ar25) is calculated.
  • the processes of steps ST103 and ST104 are executed for each cloud mass.
  • step ST200 the common area setting unit 12a sets the common area Ar1 which is also the first area Ar1 based on the camera installation points C1 to C4 as shown in FIG.
  • step ST201 the subregion setting unit 12b generates all the sets of the two cameras 10 from all the cameras 10 (C1 to C4), and sets the subregion Ar10 for each set.
  • step 202 the subregion cloud position calculation unit 12c acquired the cloud position information in the set subregion Ar10 (third region Ar13, fourth region Ar14) from the corresponding two cameras 10. Calculated based on the comparison of aerial images.
  • the first region Ar1 includes the third region Ar13 and the fourth region Ar14.
  • the subregion cloud position calculation unit 12c calculates the cloud position information in the third region Ar13 based on the first set of sky images (G1, G2) acquired from at least two cameras 10.
  • the subregion cloud position calculation unit 12c is based on the sky images (G2, G3) of the second set, which is a combination different from the first set, acquired from at least two cameras 10, and the cloud position calculation unit 12c in the fourth region Ar14.
  • the synthesis unit 12d synthesizes the calculated cloud position information of each subregion Ar10 (including the third region Ar13 and the fourth region Ar14) into the common region Ar1 and in the first region Ar1. Calculate cloud position information.
  • the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to calculate the cloud position information in the first region Ar1.
  • the cloud position calculation system 1 of the present embodiment includes an image acquisition unit 11 that acquires at least aerial images G1 to G3 including the sky taken by a plurality of cameras 10 arranged at different positions.
  • the first cloud position calculation unit 12 that calculates the cloud position information in the first area Ar1 based on the comparison of the sky images acquired from at least two cameras 10 among the sky images G1 to G3, and the first area.
  • An extraction unit 13 that extracts cloud masses m1 to m5 from cloud position information in Ar1, and a second area setting unit 14 that includes the extracted cloud masses m1 to m5 and sets a second region Ar2 that is narrower than the first region Ar1.
  • the position information of the cloud mass in the second region Ar2 is calculated based on the comparison of the images corresponding to the second region Ar2 in the sky images acquired from at least two cameras 10.
  • a second cloud position calculation unit 15 is provided.
  • the cloud position calculation method of the present embodiment acquires aerial images G1 to G3 including at least the sky taken by a plurality of cameras 10 arranged at different positions (ST100), and a plurality of aerial images G1 to G3.
  • the cloud position information in the first region Ar1 is calculated (ST101) based on the comparison of the sky images acquired from at least two cameras 10, and the cloud mass is calculated from the cloud position information in the first region Ar1.
  • Extracting m1 to m5 ST102
  • setting a second area Ar2 including the extracted cloud masses m1 to m5 and narrower than the first area Ar1 ST103
  • the calculation of the position information of the cloud mass in the second region Ar2 based on the comparison of the images corresponding to the second region Ar2 in the sky images acquired from at least two cameras 10 (ST104) is included.
  • the cloud position information in the relatively wide first region Ar1 is calculated based on the comparison of the aerial images, and the second region narrower than the first region Ar1 including the cloud mass extracted from the cloud position information. Since the position information of the cloud masses m1 to m5 is calculated by comparing the aerial images with Ar2 as the target, the position information for each cloud mass is compared with the case where the comparison is performed focusing on a part of the cloud masses of the aerial image from the beginning. Can be calculated accurately. Nevertheless, first, since the position information of the clouds in the relatively wide first region Ar1 is calculated, the consistency as a whole can be ensured, and then the position information of the local cloud mass in the second region can be estimated with an error. It will be possible to reduce it.
  • the first region Ar1 includes the third region Ar13 and the fourth region Ar14
  • the first cloud position calculation unit 12 is a set of the first set acquired from at least two cameras 10.
  • the position information of the clouds in the third region Ar13 is calculated, and the second set of aerial images (a combination different from the first set) acquired from at least two cameras 10 (a second set of aerial images (G1, G2).
  • the cloud position information in the fourth region Ar14 is calculated, and the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to form the first region. It is preferable to calculate the position information of the cloud in Ar1.
  • the position information of the clouds in the first region Ar1 is calculated by synthesizing the position information of the clouds in a plurality of regions obtained from a plurality of sets of aerial images, so that the position information of a wide range of clouds can be calculated accurately. It becomes.
  • the first cloud position calculation unit 12 calculates the reliability ZNCC 3 of the third region Ar13 together with the cloud position information in the third region Ar13, and the cloud position information in the fourth region Ar14.
  • the reliability ZNCC 4 of the 4th region Ar14 is calculated together with, and the 3rd region Ar13 and the 4th region Ar14 overlap, the cloud position information at the overlapping location is obtained from the corresponding cloud of the 3rd region Ar13 and the 4th region Ar14. It is preferable to calculate based on the position information of the above and the weights of the respective reliability ZNCC 3 and ZNCC 4.
  • the position information of the clouds at the overlapping points is calculated with the reliability ZNCC 3 and ZNCC 4 of each region as weights, so that the position information of a wide range of clouds can be acquired while improving the accuracy of the cloud position information. It will be possible.
  • the second cloud position calculation unit 15 uses sky images acquired from at least two cameras 10 near the second area Ar2.
  • the height component of the clouds in the second region Ar2 reflected in the aerial image can be reduced, and the accuracy can be improved.
  • the second region Ar2 includes at least the boundary between the extracted cloud masses m1 to m5 and the sky.
  • the second region Ar2 includes at least the boundary between the extracted cloud masses m1 to m5 and the sky, features such as the shape and arrangement of the cloud masses can be taken into consideration, and the accuracy can be improved. ..
  • the cloud position information includes at least one of cloud altitude information and latitude / longitude information. This is a preferred embodiment.
  • the first region Ar1 or the second region Ar2 is meshed along the latitude and longitude, and it is preferable that the cloud position information is represented for each cell of the mesh. This is a preferred embodiment.
  • the program according to this embodiment is a program that causes a computer to execute the above method. Further, the temporary recording medium readable by the computer according to the present embodiment stores the above program.
  • each process such as operation, procedure, step, and step in the device, system, program, and method shown in the claims, specification, and drawings may be the output of the previous process. It can be realized in any order unless it is used in processing. Even if the scope of claims, the specification, and the flow in the drawings are explained using "first”, “next”, etc. for convenience, it does not mean that it is essential to execute in this order. ..
  • Each part 12 to 17 shown in FIG. 1 is realized by executing a predetermined program by 1 or a processor, but each part may be configured by a dedicated memory or a dedicated circuit.
  • each part 11 to 19 is mounted on the processor 1b of one computer, but each part 11 to 19 may be distributed and mounted on a plurality of computers or the cloud. That is, the above method may be executed by one or more processors.
  • each part 11 to 19 is mounted, but some of them can be omitted arbitrarily.
  • an embodiment in which each part 11 to 17 is mounted can be mentioned.

Abstract

In order to provide a cloud location calculation system, method, and program which can calculate location information about clouds for each mass of clouds, the system (1) comprises: an image acquisition unit (11) which acquires sky images (G1–G3) that include at least the sky and are captured by a plurality of cameras (10) disposed at different locations; a first cloud location calculation unit (12) which calculates location information about clouds in a first area (Ar1) on the basis of a comparison between sky images that are among the plurality of sky images (G1-G3) and are acquired by at least two cameras (10); an extraction unit (13) which extracts masses of clouds (m1-m5) from the location information about the clouds in the first area (Ar1); a second area setting unit (14) which sets a second area (Ar2) that includes the extracted masses of clouds (m1-m5) and is narrower than the first area (Ar1); and a second cloud location calculation unit (15) which calculates location information about a mass of clouds in the second area (Ar2) on the basis of a comparison between images corresponding to the second area (Ar2) in the sky images that are among the plurality of sky images (G1-G3) and are acquired from at least two cameras (10).

Description

雲位置算出システム、方法、及びプログラムCloud position calculation system, method, and program
 本開示は、雲位置算出システム、方法、及びプログラムに関する。 This disclosure relates to cloud position calculation systems, methods, and programs.
 雲観測方法として、地上に設置した全天カメラを用いることが知られている。例えば特許文献1には、1つの全天カメラで空を撮像して異なる時刻の全天画像を取得し、全天画像に写る同じ雲の動きを追跡することにより、雲の速度及び雲が流れる方向を決定するとの記載がある。 It is known to use an omnidirectional camera installed on the ground as a cloud observation method. For example, in Patent Document 1, the sky is imaged with one spherical camera, all-sky images at different times are acquired, and the movement of the same cloud reflected in the all-sky image is tracked, so that the speed of the cloud and the cloud flow. There is a statement that the direction will be decided.
 特許文献2には、2つの全天カメラを用いて、2つの全天カメラからの距離が既知である設備の鉛直上にある雲の高度を算出するとの記載がある。 Patent Document 2 describes that two spherical cameras are used to calculate the altitude of clouds vertically above equipment whose distance from the two spherical cameras is known.
実開昭57-160681号公報Jitsukaisho 57-160681 Gazette 特開2012-242322号公報Japanese Unexamined Patent Publication No. 2012-242322
 しかしながら、特許文献1の方法は、雲の高度は不明であるため、高度を仮決めしている。特許文献2の方法は、カメラかの距離が既知である場所の真上にある雲の高度しか算出することができないようである。したがって、現時点では、画像に写っている複数の雲について、雲毎の位置情報(高度、緯度経度)を算出する方法が提案されていない。 However, in the method of Patent Document 1, the altitude of the cloud is unknown, so the altitude is tentatively determined. The method of Patent Document 2 seems to be able to calculate only the altitude of the cloud directly above the place where the distance of the camera is known. Therefore, at present, no method has been proposed for calculating the position information (altitude, latitude / longitude) for each cloud for a plurality of clouds shown in the image.
 本開示は、雲塊毎に雲の位置情報を算出可能な雲位置算出システム、方法、及びプログラムを提供する。 The present disclosure provides a cloud position calculation system, a method, and a program capable of calculating cloud position information for each cloud mass.
 本開示の雲位置算出システムは、互いに異なる位置に配置された複数のカメラが撮影した、少なくとも空を含む空画像を取得する画像取得部と、前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像の対比に基づいて、第1地域における雲の位置情報を算出する第1雲位置算出部と、前記第1地域における雲の位置情報から雲塊を抽出する抽出部と、前記抽出された雲塊を含み前記第1地域よりも狭い第2地域を設定する第2地域設定部と、前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像における前記第2地域に対応する画像の対比に基づいて、前記第2地域における雲塊の位置情報を算出する第2雲位置算出部と、を備える。 The cloud position calculation system of the present disclosure is from an image acquisition unit that acquires at least an aerial image including the sky taken by a plurality of cameras arranged at different positions from each other, and at least two of the plurality of aerial images. A first cloud position calculation unit that calculates cloud position information in the first region based on the comparison of the acquired sky images, and an extraction unit that extracts cloud masses from the cloud position information in the first region. The second area setting unit that includes the extracted cloud mass and sets a second area narrower than the first area, and the first in the sky image acquired from at least two cameras among the plurality of sky images. It is provided with a second cloud position calculation unit that calculates the position information of the cloud mass in the second area based on the comparison of the images corresponding to the two areas.
 このように、相対的に広い第1地域における雲の位置情報を空画像の対比に基づいて算出し、その雲の位置情報から抽出した雲塊を含み第1地域よりも狭い第2地域を対象として空画像の対比により雲塊の位置情報を算出するので、初めから空画像の一部の雲塊に着目して対比を行う場合に比べて、雲塊毎に位置情報を精度よく算出可能となる。それでいて、まず、相対的に広い第1地域における雲の位置情報を算出するので、全体としての整合性を確保でき、そのうえで、局所的な第2地域の雲塊の位置情報を、推定誤差を減らすことが可能となる。 In this way, the position information of the clouds in the relatively wide first area is calculated based on the comparison of the aerial images, and the second area narrower than the first area including the cloud mass extracted from the position information of the clouds is targeted. Since the position information of the cloud mass is calculated by comparing the sky image, the position information can be calculated accurately for each cloud mass compared to the case where the comparison is performed by focusing on a part of the cloud mass of the sky image from the beginning. Become. Nevertheless, first, since the position information of the clouds in the relatively wide first region is calculated, the consistency as a whole can be ensured, and then the position information of the local cloud mass in the second region can be reduced in estimation error. It becomes possible.
雲位置算出システムの構成を示すブロック図。A block diagram showing the configuration of a cloud position calculation system. 雲位置算出システムが実行する処理を示すフローチャート。A flowchart showing the processing executed by the cloud position calculation system. 雲位置算出システムが実行する処理を示すフローチャート。A flowchart showing the processing executed by the cloud position calculation system. 第1地域の設定に関する説明図。Explanatory drawing about setting of first area. 第1地域に含まれる第2地域と第3地域の関係を示す図。The figure which shows the relationship between the 2nd region and the 3rd region included in the 1st region. 空画像と第3地域の関係を示す図。The figure which shows the relationship between the aerial image and the third area. 第3地域における雲の雲分布行列への変換に関する説明図。Explanatory diagram about conversion of clouds to cloud distribution matrix in the third region. 高度h=1325mとした場合の空画像の対比に基づき第3地域における雲の位置情報を算出する過程を示す図。The figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h = 1325m. 高度h=2575mとした場合の空画像の対比に基づき第3地域における雲の位置情報を算出する過程を示す図。The figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h = 2575m. 高度h=3575mとした場合の空画像の対比に基づき第3地域における雲の位置情報を算出する過程を示す図。The figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h = 3575m. 高度h=1800mとした場合の空画像の対比に基づき第3地域における雲の位置情報を算出する過程を示す図。The figure which shows the process of calculating the position information of the cloud in the 3rd region based on the contrast of the aerial image at the altitude h = 1800 m. カメラ設置点から得られる空画像から、第3地域及び第4地域における雲分布行列を算出し、第1地域における雲の位置情報を得る工程を示す図。The figure which shows the process of calculating the cloud distribution matrix in the 3rd region and the 4th region from the sky image obtained from the camera installation point, and obtaining the position information of the cloud in the 1st region. 第1地域における雲の位置情報から雲塊を抽出する工程を示す図。The figure which shows the process of extracting a cloud mass from the position information of a cloud in the 1st area. 第1地域における雲塊と第2地域とを示す図。The figure which shows the cloud mass in the 1st region and the 2nd region. 第1地域に含まれる各々の雲塊の高度の算出結果を示す図。The figure which shows the calculation result of the altitude of each cloud mass included in the 1st area.
 以下、本開示の雲位置算出システムを、図面を参照して説明する。 Hereinafter, the cloud position calculation system of the present disclosure will be described with reference to the drawings.
 雲位置算出システム1は、観測システムに用いられる。システム1は、空を撮像する2つ以上のカメラ10及びカメラ10が撮影した空画像を処理するコンピュータを有する。システム1は、互いに異なる位置に配置された少なくとも2つのカメラ10から取得した空画像を取得し、画像に基づき未知である雲の位置情報を算出する。本明細書において雲の位置情報は、雲の緯度経度情報および高度情報の少なくともいずれかを含む。カメラ10は、空を撮影することができれば、どのようなカメラであってもよい。本実施形態では、空の広範囲を1つのカメラで撮影するために、魚眼レンズを用いた全天カメラを上向きに設置している。カメラ10を鉛直方向上向きで水平になるように設置すれば、カメラ10から得られる空画像の中心が真上(仰角90°)となり、中心から画像の端に向かうにつれて仰角が小さくなる。また、カメラ10の向きが既知であれば、画像中の方位も既知となる。すなわち、空画像中の任意の位置の画素はそれぞれ仰角が既知であり且つ方位も既知である。 The cloud position calculation system 1 is used for the observation system. The system 1 includes two or more cameras 10 that capture the sky and a computer that processes the sky images taken by the cameras 10. The system 1 acquires sky images acquired from at least two cameras 10 arranged at different positions from each other, and calculates unknown cloud position information based on the images. In the present specification, the cloud position information includes at least one of the latitude / longitude information and the altitude information of the cloud. The camera 10 may be any camera as long as it can photograph the sky. In this embodiment, an omnidirectional camera using a fisheye lens is installed facing upward in order to capture a wide area of the sky with one camera. If the camera 10 is installed so as to be vertically upward and horizontal, the center of the aerial image obtained from the camera 10 is directly above (elevation angle 90 °), and the elevation angle becomes smaller toward the edge of the image from the center. Further, if the orientation of the camera 10 is known, the orientation in the image is also known. That is, each pixel at an arbitrary position in the aerial image has a known elevation angle and a known orientation.
 本実施形態では、図4に示すように、日本国東京都八丈島における互いに異なる位置に複数のカメラ10を設置して、雲を観測し、雲の位置情報を算出した例を説明する。複数のカメラ10それぞれは、図4に示すカメラ設置点C1,C2,C3,C4に設置されている。 In this embodiment, as shown in FIG. 4, an example in which a plurality of cameras 10 are installed at different positions in Hachijojima, Tokyo, Japan, clouds are observed, and cloud position information is calculated will be described. Each of the plurality of cameras 10 is installed at the camera installation points C1, C2, C3, and C4 shown in FIG.
 図1に示すように、システム1は、画像取得部11と、第1雲位置算出部12と、抽出部13と、第2地域設定部14と、第2雲位置算出部15と、を有する。これら各部11~15は、CPUなどのプロセッサ1b、メモリなどのストレージ1a、各種インターフェイス等を備えたコンピュータにおいてプロセッサ1bが予めメモリに記憶されているプログラムを実行することによりソフトウェア及びハードウェアが協働して実現される。 As shown in FIG. 1, the system 1 includes an image acquisition unit 11, a first cloud position calculation unit 12, an extraction unit 13, a second area setting unit 14, and a second cloud position calculation unit 15. .. In each of these parts 11 to 15, software and hardware cooperate by executing a program in which the processor 1b is stored in the memory in advance in a computer equipped with a processor 1b such as a CPU, a storage 1a such as a memory, and various interfaces. Will be realized.
 <画像取得部11>
 画像取得部11は、図4及び図12に示すように、互いに異なる位置に設置された複数のカメラ10が撮影した、少なくとも空を含む空画像G1,G2,G3を取得する。本実施形態では、図4に示すように各々のカメラ設置点C1,C2,C3,C4から4つの空画像を取得する。これらの4つの空画像は、互いに対比するためにカメラ10の時刻を同期させて同時刻で撮影されている。時刻の誤差は1秒以内である。このように同時刻であることが好ましいが、厳密な同時刻であることを要しない。複数の空画像は互いに対応する時間に撮影されていればよい。例えば、同時間帯であればよい。同時間帯とは、例えば、複数のカメラ間で互いの時刻を同期させた状態での同一時刻を指す。ただし、本発明の実施される分野においては非常に広大な範囲を撮影した結果を比較するため、同時間帯の指すカメラ間の時間差はゼロ、又は小さい方が好ましいものの、例えば10分程度の時間差があった場合であっても効果を発揮することが可能である。より好ましくは、2分以内、好ましくは1分以内、より好ましくは10秒以内であることが挙げられる。
<Image acquisition unit 11>
As shown in FIGS. 4 and 12, the image acquisition unit 11 acquires aerial images G1, G2, and G3 including at least the sky taken by a plurality of cameras 10 installed at different positions. In this embodiment, as shown in FIG. 4, four aerial images are acquired from each camera installation point C1, C2, C3, C4. These four aerial images are taken at the same time by synchronizing the time of the camera 10 in order to contrast with each other. The time error is within 1 second. It is preferable that the time is the same as described above, but it is not necessary that the time is exactly the same. A plurality of aerial images may be taken at times corresponding to each other. For example, it may be in the same time zone. The same time zone refers to, for example, the same time in a state where the times of a plurality of cameras are synchronized with each other. However, in the field in which the present invention is carried out, in order to compare the results of photographing a very wide range, it is preferable that the time difference between the cameras pointed to by the same time zone is zero or small, but the time difference is, for example, about 10 minutes. It is possible to exert the effect even if there is. It is more preferably within 2 minutes, preferably within 1 minute, and more preferably within 10 seconds.
 <第1雲位置算出部12>
 第1雲位置算出部12は、複数の空画像のうち、少なくとも2つのカメラから取得される空画像の対比に基づいて、第1地域Ar1における雲の位置情報を算出する。第1地域Ar1は、図4及び図14に示すように、緯度経度に基づく地理的な領域であり、例えば地図における或る領域に相当する。図4及び図14の例では、第1地域Ar1は、八丈島を含むように設定されている。第1地域Ar1は、適宜変更可能である。図13及び図14は、第1雲位置算出部12が算出した、第1地域Ar1における雲の位置情報(緯度経度及びその分布)を示す。第1地域Ar1における雲の位置情報は、緯度経度及びその分布、雲の高度を示す。雲の高度は全て同一である。図4~図11に示す例では、3つのカメラ10から取得される3つの空画像を用いているが、これに限定されない。カメラの数が2つ以上であればよい。カメラが2台の場合には、2つのカメラから取得される2つの空画像の対比によって雲の位置情報が算出される。すなわち、カメラの数は2台以上であれば、任意の数に変更可能であり、また、2以上の空画像のうち、空画像の組み合わせは任意に変更可能である。
<First cloud position calculation unit 12>
The first cloud position calculation unit 12 calculates the cloud position information in the first region Ar1 based on the comparison of the sky images acquired from at least two cameras among the plurality of sky images. The first region Ar1 is a geographical region based on latitude and longitude, as shown in FIGS. 4 and 14, and corresponds to, for example, a certain region on a map. In the examples of FIGS. 4 and 14, the first area Ar1 is set to include Hachijojima. The first area Ar1 can be changed as appropriate. 13 and 14 show cloud position information (latitude / longitude and its distribution) in the first area Ar1 calculated by the first cloud position calculation unit 12. The cloud position information in the first region Ar1 indicates the latitude / longitude, its distribution, and the altitude of the cloud. The altitudes of the clouds are all the same. In the examples shown in FIGS. 4 to 11, three aerial images acquired from the three cameras 10 are used, but the present invention is not limited thereto. The number of cameras may be two or more. When there are two cameras, the cloud position information is calculated by comparing the two sky images acquired from the two cameras. That is, if the number of cameras is two or more, the number can be changed to any number, and the combination of two or more aerial images can be changed arbitrarily.
 第1雲位置算出部12の構成及び動作の具体的な説明を後述する。第1地域Ar1における雲の位置情報を算出可能であれば、本実施形態に限定されない。図1に示す第1雲位置算出部12は、共通地域設定部12aと、小地域設定部12bと、小地域雲位置算出部12cと、合成部12dと、を有する。 A specific description of the configuration and operation of the first cloud position calculation unit 12 will be described later. As long as the cloud position information in the first region Ar1 can be calculated, the present invention is not limited to this embodiment. The first cloud position calculation unit 12 shown in FIG. 1 includes a common area setting unit 12a, a subregion setting unit 12b, a subregion cloud position calculation unit 12c, and a synthesis unit 12d.
 図1に示す共通地域設定部12aは、空画像の組によって特定される複数の地域を結合するために用いる共通地域Ar1である。本実施形態では、共通地域を第1地域Ar1として用いているが、共通地域と第1地域Ar1が同じである必要はない。共通地域設定部12aは、図4に示すように、雲の位置情報を生成するために地理的領域を共通地域Ar1として設定する。図4の例では、各カメラ設置点C1~C4の緯度経度の最大値及び最小値に基づき共通地域Ar1を設定する。具体的には、緯度経度の最大値及び最小値から所定距離D1広くとっているが、一例である。図4の例では、D1=5000mとし、共通地域Ar1を50mのメッシュで区切っている。共通地域Ar1のメッシュの各セルに対応する2つの行列を生成し、雲の分布を示す雲分布行列と、雲の底の高度を示す雲高度行列とを生成した。 The common area setting unit 12a shown in FIG. 1 is a common area Ar1 used to combine a plurality of areas specified by a set of aerial images. In the present embodiment, the common area is used as the first area Ar1, but the common area and the first area Ar1 do not have to be the same. As shown in FIG. 4, the common area setting unit 12a sets the geographical area as the common area Ar1 in order to generate the position information of the clouds. In the example of FIG. 4, the common area Ar1 is set based on the maximum value and the minimum value of the latitude and longitude of each camera installation point C1 to C4. Specifically, the predetermined distance D1 is wider than the maximum and minimum values of latitude and longitude, which is an example. In the example of FIG. 4, D1 = 5000 m, and the common area Ar1 is separated by a mesh of 50 m. Two matrices corresponding to each cell of the mesh of the common area Ar1 were generated, and a cloud distribution matrix showing the distribution of clouds and a cloud altitude matrix showing the altitude of the bottom of the clouds were generated.
 図1に示す小地域設定部12bは、全てのカメラ10(C1~C4)から、2台のカメラ10の組を全て生成し、各組について小地域Ar10を設定する。小地域Ar10は、対応する2台のカメラ10から得られる空画像の対比によって得られる雲の位置情報を表すために用いる。本実施形態では、図5に示すように、小地域設定部12bが設定する小地域Ar10として、第3地域Ar13と第4地域Ar14とを例に挙げて説明する。第3地域Ar13及び第4地域Ar14はそれぞれ第1地域Ar1に含まれる。図5に示す第3地域Ar13は、図12に示すように、カメラ設置点C1,C2に設置されたカメラ10から取得された空画像G1,G2の対比に基づいて第3地域Ar13における雲の位置情報を算出するために設定される。図5に示す第4地域Ar14は、図12に示すように、カメラ設置点C2,C3に配置されたカメラ10から取得された空画像G2,G3の対比に基づいて第4地域Ar14における雲の位置情報を算出するために設定される。本実施形態では、各小地域Ar10は、2台のカメラ設置点の中心に基づいて設定されているが、設定方法は適宜変更可能である。後述する合成処理を容易にするために、各小地域Ar10は、共通地域Ar1から切り出して生成している。 The small area setting unit 12b shown in FIG. 1 generates all sets of two cameras 10 from all the cameras 10 (C1 to C4), and sets the small area Ar10 for each set. The subregion Ar10 is used to represent the position information of the clouds obtained by contrasting the aerial images obtained from the two corresponding cameras 10. In the present embodiment, as shown in FIG. 5, as the small area Ar10 set by the small area setting unit 12b, the third area Ar13 and the fourth area Ar14 will be described as an example. The third region Ar13 and the fourth region Ar14 are each included in the first region Ar1. As shown in FIG. 12, the third region Ar13 shown in FIG. 5 is the cloud in the third region Ar13 based on the comparison of the aerial images G1 and G2 acquired from the cameras 10 installed at the camera installation points C1 and C2. Set to calculate location information. As shown in FIG. 12, the fourth region Ar14 shown in FIG. 5 is the cloud in the fourth region Ar14 based on the comparison of the aerial images G2 and G3 acquired from the cameras 10 arranged at the camera installation points C2 and C3. Set to calculate location information. In the present embodiment, each subregion Ar10 is set based on the center of the two camera installation points, but the setting method can be changed as appropriate. In order to facilitate the synthesis process described later, each subregion Ar10 is generated by cutting out from the common region Ar1.
 図1に示す小地域雲位置算出部12cは、設定された小地域Ar10(第3地域Ar13、第4地域Ar14)における雲の位置情報を、対応する2台のカメラ10から取得された空画像の対比に基づいて算出する。以下、第3地域Ar13における雲の位置情報を、空画像G1,G2に基づき算出する方法を説明する。 The subregion cloud position calculation unit 12c shown in FIG. 1 captures cloud position information in the set subregion Ar10 (third region Ar13, fourth region Ar14) from aerial images acquired from the corresponding two cameras 10. Calculated based on the comparison of. Hereinafter, a method of calculating the position information of clouds in the third region Ar13 based on the aerial images G1 and G2 will be described.
 図6は、空画像G1,G2と第3地域Ar13の関係を示す図である。第3地域Ar13における座標と各空画像G1,G2との位置関係を分かりやすく示すために、クロスになるように点をプロットして示している。各カメラ10はそれぞれ緯度経度が既知である。また、空画像G1,G2中における全ての画素は、それぞれ仰角及び方位角が既知である。したがって、空画像中の任意の画素は、第3地域Ar13における緯度経度と対応付けが可能である。具体的な例を、次の2つの(1)及び(2)として示す。 FIG. 6 is a diagram showing the relationship between the aerial images G1 and G2 and the third region Ar13. In order to clearly show the positional relationship between the coordinates in the third region Ar13 and the respective sky images G1 and G2, the points are plotted so as to form a cross. Each camera 10 has a known latitude and longitude. Further, all the pixels in the aerial images G1 and G2 have known elevation angles and azimuth angles, respectively. Therefore, any pixel in the aerial image can be associated with the latitude and longitude in the third region Ar13. Specific examples are shown as the following two (1) and (2).
 (1)図6に示す空画像G1における画素P1'(雲が写る)は、その仰角α1及び方位角(例えば北を0度とする方位を示す角度)が既知である。高度hを仮決めすれば、P1'を真下に投影した地表上の地点P1からカメラ設置点C1までの距離L1を算出できる。算出した距離L1と既知の方位角とカメラ設置点C1の緯度経度に基づいて、地点P1の緯度経度が算出でき、これが雲の緯度経度となる。 (1) The elevation angle α1 and the azimuth angle (for example, the angle indicating the direction with the north as 0 degree) are known for the pixel P1'(the cloud is reflected) in the sky image G1 shown in FIG. If the altitude h is tentatively determined, the distance L1 from the point P1 on the ground surface where P1'is projected directly below to the camera installation point C1 can be calculated. The latitude / longitude of the point P1 can be calculated based on the calculated distance L1, the known azimuth, and the latitude / longitude of the camera installation point C1, and this is the latitude / longitude of the cloud.
 (2)図6に示す第3地域Ar13における地点P2に対応する、空画像G2における画素P2'を知るためには、カメラ設置点C2に対する地点P2の距離L2及び方位角が既知である。よって、高度hを仮決めすれば、地表上の地点P2の真上の高度hの地点P2'に対する仰角α2が算出可能である。算出された仰角α2及び方位角に基づき、空画像G2における画素P2'が特定可能となる。 (2) In order to know the pixel P2'in the aerial image G2 corresponding to the point P2 in the third region Ar13 shown in FIG. 6, the distance L2 and the azimuth of the point P2 with respect to the camera installation point C2 are known. Therefore, if the altitude h is tentatively determined, the elevation angle α2 with respect to the point P2'at the altitude h directly above the point P2 on the ground surface can be calculated. Based on the calculated elevation angle α2 and azimuth angle, the pixel P2'in the aerial image G2 can be specified.
 小地域雲位置算出部12cは、上記空画像と第3地域Ar13との対応関係を用い、図7に示すように、第3地域Ar13上に対応する雲を、緯度経度に基づき雲の分布を示す雲分布行列A及び雲高度行列(非図示)に変換する。雲分布行列Aは、雲が明確に存在するメッシュのセルに対応する要素の値に1を入力し、薄い雲が存在するメッシュのセルに対応する要素の値に0.5が入力されている。雲高度行列のうち、雲分布行列Aにて雲が存在するとされた要素に対応する要素には、仮決めの高度hの値が入力される。雲分布行列A及び雲高度行列のうち、雲が存在していない要素の値は0となる。雲分布行列Aの要素の値は0.0以上且つ1.0以下をとる。要素の値は雲度とも呼ばれる。 The small area cloud position calculation unit 12c uses the correspondence between the sky image and the third area Ar13 to obtain the cloud distribution on the third area Ar13 based on the latitude and longitude, as shown in FIG. into a cloud distribution matrix a 3 and cloud height matrix (not shown) indicate. Cloud distribution matrix A 3 inputs 1 to the value of the element corresponding to the cells of the mesh cloud exists clearly, 0.5 is input to the value of the element corresponding to the cell of the mesh thin clouds are present There is. Of cloud altitude matrix, the elements corresponding to the elements that there are clouds in cloud distribution matrix A 3, the value of the altitude h of the temporarily determined is input. Of cloud distribution matrix A 3 and cloud altitude matrix, the values of elements that do not exist cloud becomes zero. The value of the elements of the cloud distribution matrix A 3 is 0.0 or more and take 1.0. The value of the element is also called cloud cover.
 小地域雲位置算出部12cは、高度hを仮決めして、図12に示すように、空画像G1に基づき第3地域Ar13の雲分布行列A3_G1及び雲高度行列を生成し、空画像G2に基づき第3地域Ar13の雲分布行列A3_G2及び雲高度行列を生成する。次に、小地域雲位置算出部12cは、雲分布行列A3_G1とA3_G2との一致度(信頼度)を算出する。信頼度は、0.0以上且つ1.0以下の値を有する、2つの行列の類似度を表す評価関数ZNCC(Zero-mean Normalized Cross-Correlation)を用いたが、評価関数はこれに限定されない。勿論、別の手法のパターンマッチングでもよい。 The small area cloud position calculation unit 12c tentatively determines the altitude h, generates the cloud distribution matrix A3_G1 and the cloud altitude matrix of the third area Ar13 based on the sky image G1 as shown in FIG. 12, and generates the sky image G2. The cloud distribution matrix A3_G2 and the cloud altitude matrix of the third region Ar13 are generated based on the above. Next, the subregion cloud position calculation unit 12c calculates the degree of coincidence (reliability) between the cloud distribution matrices A 3_G1 and A 3_G2. For the reliability, an evaluation function ZNCC (Zero-mean Normalized Cross-Correlation) representing the similarity between two matrices having a value of 0.0 or more and 1.0 or less was used, but the evaluation function is not limited to this. .. Of course, pattern matching of another method may be used.
 小地域雲位置算出部12cは、高度hを所定高度ずつ異ならせて、雲分布行列A3_G1,A3_G2,雲高度行列及び信頼度ZNCCをそれぞれ算出する。所定範囲の高度の探索が終了すると、最も大きな信頼度ZNCCと対応する雲分布行列と雲高度行列とを採用し、図12に示すように信頼度ZNCC3_FIXと雲分布行列A3_FIXと雲高度行列を記憶部に記憶する。同様の処理を、全ての小地域Ar10に対して実行する。図12には、第4地域Ar14について空画像G2,G3の対比に基づき、雲分布行列A4_G2、A4_G3、雲高度行列及び信頼度ZNCCを複数組算出し、最も大きな信頼度ZNCC4_FIXと雲分布行列A4_FIXと雲高度行列を記憶部に記憶する。 The subregion cloud position calculation unit 12c calculates the cloud distribution matrices A 3_G1 , A 3_G2 , the cloud altitude matrix, and the reliability ZNCC, respectively, by making the altitude h different by a predetermined altitude. When the search for the altitude in the predetermined range is completed, the cloud distribution matrix and the cloud altitude matrix corresponding to the highest reliability ZNCC are adopted, and as shown in FIG. 12, the reliability ZNCC 3_FIX , the cloud distribution matrix A 3_FIX, and the cloud altitude matrix are adopted. Is stored in the storage unit. A similar process is performed for all subregions Ar10. In FIG. 12, a plurality of sets of cloud distribution matrices A 4_G2 , A4_G3 , cloud altitude matrix and reliability ZNCC are calculated based on the comparison of aerial images G2 and G3 for the fourth region Ar14, and the maximum reliability ZNCC 4_FIX and clouds are calculated. The distribution matrix A 4_FIX and the cloud altitude matrix are stored in the storage unit.
 図1に示す合成部12dは、図12に示すように、小地域雲位置算出部12cが算出した各々の小地域Ar10(第3地域Ar13、第4地域Ar14を含む)の雲の位置情報を共通地域Ar1に合成する。図12に示す例では、第3地域Ar13における雲の位置情報と、第4地域Ar14における雲の位置情報とを合成して、第1地域Ar1における雲の位置情報を算出する。ここで、第3地域Ar13と第4地域Ar14とが重なっていない領域については、各雲分布行列A、Aの対応する要素の値をそのまま採用すればよい。第3地域Ar13と第4地域Ar14とが重なっている領域については、重複箇所における雲の位置情報を、対応する第3地域Ar13及び第4地域Ar14の雲の位置情報と、各々の信頼度ZNCC3_FIX,ZNCC4_FIXである重みとに基づき算出することが好ましい。例えば、重複箇所における雲分布行列Aの要素の値A(i,j)は、次の式で算出可能である。
 A(i,j)=A3_FIX(i,j)×ZNCC3_FIX+A4_FIX(i,j)×ZNCC4_FIX
 i,jは、重複箇所を示す行列の行番号及び列番号である。
As shown in FIG. 12, the synthesis unit 12d shown in FIG. 1 obtains cloud position information of each subregion Ar10 (including the third region Ar13 and the fourth region Ar14) calculated by the subregion cloud position calculation unit 12c. Synthesize to the common area Ar1. In the example shown in FIG. 12, the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to calculate the cloud position information in the first region Ar1. Here, the region where the third region Ar13 and fourth regions Ar14 do not overlap, may be adopted the value of the corresponding elements of each cloud distribution matrix A 3, A 4 as it is. For the area where the third area Ar13 and the fourth area Ar14 overlap, the cloud position information at the overlapping point, the cloud position information of the corresponding third area Ar13 and the fourth area Ar14, and the reliability ZNCC of each. It is preferable to calculate based on the weights of 3_FIX and ZNCC 4_FIX. For example, the value A (i, j) of the element of the cloud distribution matrix A at the overlapping portion can be calculated by the following equation.
A (i, j) = A 3_FIX (i, j) x ZNCC 3_FIX + A 4_FIX (i, j) x ZNCC 4_FIX
i and j are row numbers and column numbers of a matrix indicating overlapping points.
 図8~図11は、空画像G1,G2の対比に基づき第3地域Ar13における雲の位置情報を算出する過程を示す図である。図8は、高度h=1325mとした場合の空画像G1,G2と、雲分布行列A3_G1、A3_G2と、信頼度ZNCCのプロットとを示す。図9は、高度h=2575mとした場合の図8に対応する図である。図10は、高度h=3575mとした場合の図8に対応する図である。図11は、高度h=1800mとした場合の図8に対応する図である。図中の撮影ポイントAは、カメラ設置点C2であり、撮影ポイントBは、カメラ設置点C1である。図11に示す高度h=1800mが最も信頼度ZNCCが高く、雲の高度h=1800mであることが理解できる。 8 to 11 are diagrams showing a process of calculating the position information of clouds in the third region Ar13 based on the comparison of the aerial images G1 and G2. FIG. 8 shows aerial images G1 and G2 at an altitude of h = 1325 m, cloud distribution matrices A3_G1 and A3_G2, and a plot of reliability ZNCC. FIG. 9 is a diagram corresponding to FIG. 8 when the altitude h = 2575 m. FIG. 10 is a diagram corresponding to FIG. 8 when the altitude h = 3575 m. FIG. 11 is a diagram corresponding to FIG. 8 when the altitude h = 1800 m. The shooting point A in the figure is the camera installation point C2, and the shooting point B is the camera installation point C1. It can be understood that the altitude h = 1800 m shown in FIG. 11 has the highest reliability ZNCC, and the cloud altitude h = 1800 m.
 <抽出部13>
 図1に示す抽出部13は、図13に示すように、第1地域Ar1における雲の位置情報から雲塊(m1~m5)を抽出する。本実施形態では、図13に示すように、第1地域Ar1における雲分布行列の各要素は、雲の存在を示す数値を有する。図13は、存在確率が高い要素を白で示し、薄い要素をグレーで示している。第1地域Ar1における各雲の高度は同一であるが、実際には雲塊毎に高度が異なるため、抽出部13は、雲塊を抽出する。一例として、抽出部13は、雲分布行列の要素の値が所定値以上である要素を抽出して、一般的な画像処理に用いるラベリングアルゴリズムでグループ化した。図13では、8近傍探索にて1つでも隣接していれば同一の雲塊と判断した。8近傍探索に限定されず、4近傍でもよい。図13の例では、5つの雲塊m1~m5を例に示す。
<Extraction unit 13>
As shown in FIG. 13, the extraction unit 13 shown in FIG. 1 extracts cloud masses (m1 to m5) from the position information of clouds in the first region Ar1. In this embodiment, as shown in FIG. 13, each element of the cloud distribution matrix in the first region Ar1 has a numerical value indicating the existence of a cloud. In FIG. 13, the elements having a high existence probability are shown in white, and the light elements are shown in gray. Although the altitude of each cloud in the first region Ar1 is the same, the altitude is actually different for each cloud mass, so the extraction unit 13 extracts the cloud mass. As an example, the extraction unit 13 extracted the elements whose cloud distribution matrix elements have a predetermined value or more, and grouped them by a labeling algorithm used for general image processing. In FIG. 13, if even one of them is adjacent to each other in the 8 neighborhood search, it is determined that they are the same cloud mass. The search is not limited to 8 neighborhoods, and 4 neighborhoods may be used. In the example of FIG. 13, five cloud masses m1 to m5 are shown as an example.
 <第2地域設定部14>
 図1に示す第2地域設定部14は、抽出部13によって抽出された雲塊を含む第2地域Ar21~Ar25を設定する。第2地域Ar2は、第1地域Ar1よりも狭く、且つ、少なくとも1つの雲塊の一部を含んでいればよい。より好ましくは、第2地域Ar2は、抽出された1つの雲塊の空との境界を少なくとも含むことがより好ましい。さらに好ましくは、第2地域Ar2は、抽出された1つの雲塊の空との境界を全て含むことが好ましい。本実施形態では、図14に示すように、雲塊m1に対して第2地域Ar21が設定され、雲塊m2に対して第2地域Ar22が設定され、雲塊m3に対して第2地域Ar23が設定され、雲塊m4に対して第2地域Ar24が設定され、雲塊m5に対して第2地域Ar25が設定される。これらの例は、1つの雲塊の空との境界を全て含むように設定されている。例えば、第2地域Ar21は、雲塊m1の空との境界を全て含む。
<Second area setting unit 14>
The second area setting unit 14 shown in FIG. 1 sets the second areas Ar21 to Ar25 including the cloud mass extracted by the extraction unit 13. The second region Ar2 may be narrower than the first region Ar1 and may include a part of at least one cloud mass. More preferably, the second region Ar2 includes at least the boundary between the extracted cloud mass and the sky. More preferably, the second region Ar2 includes all the boundaries of the extracted cloud mass with the sky. In the present embodiment, as shown in FIG. 14, the second region Ar21 is set for the cloud mass m1, the second region Ar22 is set for the cloud mass m2, and the second region Ar23 is set for the cloud mass m3. Is set, the second region Ar24 is set for the cloud mass m4, and the second region Ar25 is set for the cloud mass m5. These examples are set to include all the boundaries of one cloud mass with the sky. For example, the second region Ar21 includes all the boundaries of the cloud mass m1 with the sky.
 <第2雲位置算出部15>
 図1に示す第2雲位置算出部15は、複数の空画像のうち、少なくとも2つのカメラ10から取得される空画像における第2地域Ar21~Ar25に対応する画像の対比に基づいて、第2地域Ar21~Ar25における雲塊の位置情報を算出する。例として、図14に示す第2地域Ar21を取り上げる。まず、第2地域Ar21に一番近い2台のカメラ10(C1,C2)を選択し、それらの2台のカメラ10から得られた空画像G1,G2を取得する。この2つの空画像G1,G2を用いて第2地域Ar21について雲の位置情報[緯度経度(雲分布行列A)、雲高度行列]を算出する。雲の位置情報の算出方法は、第1雲位置算出部12と同じである。同様に、第2地域Ar22、Ar23、Ar24、Ar25について雲の位置情報を算出する。本実施形態について、第1雲位置算出部12による処理では、高度を0~4000mの範囲で探索したが、第2雲位置算出部15は、1800mを中心として±500mの範囲で探索した。このように予め雲の高度が推定されているので、探索範囲を狭めて計算コストを低減させることが可能となる。
<Second cloud position calculation unit 15>
The second cloud position calculation unit 15 shown in FIG. 1 is a second cloud position calculation unit 15 based on a comparison of images corresponding to the second regions Ar21 to Ar25 in the sky images acquired from at least two cameras 10 among the plurality of sky images. The position information of the cloud mass in the areas Ar21 to Ar25 is calculated. As an example, the second region Ar21 shown in FIG. 14 is taken. First, the two cameras 10 (C1, C2) closest to the second region Ar21 are selected, and the aerial images G1 and G2 obtained from those two cameras 10 are acquired. Using these two sky images G1 and G2, cloud position information [latitude / longitude (cloud distribution matrix A), cloud altitude matrix] is calculated for the second region Ar21. The method of calculating the cloud position information is the same as that of the first cloud position calculation unit 12. Similarly, the cloud position information is calculated for the second regions Ar22, Ar23, Ar24, and Ar25. Regarding this embodiment, in the processing by the first cloud position calculation unit 12, the altitude was searched in the range of 0 to 4000 m, but the second cloud position calculation unit 15 searched in the range of ± 500 m centered on 1800 m. Since the altitude of the cloud is estimated in advance in this way, it is possible to narrow the search range and reduce the calculation cost.
 全ての第2地域について雲の位置情報の算出が完了すれば、各々の算出結果を第1地域Ar1に上書きする。結果は図15に示す。図15に示すように、雲塊m1の高度が1948m、雲塊m2の高度が1934m、雲塊m3の高度が1905m、雲塊m4の高度が1965m、雲塊m5の高度が1750mと算出できた。雲塊毎に異なる高度を算出可能である。 When the calculation of cloud position information is completed for all the second regions, each calculation result is overwritten in the first region Ar1. The results are shown in FIG. As shown in FIG. 15, the altitude of the cloud mass m1 was 1948 m, the altitude of the cloud mass m2 was 1934 m, the altitude of the cloud mass m3 was 1905 m, the altitude of the cloud mass m4 was 1965 m, and the altitude of the cloud mass m5 was 1750 m. .. It is possible to calculate different altitudes for each cloud mass.
 [方法]
 上記システム1が実行する、情報処理方法について図2を参照しつつ説明する。
[Method]
The information processing method executed by the system 1 will be described with reference to FIG.
 まず、ステップST100において、画像取得部11は、互いに異なる位置に配置された複数のカメラ10が撮影した、少なくとも空を含む空画像G1,G2,G3を取得する。次のステップST101において、第1雲位置算出部12は、複数の空画像G1,G2,G3のうち、少なくとも2つのカメラ10から取得される空画像の対比に基づいて、第1地域Ar1における雲の位置情報を算出する。詳細な説明は図3を用いて後述する。次のステップST102において、抽出部13は、第1地域Ar1における雲の位置情報から雲塊m1~m5を抽出する。次のステップST103において、第2地域設定部14は、抽出された雲塊m1~m5を含み第1地域Ar1よりも狭い第2地域Ar2(Ar21,Ar22,Ar23,Ar24,Ar25)を設定する。次のステップST104において、第2雲位置算出部15は、複数の空画像G1,G2,G3のうち、少なくとも2つのカメラ10から取得される空画像における第2地域Ar2(Ar21,Ar22,Ar23,Ar24,Ar25)に対応する画像の対比に基づいて、第2地域Ar2(Ar21,Ar22,Ar23,Ar24,Ar25)における雲塊の位置情報を算出する。ステップST103,ST104の処理は、雲塊毎に実行する。 First, in step ST100, the image acquisition unit 11 acquires aerial images G1, G2, and G3 including at least the sky taken by a plurality of cameras 10 arranged at different positions from each other. In the next step ST101, the first cloud position calculation unit 12 determines the clouds in the first region Ar1 based on the comparison of the sky images acquired from at least two cameras 10 among the plurality of sky images G1, G2, and G3. Calculate the position information of. A detailed explanation will be described later with reference to FIG. In the next step ST102, the extraction unit 13 extracts cloud masses m1 to m5 from the cloud position information in the first region Ar1. In the next step ST103, the second area setting unit 14 sets the second area Ar2 (Ar21, Ar22, Ar23, Ar24, Ar25) including the extracted cloud masses m1 to m5 and narrower than the first area Ar1. In the next step ST104, the second cloud position calculation unit 15 determines the second region Ar2 (Ar21, Ar22, Ar23, in the aerial image acquired from at least two cameras 10 among the plurality of aerial images G1, G2, G3. Based on the comparison of the images corresponding to Ar24, Ar25), the position information of the cloud mass in the second region Ar2 (Ar21, Ar22, Ar23, Ar24, Ar25) is calculated. The processes of steps ST103 and ST104 are executed for each cloud mass.
 次に図3を用いて第1地域Ar1における雲の位置情報を算出する方法を説明する。ステップST200において、共通地域設定部12aは、図4に示すように第1地域Ar1でもある共通地域Ar1をカメラ設置点C1~C4に基づき設定する。次にステップST201において、小地域設定部12bは、全てのカメラ10(C1~C4)から、2台のカメラ10の組を全て生成し、各組について小地域Ar10を設定する。次のステップ202において、小地域雲位置算出部12cは、設定された小地域Ar10(第3地域Ar13、第4地域Ar14)における雲の位置情報を、対応する2台のカメラ10から取得された空画像の対比に基づいて算出する。ここで、図12に示すように、第1地域Ar1は、第3地域Ar13及び第4地域Ar14を含んでいる。小地域雲位置算出部12cは、少なくとも2つのカメラ10から取得される、第1組の空画像(G1,G2)に基づいて、第3地域Ar13における雲の位置情報を算出する。小地域雲位置算出部12cは、少なくとも2つのカメラ10から取得される、第1組とは異なる組み合わせである第2組の空画像(G2,G3)に基づいて、第4地域Ar14における雲の位置情報を算出する。次のステップST203において、合成部12dは、算出された各々の小地域Ar10(第3地域Ar13、第4地域Ar14を含む)の雲の位置情報を共通地域Ar1に合成し、第1地域Ar1における雲の位置情報を算出する。ここで、第3地域Ar13における雲の位置情報と、第4地域Ar14における雲の位置情報とを合成して、第1地域Ar1における雲の位置情報が算出される。 Next, a method of calculating the cloud position information in the first region Ar1 will be described with reference to FIG. In step ST200, the common area setting unit 12a sets the common area Ar1 which is also the first area Ar1 based on the camera installation points C1 to C4 as shown in FIG. Next, in step ST201, the subregion setting unit 12b generates all the sets of the two cameras 10 from all the cameras 10 (C1 to C4), and sets the subregion Ar10 for each set. In the next step 202, the subregion cloud position calculation unit 12c acquired the cloud position information in the set subregion Ar10 (third region Ar13, fourth region Ar14) from the corresponding two cameras 10. Calculated based on the comparison of aerial images. Here, as shown in FIG. 12, the first region Ar1 includes the third region Ar13 and the fourth region Ar14. The subregion cloud position calculation unit 12c calculates the cloud position information in the third region Ar13 based on the first set of sky images (G1, G2) acquired from at least two cameras 10. The subregion cloud position calculation unit 12c is based on the sky images (G2, G3) of the second set, which is a combination different from the first set, acquired from at least two cameras 10, and the cloud position calculation unit 12c in the fourth region Ar14. Calculate location information. In the next step ST203, the synthesis unit 12d synthesizes the calculated cloud position information of each subregion Ar10 (including the third region Ar13 and the fourth region Ar14) into the common region Ar1 and in the first region Ar1. Calculate cloud position information. Here, the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to calculate the cloud position information in the first region Ar1.
 以上のように、本実施形態の雲位置算出システム1は、互いに異なる位置に配置された複数のカメラ10が撮影した、少なくとも空を含む空画像G1~G3を取得する画像取得部11と、複数の空画像G1~G3のうち、少なくとも2つのカメラ10から取得される空画像の対比に基づいて、第1地域Ar1における雲の位置情報を算出する第1雲位置算出部12と、第1地域Ar1における雲の位置情報から雲塊m1~m5を抽出する抽出部13と、抽出された雲塊m1~m5を含み第1地域Ar1よりも狭い第2地域Ar2を設定する第2地域設定部14と、複数の空画像G1~G3のうち、少なくとも2つのカメラ10から取得される空画像における第2地域Ar2に対応する画像の対比に基づいて、第2地域Ar2における雲塊の位置情報を算出する第2雲位置算出部15と、を備える。 As described above, the cloud position calculation system 1 of the present embodiment includes an image acquisition unit 11 that acquires at least aerial images G1 to G3 including the sky taken by a plurality of cameras 10 arranged at different positions. The first cloud position calculation unit 12 that calculates the cloud position information in the first area Ar1 based on the comparison of the sky images acquired from at least two cameras 10 among the sky images G1 to G3, and the first area. An extraction unit 13 that extracts cloud masses m1 to m5 from cloud position information in Ar1, and a second area setting unit 14 that includes the extracted cloud masses m1 to m5 and sets a second region Ar2 that is narrower than the first region Ar1. And, among the plurality of sky images G1 to G3, the position information of the cloud mass in the second region Ar2 is calculated based on the comparison of the images corresponding to the second region Ar2 in the sky images acquired from at least two cameras 10. A second cloud position calculation unit 15 is provided.
 本実施形態の雲位置算出方法は、互いに異なる位置に配置された複数のカメラ10が撮影した、少なくとも空を含む空画像G1~G3を取得すること(ST100)と、複数の空画像G1~G3のうち、少なくとも2つのカメラ10から取得される空画像の対比に基づいて、第1地域Ar1における雲の位置情報を算出すること(ST101)と、第1地域Ar1における雲の位置情報から雲塊m1~m5を抽出すること(ST102)と、抽出された雲塊m1~m5を含み第1地域Ar1よりも狭い第2地域Ar2を設定すること(ST103)と、複数の空画像G1~G3のうち、少なくとも2つのカメラ10から取得される空画像における第2地域Ar2に対応する画像の対比に基づいて、第2地域Ar2における雲塊の位置情報を算出すること(ST104)と、を含む。 The cloud position calculation method of the present embodiment acquires aerial images G1 to G3 including at least the sky taken by a plurality of cameras 10 arranged at different positions (ST100), and a plurality of aerial images G1 to G3. Of these, the cloud position information in the first region Ar1 is calculated (ST101) based on the comparison of the sky images acquired from at least two cameras 10, and the cloud mass is calculated from the cloud position information in the first region Ar1. Extracting m1 to m5 (ST102), setting a second area Ar2 including the extracted cloud masses m1 to m5 and narrower than the first area Ar1 (ST103), and setting a plurality of aerial images G1 to G3. Among them, the calculation of the position information of the cloud mass in the second region Ar2 based on the comparison of the images corresponding to the second region Ar2 in the sky images acquired from at least two cameras 10 (ST104) is included.
 このように、相対的に広い第1地域Ar1における雲の位置情報を空画像の対比に基づいて算出し、その雲の位置情報から抽出した雲塊を含み第1地域Ar1よりも狭い第2地域Ar2を対象として空画像の対比により雲塊m1~m5の位置情報を算出するので、初めから空画像の一部の雲塊に着目して対比を行う場合に比べて、雲塊毎に位置情報を精度よく算出可能となる。それでいて、まず、相対的に広い第1地域Ar1における雲の位置情報を算出するので、全体としての整合性を確保でき、そのうえで、局所的な第2地域の雲塊の位置情報を、推定誤差を減らすことが可能となる。 In this way, the cloud position information in the relatively wide first region Ar1 is calculated based on the comparison of the aerial images, and the second region narrower than the first region Ar1 including the cloud mass extracted from the cloud position information. Since the position information of the cloud masses m1 to m5 is calculated by comparing the aerial images with Ar2 as the target, the position information for each cloud mass is compared with the case where the comparison is performed focusing on a part of the cloud masses of the aerial image from the beginning. Can be calculated accurately. Nevertheless, first, since the position information of the clouds in the relatively wide first region Ar1 is calculated, the consistency as a whole can be ensured, and then the position information of the local cloud mass in the second region can be estimated with an error. It will be possible to reduce it.
 本実施形態のシステムのように、第1地域Ar1は、第3地域Ar13及び第4地域Ar14を含み、第1雲位置算出部12は、少なくとも2つのカメラ10から取得される、第1組の空画像(G1,G2)に基づいて、第3地域Ar13における雲の位置情報を算出し、少なくとも2つのカメラ10から取得される、第1組とは異なる組み合わせである第2組の空画像(G2,G3)に基づいて、第4地域Ar14における雲の位置情報を算出し、第3地域Ar13における雲の位置情報と、第4地域Ar14における雲の位置情報とを合成して、第1地域Ar1における雲の位置情報を算出することが好ましい。 As in the system of the present embodiment, the first region Ar1 includes the third region Ar13 and the fourth region Ar14, and the first cloud position calculation unit 12 is a set of the first set acquired from at least two cameras 10. Based on the aerial images (G1, G2), the position information of the clouds in the third region Ar13 is calculated, and the second set of aerial images (a combination different from the first set) acquired from at least two cameras 10 (a second set of aerial images (G1, G2). Based on G2 and G3), the cloud position information in the fourth region Ar14 is calculated, and the cloud position information in the third region Ar13 and the cloud position information in the fourth region Ar14 are combined to form the first region. It is preferable to calculate the position information of the cloud in Ar1.
 この構成によれば、複数組の空画像から得られる複数地域の雲の位置情報を合成して第1地域Ar1の雲の位置情報を算出するので、広範囲の雲の位置情報を精度よく算出可能となる。 According to this configuration, the position information of the clouds in the first region Ar1 is calculated by synthesizing the position information of the clouds in a plurality of regions obtained from a plurality of sets of aerial images, so that the position information of a wide range of clouds can be calculated accurately. It becomes.
 本実施形態のシステムのように、第1雲位置算出部12は、第3地域Ar13における雲の位置情報と共に第3地域Ar13の信頼度ZNCCを算出し、第4地域Ar14における雲の位置情報と共に第4地域Ar14の信頼度ZNCCを算出し、第3地域Ar13及び第4地域Ar14が重複する場合、重複箇所における雲の位置情報を、対応する第3地域Ar13及び第4地域Ar14の雲の位置情報と、各々の信頼度ZNCC,ZNCCである重みとに基づき算出することが好ましい。 As in the system of the present embodiment, the first cloud position calculation unit 12 calculates the reliability ZNCC 3 of the third region Ar13 together with the cloud position information in the third region Ar13, and the cloud position information in the fourth region Ar14. When the reliability ZNCC 4 of the 4th region Ar14 is calculated together with, and the 3rd region Ar13 and the 4th region Ar14 overlap, the cloud position information at the overlapping location is obtained from the corresponding cloud of the 3rd region Ar13 and the 4th region Ar14. It is preferable to calculate based on the position information of the above and the weights of the respective reliability ZNCC 3 and ZNCC 4.
 この構成によれば、重複箇所の雲の位置情報を、各領域の信頼度ZNCC,ZNCCを重みとして算出するので、雲の位置情報の精度を向上させつつ広範囲の雲の位置情報を取得可能となる。 According to this configuration, the position information of the clouds at the overlapping points is calculated with the reliability ZNCC 3 and ZNCC 4 of each region as weights, so that the position information of a wide range of clouds can be acquired while improving the accuracy of the cloud position information. It will be possible.
 本実施形態のシステムのように、第2雲位置算出部15は、第2地域Ar2に近い少なくとも2つのカメラ10から取得される空画像を用いることが好ましい。 As in the system of the present embodiment, it is preferable that the second cloud position calculation unit 15 uses sky images acquired from at least two cameras 10 near the second area Ar2.
 このように、第2地域Ar2に近いカメラ10であれば、空画像に写る第2地域Ar2の雲の高さ成分を少なくでき、精度を向上させることが可能となる。 As described above, if the camera 10 is close to the second region Ar2, the height component of the clouds in the second region Ar2 reflected in the aerial image can be reduced, and the accuracy can be improved.
 本実施形態のシステムのように、第2地域Ar2は、抽出された雲塊m1~m5の空との境界を少なくとも含むことが好ましい。 Like the system of the present embodiment, it is preferable that the second region Ar2 includes at least the boundary between the extracted cloud masses m1 to m5 and the sky.
 このように、第2地域Ar2は、抽出された雲塊m1~m5の空との境界を少なくとも含むので、雲塊の形状や配置などの特徴を考慮でき、精度を向上させることが可能となる。 As described above, since the second region Ar2 includes at least the boundary between the extracted cloud masses m1 to m5 and the sky, features such as the shape and arrangement of the cloud masses can be taken into consideration, and the accuracy can be improved. ..
 本実施形態のシステムのように、雲の位置情報は、雲の高度情報および緯度経度情報の少なくともいずれかを含むことが好ましい。好ましい実施形態である。 As in the system of the present embodiment, it is preferable that the cloud position information includes at least one of cloud altitude information and latitude / longitude information. This is a preferred embodiment.
 本実施形態のシステムのように、空画像における雲を示す画素同士のパターンマッチングにより、前記雲の位置情報を算出することが好ましい。好ましい実施形態である。 As in the system of the present embodiment, it is preferable to calculate the position information of the clouds by pattern matching between pixels indicating clouds in the sky image. This is a preferred embodiment.
 本実施形態のシステムのように、第1地域Ar1又は第2地域Ar2は、緯度経度に沿ってメッシュ化されており、雲の位置情報は、メッシュのセル毎に表現されることが好ましい。好ましい実施形態である。 Like the system of the present embodiment, the first region Ar1 or the second region Ar2 is meshed along the latitude and longitude, and it is preferable that the cloud position information is represented for each cell of the mesh. This is a preferred embodiment.
 本実施形態のシステムのように、互いに異なる位置に配置される複数のカメラを更に備えることが好ましい。好ましい実施形態である。 It is preferable to further include a plurality of cameras arranged at different positions as in the system of the present embodiment. This is a preferred embodiment.
 本実施形態に係るプログラムは、上記方法をコンピュータに実行させるプログラムである。また、本実施形態に係るコンピュータに読み取り可能な一時記録媒体は、上記プログラムを記憶している。 The program according to this embodiment is a program that causes a computer to execute the above method. Further, the temporary recording medium readable by the computer according to the present embodiment stores the above program.
 以上、本開示の実施形態について図面に基づいて説明したが、具体的な構成は、これらの実施形態に限定されるものでないと考えられるべきである。本開示の範囲は、上記した実施形態の説明だけではなく特許請求の範囲によって示され、さらに特許請求の範囲と均等の意味および範囲内でのすべての変更が含まれる。 Although the embodiments of the present disclosure have been described above based on the drawings, it should be considered that the specific configuration is not limited to these embodiments. The scope of the present disclosure is set forth not only by the description of the embodiment described above but also by the scope of claims, and further includes all modifications within the meaning and scope equivalent to the scope of claims.
 例えば、特許請求の範囲、明細書、および図面中において示した装置、システム、プログラム、および方法における動作、手順、ステップ、および段階等の各処理の実行順序は、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現できる。特許請求の範囲、明細書、および図面中のフローに関して、便宜上「まず」、「次に」等を用いて説明したとしても、この順で実行することが必須であることを意味するものではない。 For example, the execution order of each process such as operation, procedure, step, and step in the device, system, program, and method shown in the claims, specification, and drawings may be the output of the previous process. It can be realized in any order unless it is used in processing. Even if the scope of claims, the specification, and the flow in the drawings are explained using "first", "next", etc. for convenience, it does not mean that it is essential to execute in this order. ..
 図1に示す各部12~17は、所定プログラムを1又はプロセッサで実行することで実現しているが、各部を専用メモリや専用回路で構成してもよい。 Each part 12 to 17 shown in FIG. 1 is realized by executing a predetermined program by 1 or a processor, but each part may be configured by a dedicated memory or a dedicated circuit.
 上記実施形態のシステム1は、一つのコンピュータのプロセッサ1bに各部11~19が実装されているが、各部11~19を分散させて、複数のコンピュータやクラウドで実装してもよい。すなわち、上記方法を1又は複数のプロセッサで実行してもよい。 In the system 1 of the above embodiment, each part 11 to 19 is mounted on the processor 1b of one computer, but each part 11 to 19 may be distributed and mounted on a plurality of computers or the cloud. That is, the above method may be executed by one or more processors.
 上記の各実施形態で採用している構造を他の任意の実施形態に採用することは可能である。図1では、説明の便宜上、各部11~19を実装しているが、これらの一部を任意に省略することが可能である。例えば、各部11~17を実装する実施形態が挙げられる。 It is possible to adopt the structure adopted in each of the above embodiments in any other embodiment. In FIG. 1, for convenience of explanation, each part 11 to 19 is mounted, but some of them can be omitted arbitrarily. For example, an embodiment in which each part 11 to 17 is mounted can be mentioned.
 各部の具体的な構成は、上述した実施形態のみに限定されるものではなく、本開示の趣旨を逸脱しない範囲で種々変形が可能である。 The specific configuration of each part is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present disclosure.
 1  雲位置算出システム
 11 画像取得部
 12 第1雲位置算出部
 13 抽出部
 14 第2地域設定部
 15 第2雲位置算出部
1 Cloud position calculation system 11 Image acquisition unit 12 1st cloud position calculation unit 13 Extraction unit 14 2nd area setting unit 15 2nd cloud position calculation unit

Claims (11)

  1.  互いに異なる位置に配置された複数のカメラが撮影した、少なくとも空を含む空画像を取得する画像取得部と、
     前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像の対比に基づいて、第1地域における雲の位置情報を算出する第1雲位置算出部と、
     前記第1地域における雲の位置情報から雲塊を抽出する抽出部と、
     前記抽出された雲塊を含み前記第1地域よりも狭い第2地域を設定する第2地域設定部と、
     前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像における前記第2地域に対応する画像の対比に基づいて、前記第2地域における雲塊の位置情報を算出する第2雲位置算出部と、
     を備える、雲位置算出システム。
    An image acquisition unit that acquires at least an aerial image including the sky taken by multiple cameras arranged at different positions.
    A first cloud position calculation unit that calculates cloud position information in a first area based on a comparison of the sky images acquired from at least two cameras among the plurality of sky images.
    An extraction unit that extracts cloud masses from the position information of clouds in the first area,
    A second area setting unit that includes the extracted cloud mass and sets a second area narrower than the first area,
    The second cloud that calculates the position information of the cloud mass in the second region based on the comparison of the images corresponding to the second region in the sky image acquired from at least two cameras among the plurality of sky images. Position calculation unit and
    A cloud position calculation system equipped with.
  2.  請求項1に記載のシステムであって、
     前記第1地域は、第3地域及び第4地域を含み、
     前記第1雲位置算出部は、少なくとも2つのカメラから取得される、第1組の空画像に基づいて、前記第3地域における雲の位置情報を算出し、
     少なくとも2つのカメラから取得される、前記第1組とは異なる組み合わせである第2組の空画像に基づいて、前記第4地域における雲の位置情報を算出し、
     前記第3地域における雲の位置情報と、前記第4地域における雲の位置情報とを合成して、前記第1地域における雲の位置情報を算出する、雲位置算出システム。
    The system according to claim 1.
    The first region includes the third region and the fourth region, and includes the third region and the fourth region.
    The first cloud position calculation unit calculates cloud position information in the third region based on the first set of aerial images acquired from at least two cameras.
    Based on the sky images of the second set, which is a combination different from the first set, acquired from at least two cameras, the position information of the clouds in the fourth area is calculated.
    A cloud position calculation system that calculates the position information of clouds in the first area by synthesizing the position information of clouds in the third area and the position information of clouds in the fourth area.
  3.  請求項2に記載のシステムであって、
     前記第1雲位置算出部は、前記第3地域における雲の位置情報と共に前記第3地域の信頼度を算出し、前記第4地域における雲の位置情報と共に前記第4地域の信頼度を算出し、
     前記第3地域及び前記第4地域が重複する場合、重複箇所における雲の位置情報を、対応する前記第3地域及び前記第4地域の雲の位置情報と、前記各々の信頼度である重みとに基づき算出する、雲位置算出システム。
    The system according to claim 2.
    The first cloud position calculation unit calculates the reliability of the third region together with the cloud position information in the third region, and calculates the reliability of the fourth region together with the cloud position information in the fourth region. ,
    When the third region and the fourth region overlap, the cloud position information at the overlapping location is the corresponding cloud position information of the third region and the fourth region, and the weight which is the reliability of each of them. Cloud position calculation system that calculates based on.
  4.  請求項1乃至3のいずれかの請求項に記載のシステムであって、
     前記第2雲位置算出部は、前記第2地域に近い少なくとも2つのカメラから取得される空画像を用いる、雲位置算出システム。
    The system according to any one of claims 1 to 3.
    The second cloud position calculation unit is a cloud position calculation system that uses sky images acquired from at least two cameras near the second area.
  5.  請求項1乃至4のいずれかの請求項に記載のシステムであって、
     前記第2地域は、前記抽出された雲塊の空との境界を少なくとも含む、雲位置算出システム。
    The system according to any one of claims 1 to 4.
    The second area is a cloud position calculation system including at least the boundary of the extracted cloud mass with the sky.
  6.  請求項1乃至5のいずれかの請求項に記載のシステムであって、
     前記雲の位置情報は、前記雲の高度情報および緯度経度情報の少なくともいずれかを含む、雲位置算出システム。
    The system according to any one of claims 1 to 5.
    The cloud position information is a cloud position calculation system including at least one of the altitude information and the latitude / longitude information of the cloud.
  7.  請求項1乃至6のいずれかの請求項に記載のシステムであって、
     前記空画像における雲を示す画素同士のパターンマッチングにより、前記雲の位置情報を算出する、雲位置算出システム。
    The system according to any one of claims 1 to 6.
    A cloud position calculation system that calculates the position information of the clouds by pattern matching between pixels showing clouds in the sky image.
  8.  請求項1乃至7のいずれかの請求項に記載のシステムであって、
     前記第1地域又は前記第2地域は、緯度経度に沿ってメッシュ化されており、前記雲の位置情報は、前記メッシュのセル毎に表現される、雲位置算出システム。
    The system according to any one of claims 1 to 7.
    A cloud position calculation system in which the first region or the second region is meshed along latitude and longitude, and the cloud position information is expressed for each cell of the mesh.
  9.  請求項1乃至8のいずれかの請求項に記載のシステムであって、
     互いに異なる位置に配置される複数のカメラ
     を更に備える、雲位置算出システム。
    The system according to any one of claims 1 to 8.
    A cloud position calculation system with multiple cameras located at different positions.
  10.  互いに異なる位置に配置された複数のカメラが撮影した、少なくとも空を含む空画像を取得することと、
     前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像の対比に基づいて、第1地域における雲の位置情報を算出することと、
     前記第1地域における雲の位置情報から雲塊を抽出することと、
     前記抽出された雲塊を含み前記第1地域よりも狭い第2地域を設定することと、
     前記複数の空画像のうち、少なくとも2つのカメラから取得される前記空画像における前記第2地域に対応する画像の対比に基づいて、前記第2地域における雲塊の位置情報を算出することと、
     を含む、雲位置算出方法。
    Acquiring at least aerial images including the sky taken by multiple cameras located at different positions,
    To calculate the position information of clouds in the first region based on the contrast of the sky images acquired from at least two cameras among the plurality of sky images.
    Extracting a cloud mass from the cloud position information in the first area and
    To set a second area that includes the extracted cloud mass and is narrower than the first area.
    To calculate the position information of the cloud mass in the second region based on the comparison of the images corresponding to the second region in the sky image acquired from at least two cameras among the plurality of aerial images.
    Cloud position calculation method including.
  11.  請求項10に記載の方法を1又は複数のプロセッサに実行させるプログラム。 A program that causes one or more processors to execute the method according to claim 10.
PCT/JP2021/016017 2020-05-20 2021-04-20 Cloud location calculation system, method, and program WO2021235155A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022524336A JPWO2021235155A1 (en) 2020-05-20 2021-04-20

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-088145 2020-05-20
JP2020088145 2020-05-20

Publications (1)

Publication Number Publication Date
WO2021235155A1 true WO2021235155A1 (en) 2021-11-25

Family

ID=78708527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/016017 WO2021235155A1 (en) 2020-05-20 2021-04-20 Cloud location calculation system, method, and program

Country Status (2)

Country Link
JP (1) JPWO2021235155A1 (en)
WO (1) WO2021235155A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090051785A (en) * 2007-11-19 2009-05-25 (주)뉴멀티테크 System for observing wether and thereof method
JP2012242322A (en) * 2011-05-23 2012-12-10 Kansai Electric Power Co Inc:The Aerial object position measuring device, aerial object position measuring system and aerial object position measuring method
JP2019060754A (en) * 2017-09-27 2019-04-18 国立研究開発法人情報通信研究機構 Cloud altitude and wind velocity measurement method using optical image
WO2019244510A1 (en) * 2018-06-19 2019-12-26 古野電気株式会社 Cloud observation device, cloud observation system, cloud observation method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090051785A (en) * 2007-11-19 2009-05-25 (주)뉴멀티테크 System for observing wether and thereof method
JP2012242322A (en) * 2011-05-23 2012-12-10 Kansai Electric Power Co Inc:The Aerial object position measuring device, aerial object position measuring system and aerial object position measuring method
JP2019060754A (en) * 2017-09-27 2019-04-18 国立研究開発法人情報通信研究機構 Cloud altitude and wind velocity measurement method using optical image
WO2019244510A1 (en) * 2018-06-19 2019-12-26 古野電気株式会社 Cloud observation device, cloud observation system, cloud observation method, and program

Also Published As

Publication number Publication date
JPWO2021235155A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
US8428344B2 (en) System and method for providing mobile range sensing
JP7077961B2 (en) Image analysis device, image analysis method and program
US8755632B2 (en) Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection
US9651384B2 (en) System and method for indoor navigation
US10366500B2 (en) Autonomous characterization of water flow from surface water velocity
WO2018052554A1 (en) Star tracker-aided airborne or spacecraft terrestrial landmark navigation system
AU2020315519B2 (en) 3D view model generation of an object utilizing geometrically diverse image clusters
JP2016224953A (en) Cross spectral feature correlation for navigational adjustment
EP3132231A1 (en) A method and system for estimating information related to a vehicle pitch and/or roll angle
KR101252680B1 (en) Drawing system of an aerial photograph
JP6947927B2 (en) Cloud observation equipment, cloud observation system, cloud observation method, and program
JP6804806B2 (en) Positioning accuracy information calculation device and positioning accuracy information calculation method
JP2012185712A (en) Image collation device and image collation method
Hakim et al. Development of systematic image preprocessing of LAPAN-A3/IPB multispectral images
WO2021235155A1 (en) Cloud location calculation system, method, and program
Tsao et al. Stitching aerial images for vehicle positioning and tracking
Shukla et al. Automatic geolocation of targets tracked by aerial imaging platforms using satellite imagery
US20220276046A1 (en) System and method for providing improved geocoded reference data to a 3d map representation
WO2016071896A1 (en) Methods and systems for accurate localization and virtual object overlay in geospatial augmented reality applications
JP5465001B2 (en) Target estimation device
CN115164900A (en) Omnidirectional camera based visual aided navigation method and system in urban environment
US20220114809A1 (en) Stereoscopic Camera System for Geo-registration of 3D Reconstructed Points and Vectors
JP2023069619A (en) Cloud position calculation system, cloud position calculation method, and program
CN114092850A (en) Re-recognition method and device, computer equipment and storage medium
Sablina et al. Navigation parameters correction technique using multiple view geometry methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21808412

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022524336

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21808412

Country of ref document: EP

Kind code of ref document: A1