WO2018155683A1 - 飛翔体、及びプログラム - Google Patents
飛翔体、及びプログラム Download PDFInfo
- Publication number
- WO2018155683A1 WO2018155683A1 PCT/JP2018/006941 JP2018006941W WO2018155683A1 WO 2018155683 A1 WO2018155683 A1 WO 2018155683A1 JP 2018006941 W JP2018006941 W JP 2018006941W WO 2018155683 A1 WO2018155683 A1 WO 2018155683A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- ship
- detection target
- area
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9021—SAR image post-processing techniques
- G01S13/9029—SAR image post-processing techniques specially adapted for moving target detection within a single SAR image or within multiple SAR images taken at the same time
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9021—SAR image post-processing techniques
- G01S13/9027—Pattern recognition for feature extraction
Definitions
- the present invention relates to a projectile and a program.
- Priority is claimed on Japanese Patent Application No. 2017-034122, filed Feb. 24, 2017, the content of which is incorporated herein by reference.
- Such a satellite generates, for example, observation data representing an observation target based on radio waves received from the observation target, and transmits the generated observation data to a ground receiving apparatus.
- the receiving device performs processing based on the received observation data.
- Conventional satellites can compress the amount of observation data, but also transmit observation data other than the observation target to the ground station. For this reason, the time required for data transmission (time from downlink to satellite, waiting time at ground station buffer storage, time from transmission from buffer storage to data processing center, etc.) may be long. As a result, conventional satellites may not be able to quickly find an observation target.
- the amount of data to be transmitted when transmitting data to the ground station side, the amount of data to be transmitted can be made smaller than the amount of data of observation data, and a user can more quickly detect that a desired detection target is detected. Providing flight vehicles and programs that can be notified.
- One aspect of the present invention is an observation data generation unit that generates observation data based on radio waves received by a radar, and an image that generates an image representing a monitoring space based on the observation data generated by the observation data generation unit.
- a projectile comprising: a generation unit; and a detection unit that detects a detection target based on the image generated by the image generation unit.
- Another aspect of the present invention is an observation data generation unit that generates observation data based on radio waves received by radar, a processing unit that performs range compression on the observation data generated by the observation data generation unit, and the processing unit And a detection unit configured to detect an object to be detected based on the signal compressed by the range.
- the monitoring space may include a sea area
- the detection target may include a ship in the sea area
- the detection unit is a target estimated to be the detection target from among candidates of the detection target in the monitoring space based on a plurality of parameters stored in advance.
- a configuration that detects as a detection target may be used.
- the detection unit may detect the detection target by comparing the base map with the image generated by the image generation unit. .
- the detection target in the projectile, may include a crustal movement in the monitoring space.
- the configuration further includes a position calculation unit that calculates a position of the detection target detected by the detection unit and generates position information indicating the calculated position in the projectile. It is also good.
- the projectile may further include a feature extraction unit that extracts the feature of the detection target detected by the detection unit.
- an observation data based on radio waves received by a radar is generated on a computer included in a projectile, and an image representing a monitoring space is generated based on the generated observation data, and the generated image Is a program that detects the detection target in
- observation data based on radio waves received by a radar is generated in a computer included in the projectile, the generated observation data is range-compressed, and a detection target is detected based on the range-compressed signal
- FIG. 1 is a diagram showing an example of the configuration of a satellite observation system 1;
- FIG. 2 is a diagram illustrating an example of a hardware configuration of a control device 3.
- FIG. 3 is a diagram showing an example of a functional configuration of a control device 3; It is a flowchart which shows an example of the flow of a process which the control apparatus 3 detects the ship contained in the area
- the image P1 shown in FIG. 5 is an example of a binarized image. It is a flowchart which shows the other example of the flow of the processing which the control apparatus 3 detects the ship contained in the area
- FIG. 1 is a diagram showing an example of the configuration of the satellite observation system 1.
- the satellite observation system 1 includes a projectile 2 and a receiver 4.
- the flying object 2 is an artificial satellite that orbits the surface of the earth ET along a predetermined orbit.
- the flying object 2 may be an artificial satellite that orbits above the surface of another celestial body or object along the orbit instead of the sky above the surface of the earth ET.
- the objects include other planets different from Earth ET such as Mars and Venus, satellites such as the moon and Titan, asteroids such as Itokawa, etc.
- the object includes rocks and the like.
- the flying object 2 may be another flying object such as an airplane or a drone instead of the artificial satellite.
- the flying object 2 observes the area D which is a part of the area included in the ground surface of the earth ET as an observation target which is a desired object to be observed. That is, the projectile 2 irradiates (transmits) radio waves toward the region D.
- the observation target is, in other words, the monitoring space monitored by the flying object 2.
- the monitoring space (that is, the observation target) may be a two-dimensional plane as in the region D in this example, or may be a three-dimensional space instead of the two-dimensional plane.
- the projectile 2 may be configured to observe another object existing on the earth ET as an observation target, instead of the configuration in which a part of the region is observed as an observation target.
- the flying object 2 is controlled in posture such that the radio wave emitted from the flying object 2 is irradiated to the area D when passing through the upper sky where the electric field can be irradiated to the area D among the surface sky.
- a control method of the attitude of the projectile 2 a known control method may be used, or a method developed from this may be used, so the description will be omitted.
- the projectile 2 observes the region D by receiving radio waves reflected on the surface of the region D after irradiating the region D.
- the projectile 2 includes the synthetic aperture radar unit 21, the communication antenna unit 22, and the control device 3.
- the projectile 2 is an example of a projectile.
- the synthetic aperture radar unit 21 is provided with a plurality of antenna elements arranged in the first direction as a phased array antenna A1.
- the phased array antenna A1 has both transmitting and receiving functions. However, in the phased array antenna A1, the transmitting antenna and the receiving antenna may have separate configurations.
- the phased array antenna A1 is provided at a predetermined position of the synthetic aperture radar unit 21. The said position is a position which can irradiate (transmit) a radio wave from the phased array antenna A1 in the second direction.
- the radio wave is a radio wave according to the signal acquired by the synthetic aperture radar unit 21 from the control device 3.
- the second direction is a direction orthogonal to the first direction.
- the second direction coincides with the direction toward the region D.
- the azimuth direction is the traveling direction of the projectile 2. That is, the second direction in this example is the range (slant range in the side searching method) direction.
- the phased array antenna A1 receives radio waves arriving toward the phased array antenna A1.
- the first direction may be another direction instead of the azimuth direction. That is, the second direction may be another direction instead of the range direction.
- the synthetic aperture radar unit 21 may be configured to include another antenna such as a slot antenna or a parabola antenna instead of the phased array antenna A1.
- the antenna provided in the synthetic aperture radar unit 21 may be a separate transmit antenna and a separate receive antenna.
- the projectile 2 may include only the receiving antenna and may configure a satellite and a tandem satellite including the transmitting antenna.
- the communication antenna unit 22 includes an antenna that transmits and receives radio waves according to various types of information to and from the receiving device 4.
- the antenna is not particularly limited, and may be a parabolic antenna or a phased array antenna.
- the control device 3 controls the whole of the projectile 2.
- the control device 3 is incorporated in the projectile 2 in the present embodiment.
- the control device 3 may be separate from the projectile 2.
- the control device 3 is provided in another satellite, and controls the aircraft 2 from the satellite by wireless communication.
- the control device 3 outputs the transmission pulse signal to the synthetic aperture radar unit 21 at an interval of PRI (Pulse Repetition Frequency) within the synthetic aperture time (that is, one cycle). Thereby, the control device 3 causes the phased array antenna A1 to irradiate (transmit) the radio wave corresponding to the transmission pulse signal as the irradiation radio wave toward the region D.
- the transmission pulse signal is a chirp signal, which is a signal whose frequency changes with time.
- the frequency band of the transmission pulse signal is a frequency band of microwaves in the present embodiment.
- the frequency band of the transmission pulse signal may be a frequency band lower than the frequency band of microwaves instead of the frequency band of microwaves, or may be a frequency band higher than the frequency band of microwaves.
- the control device 3 receives the radio wave thus reflected by the phased array antenna A1. Therefore, in the following, for convenience of explanation, the point-like backscatterers at the respective positions will be described as being referred to as backscatterers.
- the control device 3 generates observation data based on the intensity of the radio wave received by the phased array antenna A1 and the time when the radio wave is received.
- the observation data is two-dimensional data having a cell indicating the time when the phased array antenna A1 receives a radio wave. In each cell of observation data, the intensity of the radio wave received by the phased array antenna A1 is associated with the time represented by each cell.
- the method by which the control device 3 generates observation data may be a known method or a method to be developed. For this reason, in the following, the detailed description (for example, the description of the process of performing noise removal and the like) will not be repeated for the method.
- a control device X (for example, a conventional control device) different from the control device 3 outputs the generated observation data to the communication antenna unit 22 and a radio wave corresponding to the observation data is transmitted to the communication antenna unit 22 Send it to a receiver installed on the ground surface.
- the receiving device can transmit observation data received from the control device X to the device according to the request at a timing according to the request according to the request from the user.
- the timing can not often be determined by the convenience of one user.
- the time required for transmission (transmission) of observation data from the control device X to the reception device becomes longer as the data amount of the observation data becomes larger.
- the satellite observation system provided with the control device X may not be able to detect the detection target in the area D at the timing desired by the user.
- the user is, for example, a person who operates the device on the ground.
- the user may be another person instead of the person.
- the control device 3 detects the detection target in the region D (that is, the observation target) based on the generated observation data. That is, the flying object 2 itself detects the detection target in the area D based on the observation data. Thereby, the projectile 2 can make the data amount of the data to be transmitted to the target (the receiving device 4 in this example) to which the data is transmitted smaller than the data amount of the observation data. In addition, the flying object 2 can reduce the storage capacity required to store the data. Thus, the projectile 2 can notify the user more quickly that the desired detection target has been detected. Also, for observation of areas that can be directly transmitted to the ground station, the transmission data may be sent directly to the communication unit without being stored in the storage unit.
- the control device 3 After detecting the detection target, the control device 3 generates transmission data including information indicating the result of detection of the detection target.
- the control device 3 outputs the generated transmission data to the communication antenna unit 22, and causes the communication antenna unit 22 to transmit the transmission data toward the receiving device 4.
- the control device 3 receives radio waves according to various control signals from the receiving device 4 via the communication antenna unit 22.
- the control device 3 performs processing according to the received radio wave.
- the receiving device 4 includes an antenna that can transmit and receive various types of information to and from the aircraft 2 by radio waves.
- the receiving device 4 is, for example, a dedicated or general-purpose computer to which the antenna is connected.
- the receiver 4 is installed at a position desired by the user on the surface of the earth ET.
- the receiving device 4 receives, as received data, the transmission data transmitted from the aircraft 2 to the receiving device 4.
- the receiving device 4 stores the received data received. Thereby, the user can perform work according to the information indicating the detection target included in the reception data stored by the receiving device 4.
- control device 3 the functional configuration of the control device 3 and the process in which the control device 3 detects a detection target in the area D based on observation data will be described in detail.
- FIG. 2 is a diagram illustrating an example of a hardware configuration of the control device 3.
- the control device 3 includes, for example, an FPGA (Field Programmable Gate Array) 31, a storage unit 32, and a communication unit 34. These components are communicably connected to one another via a bus Bus. Further, the control device 3 communicates with each of the synthetic aperture radar unit 21 and the communication antenna unit 22 through the communication unit 34.
- the control device 3 may be configured to include a central processing unit (CPU) instead of the FPGA 31.
- the FPGA 31 implements the functional configuration of the control device 3 described later by a hardware function unit.
- the storage unit 32 includes, for example, an EEPROM (Electrically Erasable Programmable Read-Only Memory), a ROM (Read-Only Memory), a RAM (Random Access Memory), a flash memory, and the like.
- the storage unit 32 stores various types of information processed by the control device 3, various types of images, and the like.
- the communication unit 34 is configured to include, for example, an analog or digital input / output port or the like according to various communication standards.
- FIG. 3 is a diagram showing an example of a functional configuration of the control device 3.
- the control device 3 includes a storage unit 32, a communication unit 34, and a control unit 36.
- the control unit 36 controls the entire control device 3.
- the control unit 36 includes a communication control unit 361, a radar control unit 363, an observation data generation unit 364, a processing unit 365, an image generation unit 367, a detection target detection unit 369, a position calculation unit 371, and feature extraction.
- a unit 373 and a transmission data generation unit 375 are provided.
- Some or all of the functional units included in the control unit 36 may be hardware functional units such as a large scale integration (LSI) or an application specific integrated circuit (ASIC).
- LSI large scale integration
- ASIC application specific integrated circuit
- the communication control unit 361 transmits and receives radio waves according to various information to and from the receiving device 4 by the communication antenna unit 22. Specifically, for example, the communication control unit 261 causes the communication antenna unit 22 to receive radio waves according to various control signals from the receiving device 4. Further, the communication control unit 261 directs the radio wave according to the transmission data generated by the transmission data generation unit 275 to the reception apparatus 4 and causes the communication antenna unit 22 to transmit the radio wave.
- the radar control unit 363 outputs the transmission pulse signal to the synthetic aperture radar unit 21 at PRI intervals within the synthetic aperture time (that is, one cycle). Then, the radar control unit 363 causes the phased array antenna A1 of the synthetic aperture radar unit 21 to irradiate (transmit) an irradiation radio wave according to the transmission pulse signal toward the region D. The radar control unit 363 also receives the radio wave of a part of the irradiation radio waves emitted from the phased array antenna A1 and reflected by each backscattering body by the phased array antenna A1.
- the observation data generation unit 364 generates the above-mentioned observation data based on the radio wave received by the phased array antenna A1 of the synthetic aperture radar unit 21.
- the processing unit 365 performs various types of processing on the observation data generated by the observation data generation unit 364.
- the image generation unit 367 generates an image representing the region D (that is, the observation target) based on the observation data generated by the processing unit 365.
- the detection target detection unit 369 detects the detection target in the region D based on the observation data generated by the observation data generation unit 364.
- the position calculation unit 271 calculates the position of the detection target detected by the detection target detection unit 369 based on the observation data generated by the observation data generation unit 364.
- the feature extraction unit 373 extracts the feature of the detection target detected by the detection target detection unit 369 based on the observation data generated by the observation data generation unit 364.
- the transmission data generation unit 375 selects one of the image generated by the image generation unit 367, position information indicating the position calculated by the position calculation unit 271, and feature information indicating the feature detected by the feature extraction unit 373. Information including part or all is generated as transmission data.
- FIG. 4 is a flow chart showing an example of the flow of processing in which the control device 3 detects a ship in the area D (that is, a ship in the relevant sea area) based on observation data.
- the control device 3 detects a ship in the area D (that is, a ship in the relevant sea area) based on observation data.
- other objects or phenomena may be included in the detection target.
- the radar control unit 363 controls the synthetic aperture radar unit 21 and observes the region D when the position of the projectile 2 is a position at which the synthetic aperture radar unit 21 can emit the irradiation radio wave to at least a part of the region D. (Step S110). Specifically, in this case, the radar control unit 363 outputs the transmission pulse signal to the synthetic aperture radar unit 21 at PRI intervals within the synthetic aperture time (that is, one cycle). Then, the radar control unit 363 causes the phased array antenna A1 of the synthetic aperture radar unit 21 to irradiate (transmit) an irradiation radio wave according to the transmission pulse signal toward the region D.
- the radar control unit 363 is configured to transmit the transmission pulse signal to the synthetic aperture radar unit 21 when the position of the projectile 2 has moved to a position where the synthetic aperture radar unit 21 can not irradiate the radio wave to at least a part of the area D. Stop output.
- the radar control unit 363 also receives the radio waves that are part of the radio waves emitted from the phased array antenna A1 and that are reflected by the respective back scatterers by the phased array antenna A1. Then, the radar control unit 363 performs processing such as A / D conversion on the received radio wave information indicating the received radio wave. In the present embodiment, the description of the process is omitted. Thus, the radar control unit 363 observes the area D by the synthetic aperture radar unit 21.
- the observation data generation unit 364 generates observation data based on the received radio wave information indicating the radio wave received in step S110 (step S120).
- the observation data generation unit 364 may be configured to store the generated observation data in the storage unit 32 after generating the observation data in step S120, and the observation data generation unit 364 does not store the observation data in the storage unit 32. It may be.
- the image generation unit 367 generates an image representing the area D based on the observation data generated in step S120 (step S150). Specifically, the image generation unit 367 performs compression in the range direction and compression in the azimuth direction on the observation data to generate an image representing the area D.
- the position of each pixel of the image generated by the image generation unit 367 represents the position of the above-described backscatterer on the image, and the pixel value of each pixel is a radio wave reflected by the backscatterer. Represents the luminance value of Further, each pixel of the image is associated with phase information indicating the phase of a radio wave arriving from the backscatterer represented by each pixel toward the phased array antenna A1.
- the image generation unit 367 generates an image representing the region D by, for example, the range Doppler method, the chirp scaling method, the omega kappa method, or the like. At this time, the image generation unit 367 may improve the calculation accuracy of the phase by performing high-accuracy trajectory determination processing and ionospheric delay correction. In addition, the image generation unit 367 may be configured to perform processing of removing the phase component caused by the height on the surface of the region D using the digital elevation model stored in advance in the storage unit 32. In this case, the image generation unit 367 reads the digital elevation model from the storage unit 32.
- the digital elevation model is a three-dimensional model that represents the shape of at least a part of the surface of the earth ET.
- the image generation unit 367 may be configured to generate the image from the observation data by a known method different from these methods, and is configured to generate the image from the observation data by a method to be developed from now. It may be. Therefore, the method of generating the image by the image generation unit 367 will not be described in more detail.
- the detection target detection unit 369 detects an object considered (estimated) as a ship included in the image as a ship (step S160).
- objects considered as (estimated) ships included in the image may include sea structures, small islands and the like.
- the detection target detection unit 369 reads out, from the storage unit 32, land area information indicating a land area which is an area different from the sea area among the areas included in the area D.
- the land area is a land area among the areas included in the area D.
- the detection target detection unit 369 applies a land area filter to the image based on the read land area information.
- the detection target detection unit 369 changes the luminance value of the pixel included in the land area indicated by the land area information in the image to a predetermined first luminance value.
- the first luminance value is, for example, 0.
- the first luminance value may be any luminance value as long as the detection target detection unit 369 can distinguish the difference between the first luminance value and the third luminance value described later.
- an image after applying a land area filter to the image is described as a land area removed image.
- the detection target detection unit 369 After generating the land area removal image, the detection target detection unit 369 applies a binarization filter to the land area removal image. Specifically, the detection target detection unit 369 sets the luminance value of the pixel having the luminance value less than the predetermined luminance value as the second luminance value among the plurality of pixels constituting the land area removed image, and the land area removed image is displayed.
- the land area removal image is binarized by setting the luminance value of the pixel having the luminance value equal to or more than the first predetermined luminance value to a predetermined third luminance value among the plurality of pixels to be configured.
- the second luminance value is, for example, 0.
- the second luminance value may be any luminance value as long as the detection target detection unit 369 can distinguish the difference between the second luminance value and the third luminance value described later.
- the third luminance value is, for example, 255.
- the third luminance value may be any luminance value as long as the detection target detection unit 369 can distinguish between the luminance values of both the first luminance value and the second luminance value instead of 255.
- the first predetermined luminance value may be any luminance value as long as it is a luminance value which is larger than both the first luminance value and the second luminance value and smaller than the third luminance value.
- a binarized image an image after application of the binarization filter to the land area removed image.
- the image P1 shown in FIG. 5 is an example of a binarized image.
- the area SC1 represents an area constituted by pixels whose luminance value is the third luminance value.
- a hatched area SC2 represents an area constituted by pixels whose luminance value is the second luminance value.
- the detection target detection unit 369 may be configured to perform binarization of the land area removed image by using a standard deviation filter instead of the binarization filter.
- the detection target detection unit 369 is a pixel whose luminance value is the third luminance value and one or more areas formed by a predetermined number or more of pixels are included in the binarized image. That is, the region corresponding to the object considered to be the above-mentioned vessel is detected as the vessel.
- the predetermined number is, for example, ten.
- the predetermined number may be a number smaller than 10 or a number larger than 10.
- the region SC1 illustrated in FIG. 5 is an example of a region in which the luminance value is the third luminance value and is configured by a predetermined number of pixels. That is, in the example shown in FIG. 5, the detection target detection unit 369 detects the area SC1 as a ship.
- the detection target detection unit 369 in step S160 does not apply the machine learning algorithm to the image generated in step S150 without applying at least a part of the land area filter, the binarization filter, and the standard deviation filter.
- the configuration may be such that the ship is detected from the image by using it.
- the detection target detection unit 369 includes information in which a plurality of images including a ship are associated with the position and shape of the ship in each of the images, a plurality of images not including the ship, and the information Information indicating that no ship is included in each of the images is stored (learned) in advance as a plurality of parameters.
- each of the said images is a ship image which cut out the image of one scene.
- the detection target detection unit 369 detects a probable candidate as a ship from among the candidates of the ship included in the image generated in step S150 as a ship.
- the detection target detection unit 369 detects the likely candidate as a ship using the algorithm in step S160, and performs the process of step S180 described below (specifically, the ship detected in step S160) Feature extraction) may be performed.
- the algorithm may be any known algorithm (including deep learning) or an algorithm to be developed from now on. Therefore, the detailed description of the machine learning algorithm will be omitted.
- the detection target detection unit 369 determines whether a ship is detected in step S160 (step S165). When the detection target detection unit 369 determines that the ship is not detected in step S160 (step S165-NO), the control unit 36 ends the process. On the other hand, when the detection target detection unit 369 determines that the ship is detected in step S160 (step S165-YES), the image generation unit 367, the position calculation unit 271, the feature extraction unit 373, the communication control unit 361, the transmission data Each of the generation units 375 repeatedly performs the processing of step S170 to step S210 for each of the one or more ships detected in step S160 (step S167).
- the vessel selected in step S167 will be described as a target vessel.
- the position calculation unit 271 calculates the position of the target ship (step S170). Specifically, the position calculation unit 271 calculates, as the position of the target ship, the latitude and longitude represented by the predetermined position of the area detected as the target ship among the areas included in the binarized image in step S160. At this time, the position calculation unit 271 indicates the position of the projectile 2 at each time (for example, information of the GPS (Global Positioning System)) and the attitude of the projectile that indicates the attitude of the projectile 2 at each time.
- GPS Global Positioning System
- the predetermined position is, for example, the position of the centroid (or center of gravity) of the area.
- the predetermined position may be another position based on the area instead of the position of the center of the area.
- the image generation unit 367 trims the image generated in step S150 based on the position of the target ship calculated in step S170 (step S175). Specifically, the image generation unit 367 trims (cuts out) a partial image representing a region of a predetermined shape centering on the position of the target ship calculated in step S170 among the regions included in the region D. .
- the predetermined shape is, for example, a rectangle of a predetermined distance square.
- the predetermined shape is a rectangular area having a side parallel to the latitude direction and a side parallel to the longitude direction in the image.
- the predetermined distance is, for example, 500 meters.
- the predetermined distance may be a distance shorter than 500 meters, or may be a distance longer than 500 meters. Further, the predetermined shape may be another shape such as a circle or an oval instead of a rectangle.
- the image generation unit 367 generates the trimmed partial image as a transmission image.
- the feature extraction unit 373 extracts the features of the target vessel (step S180).
- the features include the total length of the target vessel, the type of the target vessel, the course of the target vessel, the speed of the target vessel, and the navigation state of the target vessel.
- the navigation state of the target vessel is either the state in which the target vessel is stopped or the state in which the target vessel is moving.
- the feature may be configured to include other information representing the feature of the target vessel instead of part or all of these features, and in addition to the part or all of the feature, the feature of the target vessel It may be configured to include other information to represent.
- the process of step S180 will be described.
- the feature extraction unit 373 extracts the feature of the target vessel based on the binarized image generated in step S160 and the transmission image generated by the image generation unit 367 using a machine learning algorithm.
- the feature extraction unit 373 stores (learns) information in which the features of the area representing the target vessel and the features of the target vessel are associated with each other as a plurality of parameters.
- the characteristics of the area include, in this example, the length of the area in the longitudinal direction, the length of the width direction of the area, the direction in which the longitudinal direction of the area is directed, the shape of the area, and the area of the area Be
- the feature may be configured to include other information representing the area instead of part or all of these, and in addition to the part or all, other information representing the area May be included.
- the feature extraction unit 373 is a combination of the length in the longitudinal direction of the region representing the target ship, the length in the short direction of the region, the direction in which the longitudinal direction of the region is directed, and the shape of the region Information in which the total length of the ship is associated with each other is stored in advance as a plurality of parameters. Then, based on the parameter stored in advance and the area representing the target ship included in the binarized image, the feature extraction unit 373 selects a candidate likely to be the full length among the candidates for the full length of the target ship. Extract as the total length.
- the feature extraction unit 373 includes a combination of the length in the longitudinal direction of the region representing the target vessel, the length in the short direction of the region, the direction in which the longitudinal direction of the region goes, and the shape of the region Information in which the type of ship is associated with each other is stored in advance as a plurality of parameters. Then, based on the parameter stored in advance and the area representing the target ship included in the binarized image, the feature extraction unit 373 is likely to be a type of the target ship from among the candidates for the type (category) of the target ship. Candidate candidates are extracted as the type.
- the length in the longitudinal direction of the region representing the target ship, the length in the short direction of the region, the direction in which the longitudinal direction of the region goes, the shape of the region, and the area of the region Information in which the combination and the course of the ship are associated is stored in advance as a plurality of parameters. Then, based on the parameter stored in advance and the area representing the target vessel included in the binarized image, the feature extraction unit 373 selects a candidate likely to be the course from the candidates of the course of the target vessel. Extract as the course.
- the feature extraction unit 373 determines whether there is a track outside the area, information in which the combination of the track and the speed of the ship are associated is stored in advance as a plurality of parameters. Then, based on the parameter stored in advance and the area representing the target vessel included in the binarized image, the feature extraction unit 373 selects a candidate likely to be the speed from among the candidates for the target vessel's speed. Extract as the speed.
- the algorithm of machine learning used by the feature extraction unit 373 in step S180 may be any known algorithm (including deep learning) or an algorithm to be developed from now on. Therefore, the detailed description of the machine learning algorithm will be omitted.
- the feature extraction unit 373 is a target from the image generated in step S150 (that is, the image before applying the binarization filter in step S160 and the image to which the standard deviation filter is applied) using a machine learning algorithm. It may be configured to extract the characteristics of the ship.
- the transmission data generation unit 375 generates transmission data (step S200). Specifically, when the AIS signal is received in step S190, the transmission data generation unit 375 receives the ship identification information of the target ship, the ship position information of the target ship, the transmission image, and the ship characteristic information of the target ship. And information including AIS information as transmission data.
- the said ship identification information is information which identifies an object ship.
- the ship identification information may be any information as long as it can identify each of the one or more ships detected in step S160.
- the said ship position information is information which is a position of an object ship, and shows the position calculated in step S170.
- the transmission image is a transmission image generated in step S175.
- the said ship characteristic information is information which is a characteristic of an object ship, and shows each of the characteristic detected in step S180.
- the AIS information is the AIS information stored in the storage unit 32 in step S190.
- the transmission data generation unit 375 does not receive the ship identification information, the ship position information, the transmission image, the ship characteristic information, and the AIS signal in step S190.
- Information including information indicating that it has not been received is generated as transmission data.
- the transmission data generation unit 375 may be configured to collate (match) the ship characteristic information of the target ship with the AIS information indicated by the AIS signal. In this case, the transmission data generation unit 375 identifies, of the plurality of pieces of information indicated by the ship characteristic information, information that matches any of the plurality of pieces of information indicated by the AIS information. In addition, the transmission data generation unit 375 identifies information that does not match any of the plurality of information indicated by the AIS information among the plurality of information indicated by the ship characteristic information.
- the transmission data generation unit 375 in step S200, the information specified above, the ship identification information of the target ship, the ship position information of the target ship, the transmission image, the ship characteristic information of the target ship, and the AIS information. It generates information including transmission data.
- the transmission data generation unit 375 stores the transmission data generated in step S200 in the storage unit 32 (step S210).
- the flying object 2 generates transmission data for each of one or more vessels detected in step S160 by repeatedly performing the processing of step S167 to step S210, and stores the generated transmission data in the storage unit 32. It can be done.
- the communication control unit 361 After repetition of steps S167 to S210, the communication control unit 361 outputs each of the transmission data stored in the storage unit 32 in step S210 to the communication antenna unit 22, and the radio wave corresponding to the transmission data is output.
- the communication antenna unit 22 is made to transmit toward the receiving device 4 (step S220), and the process is ended. Thereby, the flying object 2 can make the data amount of the data to be transmitted to the receiving device 4 smaller than the data amount of the observation data, for example, and information indicating that the ship which is an example of the detection target is detected It can shorten the time provided to the user.
- step S220 the communication control unit 361 outputs part of the transmission data stored in the storage unit 32 to the communication antenna unit 22, and directs the radio wave according to the transmission data to the communication antenna unit 22 to the receiving device 4. It may be configured to be transmitted.
- step S200 the communication control unit 361 outputs the transmission data generated by the transmission data generation unit 375 to the communication antenna unit 22, and directs the radio wave according to the transmission data to the communication antenna unit 22 toward the receiving device 4. It may be configured to be transmitted.
- the detection target detection unit 369, the position calculation unit 271, and the feature extraction unit 373 described above may be configured as an integral functional unit.
- the detection target detection unit 369 integrally configured with the position calculation unit 271 detects one or more ships in step S160, and Calculate (detect) the position of each detected ship.
- the detection target detection unit 369 integrally configured with the feature extraction unit 373 detects one or more ships in step S160. In addition, the features of each detected ship are extracted.
- step S160 one or more ships are detected, the position of each detected ship is calculated (detected), and the detected characteristic of each ship is extracted. Further, in the flowchart shown in FIG. 4, the processes of step S175 to step S190 may be performed in parallel with each other, or may be performed in an order different from the order shown in FIG.
- the detection process which performs feature extraction (S180) after ship detection (S160) is shown, it is not limited to the said aspect, For example, ship detection (ship detection and feature detection as a part of feature extraction) And at the same time). That is, the image generation in FIG. 4 (S150) is performed, and then the features (whether or not it is a ship, length (full length of ship), traveling direction (ship navigation direction), type (ship type), etc.) are extracted. It is also good.
- the projectile 2 generates observation data based on radio waves received by a radar (in this example, the synthetic aperture radar unit 21), and monitors space based on the generated observation data.
- An image representing in this example, an observation target
- a detection target in this example, a ship
- the flying object 2 can make the data amount of the data to be transmitted to the data transmission target (in this example, the receiving device 4) smaller than the data amount of the observation data, and a desired detection target is detected. Users can be notified more quickly.
- the flying object 2 detects a probable candidate as a detection target from among detection target candidates in the monitoring space based on a plurality of parameters stored in advance as a detection target.
- the flying object 2 can notify the user more quickly that the desired detection target has been detected based on the plurality of parameters stored in advance.
- the projectile 2 calculates the position of the detected object to be detected, and generates position information indicating the calculated position.
- the flying object 2 can notify the user of the position of the detection target faster as well as the detection of the desired detection target.
- the projectile 2 extracts the detected feature of the detection target.
- the aircraft 2 can notify the user more quickly that the desired detection target has been detected and the features of the detection target.
- the modification 1 of embodiment is demonstrated.
- symbol same about the structure part similar to embodiment is attached
- the control device 3 detects one or more vessels from the observation data without generating an image representing the area D. Specifically, the control device 3 executes the processing of the flowchart shown in FIG. 6 instead of the processing of the flowchart shown in FIG. Moreover, in the said modification 1, the case where a detection target is a ship is demonstrated as an example like embodiment.
- the area D2 is an area including only the sea area of the area included in the surface of the earth ET (ie, not including the above-mentioned land area).
- FIG. 6 is a flowchart showing another example of the process flow of the control device 3 detecting a ship in the area D2 based on observation data.
- the process of step S110 to step S120 shown in FIG. 6, the process of step S165 to step S167, the process of step S180 to step S210, and the process of step S220 are the same as those of step S110 to step S120 shown in FIG.
- the processing is the same as the processing in steps S165 to S167, the processing in steps S180 to S210, and the processing in step S220, so the description will be omitted.
- the detection target detection unit 369 detects one or more ships in the area D2 based on the observation data generated in the step S120 (step S310).
- step S310 the process of step S310 will be described.
- the detection target detection unit 369 performs pulse compression in the range direction based on the transmission chirp signal on the observation data generated in step S120 shown in FIG.
- observation data subjected to the pulse compression range compression
- the horizontal axis indicates the number of range cells
- the vertical axis indicates the number of azimuth cells in a compressed data image in which the intensity of radio waves included in the compressed data is plotted.
- the compressed data image formed of a plurality of cells
- one or more ship areas which are areas corresponding to the ship and are arc-shaped areas as shown in FIG. 7, appear.
- the arc-shaped area is an area representing a range curvature included in the compressed data.
- the number of range cells (denoted as a range cell in FIG. 7) is the number of cells on the horizontal axis of the compressed data image, and is a numerical value that can be converted into the range distance.
- the number of azimuth cells (described as azimuth cells in FIG. 7) is the number of cells on the vertical axis of the compressed data image, and is a numerical value that can be converted to time.
- FIG. 7 is a view showing an example of a compressed data image in the case where the area D2 includes one ship.
- the image P2 shown in FIG. 7 is an example of a compressed data image.
- the luminance values of the pixels constituting the image P2 represent the intensities. The luminance value increases as the intensity increases.
- the area F1 illustrated in FIG. 7 is an example of a ship area corresponding to the one ship. Further, in the example illustrated in FIG.
- the luminance values of the plurality of pixels forming the area F1 are luminance values that are equal to or greater than the second predetermined luminance value.
- the ship area has a partial area substantially parallel to the vertical axis in the compressed data image.
- the partial area is a partial area of the area F1 and is a partial area indicated by the area W1 in the image P2.
- the detection target detection unit 369 can detect the ship area by detecting such a partial area from the compressed data.
- the detection target detection unit 369 selects one or more of the one or more range carver contained in the compressed data.
- the detection target detection unit 369 calculates a total intensity which is the sum of the intensities of radio waves in the cells constituting the selected range curver (for example, the intensity is integrated to calculate the total value).
- the detection target detection unit 369 specifies the range curvature corresponding to the total intensity equal to or higher than the predetermined intensity among the total intensities calculated for each range curvature.
- the detection target detection unit 369 detects each of the specified one or more range carveras as a ship.
- the detection target detection unit 369 can perform the process of step S310 by using a method such as applying a land area filter to the compressed data.
- the detection target detection unit 369 may be configured to detect a ship area from compressed data using a machine learning algorithm.
- the detection target detection unit 369 includes information in which a plurality of compressed data including a ship are associated with the position and shape of the ship area in each of the compressed data, and a plurality of pieces of ship not included.
- the compressed data of and the information indicating that the ship is not included in each of the compressed data are stored (learned) in advance as a plurality of parameters.
- the detection target detection unit 369 detects a probable candidate as a ship area as a ship area from among the ship area candidates included in the compressed data generated in step S310.
- the algorithm may be any known algorithm (including deep learning) or an algorithm to be developed from now on. Therefore, the detailed description of the machine learning algorithm will be omitted.
- the position calculation unit 271 calculates the position of the target ship (step S320). Specifically, the position calculation unit 271 identifies the cell with the smallest range distance among the cells that configure the range carbacher identified as the target vessel in step S310. The position calculation unit 271 identifies one or more cells associated with an intensity equal to or higher than a predetermined threshold among the intensities of radio waves associated with the identified cells. Then, the position calculation unit 271 calculates the latitude and longitude corresponding to the cell that is the middle point between the cell with the oldest reception time among the specified one or more cells and the cell with the latest reception time as the position of the target vessel. Do.
- the position calculation unit 271 obtains the obtained flying object position information indicating the position of the flying object 2 at each time and the flying object posture information indicating the attitude of the flying object 2 at each time Based on the position information and the aircraft attitude information, the latitude and longitude corresponding to the cell are calculated as the position of the target vessel.
- the position calculation unit 271 may be configured to specify the position of the target vessel from the compressed data using a machine learning algorithm.
- the position calculation unit 271 includes information in which a plurality of compressed data including a ship is associated with the position, shape, and the like of the ship in each of the compressed data, and a plurality of compressions including no ship.
- Data and information indicating that no ship is included in each of the compressed data are stored (learned) in advance as a plurality of parameters. Then, based on the plurality of parameters stored in advance, the position calculation unit 271 identifies a probable candidate as the position of the ship from among the candidates of the position of the ship included in the compressed data generated in step S310 as the position of the ship.
- the algorithm may be any known algorithm (including deep learning) or an algorithm to be developed from now on. Therefore, the detailed description of the machine learning algorithm will be omitted.
- the image generation unit 367 transmits an image representing an area having a predetermined shape centered on the position.
- the image generation unit 367 may be configured to generate the transmission image based on the compressed data generated in step S310 instead of the observation data in step S330.
- the processing method for generating the transmission image based on the observation data or the compression data in step S330 may be a known method or a method to be developed from this. Therefore, in the following, the detailed description of the processing method is omitted.
- the projectile 2 As described above, the projectile 2 according to the first modification of the embodiment generates observation data based on radio waves received by the radar (in this example, the synthetic aperture radar unit 21), and generates compression of the generated observation data by range compression Then, the detection target (in this example, a ship) is detected based on the range-compressed signal. Thereby, the projectile 2 can shorten the time required to generate an image representing the region D2, and can notify the user more quickly that the detection target has been detected.
- FIG. 8 is a flowchart showing still another example of the flow of processing in which the control device 3 detects a detection target in the region D2 based on observation data.
- the processes of steps S110 to S320 shown in FIG. 8, the processes of steps S330 to S210 and the steps S220 are the processes of steps S110 to S320 shown in FIG. 6, the processes of steps S330 to S210,
- the processing is the same as the processing in step S220, and thus the description is omitted.
- the feature extraction unit 373 extracts the feature of the target vessel based on the compressed data generated in step S310 (step S410). Specifically, the feature extraction unit 373 extracts the feature of the target vessel from the compressed data using a machine learning algorithm. In this case, the feature extraction unit 373 stores (learns) information in which the feature of the vessel area representing the target vessel and each of the features of the target vessel are associated with each other as a plurality of parameters.
- the characteristics of the ship area include, for example, the width in the longitudinal direction of the ship area included in the compressed data, the shape of the ship area, and the area of the ship area.
- the feature may be configured to include other information representing the ship area instead of a part or all of these, and in addition to the part or all, the ship area may be other May be included.
- the feature extraction unit 373 in which the combination of the width in the longitudinal direction of the ship area included in the compressed data, the shape of the ship area, the area of the ship area and the entire length of the ship is associated as a plurality of parameters It is stored in advance. Then, based on the parameter stored in advance and the ship area representing the target ship included in the compressed data generated in step S310, the feature extraction unit 373 sets the total length of the target ship as a candidate for the entire length based on the parameters stored in advance. As a candidate, the candidate is extracted as the full length.
- the feature extraction unit 373 information in which the combination of the width of the ship area in the longitudinal direction included in the compressed data, the shape of the ship area, and the area of the ship area and the ship type is associated as a plurality of parameters It is stored in advance. Then, based on the parameters stored in advance and the ship area representing the target ship included in the compressed data generated in step S310, the feature extraction unit 373 selects the type of the target ship from among the candidates for the type. As a candidate, the candidate is extracted as the type.
- the feature extraction unit 373 information in which the combination of the width in the longitudinal direction of the ship area included in the compressed data, the shape of the ship area, and the area of the ship area and the course of the ship are associated as a plurality of parameters It is stored in advance. Then, based on the parameter stored in advance and the vessel area representing the target vessel included in the compressed data generated in step S310, the feature extraction unit 373 selects the course of the course of the target vessel as the course based on the parameters stored in advance. As a candidate, the candidate is extracted as the heading.
- the feature extraction unit 373 information in which the combination of the width in the longitudinal direction of the ship area included in the compressed data, the shape of the ship area, the area of the ship area and the speed of the ship are associated as a plurality of parameters It is stored in advance. Then, based on the parameters stored in advance and the ship area representing the target ship included in the compressed data generated in step S310, the feature extraction unit 373 selects the target ship's speed candidate as the speed from among the targets. As a candidate, the candidate is extracted as the speed.
- the characteristic extraction unit 373 has a plurality of parameters in which information in which the combination of the width in the longitudinal direction of the ship area included in the compressed data, the shape of the ship area, the area of the ship area and the navigation state of the ship is associated It is stored in advance as Then, based on the parameters stored in advance and the ship area representing the target ship included in the compressed data generated in step S310, the feature extraction unit 373 performs the navigation from among the navigation state candidates of the target ship. A candidate likely to be a state is extracted as the navigation state.
- the algorithm of machine learning used by the feature extraction unit 373 in step S410 may be any known algorithm (including deep learning) or an algorithm to be developed from now on. Therefore, the detailed description of the machine learning algorithm will be omitted.
- the projectile 2 extracts the features of each of one or more ships from the observation data without generating an image representing the region D2.
- the projectile 2 can shorten the time required to generate the image, and can notify the user of vessel characteristic information indicating the characteristic of the detected vessel more quickly.
- the third modification of the embodiment will be described with reference to FIG.
- symbol same about the structure part similar to embodiment is attached
- the third modification as in the first modification of the embodiment, a case where the projectile 2 observes the region D will be described.
- the case where a detection target is crustal deformation is demonstrated.
- the control device 3 detects, as a crustal movement, an uplift or a depression that occurs in at least a part of the land area included in the region D.
- the control device 3 executes the process of the flowchart shown in FIG. 9 instead of the process of the flowchart shown in FIG.
- the detection target may include another object or a phenomenon.
- FIG. 9 is a flow chart showing an example of a processing flow for detecting crustal deformation that occurs in at least a part of the land area included in the area D based on observation data.
- the processes of step S110 to step S150 shown in FIG. 9 are the same as the processes of step S110 to step S150 shown in FIG.
- the detection target detection unit 369 reads the base map stored in advance in the storage unit 32 from the storage unit 32 (step S520).
- the base map is, in this example, an image representing the area D generated in step S150 executed in the past by the control device 3.
- step S530 the detection target detection unit 369 detects the land area included in the area D.
- the crustal deformation occurring at least in part is detected (step S530).
- step S530 the process of step S530 will be described.
- each of the plurality of pixels constituting the image is referred to as a first pixel
- each of the plurality of pixels constituting the base map is referred to as a second pixel.
- the detection target detection unit 369 selects the second pixel corresponding to the first pixel for each of the plurality of first pixels constituting the image representing the area D generated in step S150 shown in FIG.
- the second pixel is a pixel that represents the same backscatterer as the position of the backscatterer represented by the first pixel and that constitutes a base map.
- the detection target detection unit 369 calculates the difference between the phase associated with a certain first pixel and the phase associated with the second pixel. What removed the unnecessary phase component from the calculated difference is extracted as a phase component by crustal movement.
- the detection target detection unit 369 specifies the first pixel as a third pixel when the extracted phase component is equal to or more than a predetermined value.
- the detection target detection unit 369 specifies the first pixel as the fourth pixel.
- the detection target detection unit 369 may be configured to do nothing in particular.
- the detection target detection unit 369 repeatedly performs the processing from the selection of the second pixel to the identification of the third pixel or the fourth pixel for each of all the first pixels.
- the detection target detection unit 369 specifies the third pixels adjacent to each other among the one or more specified third pixels as one group. From the identified one or more clusters, the detection target detection unit 369 excludes clusters with the number of third pixels forming the cluster less than a predetermined number as noise. The detection target detection unit 369 detects each of one or more clusters remaining without being excluded as crustal deformation.
- the detection target detection unit 369 determines whether crustal movement has been detected in step S530 (step S535). If the detection target detection unit 369 determines that crustal deformation is not detected in step S530 (step S535-NO), the control unit 36 ends the process. On the other hand, when the detection target detection unit 369 determines that crustal movement has been detected in step S530 (step S535-YES), each of the position calculation unit 271, the image generation unit 367, the transmission data generation unit 375, and the communication control unit 361 The process of steps S540 to S560 is repeated for each of the one or more crustal movements detected in step S530 (step S357). In the following, for convenience of explanation, the crustal movement selected in step S530 will be described as a target crustal movement.
- the position calculation unit 271 calculates the position of the target crustal movement (step S540). Specifically, the position calculation unit 271 calculates a predetermined position of the third pixel group detected as crustal movement in step S530 as the position of target crustal movement.
- the predetermined position is, for example, the position of the image center (or the center of gravity) of the area formed by the group.
- the predetermined position may be another position based on the area instead of the position of the center of the area.
- the image generation unit 367 trims the image generated in step S150 based on the position of the target crustal movement calculated in step S540 (step S545). Specifically, the image generation unit 367 trims a partial image representing a region of a predetermined shape centering on the position of the target crustal movement calculated in step S540 out of the regions included in the region D from the image (cut out) ).
- the predetermined shape is, for example, a rectangle of a predetermined distance square.
- the predetermined shape is a rectangular area having a side parallel to the latitude direction and a side parallel to the longitude direction in the image.
- the predetermined distance is, for example, 500 meters.
- the predetermined distance may be a distance shorter than 500 meters, or may be a distance longer than 500 meters. Further, the predetermined shape may be another shape such as a circle or an oval instead of a rectangle.
- the image generation unit 367 generates the trimmed partial image as a transmission image.
- the transmission data generation unit 375 generates transmission data (step S550). Specifically, the transmission data generation unit 375 generates, as transmission data, information including crustal movement identification information, crustal movement position information, and a transmission image.
- the said crustal movement identification information is information which identifies object crustal movement.
- the crustal movement identification information may be any information as long as each of the one or more crustal movements detected in step S530 can be identified.
- the crustal movement position information is information indicating the position of the target crustal movement and the position calculated in step S540.
- the transmission image is a transmission image generated in step S545.
- the transmission data generation unit 375 stores the transmission data generated in step S550 in the storage unit 32 (step S560).
- the flying object 2 generates transmission data for each of the one or more crustal movements detected in step S530 by repeatedly performing the processing of step S357 to step S560, and stores the generated transmission data in the storage unit 32. It can be memorized.
- the communication control unit 361 After repetition of steps S357 to S560, the communication control unit 361 outputs each of the transmission data stored in the storage unit 32 in step S550 to the communication antenna unit 22, and the radio wave corresponding to the transmission data is output.
- the communication antenna unit 22 is made to transmit toward the receiving device 4 (step S570), and the process is ended.
- the flying object 2 can make the data amount of data to be transmitted to an object to be transmitted data (in this example, the receiving device 4) smaller than the data amount of observation data, for example. It is possible to shorten the time for providing the user with information indicating that a certain crustal deformation has been detected.
- step S570 the communication control unit 361 outputs a part of the transmission data stored in the storage unit 32 to the communication antenna unit 22, and directs the radio wave according to the transmission data to the communication antenna unit 22 to the receiving device 4. It may be configured to be transmitted.
- step S550 the communication control unit 361 outputs the transmission data generated by the transmission data generation unit 375 to the communication antenna unit 22, and directs the radio wave according to the transmission data to the communication antenna unit 22 toward the receiving device 4. It may be configured to be transmitted.
- the projectile 2 may be applied to local disaster detection, volcanic activity monitoring, infrastructure monitoring, etc., instead of the configuration applied to the detection of such crustal movement.
- the projectile 2 may be configured to detect the feature of the target crustal movement at any timing included in the period in which the processes of step S357 to step S545 shown in FIG. 9 are performed.
- the aircraft 2 detects the feature using a machine learning algorithm.
- the flying object 2 generates transmission data including information indicating the feature.
- the projectile 2 detects the crustal movement by comparing the base map with the generated image representing the area D. Thereby, the projectile 2 can notify the user more quickly that the crustal movement has been detected based on the base map and the generated image representing the area D.
- the control device 3 described above may be configured to be mounted on another flying object such as an airplane instead of the flying object 2.
- functional units corresponding to the synthetic aperture radar unit 21 and the communication antenna unit 22 are mounted on the projectile.
- the detection target detection unit 369 obtains image similarity, reduction in coherence, and the like by coherence analysis, instead of calculating the phase difference between the base map and the image representing the generated area D. It may be configured to detect a local change of
- a program for realizing the function of an arbitrary component in the device (for example, the flying object 2) described above is recorded in a computer readable recording medium, and the program is read and executed by a computer system. You may do so.
- the “computer system” referred to here includes hardware such as an operating system (OS) and peripheral devices.
- OS operating system
- computer-readable recording medium refers to a flexible disk, a magneto-optical disk, a ROM, a portable medium such as a CD (Compact Disk) -ROM, and a storage device such as a hard disk incorporated in a computer system. .
- “computer-readable recording medium” means volatile memory (RAM) in a computer system as a server or client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line.
- RAM volatile memory
- a network such as the Internet
- a communication line such as a telephone line.
- those that hold the program for a certain period of time are also included.
- the above program may be transmitted from a computer system in which the program is stored in a storage device or the like to another computer system via a transmission medium or by transmission waves in the transmission medium.
- the “transmission medium” for transmitting the program is a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the above program may be for realizing a part of the functions described above.
- the program described above may be a so-called difference file (difference program) that can realize the functions described above in combination with the program already recorded in the computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
本願は、2017年2月24日に、日本に出願された特願2017-034122号に基づき優先権を主張し、その内容をここに援用する。
以下、本発明の実施形態について、図面を参照して説明する。
まず、実施形態に係る衛星観測システム1の概要について説明する。図1は、衛星観測システム1の構成の一例を示す図である。
飛翔体2は、本実施形態において、地球ETの地表上空を予め決められた周回軌道に沿って周回する人工衛星である。なお、飛翔体2は、地球ETの地表上空に代えて、他の天体や物体の表面上空を当該周回軌道に沿って周回する人工衛星であってもよい。当該天体には、火星や金星等の地球ETと異なる他の惑星、月やタイタン等の衛星、イトカワ等の小惑星等が含まれる。また、当該物体には、岩石等が含まれる。また、飛翔体2は、人工衛星に代えて、飛行機、ドローン等の他の飛翔体であってもよい。
なお、飛翔体2は、当該領域の一部を観測対象として観測する構成に代えて、地球ET上に存在する他の物体を観測対象として観測する構成であってもよい。飛翔体2は、当該地表上空のうち領域Dに電波を照射可能な上空を通過する際、飛翔体2から照射される電波が領域Dに照射されるように姿勢を制御される。以下では、飛翔体2の姿勢の制御方法については、既知の制御方法を用いてもよく、これから開発される方法を用いてもよいため、説明を省略する。飛翔体2は、領域Dに照射してから領域Dの表面において反射された電波を受信することにより、領域Dの観測を行う。
受信装置4は、飛翔体2が受信装置4に向けて送信した送信データを受信データとして受信する。受信装置4は、受信した受信データを記憶する。これにより、ユーザーは、受信装置4により記憶された受信データに含まれる検出対象を示す情報に応じた作業を行うことができる。
以下、図2を参照し、制御装置3のハードウェア構成について説明する。図2は、制御装置3のハードウェア構成の一例を示す図である。
記憶部32は、例えば、EEPROM(Electrically Erasable Programmable Read-Only Memory)、ROM(Read-Only Memory)、RAM(Random Access Memory)、フラッシュメモリー等を含む。記憶部32は、制御装置3が処理する各種の情報、各種の画像等を格納する。
通信部34は、例えば、各種の通信規格に応じたアナログ又はデジタルの入出力ポート等を含んで構成される。
以下、図3を参照し、制御装置3の機能構成について説明する。図3は、制御装置3の機能構成の一例を示す図である。
以下、図4を参照し、制御装置3が観測データに基づいて領域Dにおける検出対象を検出する処理について説明する。以下では、一例として、制御装置3が観測データに基づいて領域Dにおける船舶を検出対象として検出する処理について説明する。この場合、領域Dには、地球ETの地表に含まれる領域のうちの海域が少なくとも含まれている。図4は、制御装置3が観測データに基づいて領域Dにおける船舶(すなわち、当該海域における船舶)を検出する処理の流れの一例を示すフローチャートである。なお、検出対象には、船舶に加えて、他の物体や現象が含まれる構成であってもよい。
以下では、説明の便宜上、陸域除去画像に二値化フィルターを適用した後の画像を二値化画像と称して説明する。図5は、二値化画像の一例を示す図である。図5に示した画像P1は、二値化画像の一例である。画像P1において、領域SC1は、輝度値が第3輝度値である画素によって構成された領域を表している。また、ハッチングされた領域である領域SC2は、輝度値が第2輝度値である画素によって構成された領域を表している。また、検出対象検出部369は、二値化フィルターの代わりに標準偏差フィルターにより、陸域除去画像の二値化を行う構成であってもよい。
このため、当該機械学習のアルゴリズムについて、これ以上の詳細な説明を省略する。
以下、図6及び図7を参照し、実施形態の変形例1について説明する。なお、実施形態の変形例1では、実施形態と同様な構成部に対して同じ符号を付して説明を省略する。当該変形例1では、制御装置3は、領域Dを表す画像を生成することなく、観測データから1以上の船舶を検出する。具体的には、制御装置3は、図4に示したフローチャートの処理に代えて、図6に示したフローチャートの処理を実行する。また、当該変形例1では、実施形態と同様に、一例として、検出対象が船舶である場合を説明する。また、当該変形例1では、飛翔体2が、領域Dを観測する構成に代えて、領域D2を観測する場合について説明する。領域D2は、地球ETの地表に含まれる領域のうちの海域のみを含む(すなわち、前述の陸域を含まない)領域である。
この円弧形状の領域は、圧縮データに含まれるレンジカーバチャーを表す領域である。ここで、レンジセル数(図7では、レンジセルと記載されている)は、圧縮データ画像の横軸におけるセルの数であり、レンジ距離に変換可能な数値である。また、アジマスセル数(図7では、アジマスセルと記載されている)は、圧縮データ画像の縦軸におけるセルの数であり、時刻に変換可能な数値である。図7は、領域D2に1艘の船舶が含まれている場合における圧縮データ画像の一例を示す図である。図7に示した画像P2は、圧縮データ画像の一例である。画像P2を構成する画素の輝度値は、当該強度を表している。当該輝度値は、当該強度が強いほど大きくなる。図7に示した領域F1は、当該1艘の船舶に応じた船舶領域の一例である。また、図7に示した例では、領域F1を構成している複数の画素の輝度値は、第2所定輝度値以上の輝度値である。船舶領域は、圧縮データ画像内において、縦軸とほぼ平行な部分領域を有する。図7に示した例では、当該部分領域は、領域F1の部分領域であって画像P2における領域W1によって示した部分領域である。
検出対象検出部369は、このような部分領域を圧縮データから検出することにより、船舶領域を検出することができる。
以下、図8を参照し、実施形態の変形例2について説明する。なお、実施形態の変形例2では、実施形態と同様な構成部に対して同じ符号を付して説明を省略する。当該変形例2では、実施形態と同様に、一例として、検出対象が船舶である場合を説明する。また、当該変形例2では、実施形態の変形例1と同様に、飛翔体2が、領域D2を観測する場合について説明する。また、当該変形例2では、制御装置3は、領域D2を表す画像を生成することなく、観測データから領域D2における1以上の船舶それぞれの特徴を検出する。具体的には、制御装置3は、図4に示したフローチャートの処理に代えて、図8に示したフローチャートの処理を実行する。
以下、図9を参照し、実施形態の変形例3について説明する。なお、実施形態の変形例3では、実施形態と同様な構成部に対して同じ符号を付して説明を省略する。当該変形例3では、実施形態の変形例1と同様に、飛翔体2が領域Dを観測する場合について説明する。また、当該変形例3では、検出対象が地殻変動である場合について説明する。具体的には、制御装置3は、領域Dに含まれる陸域の少なくとも一部において生じる隆起又は沈降を地殻変動として検出する。当該地殻変動を検出するため、制御装置3は、図4に示したフローチャートの処理に代えて、図9に示したフローチャートの処理を実行する。なお、検出対象には、地殻変動に加えて、他の物体や現象が含まれる構成であってもよい。
さらに「コンピューター読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムが送信された場合のサーバーやクライアントとなるコンピューターシステム内部の揮発性メモリー(RAM)のように、一定時間プログラムを保持しているものも含むものとする。
また、上記のプログラムは、前述した機能の一部を実現するためのものであってもよい。さらに、上記のプログラムは、前述した機能をコンピューターシステムにすでに記録されているプログラムとの組み合わせで実現できるもの、いわゆる差分ファイル(差分プログラム)であってもよい。
2 飛翔体
3 制御装置
4 受信装置
21 合成開口レーダー部
22 通信アンテナ部
31 FPGA
32 記憶部
34 通信部
36 制御部
361 通信制御部
363 レーダー制御部
364 観測データ生成部
365 処理部
367 画像生成部
369 検出対象検出部
371 位置算出部
373 特徴抽出部
375 送信データ生成部
Claims (11)
- レーダーが受信した電波に基づいて観測データを生成する観測データ生成部と、
前記観測データ生成部により生成された前記観測データに基づいて監視空間を表す画像を生成する画像生成部と、
前記画像生成部により生成された前記画像に基づいて検出対象を検出する検出部と、
を備える飛翔体。 - 前記検出部は、前記画像生成部により生成された前記画像の二値化により前記検出対象を検出する、
請求項1に記載の飛翔体。 - レーダーが受信した電波に基づいて観測データを生成する観測データ生成部と、
前記観測データ生成部により生成された前記観測データをレンジ圧縮する処理部と、
前記処理部によりレンジ圧縮された信号に基づいて検出対象を検出する検出部と、
を備える飛翔体。 - 前記監視空間には、海域が含まれ、
前記検出対象には、前記海域における船舶が含まれる、
請求項1から3のうちいずれか一項に記載の飛翔体。 - 前記検出部は、予め記憶された複数のパラメーターに基づいて、前記監視空間における前記検出対象の候補の中から前記検出対象と推定される対象を前記検出対象として検出する、
請求項1から4のうちいずれか一項に記載の飛翔体。 - 前記検出部は、ベースマップと、前記画像生成部により生成された画像とを比較することにより前記検出対象を検出する、
請求項1に記載の飛翔体。 - 前記検出対象には、前記監視空間における地殻変動、地表面の局所的な変化の少なくとも一方が含まれる、
請求項6に記載の飛翔体。 - 前記検出部により検出された前記検出対象の位置を算出し、算出した位置を示す位置情報を生成する位置算出部を更に備える、
請求項1から7のうちいずれか一項に記載の飛翔体。 - 前記検出部により検出された前記検出対象の特徴を抽出する特徴抽出部を更に備える、
請求項1から8にうちいずれか一項に記載の飛翔体。 - 飛翔体が備えるコンピューターに、
レーダーが受信した電波に基づく観測データを生成し、生成した前記観測データに基づいて監視空間を表す画像を生成し、生成された前記画像における検出対象を検出させる、
プログラム。 - 飛翔体が備えるコンピューターに、
レーダーが受信した電波に基づく観測データを生成し、生成した前記観測データをレンジ圧縮し、レンジ圧縮された信号に基づいて検出対象を検出させる、
プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112018000997.1T DE112018000997T5 (de) | 2017-02-24 | 2018-02-26 | Flugkörper und Programm |
JP2019501862A JP7521768B2 (ja) | 2017-02-24 | 2018-02-26 | 飛翔体 |
CA3054258A CA3054258C (en) | 2017-02-24 | 2018-02-26 | Flying body and program |
US16/487,314 US11262447B2 (en) | 2017-02-24 | 2018-02-26 | Flying body and program |
JP2023156368A JP2023165817A (ja) | 2017-02-24 | 2023-09-21 | 飛翔体、及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-034122 | 2017-02-24 | ||
JP2017034122 | 2017-02-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018155683A1 true WO2018155683A1 (ja) | 2018-08-30 |
Family
ID=63252800
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/006941 WO2018155683A1 (ja) | 2017-02-24 | 2018-02-26 | 飛翔体、及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US11262447B2 (ja) |
JP (2) | JP7521768B2 (ja) |
CA (2) | CA3141030C (ja) |
DE (1) | DE112018000997T5 (ja) |
WO (1) | WO2018155683A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020159946A (ja) * | 2019-03-27 | 2020-10-01 | 株式会社Ihi | 船舶検出装置及び方法 |
JP7006875B1 (ja) | 2021-06-18 | 2022-01-24 | 株式会社スペースシフト | 学習モデル、信号処理装置、飛翔体、及びプログラム |
JP2022078754A (ja) * | 2020-11-13 | 2022-05-25 | 株式会社東芝 | 画像識別装置、レーダ装置、画像識別方法、およびプログラム |
CN115485742A (zh) * | 2021-06-18 | 2022-12-16 | 空间转移株式会社 | 学习模型、信号处理装置、飞行物以及程序 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030450B2 (en) * | 2018-05-31 | 2021-06-08 | Vatbox, Ltd. | System and method for determining originality of computer-generated images |
WO2020174566A1 (ja) * | 2019-02-26 | 2020-09-03 | 日本電気株式会社 | 監視装置、追跡方法、及び非一時的なコンピュータ可読媒体 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5248979A (en) * | 1991-11-29 | 1993-09-28 | Trw Inc. | Dual function satellite imaging and communication system using solid state mass data storage |
JP2001004398A (ja) * | 1999-06-25 | 2001-01-12 | Mitsubishi Space Software Kk | 衛星sar画像に基づく移動体の移動情報検出方法 |
JP2004170170A (ja) * | 2002-11-19 | 2004-06-17 | Mitsubishi Space Software Kk | 船舶形状推定方法及び装置、並びに船舶形状推定用プログラム |
WO2008016153A1 (fr) * | 2006-08-03 | 2008-02-07 | Pasco Corporation | procédé de prise en charge de contre-mesures pour une catastrophe |
JP2010197337A (ja) * | 2009-02-27 | 2010-09-09 | Mitsubishi Space Software Kk | 人工物検出装置及び人工物検出方法及び人工物検出プログラム |
JP2011208961A (ja) * | 2010-03-29 | 2011-10-20 | Mitsubishi Space Software Kk | 画像処理装置及び監視システム及び画像処理方法及び画像処理プログラム |
JP2012063196A (ja) * | 2010-09-15 | 2012-03-29 | Mitsubishi Space Software Kk | 船舶探知装置、船舶探知プログラムおよび船舶探知装置の船舶探知方法 |
JP2012242216A (ja) * | 2011-05-18 | 2012-12-10 | Mitsubishi Electric Corp | 画像レーダ信号処理装置 |
WO2014010000A1 (ja) * | 2012-07-12 | 2014-01-16 | 三菱電機株式会社 | レーダシステムおよびデータ処理装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4546355A (en) * | 1982-06-17 | 1985-10-08 | Grumman Aerospace Corporation | Range/azimuth/elevation ship imaging for ordnance control |
US4546354A (en) * | 1982-06-17 | 1985-10-08 | Grumman Aerospace Corporation | Range/azimuth ship imaging for ordnance control |
US4563686A (en) * | 1982-06-17 | 1986-01-07 | Grumman Aerospace Corporation | Range/doppler ship imaging for ordnance control |
US4723124A (en) * | 1986-03-21 | 1988-02-02 | Grumman Aerospace Corporation | Extended SAR imaging capability for ship classification |
JP3510140B2 (ja) | 1999-03-25 | 2004-03-22 | 三菱電機株式会社 | 目標識別装置および目標識別方法 |
ATE527557T1 (de) * | 2005-11-09 | 2011-10-15 | Saab Ab | Multisensorsystem |
EP2036043A2 (en) * | 2006-06-26 | 2009-03-18 | Lockheed Martin Corporation | Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data |
WO2009080903A1 (fr) * | 2007-12-21 | 2009-07-02 | V.Navy | Systeme de detection et de positionnement d'un objet maritime |
JP6066934B2 (ja) | 2011-03-10 | 2017-01-25 | エアバス ディフェンス アンド スペイス リミテッド | 人工衛星またはエアリアルプラットフォーム上で複数のsar画像を生成するシステム、当該システムを備える衛星、人工衛星またはエアリアルプラットフォームで合成開口レーダ(sar)画像を生成する方法 |
JP2013250122A (ja) | 2012-05-31 | 2013-12-12 | Mitsubishi Electric Corp | レーダ装置及びレーダ信号処理装置 |
WO2016101279A1 (zh) * | 2014-12-26 | 2016-06-30 | 中国海洋大学 | 一种合成孔径雷达图像舰船目标快速检测方法 |
US11263769B2 (en) | 2015-04-14 | 2022-03-01 | Sony Corporation | Image processing device, image processing method, and image processing system |
WO2017060543A1 (es) * | 2015-10-07 | 2017-04-13 | Inteo Media Mobile, S.L. | Sistema de vigilancia perimetral de cultivos marinos y similares |
-
2018
- 2018-02-26 DE DE112018000997.1T patent/DE112018000997T5/de active Pending
- 2018-02-26 US US16/487,314 patent/US11262447B2/en active Active
- 2018-02-26 WO PCT/JP2018/006941 patent/WO2018155683A1/ja active Application Filing
- 2018-02-26 CA CA3141030A patent/CA3141030C/en active Active
- 2018-02-26 CA CA3054258A patent/CA3054258C/en active Active
- 2018-02-26 JP JP2019501862A patent/JP7521768B2/ja active Active
-
2023
- 2023-09-21 JP JP2023156368A patent/JP2023165817A/ja active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5248979A (en) * | 1991-11-29 | 1993-09-28 | Trw Inc. | Dual function satellite imaging and communication system using solid state mass data storage |
JP2001004398A (ja) * | 1999-06-25 | 2001-01-12 | Mitsubishi Space Software Kk | 衛星sar画像に基づく移動体の移動情報検出方法 |
JP2004170170A (ja) * | 2002-11-19 | 2004-06-17 | Mitsubishi Space Software Kk | 船舶形状推定方法及び装置、並びに船舶形状推定用プログラム |
WO2008016153A1 (fr) * | 2006-08-03 | 2008-02-07 | Pasco Corporation | procédé de prise en charge de contre-mesures pour une catastrophe |
JP2010197337A (ja) * | 2009-02-27 | 2010-09-09 | Mitsubishi Space Software Kk | 人工物検出装置及び人工物検出方法及び人工物検出プログラム |
JP2011208961A (ja) * | 2010-03-29 | 2011-10-20 | Mitsubishi Space Software Kk | 画像処理装置及び監視システム及び画像処理方法及び画像処理プログラム |
JP2012063196A (ja) * | 2010-09-15 | 2012-03-29 | Mitsubishi Space Software Kk | 船舶探知装置、船舶探知プログラムおよび船舶探知装置の船舶探知方法 |
JP2012242216A (ja) * | 2011-05-18 | 2012-12-10 | Mitsubishi Electric Corp | 画像レーダ信号処理装置 |
WO2014010000A1 (ja) * | 2012-07-12 | 2014-01-16 | 三菱電機株式会社 | レーダシステムおよびデータ処理装置 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020159946A (ja) * | 2019-03-27 | 2020-10-01 | 株式会社Ihi | 船舶検出装置及び方法 |
JP2022078754A (ja) * | 2020-11-13 | 2022-05-25 | 株式会社東芝 | 画像識別装置、レーダ装置、画像識別方法、およびプログラム |
JP7006875B1 (ja) | 2021-06-18 | 2022-01-24 | 株式会社スペースシフト | 学習モデル、信号処理装置、飛翔体、及びプログラム |
CN115485742A (zh) * | 2021-06-18 | 2022-12-16 | 空间转移株式会社 | 学习模型、信号处理装置、飞行物以及程序 |
WO2022264473A1 (ja) * | 2021-06-18 | 2022-12-22 | 株式会社スペースシフト | 学習モデル、信号処理装置、飛翔体、及びプログラム |
JP2023000897A (ja) * | 2021-06-18 | 2023-01-04 | 株式会社スペースシフト | 学習モデル、信号処理装置、飛翔体、及びプログラム |
US12044798B2 (en) | 2021-06-18 | 2024-07-23 | Space Shift, Inc. | Learning model, signal processor, flying object, and program |
Also Published As
Publication number | Publication date |
---|---|
JP2023165817A (ja) | 2023-11-17 |
JPWO2018155683A1 (ja) | 2019-12-26 |
CA3054258C (en) | 2023-01-10 |
CA3141030C (en) | 2023-05-23 |
CA3141030A1 (en) | 2018-08-30 |
CA3054258A1 (en) | 2018-08-30 |
US11262447B2 (en) | 2022-03-01 |
JP7521768B2 (ja) | 2024-07-24 |
DE112018000997T5 (de) | 2019-11-07 |
US20200025912A1 (en) | 2020-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018155683A1 (ja) | 飛翔体、及びプログラム | |
Velotto et al. | First comparison of Sentinel-1 and TerraSAR-X data in the framework of maritime targets detection: South Italy case | |
JP5567805B2 (ja) | 飛翔体探知方法及びシステムならびにプログラム | |
JP2019067252A (ja) | 経路選定装置、無人航空機、データ処理装置、経路選定処理方法および経路選定処理用プログラム | |
CN105372626A (zh) | 用于确定双基地雷达系统的发射器的位置的系统和方法 | |
EP2339370A1 (en) | Distributed sensor SAR processing system | |
US20200191946A1 (en) | Methods and systems for controlling weather radar and electro-optical and imaging systems of search and rescue vehicles | |
CN116539913B (zh) | 星上实时反演海面风速的方法及装置 | |
US20190146054A1 (en) | Wave source direction estimation apparatus, wave source direction estimation system, wave source direction estimation method, and wave source direction estimation program | |
Galloway et al. | Automated crater detection and counting using the Hough transform | |
KR102260239B1 (ko) | 지형 추적 비행방법 | |
US9401093B2 (en) | Procedure for the detection and display of artificial obstacles for a rotary-wing aircraft | |
Luce et al. | On the performance of the range imaging technique estimated using unmanned aerial vehicles during the ShUREX 2015 campaign | |
KR101749231B1 (ko) | 별센서와 지구센서를 이용한 위성의 천측항법 | |
US9927527B2 (en) | Satellite signal acquisition using antennas with beamforming abilities | |
Van Uffelen | Global positioning systems: Over land and under sea | |
JP7187699B2 (ja) | 音声無線信号を処理するための装置、方法、およびコンピュータプログラム | |
JP2022170543A (ja) | 信号処理装置、信号処理方法、及びプログラム | |
JP6774085B2 (ja) | アクティブセンサーの信号処理システム、信号処理方法及び信号処理プログラム | |
US12044798B2 (en) | Learning model, signal processor, flying object, and program | |
Crawford | Ice island deterioration | |
EP4206732A1 (en) | Scanning a body of water with a fishing sonar apparatus | |
KR102669003B1 (ko) | 이중편파 안테나를 이용한 gnss 수신기 및 이의 위치 판별 방법 | |
Hamad et al. | A Survey of Localization Systems in the Sea Based on New Categories | |
JP2005181059A (ja) | 目標位置標定方法及び目標位置標定装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18757832 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019501862 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3054258 Country of ref document: CA |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18757832 Country of ref document: EP Kind code of ref document: A1 |