WO2019220639A1 - Dispositif et procédé d'observation de corps mobile - Google Patents

Dispositif et procédé d'observation de corps mobile Download PDF

Info

Publication number
WO2019220639A1
WO2019220639A1 PCT/JP2018/019366 JP2018019366W WO2019220639A1 WO 2019220639 A1 WO2019220639 A1 WO 2019220639A1 JP 2018019366 W JP2018019366 W JP 2018019366W WO 2019220639 A1 WO2019220639 A1 WO 2019220639A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
image
unit
wavefront
light beam
Prior art date
Application number
PCT/JP2018/019366
Other languages
English (en)
Japanese (ja)
Inventor
貴雄 遠藤
俊行 安藤
隆 高根澤
豊 江崎
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020518936A priority Critical patent/JP6771697B2/ja
Priority to PCT/JP2018/019366 priority patent/WO2019220639A1/fr
Publication of WO2019220639A1 publication Critical patent/WO2019220639A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder

Definitions

  • the present invention relates to a moving body observation apparatus and a moving body observation method for estimating a luminance distribution of a moving body.
  • a moving body observation apparatus that observes a moving body such as a celestial body or a flying body observes the moving body by receiving a light beam reflected by the moving body or a light beam transmitted from the moving body on the ground.
  • the luminous flux from the moving body may spread because the phase of the light is disturbed due to the fluctuation of the refractive index distribution in the atmosphere. Therefore, in order to increase the observation accuracy of the moving object, the moving object observation apparatus needs to acquire a wavefront that is a surface having the same phase of light.
  • Patent Document 1 below discloses a wavefront sensor that measures a wavefront.
  • the wavefront sensor disclosed in Patent Document 1 below uses a telescope to track the light flux transmitted from an object by controlling the tilt and focus of the wavefront when observing an object moving at high speed. However, I am shooting the atmospheric fault.
  • the wavefront sensor includes a control mechanism that realizes high-speed movement of each of the high-speed steering mirror, the imaging lens, and the lenslet array as a mechanism for controlling the tilt and focus of the wavefront.
  • the measurement accuracy of the wavefront by the wavefront sensor depends on the control accuracy of the control mechanism that realizes the respective high-speed movements in the high-speed steering mirror, imaging lens, and lenslet array. Therefore, the conventional mobile observation device may deteriorate the measurement accuracy of the wavefront in the wavefront sensor depending on the control accuracy of the control mechanism. There existed a subject that it might deteriorate.
  • the present invention has been made to solve the above-described problems, and is capable of estimating the luminance distribution of a moving object without implementing a control mechanism that realizes high-speed movement such as a lenslet array.
  • An object is to obtain an apparatus and a moving body observation method.
  • a moving body observation apparatus includes an imaging optical system that collects a light beam reflected by a moving body or a light beam transmitted from the moving body, and a light beam collected by the imaging optical system in a plurality of spatial regions. And a light for detecting a condensed spot image as an image of a moving body from each of the light beams collected by the space dividing unit.
  • the detector, a wavefront estimation unit that estimates the wavefront of the light beam at the aperture of the imaging optical system from the positions of the plurality of focused spot images detected by the photodetector, and the radar wave reflected by the moving body are received.
  • a radar device that generates a distance image indicating the distance to the moving object based on the radar wave, and a region where the moving object exists is detected from the distance image generated by the radar device, and is detected by the photodetector.
  • Multiple focused spot images That is, a moving body restoration unit that estimates the luminance distribution of the moving body from the focused spot image in the region where the moving body exists and the wavefront estimated by the wavefront estimation section is provided. .
  • the moving body restoration unit detects a region where the moving body exists from the distance image generated by the radar device, and among the plurality of focused spot images detected by the photodetector,
  • the moving object observation apparatus was configured to estimate the luminance distribution of the moving object from the focused spot image in the region where the moving object exists and the wavefront estimated by the wavefront estimation unit. Therefore, the moving body observation apparatus according to the present invention can estimate the luminance distribution of the moving body without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
  • FIG. 1 is a configuration diagram illustrating a mobile observation apparatus according to Embodiment 1.
  • FIG. 3 is a configuration diagram illustrating a wavefront measuring unit 15 of the moving object observation apparatus according to Embodiment 1.
  • FIG. 1 is a perspective view showing an appearance of a moving object observation apparatus according to Embodiment 1.
  • FIG. It is explanatory drawing which shows a mode that each of the telescope apparatus 10 and the radar apparatus 30 is observing the mobile body 1.
  • FIG. It is a flowchart which shows the mobile body observation method which is a process sequence of the mobile body observation apparatus shown in FIG. It is explanatory drawing which shows the image of the moving body 1 detected by the photodetector 25.
  • FIG. 1 is a configuration diagram illustrating a mobile observation apparatus according to Embodiment 1.
  • FIG. 3 is a configuration diagram illustrating a wavefront measuring unit 15 of the moving object observation apparatus according to Embodiment 1.
  • FIG. 1 is a perspective view showing an appearance of a moving object observation apparatus according to Embod
  • FIG. 5 is a flowchart showing a processing procedure of a moving body restoring unit 42. It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression
  • FIG. FIG. 6 is a configuration diagram illustrating a moving object observation apparatus according to a second embodiment.
  • FIG. 3 is an explanatory diagram showing a trajectory of a moving body 1. It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression
  • FIG. FIG. 1 is a configuration diagram illustrating a moving object observation apparatus according to Embodiment 1.
  • FIG. FIG. 2 is a configuration diagram illustrating the wavefront measuring unit 15 of the moving object observation apparatus according to the first embodiment. 1 and 2
  • the moving body 1 is an object such as a celestial body outside the atmosphere or existing in the atmosphere.
  • the light beam reflected by the moving body 1 or transmitted from the moving body 1 is a light beam 2 that has spread due to fluctuations in the refractive index distribution in the atmosphere.
  • the light beam 2 is incident on the telescope device 10. .
  • the telescope device 10 is a device that measures the wavefront of the light beam 2 at the aperture of the imaging optical system 11 and acquires an intensity image indicating an image of the moving body 1.
  • the telescope device 10 includes an imaging optical system 11, a light beam splitting unit 12, an intensity image acquisition unit 13, a relay optical system 14, a wavefront measurement unit 15, and a directing device 16.
  • the imaging optical system 11 is an optical system that condenses the incident light beam 2.
  • the light beam splitting unit 12 is realized by, for example, a beam splitter.
  • the light beam splitting unit 12 splits the light beam 2 into two by dividing the light amount or wavelength of the light beam 2 collected by the imaging optical system 11 into two.
  • the beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
  • the intensity image acquisition unit 13 is realized by, for example, an image sensor.
  • the intensity image acquisition unit 13 detects an image of the moving body 1 from the light beam 2 output from the light beam splitting unit 12 and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41.
  • the relay optical system 14 is optically equivalent to the pupil of the imaging optical system 11 so that the light beam 2 output from the light beam splitting unit 12 is in focus in the lens array 23 of the wavefront measuring unit 15. It is an optical system.
  • the wavefront measuring unit 15 includes a space dividing unit 21, a shutter 24, a photodetector 25, and a wavefront estimating unit 26.
  • the space dividing unit 21 includes a light shielding unit 22 and a lens array 23.
  • the space dividing unit 21 divides the light beam 2 output from the relay optical system 14 into a plurality of light beams 2a in a plurality of space regions, and the light beams 2a in the plurality of space regions are received by a light receiving surface 25a (FIG. 8 or FIG. 9). (See).
  • the light blocking unit 22 partially blocks the light beam 2 output from the relay optical system 14, thereby dividing the light beam 2 into light beams 2a in a plurality of spatial regions.
  • the lens array 23 includes a plurality of lenses 23 a (see FIGS. 6 to 9), and each lens 23 a condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
  • the shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • the photodetector 25 is realized by an image sensor, for example.
  • the photodetector 25 has a light receiving surface 25 a that receives each of the light beams 2 a in a plurality of spatial regions that have passed through the shutter 24.
  • the light detector 25 detects a condensing spot image as an image of the moving body 1 from each light beam 2a received by the light receiving surface 25a, and outputs an intensity image indicating each condensing spot image to the wavefront estimation unit 26 and The data is output to the moving body restoration unit 42.
  • the wavefront estimation unit 26 is realized by a computer such as a personal computer or a wavefront estimation circuit.
  • the wavefront estimation unit 26 performs processing for estimating the wavefront of the light flux at the aperture of the imaging optical system 11 from the positions of a plurality of focused spot images indicated by the intensity image output from the photodetector 25. That is, the wavefront estimation unit 26 calculates an approximate value of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 from the positions of the plurality of focused spot images.
  • the directing device 16 is a device that changes the directing direction of the imaging optical system 11 in accordance with a control signal output from the control device 49.
  • the radar apparatus 30 is realized by, for example, an inverse synthetic aperture radar (ISAR).
  • the radar apparatus 30 includes a transmission / reception unit 31, a distance image acquisition unit 32, and a directing device 33.
  • the transmission / reception unit 31 is realized by an antenna, a modulator, a demodulator, a wireless device, and the like.
  • the transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1, and receives the radar wave reflected by the moving body 1 as a reflected wave.
  • the distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the time from when the radar wave is transmitted from the transmission / reception unit 31 to when the reflected wave is received, and the distance image is transformed into an image. Processing to be output to the unit 41 is performed.
  • the directivity device 33 is a device that changes the directivity direction of the transmission / reception unit 31 in accordance with a control signal output from the control device 49.
  • the operation device 40 includes an image transformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, a planning device 48, and a control device 49.
  • the image deformation unit 41 is realized by a computer such as a personal computer or an image deformation circuit.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object. A process of outputting to the restoration unit 42 is performed.
  • the moving body restoration unit 42 is realized by a computer such as a personal computer or a moving body restoration circuit.
  • the moving body restoration unit 42 performs processing for detecting an area where the moving body 1 exists from the deformed distance image output from the image deformation unit 41.
  • the moving body restoration unit 42 includes a plurality of collections indicated by the image in the region where the moving body 1 exists and the intensity image output from the photodetector 25 among the intensity images output from the intensity image acquisition unit 13.
  • a process for estimating the luminance distribution of the moving body 1 from the light spot image or the wavefront estimated by the wavefront estimating unit 26 is performed.
  • the recording device 43 is realized by a recording processing circuit, for example.
  • the recording device 43 is a device that records the wavefront estimated by the wavefront estimating unit 26 and the luminance distribution of the moving body 1 estimated by the moving body restoring unit 42.
  • the time calibration unit 44 has a built-in clock, and calibrates the clock time using a GPS signal transmitted from a GPS (Global Positioning System) satellite or an NTP (Network Time Protocol).
  • GPS Global Positioning System
  • NTP Network Time Protocol
  • the trajectory information recording unit 46 is realized by a recording processing circuit, for example.
  • the trajectory information recording unit 46 records trajectory information indicating the trajectory of the mobile body 1.
  • the trajectory calculation unit 47 is realized by a computer such as a personal computer or a trajectory calculation circuit.
  • the trajectory calculation unit 47 performs a process of predicting the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46.
  • the planning device 48 is realized by a computer such as a personal computer.
  • the planning device 48 is a device that calculates the pointing direction of the imaging optical system 11 and the pointing direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47.
  • the control device 49 is realized by a computer such as a personal computer. Based on the elapsed time measured by the counter 45 and the calculation result of the planning device 48, the control device 49 outputs a control signal indicating the directivity direction of the imaging optical system 11 to the directivity device 16 and directs the transmission / reception unit 31. A control signal indicating the direction is output to the pointing device 33.
  • the control device 49 is a device that controls each of the intensity image acquisition unit 13, the wavefront measurement unit 15, the distance image acquisition unit 32, and the image deformation unit 41 based on the elapsed time measured by the counter 45.
  • each of the wavefront estimation circuit, the image deformation circuit, the moving body restoration circuit, and the trajectory calculation circuit includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, and an ASIC (Application Specific Integrated Circuit). , FPGA (Field-Programmable Gate Array), or a combination thereof.
  • the recording processing circuit includes, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory Non-volatile Memory or an EEPROM).
  • a semiconductor memory a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, or a DVD (Digital Versatile Disc).
  • FIG. 3 is a perspective view showing an appearance of the moving object observation apparatus according to Embodiment 1.
  • the casing 51 is mounted with the imaging optical system 11, the light beam dividing unit 12, the intensity image acquiring unit 13, the relay optical system 14, the space dividing unit 21, the shutter 24, and the photodetector 25.
  • the case 52 includes a wavefront estimation unit 26, a distance image acquisition unit 32, an image deformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, A planning device 48 and a control device 49 are mounted.
  • Optical components such as lenses and the human pupil transmit light.
  • an optical component such as a mirror reflects light.
  • the atmosphere of the earth is composed of a medium such as oxygen, nitrogen, and water vapor, and light is transmitted in the same manner as an optical component such as a lens. Since the refractive index of a medium such as oxygen fluctuates with changes in temperature and atmospheric pressure, the phase distribution of light transmitted through the earth's atmosphere changes with changes in temperature and atmospheric pressure. Since light is an electromagnetic wave, the phase distribution of light can be grasped as a wavefront.
  • the telescope device 10 shown in FIG. 1 estimates the wavefront by receiving the light beam 2 reflected by the mobile body 1 existing outside or in the atmosphere or the light beam 2 transmitted from the mobile body 1.
  • the wavefront estimated by the telescope device 10 changes as the refractive index of a medium such as oxygen changes.
  • the change in the refractive index of the medium itself is small, the change in the refractive index becomes insignificant compared to the wavelength of the light when the optical path through which light propagates becomes long. Strongly affected by fluctuations.
  • the atmosphere on the ground is affected by the effects of radiation from the sun and heat transport, and is also affected by the rotation of the earth. Therefore, an atmosphere layer is formed between the ground and the sky.
  • the wavefront of light passing through the atmospheric layers is complexly disturbed.
  • radar waves transmitted and received by the radar device 30 are hardly affected by atmospheric fluctuations.
  • FIG. 4 is an explanatory diagram showing a state in which each of the telescope device 10 and the radar device 30 observes the moving body 1.
  • the same reference numerals as those in FIGS. 1 and 3 denote the same or corresponding parts.
  • the moving body 1 is irradiated with light from the sun, which is the illumination light source 60.
  • the reflected light of wavelength ⁇ 1 that is sunlight reflected by the moving body 1 passes through the atmospheric layer between the ground and the sky, and reaches the telescope device 10.
  • FIG. 4 shows an example in which there are two atmospheric layers between the ground and the sky, where 101 is a first atmospheric layer and 102 is a second atmospheric layer.
  • the imaging optical system 11 condenses the incident light beam 2 having the wavelength ⁇ 1 on the light receiving surface 25 a of the photodetector 25, so that an image of the moving body 1 is formed on the light receiving surface 25 a of the photodetector 25. Is done. Since the wavefront is disturbed when the light beam 2 passes through the first atmospheric layer 101 and the second atmospheric layer 102, the image of the moving body 1 spreads even if the moving body 1 can be regarded as a point. End up. Therefore, if the spread of the image of the moving body 1 caused by the aberration of the imaging optical system 11 and the spread of the image of the moving body 1 caused by the resolution of the photodetector 25 are excluded, the spread of the image of the moving body 1 is excluded.
  • the cause of this is atmospheric fluctuation.
  • the moving body 1 is an object having a spread
  • the spread due to the aberration of the imaging optical system 11 and the resolution of the photodetector 25 is excluded, the spread of the image of the moving body 1 is the spread of the object itself. It is expressed by the spread due to atmospheric fluctuations.
  • the spread of the image 1 is expressed by convolution of the spread of the object itself and the spread due to atmospheric fluctuations.
  • each of the plurality of focused spot images detected by the photodetector 25 is spread by atmospheric fluctuation, and the intensity image indicating the plurality of focused spot images is an intensity image 62 as shown in FIG. Become.
  • reference numeral 61 denotes the shape of the moving body 1 when the moving body 1 is viewed from the telescope device 10 and the radar device 30.
  • the radar apparatus 30 can calculate the distance to the moving body 1 from the time from when the radar wave having the wavelength ⁇ 2 is transmitted until the reflected wave that is the radar wave reflected by the moving body 1 is received. .
  • each of L 1 , L 2 , and L 3 indicates a distance to each part in the moving body 1.
  • the radar apparatus 30 can obtain a distance image 63 as shown in FIG. 4 by calculating the distance to each part in the moving body 1.
  • Radar wave having a wavelength lambda 2 since hardly affected by air fluctuations, range image 63 is an image where there is no spread of the image due to atmospheric turbulence.
  • FIG. 5 is a flowchart showing a moving body observation method which is a processing procedure of the moving body observation apparatus shown in FIG.
  • the directing device 16 is configured such that the imaging optical system 11 transmits the light beam 2 reflected by the moving body 1 or the light beam 2 transmitted from the moving body 1.
  • the directing direction of the imaging optical system 11 is changed.
  • the directing device 33 is configured so that the transmission / reception unit 31 can receive the reflected wave from the mobile unit 1 even when the mobile unit 1 is moving relative to the radar device 30. Change the pointing direction.
  • the time calibrating unit 44 uses a GPS signal or NTP transmitted from a GPS satellite so that each of the directing device 16 and the directing device 33 can control the directing direction with a second angle accuracy. Calibrate. When the time of the clock calibrated by the time calibration unit 44 reaches a certain time, the counter 45 measures the elapsed time from the certain time.
  • the trajectory calculation unit 47 predicts the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46. Since the process itself for predicting the position of the future time at which the mobile body 1 exists from the trajectory of the mobile body 1 indicated by the trajectory information is a known technique, detailed description thereof is omitted.
  • the planning device 48 calculates the directivity direction of the imaging optical system 11 and the directivity direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47. Since the process itself for calculating the directivity direction from the predicted position of the moving body 1 is a known technique, a detailed description thereof will be omitted.
  • the control device 49 controls the pointing direction of the imaging optical system 11 based on the elapsed time measured by the counter 45 and the pointing direction of the imaging optical system 11 at a future time calculated by the planning device 48.
  • the signal is output to the pointing device 16.
  • the control device 49 generates a control signal indicating the directivity direction of the transmission / reception unit 31 based on the elapsed time measured by the counter 45 and the directivity direction of the transmission / reception unit 31 at a future time calculated by the planning device 48.
  • the directing device 16 changes the directing direction of the imaging optical system 11 according to the control signal output from the control device 49.
  • the directing device 33 changes the directing direction of the transmission / reception unit 31 in accordance with the control signal output from the control device 49.
  • the imaging optical system 11 condenses the light beam 2 (step ST1 in FIG. 5).
  • the light beam splitting unit 12 splits the light beam 2 into two by splitting the light amount of the light beam 2 collected by the imaging optical system 11 into two.
  • the beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
  • the intensity image acquisition unit 13 detects the image of the moving body 1 from the light beam 2, and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41 (FIG. 5). Step ST9). Time information indicating the detection time of the image of the moving body 1 is added to the intensity image output from the intensity image acquisition unit 13 to the image deformation unit 41.
  • the relay optical system 14 is an optical system in which the lens array 23 of the space dividing unit 21 is optically equivalent to the pupil of the imaging optical system 11.
  • the relay optical system 14 space-divides the light beam 2. The light is output to the light shielding unit 22 of the unit 21.
  • the space dividing unit 21 divides the light beam 2 into a plurality of light beams 2a in a plurality of space regions, and outputs the light beams 2a in the plurality of space regions to the shutter 24 (step ST2 in FIG. 5). ). That is, when the light shielding unit 22 receives the light beam 2 from the relay optical system 14, the light shielding unit 22 partially shields the light beam 2 to divide the light beam 2 into light beams 2 a in a plurality of spatial regions. The light shielding unit 22 outputs the light beams 2 a in a plurality of spatial regions to the lens array 23. Each lens 23 a included in the lens array 23 receives the light beam 2 a in each spatial region from the light shielding unit 22, and condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
  • the shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in accordance with a control signal output from the control device 49 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • a control signal output from the control device 49 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • the coherence time is generally about 1 to 10 ms.
  • the control device 49 controls the passage time of the light beam 2a in the shutter 24, the light amount of the light beam 2a received by the photodetector 25 may be reduced.
  • the control device 49 controls the shutter 24 so that the light beam 2a is passed and shielded a plurality of times, so that the photodetector 25 is moved a plurality of times. It is possible to detect the image.
  • the photodetector 25 detects a condensed spot image as an image of the moving body 1 from each light beam 2 that has passed through the shutter 24 and outputs an intensity image indicating the plurality of condensed spot images to the wavefront estimating unit 26.
  • FIG. 6 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25.
  • FIG. 6 shows an example in which there are three atmospheric layers between the ground and the sky. 101 is a first atmospheric layer, 102 is a second atmospheric layer, and 103 is a third atmospheric layer. In FIG. 6, the change of the light flux by the imaging optical system 11 and the relay optical system 14 is omitted.
  • Each lens 23a included in the lens array 23 condenses the luminous flux 2a in each spatial region on the light receiving surface 25a of the photodetector 25, so that the light receiving surface 25a of the photodetector 25 has a spatial region.
  • the image 104 of the moving body 1 is formed by the number of.
  • the plurality of images 104 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
  • the moving body 1 When the moving body 1 is moving relative to the telescope device 10, if the direction in which the moving body 1 is viewed from the ground is different, the propagation path of the light beam 2 is not equal, and the wavefront is different depending on the propagation path.
  • FIG. 7 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25 when the wavefront varies depending on the propagation path.
  • the same reference numerals as those in FIG. 6 denote the same or corresponding parts.
  • Each of the light beam 4, the light beam 5, and the light beam 6 is a light beam reflected by the moving body 1 or a light beam transmitted from the moving body 1.
  • the luminous flux 4, the luminous flux 5 and the luminous flux 6 are different from each other in the contribution of fluctuations in the atmospheric layer, and the propagation path of the luminous flux 4, the propagation path of the luminous flux 5, and the propagation path of the luminous flux 6 are different from each other.
  • the light beams 2a in the plurality of spatial regions collected by the lens array 23 on the light receiving surface 25a of the photodetector 25 are the light beams 4, 5, and 6, respectively, 4, the light beam 5 and the light beam 6 form an image 105 of the moving body 1.
  • the plurality of images 105 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
  • FIG. 8 and 9 are explanatory diagrams showing the relationship between the image of the moving body 1 detected by the photodetector 25 and the wavefront.
  • FIG. 8 shows an example in which the light beam 2 is propagated without spreading in the traveling direction
  • FIG. 9 shows an example in which the light beam 2 is propagated while spreading in the traveling direction.
  • 105a is an image of the moving body 1 when the light beam 2 is propagated without spreading in the traveling direction
  • 105b is propagated while the light beam 2 spreads in the traveling direction. It is the image of the moving body 1 in the case of being.
  • the position of the image 105a of the moving body 1 collected by each lens 23a of the lens array 23 is the light shielding portion 22.
  • the position of the image 105b of the moving body 1 collected by each lens 23a of the lens array 23 is caused by the light shielding unit 22 as shown in FIG.
  • the position of each of the divided space areas is shifted.
  • the wavefront 106a is obtained from the positions of the plurality of images 105a of the moving body 1
  • the wavefront 106b is obtained from the positions of the plurality of images 105b of the moving body 1.
  • 8 and 9 show an example in which a plurality of spatial regions divided by the light shielding unit 22 are arranged in a grid pattern.
  • the arrangement is not limited to this.
  • the arrangement of the plurality of space regions may be a honeycomb arrangement.
  • 8 and 9 show an example in which the light shielding area where the light flux 2 is shielded, that is, the area other than the transmission area where the light flux 2 is transmitted is painted black.
  • the light-shielding region is not required to transmit unnecessary light and may be painted with a color other than black.
  • the light shielding region may be colored or processed to absorb unnecessary light, or may be colored or processed to scatter unnecessary light.
  • FIG. 10 is a flowchart showing a processing procedure of the wavefront estimation unit 26.
  • the processing content of the wavefront estimation unit 26 will be specifically described with reference to FIG.
  • the wavefront estimation unit 26 receives an intensity image indicating a plurality of focused spot images from the photodetector 25, the light flux at the aperture of the imaging optical system 11 is determined from the positions of the plurality of focused spot images indicated by the intensity image. 2 is calculated (step ST11 in FIG. 10). Since the processing itself for calculating the approximate value of the wavefront from the positions of the plurality of focused spot images is a known technique, detailed description thereof is omitted.
  • a method for estimating a wavefront from the positions of a plurality of focused spot images is disclosed in Non-Patent Document 1 below, for example. [Non-patent document 1] National Astronomical Observatory of Japan vol.2 No.2
  • control device 49 controls the shutter 24 so that the passage of the light beam 2 a and the light shielding are repeated a plurality of times, so that N intensity images are output from the photodetector 25 to the wavefront estimation unit 26.
  • a mode of obtaining the position of the center of gravity of the point image as the position of the focused spot image can be considered.
  • the wavefront can be obtained from the interval between the plurality of focused spot images or the relative position of the plurality of focused spot images. Therefore, as the position of the focused spot image, an aspect in which the cross-correlation of the plurality of focused spot images or the interval between the characteristic positions of the focused spot images is considered.
  • the wavefront estimation unit 26 sets the phase of the wavefront, which is an approximate value calculated from the positions of the plurality of focused spot images indicated by the nth intensity image, to ⁇ 0, n .
  • the wavefront estimation unit 26 uses the phase ⁇ 0, n as the initial value of the phase ⁇ n (u, v) of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 as shown in the following equation (1).
  • (U, v) are the coordinates of the pupil space.
  • FIG. 11 shows the relationship between the aperture of the imaging optical system 11, the respective apertures in a plurality of spatial regions, and the image 105 of the movable body 1 when the movable body 1 moves relative to the telescope device 10.
  • M 0 (u, v) is the aperture of the imaging optical system 11.
  • M 1 (u, v), M 2 (u, v),..., M M (u, v) is an opening in each of a plurality of spatial regions.
  • the pupil function G m, n (u, v) represented by the wavefront aberration and the amplitude distribution on the pupil is an imaging optical system corresponding to the nth intensity image, as shown in the following equation (2).
  • 11 is represented by the phase ⁇ n (u, v) of the wavefront of the light beam 2 at the aperture of 11 and the aperture M m (u, v). Since the aperture M m (u, v) is known, and the initial value of the phase ⁇ n (u, v) is the approximate wavefront phase ⁇ 0, n , the pupil function G m, n (u , V) is calculated from the phase ⁇ n (u, v) and the aperture M m (u, v).
  • the amplitude spread function a m, n (u, v) is obtained by inverse Fourier transforming the pupil function G m, n (u, v) as shown in the following equation (4).
  • F ⁇ 1 is a symbol representing the inverse Fourier transform.
  • Point spread function k m point spread shows a, n (x, y), as shown in the following equation (5), the amplitude spread function a m, and n (u, v), the amplitude spread function a m , N (u, v) and the complex conjugate.
  • (X, y) are real space coordinates.
  • the aperture Mm (u,
  • the image im , n (x, y) of the moving body 1 corresponding to v) is expressed by the following equation (6).
  • (P, q) are real space coordinates indicating the position where the moving body 1 exists.
  • the luminance distribution o (p, q) of the moving body 1 is the intensity of the light beam 2 reflected by the moving body 1 or the intensity of the light beam 2 transmitted from the moving body 1.
  • the image i m of the moving body 1, n (x, y) is the point spread function k m, and n (x, y), the convolution of the intensity distribution of the moving body 1 o (p, q)
  • noise em , n (x, y) generated in the photodetector 25 is added to the equations (6) and (7).
  • the following equation (8) is an image of the moving body 1 obtained from the luminance distribution o (p, q) of the moving body 1 and the point spread function km , n (x, y) indicating the point image intensity distribution.
  • the point spread function km , n (x, y) is obtained from Expression (2), Expression (4), and Expression (5). Therefore, in Equation (8), the only unknown value is the luminance distribution o (p, q) of the moving body 1.
  • the luminance distribution o (p, q) of the moving body 1 is obtained by searching for o (p, q) that minimizes the sum of squared differences e.
  • the moving body 1 moves relative to the telescope device 10, and even if the directing device 16 changes the directing direction of the imaging optical system 11, the relative movement between the moving body 1 and the telescope device 10 is complete. Cannot be canceled. Therefore, the relative position of the moving body 1 changes as the time t changes.
  • the luminance distribution o (p, q) of the moving body 1 does not depend on the frame and does not change. However, it is assumed that the wavefront changes from frame to frame.
  • equation (8) is Fourier transformed, and the square sum e of the difference shown in equation (8) is the square sum E of the difference in the phase space.
  • I m, n (u, v) is a spectrum of im , n (x, y) and is expressed as the following Expression (10).
  • F is a symbol representing Fourier transform.
  • is a coefficient introduced for stabilizing the solution.
  • K m, n (u, v) is an autocorrelation of the pupil function G m, n (u, v) and is represented by the following equation (12).
  • K m, n (u, v) is not normalized but is an optical transfer function.
  • Equation (13) The sum of squares E of the difference shown in the equation (13) is the aperture M m (p, q), the phase ⁇ n (p, q), and the spectrum I of the image im , n (x, y) of the moving body 1. It is represented by m, n (u, v) and does not depend on the spectrum O (u, v) of the luminance distribution o (p, q) of the moving body 1 that is unknown.
  • the wavefront W n (u, v) can be estimated by the equation (2) if the phase ⁇ n (u, v) at which the square sum Err of the difference shown in the following equation (14) is minimized.
  • the luminance distribution o (p, q) of the moving body 1 is obtained even by estimating the wavefront W n (u, v) by obtaining the phase ⁇ n (u, v) that minimizes the square sum Err of the difference.
  • the equation (14) does not depend on the luminance distribution o (p, q) of the moving body 1. Therefore, a strong calculational constraint that the luminance distribution o (p, q) of the moving body 1 is a real number larger than 0 in real space is not given.
  • the main noise is shot noise and readout noise.
  • the readout noise follows a normal distribution and has a median value of 0 and a standard deviation of ⁇ . Shot noise is proportional to the luminance distribution of the acquired frame. Therefore, when Expression (15) is normalized with noise, Expression (16) is obtained. If the ratio of the difference r m, n (x, y) in the real space to noise is larger than 1, the deviation is large. If it is 1, there is no deviation, and if it is smaller than 1, the deviation is small. To do.
  • a likelihood function shown in the following equation (17) is introduced.
  • d m (x, y) is a weight given to the difference r m, n (x, y). For example, a frame with a large deviation is given a low weight because the reliability is low. .
  • the calculation amount can be reduced by setting the weight of the area to be calculated to 1 and the weight of the area to omit the calculation to 0. The above is the principle of the wavefront estimation process and the principle of the luminance distribution estimation process of the moving body 1.
  • Wavefront estimation unit 26 when the wavefront phase approximations and [Phi 0, n, the [Phi 0, n, is set to an initial value of the formula shown in (2) ⁇ n (u, v).
  • the wavefront estimation unit 26 calculates the point spread function km , n (x, y) indicating the point image intensity distribution by calculating Expression (2), Expression (4), and Expression (5) (FIG. 10). Step ST12).
  • Wavefront estimation unit 26 the point spread function k m, n (x, y ) and the image i m of the moving body 1, n (x, y) respectively by Fourier transform, the optical transfer function K m, n ( u, v) and the spectrum I m, n (u, v) of the image im , n (x, y) of the moving body 1 are obtained (step ST13 in FIG. 10).
  • the wavefront estimation unit 26 substitutes the optical transfer function K m, n (u, v) and the spectrum I m, n (u, v) into the equation (14), and calculates the square sum Err of the difference (FIG. 10). Step ST14).
  • the wavefront estimating unit 26 determines whether or not the phase search process has converged (step ST15 in FIG. 10). As a convergence determination in the phase search process, for example, there is a method of determining that the convergence is achieved if the calculated square sum Err of the difference is equal to or less than a first allowable error set in advance. The sum of squares Err of the difference calculated when it is determined that it has converged is the minimum square sum Err.
  • the first allowable error is assumed to be stored in the internal memory of the wavefront estimation unit 26 or the recording device 43, for example. Further, as the convergence determination of the phase search process, for example, while changing the phase ⁇ n (u, v), the square sum Err of the difference is calculated a predetermined number of times, and the calculated sum of squares Err is If the minimum square sum Err is specified, there is a method for determining that the sum has converged.
  • the wavefront estimation unit 26 changes the phase ⁇ n (u, v) shown in equation (2) (step in FIG. 10). ST16), the changed phase ⁇ n (u, v) is set in the equation (2). The wavefront estimation unit 26 performs the processes of steps ST12 to ST15 again.
  • the phase ⁇ n (u, v) after the change may be any phase as long as it has not yet been set in the expression (2), but should be a phase that makes the square sum Err of the difference small. Is desirable. If the phase search process has converged (in the case of step ST15 in FIG. 10: YES), the wavefront estimation unit 26 ends the phase search process.
  • the wavefront estimation unit 26 substitutes the phase ⁇ n (u, v) for which the minimum square sum Err is calculated into the expression (3), so that the wavefront estimation unit 26 at the aperture of the imaging optical system 11
  • the wave front W n (u, v) of the light beam 2 is estimated (step ST17 in FIG. 10).
  • the estimated wavefront W n (u, v) is a wavefront with higher accuracy than the wavefront as the approximate value calculated in step ST11.
  • the wavefront estimation unit 26 outputs the wavefront W n (u, v) to the recording device 43.
  • the time information indicating the calculation time of the wavefront W n (u, v) is added to the wavefront W n (u, v) output from the wavefront estimation unit 26 to the recording device 43. Further, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) corresponding to the phase ⁇ n (u, v) for which the minimum square sum Err is calculated to the moving body restoration unit 42. Time information indicating the calculation time of the point spread function km , n (x, y) is added to the point spread function km , n (x, y) output from the wavefront estimation unit 26 to the moving body restoration unit 42. Has been. Here, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) to the moving body restoration unit 42, but outputs the wavefront W n (u, v) to the moving body restoration unit 42. You may make it do.
  • the radar apparatus 30 receives the radar wave reflected by the moving body 1 and generates a distance image indicating the distance to the moving body 1 based on the radar wave (step ST5 in FIG. 5).
  • the transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1 and outputs the radar wave to the distance image acquisition unit 32.
  • the transmission / reception unit 31 receives the radar wave reflected by the moving body 1 as a reflected wave, and outputs the reflected wave to the distance image acquisition unit 32.
  • the distance from the radar apparatus 30 to the moving body 1 is directly proportional to the time from when the radar apparatus 30 transmits a radar wave to when the reflected wave is received.
  • the distance image acquisition unit 32 measures the time from when the radar wave is transmitted from the transmission / reception unit 31 until the reflected wave is received.
  • the distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the measured time, and outputs the distance image to the image deformation unit 41. Since the process itself for generating the distance image based on the measurement time is a known technique, detailed description thereof is omitted. Time information indicating the generation time of the distance image is added to the distance image output from the distance image acquisition unit 32 to the image deformation unit 41.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object.
  • the data is output to the restoration unit 42 (step ST6 in FIG. 5).
  • the distance image deformation process by the image deformation unit 41 will be specifically described below.
  • the image transformation unit 41 sequentially acquires intensity images at different times from the intensity image acquisition unit 13 and sequentially acquires distance images at different times from the distance image acquisition unit 32. For example, when the deformation process of the distance image at the generation time t c is performed, the image deformation unit 41 selects an intensity image whose detection time t d is closest to the generation time t c from the plurality of acquired intensity images.
  • the image deforming unit 41 acquires the status information of the telescope device 10 and the status information of the radar device 30 from the control device 49.
  • the status information of the telescope device 10 is information indicating the orientation direction of the imaging optical system 11, the scale of the intensity image, and the like.
  • the status information of the radar apparatus 30 is information indicating the directivity direction of the transmission / reception unit 31 and the scale of the distance image.
  • the image deformation unit 41 refers to the directivity direction of the imaging optical system 11 to confirm the direction of the intensity image, and refers to the directivity direction of the transmission / reception unit 31 to confirm the direction of the distance image.
  • Image transforming unit 41 the orientation of the distance image generation time t c is, to match the orientation of the selected intensity image, carrying out the affine transformation processing of the distance image is rotated or the like.
  • the image transformation unit 41 performs the distance image enlargement process or the distance image reduction process so that the scale of the distance image subjected to the affine transformation process or the like matches the scale of the selected intensity image.
  • the image deforming unit 41 performs an affine transformation process and an enlargement process or a reduction process as the process for deforming the distance image.
  • any process may be used as long as the image deforming unit 41 deforms the distance image in order to align the distance image with the distance image.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13.
  • the image deformation unit 41 deforms the distance image so that the distance image output from the distance image acquisition unit 32 matches the intensity image output from the photodetector 25. It may be.
  • the moving body restoration unit 42 includes a distance image output from the image deformation unit 41, an intensity image output from the intensity image acquisition unit 13, and a plurality of focused spot images or wavefront estimation units detected by the photodetector 25.
  • the brightness distribution o (p, q) of the moving body 1 is estimated from the point spread function km , n (x, y) output from the terminal 26 (step ST7 in FIG. 5).
  • the moving body restoration unit 42 estimates the luminance distribution o (p, q) of the moving body 1 using the point spread function km , n (x, y).
  • the luminance distribution o (p, q) of the moving body 1 may be estimated using the output wavefront W n (u, v).
  • Mobile recovery unit 42 the wavefront W n (u, v) using a luminance distribution o (p, q) of the moving body 1 when estimating the wavefront W n (u, v) in the equation (3)
  • the phase ⁇ n (u, v) is calculated.
  • the moving body restoration unit 42 substitutes the phase ⁇ n (u, v) into the equation (2), and calculates the equations (2), (4), and (5), thereby calculating the point spread function k.
  • m, n (x, y) is calculated.
  • the moving body restoration unit 42 calculates the luminance distribution o (p, q) of the moving body 1 from the distance image, the intensity image, and the plurality of focused spot images or the calculated point spread function km , n (x, y). ).
  • FIG. 12 is a flowchart showing the processing procedure of the moving object restoring unit 42.
  • FIG. 13 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 42.
  • the distance image 111 is a deformed distance image output from the image deforming unit 41.
  • the intensity image 112 is an intensity image output from the intensity image acquisition unit 13.
  • the moving body restoring unit 42 detects an area where the moving body 1 exists from the distance image 111 (step ST21 in FIG. 12). That is, the moving body restoration unit 42 performs a contour extraction process for extracting the contour of the moving body 1 from the distance image 111. Since the contour extraction process itself is a known technique, a detailed description thereof will be omitted.
  • the moving body restoration unit 42 sets the area inside the extracted outline as the area where the moving body 1 exists, and sets the area outside the outline as the area where the moving body 1 does not exist.
  • the moving body restoration unit 42 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1.
  • a mask image 113 is generated (step ST22 in FIG. 12).
  • the processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists.
  • the existing area may be a large area.
  • As a region where the moving object 1 is also large an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered.
  • As the margin for example, a size of about 10% of the area where the moving body 1 exists can be considered.
  • the moving body restoring unit 42 extracts the image im , n (x, y) of the moving body 1 in the processing target area in the mask image 113 from the intensity image 112 (step ST23 in FIG. 12).
  • the intensity image 114 shown in FIG. 13 is an intensity image showing the image im , n (x, y) of the moving body 1 in the processing target area extracted from the intensity image 112.
  • Mobile recovery unit 42 the image i m of one or more mobile 1 contained in the processing target area, n (x, y) from the one image i m of the moving body 1, n (x , Y).
  • the moving body restoration unit 42 includes an image im , n (x, y) of the selected moving body 1 and a point spread function km, corresponding to the selected image im , n (x, y) of the moving body 1 . Substituting n (x, y) into equation (16), the difference rm , n (x, y) is calculated. The moving body restoration unit 42 selects all the images im , n (x, y) of one or more moving bodies 1 included in the processing target region, and calculates the difference rm , n (x, y). The above process is repeated until the process ends.
  • the moving body restoration unit 42 substitutes all the calculated differences r m, n (x, y) into the equation (17) to calculate the square sum e of the differences (step ST24 in FIG. 12).
  • Mobile recovery unit 42 in formula (17), the difference r m of the processing target area, n (x, y) weighted d m (x, y) corresponding to the set to 1, processing target area outside the difference r A weight d m (x, y) corresponding to m, n (x, y) is set to zero.
  • the moving body restoration unit 42 determines whether or not the luminance distribution estimation process of the moving body 1 has converged (step ST25 in FIG. 12).
  • a convergence determination of the luminance distribution estimation process of the mobile body for example, there is a method of determining that the convergence is achieved if the calculated square sum e of differences is equal to or less than a second allowable error set in advance.
  • the sum of squares e of the difference calculated when it is determined that it has converged is the minimum square sum e.
  • the second allowable error is assumed to be stored in the internal memory of the moving body restoration unit 42 or the recording device 43, for example.
  • the square sum e of the difference is calculated a predetermined number of times while changing the luminance distribution o (p, q) of the mobile object 1, There is a method of determining that convergence is achieved when the minimum square sum e is specified among the calculated sums of squares e.
  • the moving object restoring unit 42 determines the luminance distribution o (p, q of the moving object 1 shown in Expression (16). ) Is changed (step ST26 in FIG. 12), and the processes of steps ST24 to ST25 are performed again.
  • the luminance distribution o (p, q) after the change may be any luminance distribution as long as it is not yet set in the equation (16), but the luminance distribution is such that the square sum e of the differences becomes small. It is desirable that If the luminance distribution estimation process of the moving object 1 has converged (in the case of step ST25 in FIG.
  • the moving object restoration unit 42 obtains the minimum square sum e as the result of the luminance distribution estimation process of the moving object 1.
  • the calculated luminance distribution o (p, q) of the moving body 1 is output to the recording device 43 (step ST27 in FIG. 12).
  • the moving body restoration unit 42 detects a region where the moving body 1 exists from the distance image generated by the radar device 30, and a plurality of collections detected by the photodetector 25. Moving object observation so as to estimate the luminance distribution of the moving object 1 from the focused spot image in the region where the moving object 1 exists in the light spot image and the wavefront estimated by the wavefront estimating unit 26. Configured the device. Therefore, the moving body observation apparatus can estimate the luminance distribution of the moving body 1 without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
  • Embodiment 2 In the moving body observation apparatus of the first embodiment, the moving body restoring unit 42 extracts the contour of the moving body 1 from the distance image 111, and sets the area inside the contour as the area where the moving body 1 exists.
  • the moving body restoration unit 70 has a distance value that is larger than the first threshold value and smaller than the second threshold value among the plurality of pixels included in the distance image 111. A moving body observation apparatus that detects a region where the moving bodies 1 exist as a region where the moving body 1 exists will be described.
  • FIG. 14 is a configuration diagram illustrating a moving object observation apparatus according to the second embodiment.
  • the moving body restoration unit 70 is realized by a computer such as a personal computer or a moving body restoration circuit.
  • the moving body restoration unit 70 acquires the trajectory information from the trajectory information recording unit 46, and based on the trajectory information, sets the first threshold value L ref1 indicating the lower limit of the distance and the second threshold value L ref2 indicating the upper limit of the distance, respectively. Implement the process to determine.
  • the moving body restoration unit 70 collects pixels having a distance value that is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111.
  • region where the mobile body 1 exists is implemented.
  • the moving body restoring unit 70 includes a focused spot image in a region where the moving body 1 is present among the plurality of focused spot images detected by the photodetector 25, and the wavefront estimated by the wavefront estimating unit 26. Then, the process of estimating the luminance distribution of the moving body 1 is performed.
  • the moving body restoration unit 70 acquires trajectory information from the trajectory information recording unit 46.
  • the moving body 1 is a meteorite, space debris, an artificial satellite or the like that orbits the earth
  • the gravity with respect to the moving body 1 is mainly the gravity of the earth.
  • the movement locus of the moving body 1 is represented by a quadratic curve as shown in FIG.
  • the trajectory of the movement of the moving body 1 corresponds to a trajectory.
  • the quadratic curve is, for example, a normal line, a hyperbola, or an ellipse.
  • FIG. 15 is an explanatory diagram showing the trajectory of the moving body 1.
  • orbit information indicating the orbits of celestial bodies such as stars and comets is assumed to be known.
  • orbit information indicating the orbits of moving objects such as artificial satellites and the International Space Station is also known here.
  • FIG. 16 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 70.
  • the same reference numerals as those in FIG. 13 denote the same or corresponding parts.
  • each of L 1 , L 2 , and L 3 indicates the distance from the radar device 30 to each part in the moving body 1 as shown in FIG. L 1 ⁇ L 2 ⁇ L 3 is satisfied.
  • the moving body restoration unit 70 identifies the position of the moving body 1 with reference to the trajectory information.
  • the moving body restoration unit 70 calculates an approximate value of the distance from the radar apparatus 30 to the moving body 1 from the position of the radar apparatus 30 and the position of the moving body 1.
  • the moving body restoration unit 70 sets (approximate distance- ⁇ ) to the first threshold value L ref1, where ⁇ is 1.0 to 2.5 times the maximum outer dimension of the moving body 1, and the like (
  • the approximate distance value + ⁇ ) is set to the second threshold value L ref2 .
  • This setting is only an example, and it goes without saying that the first threshold value L ref1 and the second threshold value L ref2 may be set by other methods.
  • the maximum external dimension of the moving body 1 is, for example, the horizontal dimension of the moving body 1 if the moving body 1 is a horizontally long object.
  • the maximum value of the moving body 1 The vertical dimension.
  • L ref1 ⁇ L 1 ⁇ L 2 ⁇ L 3 ⁇ L ref2 it is assumed that L ref1 ⁇ L 1 ⁇ L 2 ⁇ L 3 ⁇ L ref2 .
  • the moving body restoration unit 70 is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111.
  • An area in which pixels having distance values are gathered is specified.
  • the moving body restoration unit 70 sets the area inside the specified area as the area where the moving body 1 exists, and sets the area outside the specified area as the area where the moving body 1 does not exist.
  • the moving body restoration unit 70 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1.
  • a mask image 113 shown is generated.
  • the processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists.
  • the existing area may be a large area. As a region where the moving object 1 is also large, an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered.
  • the moving body restoring unit 70 and the focused spot image in the region where the moving body 1 exists and the point spread function output from the wavefront estimating unit 26 are used.
  • the brightness distribution o (p, q) of the moving body 1 is estimated from km , n (x, y).
  • the present invention is suitable for a moving object observation apparatus and a moving object observation method for estimating the luminance distribution of a moving object.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

La présente invention porte sur un dispositif d'observation de corps mobile qui a été configuré de telle sorte qu'une unité de récupération de corps mobile (42) détecte une zone, dans laquelle un corps mobile (1) est présent, à partir d'une image de distance générée par un dispositif radar (30), et estime une distribution de luminosité du corps mobile (1) à partir d'une image de points lumineux condensés dans la zone dans laquelle le corps mobile (1) est présent, parmi une pluralité d'images de points lumineux condensés détectées par le détecteur optique (25), et d'un front d'onde estimé par une unité d'estimation de front d'onde (26).
PCT/JP2018/019366 2018-05-18 2018-05-18 Dispositif et procédé d'observation de corps mobile WO2019220639A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020518936A JP6771697B2 (ja) 2018-05-18 2018-05-18 移動体観測装置及び移動体観測方法
PCT/JP2018/019366 WO2019220639A1 (fr) 2018-05-18 2018-05-18 Dispositif et procédé d'observation de corps mobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/019366 WO2019220639A1 (fr) 2018-05-18 2018-05-18 Dispositif et procédé d'observation de corps mobile

Publications (1)

Publication Number Publication Date
WO2019220639A1 true WO2019220639A1 (fr) 2019-11-21

Family

ID=68540275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019366 WO2019220639A1 (fr) 2018-05-18 2018-05-18 Dispositif et procédé d'observation de corps mobile

Country Status (2)

Country Link
JP (1) JP6771697B2 (fr)
WO (1) WO2019220639A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610707A (en) * 1995-07-07 1997-03-11 Lockheed Missiles & Space Co., Inc. Wavefront sensor for a staring imager
US7405834B1 (en) * 2006-02-15 2008-07-29 Lockheed Martin Corporation Compensated coherent imaging for improved imaging and directed energy weapons applications
US7551121B1 (en) * 2004-03-12 2009-06-23 Oceanit Laboratories, Inc. Multi-target-tracking optical sensor-array technology
JP2012514796A (ja) * 2009-01-05 2012-06-28 アプライド クウォンタム テクノロジイズ インク マルチスケール光学システム
US20120261514A1 (en) * 2010-12-17 2012-10-18 The Johns Hopkins University System and Method of Solar Flux Concentration for Orbital Debris Remediation
JP2016118547A (ja) * 2014-12-17 2016-06-30 ザ・ボーイング・カンパニーThe Boeing Company 大口径望遠鏡における、非同一平面上の(anisoplanatic)画像形成のためのレンズレット、ビームウォーク(beamwalk)、及びチルトの多様化

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610707A (en) * 1995-07-07 1997-03-11 Lockheed Missiles & Space Co., Inc. Wavefront sensor for a staring imager
US7551121B1 (en) * 2004-03-12 2009-06-23 Oceanit Laboratories, Inc. Multi-target-tracking optical sensor-array technology
US7405834B1 (en) * 2006-02-15 2008-07-29 Lockheed Martin Corporation Compensated coherent imaging for improved imaging and directed energy weapons applications
JP2012514796A (ja) * 2009-01-05 2012-06-28 アプライド クウォンタム テクノロジイズ インク マルチスケール光学システム
US20120261514A1 (en) * 2010-12-17 2012-10-18 The Johns Hopkins University System and Method of Solar Flux Concentration for Orbital Debris Remediation
JP2016118547A (ja) * 2014-12-17 2016-06-30 ザ・ボーイング・カンパニーThe Boeing Company 大口径望遠鏡における、非同一平面上の(anisoplanatic)画像形成のためのレンズレット、ビームウォーク(beamwalk)、及びチルトの多様化

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAYTON, D.C. ET AL.: "Performance simulations of a daylight low-order adaptive optics system with speckle postprocessing for observation of low-earth orbit satellites", OPTICAL ENGINEERING, vol. 36, no. 7, July 1997 (1997-07-01), pages 1910 - 1917, XP000658595, DOI: 10.1117/1.601397 *

Also Published As

Publication number Publication date
JPWO2019220639A1 (ja) 2020-12-10
JP6771697B2 (ja) 2020-10-21

Similar Documents

Publication Publication Date Title
US10641897B1 (en) Ladar system and method with adaptive pulse duration
US8355536B2 (en) Passive electro-optical tracker
US9759605B2 (en) Low-orbit satellite-borne image-spectrum associated detection method and payload
JP6570991B2 (ja) 大口径望遠鏡における、非同一平面上の(anisoplanatic)画像形成のためのレンズレット、ビームウォーク(beamwalk)、及びチルトの多様化
JP6509456B1 (ja) 波面計測装置、波面計測方法及び移動体観測装置、移動体観測方法
US5350911A (en) Wavefront error estimation derived from observation of arbitrary unknown extended scenes
JP6632468B2 (ja) 移動体検出装置、観測システム及び移動体検出方法
Sun et al. Precise measurement of the light curves for space debris with wide field of view telescope
WO2019220639A1 (fr) Dispositif et procédé d'observation de corps mobile
JP2009509125A (ja) 画像に関連する位置を決定するための方法および装置
WO2011059530A2 (fr) Dispositif de suivi électro-optique passif
US20230055616A1 (en) Rayleigh-raman polychromatic laser guide star
JP6494885B1 (ja) 波面計測装置、波面計測方法及び移動体観測装置、移動体観測方法
JP6906732B1 (ja) 移動体撮像システム、運用計画設定装置および移動体撮像方法
US20240020852A1 (en) Initial orbit determination using angular velocity and angular acceleration measurements
JP7038940B1 (ja) 形状姿勢推定装置、形状姿勢推定方法及び形状姿勢推定システム
US11448483B1 (en) Projectile tracking and 3D traceback method
Conran et al. A New Technique to Define the Spatial Resolution of Imaging Sensors
US10366506B2 (en) Hyperacuity system and methods for real time and analog detection and kinematic state tracking
Brewer et al. Infrared seeker/sensor dynamic performance prediction model
Delaite et al. Performance of an Optical COTS Station for the wide-field Detection of Resident Space Objects
Allured et al. Measurements of the isopistonic angle using masked aperture interferometry
CN117495933A (zh) 基于视差修正的光电望远镜外挂镜头图像实时配准方法
Ellis et al. Star sensing for an earth imaging sensor
Belen’kii et al. Cross-path LIDAR for turbulence profile determination

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18918914

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020518936

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18918914

Country of ref document: EP

Kind code of ref document: A1