WO2019220639A1 - Moving body observation device and moving body observation method - Google Patents

Moving body observation device and moving body observation method Download PDF

Info

Publication number
WO2019220639A1
WO2019220639A1 PCT/JP2018/019366 JP2018019366W WO2019220639A1 WO 2019220639 A1 WO2019220639 A1 WO 2019220639A1 JP 2018019366 W JP2018019366 W JP 2018019366W WO 2019220639 A1 WO2019220639 A1 WO 2019220639A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
image
unit
wavefront
light beam
Prior art date
Application number
PCT/JP2018/019366
Other languages
French (fr)
Japanese (ja)
Inventor
貴雄 遠藤
俊行 安藤
隆 高根澤
豊 江崎
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020518936A priority Critical patent/JP6771697B2/en
Priority to PCT/JP2018/019366 priority patent/WO2019220639A1/en
Publication of WO2019220639A1 publication Critical patent/WO2019220639A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder

Definitions

  • the present invention relates to a moving body observation apparatus and a moving body observation method for estimating a luminance distribution of a moving body.
  • a moving body observation apparatus that observes a moving body such as a celestial body or a flying body observes the moving body by receiving a light beam reflected by the moving body or a light beam transmitted from the moving body on the ground.
  • the luminous flux from the moving body may spread because the phase of the light is disturbed due to the fluctuation of the refractive index distribution in the atmosphere. Therefore, in order to increase the observation accuracy of the moving object, the moving object observation apparatus needs to acquire a wavefront that is a surface having the same phase of light.
  • Patent Document 1 below discloses a wavefront sensor that measures a wavefront.
  • the wavefront sensor disclosed in Patent Document 1 below uses a telescope to track the light flux transmitted from an object by controlling the tilt and focus of the wavefront when observing an object moving at high speed. However, I am shooting the atmospheric fault.
  • the wavefront sensor includes a control mechanism that realizes high-speed movement of each of the high-speed steering mirror, the imaging lens, and the lenslet array as a mechanism for controlling the tilt and focus of the wavefront.
  • the measurement accuracy of the wavefront by the wavefront sensor depends on the control accuracy of the control mechanism that realizes the respective high-speed movements in the high-speed steering mirror, imaging lens, and lenslet array. Therefore, the conventional mobile observation device may deteriorate the measurement accuracy of the wavefront in the wavefront sensor depending on the control accuracy of the control mechanism. There existed a subject that it might deteriorate.
  • the present invention has been made to solve the above-described problems, and is capable of estimating the luminance distribution of a moving object without implementing a control mechanism that realizes high-speed movement such as a lenslet array.
  • An object is to obtain an apparatus and a moving body observation method.
  • a moving body observation apparatus includes an imaging optical system that collects a light beam reflected by a moving body or a light beam transmitted from the moving body, and a light beam collected by the imaging optical system in a plurality of spatial regions. And a light for detecting a condensed spot image as an image of a moving body from each of the light beams collected by the space dividing unit.
  • the detector, a wavefront estimation unit that estimates the wavefront of the light beam at the aperture of the imaging optical system from the positions of the plurality of focused spot images detected by the photodetector, and the radar wave reflected by the moving body are received.
  • a radar device that generates a distance image indicating the distance to the moving object based on the radar wave, and a region where the moving object exists is detected from the distance image generated by the radar device, and is detected by the photodetector.
  • Multiple focused spot images That is, a moving body restoration unit that estimates the luminance distribution of the moving body from the focused spot image in the region where the moving body exists and the wavefront estimated by the wavefront estimation section is provided. .
  • the moving body restoration unit detects a region where the moving body exists from the distance image generated by the radar device, and among the plurality of focused spot images detected by the photodetector,
  • the moving object observation apparatus was configured to estimate the luminance distribution of the moving object from the focused spot image in the region where the moving object exists and the wavefront estimated by the wavefront estimation unit. Therefore, the moving body observation apparatus according to the present invention can estimate the luminance distribution of the moving body without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
  • FIG. 1 is a configuration diagram illustrating a mobile observation apparatus according to Embodiment 1.
  • FIG. 3 is a configuration diagram illustrating a wavefront measuring unit 15 of the moving object observation apparatus according to Embodiment 1.
  • FIG. 1 is a perspective view showing an appearance of a moving object observation apparatus according to Embodiment 1.
  • FIG. It is explanatory drawing which shows a mode that each of the telescope apparatus 10 and the radar apparatus 30 is observing the mobile body 1.
  • FIG. It is a flowchart which shows the mobile body observation method which is a process sequence of the mobile body observation apparatus shown in FIG. It is explanatory drawing which shows the image of the moving body 1 detected by the photodetector 25.
  • FIG. 1 is a configuration diagram illustrating a mobile observation apparatus according to Embodiment 1.
  • FIG. 3 is a configuration diagram illustrating a wavefront measuring unit 15 of the moving object observation apparatus according to Embodiment 1.
  • FIG. 1 is a perspective view showing an appearance of a moving object observation apparatus according to Embod
  • FIG. 5 is a flowchart showing a processing procedure of a moving body restoring unit 42. It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression
  • FIG. FIG. 6 is a configuration diagram illustrating a moving object observation apparatus according to a second embodiment.
  • FIG. 3 is an explanatory diagram showing a trajectory of a moving body 1. It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression
  • FIG. FIG. 1 is a configuration diagram illustrating a moving object observation apparatus according to Embodiment 1.
  • FIG. FIG. 2 is a configuration diagram illustrating the wavefront measuring unit 15 of the moving object observation apparatus according to the first embodiment. 1 and 2
  • the moving body 1 is an object such as a celestial body outside the atmosphere or existing in the atmosphere.
  • the light beam reflected by the moving body 1 or transmitted from the moving body 1 is a light beam 2 that has spread due to fluctuations in the refractive index distribution in the atmosphere.
  • the light beam 2 is incident on the telescope device 10. .
  • the telescope device 10 is a device that measures the wavefront of the light beam 2 at the aperture of the imaging optical system 11 and acquires an intensity image indicating an image of the moving body 1.
  • the telescope device 10 includes an imaging optical system 11, a light beam splitting unit 12, an intensity image acquisition unit 13, a relay optical system 14, a wavefront measurement unit 15, and a directing device 16.
  • the imaging optical system 11 is an optical system that condenses the incident light beam 2.
  • the light beam splitting unit 12 is realized by, for example, a beam splitter.
  • the light beam splitting unit 12 splits the light beam 2 into two by dividing the light amount or wavelength of the light beam 2 collected by the imaging optical system 11 into two.
  • the beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
  • the intensity image acquisition unit 13 is realized by, for example, an image sensor.
  • the intensity image acquisition unit 13 detects an image of the moving body 1 from the light beam 2 output from the light beam splitting unit 12 and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41.
  • the relay optical system 14 is optically equivalent to the pupil of the imaging optical system 11 so that the light beam 2 output from the light beam splitting unit 12 is in focus in the lens array 23 of the wavefront measuring unit 15. It is an optical system.
  • the wavefront measuring unit 15 includes a space dividing unit 21, a shutter 24, a photodetector 25, and a wavefront estimating unit 26.
  • the space dividing unit 21 includes a light shielding unit 22 and a lens array 23.
  • the space dividing unit 21 divides the light beam 2 output from the relay optical system 14 into a plurality of light beams 2a in a plurality of space regions, and the light beams 2a in the plurality of space regions are received by a light receiving surface 25a (FIG. 8 or FIG. 9). (See).
  • the light blocking unit 22 partially blocks the light beam 2 output from the relay optical system 14, thereby dividing the light beam 2 into light beams 2a in a plurality of spatial regions.
  • the lens array 23 includes a plurality of lenses 23 a (see FIGS. 6 to 9), and each lens 23 a condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
  • the shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • the photodetector 25 is realized by an image sensor, for example.
  • the photodetector 25 has a light receiving surface 25 a that receives each of the light beams 2 a in a plurality of spatial regions that have passed through the shutter 24.
  • the light detector 25 detects a condensing spot image as an image of the moving body 1 from each light beam 2a received by the light receiving surface 25a, and outputs an intensity image indicating each condensing spot image to the wavefront estimation unit 26 and The data is output to the moving body restoration unit 42.
  • the wavefront estimation unit 26 is realized by a computer such as a personal computer or a wavefront estimation circuit.
  • the wavefront estimation unit 26 performs processing for estimating the wavefront of the light flux at the aperture of the imaging optical system 11 from the positions of a plurality of focused spot images indicated by the intensity image output from the photodetector 25. That is, the wavefront estimation unit 26 calculates an approximate value of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 from the positions of the plurality of focused spot images.
  • the directing device 16 is a device that changes the directing direction of the imaging optical system 11 in accordance with a control signal output from the control device 49.
  • the radar apparatus 30 is realized by, for example, an inverse synthetic aperture radar (ISAR).
  • the radar apparatus 30 includes a transmission / reception unit 31, a distance image acquisition unit 32, and a directing device 33.
  • the transmission / reception unit 31 is realized by an antenna, a modulator, a demodulator, a wireless device, and the like.
  • the transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1, and receives the radar wave reflected by the moving body 1 as a reflected wave.
  • the distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the time from when the radar wave is transmitted from the transmission / reception unit 31 to when the reflected wave is received, and the distance image is transformed into an image. Processing to be output to the unit 41 is performed.
  • the directivity device 33 is a device that changes the directivity direction of the transmission / reception unit 31 in accordance with a control signal output from the control device 49.
  • the operation device 40 includes an image transformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, a planning device 48, and a control device 49.
  • the image deformation unit 41 is realized by a computer such as a personal computer or an image deformation circuit.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object. A process of outputting to the restoration unit 42 is performed.
  • the moving body restoration unit 42 is realized by a computer such as a personal computer or a moving body restoration circuit.
  • the moving body restoration unit 42 performs processing for detecting an area where the moving body 1 exists from the deformed distance image output from the image deformation unit 41.
  • the moving body restoration unit 42 includes a plurality of collections indicated by the image in the region where the moving body 1 exists and the intensity image output from the photodetector 25 among the intensity images output from the intensity image acquisition unit 13.
  • a process for estimating the luminance distribution of the moving body 1 from the light spot image or the wavefront estimated by the wavefront estimating unit 26 is performed.
  • the recording device 43 is realized by a recording processing circuit, for example.
  • the recording device 43 is a device that records the wavefront estimated by the wavefront estimating unit 26 and the luminance distribution of the moving body 1 estimated by the moving body restoring unit 42.
  • the time calibration unit 44 has a built-in clock, and calibrates the clock time using a GPS signal transmitted from a GPS (Global Positioning System) satellite or an NTP (Network Time Protocol).
  • GPS Global Positioning System
  • NTP Network Time Protocol
  • the trajectory information recording unit 46 is realized by a recording processing circuit, for example.
  • the trajectory information recording unit 46 records trajectory information indicating the trajectory of the mobile body 1.
  • the trajectory calculation unit 47 is realized by a computer such as a personal computer or a trajectory calculation circuit.
  • the trajectory calculation unit 47 performs a process of predicting the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46.
  • the planning device 48 is realized by a computer such as a personal computer.
  • the planning device 48 is a device that calculates the pointing direction of the imaging optical system 11 and the pointing direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47.
  • the control device 49 is realized by a computer such as a personal computer. Based on the elapsed time measured by the counter 45 and the calculation result of the planning device 48, the control device 49 outputs a control signal indicating the directivity direction of the imaging optical system 11 to the directivity device 16 and directs the transmission / reception unit 31. A control signal indicating the direction is output to the pointing device 33.
  • the control device 49 is a device that controls each of the intensity image acquisition unit 13, the wavefront measurement unit 15, the distance image acquisition unit 32, and the image deformation unit 41 based on the elapsed time measured by the counter 45.
  • each of the wavefront estimation circuit, the image deformation circuit, the moving body restoration circuit, and the trajectory calculation circuit includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, and an ASIC (Application Specific Integrated Circuit). , FPGA (Field-Programmable Gate Array), or a combination thereof.
  • the recording processing circuit includes, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory Non-volatile Memory or an EEPROM).
  • a semiconductor memory a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, or a DVD (Digital Versatile Disc).
  • FIG. 3 is a perspective view showing an appearance of the moving object observation apparatus according to Embodiment 1.
  • the casing 51 is mounted with the imaging optical system 11, the light beam dividing unit 12, the intensity image acquiring unit 13, the relay optical system 14, the space dividing unit 21, the shutter 24, and the photodetector 25.
  • the case 52 includes a wavefront estimation unit 26, a distance image acquisition unit 32, an image deformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, A planning device 48 and a control device 49 are mounted.
  • Optical components such as lenses and the human pupil transmit light.
  • an optical component such as a mirror reflects light.
  • the atmosphere of the earth is composed of a medium such as oxygen, nitrogen, and water vapor, and light is transmitted in the same manner as an optical component such as a lens. Since the refractive index of a medium such as oxygen fluctuates with changes in temperature and atmospheric pressure, the phase distribution of light transmitted through the earth's atmosphere changes with changes in temperature and atmospheric pressure. Since light is an electromagnetic wave, the phase distribution of light can be grasped as a wavefront.
  • the telescope device 10 shown in FIG. 1 estimates the wavefront by receiving the light beam 2 reflected by the mobile body 1 existing outside or in the atmosphere or the light beam 2 transmitted from the mobile body 1.
  • the wavefront estimated by the telescope device 10 changes as the refractive index of a medium such as oxygen changes.
  • the change in the refractive index of the medium itself is small, the change in the refractive index becomes insignificant compared to the wavelength of the light when the optical path through which light propagates becomes long. Strongly affected by fluctuations.
  • the atmosphere on the ground is affected by the effects of radiation from the sun and heat transport, and is also affected by the rotation of the earth. Therefore, an atmosphere layer is formed between the ground and the sky.
  • the wavefront of light passing through the atmospheric layers is complexly disturbed.
  • radar waves transmitted and received by the radar device 30 are hardly affected by atmospheric fluctuations.
  • FIG. 4 is an explanatory diagram showing a state in which each of the telescope device 10 and the radar device 30 observes the moving body 1.
  • the same reference numerals as those in FIGS. 1 and 3 denote the same or corresponding parts.
  • the moving body 1 is irradiated with light from the sun, which is the illumination light source 60.
  • the reflected light of wavelength ⁇ 1 that is sunlight reflected by the moving body 1 passes through the atmospheric layer between the ground and the sky, and reaches the telescope device 10.
  • FIG. 4 shows an example in which there are two atmospheric layers between the ground and the sky, where 101 is a first atmospheric layer and 102 is a second atmospheric layer.
  • the imaging optical system 11 condenses the incident light beam 2 having the wavelength ⁇ 1 on the light receiving surface 25 a of the photodetector 25, so that an image of the moving body 1 is formed on the light receiving surface 25 a of the photodetector 25. Is done. Since the wavefront is disturbed when the light beam 2 passes through the first atmospheric layer 101 and the second atmospheric layer 102, the image of the moving body 1 spreads even if the moving body 1 can be regarded as a point. End up. Therefore, if the spread of the image of the moving body 1 caused by the aberration of the imaging optical system 11 and the spread of the image of the moving body 1 caused by the resolution of the photodetector 25 are excluded, the spread of the image of the moving body 1 is excluded.
  • the cause of this is atmospheric fluctuation.
  • the moving body 1 is an object having a spread
  • the spread due to the aberration of the imaging optical system 11 and the resolution of the photodetector 25 is excluded, the spread of the image of the moving body 1 is the spread of the object itself. It is expressed by the spread due to atmospheric fluctuations.
  • the spread of the image 1 is expressed by convolution of the spread of the object itself and the spread due to atmospheric fluctuations.
  • each of the plurality of focused spot images detected by the photodetector 25 is spread by atmospheric fluctuation, and the intensity image indicating the plurality of focused spot images is an intensity image 62 as shown in FIG. Become.
  • reference numeral 61 denotes the shape of the moving body 1 when the moving body 1 is viewed from the telescope device 10 and the radar device 30.
  • the radar apparatus 30 can calculate the distance to the moving body 1 from the time from when the radar wave having the wavelength ⁇ 2 is transmitted until the reflected wave that is the radar wave reflected by the moving body 1 is received. .
  • each of L 1 , L 2 , and L 3 indicates a distance to each part in the moving body 1.
  • the radar apparatus 30 can obtain a distance image 63 as shown in FIG. 4 by calculating the distance to each part in the moving body 1.
  • Radar wave having a wavelength lambda 2 since hardly affected by air fluctuations, range image 63 is an image where there is no spread of the image due to atmospheric turbulence.
  • FIG. 5 is a flowchart showing a moving body observation method which is a processing procedure of the moving body observation apparatus shown in FIG.
  • the directing device 16 is configured such that the imaging optical system 11 transmits the light beam 2 reflected by the moving body 1 or the light beam 2 transmitted from the moving body 1.
  • the directing direction of the imaging optical system 11 is changed.
  • the directing device 33 is configured so that the transmission / reception unit 31 can receive the reflected wave from the mobile unit 1 even when the mobile unit 1 is moving relative to the radar device 30. Change the pointing direction.
  • the time calibrating unit 44 uses a GPS signal or NTP transmitted from a GPS satellite so that each of the directing device 16 and the directing device 33 can control the directing direction with a second angle accuracy. Calibrate. When the time of the clock calibrated by the time calibration unit 44 reaches a certain time, the counter 45 measures the elapsed time from the certain time.
  • the trajectory calculation unit 47 predicts the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46. Since the process itself for predicting the position of the future time at which the mobile body 1 exists from the trajectory of the mobile body 1 indicated by the trajectory information is a known technique, detailed description thereof is omitted.
  • the planning device 48 calculates the directivity direction of the imaging optical system 11 and the directivity direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47. Since the process itself for calculating the directivity direction from the predicted position of the moving body 1 is a known technique, a detailed description thereof will be omitted.
  • the control device 49 controls the pointing direction of the imaging optical system 11 based on the elapsed time measured by the counter 45 and the pointing direction of the imaging optical system 11 at a future time calculated by the planning device 48.
  • the signal is output to the pointing device 16.
  • the control device 49 generates a control signal indicating the directivity direction of the transmission / reception unit 31 based on the elapsed time measured by the counter 45 and the directivity direction of the transmission / reception unit 31 at a future time calculated by the planning device 48.
  • the directing device 16 changes the directing direction of the imaging optical system 11 according to the control signal output from the control device 49.
  • the directing device 33 changes the directing direction of the transmission / reception unit 31 in accordance with the control signal output from the control device 49.
  • the imaging optical system 11 condenses the light beam 2 (step ST1 in FIG. 5).
  • the light beam splitting unit 12 splits the light beam 2 into two by splitting the light amount of the light beam 2 collected by the imaging optical system 11 into two.
  • the beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
  • the intensity image acquisition unit 13 detects the image of the moving body 1 from the light beam 2, and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41 (FIG. 5). Step ST9). Time information indicating the detection time of the image of the moving body 1 is added to the intensity image output from the intensity image acquisition unit 13 to the image deformation unit 41.
  • the relay optical system 14 is an optical system in which the lens array 23 of the space dividing unit 21 is optically equivalent to the pupil of the imaging optical system 11.
  • the relay optical system 14 space-divides the light beam 2. The light is output to the light shielding unit 22 of the unit 21.
  • the space dividing unit 21 divides the light beam 2 into a plurality of light beams 2a in a plurality of space regions, and outputs the light beams 2a in the plurality of space regions to the shutter 24 (step ST2 in FIG. 5). ). That is, when the light shielding unit 22 receives the light beam 2 from the relay optical system 14, the light shielding unit 22 partially shields the light beam 2 to divide the light beam 2 into light beams 2 a in a plurality of spatial regions. The light shielding unit 22 outputs the light beams 2 a in a plurality of spatial regions to the lens array 23. Each lens 23 a included in the lens array 23 receives the light beam 2 a in each spatial region from the light shielding unit 22, and condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
  • the shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in accordance with a control signal output from the control device 49 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • a control signal output from the control device 49 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
  • the coherence time is generally about 1 to 10 ms.
  • the control device 49 controls the passage time of the light beam 2a in the shutter 24, the light amount of the light beam 2a received by the photodetector 25 may be reduced.
  • the control device 49 controls the shutter 24 so that the light beam 2a is passed and shielded a plurality of times, so that the photodetector 25 is moved a plurality of times. It is possible to detect the image.
  • the photodetector 25 detects a condensed spot image as an image of the moving body 1 from each light beam 2 that has passed through the shutter 24 and outputs an intensity image indicating the plurality of condensed spot images to the wavefront estimating unit 26.
  • FIG. 6 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25.
  • FIG. 6 shows an example in which there are three atmospheric layers between the ground and the sky. 101 is a first atmospheric layer, 102 is a second atmospheric layer, and 103 is a third atmospheric layer. In FIG. 6, the change of the light flux by the imaging optical system 11 and the relay optical system 14 is omitted.
  • Each lens 23a included in the lens array 23 condenses the luminous flux 2a in each spatial region on the light receiving surface 25a of the photodetector 25, so that the light receiving surface 25a of the photodetector 25 has a spatial region.
  • the image 104 of the moving body 1 is formed by the number of.
  • the plurality of images 104 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
  • the moving body 1 When the moving body 1 is moving relative to the telescope device 10, if the direction in which the moving body 1 is viewed from the ground is different, the propagation path of the light beam 2 is not equal, and the wavefront is different depending on the propagation path.
  • FIG. 7 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25 when the wavefront varies depending on the propagation path.
  • the same reference numerals as those in FIG. 6 denote the same or corresponding parts.
  • Each of the light beam 4, the light beam 5, and the light beam 6 is a light beam reflected by the moving body 1 or a light beam transmitted from the moving body 1.
  • the luminous flux 4, the luminous flux 5 and the luminous flux 6 are different from each other in the contribution of fluctuations in the atmospheric layer, and the propagation path of the luminous flux 4, the propagation path of the luminous flux 5, and the propagation path of the luminous flux 6 are different from each other.
  • the light beams 2a in the plurality of spatial regions collected by the lens array 23 on the light receiving surface 25a of the photodetector 25 are the light beams 4, 5, and 6, respectively, 4, the light beam 5 and the light beam 6 form an image 105 of the moving body 1.
  • the plurality of images 105 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
  • FIG. 8 and 9 are explanatory diagrams showing the relationship between the image of the moving body 1 detected by the photodetector 25 and the wavefront.
  • FIG. 8 shows an example in which the light beam 2 is propagated without spreading in the traveling direction
  • FIG. 9 shows an example in which the light beam 2 is propagated while spreading in the traveling direction.
  • 105a is an image of the moving body 1 when the light beam 2 is propagated without spreading in the traveling direction
  • 105b is propagated while the light beam 2 spreads in the traveling direction. It is the image of the moving body 1 in the case of being.
  • the position of the image 105a of the moving body 1 collected by each lens 23a of the lens array 23 is the light shielding portion 22.
  • the position of the image 105b of the moving body 1 collected by each lens 23a of the lens array 23 is caused by the light shielding unit 22 as shown in FIG.
  • the position of each of the divided space areas is shifted.
  • the wavefront 106a is obtained from the positions of the plurality of images 105a of the moving body 1
  • the wavefront 106b is obtained from the positions of the plurality of images 105b of the moving body 1.
  • 8 and 9 show an example in which a plurality of spatial regions divided by the light shielding unit 22 are arranged in a grid pattern.
  • the arrangement is not limited to this.
  • the arrangement of the plurality of space regions may be a honeycomb arrangement.
  • 8 and 9 show an example in which the light shielding area where the light flux 2 is shielded, that is, the area other than the transmission area where the light flux 2 is transmitted is painted black.
  • the light-shielding region is not required to transmit unnecessary light and may be painted with a color other than black.
  • the light shielding region may be colored or processed to absorb unnecessary light, or may be colored or processed to scatter unnecessary light.
  • FIG. 10 is a flowchart showing a processing procedure of the wavefront estimation unit 26.
  • the processing content of the wavefront estimation unit 26 will be specifically described with reference to FIG.
  • the wavefront estimation unit 26 receives an intensity image indicating a plurality of focused spot images from the photodetector 25, the light flux at the aperture of the imaging optical system 11 is determined from the positions of the plurality of focused spot images indicated by the intensity image. 2 is calculated (step ST11 in FIG. 10). Since the processing itself for calculating the approximate value of the wavefront from the positions of the plurality of focused spot images is a known technique, detailed description thereof is omitted.
  • a method for estimating a wavefront from the positions of a plurality of focused spot images is disclosed in Non-Patent Document 1 below, for example. [Non-patent document 1] National Astronomical Observatory of Japan vol.2 No.2
  • control device 49 controls the shutter 24 so that the passage of the light beam 2 a and the light shielding are repeated a plurality of times, so that N intensity images are output from the photodetector 25 to the wavefront estimation unit 26.
  • a mode of obtaining the position of the center of gravity of the point image as the position of the focused spot image can be considered.
  • the wavefront can be obtained from the interval between the plurality of focused spot images or the relative position of the plurality of focused spot images. Therefore, as the position of the focused spot image, an aspect in which the cross-correlation of the plurality of focused spot images or the interval between the characteristic positions of the focused spot images is considered.
  • the wavefront estimation unit 26 sets the phase of the wavefront, which is an approximate value calculated from the positions of the plurality of focused spot images indicated by the nth intensity image, to ⁇ 0, n .
  • the wavefront estimation unit 26 uses the phase ⁇ 0, n as the initial value of the phase ⁇ n (u, v) of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 as shown in the following equation (1).
  • (U, v) are the coordinates of the pupil space.
  • FIG. 11 shows the relationship between the aperture of the imaging optical system 11, the respective apertures in a plurality of spatial regions, and the image 105 of the movable body 1 when the movable body 1 moves relative to the telescope device 10.
  • M 0 (u, v) is the aperture of the imaging optical system 11.
  • M 1 (u, v), M 2 (u, v),..., M M (u, v) is an opening in each of a plurality of spatial regions.
  • the pupil function G m, n (u, v) represented by the wavefront aberration and the amplitude distribution on the pupil is an imaging optical system corresponding to the nth intensity image, as shown in the following equation (2).
  • 11 is represented by the phase ⁇ n (u, v) of the wavefront of the light beam 2 at the aperture of 11 and the aperture M m (u, v). Since the aperture M m (u, v) is known, and the initial value of the phase ⁇ n (u, v) is the approximate wavefront phase ⁇ 0, n , the pupil function G m, n (u , V) is calculated from the phase ⁇ n (u, v) and the aperture M m (u, v).
  • the amplitude spread function a m, n (u, v) is obtained by inverse Fourier transforming the pupil function G m, n (u, v) as shown in the following equation (4).
  • F ⁇ 1 is a symbol representing the inverse Fourier transform.
  • Point spread function k m point spread shows a, n (x, y), as shown in the following equation (5), the amplitude spread function a m, and n (u, v), the amplitude spread function a m , N (u, v) and the complex conjugate.
  • (X, y) are real space coordinates.
  • the aperture Mm (u,
  • the image im , n (x, y) of the moving body 1 corresponding to v) is expressed by the following equation (6).
  • (P, q) are real space coordinates indicating the position where the moving body 1 exists.
  • the luminance distribution o (p, q) of the moving body 1 is the intensity of the light beam 2 reflected by the moving body 1 or the intensity of the light beam 2 transmitted from the moving body 1.
  • the image i m of the moving body 1, n (x, y) is the point spread function k m, and n (x, y), the convolution of the intensity distribution of the moving body 1 o (p, q)
  • noise em , n (x, y) generated in the photodetector 25 is added to the equations (6) and (7).
  • the following equation (8) is an image of the moving body 1 obtained from the luminance distribution o (p, q) of the moving body 1 and the point spread function km , n (x, y) indicating the point image intensity distribution.
  • the point spread function km , n (x, y) is obtained from Expression (2), Expression (4), and Expression (5). Therefore, in Equation (8), the only unknown value is the luminance distribution o (p, q) of the moving body 1.
  • the luminance distribution o (p, q) of the moving body 1 is obtained by searching for o (p, q) that minimizes the sum of squared differences e.
  • the moving body 1 moves relative to the telescope device 10, and even if the directing device 16 changes the directing direction of the imaging optical system 11, the relative movement between the moving body 1 and the telescope device 10 is complete. Cannot be canceled. Therefore, the relative position of the moving body 1 changes as the time t changes.
  • the luminance distribution o (p, q) of the moving body 1 does not depend on the frame and does not change. However, it is assumed that the wavefront changes from frame to frame.
  • equation (8) is Fourier transformed, and the square sum e of the difference shown in equation (8) is the square sum E of the difference in the phase space.
  • I m, n (u, v) is a spectrum of im , n (x, y) and is expressed as the following Expression (10).
  • F is a symbol representing Fourier transform.
  • is a coefficient introduced for stabilizing the solution.
  • K m, n (u, v) is an autocorrelation of the pupil function G m, n (u, v) and is represented by the following equation (12).
  • K m, n (u, v) is not normalized but is an optical transfer function.
  • Equation (13) The sum of squares E of the difference shown in the equation (13) is the aperture M m (p, q), the phase ⁇ n (p, q), and the spectrum I of the image im , n (x, y) of the moving body 1. It is represented by m, n (u, v) and does not depend on the spectrum O (u, v) of the luminance distribution o (p, q) of the moving body 1 that is unknown.
  • the wavefront W n (u, v) can be estimated by the equation (2) if the phase ⁇ n (u, v) at which the square sum Err of the difference shown in the following equation (14) is minimized.
  • the luminance distribution o (p, q) of the moving body 1 is obtained even by estimating the wavefront W n (u, v) by obtaining the phase ⁇ n (u, v) that minimizes the square sum Err of the difference.
  • the equation (14) does not depend on the luminance distribution o (p, q) of the moving body 1. Therefore, a strong calculational constraint that the luminance distribution o (p, q) of the moving body 1 is a real number larger than 0 in real space is not given.
  • the main noise is shot noise and readout noise.
  • the readout noise follows a normal distribution and has a median value of 0 and a standard deviation of ⁇ . Shot noise is proportional to the luminance distribution of the acquired frame. Therefore, when Expression (15) is normalized with noise, Expression (16) is obtained. If the ratio of the difference r m, n (x, y) in the real space to noise is larger than 1, the deviation is large. If it is 1, there is no deviation, and if it is smaller than 1, the deviation is small. To do.
  • a likelihood function shown in the following equation (17) is introduced.
  • d m (x, y) is a weight given to the difference r m, n (x, y). For example, a frame with a large deviation is given a low weight because the reliability is low. .
  • the calculation amount can be reduced by setting the weight of the area to be calculated to 1 and the weight of the area to omit the calculation to 0. The above is the principle of the wavefront estimation process and the principle of the luminance distribution estimation process of the moving body 1.
  • Wavefront estimation unit 26 when the wavefront phase approximations and [Phi 0, n, the [Phi 0, n, is set to an initial value of the formula shown in (2) ⁇ n (u, v).
  • the wavefront estimation unit 26 calculates the point spread function km , n (x, y) indicating the point image intensity distribution by calculating Expression (2), Expression (4), and Expression (5) (FIG. 10). Step ST12).
  • Wavefront estimation unit 26 the point spread function k m, n (x, y ) and the image i m of the moving body 1, n (x, y) respectively by Fourier transform, the optical transfer function K m, n ( u, v) and the spectrum I m, n (u, v) of the image im , n (x, y) of the moving body 1 are obtained (step ST13 in FIG. 10).
  • the wavefront estimation unit 26 substitutes the optical transfer function K m, n (u, v) and the spectrum I m, n (u, v) into the equation (14), and calculates the square sum Err of the difference (FIG. 10). Step ST14).
  • the wavefront estimating unit 26 determines whether or not the phase search process has converged (step ST15 in FIG. 10). As a convergence determination in the phase search process, for example, there is a method of determining that the convergence is achieved if the calculated square sum Err of the difference is equal to or less than a first allowable error set in advance. The sum of squares Err of the difference calculated when it is determined that it has converged is the minimum square sum Err.
  • the first allowable error is assumed to be stored in the internal memory of the wavefront estimation unit 26 or the recording device 43, for example. Further, as the convergence determination of the phase search process, for example, while changing the phase ⁇ n (u, v), the square sum Err of the difference is calculated a predetermined number of times, and the calculated sum of squares Err is If the minimum square sum Err is specified, there is a method for determining that the sum has converged.
  • the wavefront estimation unit 26 changes the phase ⁇ n (u, v) shown in equation (2) (step in FIG. 10). ST16), the changed phase ⁇ n (u, v) is set in the equation (2). The wavefront estimation unit 26 performs the processes of steps ST12 to ST15 again.
  • the phase ⁇ n (u, v) after the change may be any phase as long as it has not yet been set in the expression (2), but should be a phase that makes the square sum Err of the difference small. Is desirable. If the phase search process has converged (in the case of step ST15 in FIG. 10: YES), the wavefront estimation unit 26 ends the phase search process.
  • the wavefront estimation unit 26 substitutes the phase ⁇ n (u, v) for which the minimum square sum Err is calculated into the expression (3), so that the wavefront estimation unit 26 at the aperture of the imaging optical system 11
  • the wave front W n (u, v) of the light beam 2 is estimated (step ST17 in FIG. 10).
  • the estimated wavefront W n (u, v) is a wavefront with higher accuracy than the wavefront as the approximate value calculated in step ST11.
  • the wavefront estimation unit 26 outputs the wavefront W n (u, v) to the recording device 43.
  • the time information indicating the calculation time of the wavefront W n (u, v) is added to the wavefront W n (u, v) output from the wavefront estimation unit 26 to the recording device 43. Further, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) corresponding to the phase ⁇ n (u, v) for which the minimum square sum Err is calculated to the moving body restoration unit 42. Time information indicating the calculation time of the point spread function km , n (x, y) is added to the point spread function km , n (x, y) output from the wavefront estimation unit 26 to the moving body restoration unit 42. Has been. Here, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) to the moving body restoration unit 42, but outputs the wavefront W n (u, v) to the moving body restoration unit 42. You may make it do.
  • the radar apparatus 30 receives the radar wave reflected by the moving body 1 and generates a distance image indicating the distance to the moving body 1 based on the radar wave (step ST5 in FIG. 5).
  • the transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1 and outputs the radar wave to the distance image acquisition unit 32.
  • the transmission / reception unit 31 receives the radar wave reflected by the moving body 1 as a reflected wave, and outputs the reflected wave to the distance image acquisition unit 32.
  • the distance from the radar apparatus 30 to the moving body 1 is directly proportional to the time from when the radar apparatus 30 transmits a radar wave to when the reflected wave is received.
  • the distance image acquisition unit 32 measures the time from when the radar wave is transmitted from the transmission / reception unit 31 until the reflected wave is received.
  • the distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the measured time, and outputs the distance image to the image deformation unit 41. Since the process itself for generating the distance image based on the measurement time is a known technique, detailed description thereof is omitted. Time information indicating the generation time of the distance image is added to the distance image output from the distance image acquisition unit 32 to the image deformation unit 41.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object.
  • the data is output to the restoration unit 42 (step ST6 in FIG. 5).
  • the distance image deformation process by the image deformation unit 41 will be specifically described below.
  • the image transformation unit 41 sequentially acquires intensity images at different times from the intensity image acquisition unit 13 and sequentially acquires distance images at different times from the distance image acquisition unit 32. For example, when the deformation process of the distance image at the generation time t c is performed, the image deformation unit 41 selects an intensity image whose detection time t d is closest to the generation time t c from the plurality of acquired intensity images.
  • the image deforming unit 41 acquires the status information of the telescope device 10 and the status information of the radar device 30 from the control device 49.
  • the status information of the telescope device 10 is information indicating the orientation direction of the imaging optical system 11, the scale of the intensity image, and the like.
  • the status information of the radar apparatus 30 is information indicating the directivity direction of the transmission / reception unit 31 and the scale of the distance image.
  • the image deformation unit 41 refers to the directivity direction of the imaging optical system 11 to confirm the direction of the intensity image, and refers to the directivity direction of the transmission / reception unit 31 to confirm the direction of the distance image.
  • Image transforming unit 41 the orientation of the distance image generation time t c is, to match the orientation of the selected intensity image, carrying out the affine transformation processing of the distance image is rotated or the like.
  • the image transformation unit 41 performs the distance image enlargement process or the distance image reduction process so that the scale of the distance image subjected to the affine transformation process or the like matches the scale of the selected intensity image.
  • the image deforming unit 41 performs an affine transformation process and an enlargement process or a reduction process as the process for deforming the distance image.
  • any process may be used as long as the image deforming unit 41 deforms the distance image in order to align the distance image with the distance image.
  • the image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13.
  • the image deformation unit 41 deforms the distance image so that the distance image output from the distance image acquisition unit 32 matches the intensity image output from the photodetector 25. It may be.
  • the moving body restoration unit 42 includes a distance image output from the image deformation unit 41, an intensity image output from the intensity image acquisition unit 13, and a plurality of focused spot images or wavefront estimation units detected by the photodetector 25.
  • the brightness distribution o (p, q) of the moving body 1 is estimated from the point spread function km , n (x, y) output from the terminal 26 (step ST7 in FIG. 5).
  • the moving body restoration unit 42 estimates the luminance distribution o (p, q) of the moving body 1 using the point spread function km , n (x, y).
  • the luminance distribution o (p, q) of the moving body 1 may be estimated using the output wavefront W n (u, v).
  • Mobile recovery unit 42 the wavefront W n (u, v) using a luminance distribution o (p, q) of the moving body 1 when estimating the wavefront W n (u, v) in the equation (3)
  • the phase ⁇ n (u, v) is calculated.
  • the moving body restoration unit 42 substitutes the phase ⁇ n (u, v) into the equation (2), and calculates the equations (2), (4), and (5), thereby calculating the point spread function k.
  • m, n (x, y) is calculated.
  • the moving body restoration unit 42 calculates the luminance distribution o (p, q) of the moving body 1 from the distance image, the intensity image, and the plurality of focused spot images or the calculated point spread function km , n (x, y). ).
  • FIG. 12 is a flowchart showing the processing procedure of the moving object restoring unit 42.
  • FIG. 13 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 42.
  • the distance image 111 is a deformed distance image output from the image deforming unit 41.
  • the intensity image 112 is an intensity image output from the intensity image acquisition unit 13.
  • the moving body restoring unit 42 detects an area where the moving body 1 exists from the distance image 111 (step ST21 in FIG. 12). That is, the moving body restoration unit 42 performs a contour extraction process for extracting the contour of the moving body 1 from the distance image 111. Since the contour extraction process itself is a known technique, a detailed description thereof will be omitted.
  • the moving body restoration unit 42 sets the area inside the extracted outline as the area where the moving body 1 exists, and sets the area outside the outline as the area where the moving body 1 does not exist.
  • the moving body restoration unit 42 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1.
  • a mask image 113 is generated (step ST22 in FIG. 12).
  • the processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists.
  • the existing area may be a large area.
  • As a region where the moving object 1 is also large an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered.
  • As the margin for example, a size of about 10% of the area where the moving body 1 exists can be considered.
  • the moving body restoring unit 42 extracts the image im , n (x, y) of the moving body 1 in the processing target area in the mask image 113 from the intensity image 112 (step ST23 in FIG. 12).
  • the intensity image 114 shown in FIG. 13 is an intensity image showing the image im , n (x, y) of the moving body 1 in the processing target area extracted from the intensity image 112.
  • Mobile recovery unit 42 the image i m of one or more mobile 1 contained in the processing target area, n (x, y) from the one image i m of the moving body 1, n (x , Y).
  • the moving body restoration unit 42 includes an image im , n (x, y) of the selected moving body 1 and a point spread function km, corresponding to the selected image im , n (x, y) of the moving body 1 . Substituting n (x, y) into equation (16), the difference rm , n (x, y) is calculated. The moving body restoration unit 42 selects all the images im , n (x, y) of one or more moving bodies 1 included in the processing target region, and calculates the difference rm , n (x, y). The above process is repeated until the process ends.
  • the moving body restoration unit 42 substitutes all the calculated differences r m, n (x, y) into the equation (17) to calculate the square sum e of the differences (step ST24 in FIG. 12).
  • Mobile recovery unit 42 in formula (17), the difference r m of the processing target area, n (x, y) weighted d m (x, y) corresponding to the set to 1, processing target area outside the difference r A weight d m (x, y) corresponding to m, n (x, y) is set to zero.
  • the moving body restoration unit 42 determines whether or not the luminance distribution estimation process of the moving body 1 has converged (step ST25 in FIG. 12).
  • a convergence determination of the luminance distribution estimation process of the mobile body for example, there is a method of determining that the convergence is achieved if the calculated square sum e of differences is equal to or less than a second allowable error set in advance.
  • the sum of squares e of the difference calculated when it is determined that it has converged is the minimum square sum e.
  • the second allowable error is assumed to be stored in the internal memory of the moving body restoration unit 42 or the recording device 43, for example.
  • the square sum e of the difference is calculated a predetermined number of times while changing the luminance distribution o (p, q) of the mobile object 1, There is a method of determining that convergence is achieved when the minimum square sum e is specified among the calculated sums of squares e.
  • the moving object restoring unit 42 determines the luminance distribution o (p, q of the moving object 1 shown in Expression (16). ) Is changed (step ST26 in FIG. 12), and the processes of steps ST24 to ST25 are performed again.
  • the luminance distribution o (p, q) after the change may be any luminance distribution as long as it is not yet set in the equation (16), but the luminance distribution is such that the square sum e of the differences becomes small. It is desirable that If the luminance distribution estimation process of the moving object 1 has converged (in the case of step ST25 in FIG.
  • the moving object restoration unit 42 obtains the minimum square sum e as the result of the luminance distribution estimation process of the moving object 1.
  • the calculated luminance distribution o (p, q) of the moving body 1 is output to the recording device 43 (step ST27 in FIG. 12).
  • the moving body restoration unit 42 detects a region where the moving body 1 exists from the distance image generated by the radar device 30, and a plurality of collections detected by the photodetector 25. Moving object observation so as to estimate the luminance distribution of the moving object 1 from the focused spot image in the region where the moving object 1 exists in the light spot image and the wavefront estimated by the wavefront estimating unit 26. Configured the device. Therefore, the moving body observation apparatus can estimate the luminance distribution of the moving body 1 without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
  • Embodiment 2 In the moving body observation apparatus of the first embodiment, the moving body restoring unit 42 extracts the contour of the moving body 1 from the distance image 111, and sets the area inside the contour as the area where the moving body 1 exists.
  • the moving body restoration unit 70 has a distance value that is larger than the first threshold value and smaller than the second threshold value among the plurality of pixels included in the distance image 111. A moving body observation apparatus that detects a region where the moving bodies 1 exist as a region where the moving body 1 exists will be described.
  • FIG. 14 is a configuration diagram illustrating a moving object observation apparatus according to the second embodiment.
  • the moving body restoration unit 70 is realized by a computer such as a personal computer or a moving body restoration circuit.
  • the moving body restoration unit 70 acquires the trajectory information from the trajectory information recording unit 46, and based on the trajectory information, sets the first threshold value L ref1 indicating the lower limit of the distance and the second threshold value L ref2 indicating the upper limit of the distance, respectively. Implement the process to determine.
  • the moving body restoration unit 70 collects pixels having a distance value that is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111.
  • region where the mobile body 1 exists is implemented.
  • the moving body restoring unit 70 includes a focused spot image in a region where the moving body 1 is present among the plurality of focused spot images detected by the photodetector 25, and the wavefront estimated by the wavefront estimating unit 26. Then, the process of estimating the luminance distribution of the moving body 1 is performed.
  • the moving body restoration unit 70 acquires trajectory information from the trajectory information recording unit 46.
  • the moving body 1 is a meteorite, space debris, an artificial satellite or the like that orbits the earth
  • the gravity with respect to the moving body 1 is mainly the gravity of the earth.
  • the movement locus of the moving body 1 is represented by a quadratic curve as shown in FIG.
  • the trajectory of the movement of the moving body 1 corresponds to a trajectory.
  • the quadratic curve is, for example, a normal line, a hyperbola, or an ellipse.
  • FIG. 15 is an explanatory diagram showing the trajectory of the moving body 1.
  • orbit information indicating the orbits of celestial bodies such as stars and comets is assumed to be known.
  • orbit information indicating the orbits of moving objects such as artificial satellites and the International Space Station is also known here.
  • FIG. 16 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 70.
  • the same reference numerals as those in FIG. 13 denote the same or corresponding parts.
  • each of L 1 , L 2 , and L 3 indicates the distance from the radar device 30 to each part in the moving body 1 as shown in FIG. L 1 ⁇ L 2 ⁇ L 3 is satisfied.
  • the moving body restoration unit 70 identifies the position of the moving body 1 with reference to the trajectory information.
  • the moving body restoration unit 70 calculates an approximate value of the distance from the radar apparatus 30 to the moving body 1 from the position of the radar apparatus 30 and the position of the moving body 1.
  • the moving body restoration unit 70 sets (approximate distance- ⁇ ) to the first threshold value L ref1, where ⁇ is 1.0 to 2.5 times the maximum outer dimension of the moving body 1, and the like (
  • the approximate distance value + ⁇ ) is set to the second threshold value L ref2 .
  • This setting is only an example, and it goes without saying that the first threshold value L ref1 and the second threshold value L ref2 may be set by other methods.
  • the maximum external dimension of the moving body 1 is, for example, the horizontal dimension of the moving body 1 if the moving body 1 is a horizontally long object.
  • the maximum value of the moving body 1 The vertical dimension.
  • L ref1 ⁇ L 1 ⁇ L 2 ⁇ L 3 ⁇ L ref2 it is assumed that L ref1 ⁇ L 1 ⁇ L 2 ⁇ L 3 ⁇ L ref2 .
  • the moving body restoration unit 70 is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111.
  • An area in which pixels having distance values are gathered is specified.
  • the moving body restoration unit 70 sets the area inside the specified area as the area where the moving body 1 exists, and sets the area outside the specified area as the area where the moving body 1 does not exist.
  • the moving body restoration unit 70 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1.
  • a mask image 113 shown is generated.
  • the processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists.
  • the existing area may be a large area. As a region where the moving object 1 is also large, an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered.
  • the moving body restoring unit 70 and the focused spot image in the region where the moving body 1 exists and the point spread function output from the wavefront estimating unit 26 are used.
  • the brightness distribution o (p, q) of the moving body 1 is estimated from km , n (x, y).
  • the present invention is suitable for a moving object observation apparatus and a moving object observation method for estimating the luminance distribution of a moving object.

Abstract

This moving body observation device has been configured so that a moving body recovery unit (42) detects a region, in which a moving body (1) is present, from a distance image generated by a radar device (30), and estimates a brightness distribution of the moving body (1) from: a condensed light spot image within the region in which the moving body (1) is present, from among a plurality of condensed light spot images detected by the optical detector (25); and a wavefront estimated by a wavefront estimation unit (26).

Description

移動体観測装置及び移動体観測方法Mobile object observation apparatus and mobile object observation method
 この発明は、移動体の輝度分布を推定する移動体観測装置及び移動体観測方法に関するものである。 The present invention relates to a moving body observation apparatus and a moving body observation method for estimating a luminance distribution of a moving body.
 天体又は飛翔体などの移動体を観測する移動体観測装置は、移動体に反射された光束又は移動体から送信された光束を地上で受信することで、移動体を観測する。
 移動体からの光束は、大気の屈折率分布のゆらぎが原因で、光の位相が乱されてしまうために、広がってしまうことがある。
 したがって、移動体観測装置は、移動体の観測精度を高めるには、光の位相の等しい面である波面を取得する必要がある。
 以下の特許文献1には、波面を計測する波面センサが開示されている。
A moving body observation apparatus that observes a moving body such as a celestial body or a flying body observes the moving body by receiving a light beam reflected by the moving body or a light beam transmitted from the moving body on the ground.
The luminous flux from the moving body may spread because the phase of the light is disturbed due to the fluctuation of the refractive index distribution in the atmosphere.
Therefore, in order to increase the observation accuracy of the moving object, the moving object observation apparatus needs to acquire a wavefront that is a surface having the same phase of light.
Patent Document 1 below discloses a wavefront sensor that measures a wavefront.
 以下の特許文献1に開示されている波面センサは、望遠鏡を用いて、高速に移動する物体を観察する際、波面のチルト及び焦点のそれぞれを制御することで、物体から送信された光束を追いかけながら、大気断層を撮影するようにしている。
 波面センサは、波面のチルト及び焦点のそれぞれを制御するための機構として、高速ステアリングミラー、結像レンズ及びレンズレットアレイにおけるそれぞれの高速移動を実現する制御機構を備えている。
The wavefront sensor disclosed in Patent Document 1 below uses a telescope to track the light flux transmitted from an object by controlling the tilt and focus of the wavefront when observing an object moving at high speed. However, I am shooting the atmospheric fault.
The wavefront sensor includes a control mechanism that realizes high-speed movement of each of the high-speed steering mirror, the imaging lens, and the lenslet array as a mechanism for controlling the tilt and focus of the wavefront.
特開2016-118547号公報JP 2016-118547 A
 波面センサによる波面の計測精度は、高速ステアリングミラー、結像レンズ及びレンズレットアレイにおけるそれぞれの高速移動を実現する制御機構の制御精度に依存している。
 したがって、従来の移動体観測装置は、制御機構の制御精度によっては、波面センサにおける波面の計測精度が劣化してしまうことがあるため、波面センサを実装していても、移動体の観測精度が劣化してしまうことがあるという課題があった。
The measurement accuracy of the wavefront by the wavefront sensor depends on the control accuracy of the control mechanism that realizes the respective high-speed movements in the high-speed steering mirror, imaging lens, and lenslet array.
Therefore, the conventional mobile observation device may deteriorate the measurement accuracy of the wavefront in the wavefront sensor depending on the control accuracy of the control mechanism. There existed a subject that it might deteriorate.
 この発明は上記のような課題を解決するためになされたもので、レンズレットアレイなどの高速移動を実現する制御機構を実装することなく、移動体の輝度分布を推定することができる移動体観測装置及び移動体観測方法を得ることを目的とする。 The present invention has been made to solve the above-described problems, and is capable of estimating the luminance distribution of a moving object without implementing a control mechanism that realizes high-speed movement such as a lenslet array. An object is to obtain an apparatus and a moving body observation method.
 この発明に係る移動体観測装置は、移動体に反射された光束又は移動体から送信された光束を集光する結像光学系と、結像光学系により集光された光束を複数の空間領域の光束に分割し、複数の空間領域の光束のそれぞれを集光する空間分割部と、空間分割部により集光されたそれぞれの光束から、移動体の像として、集光スポット像を検出する光検出器と、光検出器により検出された複数の集光スポット像の位置から、結像光学系の開口における光束の波面を推定する波面推定部と、移動体に反射されたレーダ波を受信し、レーダ波に基づいて移動体までの距離を示す距離画像を生成するレーダ装置と、レーダ装置により生成された距離画像から、移動体が存在している領域を検出し、光検出器により検出された複数の集光スポット像のうち、移動体が存在している領域内の集光スポット像と、波面推定部により推定された波面とから、移動体の輝度分布を推定する移動体復元部とを備えるようにしたものである。 A moving body observation apparatus according to the present invention includes an imaging optical system that collects a light beam reflected by a moving body or a light beam transmitted from the moving body, and a light beam collected by the imaging optical system in a plurality of spatial regions. And a light for detecting a condensed spot image as an image of a moving body from each of the light beams collected by the space dividing unit. The detector, a wavefront estimation unit that estimates the wavefront of the light beam at the aperture of the imaging optical system from the positions of the plurality of focused spot images detected by the photodetector, and the radar wave reflected by the moving body are received. A radar device that generates a distance image indicating the distance to the moving object based on the radar wave, and a region where the moving object exists is detected from the distance image generated by the radar device, and is detected by the photodetector. Multiple focused spot images That is, a moving body restoration unit that estimates the luminance distribution of the moving body from the focused spot image in the region where the moving body exists and the wavefront estimated by the wavefront estimation section is provided. .
 この発明によれば、移動体復元部が、レーダ装置により生成された距離画像から、移動体が存在している領域を検出し、光検出器により検出された複数の集光スポット像のうち、移動体が存在している領域内の集光スポット像と、波面推定部により推定された波面とから、移動体の輝度分布を推定するように、移動体観測装置を構成した。したがって、この発明に係る移動体観測装置は、レンズレットアレイなどの高速移動を実現する制御機構を実装することなく、移動体の輝度分布を推定することができる。 According to this invention, the moving body restoration unit detects a region where the moving body exists from the distance image generated by the radar device, and among the plurality of focused spot images detected by the photodetector, The moving object observation apparatus was configured to estimate the luminance distribution of the moving object from the focused spot image in the region where the moving object exists and the wavefront estimated by the wavefront estimation unit. Therefore, the moving body observation apparatus according to the present invention can estimate the luminance distribution of the moving body without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
実施の形態1による移動体観測装置を示す構成図である。1 is a configuration diagram illustrating a mobile observation apparatus according to Embodiment 1. FIG. 実施の形態1による移動体観測装置の波面計測部15を示す構成図である。3 is a configuration diagram illustrating a wavefront measuring unit 15 of the moving object observation apparatus according to Embodiment 1. FIG. 実施の形態1による移動体観測装置の外観を示す斜視図である。1 is a perspective view showing an appearance of a moving object observation apparatus according to Embodiment 1. FIG. 望遠鏡装置10及びレーダ装置30のそれぞれが移動体1を観測している様子を示す説明図である。It is explanatory drawing which shows a mode that each of the telescope apparatus 10 and the radar apparatus 30 is observing the mobile body 1. FIG. 図1に示す移動体観測装置の処理手順である移動体観測方法を示すフローチャートである。It is a flowchart which shows the mobile body observation method which is a process sequence of the mobile body observation apparatus shown in FIG. 光検出器25により検出される移動体1の像を示す説明図である。It is explanatory drawing which shows the image of the moving body 1 detected by the photodetector 25. FIG. 波面が伝搬経路によって異なる場合の光検出器25により検出される移動体1の像を示す説明図である。It is explanatory drawing which shows the image of the moving body 1 detected by the photodetector 25 when a wave front changes with propagation paths. 光検出器25により検出される移動体1の像と波面の関係を示す説明図である。It is explanatory drawing which shows the relationship between the image of the moving body 1 detected by the photodetector 25, and a wave front. 光検出器25により検出される移動体1の像と波面の関係を示す説明図である。It is explanatory drawing which shows the relationship between the image of the moving body 1 detected by the photodetector 25, and a wave front. 波面推定部26の処理手順を示すフローチャートである。4 is a flowchart illustrating a processing procedure of a wavefront estimation unit 26. 移動体1が望遠鏡装置10と相対的に移動している場合において、結像光学系11の開口と、複数の空間領域におけるそれぞれの開口と、移動体1の像105との関係を示す説明図である。When the moving body 1 is moving relative to the telescope device 10, an explanatory diagram showing the relationship between the aperture of the imaging optical system 11, the respective apertures in a plurality of spatial regions, and the image 105 of the moving body 1. It is. 移動体復元部42の処理手順を示すフローチャートである。5 is a flowchart showing a processing procedure of a moving body restoring unit 42. 移動体復元部42による移動体1の輝度分布推定処理を示す説明図である。It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression | restoration part 42. FIG. 実施の形態2による移動体観測装置を示す構成図である。FIG. 6 is a configuration diagram illustrating a moving object observation apparatus according to a second embodiment. 移動体1の軌道を示す説明図である。FIG. 3 is an explanatory diagram showing a trajectory of a moving body 1. 移動体復元部70による移動体1の輝度分布推定処理を示す説明図である。It is explanatory drawing which shows the luminance distribution estimation process of the moving body 1 by the moving body decompression | restoration part 70. FIG.
 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。 Hereinafter, in order to explain the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
実施の形態1.
 図1は、実施の形態1による移動体観測装置を示す構成図である。
 図2は、実施の形態1による移動体観測装置の波面計測部15を示す構成図である。
 図1及び図2において、移動体1は、大気の外、あるいは、大気中に存在している天体などの物体である。
 移動体1に反射された光束又は移動体1から送信された光束は、大気の屈折率分布のゆらぎが原因で、広がってしまった光束2であり、光束2は、望遠鏡装置10に入射される。
Embodiment 1 FIG.
FIG. 1 is a configuration diagram illustrating a moving object observation apparatus according to Embodiment 1. In FIG.
FIG. 2 is a configuration diagram illustrating the wavefront measuring unit 15 of the moving object observation apparatus according to the first embodiment.
1 and 2, the moving body 1 is an object such as a celestial body outside the atmosphere or existing in the atmosphere.
The light beam reflected by the moving body 1 or transmitted from the moving body 1 is a light beam 2 that has spread due to fluctuations in the refractive index distribution in the atmosphere. The light beam 2 is incident on the telescope device 10. .
 望遠鏡装置10は、結像光学系11の開口における光束2の波面を計測するとともに、移動体1の像を示す強度画像を取得する装置である。
 望遠鏡装置10は、結像光学系11、光束分割部12、強度画像取得部13、リレー光学系14、波面計測部15及び指向装置16を備えている。
 結像光学系11は、入射された光束2を集光する光学系である。
 光束分割部12は、例えば、ビームスプリッタによって実現される。
 光束分割部12は、結像光学系11により集光された光束2の光量又は波長を2つに分割することで、光束2を2つに分割する。
 光束分割部12は、分割後の一方の光束2を強度画像取得部13に出力し、分割後の他方の光束2をリレー光学系14に出力する。
The telescope device 10 is a device that measures the wavefront of the light beam 2 at the aperture of the imaging optical system 11 and acquires an intensity image indicating an image of the moving body 1.
The telescope device 10 includes an imaging optical system 11, a light beam splitting unit 12, an intensity image acquisition unit 13, a relay optical system 14, a wavefront measurement unit 15, and a directing device 16.
The imaging optical system 11 is an optical system that condenses the incident light beam 2.
The light beam splitting unit 12 is realized by, for example, a beam splitter.
The light beam splitting unit 12 splits the light beam 2 into two by dividing the light amount or wavelength of the light beam 2 collected by the imaging optical system 11 into two.
The beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
 強度画像取得部13は、例えば、イメージセンサによって実現される。
 強度画像取得部13は、光束分割部12から出力された光束2から、移動体1の像を検出し、移動体1の像を示す強度画像を画像変形部41に出力する。
 リレー光学系14は、波面計測部15のレンズアレイ23において、光束分割部12から出力された光束2のピントが合うように、レンズアレイ23を結像光学系11の瞳と光学的に等価とする光学系である。
The intensity image acquisition unit 13 is realized by, for example, an image sensor.
The intensity image acquisition unit 13 detects an image of the moving body 1 from the light beam 2 output from the light beam splitting unit 12 and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41.
The relay optical system 14 is optically equivalent to the pupil of the imaging optical system 11 so that the light beam 2 output from the light beam splitting unit 12 is in focus in the lens array 23 of the wavefront measuring unit 15. It is an optical system.
 波面計測部15は、空間分割部21、シャッタ24、光検出器25及び波面推定部26を備えている。
 空間分割部21は、遮光部22及びレンズアレイ23を備えている。
 空間分割部21は、リレー光学系14から出力された光束2を複数の空間領域の光束2aに分割し、複数の空間領域の光束2aを光検出器25の受光面25a(図8又は図9を参照)に集光する。
 遮光部22は、リレー光学系14から出力された光束2を部分的に遮光することで、光束2を複数の空間領域の光束2aに分割する。
 レンズアレイ23は、複数のレンズ23a(図6から図9を参照)を含んでおり、それぞれのレンズ23aがそれぞれの空間領域の光束2aを光検出器25の受光面25aに集光する。
The wavefront measuring unit 15 includes a space dividing unit 21, a shutter 24, a photodetector 25, and a wavefront estimating unit 26.
The space dividing unit 21 includes a light shielding unit 22 and a lens array 23.
The space dividing unit 21 divides the light beam 2 output from the relay optical system 14 into a plurality of light beams 2a in a plurality of space regions, and the light beams 2a in the plurality of space regions are received by a light receiving surface 25a (FIG. 8 or FIG. 9). (See).
The light blocking unit 22 partially blocks the light beam 2 output from the relay optical system 14, thereby dividing the light beam 2 into light beams 2a in a plurality of spatial regions.
The lens array 23 includes a plurality of lenses 23 a (see FIGS. 6 to 9), and each lens 23 a condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
 シャッタ24は、光検出器25により受光される光束2aの光量を調整するために、レンズアレイ23から出力された光束2aの通過を時間的に制限する。
 光検出器25は、例えば、イメージセンサによって実現される。
 光検出器25は、シャッタ24を通過してきた複数の空間領域の光束2aのそれぞれを受光する受光面25aを有している。
 光検出器25は、受光面25aで受光されるそれぞれの光束2aから、移動体1の像として、集光スポット像を検出し、それぞれの集光スポット像を示す強度画像を波面推定部26及び移動体復元部42に出力する。
The shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
The photodetector 25 is realized by an image sensor, for example.
The photodetector 25 has a light receiving surface 25 a that receives each of the light beams 2 a in a plurality of spatial regions that have passed through the shutter 24.
The light detector 25 detects a condensing spot image as an image of the moving body 1 from each light beam 2a received by the light receiving surface 25a, and outputs an intensity image indicating each condensing spot image to the wavefront estimation unit 26 and The data is output to the moving body restoration unit 42.
 波面推定部26は、パーソナルコンピュータなどの計算機、あるいは、波面推定回路などによって実現される。
 波面推定部26は、光検出器25から出力された強度画像が示す複数の集光スポット像の位置から、結像光学系11の開口における光束の波面を推定する処理を実施する。
 即ち、波面推定部26は、複数の集光スポット像の位置から、結像光学系11の開口における光束2の波面の概算値を算出する。
 指向装置16は、制御装置49から出力された制御信号に従って結像光学系11の指向方向を変更する装置である。
The wavefront estimation unit 26 is realized by a computer such as a personal computer or a wavefront estimation circuit.
The wavefront estimation unit 26 performs processing for estimating the wavefront of the light flux at the aperture of the imaging optical system 11 from the positions of a plurality of focused spot images indicated by the intensity image output from the photodetector 25.
That is, the wavefront estimation unit 26 calculates an approximate value of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 from the positions of the plurality of focused spot images.
The directing device 16 is a device that changes the directing direction of the imaging optical system 11 in accordance with a control signal output from the control device 49.
 レーダ装置30は、例えば、逆合成開口レーダ(ISAR:Inverse Synthetic Aparture Radar)によって実現される。
 レーダ装置30は、送受信部31、距離画像取得部32及び指向装置33を備えている。
 送受信部31は、アンテナ、変調器、復調器及び無線機などによって実現される。
 送受信部31は、マイクロ波又はミリ波などのレーダ波を移動体1に向けて送信する一方、移動体1に反射されたレーダ波を反射波として受信する。
 距離画像取得部32は、送受信部31からレーダ波が送信されてから、反射波を受信するまでの時間に基づいて、移動体1までの距離を示す距離画像を生成し、距離画像を画像変形部41に出力する処理を実施する。
 指向装置33は、制御装置49から出力された制御信号に従って送受信部31の指向方向を変更する装置である。
The radar apparatus 30 is realized by, for example, an inverse synthetic aperture radar (ISAR).
The radar apparatus 30 includes a transmission / reception unit 31, a distance image acquisition unit 32, and a directing device 33.
The transmission / reception unit 31 is realized by an antenna, a modulator, a demodulator, a wireless device, and the like.
The transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1, and receives the radar wave reflected by the moving body 1 as a reflected wave.
The distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the time from when the radar wave is transmitted from the transmission / reception unit 31 to when the reflected wave is received, and the distance image is transformed into an image. Processing to be output to the unit 41 is performed.
The directivity device 33 is a device that changes the directivity direction of the transmission / reception unit 31 in accordance with a control signal output from the control device 49.
 運用装置40は、画像変形部41、移動体復元部42、記録装置43、時刻校正部44、カウンタ45、軌道情報記録部46、軌道計算部47、計画装置48及び制御装置49を備えている。
 画像変形部41は、パーソナルコンピュータなどの計算機、あるいは、画像変形回路などによって実現される。
 画像変形部41は、距離画像取得部32から出力された距離画像が、強度画像取得部13から出力された強度画像と整合するように、距離画像を変形し、変形後の距離画像を移動体復元部42に出力する処理を実施する。
The operation device 40 includes an image transformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, a planning device 48, and a control device 49. .
The image deformation unit 41 is realized by a computer such as a personal computer or an image deformation circuit.
The image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object. A process of outputting to the restoration unit 42 is performed.
 移動体復元部42は、パーソナルコンピュータなどの計算機、あるいは、移動体復元回路などによって実現される。
 移動体復元部42は、画像変形部41から出力された変形後の距離画像から、移動体1が存在している領域を検出する処理を実施する。
 移動体復元部42は、強度画像取得部13から出力された強度画像のうち、移動体1が存在している領域内の画像と、光検出器25から出力された強度画像が示す複数の集光スポット像あるいは波面推定部26により推定された波面とから、移動体1の輝度分布を推定する処理を実施する。
 記録装置43は、例えば、記録処理回路によって実現される。
 記録装置43は、波面推定部26により推定された波面及び移動体復元部42により推定された移動体1の輝度分布などを記録する装置である。
The moving body restoration unit 42 is realized by a computer such as a personal computer or a moving body restoration circuit.
The moving body restoration unit 42 performs processing for detecting an area where the moving body 1 exists from the deformed distance image output from the image deformation unit 41.
The moving body restoration unit 42 includes a plurality of collections indicated by the image in the region where the moving body 1 exists and the intensity image output from the photodetector 25 among the intensity images output from the intensity image acquisition unit 13. A process for estimating the luminance distribution of the moving body 1 from the light spot image or the wavefront estimated by the wavefront estimating unit 26 is performed.
The recording device 43 is realized by a recording processing circuit, for example.
The recording device 43 is a device that records the wavefront estimated by the wavefront estimating unit 26 and the luminance distribution of the moving body 1 estimated by the moving body restoring unit 42.
 時刻校正部44は、クロックを内蔵しており、GPS(Global Positioning System)衛星から発信されるGPS信号又はNTP(Network Time Protocol)を用いて、クロックの時刻を校正する。
 カウンタ45は、時刻校正部44により校正されたクロックの時刻が或る時刻になると、或る時刻からの経過時間を計測する。
The time calibration unit 44 has a built-in clock, and calibrates the clock time using a GPS signal transmitted from a GPS (Global Positioning System) satellite or an NTP (Network Time Protocol).
When the time of the clock calibrated by the time calibration unit 44 reaches a certain time, the counter 45 measures the elapsed time from the certain time.
 軌道情報記録部46は、例えば、記録処理回路によって実現される。
 軌道情報記録部46は、移動体1の軌道を示す軌道情報などを記録している。
 軌道計算部47は、パーソナルコンピュータなどの計算機、あるいは、軌道計算回路などによって実現される。
 軌道計算部47は、軌道情報記録部46に記録されている軌道情報に基づいて、移動体1が存在している将来の時刻の位置を予測する処理を実施する。
The trajectory information recording unit 46 is realized by a recording processing circuit, for example.
The trajectory information recording unit 46 records trajectory information indicating the trajectory of the mobile body 1.
The trajectory calculation unit 47 is realized by a computer such as a personal computer or a trajectory calculation circuit.
The trajectory calculation unit 47 performs a process of predicting the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46.
 計画装置48は、パーソナルコンピュータなどの計算機によって実現される。
 計画装置48は、軌道計算部47により予測された位置に基づいて、将来の時刻における結像光学系11の指向方向及び送受信部31の指向方向などを計算する装置である。
 制御装置49は、パーソナルコンピュータなどの計算機によって実現される。
 制御装置49は、カウンタ45により計測された経過時間及び計画装置48の計算結果に基づいて、結像光学系11の指向方向を示す制御信号を指向装置16に出力するとともに、送受信部31の指向方向を示す制御信号を指向装置33に出力する。
 また、制御装置49は、カウンタ45により計測された経過時間に基づいて、強度画像取得部13、波面計測部15、距離画像取得部32及び画像変形部41のそれぞれを制御する装置である。
The planning device 48 is realized by a computer such as a personal computer.
The planning device 48 is a device that calculates the pointing direction of the imaging optical system 11 and the pointing direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47.
The control device 49 is realized by a computer such as a personal computer.
Based on the elapsed time measured by the counter 45 and the calculation result of the planning device 48, the control device 49 outputs a control signal indicating the directivity direction of the imaging optical system 11 to the directivity device 16 and directs the transmission / reception unit 31. A control signal indicating the direction is output to the pointing device 33.
The control device 49 is a device that controls each of the intensity image acquisition unit 13, the wavefront measurement unit 15, the distance image acquisition unit 32, and the image deformation unit 41 based on the elapsed time measured by the counter 45.
 ここで、波面推定回路、画像変形回路、移動体復元回路及び軌道計算回路のそれぞれは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又は、これらを組み合わせたものが該当する。
 また、記録処理回路は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの不揮発性又は揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、あるいは、DVD(Digital Versatile Disc)が該当する。
Here, each of the wavefront estimation circuit, the image deformation circuit, the moving body restoration circuit, and the trajectory calculation circuit includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, and an ASIC (Application Specific Integrated Circuit). , FPGA (Field-Programmable Gate Array), or a combination thereof.
In addition, the recording processing circuit includes, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory Non-volatile Memory or an EEPROM). Such a semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, or a DVD (Digital Versatile Disc).
 図3は、実施の形態1による移動体観測装置の外観を示す斜視図である。
 図3において、図1及び図2と同一符号は同一又は相当部分を示すので説明を省略する。
 筐体51は、結像光学系11、光束分割部12、強度画像取得部13、リレー光学系14、空間分割部21、シャッタ24及び光検出器25を実装している。
 筐体52は、波面推定部26、距離画像取得部32、画像変形部41、移動体復元部42、記録装置43、時刻校正部44、カウンタ45、軌道情報記録部46、軌道計算部47、計画装置48及び制御装置49を実装している。
FIG. 3 is a perspective view showing an appearance of the moving object observation apparatus according to Embodiment 1.
In FIG. 3, the same reference numerals as those in FIGS. 1 and 2 indicate the same or corresponding parts, and the description thereof is omitted.
The casing 51 is mounted with the imaging optical system 11, the light beam dividing unit 12, the intensity image acquiring unit 13, the relay optical system 14, the space dividing unit 21, the shutter 24, and the photodetector 25.
The case 52 includes a wavefront estimation unit 26, a distance image acquisition unit 32, an image deformation unit 41, a moving body restoration unit 42, a recording device 43, a time calibration unit 44, a counter 45, a trajectory information recording unit 46, a trajectory calculation unit 47, A planning device 48 and a control device 49 are mounted.
 レンズなどの光学部品及び人間の瞳などは、光が透過する。また、鏡などの光学部品は、光を反射させる。光が光学部品等を透過することで、光の位相分布が変化し、光が光学部品に反射されることで、光の位相分布が変化する。
 地球の大気は、酸素、窒素及び水蒸気などの媒質によって構成されており、レンズなどの光学部品と同様に、光が透過する。
 酸素等の媒質は、温度の変化及び気圧の変化に伴って屈折率が変動するため、地球の大気を透過する光の位相分布は、温度の変化及び気圧の変化に伴って変化する。光は、電磁波であるため、光の位相分布は、波面として捉えることが可能である。
Optical components such as lenses and the human pupil transmit light. In addition, an optical component such as a mirror reflects light. When light passes through an optical component or the like, the phase distribution of light changes, and when light is reflected by the optical component, the phase distribution of light changes.
The atmosphere of the earth is composed of a medium such as oxygen, nitrogen, and water vapor, and light is transmitted in the same manner as an optical component such as a lens.
Since the refractive index of a medium such as oxygen fluctuates with changes in temperature and atmospheric pressure, the phase distribution of light transmitted through the earth's atmosphere changes with changes in temperature and atmospheric pressure. Since light is an electromagnetic wave, the phase distribution of light can be grasped as a wavefront.
 図1に示す望遠鏡装置10は、大気の外、あるいは、大気中に存在している移動体1に反射された光束2又は移動体1から送信された光束2を受信することで、波面を推定する装置であり、望遠鏡装置10により推定される波面は、酸素等の媒質の屈折率が変化することで変化する。
 媒質の屈折率の変化自体は小さいが、光が伝搬される光路が長くなると、屈折率の変化は、光の波長と比較して、無視できない大きさとなるため、波面の推定においては、大気のゆらぎの影響を強く受ける。
 また、地上の大気は、太陽からの輻射の影響及び熱輸送の影響を受けるとともに、地球の自転の影響を受けるため、地上と上空の間には、大気の層が形成される。大気の層を透過してくる光の波面は、複雑に乱れる。
 一方、レーダ装置30により送受信されるレーダ波は、大気ゆらぎの影響をほとんど受けない。
The telescope device 10 shown in FIG. 1 estimates the wavefront by receiving the light beam 2 reflected by the mobile body 1 existing outside or in the atmosphere or the light beam 2 transmitted from the mobile body 1. The wavefront estimated by the telescope device 10 changes as the refractive index of a medium such as oxygen changes.
Although the change in the refractive index of the medium itself is small, the change in the refractive index becomes insignificant compared to the wavelength of the light when the optical path through which light propagates becomes long. Strongly affected by fluctuations.
In addition, the atmosphere on the ground is affected by the effects of radiation from the sun and heat transport, and is also affected by the rotation of the earth. Therefore, an atmosphere layer is formed between the ground and the sky. The wavefront of light passing through the atmospheric layers is complexly disturbed.
On the other hand, radar waves transmitted and received by the radar device 30 are hardly affected by atmospheric fluctuations.
 図4は、望遠鏡装置10及びレーダ装置30のそれぞれが移動体1を観測している様子を示す説明図である。
 図4において、図1及び図3と同一符号は同一又は相当部分を示している。
 図4では、移動体1は、照明光源60である太陽からの光が照射されている。
 移動体1により反射された太陽光である波長λの反射光は、地上と上空の間の大気層を透過して、望遠鏡装置10に到達する。図4は、地上と上空の間に2つの大気層がある例を示しており、101は第1の大気層、102は第2の大気層である。
 結像光学系11が、入射された波長λの光束2を光検出器25の受光面25aに集光することで、光検出器25の受光面25aには、移動体1の像が形成される。
 光束2が第1の大気層101及び第2の大気層102を透過する際に波面が乱されるため、移動体1の像は、仮に移動体1が点とみなせる物体であっても、広がってしまう。
 したがって、結像光学系11の収差を要因とする移動体1の像の広がり及び光検出器25の分解能を要因とする移動体1の像の広がりを除外すれば、移動体1の像の広がりの要因は、大気ゆらぎである。移動体1が、広がりを有する物体である場合、結像光学系11の収差及び光検出器25の分解能を要因とする広がりを除外すれば、移動体1の像の広がりは、物体自体の広がりと、大気ゆらぎによる広がりで表される。数学的には物体自体の広がりが、波面が等しいとみなせる角度範囲及びアイソプラナテック角を満たす場合、像1の広がりは物体自体の広がりと大気揺らぎによる広がりの畳み込みで表される。
 以上より、光検出器25により検出された複数の集光スポット像のそれぞれは、大気ゆらぎによって広がっており、複数の集光スポット像を示す強度画像は、図4に示すような強度画像62となる。
 図4において、61は、望遠鏡装置10及びレーダ装置30から、移動体1を見たときの移動体1の形状を示している。
FIG. 4 is an explanatory diagram showing a state in which each of the telescope device 10 and the radar device 30 observes the moving body 1.
4, the same reference numerals as those in FIGS. 1 and 3 denote the same or corresponding parts.
In FIG. 4, the moving body 1 is irradiated with light from the sun, which is the illumination light source 60.
The reflected light of wavelength λ 1 that is sunlight reflected by the moving body 1 passes through the atmospheric layer between the ground and the sky, and reaches the telescope device 10. FIG. 4 shows an example in which there are two atmospheric layers between the ground and the sky, where 101 is a first atmospheric layer and 102 is a second atmospheric layer.
The imaging optical system 11 condenses the incident light beam 2 having the wavelength λ 1 on the light receiving surface 25 a of the photodetector 25, so that an image of the moving body 1 is formed on the light receiving surface 25 a of the photodetector 25. Is done.
Since the wavefront is disturbed when the light beam 2 passes through the first atmospheric layer 101 and the second atmospheric layer 102, the image of the moving body 1 spreads even if the moving body 1 can be regarded as a point. End up.
Therefore, if the spread of the image of the moving body 1 caused by the aberration of the imaging optical system 11 and the spread of the image of the moving body 1 caused by the resolution of the photodetector 25 are excluded, the spread of the image of the moving body 1 is excluded. The cause of this is atmospheric fluctuation. When the moving body 1 is an object having a spread, if the spread due to the aberration of the imaging optical system 11 and the resolution of the photodetector 25 is excluded, the spread of the image of the moving body 1 is the spread of the object itself. It is expressed by the spread due to atmospheric fluctuations. Mathematically, when the spread of the object itself satisfies the angle range and isoplanatic angle where the wave fronts can be regarded as equal, the spread of the image 1 is expressed by convolution of the spread of the object itself and the spread due to atmospheric fluctuations.
From the above, each of the plurality of focused spot images detected by the photodetector 25 is spread by atmospheric fluctuation, and the intensity image indicating the plurality of focused spot images is an intensity image 62 as shown in FIG. Become.
In FIG. 4, reference numeral 61 denotes the shape of the moving body 1 when the moving body 1 is viewed from the telescope device 10 and the radar device 30.
 レーダ装置30は、波長λのレーダ波を送信してから、移動体1に反射されたレーダ波である反射波を受信するまでの時間から、移動体1までの距離を算出することができる。
 図4において、L,L,Lのそれぞれは、移動体1におけるそれぞれの部位までの距離を示している。
 レーダ装置30は、移動体1におけるそれぞれの部位までの距離を算出することで、図4に示すような距離画像63を得ることができる。
 波長λのレーダ波は、大気ゆらぎの影響をほとんど受けないため、距離画像63は、大気ゆらぎによる像の広がりが生じていない画像となっている。
The radar apparatus 30 can calculate the distance to the moving body 1 from the time from when the radar wave having the wavelength λ 2 is transmitted until the reflected wave that is the radar wave reflected by the moving body 1 is received. .
In FIG. 4, each of L 1 , L 2 , and L 3 indicates a distance to each part in the moving body 1.
The radar apparatus 30 can obtain a distance image 63 as shown in FIG. 4 by calculating the distance to each part in the moving body 1.
Radar wave having a wavelength lambda 2, since hardly affected by air fluctuations, range image 63 is an image where there is no spread of the image due to atmospheric turbulence.
 次に、図1に示す移動体観測装置について説明する。
 図5は、図1に示す移動体観測装置の処理手順である移動体観測方法を示すフローチャートである。
 指向装置16は、移動体1が望遠鏡装置10と相対的に移動している場合でも、結像光学系11が、移動体1に反射された光束2又は移動体1から送信された光束2を集光することができるようにするため、結像光学系11の指向方向を変更する。
 指向装置33は、移動体1がレーダ装置30と相対的に移動している場合でも、送受信部31が、移動体1からの反射波を受信することができるようにするため、送受信部31の指向方向を変更する。
 例えば、移動体1が恒星である場合、移動体1は、日周運動によって、1秒間に15秒角(=15/3600度)ほど、移動する。したがって、移動体1の追尾を可能にするには、指向装置16及び指向装置33のそれぞれが、指向方向を秒角精度で制御できる必要がある。
Next, the moving body observation apparatus shown in FIG. 1 will be described.
FIG. 5 is a flowchart showing a moving body observation method which is a processing procedure of the moving body observation apparatus shown in FIG.
Even when the moving body 1 is moving relative to the telescope device 10, the directing device 16 is configured such that the imaging optical system 11 transmits the light beam 2 reflected by the moving body 1 or the light beam 2 transmitted from the moving body 1. In order to be able to collect light, the directing direction of the imaging optical system 11 is changed.
The directing device 33 is configured so that the transmission / reception unit 31 can receive the reflected wave from the mobile unit 1 even when the mobile unit 1 is moving relative to the radar device 30. Change the pointing direction.
For example, when the moving body 1 is a star, the moving body 1 moves about 15 seconds per second (= 15/3600 degrees) by the diurnal motion. Therefore, in order to enable tracking of the moving body 1, each of the pointing device 16 and the pointing device 33 needs to be able to control the pointing direction with a second angle accuracy.
 時刻校正部44は、指向装置16及び指向装置33のそれぞれが、指向方向を秒角精度で制御できるようにするため、GPS衛星から発信されるGPS信号又はNTPを用いて、内蔵のクロックの時刻を校正する。
 カウンタ45は、時刻校正部44により校正されたクロックの時刻が或る時刻になると、或る時刻からの経過時間を計測する。
The time calibrating unit 44 uses a GPS signal or NTP transmitted from a GPS satellite so that each of the directing device 16 and the directing device 33 can control the directing direction with a second angle accuracy. Calibrate.
When the time of the clock calibrated by the time calibration unit 44 reaches a certain time, the counter 45 measures the elapsed time from the certain time.
 軌道計算部47は、軌道情報記録部46に記録されている軌道情報に基づいて、移動体1が存在している将来の時刻の位置を予測する。軌道情報が示す移動体1の軌道から、移動体1が存在している将来の時刻の位置を予測する処理自体は、公知の技術であるため詳細な説明を省略する。
 計画装置48は、軌道計算部47により予測された位置に基づいて、将来の時刻における結像光学系11の指向方向及び送受信部31の指向方向などを計算する。
 移動体1の予測位置から指向方向を計算する処理自体は、公知の技術であるため詳細な説明を省略する。
The trajectory calculation unit 47 predicts the position of the future time at which the mobile body 1 exists based on the trajectory information recorded in the trajectory information recording unit 46. Since the process itself for predicting the position of the future time at which the mobile body 1 exists from the trajectory of the mobile body 1 indicated by the trajectory information is a known technique, detailed description thereof is omitted.
The planning device 48 calculates the directivity direction of the imaging optical system 11 and the directivity direction of the transmission / reception unit 31 at a future time based on the position predicted by the trajectory calculation unit 47.
Since the process itself for calculating the directivity direction from the predicted position of the moving body 1 is a known technique, a detailed description thereof will be omitted.
 制御装置49は、カウンタ45により計測された経過時間と、計画装置48により計算された将来の時刻における結像光学系11の指向方向とに基づいて、結像光学系11の指向方向を示す制御信号を指向装置16に出力する。
 また、制御装置49は、カウンタ45により計測された経過時間と、計画装置48により計算された将来の時刻における送受信部31の指向方向とに基づいて、送受信部31の指向方向を示す制御信号を指向装置33に出力する。
 指向装置16は、制御装置49から出力された制御信号に従って結像光学系11の指向方向を変更する。
 指向装置33は、制御装置49から出力された制御信号に従って送受信部31の指向方向を変更する。
The control device 49 controls the pointing direction of the imaging optical system 11 based on the elapsed time measured by the counter 45 and the pointing direction of the imaging optical system 11 at a future time calculated by the planning device 48. The signal is output to the pointing device 16.
In addition, the control device 49 generates a control signal indicating the directivity direction of the transmission / reception unit 31 based on the elapsed time measured by the counter 45 and the directivity direction of the transmission / reception unit 31 at a future time calculated by the planning device 48. Output to the pointing device 33.
The directing device 16 changes the directing direction of the imaging optical system 11 according to the control signal output from the control device 49.
The directing device 33 changes the directing direction of the transmission / reception unit 31 in accordance with the control signal output from the control device 49.
 結像光学系11は、移動体1に反射された光束2又は移動体1から送信された光束2が入射されると、光束2を集光する(図5のステップST1)。
 光束分割部12は、結像光学系11により集光された光束2の光量等を2つに分割することで、光束2を2つに分割する。
 光束分割部12は、分割後の一方の光束2を強度画像取得部13に出力し、分割後の他方の光束2をリレー光学系14に出力する。
When the light beam 2 reflected by the moving body 1 or the light beam 2 transmitted from the moving body 1 is incident, the imaging optical system 11 condenses the light beam 2 (step ST1 in FIG. 5).
The light beam splitting unit 12 splits the light beam 2 into two by splitting the light amount of the light beam 2 collected by the imaging optical system 11 into two.
The beam splitting unit 12 outputs one split beam 2 to the intensity image acquisition unit 13 and outputs the other split beam 2 to the relay optical system 14.
 強度画像取得部13は、光束分割部12から光束2を受けると、光束2から移動体1の像を検出し、移動体1の像を示す強度画像を画像変形部41に出力する(図5のステップST9)。
 強度画像取得部13から画像変形部41に出力される強度画像には、移動体1の像の検出時刻を示す時刻情報が付加されている。
 リレー光学系14は、空間分割部21のレンズアレイ23を結像光学系11の瞳と光学的に等価とする光学系であり、光束分割部12から光束2を受けると、光束2を空間分割部21の遮光部22に出力する。
When receiving the light beam 2 from the light beam splitting unit 12, the intensity image acquisition unit 13 detects the image of the moving body 1 from the light beam 2, and outputs an intensity image indicating the image of the moving body 1 to the image deformation unit 41 (FIG. 5). Step ST9).
Time information indicating the detection time of the image of the moving body 1 is added to the intensity image output from the intensity image acquisition unit 13 to the image deformation unit 41.
The relay optical system 14 is an optical system in which the lens array 23 of the space dividing unit 21 is optically equivalent to the pupil of the imaging optical system 11. When receiving the light beam 2 from the light beam dividing unit 12, the relay optical system 14 space-divides the light beam 2. The light is output to the light shielding unit 22 of the unit 21.
 空間分割部21は、リレー光学系14から光束2を受けると、光束2を複数の空間領域の光束2aに分割し、複数の空間領域の光束2aをシャッタ24に出力する(図5のステップST2)。
 即ち、遮光部22は、リレー光学系14から光束2を受けると、光束2を部分的に遮光することで、光束2を複数の空間領域の光束2aに分割する。
 遮光部22は、複数の空間領域の光束2aをレンズアレイ23に出力する。
 レンズアレイ23に含まれているそれぞれのレンズ23aは、遮光部22からそれぞれの空間領域の光束2aを受けると、それぞれの空間領域の光束2aを光検出器25の受光面25aに集光する。
Upon receiving the light beam 2 from the relay optical system 14, the space dividing unit 21 divides the light beam 2 into a plurality of light beams 2a in a plurality of space regions, and outputs the light beams 2a in the plurality of space regions to the shutter 24 (step ST2 in FIG. 5). ).
That is, when the light shielding unit 22 receives the light beam 2 from the relay optical system 14, the light shielding unit 22 partially shields the light beam 2 to divide the light beam 2 into light beams 2 a in a plurality of spatial regions.
The light shielding unit 22 outputs the light beams 2 a in a plurality of spatial regions to the lens array 23.
Each lens 23 a included in the lens array 23 receives the light beam 2 a in each spatial region from the light shielding unit 22, and condenses the light beam 2 a in each spatial region on the light receiving surface 25 a of the photodetector 25.
 シャッタ24は、光検出器25により受光される光束2aの光量を調整するために、制御装置49から出力される制御信号に従ってレンズアレイ23から出力された光束2aの通過を時間的に制限する。
 光検出器25での光束2aの露光時間がコヒーレンス時間よりも長くなると、大気の状態が変わるため、光検出器25により検出される移動体1の像の広がりが大きくなる。コヒーレンス時間は一般に1~10ms程度である。
 移動体1が高速で移動している場合、制御装置49は、光検出器25での光束2aの露光時間がコヒーレンス時間よりも短くなるように、シャッタ24における光束2aの通過時間を制御する。
 制御装置49が、上記のように、シャッタ24における光束2aの通過時間を制御する場合、光検出器25により受光される光束2aの光量が少なくなってしまうことがある。光束2aの光量が少なくなってしまう場合、制御装置49が、シャッタ24において、光束2aの通過と遮光が複数回繰り返されるように制御することで、光検出器25が、複数回、移動体1の像を検出できるようにする。
The shutter 24 temporally limits the passage of the light beam 2a output from the lens array 23 in accordance with a control signal output from the control device 49 in order to adjust the light amount of the light beam 2a received by the photodetector 25.
When the exposure time of the light beam 2a at the photodetector 25 becomes longer than the coherence time, the atmospheric state changes, and the spread of the image of the moving body 1 detected by the photodetector 25 increases. The coherence time is generally about 1 to 10 ms.
When the moving body 1 is moving at high speed, the control device 49 controls the passage time of the light beam 2a in the shutter 24 so that the exposure time of the light beam 2a in the photodetector 25 is shorter than the coherence time.
As described above, when the control device 49 controls the passage time of the light beam 2a in the shutter 24, the light amount of the light beam 2a received by the photodetector 25 may be reduced. When the light amount of the light beam 2a is reduced, the control device 49 controls the shutter 24 so that the light beam 2a is passed and shielded a plurality of times, so that the photodetector 25 is moved a plurality of times. It is possible to detect the image.
 光検出器25は、シャッタ24を通過してきたそれぞれの光束2から、移動体1の像として、集光スポット像を検出し、複数の集光スポット像を示す強度画像を波面推定部26に出力する(図5のステップST3)。
 ここで、図6は、光検出器25により検出される移動体1の像を示す説明図である。
 図6は、地上と上空の間に3つの大気層がある例を示しており、101は第1の大気層、102は第2の大気層、103は第3の大気層である。
 図6では、結像光学系11及びリレー光学系14による光束の変化を省略している。
 レンズアレイ23に含まれているそれぞれのレンズ23aが、それぞれの空間領域の光束2aを光検出器25の受光面25aに集光することで、光検出器25の受光面25aには、空間領域の個数分だけ、移動体1の像104が形成される。
 光検出器25の受光面25aに形成される複数の像104は、大気のゆらぎを要因とする広がりを持っており、波面の推定に用いることができる。
The photodetector 25 detects a condensed spot image as an image of the moving body 1 from each light beam 2 that has passed through the shutter 24 and outputs an intensity image indicating the plurality of condensed spot images to the wavefront estimating unit 26. (Step ST3 in FIG. 5).
Here, FIG. 6 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25.
FIG. 6 shows an example in which there are three atmospheric layers between the ground and the sky. 101 is a first atmospheric layer, 102 is a second atmospheric layer, and 103 is a third atmospheric layer.
In FIG. 6, the change of the light flux by the imaging optical system 11 and the relay optical system 14 is omitted.
Each lens 23a included in the lens array 23 condenses the luminous flux 2a in each spatial region on the light receiving surface 25a of the photodetector 25, so that the light receiving surface 25a of the photodetector 25 has a spatial region. The image 104 of the moving body 1 is formed by the number of.
The plurality of images 104 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
 移動体1が望遠鏡装置10と相対的に移動している場合、地上から移動体1を見る方向が異なると、光束2の伝搬経路が等しくなくなり、波面が、伝搬経路によって異なるようになる。 When the moving body 1 is moving relative to the telescope device 10, if the direction in which the moving body 1 is viewed from the ground is different, the propagation path of the light beam 2 is not equal, and the wavefront is different depending on the propagation path.
 図7は、波面が伝搬経路によって異なる場合の光検出器25により検出される移動体1の像を示す説明図である。図7において、図6と同一符号は同一又は相当部分を示している。
 光束4、光束5及び光束6のそれぞれは、移動体1に反射された光束又は移動体1から送信された光束である。光束4、光束5及び光束6は、途中の大気層の揺らぎの寄与が互いに異なっており、光束4の伝搬経路、光束5の伝搬経路及び光束6の伝搬経路は、互いに異なっている。
 レンズアレイ23によって、光検出器25の受光面25aに集光される複数の空間領域の光束2aが、光束4、光束5及び光束6のそれぞれであるとみなすと、受光面25aには、光束4、光束5及び光束6のそれぞれによって移動体1の像105が形成される。
 光検出器25の受光面25aに形成される複数の像105は、大気のゆらぎを要因とする広がりを持っており、波面の推定に用いることができる。
FIG. 7 is an explanatory diagram showing an image of the moving body 1 detected by the photodetector 25 when the wavefront varies depending on the propagation path. 7, the same reference numerals as those in FIG. 6 denote the same or corresponding parts.
Each of the light beam 4, the light beam 5, and the light beam 6 is a light beam reflected by the moving body 1 or a light beam transmitted from the moving body 1. The luminous flux 4, the luminous flux 5 and the luminous flux 6 are different from each other in the contribution of fluctuations in the atmospheric layer, and the propagation path of the luminous flux 4, the propagation path of the luminous flux 5, and the propagation path of the luminous flux 6 are different from each other.
When it is assumed that the light beams 2a in the plurality of spatial regions collected by the lens array 23 on the light receiving surface 25a of the photodetector 25 are the light beams 4, 5, and 6, respectively, 4, the light beam 5 and the light beam 6 form an image 105 of the moving body 1.
The plurality of images 105 formed on the light receiving surface 25a of the photodetector 25 has a spread due to atmospheric fluctuations, and can be used for wavefront estimation.
 図8及び図9は、光検出器25により検出される移動体1の像と波面の関係を示す説明図である。
 図8は、光束2が進行方向に対して広がらずに伝搬されている例を示し、図9は、光束2が進行方向に対して広がりながら伝搬されている例を示している。
 図8及び図9において、105aは、光束2が進行方向に対して広がらずに伝搬されている場合の移動体1の像であり、105bは、光束2が進行方向に対して広がりながら伝搬されている場合の移動体1の像である。
 光束2が進行方向に対して広がらずに伝搬されている場合、図8に示すように、レンズアレイ23のそれぞれのレンズ23aにより集光される移動体1の像105aの位置は、遮光部22により分割された複数の空間領域のそれぞれの位置と一致している。
 光束2が進行方向に対して広がりながら伝搬されている場合、図9に示すように、レンズアレイ23のそれぞれのレンズ23aにより集光される移動体1の像105bの位置は、遮光部22により分割された複数の空間領域のそれぞれの位置とずれている。
 波面106aは、移動体1の複数の像105aの位置から求められ、波面106bは、移動体1の複数の像105bの位置から求められる。
8 and 9 are explanatory diagrams showing the relationship between the image of the moving body 1 detected by the photodetector 25 and the wavefront.
FIG. 8 shows an example in which the light beam 2 is propagated without spreading in the traveling direction, and FIG. 9 shows an example in which the light beam 2 is propagated while spreading in the traveling direction.
8 and 9, 105a is an image of the moving body 1 when the light beam 2 is propagated without spreading in the traveling direction, and 105b is propagated while the light beam 2 spreads in the traveling direction. It is the image of the moving body 1 in the case of being.
When the light beam 2 is propagated without spreading in the traveling direction, as shown in FIG. 8, the position of the image 105a of the moving body 1 collected by each lens 23a of the lens array 23 is the light shielding portion 22. Are coincident with the respective positions of the plurality of spatial regions divided by.
When the light beam 2 is propagated while spreading in the traveling direction, the position of the image 105b of the moving body 1 collected by each lens 23a of the lens array 23 is caused by the light shielding unit 22 as shown in FIG. The position of each of the divided space areas is shifted.
The wavefront 106a is obtained from the positions of the plurality of images 105a of the moving body 1, and the wavefront 106b is obtained from the positions of the plurality of images 105b of the moving body 1.
 図8及び図9では、遮光部22により分割された複数の空間領域が格子状に配置されている例を示している。しかし、これに限るものではなく、例えば、複数の空間領域の配置がハニカム配置であってもよい。
 図8及び図9では、遮光部22において、光束2が遮光される遮光領域、即ち、光束2が透過する透過領域以外の領域が、黒く塗られている例を示している。しかし、遮光部22において、遮光領域は、不要な光を透過させなければよく、黒以外の色が塗られていてもよい。
 また、遮光部22において、遮光領域は、不要な光を吸収する着色又は加工が施されていてもよいし、不要な光を散乱する着色又は加工が施されていてもよい。
8 and 9 show an example in which a plurality of spatial regions divided by the light shielding unit 22 are arranged in a grid pattern. However, the arrangement is not limited to this. For example, the arrangement of the plurality of space regions may be a honeycomb arrangement.
8 and 9 show an example in which the light shielding area where the light flux 2 is shielded, that is, the area other than the transmission area where the light flux 2 is transmitted is painted black. However, in the light-shielding part 22, the light-shielding region is not required to transmit unnecessary light and may be painted with a color other than black.
Moreover, in the light shielding part 22, the light shielding region may be colored or processed to absorb unnecessary light, or may be colored or processed to scatter unnecessary light.
 波面推定部26は、光検出器25から複数の集光スポット像を示す強度画像を受けると、強度画像が示す複数の集光スポット像の位置から、結像光学系11の開口における光束2の波面を推定する(図5のステップST4)。
 図10は、波面推定部26の処理手順を示すフローチャートである。
 以下、図10を参照しながら、波面推定部26の処理内容を具体的に説明する。
When the wavefront estimation unit 26 receives an intensity image indicating a plurality of focused spot images from the photodetector 25, the wavefront estimating unit 26 determines the light flux 2 at the aperture of the imaging optical system 11 from the positions of the plurality of focused spot images indicated by the intensity image. The wavefront is estimated (step ST4 in FIG. 5).
FIG. 10 is a flowchart showing a processing procedure of the wavefront estimation unit 26.
Hereinafter, the processing content of the wavefront estimation unit 26 will be specifically described with reference to FIG.
 まず、波面推定部26は、光検出器25から複数の集光スポット像を示す強度画像を受けると、強度画像が示す複数の集光スポット像の位置から、結像光学系11の開口における光束2の波面の概算値を算出する(図10のステップST11)。
 複数の集光スポット像の位置から、波面の概算値を算出する処理自体は、公知の技術であるため詳細な説明を省略する。
 複数の集光スポット像の位置から波面を推定する方法は、例えば、以下の非特許文献1に開示されている。
[非特許文献1]国立天文台報 vol.2 No.2
First, when the wavefront estimation unit 26 receives an intensity image indicating a plurality of focused spot images from the photodetector 25, the light flux at the aperture of the imaging optical system 11 is determined from the positions of the plurality of focused spot images indicated by the intensity image. 2 is calculated (step ST11 in FIG. 10).
Since the processing itself for calculating the approximate value of the wavefront from the positions of the plurality of focused spot images is a known technique, detailed description thereof is omitted.
A method for estimating a wavefront from the positions of a plurality of focused spot images is disclosed in Non-Patent Document 1 below, for example.
[Non-patent document 1] National Astronomical Observatory of Japan vol.2 No.2
 ここでは、制御装置49が、シャッタ24において、光束2aの通過と遮光が複数回繰り返されるように制御することで、光検出器25からN枚の強度画像が波面推定部26に出力されているものとする。
 そして、波面推定部26は、N枚の強度画像のうち、n(n=1,2,・・・,N)枚目の強度画像が示す複数の集光スポット像の位置から、波面の概算値を算出しているものとする。
 なお、移動体1が点像である場合、あるいは、移動体1が点像と近似できる場合、集光スポット像の位置として、点像の重心の位置を求める態様が考えられる。
 また、移動体1が広がりのある物体である場合、波面は、複数の集光スポット像の間隔又は複数の集光スポット像の相対位置から求めることができる。したがって、集光スポット像の位置として、複数の集光スポット像の相互相関又は複数の集光スポット像の特徴的な位置の間隔を求める態様が考えられる。
Here, the control device 49 controls the shutter 24 so that the passage of the light beam 2 a and the light shielding are repeated a plurality of times, so that N intensity images are output from the photodetector 25 to the wavefront estimation unit 26. Shall.
Then, the wavefront estimation unit 26 approximates the wavefront from the positions of a plurality of focused spot images indicated by n (n = 1, 2,..., N) intensity images among the N intensity images. Assume that the value is being calculated.
In addition, when the moving body 1 is a point image, or when the moving body 1 can be approximated to a point image, a mode of obtaining the position of the center of gravity of the point image as the position of the focused spot image can be considered.
When the moving body 1 is a wide object, the wavefront can be obtained from the interval between the plurality of focused spot images or the relative position of the plurality of focused spot images. Therefore, as the position of the focused spot image, an aspect in which the cross-correlation of the plurality of focused spot images or the interval between the characteristic positions of the focused spot images is considered.
 波面推定部26は、n枚目の強度画像が示す複数の集光スポット像の位置から算出した概算値である波面の位相をΦ0,nとする。
 波面推定部26は、以下の式(1)に示すように、結像光学系11の開口における光束2の波面の位相Φ(u,v)の初期値として、位相Φ0,nを用いることで、概算値よりも高精度な波面を推定する。(u,v)は、瞳空間の座標である。
Figure JPOXMLDOC01-appb-I000001
 以下、高精度な波面の推定処理を説明する前に、高精度な波面の推定処理の原理及び移動体1の輝度分布推定処理の原理を説明する。
The wavefront estimation unit 26 sets the phase of the wavefront, which is an approximate value calculated from the positions of the plurality of focused spot images indicated by the nth intensity image, to Φ 0, n .
The wavefront estimation unit 26 uses the phase Φ 0, n as the initial value of the phase Φ n (u, v) of the wavefront of the light beam 2 at the aperture of the imaging optical system 11 as shown in the following equation (1). Thus, a wavefront with higher accuracy than the estimated value is estimated. (U, v) are the coordinates of the pupil space.
Figure JPOXMLDOC01-appb-I000001
Hereinafter, before explaining the highly accurate wavefront estimation process, the principle of the highly accurate wavefront estimation process and the principle of the luminance distribution estimation process of the moving body 1 will be described.
 図11は、移動体1が望遠鏡装置10と相対的に移動している場合において、結像光学系11の開口と、複数の空間領域におけるそれぞれの開口と、移動体1の像105との関係を示す説明図である。
 M(u,v)は、結像光学系11の開口である。
 M(u,v)、M(u,v)、・・・、M(u,v)のそれぞれは、複数の空間領域におけるそれぞれの開口である。M(u,v)における添え字のMは、2以上の整数であり、例えば、m=1,2,・・・,Mである。
 波面収差と瞳上の振幅分布で表される瞳関数Gm,n(u,v)は、以下の式(2)に示すように、n枚目の強度画像に対応する、結像光学系11の開口における光束2の波面の位相Φ(u,v)と、開口M(u,v)とで表される。
Figure JPOXMLDOC01-appb-I000002
 開口M(u,v)は、既知であり、位相Φ(u,v)の初期値は、概算値である波面の位相Φ0,nであるため、瞳関数Gm,n(u,v)は、位相Φ(u,v)と開口M(u,v)から算出される。
FIG. 11 shows the relationship between the aperture of the imaging optical system 11, the respective apertures in a plurality of spatial regions, and the image 105 of the movable body 1 when the movable body 1 moves relative to the telescope device 10. It is explanatory drawing which shows.
M 0 (u, v) is the aperture of the imaging optical system 11.
Each of M 1 (u, v), M 2 (u, v),..., M M (u, v) is an opening in each of a plurality of spatial regions. The subscript M in M M (u, v) is an integer equal to or greater than 2, for example, m = 1, 2,.
The pupil function G m, n (u, v) represented by the wavefront aberration and the amplitude distribution on the pupil is an imaging optical system corresponding to the nth intensity image, as shown in the following equation (2). 11 is represented by the phase Φ n (u, v) of the wavefront of the light beam 2 at the aperture of 11 and the aperture M m (u, v).
Figure JPOXMLDOC01-appb-I000002
Since the aperture M m (u, v) is known, and the initial value of the phase Φ n (u, v) is the approximate wavefront phase Φ 0, n , the pupil function G m, n (u , V) is calculated from the phase Φ n (u, v) and the aperture M m (u, v).
 また、位相Φ(u,v)と、波面W(u,v)との関係は、以下の式(3)で表される。
Figure JPOXMLDOC01-appb-I000003
 式(3)において、λは、波長である。
Further, the relationship between the phase Φ n (u, v) and the wavefront W n (u, v) is expressed by the following equation (3).
Figure JPOXMLDOC01-appb-I000003
In Expression (3), λ is a wavelength.
 振幅広がり関数am,n(u,v)は、以下の式(4)に示すように、瞳関数Gm,n(u,v)が逆フーリエ変換されることで得られる。
Figure JPOXMLDOC01-appb-I000004
 式(4)において、F-1は、逆フーリエ変換を表す記号である。
 点像強度分布を示す点広がり関数km,n(x,y)は、以下の式(5)に示すように、振幅広がり関数am,n(u,v)と、振幅広がり関数am,n(u,v)の複素共役との積で表される。(x,y)は、実空間の座標である。
Figure JPOXMLDOC01-appb-I000005
The amplitude spread function a m, n (u, v) is obtained by inverse Fourier transforming the pupil function G m, n (u, v) as shown in the following equation (4).
Figure JPOXMLDOC01-appb-I000004
In equation (4), F −1 is a symbol representing the inverse Fourier transform.
Point spread function k m point spread shows a, n (x, y), as shown in the following equation (5), the amplitude spread function a m, and n (u, v), the amplitude spread function a m , N (u, v) and the complex conjugate. (X, y) are real space coordinates.
Figure JPOXMLDOC01-appb-I000005
 移動体1の輝度分布がo(p,q)、光検出器25で生じるノイズがem,n(x,y)で表されるとすると、m番目の空間領域の開口M(u,v)に対応する移動体1の像im,n(x,y)は、以下の式(6)で表される。(p,q)は、移動体1が存在している位置を示す実空間の座標である。
 移動体1の輝度分布o(p,q)は、移動体1に反射される光束2の強度又は移動体1から送信される光束2の強度である。
Figure JPOXMLDOC01-appb-I000006
 式(6)における畳み込み積分を“*”の記号で表記すると、式(6)は、以下の式(7)で表される。
Figure JPOXMLDOC01-appb-I000007
 一般的には、移動体1の像im,n(x,y)は、点広がり関数km,n(x,y)と、移動体1の輝度分布o(p,q)との畳み込み積分で得られるが、式(6)及び式(7)には、光検出器25で生じるノイズem,n(x,y)が付加されている。
If the luminance distribution of the moving body 1 is represented by o (p, q) and the noise generated by the photodetector 25 is represented by em , n (x, y), the aperture Mm (u, The image im , n (x, y) of the moving body 1 corresponding to v) is expressed by the following equation (6). (P, q) are real space coordinates indicating the position where the moving body 1 exists.
The luminance distribution o (p, q) of the moving body 1 is the intensity of the light beam 2 reflected by the moving body 1 or the intensity of the light beam 2 transmitted from the moving body 1.
Figure JPOXMLDOC01-appb-I000006
When the convolution integral in Expression (6) is expressed by the symbol “*”, Expression (6) is expressed by Expression (7) below.
Figure JPOXMLDOC01-appb-I000007
In general, the image i m of the moving body 1, n (x, y) is the point spread function k m, and n (x, y), the convolution of the intensity distribution of the moving body 1 o (p, q) Although obtained by integration, noise em , n (x, y) generated in the photodetector 25 is added to the equations (6) and (7).
 以下の式(8)は、移動体1の輝度分布o(p,q)と点像強度分布を示す点広がり関数km,n(x,y)とから得られる移動体1の像であるo(p,q)*km,n(x,y)と、移動体1の実測の像であるim,n(x,y)との差分の二乗和eを示している。
Figure JPOXMLDOC01-appb-I000008
 式(8)において、点広がり関数km,n(x,y)は、式(2)、式(4)及び式(5)から得られる。したがって、式(8)において、未知の値は、移動体1の輝度分布o(p,q)のみである。
 移動体1の輝度分布o(p,q)は、差分の二乗和eが最小になるo(p,q)を探索することで求まる。
The following equation (8) is an image of the moving body 1 obtained from the luminance distribution o (p, q) of the moving body 1 and the point spread function km , n (x, y) indicating the point image intensity distribution. o (p, q) * k m, and n (x, y), which is an image of the actual measurement of the moving body 1 i m, n (x, y) represents the square sum e of the difference between the.
Figure JPOXMLDOC01-appb-I000008
In Expression (8), the point spread function km , n (x, y) is obtained from Expression (2), Expression (4), and Expression (5). Therefore, in Equation (8), the only unknown value is the luminance distribution o (p, q) of the moving body 1.
The luminance distribution o (p, q) of the moving body 1 is obtained by searching for o (p, q) that minimizes the sum of squared differences e.
 移動体1は、望遠鏡装置10に対して相対的に運動しており、指向装置16が結像光学系11の指向方向を変更しても、移動体1と望遠鏡装置10の相対運動は、完全にはキャンセルできないものとする。
 したがって、時刻tが変わることで、移動体1の相対位置が変わる。
 時刻tが変わる回数と、波面推定部26が得る強度画像の枚数であるフレーム数とは、同じである必要はないが、フレーム数が例えば10であれば、10点の時刻の強度画像が得られることになるため、フレームの番号は、時刻の番号と対応する。
 ここでは、移動体1の輝度分布o(p,q)は、フレームに依存しておらず、変化していないものとする。ただし、波面は、フレーム毎に、変化しているものとする。
The moving body 1 moves relative to the telescope device 10, and even if the directing device 16 changes the directing direction of the imaging optical system 11, the relative movement between the moving body 1 and the telescope device 10 is complete. Cannot be canceled.
Therefore, the relative position of the moving body 1 changes as the time t changes.
The number of times t changes and the number of frames, which is the number of intensity images obtained by the wavefront estimation unit 26, do not have to be the same, but if the number of frames is 10, for example, ten intensity images are obtained. Therefore, the frame number corresponds to the time number.
Here, it is assumed that the luminance distribution o (p, q) of the moving body 1 does not depend on the frame and does not change. However, it is assumed that the wavefront changes from frame to frame.
 移動体1の輝度分布o(p,q)を探索する際、差分の二乗和は、位相空間で考えることができる。
 以下の式(9)は、式(8)がフーリエ変換されたものであり、式(8)に示す差分の二乗和eが、位相空間での差分の二乗和Eになっている。
Figure JPOXMLDOC01-appb-I000009
 式(9)において、Im,n(u,v)は、im,n(x,y)のスペクトルであり、以下の式(10)のように表される。
Figure JPOXMLDOC01-appb-I000010
When searching for the luminance distribution o (p, q) of the moving body 1, the sum of squares of the differences can be considered in the phase space.
In the following equation (9), equation (8) is Fourier transformed, and the square sum e of the difference shown in equation (8) is the square sum E of the difference in the phase space.
Figure JPOXMLDOC01-appb-I000009
In Expression (9), I m, n (u, v) is a spectrum of im , n (x, y) and is expressed as the following Expression (10).
Figure JPOXMLDOC01-appb-I000010
 式(10)において、Fは、フーリエ変換を表す記号である。
 式(9)において、O(u,v)は、o(p,q)のスペクトルであり、以下の式(11)のように表される。
Figure JPOXMLDOC01-appb-I000011
 光検出器25で生じるノイズem,n(x,y)があるために、O(u,v)=im,n(u,v)/Km,n(u,v)のように表現することができないので、式(11)のように、表されている。
 式(11)において、γは、解の安定化のために導入している係数である。
 Km,n(u,v)は、瞳関数Gm,n(u,v)の自己相関であり、以下の式(12)で表される。Km,n(u,v)は、規格化されていないが、光学伝達関数である。
Figure JPOXMLDOC01-appb-I000012
In Expression (10), F is a symbol representing Fourier transform.
In the equation (9), O (u, v) is a spectrum of o (p, q) and is represented as the following equation (11).
Figure JPOXMLDOC01-appb-I000011
Since there is noise em , n (x, y) generated in the photodetector 25, O (u, v) = im , n (u, v) / Km, n (u, v). Since it cannot be expressed, it is expressed as in equation (11).
In equation (11), γ is a coefficient introduced for stabilizing the solution.
K m, n (u, v) is an autocorrelation of the pupil function G m, n (u, v) and is represented by the following equation (12). K m, n (u, v) is not normalized but is an optical transfer function.
Figure JPOXMLDOC01-appb-I000012
 式(11)を式(9)に代入すると、以下の式(13)が得られる。
Figure JPOXMLDOC01-appb-I000013
 式(13)に示す差分の二乗和Eは、開口M(p,q)と、位相Φ(p,q)と、移動体1の像im,n(x,y)のスペクトルIm,n(u,v)とで表されており、未知である移動体1の輝度分布o(p,q)のスペクトルO(u,v)に依存していない。
 波面W(u,v)は、以下の式(14)に示す差分の二乗和Errが最小になる位相Φ(u,v)を求めれば、式(2)よって推定することができる。
Figure JPOXMLDOC01-appb-I000014
Substituting equation (11) into equation (9) yields equation (13) below.
Figure JPOXMLDOC01-appb-I000013
The sum of squares E of the difference shown in the equation (13) is the aperture M m (p, q), the phase Φ n (p, q), and the spectrum I of the image im , n (x, y) of the moving body 1. It is represented by m, n (u, v) and does not depend on the spectrum O (u, v) of the luminance distribution o (p, q) of the moving body 1 that is unknown.
The wavefront W n (u, v) can be estimated by the equation (2) if the phase Φ n (u, v) at which the square sum Err of the difference shown in the following equation (14) is minimized.
Figure JPOXMLDOC01-appb-I000014
 差分の二乗和Errが最小になる位相Φ(u,v)を求めることで、波面W(u,v)を推定する場合でも、移動体1の輝度分布o(p,q)を求めることが可能であるが、式(14)は、移動体1の輝度分布o(p,q)に依存していない。したがって、移動体1の輝度分布o(p,q)が実空間で0よりも大きい実数であるという、計算上の強い制約を与えられない。
 移動体1の輝度分布o(p,q)が実空間で0よりも大きい実数であるという制約を与えるには、実空間における差分の二乗和eを示す式(8)に対して、さらに制約を与えればよい。
 以下の式(15)は、実空間における差分rm,n(x,y)を示している。
Figure JPOXMLDOC01-appb-I000015
The luminance distribution o (p, q) of the moving body 1 is obtained even by estimating the wavefront W n (u, v) by obtaining the phase Φ n (u, v) that minimizes the square sum Err of the difference. However, the equation (14) does not depend on the luminance distribution o (p, q) of the moving body 1. Therefore, a strong calculational constraint that the luminance distribution o (p, q) of the moving body 1 is a real number larger than 0 in real space is not given.
In order to give a constraint that the luminance distribution o (p, q) of the moving body 1 is a real number larger than 0 in the real space, a further constraint is imposed on the equation (8) indicating the square sum e of the difference in the real space. Should be given.
The following equation (15) shows the difference r m, n (x, y) in the real space.
Figure JPOXMLDOC01-appb-I000015
 光検出器25が、CCD(Charge Coupled Device)又はCMOS(Complementary Metal Oxide Semiconductor)などのイメージセンサである場合、主なノイズは、ショットノイズと読み出しノイズである。
 読み出しノイズは、正規分布に従い、中央値が0で、標準偏差がσであるとする。ショットノイズは、取得したフレームの輝度分布に比例する。
 したがって、式(15)をノイズで規格化すると、式(16)のようになる。
Figure JPOXMLDOC01-appb-I000016
 ノイズに対する実空間における差分rm,n(x,y)の比が、1よりも大きければ、ずれが大きく、1であれば、ずれがなく、1よりも小さければ、ずれが小さいことを意味する。
When the photodetector 25 is an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), the main noise is shot noise and readout noise.
The readout noise follows a normal distribution and has a median value of 0 and a standard deviation of σ. Shot noise is proportional to the luminance distribution of the acquired frame.
Therefore, when Expression (15) is normalized with noise, Expression (16) is obtained.
Figure JPOXMLDOC01-appb-I000016
If the ratio of the difference r m, n (x, y) in the real space to noise is larger than 1, the deviation is large. If it is 1, there is no deviation, and if it is smaller than 1, the deviation is small. To do.
 式(8)の代わりに、以下の式(17)に示す尤度関数を導入する。
Figure JPOXMLDOC01-appb-I000017
 式(17)において、d(x,y)は、差分rm,n(x,y)に与える重みであり、例えば、ずれが大きいフレームは、信頼度が低いので、小さい重みが与えられる。
 また、実空間上において、計算する領域の重みを1、計算を省略する領域の重みを0として、計算量を減らすことが可能である。
 以上が、波面の推定処理の原理及び移動体1の輝度分布推定処理の原理である。
Instead of the equation (8), a likelihood function shown in the following equation (17) is introduced.
Figure JPOXMLDOC01-appb-I000017
In Expression (17), d m (x, y) is a weight given to the difference r m, n (x, y). For example, a frame with a large deviation is given a low weight because the reliability is low. .
In the real space, the calculation amount can be reduced by setting the weight of the area to be calculated to 1 and the weight of the area to omit the calculation to 0.
The above is the principle of the wavefront estimation process and the principle of the luminance distribution estimation process of the moving body 1.
 波面推定部26は、概算値である波面の位相をΦ0,nとすると、Φ0,nを、式(2)に示すΦ(u,v)の初期値に設定する。
 波面推定部26は、式(2)、式(4)及び式(5)を算出することで、点像強度分布を示す点広がり関数km,n(x,y)を算出する(図10のステップST12)。
 波面推定部26は、点広がり関数km,n(x,y)及び移動体1の像im,n(x,y)のそれぞれをフーリエ変換することで、光学伝達関数Km,n(u,v)及び移動体1の像im,n(x,y)のスペクトルIm,n(u,v)を得る(図10のステップST13)。
Wavefront estimation unit 26, when the wavefront phase approximations and [Phi 0, n, the [Phi 0, n, is set to an initial value of the formula shown in (2) Φ n (u, v).
The wavefront estimation unit 26 calculates the point spread function km , n (x, y) indicating the point image intensity distribution by calculating Expression (2), Expression (4), and Expression (5) (FIG. 10). Step ST12).
Wavefront estimation unit 26, the point spread function k m, n (x, y ) and the image i m of the moving body 1, n (x, y) respectively by Fourier transform, the optical transfer function K m, n ( u, v) and the spectrum I m, n (u, v) of the image im , n (x, y) of the moving body 1 are obtained (step ST13 in FIG. 10).
 波面推定部26は、光学伝達関数Km,n(u,v)及びスペクトルIm,n(u,v)を式(14)に代入し、差分の二乗和Errを算出する(図10のステップST14)。
 波面推定部26は、差分の二乗和Errを算出すると、位相の探索処理が収束しているか否かを判定する(図10のステップST15)。
 位相の探索処理の収束判定として、例えば、算出した差分の二乗和Errが事前に設定されている第1の許容誤差以下であれば、収束していると判定する方法がある。収束していると判定したときの算出した差分の二乗和Errは、最小の二乗和Errである。第1の許容誤差は、例えば、波面推定部26の内部メモリ又は記録装置43に格納されているものとする。
 また、位相の探索処理の収束判定として、例えば、位相Φ(u,v)を変更しながら、事前に設定された回数だけ差分の二乗和Errを算出し、算出した二乗和Errの中で、最小の二乗和Errを特定したら、収束していると判定する方法がある。
The wavefront estimation unit 26 substitutes the optical transfer function K m, n (u, v) and the spectrum I m, n (u, v) into the equation (14), and calculates the square sum Err of the difference (FIG. 10). Step ST14).
When calculating the square sum Err of the difference, the wavefront estimating unit 26 determines whether or not the phase search process has converged (step ST15 in FIG. 10).
As a convergence determination in the phase search process, for example, there is a method of determining that the convergence is achieved if the calculated square sum Err of the difference is equal to or less than a first allowable error set in advance. The sum of squares Err of the difference calculated when it is determined that it has converged is the minimum square sum Err. The first allowable error is assumed to be stored in the internal memory of the wavefront estimation unit 26 or the recording device 43, for example.
Further, as the convergence determination of the phase search process, for example, while changing the phase Φ n (u, v), the square sum Err of the difference is calculated a predetermined number of times, and the calculated sum of squares Err is If the minimum square sum Err is specified, there is a method for determining that the sum has converged.
 波面推定部26は、位相の探索処理が収束していなければ(図10のステップST15:NOの場合)、式(2)に示す位相Φ(u,v)を変更し(図10のステップST16)、変更後の位相Φ(u,v)を式(2)に設定する。
 波面推定部26は、ステップST12~ST15の処理を再度実施する。
 変更後の位相Φ(u,v)は、未だ式(2)に設定していない位相であれば、どのような位相でもよいが、差分の二乗和Errが小さくなるような位相であることが望ましい。
 波面推定部26は、位相の探索処理が収束していれば(図10のステップST15:YESの場合)、位相の探索処理を終了する。
If the phase search processing has not converged (in the case of step ST15 in FIG. 10: NO), the wavefront estimation unit 26 changes the phase Φ n (u, v) shown in equation (2) (step in FIG. 10). ST16), the changed phase Φ n (u, v) is set in the equation (2).
The wavefront estimation unit 26 performs the processes of steps ST12 to ST15 again.
The phase Φ n (u, v) after the change may be any phase as long as it has not yet been set in the expression (2), but should be a phase that makes the square sum Err of the difference small. Is desirable.
If the phase search process has converged (in the case of step ST15 in FIG. 10: YES), the wavefront estimation unit 26 ends the phase search process.
 波面推定部26は、位相の探索処理が終了すると、最小の二乗和Errが算出された位相Φ(u,v)を式(3)に代入することで、結像光学系11の開口における光束2の波面W(u,v)を推定する(図10のステップST17)。
 推定された波面W(u,v)は、ステップST11で算出された概算値としての波面よりも高精度な波面である。
 波面推定部26は、波面W(u,v)を記録装置43に出力する。波面推定部26から記録装置43に出力される波面W(u,v)には、波面W(u,v)の算出時刻を示す時刻情報が付加されている。
 また、波面推定部26は、最小の二乗和Errが算出された位相Φ(u,v)に対応する点広がり関数km,n(x,y)を移動体復元部42に出力する。波面推定部26から移動体復元部42に出力される点広がり関数km,n(x,y)には、点広がり関数km,n(x,y)の算出時刻を示す時刻情報が付加されている。
 ここでは、波面推定部26が、点広がり関数km,n(x,y)を移動体復元部42に出力しているが、波面W(u,v)を移動体復元部42に出力するようにしてもよい。
When the phase search process ends, the wavefront estimation unit 26 substitutes the phase Φ n (u, v) for which the minimum square sum Err is calculated into the expression (3), so that the wavefront estimation unit 26 at the aperture of the imaging optical system 11 The wave front W n (u, v) of the light beam 2 is estimated (step ST17 in FIG. 10).
The estimated wavefront W n (u, v) is a wavefront with higher accuracy than the wavefront as the approximate value calculated in step ST11.
The wavefront estimation unit 26 outputs the wavefront W n (u, v) to the recording device 43. The time information indicating the calculation time of the wavefront W n (u, v) is added to the wavefront W n (u, v) output from the wavefront estimation unit 26 to the recording device 43.
Further, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) corresponding to the phase Φ n (u, v) for which the minimum square sum Err is calculated to the moving body restoration unit 42. Time information indicating the calculation time of the point spread function km , n (x, y) is added to the point spread function km , n (x, y) output from the wavefront estimation unit 26 to the moving body restoration unit 42. Has been.
Here, the wavefront estimation unit 26 outputs the point spread function km , n (x, y) to the moving body restoration unit 42, but outputs the wavefront W n (u, v) to the moving body restoration unit 42. You may make it do.
 レーダ装置30は、移動体1に反射されたレーダ波を受信し、レーダ波に基づいて移動体1までの距離を示す距離画像を生成する(図5のステップST5)。
 以下、レーダ装置30による距離画像の生成処理を具体的に説明する。
 送受信部31は、マイクロ波又はミリ波などのレーダ波を移動体1に向けて送信するとともに、レーダ波を距離画像取得部32に出力する。
 送受信部31は、移動体1に反射されたレーダ波を反射波として受信し、反射波を距離画像取得部32に出力する。
 レーダ装置30から移動体1までの距離は、レーダ装置30がレーダ波を送信してから、反射波を受信するまでの時間に正比例する。
The radar apparatus 30 receives the radar wave reflected by the moving body 1 and generates a distance image indicating the distance to the moving body 1 based on the radar wave (step ST5 in FIG. 5).
Hereinafter, the distance image generation processing by the radar apparatus 30 will be specifically described.
The transmission / reception unit 31 transmits a radar wave such as a microwave or a millimeter wave toward the moving body 1 and outputs the radar wave to the distance image acquisition unit 32.
The transmission / reception unit 31 receives the radar wave reflected by the moving body 1 as a reflected wave, and outputs the reflected wave to the distance image acquisition unit 32.
The distance from the radar apparatus 30 to the moving body 1 is directly proportional to the time from when the radar apparatus 30 transmits a radar wave to when the reflected wave is received.
 距離画像取得部32は、送受信部31からレーダ波が送信されてから、反射波を受信するまでの時間を計測する。
 距離画像取得部32は、計測した時間に基づいて、移動体1までの距離を示す距離画像を生成し、距離画像を画像変形部41に出力する。
 計測時間に基づいて距離画像を生成する処理自体は、公知の技術であるため詳細な説明を省略する。
 距離画像取得部32から画像変形部41に出力される距離画像には、距離画像の生成時刻を示す時刻情報が付加されている。
The distance image acquisition unit 32 measures the time from when the radar wave is transmitted from the transmission / reception unit 31 until the reflected wave is received.
The distance image acquisition unit 32 generates a distance image indicating the distance to the moving body 1 based on the measured time, and outputs the distance image to the image deformation unit 41.
Since the process itself for generating the distance image based on the measurement time is a known technique, detailed description thereof is omitted.
Time information indicating the generation time of the distance image is added to the distance image output from the distance image acquisition unit 32 to the image deformation unit 41.
 画像変形部41は、距離画像取得部32から出力された距離画像が、強度画像取得部13から出力された強度画像と整合するように、距離画像を変形し、変形後の距離画像を移動体復元部42に出力する(図5のステップST6)。
 以下、画像変形部41による距離画像の変形処理を具体的に説明する。
 まず、画像変形部41は、強度画像取得部13から異なる時刻の強度画像を順次取得し、距離画像取得部32から異なる時刻の距離画像を順次取得する。
 画像変形部41は、例えば、生成時刻tの距離画像の変形処理を行う場合、取得した複数の強度画像の中から、検出時刻tが生成時刻tと最も近い強度画像を選択する。
The image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13, and converts the deformed distance image to the moving object. The data is output to the restoration unit 42 (step ST6 in FIG. 5).
The distance image deformation process by the image deformation unit 41 will be specifically described below.
First, the image transformation unit 41 sequentially acquires intensity images at different times from the intensity image acquisition unit 13 and sequentially acquires distance images at different times from the distance image acquisition unit 32.
For example, when the deformation process of the distance image at the generation time t c is performed, the image deformation unit 41 selects an intensity image whose detection time t d is closest to the generation time t c from the plurality of acquired intensity images.
 次に、画像変形部41は、制御装置49から、望遠鏡装置10のステータス情報を取得するとともに、レーダ装置30のステータス情報を取得する。
 望遠鏡装置10のステータス情報は、結像光学系11の指向方向及び強度画像の縮尺などを示す情報である。
 レーダ装置30のステータス情報は、送受信部31の指向方向及び距離画像の縮尺などを示す情報である。
 画像変形部41は、結像光学系11の指向方向を参照して、強度画像の向きを確認し、送受信部31の指向方向を参照して、距離画像の向きを確認する。
 画像変形部41は、生成時刻tの距離画像の向きが、選択した強度画像の向きと一致するように、距離画像を回転等させるアフィン変換処理などを実施する。
 また、画像変形部41は、アフィン変換処理などを実施した距離画像の縮尺が、選択した強度画像の縮尺と一致するように、当該距離画像の拡大処理又は当該距離画像の縮小処理を実施する。
Next, the image deforming unit 41 acquires the status information of the telescope device 10 and the status information of the radar device 30 from the control device 49.
The status information of the telescope device 10 is information indicating the orientation direction of the imaging optical system 11, the scale of the intensity image, and the like.
The status information of the radar apparatus 30 is information indicating the directivity direction of the transmission / reception unit 31 and the scale of the distance image.
The image deformation unit 41 refers to the directivity direction of the imaging optical system 11 to confirm the direction of the intensity image, and refers to the directivity direction of the transmission / reception unit 31 to confirm the direction of the distance image.
Image transforming unit 41, the orientation of the distance image generation time t c is, to match the orientation of the selected intensity image, carrying out the affine transformation processing of the distance image is rotated or the like.
In addition, the image transformation unit 41 performs the distance image enlargement process or the distance image reduction process so that the scale of the distance image subjected to the affine transformation process or the like matches the scale of the selected intensity image.
 ここでは、画像変形部41が、距離画像を変形させる処理として、アフィン変換処理と拡大処理又は縮小処理とを実施している。しかし、これは一例に過ぎず、画像変形部41が、距離画像を距離画像と整合させるために、距離画像を変形させる処理であれば、どのような処理でもよい。
 また、画像変形部41が、距離画像取得部32から出力された距離画像が、強度画像取得部13から出力された強度画像と整合するように、距離画像を変形している。しかし、これは一例に過ぎず、画像変形部41が、距離画像取得部32から出力された距離画像が、光検出器25から出力された強度画像と整合するように、距離画像を変形するものであってもよい。
Here, the image deforming unit 41 performs an affine transformation process and an enlargement process or a reduction process as the process for deforming the distance image. However, this is only an example, and any process may be used as long as the image deforming unit 41 deforms the distance image in order to align the distance image with the distance image.
The image deforming unit 41 deforms the distance image so that the distance image output from the distance image acquiring unit 32 matches the intensity image output from the intensity image acquiring unit 13. However, this is only an example, and the image deformation unit 41 deforms the distance image so that the distance image output from the distance image acquisition unit 32 matches the intensity image output from the photodetector 25. It may be.
 移動体復元部42は、画像変形部41から出力された距離画像と、強度画像取得部13から出力された強度画像と、光検出器25により検出された複数の集光スポット像あるいは波面推定部26から出力された点広がり関数km,n(x,y)とから、移動体1の輝度分布o(p,q)を推定する(図5のステップST7)。
 ここでは、移動体復元部42が、点広がり関数km,n(x,y)を用いて、移動体1の輝度分布o(p,q)を推定しているが、波面推定部26から出力された波面W(u,v)を用いて、移動体1の輝度分布o(p,q)を推定するようにしてもよい。
The moving body restoration unit 42 includes a distance image output from the image deformation unit 41, an intensity image output from the intensity image acquisition unit 13, and a plurality of focused spot images or wavefront estimation units detected by the photodetector 25. The brightness distribution o (p, q) of the moving body 1 is estimated from the point spread function km , n (x, y) output from the terminal 26 (step ST7 in FIG. 5).
Here, the moving body restoration unit 42 estimates the luminance distribution o (p, q) of the moving body 1 using the point spread function km , n (x, y). The luminance distribution o (p, q) of the moving body 1 may be estimated using the output wavefront W n (u, v).
 移動体復元部42は、波面W(u,v)を用いて、移動体1の輝度分布o(p,q)を推定する場合、波面W(u,v)を式(3)に代入することで、位相Φ(u,v)を算出する。そして、移動体復元部42は、位相Φ(u,v)を式(2)に代入し、式(2)、式(4)及び式(5)を算出することで、点広がり関数km,n(x,y)を算出する。
 移動体復元部42は、距離画像と、強度画像と、複数の集光スポット像あるいは算出した点広がり関数km,n(x,y)とから、移動体1の輝度分布o(p,q)を推定する。
Mobile recovery unit 42, the wavefront W n (u, v) using a luminance distribution o (p, q) of the moving body 1 when estimating the wavefront W n (u, v) in the equation (3) By substituting, the phase Φ n (u, v) is calculated. Then, the moving body restoration unit 42 substitutes the phase Φ n (u, v) into the equation (2), and calculates the equations (2), (4), and (5), thereby calculating the point spread function k. m, n (x, y) is calculated.
The moving body restoration unit 42 calculates the luminance distribution o (p, q) of the moving body 1 from the distance image, the intensity image, and the plurality of focused spot images or the calculated point spread function km , n (x, y). ).
 以下、移動体復元部42による移動体1の輝度分布の推定処理を具体的に説明する。
 図12は、移動体復元部42の処理手順を示すフローチャートである。
 図13は、移動体復元部42による移動体1の輝度分布推定処理を示す説明図である。
 図13において、距離画像111は、画像変形部41から出力された変形後の距離画像である。
 強度画像112は、強度画像取得部13から出力された強度画像である。
Hereinafter, the process of estimating the luminance distribution of the moving body 1 by the moving body restoring unit 42 will be described in detail.
FIG. 12 is a flowchart showing the processing procedure of the moving object restoring unit 42.
FIG. 13 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 42.
In FIG. 13, the distance image 111 is a deformed distance image output from the image deforming unit 41.
The intensity image 112 is an intensity image output from the intensity image acquisition unit 13.
 まず、移動体復元部42は、画像変形部41から距離画像111を受けると、距離画像111から、移動体1が存在している領域を検出する(図12のステップST21)。
 即ち、移動体復元部42は、距離画像111から、移動体1の輪郭を抽出する輪郭抽出処理を実施する。輪郭抽出処理自体は、公知の技術であるため詳細な説明を省略する。
 移動体復元部42は、抽出した輪郭の内側の領域を移動体1が存在している領域とし、輪郭の外側の領域を移動体1が存在していない領域とする。
First, when receiving the distance image 111 from the image deforming unit 41, the moving body restoring unit 42 detects an area where the moving body 1 exists from the distance image 111 (step ST21 in FIG. 12).
That is, the moving body restoration unit 42 performs a contour extraction process for extracting the contour of the moving body 1 from the distance image 111. Since the contour extraction process itself is a known technique, a detailed description thereof will be omitted.
The moving body restoration unit 42 sets the area inside the extracted outline as the area where the moving body 1 exists, and sets the area outside the outline as the area where the moving body 1 does not exist.
 次に、移動体復元部42は、図13に示すように、移動体1が存在している領域を包含する領域のみが、移動体1の輝度分布推定処理に用いる処理対象領域である旨を示すマスク画像113を生成する(図12のステップST22)。
 処理対象領域は、移動体1が存在している領域を包含する領域であり、処理対象領域は、移動体1が存在している領域と一致する領域であってもよいし、移動体1が存在している領域も大きい領域であってもよい。移動体1が存在している領域も大きい領域としては、移動体1の影に対応するマージンだけ、抽出した移動体1の輪郭よりも大きい領域などが考えられる。マージンとしては、例えば、移動体1が存在している領域の約10%の大きさが考えられる。
Next, as shown in FIG. 13, the moving body restoration unit 42 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1. A mask image 113 is generated (step ST22 in FIG. 12).
The processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists. The existing area may be a large area. As a region where the moving object 1 is also large, an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered. As the margin, for example, a size of about 10% of the area where the moving body 1 exists can be considered.
 移動体復元部42は、強度画像112から、マスク画像113における処理対象領域内の移動体1の像im,n(x,y)を抽出する(図12のステップST23)。図13に示す強度画像114は、強度画像112から抽出された処理対象領域内の移動体1の像im,n(x,y)を示す強度画像である。
 移動体復元部42は、処理対象領域に含まれている1つ以上の移動体1の像im,n(x,y)の中から、1つの移動体1の像im,n(x,y)を選択する。
 移動体復元部42は、選択した移動体1の像im,n(x,y)と、選択した移動体1の像im,n(x,y)に対応する点広がり関数km,n(x,y)とを式(16)に代入して、差分rm,n(x,y)を算出する。
 移動体復元部42は、処理対象領域に含まれている1以上の移動体1の像im,n(x,y)をすべて選択して、差分rm,n(x,y)の算出が終了するまで、上記の処理を繰り返し実施する。
 また、移動体復元部42は、算出した全ての差分rm,n(x,y)を式(17)に代入して、差分の二乗和eを算出する(図12のステップST24)。
 移動体復元部42は、式(17)において、処理対象領域内の差分rm,n(x,y)に対応する重みd(x,y)を1とし、処理対象領域外の差分rm,n(x,y)に対応する重みd(x,y)を0とする。
The moving body restoring unit 42 extracts the image im , n (x, y) of the moving body 1 in the processing target area in the mask image 113 from the intensity image 112 (step ST23 in FIG. 12). The intensity image 114 shown in FIG. 13 is an intensity image showing the image im , n (x, y) of the moving body 1 in the processing target area extracted from the intensity image 112.
Mobile recovery unit 42, the image i m of one or more mobile 1 contained in the processing target area, n (x, y) from the one image i m of the moving body 1, n (x , Y).
The moving body restoration unit 42 includes an image im , n (x, y) of the selected moving body 1 and a point spread function km, corresponding to the selected image im , n (x, y) of the moving body 1 . Substituting n (x, y) into equation (16), the difference rm , n (x, y) is calculated.
The moving body restoration unit 42 selects all the images im , n (x, y) of one or more moving bodies 1 included in the processing target region, and calculates the difference rm , n (x, y). The above process is repeated until the process ends.
Further, the moving body restoration unit 42 substitutes all the calculated differences r m, n (x, y) into the equation (17) to calculate the square sum e of the differences (step ST24 in FIG. 12).
Mobile recovery unit 42, in formula (17), the difference r m of the processing target area, n (x, y) weighted d m (x, y) corresponding to the set to 1, processing target area outside the difference r A weight d m (x, y) corresponding to m, n (x, y) is set to zero.
 移動体復元部42は、差分の二乗和eを算出すると、移動体1の輝度分布推定処理が収束しているか否かを判定する(図12のステップST25)。
 移動体1の輝度分布推定処理の収束判定として、例えば、算出した差分の二乗和eが事前に設定されている第2の許容誤差以下であれば、収束していると判定する方法がある。収束していると判定したときの算出した差分の二乗和eは、最小の二乗和eである。第2の許容誤差は、例えば、移動体復元部42の内部メモリ又は記録装置43に格納されているものとする。
 また、移動体1の輝度分布推定処理の収束判定として、例えば、移動体1の輝度分布o(p,q)を変更しながら、事前に設定された回数だけ差分の二乗和eを算出し、算出した二乗和eの中で、最小の二乗和eを特定したら、収束していると判定する方法がある。
After calculating the square sum e of the difference, the moving body restoration unit 42 determines whether or not the luminance distribution estimation process of the moving body 1 has converged (step ST25 in FIG. 12).
As a convergence determination of the luminance distribution estimation process of the mobile body 1, for example, there is a method of determining that the convergence is achieved if the calculated square sum e of differences is equal to or less than a second allowable error set in advance. The sum of squares e of the difference calculated when it is determined that it has converged is the minimum square sum e. The second allowable error is assumed to be stored in the internal memory of the moving body restoration unit 42 or the recording device 43, for example.
In addition, as the convergence determination of the luminance distribution estimation process of the mobile object 1, for example, the square sum e of the difference is calculated a predetermined number of times while changing the luminance distribution o (p, q) of the mobile object 1, There is a method of determining that convergence is achieved when the minimum square sum e is specified among the calculated sums of squares e.
 移動体復元部42は、移動体1の輝度分布推定処理が収束していなければ(図12のステップST25:NOの場合)、式(16)に示す移動体1の輝度分布o(p,q)を変更し(図12のステップST26)、ステップST24~ST25の処理を再度実施する。
 変更後の輝度分布o(p,q)は、未だ式(16)に設定していない輝度分布であれば、どのような輝度分布でもよいが、差分の二乗和eが小さくなるような輝度分布であることが望ましい。
 移動体復元部42は、移動体1の輝度分布推定処理が収束していれば(図12のステップST25:YESの場合)、移動体1の輝度分布推定処理結果として、最小の二乗和eが算出された移動体1の輝度分布o(p,q)を記録装置43に出力する(図12のステップST27)。
If the luminance distribution estimation process of the moving object 1 has not converged (in the case of step ST25 in FIG. 12: NO), the moving object restoring unit 42 determines the luminance distribution o (p, q of the moving object 1 shown in Expression (16). ) Is changed (step ST26 in FIG. 12), and the processes of steps ST24 to ST25 are performed again.
The luminance distribution o (p, q) after the change may be any luminance distribution as long as it is not yet set in the equation (16), but the luminance distribution is such that the square sum e of the differences becomes small. It is desirable that
If the luminance distribution estimation process of the moving object 1 has converged (in the case of step ST25 in FIG. 12: YES), the moving object restoration unit 42 obtains the minimum square sum e as the result of the luminance distribution estimation process of the moving object 1. The calculated luminance distribution o (p, q) of the moving body 1 is output to the recording device 43 (step ST27 in FIG. 12).
 以上の実施の形態1は、移動体復元部42が、レーダ装置30により生成された距離画像から、移動体1が存在している領域を検出し、光検出器25により検出された複数の集光スポット像のうち、移動体1が存在している領域内の集光スポット像と、波面推定部26により推定された波面とから、移動体1の輝度分布を推定するように、移動体観測装置を構成した。したがって、移動体観測装置は、レンズレットアレイなどの高速移動を実現する制御機構を実装することなく、移動体1の輝度分布を推定することができる。 In the first embodiment described above, the moving body restoration unit 42 detects a region where the moving body 1 exists from the distance image generated by the radar device 30, and a plurality of collections detected by the photodetector 25. Moving object observation so as to estimate the luminance distribution of the moving object 1 from the focused spot image in the region where the moving object 1 exists in the light spot image and the wavefront estimated by the wavefront estimating unit 26. Configured the device. Therefore, the moving body observation apparatus can estimate the luminance distribution of the moving body 1 without mounting a control mechanism that realizes high-speed movement such as a lenslet array.
実施の形態2.
 実施の形態1の移動体観測装置では、移動体復元部42が、距離画像111から、移動体1の輪郭を抽出し、輪郭の内側の領域を移動体1が存在している領域としている。
 実施の形態2では、移動体復元部70が、距離画像111に含まれている複数の画素の中で、第1の閾値よりも大きく、かつ、第2の閾値よりも小さい距離値を有する画素が集まっている領域を移動体1が存在している領域として検出する移動体観測装置を説明する。
Embodiment 2. FIG.
In the moving body observation apparatus of the first embodiment, the moving body restoring unit 42 extracts the contour of the moving body 1 from the distance image 111, and sets the area inside the contour as the area where the moving body 1 exists.
In the second embodiment, the moving body restoration unit 70 has a distance value that is larger than the first threshold value and smaller than the second threshold value among the plurality of pixels included in the distance image 111. A moving body observation apparatus that detects a region where the moving bodies 1 exist as a region where the moving body 1 exists will be described.
 図14は、実施の形態2による移動体観測装置を示す構成図である。
 図14において、図1と同一符号は同一又は相当部分を示すので説明を省略する。
 移動体復元部70は、パーソナルコンピュータなどの計算機、あるいは、移動体復元回路などによって実現される。
 移動体復元部70は、軌道情報記録部46から軌道情報を取得し、軌道情報に基づいて、距離の下限を示す第1の閾値Lref1及び距離の上限を示す第2の閾値Lref2をそれぞれ決定する処理を実施する。
 移動体復元部70は、距離画像111に含まれている複数の画素の中で、第1の閾値Lref1よりも大きく、かつ、第2の閾値Lref2よりも小さい距離値を有する画素が集まっている領域を移動体1が存在している領域として検出する処理を実施する。
 移動体復元部70は、光検出器25により検出された複数の集光スポット像のうち、移動体1が存在している領域内の集光スポット像と、波面推定部26により推定された波面とから、移動体1の輝度分布を推定する処理を実施する。
FIG. 14 is a configuration diagram illustrating a moving object observation apparatus according to the second embodiment.
In FIG. 14, the same reference numerals as those in FIG.
The moving body restoration unit 70 is realized by a computer such as a personal computer or a moving body restoration circuit.
The moving body restoration unit 70 acquires the trajectory information from the trajectory information recording unit 46, and based on the trajectory information, sets the first threshold value L ref1 indicating the lower limit of the distance and the second threshold value L ref2 indicating the upper limit of the distance, respectively. Implement the process to determine.
The moving body restoration unit 70 collects pixels having a distance value that is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111. The process which detects the area | region which is moving as an area | region where the mobile body 1 exists is implemented.
The moving body restoring unit 70 includes a focused spot image in a region where the moving body 1 is present among the plurality of focused spot images detected by the photodetector 25, and the wavefront estimated by the wavefront estimating unit 26. Then, the process of estimating the luminance distribution of the moving body 1 is performed.
 次に、図14に示す移動体観測装置について説明する。ただし、移動体復元部70以外は、図1に示す移動体観測装置と同様であるため、ここでは、移動体復元部70の処理内容だけを説明する。
 まず、移動体復元部70は、軌道情報記録部46から軌道情報を取得する。
 移動体1が、地球の軌道を周回している隕石、宇宙ゴミ、人工衛星などである場合、移動体1に対する重力は、主に地球による重力である。
 移動体1と地球の2対問題であると仮定すると、移動体1の運動の軌跡は、図15に示すように、二次曲線で表される。移動体1の運動の軌跡は、軌道に相当する。二次曲線は、例えば、法物線、双曲線又は楕円のいずれかである。
 図15は、移動体1の軌道を示す説明図である。
 ここで、恒星及び彗星などの天体の軌道を示す軌道情報は、既知とする。また、人工衛星及び国際宇宙ステーションなどの移動体の軌道を示す軌道情報も、ここでは既知とする。
Next, the moving body observation apparatus shown in FIG. 14 will be described. However, since the parts other than the moving body restoring unit 70 are the same as those of the moving body observation apparatus shown in FIG. 1, only the processing contents of the moving body restoring unit 70 will be described here.
First, the moving body restoration unit 70 acquires trajectory information from the trajectory information recording unit 46.
When the moving body 1 is a meteorite, space debris, an artificial satellite or the like that orbits the earth, the gravity with respect to the moving body 1 is mainly the gravity of the earth.
Assuming that this is a two-pair problem of the moving body 1 and the earth, the movement locus of the moving body 1 is represented by a quadratic curve as shown in FIG. The trajectory of the movement of the moving body 1 corresponds to a trajectory. The quadratic curve is, for example, a normal line, a hyperbola, or an ellipse.
FIG. 15 is an explanatory diagram showing the trajectory of the moving body 1.
Here, orbit information indicating the orbits of celestial bodies such as stars and comets is assumed to be known. In addition, orbit information indicating the orbits of moving objects such as artificial satellites and the International Space Station is also known here.
 図16は、移動体復元部70による移動体1の輝度分布推定処理を示す説明図である。
 図16において、図13と同一符号は同一又は相当部分を示している。
 距離画像111において、L,L,Lのそれぞれは、図4に示すように、レーダ装置30から移動体1におけるそれぞれの部位までの距離を示している。L<L<Lである。
FIG. 16 is an explanatory diagram showing luminance distribution estimation processing of the moving object 1 by the moving object restoring unit 70.
16, the same reference numerals as those in FIG. 13 denote the same or corresponding parts.
In the distance image 111, each of L 1 , L 2 , and L 3 indicates the distance from the radar device 30 to each part in the moving body 1 as shown in FIG. L 1 <L 2 <L 3 is satisfied.
 移動体復元部70は、軌道情報を参照して、移動体1の位置を特定する。
 移動体復元部70は、レーダ装置30の位置と移動体1の位置とから、レーダ装置30から移動体1までの距離の概算値を算出する。
 移動体復元部70は、移動体1における外形寸法の最大値の1.0~2.5倍などをαとして、(距離の概算値-α)を第1の閾値Lref1に設定し、(距離の概算値+α)を第2の閾値Lref2に設定する。この設定は、一例に過ぎず、第1の閾値Lref1及び第2の閾値Lref2は、他の方法で設定してもよいことは言うまでもない。
 移動体1における外形寸法の最大値は、例えば、移動体1が横長の物体であれば、移動体1の横寸法であり、例えば、移動体1が縦長の物体であれば、移動体1の縦寸法である。
 ここでは、説明の便宜上、Lref1<L<L<L<Lref2であるものとする。
The moving body restoration unit 70 identifies the position of the moving body 1 with reference to the trajectory information.
The moving body restoration unit 70 calculates an approximate value of the distance from the radar apparatus 30 to the moving body 1 from the position of the radar apparatus 30 and the position of the moving body 1.
The moving body restoration unit 70 sets (approximate distance-α) to the first threshold value L ref1, where α is 1.0 to 2.5 times the maximum outer dimension of the moving body 1, and the like ( The approximate distance value + α) is set to the second threshold value L ref2 . This setting is only an example, and it goes without saying that the first threshold value L ref1 and the second threshold value L ref2 may be set by other methods.
The maximum external dimension of the moving body 1 is, for example, the horizontal dimension of the moving body 1 if the moving body 1 is a horizontally long object. For example, if the moving body 1 is a vertically long object, the maximum value of the moving body 1 The vertical dimension.
Here, for convenience of explanation, it is assumed that L ref1 <L 1 <L 2 <L 3 <L ref2 .
 移動体復元部70は、図16に示すように、距離画像111に含まれている複数の画素の中で、第1の閾値Lref1よりも大きく、かつ、第2の閾値Lref2よりも小さい距離値を有する画素が集まっている領域を特定する。
 移動体復元部70は、特定した領域の内側の領域を移動体1が存在している領域とし、特定した領域の外側の領域を移動体1が存在していない領域とする。
As shown in FIG. 16, the moving body restoration unit 70 is larger than the first threshold value L ref1 and smaller than the second threshold value L ref2 among the plurality of pixels included in the distance image 111. An area in which pixels having distance values are gathered is specified.
The moving body restoration unit 70 sets the area inside the specified area as the area where the moving body 1 exists, and sets the area outside the specified area as the area where the moving body 1 does not exist.
 次に、移動体復元部70は、図16に示すように、移動体1が存在している領域を包含する領域のみが、移動体1の輝度分布推定処理に用いる処理対象領域である旨を示すマスク画像113を生成する。
 処理対象領域は、移動体1が存在している領域を包含する領域であり、処理対象領域は、移動体1が存在している領域と一致する領域であってもよいし、移動体1が存在している領域も大きい領域であってもよい。移動体1が存在している領域も大きい領域としては、移動体1の影に対応するマージンだけ、抽出した移動体1の輪郭よりも大きい領域などが考えられる。
Next, as shown in FIG. 16, the moving body restoration unit 70 indicates that only the area including the area where the moving body 1 exists is the processing target area used for the luminance distribution estimation process of the moving body 1. A mask image 113 shown is generated.
The processing target area is an area including the area where the moving body 1 exists, and the processing target area may be an area that matches the area where the moving body 1 exists. The existing area may be a large area. As a region where the moving object 1 is also large, an area larger than the contour of the extracted moving object 1 by a margin corresponding to the shadow of the moving object 1 can be considered.
 以下、移動体復元部70は、図1に示す移動体復元部42と同様に、移動体1が存在している領域内の集光スポット像と、波面推定部26から出力された点広がり関数km,n(x,y)とから、移動体1の輝度分布o(p,q)を推定する。 Hereinafter, similarly to the moving body restoring unit 42 shown in FIG. 1, the moving body restoring unit 70 and the focused spot image in the region where the moving body 1 exists and the point spread function output from the wavefront estimating unit 26 are used. The brightness distribution o (p, q) of the moving body 1 is estimated from km , n (x, y).
 なお、本願発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the present invention, within the scope of the invention, any combination of the embodiments, any modification of any component in each embodiment, or omission of any component in each embodiment is possible. .
 この発明は、移動体の輝度分布を推定する移動体観測装置及び移動体観測方法に適している。 The present invention is suitable for a moving object observation apparatus and a moving object observation method for estimating the luminance distribution of a moving object.
 1 移動体、2,2a,4,5,6 光束、10 望遠鏡装置、11 結像光学系、12 光束分割部、13 強度画像取得部、14 リレー光学系、15 波面計測部、16 指向装置、21 空間分割部、22 遮光部、23 レンズアレイ、23a レンズ、24 シャッタ、25 光検出器、25a 光検出器の受光面、26 波面推定部、30 レーダ装置、31 送受信部、32 距離画像取得部、33 指向装置、40 運用装置、41 画像変形部、42 移動体復元部、43 記録装置、44 時刻校正部、45 カウンタ、46 軌道情報記録部、47 軌道計算部、48 計画装置、49 制御装置、51,52 筐体、60 照明光源、61 移動体1の形状、62 強度画像、63 距離画像、70 移動体復元部、101 第1の大気層、102 第2の大気層、103 第3の大気層、104,105,105a,105b 移動体の像、106a,106b 波面、111 距離画像、112,114 強度画像、113 マスク画像。 1 moving body, 2, 2a, 4, 5, 6 light beam, 10 telescope device, 11 imaging optical system, 12 light beam splitting unit, 13 intensity image acquisition unit, 14 relay optical system, 15 wavefront measuring unit, 16 directivity device, 21 space dividing unit, 22 light shielding unit, 23 lens array, 23a lens, 24 shutter, 25 photodetector, 25a photodetector surface, 26 wavefront estimation unit, 30 radar device, 31 transmission / reception unit, 32 distance image acquisition unit , 33 directional device, 40 operation device, 41 image deformation unit, 42 moving body restoration unit, 43 recording device, 44 time calibration unit, 45 counter, 46 trajectory information recording unit, 47 trajectory calculation unit, 48 planning device, 49 control device 51, 52 housing, 60 illumination light source, 61 shape of moving body 1, 62 intensity image, 63 distance image, 70 moving body recovery Part, 101 first atmospheric layer, 102 second atmospheric layer, 103 third atmospheric layer, 104, 105, 105a, 105b moving body image, 106a, 106b wavefront, 111 distance image, 112, 114 intensity image, 113 Mask image.

Claims (7)

  1.  移動体に反射された光束又は前記移動体から送信された光束を集光する結像光学系と、
     前記結像光学系により集光された光束を複数の空間領域の光束に分割し、前記複数の空間領域の光束のそれぞれを集光する空間分割部と、
     前記空間分割部により集光されたそれぞれの光束から、前記移動体の像として、集光スポット像を検出する光検出器と、
     前記光検出器により検出された複数の集光スポット像の位置から、前記結像光学系の開口における光束の波面を推定する波面推定部と、
     前記移動体に反射されたレーダ波を受信し、前記レーダ波に基づいて前記移動体までの距離を示す距離画像を生成するレーダ装置と、
     前記レーダ装置により生成された距離画像から、前記移動体が存在している領域を検出し、前記光検出器により検出された複数の集光スポット像のうち、前記移動体が存在している領域内の集光スポット像と、前記波面推定部により推定された波面とから、前記移動体の輝度分布を推定する移動体復元部と
     を備えた移動体観測装置。
    An imaging optical system that collects a light beam reflected by the moving body or a light beam transmitted from the moving body;
    A space dividing unit that divides a light beam collected by the imaging optical system into a plurality of light beams in a plurality of spatial regions, and collects each of the light beams in the plurality of spatial regions;
    A photodetector that detects a focused spot image as an image of the moving body from each light beam collected by the space dividing unit;
    A wavefront estimation unit that estimates a wavefront of a light flux at an aperture of the imaging optical system from the positions of a plurality of focused spot images detected by the photodetector;
    A radar apparatus that receives a radar wave reflected by the moving body and generates a distance image indicating a distance to the moving body based on the radar wave;
    A region where the moving body exists is detected from a range image generated by the radar device, and the region where the moving body exists among a plurality of focused spot images detected by the photodetector. A moving body observing apparatus comprising: a moving body restoring unit that estimates a luminance distribution of the moving body from a focused spot image inside and a wavefront estimated by the wavefront estimating unit.
  2.  前記結像光学系により集光された光束から、前記移動体の像を検出し、前記移動体の像を示す強度画像を出力する強度画像取得部を備え、
     前記レーダ装置により生成された距離画像が前記強度画像と整合するように、前記距離画像を変形し、変形後の距離画像を前記移動体復元部に出力する画像変形部を備えたことを特徴とする請求項1記載の移動体観測装置。
    An intensity image acquisition unit that detects an image of the moving body from a light beam collected by the imaging optical system and outputs an intensity image indicating the image of the moving body;
    An image deforming unit is provided that deforms the distance image so that the distance image generated by the radar device matches the intensity image, and outputs the deformed distance image to the moving body restoring unit. The moving body observation apparatus according to claim 1.
  3.  前記レーダ装置により生成された距離画像が前記光検出器により検出された複数の集光スポット像を示す強度画像と整合するように、前記距離画像を変形し、変形後の距離画像を前記移動体復元部に出力する画像変形部を備えたことを特徴とする請求項1記載の移動体観測装置。 The distance image is deformed so that the distance image generated by the radar device matches an intensity image indicating a plurality of focused spot images detected by the photodetector, and the deformed distance image is converted into the moving body. The moving body observation apparatus according to claim 1, further comprising an image deformation unit that outputs to the restoration unit.
  4.  前記移動体復元部は、前記レーダ装置により生成された距離画像から、前記移動体の輪郭を抽出し、前記輪郭の内側の領域を前記移動体が存在している領域として検出することを特徴とする請求項1記載の移動体観測装置。 The moving body restoration unit extracts an outline of the moving body from a distance image generated by the radar device, and detects an area inside the outline as an area where the moving body exists. The moving body observation apparatus according to claim 1.
  5.  前記移動体復元部は、
     前記移動体の軌道を示す軌道情報に基づいて、距離の下限を示す第1の閾値及び距離の上限を示す第2の閾値をそれぞれ決定し、
     前記距離画像に含まれている複数の画素の中で、前記第1の閾値よりも大きく、かつ、前記第2の閾値よりも小さい距離値を有する画素が集まっている領域を前記移動体が存在している領域として検出することを特徴とする請求項1記載の移動体観測装置。
    The moving body restoration unit
    Based on the trajectory information indicating the trajectory of the mobile body, a first threshold value indicating the lower limit of the distance and a second threshold value indicating the upper limit of the distance are determined, respectively.
    Among the plurality of pixels included in the distance image, the moving object exists in an area where pixels having distance values larger than the first threshold value and smaller than the second threshold value are gathered. The moving body observation apparatus according to claim 1, wherein the moving body observation apparatus detects the area as a moving area.
  6.  前記波面推定部は、前記光検出器により検出された複数の集光スポット像の位置から、前記結像光学系の開口における光束の波面の概算値を算出し、前記概算値を用いて、前記複数の集光スポット像の点像強度分布を算出し、前記点像強度分布と前記複数の集光スポット像とから、前記波面を推定することを特徴とする請求項1記載の移動体観測装置。 The wavefront estimation unit calculates an approximate value of a wavefront of a light beam at an aperture of the imaging optical system from the positions of a plurality of focused spot images detected by the photodetector, and uses the approximate value to calculate the wavefront. 2. The mobile observation apparatus according to claim 1, wherein a point image intensity distribution of a plurality of focused spot images is calculated, and the wavefront is estimated from the point image intensity distribution and the plurality of focused spot images. .
  7.  結像光学系が、移動体に反射された光束又は前記移動体から送信された光束を集光し、
     空間分割部が、前記結像光学系により集光された光束を複数の空間領域の光束に分割して、前記複数の空間領域の光束のそれぞれを集光し、
     光検出器が、前記空間分割部により集光されたそれぞれの光束から、前記移動体の像として、集光スポット像を検出し、
     波面推定部が、前記光検出器により検出された複数の集光スポット像の位置から、前記結像光学系の開口における光束の波面を推定し、
     レーダ装置が、前記移動体に反射されたレーダ波を受信し、前記レーダ波に基づいて前記移動体までの距離を示す距離画像を生成し、
     移動体復元部が、前記レーダ装置により生成された距離画像から、前記移動体が存在している領域を検出し、前記光検出器により検出された複数の集光スポット像のうち、前記移動体が存在している領域内の集光スポット像と、前記波面推定部により推定された波面とから、前記移動体の輝度分布を推定する
     移動体観測方法。
    The imaging optical system collects the light beam reflected by the moving body or the light beam transmitted from the moving body,
    The space dividing unit divides the light beam collected by the imaging optical system into light beams of a plurality of spatial regions, and collects each of the light beams of the plurality of spatial regions,
    A photodetector detects a focused spot image as an image of the moving body from each light beam collected by the space dividing unit,
    The wavefront estimation unit estimates the wavefront of the light flux at the aperture of the imaging optical system from the positions of the plurality of focused spot images detected by the photodetector,
    A radar apparatus receives a radar wave reflected by the moving body, and generates a distance image indicating a distance to the moving body based on the radar wave;
    A moving body restoration unit detects a region where the moving body exists from a distance image generated by the radar device, and among the plurality of focused spot images detected by the photodetector, the moving body A moving body observation method for estimating a luminance distribution of the moving body from a focused spot image in a region where a wave exists and a wavefront estimated by the wavefront estimating section.
PCT/JP2018/019366 2018-05-18 2018-05-18 Moving body observation device and moving body observation method WO2019220639A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020518936A JP6771697B2 (en) 2018-05-18 2018-05-18 Mobile observation device and mobile observation method
PCT/JP2018/019366 WO2019220639A1 (en) 2018-05-18 2018-05-18 Moving body observation device and moving body observation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/019366 WO2019220639A1 (en) 2018-05-18 2018-05-18 Moving body observation device and moving body observation method

Publications (1)

Publication Number Publication Date
WO2019220639A1 true WO2019220639A1 (en) 2019-11-21

Family

ID=68540275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019366 WO2019220639A1 (en) 2018-05-18 2018-05-18 Moving body observation device and moving body observation method

Country Status (2)

Country Link
JP (1) JP6771697B2 (en)
WO (1) WO2019220639A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610707A (en) * 1995-07-07 1997-03-11 Lockheed Missiles & Space Co., Inc. Wavefront sensor for a staring imager
US7405834B1 (en) * 2006-02-15 2008-07-29 Lockheed Martin Corporation Compensated coherent imaging for improved imaging and directed energy weapons applications
US7551121B1 (en) * 2004-03-12 2009-06-23 Oceanit Laboratories, Inc. Multi-target-tracking optical sensor-array technology
JP2012514796A (en) * 2009-01-05 2012-06-28 アプライド クウォンタム テクノロジイズ インク Multi-scale optical system
US20120261514A1 (en) * 2010-12-17 2012-10-18 The Johns Hopkins University System and Method of Solar Flux Concentration for Orbital Debris Remediation
JP2016118547A (en) * 2014-12-17 2016-06-30 ザ・ボーイング・カンパニーThe Boeing Company Diversification of lenslet, beamwalk, and tilt for anisoplanatic image formation in large diameter telescope

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5610707A (en) * 1995-07-07 1997-03-11 Lockheed Missiles & Space Co., Inc. Wavefront sensor for a staring imager
US7551121B1 (en) * 2004-03-12 2009-06-23 Oceanit Laboratories, Inc. Multi-target-tracking optical sensor-array technology
US7405834B1 (en) * 2006-02-15 2008-07-29 Lockheed Martin Corporation Compensated coherent imaging for improved imaging and directed energy weapons applications
JP2012514796A (en) * 2009-01-05 2012-06-28 アプライド クウォンタム テクノロジイズ インク Multi-scale optical system
US20120261514A1 (en) * 2010-12-17 2012-10-18 The Johns Hopkins University System and Method of Solar Flux Concentration for Orbital Debris Remediation
JP2016118547A (en) * 2014-12-17 2016-06-30 ザ・ボーイング・カンパニーThe Boeing Company Diversification of lenslet, beamwalk, and tilt for anisoplanatic image formation in large diameter telescope

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAYTON, D.C. ET AL.: "Performance simulations of a daylight low-order adaptive optics system with speckle postprocessing for observation of low-earth orbit satellites", OPTICAL ENGINEERING, vol. 36, no. 7, July 1997 (1997-07-01), pages 1910 - 1917, XP000658595, DOI: 10.1117/1.601397 *

Also Published As

Publication number Publication date
JPWO2019220639A1 (en) 2020-12-10
JP6771697B2 (en) 2020-10-21

Similar Documents

Publication Publication Date Title
US10641897B1 (en) Ladar system and method with adaptive pulse duration
US8355536B2 (en) Passive electro-optical tracker
US8218013B1 (en) Star sensing for an earth imaging sensor
US9759605B2 (en) Low-orbit satellite-borne image-spectrum associated detection method and payload
JP6570991B2 (en) Diversification of lenslet, beam walk (BEAMWALK), and tilt for forming non-coplanar (ANISOLANATIC) images in large aperture telescopes
US20110261193A1 (en) Passive electro-optical tracker
JP6509456B1 (en) Wavefront measurement device, wavefront measurement method, mobile object observation device, mobile object observation method
US5350911A (en) Wavefront error estimation derived from observation of arbitrary unknown extended scenes
JP6632468B2 (en) Moving object detecting device, observation system and moving object detecting method
WO2019220639A1 (en) Moving body observation device and moving body observation method
JP2009509125A (en) Method and apparatus for determining a position associated with an image
WO2011059530A2 (en) Passive electro-optical tracker
US20230055616A1 (en) Rayleigh-raman polychromatic laser guide star
JP6494885B1 (en) Wavefront measuring apparatus, wavefront measuring method, moving object observing apparatus, moving object observing method
JP6906732B1 (en) Mobile imaging system, operation plan setting device and mobile imaging method
US20240020852A1 (en) Initial orbit determination using angular velocity and angular acceleration measurements
JP7038940B1 (en) Shape and attitude estimation device, shape and attitude estimation method and shape and attitude estimation system
Kudak et al. QHY-174M-GPS camera as the device for photometry of artificial satellites
US11448483B1 (en) Projectile tracking and 3D traceback method
Conran et al. A New Technique to Define the Spatial Resolution of Imaging Sensors
US10366506B2 (en) Hyperacuity system and methods for real time and analog detection and kinematic state tracking
Brewer et al. Infrared seeker/sensor dynamic performance prediction model
Delaite et al. Performance of an Optical COTS Station for the wide-field Detection of Resident Space Objects
Allured et al. Measurements of the isopistonic angle using masked aperture interferometry
Ellis et al. Star sensing for an earth imaging sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18918914

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020518936

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18918914

Country of ref document: EP

Kind code of ref document: A1