WO2017138049A1 - Flying body and control system therefor - Google Patents

Flying body and control system therefor Download PDF

Info

Publication number
WO2017138049A1
WO2017138049A1 PCT/JP2016/004754 JP2016004754W WO2017138049A1 WO 2017138049 A1 WO2017138049 A1 WO 2017138049A1 JP 2016004754 W JP2016004754 W JP 2016004754W WO 2017138049 A1 WO2017138049 A1 WO 2017138049A1
Authority
WO
WIPO (PCT)
Prior art keywords
base station
light
image
flying object
distance measuring
Prior art date
Application number
PCT/JP2016/004754
Other languages
French (fr)
Japanese (ja)
Inventor
堀ノ内 貴志
俊典 廣瀬
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2017138049A1 publication Critical patent/WO2017138049A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64FGROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
    • B64F1/00Ground or aircraft-carrier-deck installations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00

Definitions

  • the present disclosure relates to an aircraft flying in the air and a control system thereof.
  • Patent Document 2 discloses a technique related to position and orientation control.
  • Patent Document 2 discloses an apparatus that obtains distance distribution information (distance image) on the lower surface by performing stereo processing on a captured image captured by a stereo camera, and calculates the attitude angle of the own device from distance information of a large number of measurement points. is doing.
  • Patent Document 3 discloses a method of tracking a flying object including a light reflector with a surveying device.
  • the present disclosure provides a flying object control system that can stably control the attitude of a flying object even in a situation where a satellite positioning signal such as a GPS signal cannot be captured due to an obstacle.
  • a flying object control system including a flying object and a base station.
  • the base station includes a light source unit that emits guide light having a predetermined wavelength.
  • the flying object includes a propulsion unit that generates a driving force for flying in the air, a distance measuring unit that measures a distance from the flying object to an object existing in a predetermined space around the flying object, and a predetermined measurement unit.
  • An imaging unit that captures an image of a subject existing in space and generates image data, and a first controller that controls the operation of the flying object. The first controller obtains the relative positional relationship of the flying object with respect to the base station based on the guide light image included in the image data generated by the imaging unit and information from the distance measuring unit.
  • the flying object can recognize the relative position with respect to the base station based on the guide light from the base station. For this reason, if the base station is arranged at a position where the flying object can receive the guide light from the base station, even if the satellite positioning signal such as a GPS signal cannot be captured by an obstacle, the flying object can be stably obtained. Attitude control can be performed.
  • FIG. 4 is a block diagram showing a functional configuration of a base station in the second embodiment.
  • Flowchart showing processing of the control system for an aircraft in the second embodiment The figure which shows the example of the image B image
  • FIG. 1 is a diagram illustrating a configuration of an aircraft control system according to the present disclosure.
  • the flying vehicle control system 100 includes an unmanned flying vehicle (a so-called “drone”) 10 that is automatically piloted, and a base station 50 that gives instructions to the flying vehicle.
  • the flying object 10 captures the state of the bridge floor slab 73 and the bridge girder 74 with a camera while moving under the bridge.
  • the base station 50 is attached and fixed to a part of the bridge. Below the bridge, the GPS signal from the GPS satellite is shielded by the floor slab 73 and the bridge girder 74 and does not reach the flying object 10. For this reason, the flying object 10 cannot receive GPS signals, and position control becomes difficult.
  • the flying object 10 of the present embodiment grasps the relative position with respect to the base station 50 based on the guide light (light indicating the position of the base station 50) emitted from the base station 50, and performs position and orientation control based on the relative position. (Details will be described later).
  • FIG. 2 is a view showing the appearance of the flying object 10.
  • FIG. 3A is a block diagram showing a functional configuration of the flying object 10.
  • the flying object 10 includes a main body 11 and a propulsion device 15 that generates a propulsive force of the flying object 10.
  • the propulsion device 15 is attached to the tip of the support portion 13 extending from each of the squares of the main body 11.
  • a first camera 21 is attached to the side surface of the main body 11.
  • An inertial measuring device 17, a GPS positioning device 18, an omnidirectional distance measuring device 19, and a second camera 21b are attached to the upper side of the main body 11.
  • a communication unit 23 and a battery 25 are attached to the lower side of the main body 11.
  • a controller 16 is accommodated in the main body 11.
  • the propulsion device 15 includes a propeller and a motor that rotates the propeller.
  • the flying object 10 includes four propulsion devices 15, but the number of propulsion devices is not limited to four, and may be five or more, for example. By appropriately controlling the rotation speed of each propeller 15, the moving direction and flight state of the flying object 10 can be controlled.
  • the first camera 21 captures a subject (inspection target) and generates high-definition (for example, 4K) image data for inspection.
  • the first camera 21 includes an optical system and an image sensor such as a CCD or CMOS image sensor, and shoots a subject to generate image data.
  • the first camera 21 captures visible light and generates image data.
  • An image generated by imaging such visible light is hereinafter referred to as an “RGB image”.
  • the first camera 21 is attached to the side surface of the flying object 10 toward the upper side of the flying object 10.
  • the flying object 10 can take an image of a predetermined angle of view vertically above the flying object 10 during the flight. That is, the state of the lower side of the building can be photographed by causing the flying object 10 to fly under the building to be inspected.
  • the inertial measurement device 17 includes an acceleration sensor and a gyro sensor, and measures the acceleration and angular velocity of the flying object 10. Based on the output from the inertial measurement device 17, the behavior and attitude of the flying vehicle 10 are controlled.
  • the GPS positioning device 18 receives a signal from a GPS (Global Positioning System) satellite and measures the current position of the flying object 10.
  • the battery 25 is a power source that supplies a power supply voltage to each element of the flying vehicle 10.
  • the omnidirectional distance measuring device 19 captures a space around the distance measuring device 19 at a wide angle, and generates a distance image indicating a distance to an object existing in the space. Specifically, as shown in FIG. 4A, a substantially hemispherical space (hereinafter referred to as “measurement target space”) surrounding a portion where the light of the distance measuring device 19 enters and exits is photographed.
  • the range in which the distance measuring device 19 can shoot covers all directions (360 °) in the horizontal direction and covers a range of 180 ° or more in the vertical direction.
  • the distance measuring device 19 can be constituted by, for example, a TOF (Time) of Flight) sensor capable of photographing at a wide angle.
  • FIG. 4B is a diagram showing a functional configuration of the distance measuring device 19.
  • the distance measuring device 19 includes a light receiving lens 192, a filter 193, a light receiving unit 194, a control unit 195 (an example of a second controller), a light source unit 197, and an irradiation lens 199.
  • the light source unit 197 emits light for measuring the distance to the object in the measurement target space (hereinafter referred to as “ranging light”) at a predetermined timing in accordance with the control of the control unit 195.
  • the distance measuring light is infrared light and has a predetermined wavelength (wavelength A).
  • the light source unit 197 includes a plurality of (for example, eight or more) laser diodes that output laser light having a predetermined wavelength (wavelength A) and a diffusion plate. Laser light from a plurality of laser diodes is diffused by a diffusion plate to become uniform light, and is irradiated in all directions through the irradiation lens 199 as distance measurement light.
  • the light receiving lens 192 is a wide-angle lens that can capture an image in a wide range (hemispherical range) as shown in FIG. 4A.
  • the irradiation lens 199 is a lens for diffusing and irradiating the distance measuring light at a wide angle so that the distance measuring light is irradiated to an object in a wide range as shown in FIG. 4A.
  • the filter 193 is an optical filter configured to transmit light of a wavelength component having a predetermined width centered on the wavelength of the distance measuring light (wavelength A) in the light received through the light receiving lens 192. By filtering in this way, the light receiving unit 194 receives only the reflected light of the distance measuring light (wavelength A).
  • the light receiving unit 194 includes an image sensor such as a CMOS image sensor, and generates image data based on the received light.
  • the light receiving unit 194 receives light through the filter 193 to generate image data of an image (hereinafter referred to as “IR image”) by reflected light of the distance measuring light (wavelength A).
  • the light source unit 197 emits distance measuring light.
  • the distance measuring light is reflected by an object existing in the measurement object space.
  • the reflected light is received by the light receiving unit 194 through the filter 193, and an IR image is generated.
  • the control unit 195 generates a distance image using information on the IR image generated by the light receiving unit 194.
  • the second camera 21b includes an optical system and an image sensor such as a CCD or a CMOS image sensor.
  • the second camera 21b captures visible light and generates RGB image data.
  • the RGB image data is image data used for ranging (hereinafter referred to as “ranging image data”).
  • the distance measurement image data generated by the second camera 21b has a lower resolution than the image data generated by the first camera 21.
  • the second camera 21b has the same angle of view as the distance measuring device 19, images the same measurement target space as the distance measuring device 19, and generates distance measurement image data. That is, in the present embodiment, distance measurement image data (RGB image data) by the second camera 21b and distance image data by the distance measuring device 19 are generated for the same measurement target space.
  • FIG. 5 is a diagram illustrating an appearance of the base station 50.
  • FIG. 6 is a block diagram showing a functional configuration of the base station 50.
  • the base station 50 includes a light source unit 51 that emits guide light, a communication unit 53 that communicates with the flying object 10, a battery 55 that supplies power, and operations of the light source unit 51. And a GPS positioning device 58 that measures the current position of the base station 50.
  • the base station 50 includes a reflecting plate 52 around the light emitting part of the light source part 51.
  • the base station 50 includes a support member 59 for fixing to an inspection object such as a bridge.
  • the reflector 52 reflects the distance measuring light from the flying object 10.
  • the communication unit 53 includes a communication module for performing wireless communication with the flying object 10.
  • the controller 56 is configured by a programmable microcomputer or the like.
  • the GPS positioning device 58 receives a signal from a GPS satellite and measures the current position (absolute position) of the base station 50.
  • the GPS positioning device 58 includes an antenna 58a for receiving a signal from a GPS satellite.
  • the antenna 58a is installed at a position protruding from the bridge 71 so that a signal from the GPS satellite is not shielded by the floor slab 73 or the bridge girder 74.
  • the light source unit 51 includes an LED (light emitting diode) that emits guide light, a lens for irradiating light emitted from the LED at a wide angle, and a drive circuit that drives the LED.
  • the color (wavelength) of the guide light emitted from the light source unit 51 is set to a specific color (wavelength) that can be photographed by the first camera 21.
  • the flying object 10 flies in a posture in which the side to which the distance measuring device 19 is attached faces upward.
  • the flying object 10 moves in a state in which it always faces a certain direction with respect to the inspection object. That is, as shown in FIG. 3B, when the flying object 10 moves in the direction of the arrow, the flying object 10 always moves while facing a certain direction.
  • the flying object 10 flies below the inspection object (for example, bridge floor slab 73 or bridge girder 74) in accordance with the instruction from the base station 50 or the pre-programmed instruction content, and is inspected by the first camera 21. An image of the surface of an object (for example, a bridge floor slab 73 or a bridge girder 74) is taken.
  • the flying object 10 grasps the relative positional relationship (position and orientation) with respect to the base station 50 based on the guide light from the base station 50.
  • the base station 50 measures its own absolute position by the GPS positioning device 58 and notifies the flying object 10 of the absolute position of the base station 50. Based on the absolute position of the base station 50 and the relative positional relationship with respect to the base station 50, the flying object 10 recognizes the absolute position of its own device and determines the flight path.
  • FIG. 7 is a flowchart showing processing in the flying object control system 100 according to the first embodiment. The operation of recognizing the relative positional relationship between the flying object 10 and the base station 50 in the flying object control system 100 will be described with reference to the flowchart of FIG.
  • the base station 50 emits guide light of a desired color from the light source unit 51 (S11).
  • the second camera 21b of the flying object 10 takes a wide-angle image of the measurement target space and generates distance measurement image data (S12).
  • the distance measurement image data generated by the second camera 21b is image data represented by RGB signals.
  • the controller 16 of the flying object 10 converts the ranging image represented by RGB into an HSV image represented by hue, saturation, and value (S13).
  • the controller 16 extracts a light source having a color close to the guide light from the HSV image, and detects the position of the guide light (that is, the light source 51 or the reflection plate 52) in the HSV image (S14).
  • FIG. 8A is a diagram showing an example of an HSV image generated by performing color conversion from an RGB image for distance measurement generated by the second camera 21b.
  • the center P00 indicates the position of the flying object 10
  • the star mark P01 indicates the position on the image that emits a color close to the guide light.
  • the controller 16 detects the position of the guide light (that is, the light source 51 or the reflection plate 52) by extracting a pixel including a color close to the guide light from the HSV image 60.
  • the controller 16 detects the orientation of the base station 50 from the position of the guide light detected on the HSV image (S15).
  • the azimuth D of the base station 50 is obtained from the position P01 of the guide light specified on the HSV image 60 shown in FIG. 8A.
  • the flying object 10 moves in a fixed direction with respect to the inspection object (for example, attitude control in which one of the longitudinal directions of the bridge is always forward of the flying object).
  • the azimuth indicated in the image does not change greatly depending on the flight status of the aircraft 10.
  • the laser light (ranging light) of wavelength A is emitted from the light source unit 197 of the distance measuring device 19 of the flying object 10 through the irradiation lens 199 at a wide angle (S16). Then, the distance measuring device 19 shoots the measurement target space and generates a distance image (S17). Specifically, the light receiving unit 194 of the distance measuring device 19 receives the reflected light of the distance measuring light through the filter 193, thereby photographing the measurement target space and generating image data. The control unit 195 generates image data indicating a distance image based on the image data generated by the light receiving unit 194.
  • the controller 16 of the flying object 10 determines the distance from the flying object 10 to the guide light (that is, the light source 51 or the reflecting plate 52) of the base station 50 based on the previously determined direction of the base station 50 and the position of the guide light. Measure (S18). Specifically, in the distance image, the controller 16 obtains a position corresponding to the position of the guide light (such as the light source unit 51) previously obtained from the HSV image, and obtains the distance to this position.
  • FIG. 8B is a diagram showing an example of a distance image corresponding to the HSV image 60 shown in FIG. 8A. 8A and 8B, in the distance image 62, the pixel position P11 corresponding to the position P01 obtained from the HSV image 60 is obtained, and the distance to the pixel at this position P11 is obtained.
  • the controller 16 of the flying object 10 obtains a relative positional relationship (position and orientation) with respect to the base station 50 based on the azimuth of the base station 50 and the distance to the base station 50 obtained as described above (S19).
  • the flying object 10 acquires the absolute position of the base station 50 from the base station 50. Since the absolute position of the base station 50 is known, the flying object 10 can recognize the absolute position of the flying object 10 by recognizing the relative positional relationship of the flying object 10 with respect to the base station 50. Therefore, even in a situation where GPS signals or the like cannot be received due to an obstacle, the controller 16 of the flying object 10 can grasp the absolute position of the own device from the relative positional relationship with the base station 50 and control the flight path. it can.
  • the image data of the RGB image for distance measurement is generated by the first camera 21b, and the image data of the distance image is generated by the distance measuring device 19, respectively.
  • one ranging device 19 may generate image data of RGB images for distance measurement in addition to distance image data.
  • the filter 193 is configured to have characteristics as shown in FIG. FIG. 9 shows the transmission characteristics for four pixels for convenience of explanation, and the pattern shown in FIG. 9 is repeatedly applied to the entire pixel region of the light receiving unit 194.
  • the filter 193 separates R, G, and B colors and infrared light (IR) for each pixel.
  • the light receiving unit 194 receives light through a filter 193 having a pattern as shown in FIG.
  • control unit 195 generates the image data of the distance image based on the image data of the IR image.
  • the controller 16 of the flying object 10 acquires the image data of the RGB image for distance measurement and the data of the distance image from the distance measuring device 19, and uses these image data to perform the same method as the above-described method.
  • the relative positional relationship with respect to the base station 50 is recognized.
  • the configuration of the distance measuring device 19 is not limited to the above-described one, and any other configuration may be used as long as it can capture a distance image in the infrared wavelength region and an image in the visible wavelength region. Good.
  • a distance measuring device used in the flying object control system 100 a distance measuring device having a configuration disclosed in Japanese Patent Application Laid-Open No. 2008-70374 can be applied.
  • the controller 56 of the base station 50 may transmit the guide light emitted from the light source unit 51 after modulating it with a modulation signal.
  • the modulation signal may include, for example, identification information for identifying the base station.
  • the flying object 10 is obtained by demodulating the signal extracted through the RGB image obtained by photographing the guide light to extract the identification information included in the modulation signal, and the guide light is emitted from the base station 50 based on the extracted identification information. I can confirm that there is. Thereby, disturbance light and guide light can be distinguished, and it can prevent misidentifying the position of the base station 50 by disturbance light.
  • the control system 100 is a control system including the flying object 10 and the base station 50.
  • the base station 50 includes a light source unit 51 that emits guide light having a predetermined wavelength (specific color).
  • the flying object 10 is a omnidirectional distance measuring device that measures a distance from a propulsion device 15 that generates a propulsive force for flying in the air and an object existing in a predetermined space around the flying object 10.
  • a device 19 an example of a distance measuring unit
  • a second camera 21b an example of an imaging unit
  • a controller 16 (which controls the operation of the flying object) An example of a first controller).
  • the controller 16 obtains the relative positional relationship of the flying object with respect to the base station 50 based on the guide light image included in the image data generated by the second camera 21 b and the information from the omnidirectional distance measuring device 19.
  • the flying object 10 can grasp the relative positional relationship of its own device with the guide light from the base station 50, it can grasp the position of its own device even when the GPS signal or the like cannot be received due to an obstacle or the like. it can.
  • the influence of multipath can be eliminated by using light instead of sound waves and radio waves as a communication medium.
  • the base station 50 includes an LED, and radiates light emitted from the LED as guide light.
  • the base station emits laser light as guide light.
  • FIG. 10A is a diagram showing a functional configuration of the flying object 10b in the second embodiment.
  • the flying object 10b according to the second embodiment is different from the structure of the flying object 10 according to the first embodiment in that the second camera 21b is not provided.
  • the omnidirectional distance measuring device 19b is different from the configuration of the first embodiment in that it receives infrared light of two types of wavelengths (wavelength A and wavelength B) and generates a captured image for each wavelength.
  • the filter 193 of the omnidirectional distance measuring device 19b of Embodiment 2 has a transmission characteristic as shown in FIG. 10B.
  • FIG. 10B shows the transmission characteristics for four pixels for convenience of explanation, and the pattern shown in FIG.
  • FIG. 10B is repeatedly applied to the entire pixel region of the light receiving unit 194.
  • FIG. 10B shows filter characteristics that transmit the wavelength A and the wavelength B for each pixel.
  • the light receiving unit 194 of the omnidirectional distance measuring device 19b can receive light of wavelength A or light of wavelength B for each pixel.
  • the light receiving unit 194 of the omnidirectional distance measuring device 19b can generate image data of an image based on light received at a wavelength A and image data of an image based on light received at a wavelength B.
  • the control unit 195 generates distance image data based on a signal from a pixel that receives the wavelength A.
  • FIG. 11 is a diagram showing a configuration of the base station 50b in the second embodiment.
  • the light source unit 51 includes an LED.
  • the light source unit 51b includes a laser diode that emits a laser beam having a wavelength B different from the wavelength A of the ranging light. This is different from the first embodiment. That is, in the second embodiment, the distance measuring light and the guide light are the same laser light, but their wavelengths (wavelength A and wavelength B) are different.
  • FIG. 12 is a flowchart showing processing of the flying object control system 100 according to the second embodiment. The operation of the flying object control system 100 according to the second embodiment will be described with reference to the flowchart of FIG.
  • the base station 50b radiates guide light having a wavelength B from the light source unit 51b (S21).
  • the distance measuring device 19b of the flying object 10 irradiates laser light (ranging light) having a wavelength A from the light source unit 197 at a wide angle (S22).
  • the distance measuring device 19b takes a wide-angle image of the measurement target space, and image data of an image based on light received at wavelength A (hereinafter referred to as “image A”) and an image based on light received at wavelength B (hereinafter referred to as “image B”). Is generated) (S23).
  • the controller 16 detects the position of the guide light (that is, the light source unit 51 or the reflection plate 52) from the image B (S24).
  • FIG. 13A is a diagram illustrating an example of an image B.
  • the center P00 indicates the position of the flying object 10
  • the star P02 indicates the position of the guide light.
  • the controller 16 can detect the position of the guide light (that is, the light source unit 51 or the reflection plate 52) based on the luminance of each pixel of the image B (60b).
  • the controller 16 detects the azimuth of the base station 50b from the position of the guide light detected on the image B (S25).
  • the direction D1 of the base station 50 is obtained from the position P02 of the guide light detected on the image B (60b) shown in FIG. 13A.
  • the control unit 195 of the distance measuring device 19 generates image data indicating a distance image based on the image data of the image A (S26).
  • FIG. 13B is a diagram illustrating an example of a distance image obtained from the image A.
  • the distance image 62b in the distance image 62b, the pixel position P22 corresponding to the position P02 obtained from the image B (60b) is obtained, and the distance to the pixel at this position P22 is obtained.
  • the controller 16 obtains a relative positional relationship (position and orientation) with respect to the base station 50 based on the orientation of the base station 50b and the distance to the base station 50b obtained as described above (S28).
  • the desired wavelength based on each light can be obtained by making the wavelengths different from each other. Can be controlled. Thereby, the flying body 10 can obtain a relative positional relationship with respect to the base station 50b based on the guide light.
  • FIG. 14A is a diagram showing a functional configuration of the flying object 10c in the third embodiment.
  • the flying object 10c of the third embodiment is different from the structure of the flying object 10b of the second embodiment in the filter of the omnidirectional distance measuring device. That is, in the third embodiment, as in the first embodiment, the filter 193 has a characteristic of transmitting light of wavelength A.
  • FIG. 14B is a diagram illustrating the configuration of the base station 50c according to the third embodiment.
  • the light source unit 51b outputs guide light (laser light) having a wavelength different from the wavelength of the distance measuring light.
  • the light source unit 51c according to the present embodiment outputs guide light (laser light) having the same wavelength as the distance measuring light (that is, wavelength A).
  • FIG. 15 is a flowchart showing processing of the flying object control system 100 according to the third embodiment. The operation of the flying object control system 100 according to the third embodiment will be described with reference to the flowchart of FIG.
  • the base station 50c irradiates guide light having a wavelength A from the light source unit 51c (S31). While the guide light is emitted from the light source unit 51c, as shown in FIG. 16, the flying object 10 stops the irradiation of the distance measuring light.
  • the distance measuring device 19c takes a wide-angle image of the measurement target space, and generates image data of an image (image A) based on the received light of wavelength A (S32).
  • the controller 16 detects the position of the guide light (that is, the light source unit 51) from the image A (S33). Further, the controller 16 detects the azimuth of the base station 50c from the position of the guide light detected on the image A (S34).
  • the ranging device 19c of the flying object 10 irradiates laser light (ranging light) having a wavelength A at a wide angle (S35). While the laser beam (ranging beam) is emitted from the distance measuring device 19c, the base station 50c stops the irradiation of the guide beam as shown in FIG. Then, the distance measuring device 19 performs wide-angle imaging of the measurement target space, generates image data of the image A based on the received light of wavelength A, and generates image data indicating the distance image based on the image data of the image A ( S36). In this manner, the irradiation timing of the ranging light from the flying object 10 is different from the irradiation timing of the guide light from the base station 50c (see FIG. 16). Thereby, even when the same wavelength is used for the distance measuring light and the guide light, it is possible to perform desired control based on each light without interfering with each other.
  • the controller 16 of the flying object 10 reaches the base station 50c (that is, the guide light, the light source 51, or the reflector 52) based on the direction of the base station 50c obtained in step S34 and the position of the guide light obtained in step S33. Is measured (S37).
  • the controller 16 obtains a relative positional relationship (position and orientation) with respect to the base station 50c based on the orientation of the base station 50c and the distance to the base station 50c obtained as described above (S38).
  • the flying object 10 and the base station 50 are: It is necessary to accurately synchronize the time managed by them in advance.
  • each of the flying object 10 and the base station 50 exclusively uses wavelength A light (ranging light or guide light) at a predetermined timing (see FIG. 16) so that the light irradiation timings do not overlap each other. Irradiate.
  • the base station 50 and the flying object 10 may communicate to control each other's irradiation timing exclusively.
  • the other device when one device of the flying object 10 and the base station 50 stops emitting the light of the wavelength A (ranging light or guide light), the other device may be notified to that effect.
  • the other device receives the notification of the stop of irradiation from the one device, the other device may irradiate light of wavelength A for a certain period of time.
  • the guide light and the distance measurement light are emitted exclusively by controlling the timing of the guide light irradiation and the distance measurement light irradiation, so that the guide light and the distance measurement light have the same wavelength.
  • Light can be distinguished and detected. Therefore, the flying object 10 can obtain the relative positional relationship with respect to the base station 50c based on the guide light even if the wavelengths of the guide light and the distance measuring light are the same.
  • Embodiments 1 to 3 have been described as examples of the technology disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed.
  • the configuration of the omnidirectional distance measuring device in the above embodiment is not limited to that described above. Other configurations can be applied to the omnidirectional distance measuring device as long as the distance to the object can be measured. For example, a configuration in which a distance to an object is measured using a stereo camera can be applied.
  • the flying object 10 and the base station 50 perform wireless communication has been described.
  • the flying object 10 and the base station 50 may be connected by wire to perform communication by wired communication. .
  • the first camera 21 is mounted upward with respect to the flying object 10 so as to shoot a subject in the space above the flying object 10, but the orientation of the first camera 21 is this. It is not limited to.
  • the direction of the first camera 21 may be set as appropriate according to the part to be inspected and the structure of the inspection object.
  • the omnidirectional distance measuring device 19 is also attached to the flying object 10 so as to photograph the hemispheric space above the flying object 10 and generate a distance image.
  • the direction of the omnidirectional distance measuring device 19 is not limited to this.
  • the orientation of the omnidirectional distance measuring device 19 may be set as appropriate according to the part or structure to be inspected of the inspection object. For example, when inspecting the upper surface of the inspection object, the omnidirectional distance measuring device 19 may be attached to the flying object 10 so as to photograph a hemispheric space below the flying object 10 and generate a distance image.
  • the controller 16 and the control unit 195 of the flying object 10 and the controller 56 in the base station can be constituted by an electronic circuit such as a CPU, MPU, DSP, microcomputer, FPGA, ASIC or the like.
  • the control system of the present disclosure is useful for a flying object control system because it enables stable attitude control of the flying object even in a situation where a satellite positioning signal such as a GPS signal cannot be captured by an obstacle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The control system (100) includes a flying body (10) and a base station (50). The base station is equipped with a light source unit which irradiates guide light of a predetermined wavelength. The flying body is equipped with: a propulsion unit which generates propulsive force for flying in the air; a distance measuring unit which measures the distance between the flying body and an object within a predetermined space around the flying body; an imaging unit which captures an image of the object in the predetermined space, and generates image data; and a first controller which controls the movement of the flying body. The first controller determines the relative positional relationship of the flying body with respect to the base station on the basis of the image of the guide light included in the image data generated by the imaging unit and the information from the distance measuring unit.

Description

飛行体及びその制御システムAircraft and its control system
 本開示は、空中を飛行する飛行体及びその制御システムに関する。 The present disclosure relates to an aircraft flying in the air and a control system thereof.
 屋外作業用ロボットの自動運転/遠隔操作においては、GPS衛星などの衛星からの測位信号を捕捉して自己位置取得と位置姿勢制御する方式が有用である(特許文献1参照)。しかしながら橋梁や屋内などの大型構造物の下かつ狭隘部では、作業用ロボットは、衛星から測位信号を捕捉できず位置姿勢制御が困難となる。 In automatic operation / remote operation of an outdoor work robot, a system that acquires a positioning signal from a satellite such as a GPS satellite and performs self-position acquisition and position / orientation control is useful (see Patent Document 1). However, under a large structure such as a bridge or indoor and in a narrow part, the working robot cannot capture a positioning signal from the satellite, and position and orientation control becomes difficult.
 位置姿勢制御に関する技術として例えば特許文献2に開示のものがある。特許文献2は、ステレオカメラで撮像した撮像画像をステレオ処理して下方面の距離分布情報(距離画像)を取得し、多数の計測点の距離情報から自機の姿勢角を算出する装置を開示している。また、特許文献3は、光反射体を備える飛行体を測量装置で追尾する方法を開示している。 For example, Patent Document 2 discloses a technique related to position and orientation control. Patent Document 2 discloses an apparatus that obtains distance distribution information (distance image) on the lower surface by performing stereo processing on a captured image captured by a stereo camera, and calculates the attitude angle of the own device from distance information of a large number of measurement points. is doing. Patent Document 3 discloses a method of tracking a flying object including a light reflector with a surveying device.
特開2006-001486号公報JP 2006-001486 A 特開2002-188917号公報JP 2002-188917 A 特開2015-145784号公報JP2015-145784 A
 本開示は、障害物によりGPS信号などの衛星測位信号を捕捉できない状況であっても、安定して飛行体の姿勢制御を可能とする飛行体の制御システムを提供する。 The present disclosure provides a flying object control system that can stably control the attitude of a flying object even in a situation where a satellite positioning signal such as a GPS signal cannot be captured due to an obstacle.
 本開示の第一態様において、飛行体と、ベースステーションとを含む飛行体の制御システムが提供される。ベースステーションは、所定の波長のガイド光を照射する光源部を備える。飛行体は、空中を飛行するための推進力を発生する推進部と、飛行体から、飛行体の周囲の所定の空間内に存在する対象物までの距離を測定する測距部と、所定の空間内に存在する被写体を撮像し、画像データを生成する撮像部と、飛行体の動作を制御する第1コントローラと、を備える。第1コントローラは、撮像部により生成された画像データに含まれるガイド光の画像と、測距部からの情報とに基づき、ベースステーションに対する飛行体の相対的な位置関係を求める。 In a first aspect of the present disclosure, a flying object control system including a flying object and a base station is provided. The base station includes a light source unit that emits guide light having a predetermined wavelength. The flying object includes a propulsion unit that generates a driving force for flying in the air, a distance measuring unit that measures a distance from the flying object to an object existing in a predetermined space around the flying object, and a predetermined measurement unit. An imaging unit that captures an image of a subject existing in space and generates image data, and a first controller that controls the operation of the flying object. The first controller obtains the relative positional relationship of the flying object with respect to the base station based on the guide light image included in the image data generated by the imaging unit and information from the distance measuring unit.
 本開示の制御システムによれば、飛行体はベースステーションからのガイド光に基づきベースステーションに対する相対位置を認識できる。このため、飛行体がベースステーションからのガイド光を受信できるような位置にベースステーションを配置すれば、障害物によりGPS信号などの衛星測位信号を捕捉できない状況であっても、安定して飛行体の姿勢制御を行うことができる。 According to the control system of the present disclosure, the flying object can recognize the relative position with respect to the base station based on the guide light from the base station. For this reason, if the base station is arranged at a position where the flying object can receive the guide light from the base station, even if the satellite positioning signal such as a GPS signal cannot be captured by an obstacle, the flying object can be stably obtained. Attitude control can be performed.
実施の形態1における飛行体の制御システムの構成を説明した図The figure explaining the structure of the control system of the flying body in Embodiment 1 飛行体の外観を示す図Figure showing the appearance of the flying object 飛行体の機能的な構成を示すブロック図Block diagram showing the functional configuration of the aircraft 飛行中の飛行体の向きを説明するための図Illustration for explaining the orientation of the flying object in flight 全方位測距装置の画角を説明した図Illustration explaining the angle of view of the omnidirectional distance measuring device 飛行体の全方位測距装置の機能的な構成を示すブロック図Block diagram showing the functional configuration of the omnidirectional rangefinder of the flying object ベースステーションの外観を示す図Diagram showing the appearance of the base station ベースステーションの機能的な構成を示すブロック図Block diagram showing the functional configuration of the base station 実施の形態1における飛行体の制御システムの処理を示すフローチャートFlowchart showing the process of the control system of the flying object in the first embodiment 全方位測距装置により撮影されるHSV画像の例を示す図The figure which shows the example of the HSV image image | photographed with an omnidirectional ranging device 全方位測距装置により生成される距離画像の例を示す図The figure which shows the example of the distance image produced | generated by the omnidirectional ranging device 全方位測距装置におけるフィルタの例を示す図The figure which shows the example of the filter in the omnidirectional distance measuring device 実施の形態2における飛行体の機能的な構成を示す図The figure which shows the functional structure of the aircraft in Embodiment 2. 実施の形態2における全方位測距装置におけるフィルタの例を示す図The figure which shows the example of the filter in the omnidirectional ranging device in Embodiment 2. 実施の形態2におけるベースステーションの機能的な構成を示すブロック図FIG. 4 is a block diagram showing a functional configuration of a base station in the second embodiment. 実施の形態2における飛行体の制御システムの処理を示すフローチャートFlowchart showing processing of the control system for an aircraft in the second embodiment 全方位測距装置により撮影された画像Bの例を示す図The figure which shows the example of the image B image | photographed with the omnidirectional ranging device 画像Aに基づきされた距離画像の例を示す図The figure which shows the example of the distance image based on the image A 実施の形態3における飛行体の機能的な構成を示す図、The figure which shows the functional structure of the flying body in Embodiment 3, 実施の形態3における全方位測距装置におけるベースステーションの機能的な構成を示す図The figure which shows the functional structure of the base station in the omnidirectional ranging device in Embodiment 3. 実施の形態3における飛行体の制御システムの処理を示すフローチャートFlowchart showing processing of the control system for an aircraft in the third embodiment 測距光とガイド光の照射タイミングを示す図Diagram showing irradiation timing of distance measuring light and guide light
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
 なお、発明者(ら)は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。 The inventor (s) provides the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and is intended to limit the subject matter described in the claims. Not what you want.
 (実施の形態1)
 以下、添付の図面を用いて、実施の形態1を説明する。
(Embodiment 1)
The first embodiment will be described below with reference to the accompanying drawings.
 [1.構成]
 図1は、本開示における飛行体の制御システムの構成を説明した図である。飛行体の制御システム100は、自動操縦される無人の飛行体(いわゆる「ドローン」)10と、飛行体に指示を与えるベースステーション50と、を備える。図1に示す例では、飛行体10は、橋梁の下側を移動しながら、橋梁の床版73や橋桁74の状態をカメラで撮影する。ベースステーション50は橋梁の一部に取り付けられ、固定されている。橋梁の下側では、GPS衛星からのGPS信号は床版73や橋桁74により遮蔽され、飛行体10に届かない。このため飛行体10はGPS信号を受信することができず、位置制御が困難となる。
[1. Constitution]
FIG. 1 is a diagram illustrating a configuration of an aircraft control system according to the present disclosure. The flying vehicle control system 100 includes an unmanned flying vehicle (a so-called “drone”) 10 that is automatically piloted, and a base station 50 that gives instructions to the flying vehicle. In the example illustrated in FIG. 1, the flying object 10 captures the state of the bridge floor slab 73 and the bridge girder 74 with a camera while moving under the bridge. The base station 50 is attached and fixed to a part of the bridge. Below the bridge, the GPS signal from the GPS satellite is shielded by the floor slab 73 and the bridge girder 74 and does not reach the flying object 10. For this reason, the flying object 10 cannot receive GPS signals, and position control becomes difficult.
 以下では、このようなGPS衛星からの測位信号を受信できないような状況でも、飛行体が自装置の位置を把握できるようにする飛行体の制御システムを開示する。本実施の形態の飛行体10は、ベースステーション50から照射されるガイド光(ベースステーション50の位置を示す光)に基づきベースステーション50に対する相対位置を把握し、その相対位置に基づき位置姿勢制御を行う(詳細は後述)。 Hereinafter, a control system for a flying object that enables the flying object to grasp the position of its own device even in a situation where a positioning signal from such a GPS satellite cannot be received will be disclosed. The flying object 10 of the present embodiment grasps the relative position with respect to the base station 50 based on the guide light (light indicating the position of the base station 50) emitted from the base station 50, and performs position and orientation control based on the relative position. (Details will be described later).
 [1.1 飛行体の構成]
 図2は、飛行体10の外観を示した図である。図3Aは、飛行体10の機能的な構成を示したブロック図である。飛行体10は、本体11と、飛行体10の推進力を発生させる推進器15とを備える。推進器15は、本体11の四角の各々から延在する支持部13の先端に取り付けられる。さらに、本体11の側面に第1カメラ21が取り付けられている。本体11の上部側には、慣性計測装置17と、GPS測位装置18と、全方位測距装置19と、第2カメラ21bとが取り付けられている。本体11の下部側には、通信部23と、電池25とが取り付けられている。本体11内部にはコントローラ16が収納されている。
[1.1 Configuration of the flying object]
FIG. 2 is a view showing the appearance of the flying object 10. FIG. 3A is a block diagram showing a functional configuration of the flying object 10. The flying object 10 includes a main body 11 and a propulsion device 15 that generates a propulsive force of the flying object 10. The propulsion device 15 is attached to the tip of the support portion 13 extending from each of the squares of the main body 11. Further, a first camera 21 is attached to the side surface of the main body 11. An inertial measuring device 17, a GPS positioning device 18, an omnidirectional distance measuring device 19, and a second camera 21b are attached to the upper side of the main body 11. A communication unit 23 and a battery 25 are attached to the lower side of the main body 11. A controller 16 is accommodated in the main body 11.
 推進器15はプロペラとプロペラを回転させるモータとからなる。図2の例では、飛行体10は4個の推進器15を有しているが、推進器の数は4個に限定されず、例えば5個以上であってもよい。各推進器15の回転数を適宜制御することで、飛行体10の移動方向や飛行状態を制御することができる。 The propulsion device 15 includes a propeller and a motor that rotates the propeller. In the example of FIG. 2, the flying object 10 includes four propulsion devices 15, but the number of propulsion devices is not limited to four, and may be five or more, for example. By appropriately controlling the rotation speed of each propeller 15, the moving direction and flight state of the flying object 10 can be controlled.
 第1カメラ21は被写体(検査対象)を撮影し、高精細(例えば、4K)な検査用の画像データを生成する。第1カメラ21は、光学系と、CCDやCMOSイメージセンサなどの画像センサとを備え、被写体を撮影して画像データを生成する。第1カメラ21は、可視光を撮像して画像データを生成する。このような可視光を撮像して生成される画像を以下「RGB画像」という。本実施形態では、第1カメラ21は、飛行体10の側面に飛行体10の上方に向けて取り付けられている。飛行体10は、その飛行中において、飛行体10の鉛直上方の所定の画角の画像を撮影することができる。すなわち、飛行体10を、検査対象である建造物の下側を飛行させることで、建造物の下側の状態を撮影することができる。 The first camera 21 captures a subject (inspection target) and generates high-definition (for example, 4K) image data for inspection. The first camera 21 includes an optical system and an image sensor such as a CCD or CMOS image sensor, and shoots a subject to generate image data. The first camera 21 captures visible light and generates image data. An image generated by imaging such visible light is hereinafter referred to as an “RGB image”. In the present embodiment, the first camera 21 is attached to the side surface of the flying object 10 toward the upper side of the flying object 10. The flying object 10 can take an image of a predetermined angle of view vertically above the flying object 10 during the flight. That is, the state of the lower side of the building can be photographed by causing the flying object 10 to fly under the building to be inspected.
 慣性計測装置17は、加速度センサやジャイロセンサを備え、飛行体10の加速度や角速度を計測する装置である。慣性計測装置17からの出力に基づき飛行体10の挙動や姿勢が制御される。 The inertial measurement device 17 includes an acceleration sensor and a gyro sensor, and measures the acceleration and angular velocity of the flying object 10. Based on the output from the inertial measurement device 17, the behavior and attitude of the flying vehicle 10 are controlled.
 GPS測位装置18は、GPS(Global Positioning System)衛星から信号を受信して、飛行体10の現在位置を計測する。電池25は、飛行体10の各要素に電源電圧を供給する電源である。 The GPS positioning device 18 receives a signal from a GPS (Global Positioning System) satellite and measures the current position of the flying object 10. The battery 25 is a power source that supplies a power supply voltage to each element of the flying vehicle 10.
 全方位測距装置19(以下単に「測距装置」という)は、測距装置19の周囲の空間を広角に撮影し、空間内に存在する対象物までの距離を示す距離画像を生成する。具体的には、図4Aに示すように、測距装置19の光を入射及び出射する部分を囲む略半球の空間(以下「計測対象空間」という)を撮影する。測距装置19が撮影可能な範囲は、水平方向において全方位(360°)をカバーし、鉛直方向において180°以上の範囲をカバーする。測距装置19は例えば広角に撮影ができるTOF(Time of Flight)センサで構成できる。 The omnidirectional distance measuring device 19 (hereinafter simply referred to as “ranging device”) captures a space around the distance measuring device 19 at a wide angle, and generates a distance image indicating a distance to an object existing in the space. Specifically, as shown in FIG. 4A, a substantially hemispherical space (hereinafter referred to as “measurement target space”) surrounding a portion where the light of the distance measuring device 19 enters and exits is photographed. The range in which the distance measuring device 19 can shoot covers all directions (360 °) in the horizontal direction and covers a range of 180 ° or more in the vertical direction. The distance measuring device 19 can be constituted by, for example, a TOF (Time) of Flight) sensor capable of photographing at a wide angle.
 図4Bは、測距装置19の機能的な構成を示した図である。測距装置19は、受光レンズ192と、フィルタ193と、受光部194と、制御部195(第2コントローラの一例)と、光源部197と、照射レンズ199とを含む。光源部197は、制御部195の制御にしたがい、所定のタイミングで、計測対象空間における対象物までの距離を計測するための光(以下「測距光」という)を照射する。測距光は赤外光であり、所定の波長(波長A)を有する。光源部197は、所定の波長(波長A)のレーザ光を出力する複数(例えば8個以上)のレーザダイオードと拡散板とを含む。複数のレーザダイオードからのレーザ光は拡散板により拡散されて均一な光となり、測距光として照射レンズ199を通して全方位に照射される。受光レンズ192は、図4Aに示すような広い範囲(半球範囲)の画像の撮像を可能とする広角レンズである。照射レンズ199は、図4Aに示すような広い範囲の対象物に対して測距光が照射されるよう、測距光を広角に拡散させて照射させるためのレンズである。 FIG. 4B is a diagram showing a functional configuration of the distance measuring device 19. The distance measuring device 19 includes a light receiving lens 192, a filter 193, a light receiving unit 194, a control unit 195 (an example of a second controller), a light source unit 197, and an irradiation lens 199. The light source unit 197 emits light for measuring the distance to the object in the measurement target space (hereinafter referred to as “ranging light”) at a predetermined timing in accordance with the control of the control unit 195. The distance measuring light is infrared light and has a predetermined wavelength (wavelength A). The light source unit 197 includes a plurality of (for example, eight or more) laser diodes that output laser light having a predetermined wavelength (wavelength A) and a diffusion plate. Laser light from a plurality of laser diodes is diffused by a diffusion plate to become uniform light, and is irradiated in all directions through the irradiation lens 199 as distance measurement light. The light receiving lens 192 is a wide-angle lens that can capture an image in a wide range (hemispherical range) as shown in FIG. 4A. The irradiation lens 199 is a lens for diffusing and irradiating the distance measuring light at a wide angle so that the distance measuring light is irradiated to an object in a wide range as shown in FIG. 4A.
 フィルタ193は、受光レンズ192を介して受光した光において、測距光の波長(波長A)を中心として所定幅の波長成分の光を透過させるよう構成された光学フィルタである。このようにフィルタリングすることで、受光部194は測距光(波長A)の反射光のみを受光する。 The filter 193 is an optical filter configured to transmit light of a wavelength component having a predetermined width centered on the wavelength of the distance measuring light (wavelength A) in the light received through the light receiving lens 192. By filtering in this way, the light receiving unit 194 receives only the reflected light of the distance measuring light (wavelength A).
 受光部194はCMOSイメージセンサのような画像センサを含み、受光した光に基づき画像データを生成する。受光部194は、フィルタ193を介して受光することで、測距光(波長A)の反射光による画像(以下「IR画像」という)の画像データを生成する。 The light receiving unit 194 includes an image sensor such as a CMOS image sensor, and generates image data based on the received light. The light receiving unit 194 receives light through the filter 193 to generate image data of an image (hereinafter referred to as “IR image”) by reflected light of the distance measuring light (wavelength A).
 以上のように構成される測距装置19において、光源部197は測距光を照射する。測距光は計測対象空間内に存在する対象物によって反射される。反射光はフィルタ193を介して受光部194で受光され、IR画像が生成される。制御部195は、受光部194で生成されたIR画像の情報を用いて距離画像を生成する。 In the distance measuring device 19 configured as described above, the light source unit 197 emits distance measuring light. The distance measuring light is reflected by an object existing in the measurement object space. The reflected light is received by the light receiving unit 194 through the filter 193, and an IR image is generated. The control unit 195 generates a distance image using information on the IR image generated by the light receiving unit 194.
 第2カメラ21bは、光学系と、CCDやCMOSイメージセンサなどの画像センサとを備える。第2カメラ21bは、可視光を撮像してRGB画像データを生成する。このRGB画像データは測距のために使用される画像データ(以下「測距用画像データ」という)である。第2カメラ21bにより生成される測距用画像データは、第1カメラ21による生成される画像データよりも低い解像度を有する。第2カメラ21bは測距装置19と同じ画角を有し、測距装置19と同じ計測対象空間を撮影し、測距用画像データを生成する。つまり、本実施の形態では、同じ計測対象空間に対して、第2カメラ21bによる測距用画像データ(RGB画像データ)と、測距装置19による距離画像データとが生成される。 The second camera 21b includes an optical system and an image sensor such as a CCD or a CMOS image sensor. The second camera 21b captures visible light and generates RGB image data. The RGB image data is image data used for ranging (hereinafter referred to as “ranging image data”). The distance measurement image data generated by the second camera 21b has a lower resolution than the image data generated by the first camera 21. The second camera 21b has the same angle of view as the distance measuring device 19, images the same measurement target space as the distance measuring device 19, and generates distance measurement image data. That is, in the present embodiment, distance measurement image data (RGB image data) by the second camera 21b and distance image data by the distance measuring device 19 are generated for the same measurement target space.
 [1.2 ベースステーションの構成]
 図5は、ベースステーション50の外観を示す図である。図6は、ベースステーション50の機能的な構成を示すブロック図である。図5、図6に示すように、ベースステーション50は、ガイド光を照射する光源部51と、飛行体10と通信を行う通信部53と、電源を供給する電池55と、光源部51の動作を制御するコントローラ56と、ベースステーション50の現在位置を計測するGPS測位装置58とを備える。
[1.2 Base station configuration]
FIG. 5 is a diagram illustrating an appearance of the base station 50. FIG. 6 is a block diagram showing a functional configuration of the base station 50. As shown in FIGS. 5 and 6, the base station 50 includes a light source unit 51 that emits guide light, a communication unit 53 that communicates with the flying object 10, a battery 55 that supplies power, and operations of the light source unit 51. And a GPS positioning device 58 that measures the current position of the base station 50.
 さらに、ベースステーション50は、図5に示すように、光源部51の出射部の周囲に反射板52を備える。また、ベースステーション50は、橋梁のような検査対象物等に固定するための支持部材59を備える。反射板52は、飛行体10からの測距光を反射する。 Furthermore, as shown in FIG. 5, the base station 50 includes a reflecting plate 52 around the light emitting part of the light source part 51. The base station 50 includes a support member 59 for fixing to an inspection object such as a bridge. The reflector 52 reflects the distance measuring light from the flying object 10.
 通信部53は、飛行体10と無線通信を行うための通信モジュールを含む。コントローラ56はプログラム可能なマイコン等で構成される。GPS測位装置58は、GPS衛星から信号を受信して、ベースステーション50の現在位置(絶対位置)を計測する。GPS測位装置58は、GPS衛星からの信号を受信するためのアンテナ58aを備える。アンテナ58aは、GPS衛星からの信号が床版73や橋桁74に遮蔽されないよう橋梁71から突出させた位置に設置される。 The communication unit 53 includes a communication module for performing wireless communication with the flying object 10. The controller 56 is configured by a programmable microcomputer or the like. The GPS positioning device 58 receives a signal from a GPS satellite and measures the current position (absolute position) of the base station 50. The GPS positioning device 58 includes an antenna 58a for receiving a signal from a GPS satellite. The antenna 58a is installed at a position protruding from the bridge 71 so that a signal from the GPS satellite is not shielded by the floor slab 73 or the bridge girder 74.
 光源部51は、ガイド光を発光するLED(発光ダイオード)と、LEDから出射された光を広角に照射させるためのレンズと、LEDを駆動する駆動回路とを含む。ここで、光源部51から照射されるガイド光の色(波長)は第1カメラ21により撮影可能な特定の色(波長)に設定される。 The light source unit 51 includes an LED (light emitting diode) that emits guide light, a lens for irradiating light emitted from the LED at a wide angle, and a drive circuit that drives the LED. Here, the color (wavelength) of the guide light emitted from the light source unit 51 is set to a specific color (wavelength) that can be photographed by the first camera 21.
 [2.動作]
 以上のように構成される飛行体の制御システム100の動作を説明する。飛行体10は、測距装置19が取り付けられた側を上方に向けた姿勢で飛行する。飛行体10は、検査対象物に対して常に一定の向きを向いた状態で移動する。すなわち、図3Bに示すように、矢印の方向に飛行体10が移動する場合、飛行体10は常に一定の方向を向いたまま移動する。
[2. Operation]
The operation of the aircraft control system 100 configured as described above will be described. The flying object 10 flies in a posture in which the side to which the distance measuring device 19 is attached faces upward. The flying object 10 moves in a state in which it always faces a certain direction with respect to the inspection object. That is, as shown in FIG. 3B, when the flying object 10 moves in the direction of the arrow, the flying object 10 always moves while facing a certain direction.
 飛行体10は、ベースステーション50からの指示または事前にプログラムされた指示内容にしたがい検査対象物(例えば、橋梁の床版73や橋桁74)の下側を飛行し、第1カメラ21により検査対象物(例えば、橋梁の床版73や橋桁74)の表面の画像を撮影する。飛行体10は、ベースステーション50からのガイド光に基づいてベースステーション50に対する相対的な位置関係(位置姿勢)を把握する。ベースステーション50はGPS測位装置58により自己の絶対位置を計測し、飛行体10にベースステーション50の絶対位置を通知する。飛行体10は、ベースステーション50の絶対位置と、ベースステーション50に対する相対的な位置関係とに基づき、自装置の絶対位置を認識して飛行経路を決定する。 The flying object 10 flies below the inspection object (for example, bridge floor slab 73 or bridge girder 74) in accordance with the instruction from the base station 50 or the pre-programmed instruction content, and is inspected by the first camera 21. An image of the surface of an object (for example, a bridge floor slab 73 or a bridge girder 74) is taken. The flying object 10 grasps the relative positional relationship (position and orientation) with respect to the base station 50 based on the guide light from the base station 50. The base station 50 measures its own absolute position by the GPS positioning device 58 and notifies the flying object 10 of the absolute position of the base station 50. Based on the absolute position of the base station 50 and the relative positional relationship with respect to the base station 50, the flying object 10 recognizes the absolute position of its own device and determines the flight path.
 図7は、実施の形態1における飛行体の制御システム100における処理を示すフローチャートである。図7のフローチャートを用いて、飛行体の制御システム100における、飛行体10とベースステーション50間の相対的位置関係の認識動作を説明する。 FIG. 7 is a flowchart showing processing in the flying object control system 100 according to the first embodiment. The operation of recognizing the relative positional relationship between the flying object 10 and the base station 50 in the flying object control system 100 will be described with reference to the flowchart of FIG.
 ベースステーション50は、光源部51から所望の色のガイド光を照射する(S11)。飛行体10の第2カメラ21bは、計測対象空間を広角撮影し、測距用画像データを生成する(S12)。ここで、第2カメラ21bにより生成された測距用画像データは、RGB信号により表される画像データである。飛行体10のコントローラ16は、RGBで表された測距用画像を、色相(Hue)、彩度(Saturation)、明度(Value)で表されたHSV画像に変換する(S13)。コントローラ16は、HSV画像からガイド光に近い色の光源を抽出し、HSV画像におけるガイド光(すなわち、光源51または反射板52)の位置を検出する(S14)。 The base station 50 emits guide light of a desired color from the light source unit 51 (S11). The second camera 21b of the flying object 10 takes a wide-angle image of the measurement target space and generates distance measurement image data (S12). Here, the distance measurement image data generated by the second camera 21b is image data represented by RGB signals. The controller 16 of the flying object 10 converts the ranging image represented by RGB into an HSV image represented by hue, saturation, and value (S13). The controller 16 extracts a light source having a color close to the guide light from the HSV image, and detects the position of the guide light (that is, the light source 51 or the reflection plate 52) in the HSV image (S14).
 図8Aは、第2カメラ21bにより生成された測距用のRGB画像から色変換して生成されたHSV画像の一例を示した図である。同図において中心P00は飛行体10の位置を示し、星印P01がガイド光に近い色を発する画像上の位置を示す。コントローラ16は、HSV画像60から、ガイド光に近い色を含む画素を抽出することでガイド光(すなわち、光源51または反射板52)の位置を検出する。 FIG. 8A is a diagram showing an example of an HSV image generated by performing color conversion from an RGB image for distance measurement generated by the second camera 21b. In the figure, the center P00 indicates the position of the flying object 10, and the star mark P01 indicates the position on the image that emits a color close to the guide light. The controller 16 detects the position of the guide light (that is, the light source 51 or the reflection plate 52) by extracting a pixel including a color close to the guide light from the HSV image 60.
 そして、コントローラ16は、HSV画像上で検出したガイド光の位置から、ベースステーション50の方位を検出する(S15)。例えば、図8Aに示すHSV画像60上で特定したガイド光の位置P01からベースステーション50の方位Dを求める。なお、飛行体10は前述のように、検査対象物に対して一定の向きのまま移動(例えば、橋梁の長手方向の一方を常に飛行体の前方向とする姿勢制御)を行うことから、HSV画像中で示される方位は飛行体10の飛行状況に応じて大きく変化することはない。 Then, the controller 16 detects the orientation of the base station 50 from the position of the guide light detected on the HSV image (S15). For example, the azimuth D of the base station 50 is obtained from the position P01 of the guide light specified on the HSV image 60 shown in FIG. 8A. As described above, the flying object 10 moves in a fixed direction with respect to the inspection object (for example, attitude control in which one of the longitudinal directions of the bridge is always forward of the flying object). The azimuth indicated in the image does not change greatly depending on the flight status of the aircraft 10.
 飛行体10の測距装置19の光源部197から波長Aのレーザ光(測距光)を、照射レンズ199を介して広角で照射する(S16)。そして、測距装置19は、計測対象空間を撮影して距離画像を生成する(S17)。具体的には、測距装置19の受光部194は、フィルタ193を介して測距光の反射光を受光することで、計測対象空間を撮影して画像データを生成する。制御部195は、受光部194で生成された画像データに基づき距離画像を示す画像データを生成する。 The laser light (ranging light) of wavelength A is emitted from the light source unit 197 of the distance measuring device 19 of the flying object 10 through the irradiation lens 199 at a wide angle (S16). Then, the distance measuring device 19 shoots the measurement target space and generates a distance image (S17). Specifically, the light receiving unit 194 of the distance measuring device 19 receives the reflected light of the distance measuring light through the filter 193, thereby photographing the measurement target space and generating image data. The control unit 195 generates image data indicating a distance image based on the image data generated by the light receiving unit 194.
 飛行体10のコントローラ16は、先に求めたベースステーション50の方位とガイド光の位置とに基づき、飛行体10からベースステーション50のガイド光(すなわち、光源51または反射板52)までの距離を計測する(S18)。具体的には、コントローラ16は、距離画像において、HSV画像から先に求めたガイド光(光源部51等)の位置に対応する位置を求め、この位置までの距離を求める。図8Bは、図8Aに示すHSV画像60に対応する距離画像の例を示した図である。図8A、8Bの例の場合、距離画像62において、HSV画像60から求めた位置P01に対応する画素の位置P11を求め、この位置P11の画素に対する距離を求める。 The controller 16 of the flying object 10 determines the distance from the flying object 10 to the guide light (that is, the light source 51 or the reflecting plate 52) of the base station 50 based on the previously determined direction of the base station 50 and the position of the guide light. Measure (S18). Specifically, in the distance image, the controller 16 obtains a position corresponding to the position of the guide light (such as the light source unit 51) previously obtained from the HSV image, and obtains the distance to this position. FIG. 8B is a diagram showing an example of a distance image corresponding to the HSV image 60 shown in FIG. 8A. 8A and 8B, in the distance image 62, the pixel position P11 corresponding to the position P01 obtained from the HSV image 60 is obtained, and the distance to the pixel at this position P11 is obtained.
 飛行体10のコントローラ16は、以上のようにして求めた、ベースステーション50の方位及びベースステーション50までの距離に基づき、ベースステーション50に対する相対的な位置関係(位置姿勢)を求める(S19)。 The controller 16 of the flying object 10 obtains a relative positional relationship (position and orientation) with respect to the base station 50 based on the azimuth of the base station 50 and the distance to the base station 50 obtained as described above (S19).
 飛行体10はベースステーション50からベースステーション50の絶対位置を取得する。飛行体10は、ベースステーション50の絶対位置が判っていることから、ベースステーション50に対する飛行体10の相対的な位置関係を認識することで、飛行体10の絶対位置を認識することができる。よって、GPS信号等が障害物により受信できない状況においても、飛行体10のコントローラ16は、ベースステーション50に対する相対的な位置関係から自装置の絶対位置を把握し、飛行経路の制御を行うことができる。 The flying object 10 acquires the absolute position of the base station 50 from the base station 50. Since the absolute position of the base station 50 is known, the flying object 10 can recognize the absolute position of the flying object 10 by recognizing the relative positional relationship of the flying object 10 with respect to the base station 50. Therefore, even in a situation where GPS signals or the like cannot be received due to an obstacle, the controller 16 of the flying object 10 can grasp the absolute position of the own device from the relative positional relationship with the base station 50 and control the flight path. it can.
 なお、上記の説明では、第1カメラ21bにより測距用のRGB画像の画像データを生成し、測距装置19により距離画像の画像データをそれぞれ生成した。しかし、1つの測距装置19で、距離画像のデータに加えて、測距用のRGB画像の画像データも生成するようにしてもよい。この場合、測距装置19において、フィルタ193は、図9に示すような特性を持つように構成する。なお、図9は、説明の便宜上4画素分の透過特性を示しており、図9に示すパターンが受光部194の画素領域全体に対して繰り返して適用される。フィルタ193はR、G、B各色と、赤外光(IR)とを画素毎に分離する。受光部194は、図9に示すようなパターンを持つフィルタ193を介して受光することで、R、G、B光による測距用画像(RGB画像)の画像データと、測距光(赤外光)による画像(IR画像)の画像データとを別々に生成する。制御部195は、前述のようにIR画像の画像データに基づき距離画像の画像データを生成する。 In the above description, the image data of the RGB image for distance measurement is generated by the first camera 21b, and the image data of the distance image is generated by the distance measuring device 19, respectively. However, one ranging device 19 may generate image data of RGB images for distance measurement in addition to distance image data. In this case, in the distance measuring device 19, the filter 193 is configured to have characteristics as shown in FIG. FIG. 9 shows the transmission characteristics for four pixels for convenience of explanation, and the pattern shown in FIG. 9 is repeatedly applied to the entire pixel region of the light receiving unit 194. The filter 193 separates R, G, and B colors and infrared light (IR) for each pixel. The light receiving unit 194 receives light through a filter 193 having a pattern as shown in FIG. 9, whereby image data of a distance measurement image (RGB image) using R, G, and B light and distance measurement light (infrared) Image data of an image (IR image) by light is generated separately. As described above, the control unit 195 generates the image data of the distance image based on the image data of the IR image.
 この場合、飛行体10のコントローラ16は、測距用のRGB画像の画像データと、距離画像のデータとを測距装置19から取得し、これらの画像データを用いて前述の方法と同様の方法でベースステーション50に対する相対的位置関係を認識する。 In this case, the controller 16 of the flying object 10 acquires the image data of the RGB image for distance measurement and the data of the distance image from the distance measuring device 19, and uses these image data to perform the same method as the above-described method. Thus, the relative positional relationship with respect to the base station 50 is recognized.
 また、測距装置19の構成は上述のものに限定されず、赤外光の波長領域の距離画像と、可視光の波長領域の画像が撮影できるものであれば、他の構成であってもよい。例えば、飛行体の制御システム100において使用する測距装置として、特開2008-70374号公報に開示された構成を有する測距装置を適用することもできる。 Further, the configuration of the distance measuring device 19 is not limited to the above-described one, and any other configuration may be used as long as it can capture a distance image in the infrared wavelength region and an image in the visible wavelength region. Good. For example, as a distance measuring device used in the flying object control system 100, a distance measuring device having a configuration disclosed in Japanese Patent Application Laid-Open No. 2008-70374 can be applied.
 なお、ベースステーション50のコントローラ56は光源部51から照射するガイド光に変調信号による変調を加えて送信してもよい。このとき変調信号には、例えばベースステーションを識別するための識別情報を含めてもよい。飛行体10は、ガイド光を撮影したRGB画像を介して抽出した信号を復調して変調信号に含まれる識別情報を取り出し、取り出した識別情報に基づきガイド光がベースステーション50から出射されたものであることを確認できる。これにより外乱光とガイド光を区別でき、外乱光によりベースステーション50の位置を誤認することを防止できる。 Note that the controller 56 of the base station 50 may transmit the guide light emitted from the light source unit 51 after modulating it with a modulation signal. At this time, the modulation signal may include, for example, identification information for identifying the base station. The flying object 10 is obtained by demodulating the signal extracted through the RGB image obtained by photographing the guide light to extract the identification information included in the modulation signal, and the guide light is emitted from the base station 50 based on the extracted identification information. I can confirm that there is. Thereby, disturbance light and guide light can be distinguished, and it can prevent misidentifying the position of the base station 50 by disturbance light.
 [3.効果、等]
 以上のように本実施形態の制御システム100は、飛行体10とベースステーション50とを含む制御システムである。ベースステーション50は所定の波長(特定の色)のガイド光を照射する光源部51を備える。飛行体10は、空中を飛行するための推進力を発生する推進器15と、飛行体10から、飛行体の周囲の所定の空間内に存在する対象物までの距離を測定する全方位測距装置19(測距部の一例)と、所定の空間内に存在する被写体を撮像し、画像データを生成する第2カメラ21b(撮像部の一例)と、飛行体の動作を制御するコントローラ16(第1コントローラの例)と、を備える。コントローラ16は、第2カメラ21bにより生成された画像データに含まれるガイド光の画像と、全方位測距装置19からの情報とに基づきベースステーション50に対する飛行体の相対的な位置関係を求める。
[3. Effect, etc.]
As described above, the control system 100 according to the present embodiment is a control system including the flying object 10 and the base station 50. The base station 50 includes a light source unit 51 that emits guide light having a predetermined wavelength (specific color). The flying object 10 is a omnidirectional distance measuring device that measures a distance from a propulsion device 15 that generates a propulsive force for flying in the air and an object existing in a predetermined space around the flying object 10. A device 19 (an example of a distance measuring unit), a second camera 21b (an example of an imaging unit) that captures an image of a subject existing in a predetermined space and generates image data, and a controller 16 (which controls the operation of the flying object) An example of a first controller). The controller 16 obtains the relative positional relationship of the flying object with respect to the base station 50 based on the guide light image included in the image data generated by the second camera 21 b and the information from the omnidirectional distance measuring device 19.
 飛行体10はベースステーション50からのガイド光により自装置の相対的な位置関係を把握できることから、障害物等によりGPS信号等を受信できない場合であっても、自装置の位置を把握することができる。また、通信媒体として音波や電波ではなく光を用いることでマルチパスによる影響を排除できる。 Since the flying object 10 can grasp the relative positional relationship of its own device with the guide light from the base station 50, it can grasp the position of its own device even when the GPS signal or the like cannot be received due to an obstacle or the like. it can. In addition, the influence of multipath can be eliminated by using light instead of sound waves and radio waves as a communication medium.
 (実施の形態2)
 実施の形態1では、ベースステーション50はLEDを備え、LEDからの出射された光をガイド光として照射した。これに対して、本実施形態では、ベースステーションがガイド光としてレーザ光を照射する場合の構成例を説明する。
(Embodiment 2)
In the first embodiment, the base station 50 includes an LED, and radiates light emitted from the LED as guide light. On the other hand, in this embodiment, a configuration example in the case where the base station emits laser light as guide light will be described.
 図10Aは、実施の形態2における飛行体10bの機能的な構成を示す図である。実施の形態2の飛行体10bは、実施の形態1における飛行体10の構成とは、第2カメラ21bを備えていない点が異なる。また、全方位測距装置19bが2種類の波長(波長A、波長B)の赤外光を受信し、それぞれの波長に対する撮像画像を生成する点が実施の形態1の構成と異なる。このため、実施の形態2の全方位測距装置19bのフィルタ193は、図10Bに示すような透過特性を有する。図10Bは、説明の便宜上4画素分の透過特性を示しており、図10Bに示すパターンが受光部194の画素領域全体に繰り返して適用される。図10Bは、画素毎に波長Aと波長Bを透過させるフィルタ特性を示している。図10Bに示すようなフィルタ193を用いることで、全方位測距装置19bの受光部194は画素毎に波長Aの光または波長Bの光を受光することができる。これにより、全方位測距装置19bの受光部194は、波長Aの受光した光に基づく画像の画像データと、波長Bの受光した光に基づく画像の画像データとを生成することができる。制御部195は波長Aを受光する画素からの信号に基づき距離画像データを生成する。 FIG. 10A is a diagram showing a functional configuration of the flying object 10b in the second embodiment. The flying object 10b according to the second embodiment is different from the structure of the flying object 10 according to the first embodiment in that the second camera 21b is not provided. Further, the omnidirectional distance measuring device 19b is different from the configuration of the first embodiment in that it receives infrared light of two types of wavelengths (wavelength A and wavelength B) and generates a captured image for each wavelength. For this reason, the filter 193 of the omnidirectional distance measuring device 19b of Embodiment 2 has a transmission characteristic as shown in FIG. 10B. FIG. 10B shows the transmission characteristics for four pixels for convenience of explanation, and the pattern shown in FIG. 10B is repeatedly applied to the entire pixel region of the light receiving unit 194. FIG. 10B shows filter characteristics that transmit the wavelength A and the wavelength B for each pixel. By using a filter 193 as shown in FIG. 10B, the light receiving unit 194 of the omnidirectional distance measuring device 19b can receive light of wavelength A or light of wavelength B for each pixel. As a result, the light receiving unit 194 of the omnidirectional distance measuring device 19b can generate image data of an image based on light received at a wavelength A and image data of an image based on light received at a wavelength B. The control unit 195 generates distance image data based on a signal from a pixel that receives the wavelength A.
 図11は、実施の形態2におけるベースステーション50bの構成を示した図である。実施の形態1では、光源部51はLEDを備えていたが、実施の形態2では、光源部51bは、測距光の波長Aとは異なる波長Bのレーザ光を出射するレーザダイオードを備えている点が実施の形態1のものと異なる。すなわち、実施の形態2では、測距光とガイド光とは同じレーザ光ではあるが、それらの波長(波長Aと波長B)は異なっている。 FIG. 11 is a diagram showing a configuration of the base station 50b in the second embodiment. In the first embodiment, the light source unit 51 includes an LED. In the second embodiment, the light source unit 51b includes a laser diode that emits a laser beam having a wavelength B different from the wavelength A of the ranging light. This is different from the first embodiment. That is, in the second embodiment, the distance measuring light and the guide light are the same laser light, but their wavelengths (wavelength A and wavelength B) are different.
 図12は実施の形態2における飛行体の制御システム100の処理を示すフローチャートである。図12のフローチャートを用いて実施の形態2における飛行体の制御システム100の動作を説明する。 FIG. 12 is a flowchart showing processing of the flying object control system 100 according to the second embodiment. The operation of the flying object control system 100 according to the second embodiment will be described with reference to the flowchart of FIG.
 ベースステーション50bは、光源部51bから波長Bのガイド光を照射する(S21)。飛行体10の測距装置19bは、光源部197から波長Aのレーザ光(測距光)を広角で照射する(S22)。 The base station 50b radiates guide light having a wavelength B from the light source unit 51b (S21). The distance measuring device 19b of the flying object 10 irradiates laser light (ranging light) having a wavelength A from the light source unit 197 at a wide angle (S22).
 測距装置19bは、計測対象空間を広角撮影し、波長Aの受光した光に基づく画像(以下「画像A」という)の画像データと、波長Bの受光した光に基づく画像(以下「画像B」という)の画像データとを生成する(S23)。 The distance measuring device 19b takes a wide-angle image of the measurement target space, and image data of an image based on light received at wavelength A (hereinafter referred to as “image A”) and an image based on light received at wavelength B (hereinafter referred to as “image B”). Is generated) (S23).
 コントローラ16は、画像Bからガイド光(すなわち、光源部51または反射板52)の位置を検出する(S24)。図13Aは、画像Bの一例を示した図である。同図において中心P00は飛行体10の位置を示し、星印P02がガイド光の位置を示す。コントローラ16は、画像B(60b)の各画素の輝度に基づきガイド光(すなわち、光源部51または反射板52)の位置を検出できる。 The controller 16 detects the position of the guide light (that is, the light source unit 51 or the reflection plate 52) from the image B (S24). FIG. 13A is a diagram illustrating an example of an image B. In the figure, the center P00 indicates the position of the flying object 10, and the star P02 indicates the position of the guide light. The controller 16 can detect the position of the guide light (that is, the light source unit 51 or the reflection plate 52) based on the luminance of each pixel of the image B (60b).
 そして、コントローラ16は、画像B上で検出したガイド光の位置からベースステーション50bの方位を検出する(S25)。例えば、図13Aに示す画像B(60b)上で検出したガイド光の位置P02からベースステーション50の方位D1を求める。 Then, the controller 16 detects the azimuth of the base station 50b from the position of the guide light detected on the image B (S25). For example, the direction D1 of the base station 50 is obtained from the position P02 of the guide light detected on the image B (60b) shown in FIG. 13A.
 測距装置19の制御部195は、画像Aの画像データに基づき距離画像を示す画像データを生成する(S26)。 The control unit 195 of the distance measuring device 19 generates image data indicating a distance image based on the image data of the image A (S26).
 飛行体10のコントローラ16は、ステップS25で求めたベースステーション50bの方位と、ステップS24で求めたガイド光の位置とに基づき、ベースステーション50(すなわち、ガイド光、光源部51または反射板52)までの距離を計測する(S27)。図13Bは、画像Aから求めた距離画像の例を示した図である。図13A、13Bの例の場合、距離画像62bにおいて、画像B(60b)から求めた位置P02に対応する画素の位置P22を求め、この位置P22の画素に対する距離を求める。 The controller 16 of the flying object 10 determines the base station 50 (that is, the guide light, the light source unit 51, or the reflector 52) based on the orientation of the base station 50b obtained in step S25 and the position of the guide light obtained in step S24. Is measured (S27). FIG. 13B is a diagram illustrating an example of a distance image obtained from the image A. In the example of FIGS. 13A and 13B, in the distance image 62b, the pixel position P22 corresponding to the position P02 obtained from the image B (60b) is obtained, and the distance to the pixel at this position P22 is obtained.
 そして、コントローラ16は、以上のようにして求めた、ベースステーション50bの方位及びベースステーション50bまでの距離に基づき、ベースステーション50に対する相対的な位置関係(位置姿勢)を求める(S28)。 Then, the controller 16 obtains a relative positional relationship (position and orientation) with respect to the base station 50 based on the orientation of the base station 50b and the distance to the base station 50b obtained as described above (S28).
 以上のように、ベースステーション50bからのガイド光と飛行体10からの測距光に対してレーザ光を用いた場合であっても、互いの波長を異ならせることで、それぞれの光に基づく所望の制御を行うことができる。これにより、飛行体10は、ガイド光に基づいてベースステーション50bに対する相対的な位置関係を求めることができる。 As described above, even when the laser light is used for the guide light from the base station 50b and the distance measuring light from the flying object 10, the desired wavelength based on each light can be obtained by making the wavelengths different from each other. Can be controlled. Thereby, the flying body 10 can obtain a relative positional relationship with respect to the base station 50b based on the guide light.
 (実施の形態3)
 実施の形態2では、ベースステーションからのガイド光と測距装置からの測距光の波長とは異なっていた。これに対して、本実施の形態では、ベースステーションからのガイド光の波長と、測距装置からの測距光の波長とが同じである場合の構成を説明する。
(Embodiment 3)
In the second embodiment, the wavelength of the guide light from the base station and the distance measuring light from the distance measuring device are different. On the other hand, in the present embodiment, a configuration in the case where the wavelength of the guide light from the base station and the wavelength of the distance measuring light from the distance measuring device are the same will be described.
 図14Aは、実施の形態3における飛行体10cの機能的な構成を示す図である。実施の形態3の飛行体10cは、実施の形態2における飛行体10bの構成とは、全方位測距装置のフィルタが異なる。すなわち、実施の形態3では、実施の形態1と同様に、フィルタ193は波長Aの光を透過させる特性を持つ。 FIG. 14A is a diagram showing a functional configuration of the flying object 10c in the third embodiment. The flying object 10c of the third embodiment is different from the structure of the flying object 10b of the second embodiment in the filter of the omnidirectional distance measuring device. That is, in the third embodiment, as in the first embodiment, the filter 193 has a characteristic of transmitting light of wavelength A.
 図14Bは、実施の形態3におけるベースステーション50cの構成を示した図である。実施の形態2では、光源部51bは測距光の波長とは異なる波長のガイド光(レーザ光)を出力していた。これに対して、本実施の形態の光源部51cは、測距光と同じ波長(すなわち、波長A)のガイド光(レーザ光)を出力する。 FIG. 14B is a diagram illustrating the configuration of the base station 50c according to the third embodiment. In the second embodiment, the light source unit 51b outputs guide light (laser light) having a wavelength different from the wavelength of the distance measuring light. On the other hand, the light source unit 51c according to the present embodiment outputs guide light (laser light) having the same wavelength as the distance measuring light (that is, wavelength A).
 図15は実施の形態3における飛行体の制御システム100の処理を示すフローチャートである。図15のフローチャートを用いて実施の形態3における飛行体の制御システム100の動作を説明する。 FIG. 15 is a flowchart showing processing of the flying object control system 100 according to the third embodiment. The operation of the flying object control system 100 according to the third embodiment will be described with reference to the flowchart of FIG.
 ベースステーション50cは、光源部51cから波長Aのガイド光を照射する(S31)。光源部51cからガイド光が照射される間、図16に示すように、飛行体10は測距光の照射を停止している。 The base station 50c irradiates guide light having a wavelength A from the light source unit 51c (S31). While the guide light is emitted from the light source unit 51c, as shown in FIG. 16, the flying object 10 stops the irradiation of the distance measuring light.
 測距装置19cは、計測対象空間を広角撮影し、受光した波長Aの光に基づく画像(画像A)の画像データを生成する(S32)。 The distance measuring device 19c takes a wide-angle image of the measurement target space, and generates image data of an image (image A) based on the received light of wavelength A (S32).
 コントローラ16は、画像Aからガイド光(すなわち、光源部51)の位置を検出する(S33)。さらに、コントローラ16は、画像A上で検出したガイド光の位置からベースステーション50cの方位を検出する(S34)。 The controller 16 detects the position of the guide light (that is, the light source unit 51) from the image A (S33). Further, the controller 16 detects the azimuth of the base station 50c from the position of the guide light detected on the image A (S34).
 飛行体10の測距装置19cは、波長Aのレーザ光(測距光)を広角で照射する(S35)。測距装置19cからレーザ光(測距光)が照射される間、図16に示すように、ベースステーション50cはガイド光の照射を停止している。そして、測距装置19は、計測対象空間を広角撮影し、受光した波長Aの光に基づく画像Aの画像データを生成し、画像Aの画像データに基づき距離画像を示す画像データを生成する(S36)。このように、飛行体10からの測距光の照射タイミングを、ベースステーション50cからのガイド光の照射タイミングと異ならせている(図16参照)。これにより、測距光とガイド光で同じ波長を用いた場合であっても、互いに干渉することなく、それぞれの光に基づく所望の制御を行うことができる。 The ranging device 19c of the flying object 10 irradiates laser light (ranging light) having a wavelength A at a wide angle (S35). While the laser beam (ranging beam) is emitted from the distance measuring device 19c, the base station 50c stops the irradiation of the guide beam as shown in FIG. Then, the distance measuring device 19 performs wide-angle imaging of the measurement target space, generates image data of the image A based on the received light of wavelength A, and generates image data indicating the distance image based on the image data of the image A ( S36). In this manner, the irradiation timing of the ranging light from the flying object 10 is different from the irradiation timing of the guide light from the base station 50c (see FIG. 16). Thereby, even when the same wavelength is used for the distance measuring light and the guide light, it is possible to perform desired control based on each light without interfering with each other.
 飛行体10のコントローラ16は、ステップS34で求めたベースステーション50cの方位と、ステップS33で求めたガイド光の位置とに基づき、ベースステーション50c(すなわち、ガイド光、光源51または反射板52)までの距離を計測する(S37)。 The controller 16 of the flying object 10 reaches the base station 50c (that is, the guide light, the light source 51, or the reflector 52) based on the direction of the base station 50c obtained in step S34 and the position of the guide light obtained in step S33. Is measured (S37).
 そして、コントローラ16は、以上のようにして求めた、ベースステーション50cの方位及びベースステーション50cまでの距離に基づき、ベースステーション50cに対する相対的な位置関係(位置姿勢)を求める(S38)。 Then, the controller 16 obtains a relative positional relationship (position and orientation) with respect to the base station 50c based on the orientation of the base station 50c and the distance to the base station 50c obtained as described above (S38).
 なお、ステップS31におけるベースステーション50からのガイド光の照射タイミングと、ステップS35における飛行体10からの測距光の照射タイミングとが干渉しないようにするため、飛行体10とベースステーション50とは、それらが管理する時刻を予め精度よく同期させておく必要がある。この場合、飛行体10とベースステーション50のそれぞれは、互いに光の照射タイミングが重ならないようにあらかじめ決められたタイミング(図16参照)で波長Aの光(測距光またはガイド光)を排他的に照射する。または、ベースステーション50と飛行体10とが通信を行って、互いの照射タイミングを排他的に制御してもよい。この場合、飛行体10とベースステーション50の一方の機器が、波長Aの光(測距光またはガイド光)の照射を停止したときに、その旨を他方の機器に通知すればよい。他方の機器は、一方の機器から照射の停止の通知を受けると、一定時間、波長Aの光の照射を行えばよい。このような方法により、飛行体10からの測距光とベースステーション50からのガイド光とを排他的に照射できる。 In order to prevent interference between the irradiation timing of the guide light from the base station 50 in step S31 and the irradiation timing of the distance measuring light from the flying object 10 in step S35, the flying object 10 and the base station 50 are: It is necessary to accurately synchronize the time managed by them in advance. In this case, each of the flying object 10 and the base station 50 exclusively uses wavelength A light (ranging light or guide light) at a predetermined timing (see FIG. 16) so that the light irradiation timings do not overlap each other. Irradiate. Alternatively, the base station 50 and the flying object 10 may communicate to control each other's irradiation timing exclusively. In this case, when one device of the flying object 10 and the base station 50 stops emitting the light of the wavelength A (ranging light or guide light), the other device may be notified to that effect. When the other device receives the notification of the stop of irradiation from the one device, the other device may irradiate light of wavelength A for a certain period of time. By such a method, the ranging light from the flying object 10 and the guide light from the base station 50 can be exclusively irradiated.
 以上のように、本実施形態では、ガイド光を照射するタイミングと、測距光を照射するタイミングを排他的に制御することで、ガイド光と測距光の波長が同じであっても、ガイド光を区別して検出することができる。よって、飛行体10は、ガイド光と測距光の波長が同じであっても、ガイド光に基づいてベースステーション50cに対する相対的な位置関係を求めることができる。 As described above, in the present embodiment, the guide light and the distance measurement light are emitted exclusively by controlling the timing of the guide light irradiation and the distance measurement light irradiation, so that the guide light and the distance measurement light have the same wavelength. Light can be distinguished and detected. Therefore, the flying object 10 can obtain the relative positional relationship with respect to the base station 50c based on the guide light even if the wavelengths of the guide light and the distance measuring light are the same.
 (他の実施の形態)
 以上のように、本出願において開示する技術の例示として、実施の形態1~3を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施の形態1~3で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。そこで、以下、他の実施の形態を例示する。
(Other embodiments)
As described above, Embodiments 1 to 3 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. Also, it is possible to combine the components described in the first to third embodiments to form a new embodiment. Therefore, other embodiments will be exemplified below.
 上記の実施の形態における全方位測距装置の構成は上述のものに限定されるものではない。対象物までの距離が計測できる構成であれば、全方位測距装置に対して他の構成を適用することができる。例えば、ステレオカメラを用いて対象物までの距離を計測する構成を適用することができる。 The configuration of the omnidirectional distance measuring device in the above embodiment is not limited to that described above. Other configurations can be applied to the omnidirectional distance measuring device as long as the distance to the object can be measured. For example, a configuration in which a distance to an object is measured using a stereo camera can be applied.
 上記の実施の形態では、飛行体10とベースステーション50とが無線通信を行う例を説明したが、飛行体10とベースステーション50を有線で接続し、有線通信による通信を行うようにしてもよい。 In the above-described embodiment, the example in which the flying object 10 and the base station 50 perform wireless communication has been described. However, the flying object 10 and the base station 50 may be connected by wire to perform communication by wired communication. .
 上記の実施の形態では、第1カメラ21を、飛行体10の上方の空間にある被写体を撮影するように、飛行体10に対して上向きに取り付けられたが、第1カメラ21の向きはこれに限定されるものではない。第1カメラ21の向きは検査対象物の検査すべき部位や構造等に応じて適宜設定してよい。 In the above embodiment, the first camera 21 is mounted upward with respect to the flying object 10 so as to shoot a subject in the space above the flying object 10, but the orientation of the first camera 21 is this. It is not limited to. The direction of the first camera 21 may be set as appropriate according to the part to be inspected and the structure of the inspection object.
 また、上記の実施の形態では、全方位測距装置19も飛行体10の上方の半球空間を撮影し距離画像を生成するように飛行体10に取り付けられていた。しかし、全方位測距装置19の向きはこれに限定されるものではない。全方位測距装置19の向きは検査対象物の検査すべき部位や構造等に応じて適宜設定してよい。例えば、検査対象物の上側表面を検査する場合、全方位測距装置19は、飛行体10の下方の半球空間を撮影し距離画像を生成するように飛行体10に取り付けられてもよい。 In the above embodiment, the omnidirectional distance measuring device 19 is also attached to the flying object 10 so as to photograph the hemispheric space above the flying object 10 and generate a distance image. However, the direction of the omnidirectional distance measuring device 19 is not limited to this. The orientation of the omnidirectional distance measuring device 19 may be set as appropriate according to the part or structure to be inspected of the inspection object. For example, when inspecting the upper surface of the inspection object, the omnidirectional distance measuring device 19 may be attached to the flying object 10 so as to photograph a hemispheric space below the flying object 10 and generate a distance image.
 上記の実施形態において、飛行体10のコントローラ16や制御部195並びにベースステーションにおけるコントローラ56は、CPU、MPU,DSP、マイコン、またはFPGA、ASIC等の電子回路で構成することができる。 In the above embodiment, the controller 16 and the control unit 195 of the flying object 10 and the controller 56 in the base station can be constituted by an electronic circuit such as a CPU, MPU, DSP, microcomputer, FPGA, ASIC or the like.
 以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。 As described above, the embodiments have been described as examples of the technology in the present disclosure. For this purpose, the accompanying drawings and detailed description are provided.
 したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Accordingly, among the components described in the accompanying drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to illustrate the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiment is for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be performed within the scope of the claims or an equivalent scope thereof.
 本開示の制御システムは、障害物によりGPS信号などの衛星測位信号を捕捉できない状況であっても、安定した飛行体の姿勢制御を可能とすることから、飛行体の制御システムに有用である。 The control system of the present disclosure is useful for a flying object control system because it enables stable attitude control of the flying object even in a situation where a satellite positioning signal such as a GPS signal cannot be captured by an obstacle.

Claims (9)

  1.  飛行体と、ベースステーションとを含む制御システムであって、
     前記ベースステーションは所定の波長のガイド光を照射する光源部を備え、
     前記飛行体は、
      空中を飛行するための推進力を発生する推進部と、
      前記飛行体から、前記飛行体の周囲の所定の空間内に存在する対象物までの距離を測定する測距部と、
     前記所定の空間内に存在する被写体を撮像し、画像データを生成する撮像部と、
      前記飛行体の動作を制御する第1コントローラと、を備え、
     前記第1コントローラは、前記撮像部により生成された画像データに含まれるガイド光の画像と、前記測距部からの情報とに基づき、前記ベースステーションに対する前記飛行体の相対的な位置関係を求める
    制御システム。
    A control system including an air vehicle and a base station,
    The base station includes a light source unit that emits guide light having a predetermined wavelength.
    The aircraft is
    A propulsion unit that generates propulsive force for flying in the air,
    A distance measuring unit for measuring a distance from the flying object to an object existing in a predetermined space around the flying object;
    An imaging unit that captures an image of a subject existing in the predetermined space and generates image data;
    A first controller for controlling the operation of the aircraft,
    The first controller obtains a relative positional relationship of the flying object with respect to the base station based on an image of guide light included in image data generated by the imaging unit and information from the distance measuring unit. Control system.
  2.  前記測距部は、
      測距光を照射する発光部と、
      前記対象物により反射された測距光を受光する受光部と、
      前記受光部で受光した測距光に基づき、前記飛行体から前記所定の空間内に存在する対象物までの距離を示す距離画像データを生成する第2コントローラと、を備え、
     前記第1コントローラは、
      前記撮像部で生成された画像データから、前記飛行体を基準とした前記ベースステーションの方位を検出し、
      前記ガイド光の画像に基づき前記画像データの中から前記ベースステーションの位置を特定し、特定した位置を用いて前記距離画像データから前記ベースステーションの距離を求め、
      前記ベースステーションの方位と、前記ベースステーションの距離とにより、前記飛行体と前記ベースステーションの相対的な位置関係を求める
    請求項1記載の制御システム。
    The distance measuring unit is
    A light emitting unit for emitting distance measuring light;
    A light receiving unit that receives distance measuring light reflected by the object;
    A second controller that generates distance image data indicating a distance from the flying object to an object existing in the predetermined space based on distance measuring light received by the light receiving unit;
    The first controller includes:
    From the image data generated by the imaging unit, detect the orientation of the base station relative to the flying object,
    The position of the base station is specified from the image data based on the image of the guide light, and the distance of the base station is determined from the distance image data using the specified position.
    The control system according to claim 1, wherein a relative positional relationship between the flying object and the base station is obtained based on an orientation of the base station and a distance of the base station.
  3.  前記ベースステーションからのガイド光の波長と、前記飛行体の測距部から照射される測距光の波長とが異なる、
    請求項2記載の制御システム。
    The wavelength of the guide light from the base station is different from the wavelength of the ranging light emitted from the ranging unit of the flying object,
    The control system according to claim 2.
  4.  前記ベースステーションからのガイド光の波長と、前記飛行体の測距部から照射される測距光の波長とが同じであり、
     前記ガイド光と前記測距光とは互いに排他的なタイミングで照射される、
    請求項2記載の制御システム。
    The wavelength of the guide light from the base station and the wavelength of the ranging light emitted from the ranging unit of the flying object are the same,
    The guide light and the distance measuring light are irradiated at mutually exclusive timings,
    The control system according to claim 2.
  5.  前記所定の空間は、前記飛行体上方の半球分の範囲を含む空間または前記飛行体下方の半球分の範囲を含む空間である、請求項1ないし4のいずれかに記載の制御システム。 The control system according to any one of claims 1 to 4, wherein the predetermined space is a space including a range of a hemisphere above the aircraft or a space including a range of a hemisphere below the aircraft.
  6.  前記撮像部と前記測距部とが一つの素子として形成された、請求項1ないし4のいずれかに記載の制御システム。 The control system according to any one of claims 1 to 4, wherein the imaging unit and the distance measuring unit are formed as one element.
  7.  被写体を撮像し、前記画像データよりも高画質な画像データを生成するカメラをさらに備えた、請求項1ないし4のいずれかに記載の制御システム。 5. The control system according to claim 1, further comprising a camera that captures an image of a subject and generates image data having higher image quality than the image data.
  8.  前記ベースステーションの光源部は、所定の変調信号で変調したガイド光を送信し、
     前記第1コントローラは、前記ガイド光の画像から得られた信号を復調し前記変調信号を得る、
    請求項1ないし4のいずれかに記載の制御システム。
    The light source unit of the base station transmits guide light modulated with a predetermined modulation signal,
    The first controller demodulates a signal obtained from the image of the guide light to obtain the modulated signal;
    The control system according to claim 1.
  9.  所定位置に配置されたベースステーションから照射されるガイド光に基づき前記ベースステーションに対する相対的な位置関係を認識しながら空中を飛行可能な飛行体であって、
     空中を飛行するための推進力を発生する推進部と、
     前記飛行体から、前記飛行体の周囲の所定の空間内に存在する対象物までの距離を測定する測距部と、
     前記所定の空間内に存在する被写体を撮像し、画像データを生成する撮像部と、
     前記飛行体の動作を制御する第1コントローラと、を備え、
     前記第1コントローラは、前記撮像部により生成された画像データが示す画像に含まれる前記ガイド光の画像と、前記測距部からの情報とに基づき、前記ベースステーションに対する相対的な位置関係を求める、
    飛行体。
    A flying object capable of flying in the air while recognizing a relative positional relationship with respect to the base station based on guide light emitted from a base station arranged at a predetermined position;
    A propulsion unit that generates propulsive force for flying in the air,
    A distance measuring unit for measuring a distance from the flying object to an object existing in a predetermined space around the flying object;
    An imaging unit that captures an image of a subject existing in the predetermined space and generates image data;
    A first controller for controlling the operation of the aircraft,
    The first controller obtains a relative positional relationship with respect to the base station based on the guide light image included in the image indicated by the image data generated by the imaging unit and information from the distance measuring unit. ,
    Flying body.
PCT/JP2016/004754 2016-02-10 2016-10-28 Flying body and control system therefor WO2017138049A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-023443 2016-02-10
JP2016023443 2016-02-10

Publications (1)

Publication Number Publication Date
WO2017138049A1 true WO2017138049A1 (en) 2017-08-17

Family

ID=59562947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/004754 WO2017138049A1 (en) 2016-02-10 2016-10-28 Flying body and control system therefor

Country Status (1)

Country Link
WO (1) WO2017138049A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019193642A1 (en) * 2018-04-03 2019-10-10 株式会社自律制御システム研究所 Localization device and localization method for unmanned aerial vehicle
WO2019229887A1 (en) * 2018-05-30 2019-12-05 マクセル株式会社 Camera apparatus
JP2020019371A (en) * 2018-08-01 2020-02-06 三菱ロジスネクスト株式会社 Automated guided vehicle system using unmanned flight body
JP2020056627A (en) * 2018-09-28 2020-04-09 株式会社リコー Imaging device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008070374A (en) * 2007-09-25 2008-03-27 Fujifilm Corp Imaging device and distance-measuring method
JP2011022062A (en) * 2009-07-17 2011-02-03 Topcon Corp Position measurement method and position measuring instrument
JP2016015628A (en) * 2014-07-02 2016-01-28 三菱重工業株式会社 Indoor monitoring system and mode of structure

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008070374A (en) * 2007-09-25 2008-03-27 Fujifilm Corp Imaging device and distance-measuring method
JP2011022062A (en) * 2009-07-17 2011-02-03 Topcon Corp Position measurement method and position measuring instrument
JP2016015628A (en) * 2014-07-02 2016-01-28 三菱重工業株式会社 Indoor monitoring system and mode of structure

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019193642A1 (en) * 2018-04-03 2019-10-10 株式会社自律制御システム研究所 Localization device and localization method for unmanned aerial vehicle
JPWO2019193642A1 (en) * 2018-04-03 2021-04-30 株式会社自律制御システム研究所 Self-position estimation device and self-position estimation method for unmanned aerial vehicles
WO2019229887A1 (en) * 2018-05-30 2019-12-05 マクセル株式会社 Camera apparatus
JPWO2019229887A1 (en) * 2018-05-30 2021-07-26 マクセル株式会社 Camera device
JP7025539B2 (en) 2018-05-30 2022-02-24 マクセル株式会社 Camera device
JP2020019371A (en) * 2018-08-01 2020-02-06 三菱ロジスネクスト株式会社 Automated guided vehicle system using unmanned flight body
JP2020056627A (en) * 2018-09-28 2020-04-09 株式会社リコー Imaging device

Similar Documents

Publication Publication Date Title
EP3187895B1 (en) Variable resolution light radar system
JP6729561B2 (en) Light irradiation device and light irradiation system
JP6371988B2 (en) Flying object
CN109074101B (en) Imaging using multiple drones
WO2017138049A1 (en) Flying body and control system therefor
US11019322B2 (en) Estimation system and automobile
JP2023022237A (en) Augmenting Panoramic LIDAR Results with Color
US10429508B2 (en) Distance measuring device, moving system, and distance measurement method
JP4928203B2 (en) Laser energy and information supply system
US11237252B2 (en) Detection apparatus, detection system, detection method, and movable device
US11999480B2 (en) Flight control system for unmanned aerial vehicle and topography measuring system
US20190116309A1 (en) Overhead line image capturing system and overhead line image capturing method
EP3788451B1 (en) Controlling a vehicle using a remotely located laser and an on-board camera
JP2017201757A (en) Image acquisition system, image acquisition method, and image processing method
WO2021241534A1 (en) Aerial photography system and method
KR20180027847A (en) Apparatus of detecting charging position for unmanned air vehicle
JP2020185941A (en) Unmanned aircraft, inspection method, and inspection program
JP7504502B2 (en) Aircraft
JP2020191523A (en) Unmanned mobile
JP2010133802A (en) Maritime monitoring/searching method
US20190152598A1 (en) Circular light source for obstacle detection
KR102553469B1 (en) Apparatus and method for determining position of a drone
KR102610855B1 (en) Integrated fusion sensor apparatus including detachable light source with built-in intelligent camera
JP2019219874A (en) Autonomous moving and imaging control system and autonomous moving body
JP2017227516A (en) Device, mobile body device, and method for measuring distance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16889753

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16889753

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP