WO2015107623A1 - Management system and position specification method - Google Patents

Management system and position specification method Download PDF

Info

Publication number
WO2015107623A1
WO2015107623A1 PCT/JP2014/050499 JP2014050499W WO2015107623A1 WO 2015107623 A1 WO2015107623 A1 WO 2015107623A1 JP 2014050499 W JP2014050499 W JP 2014050499W WO 2015107623 A1 WO2015107623 A1 WO 2015107623A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
vehicle
information
specifying
feature
Prior art date
Application number
PCT/JP2014/050499
Other languages
French (fr)
Japanese (ja)
Inventor
崇 古庄
好祥 小林
畑野 一良
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2014/050499 priority Critical patent/WO2015107623A1/en
Publication of WO2015107623A1 publication Critical patent/WO2015107623A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/145Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas
    • G08G1/146Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas where the parking area is a limited parking space, e.g. parking garage, restricted space
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
    • G08G1/142Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces external to the vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Definitions

  • the present invention relates to a management system and a location specifying method.
  • GPS positioning based on radio wave reception results from GPS (Global Positioning System) satellites and odometry positioning using acceleration sensors, gyro sensors, wheel rotation speed, wheel angle, etc. are generally used.
  • GPS Global Positioning System
  • odometry positioning using acceleration sensors, gyro sensors, wheel rotation speed, wheel angle, etc. have been done.
  • indoors such as in warehouses, multistory parking lots, and in the hold (except when the space is completely closed, the upper part is closed or the upper part is substantially closed, and radio waves from GPS satellites cannot be received.
  • GPS positioning cannot be performed with respect to the position of a vehicle parked in a space (hereinafter also referred to as “in a warehouse, etc.”).
  • the vehicle position in the indoor can be specified, but in order to specify the vehicle position with high accuracy, a sensor with a small offset is used. It is necessary to use it. Such a sensor is very expensive, and it is not practical to employ a configuration in which each sensor is arranged.
  • the result of odometry positioning using the wheel rotation speed, wheel angle, etc. for each vehicle cannot be acquired as information from the ECU (Electronic Control Unit) of the vehicle.
  • ECU Electronic Control Unit
  • the ceiling height is generally limited, so it is difficult to install an appropriate support column that allows the entire floor to be viewed from above. there were.
  • the configuration is not simple for specifying the position of an object such as a vehicle.
  • the facility construction was unavoidable.
  • the invention according to claim 1 is an unmanned air vehicle capable of flying indoors; and information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space is in flight of the unmanned air vehicle.
  • a position-related information acquiring unit to be acquired; and a position specifying unit for specifying the position of the object in the room based on the information acquired by the position-related information unit. is there.
  • the invention according to claim 7 is an unmanned traveling body capable of traveling on an indoor floor surface; the unmanned traveling body is mounted on the unmanned traveling body, and information related to a position of a predetermined object existing in the indoor is displayed on the unmanned traveling body.
  • a position-related information acquiring unit that acquires from above the predetermined object during traveling, and a position specifying unit that specifies the position of the object indoors based on the information acquired by the position-related information unit
  • a management system characterized by comprising:
  • an unmanned air vehicle capable of flying indoors; information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space during the flight of the unmanned air vehicle
  • a position specifying method used in a management system comprising: a position related information acquiring unit to acquire; and a position specifying unit for specifying the position of the object in the room, wherein the position related information acquiring unit includes: An acquisition step of acquiring information related to the position of the object during the flight of the unmanned air vehicle; and the position of the target object in the room based on the information acquired by the position specifying unit in the acquisition step And a position specifying step for specifying the position.
  • FIG. 1 is a block diagram illustrating a configuration of a management system 500 according to an embodiment.
  • This management system 500 is a system that manages the vehicles CR 1 , CR 2 , CR 3 ,... Parked indoors in the building BLD 1 .
  • a feature information sheet QR j in which feature information indicating features unique to j is converted into a QR code (registered trademark) is attached. The content of the feature information is determined in advance from the viewpoint of vehicle management inside the building BLD1.
  • the management system 500 includes an unmanned air vehicle 100, an air vehicle mounting device 200, a relay device 300, and a processing control device 400.
  • the unmanned air vehicle 100, the air vehicle mounting device 200, and the relay device 300 are arranged in the building BLD1.
  • the process control apparatus 400 is arrange
  • the unmanned air vehicle 100 includes a plurality of propellers.
  • the unmanned air vehicle 100 can be remotely controlled by the processing control device 400 such as the flight speed and the flight path.
  • the processing control device 400 such as the flight speed and the flight path.
  • a flight base BS in which charging facilities for the unmanned air vehicle 100 are prepared is installed, and the unmanned air vehicle 100 can be charged at the air base BS.
  • the above-mentioned flying object mounting apparatus 200 is mounted on the unmanned flying object 100.
  • This flying object mounting device 200 can wirelessly communicate with the relay device 300. The details of the configuration of the flying object mounting apparatus 200 will be described later.
  • the relay device 300 is configured such that the main body is arranged at the flying base BS.
  • the relay device 300 includes an antenna 300A 1 for performing wireless communication with the flying object mounting device 200 and an antenna 300A 2 for performing wireless communication with the processing control device 400.
  • the antenna 300A 2 is installed outdoors of a building BLD1.
  • the relay apparatus 300 When the relay apparatus 300 receives the radio signal transmitted from the flying object mounting apparatus 200 with the antenna 300A 1 , the relay apparatus 300 performs amplification processing and the like as appropriate, and then transmits the antenna signal from the antenna 300A 2 to the processing control apparatus 400. Also, repeater 300 receives the radio signal transmitted from the processing control unit 400 by the antenna 300A 2, is transmitted from the antenna 300A 1 after the amplification processing or the like as appropriate to the aircraft mounted apparatus 200.
  • the processing control apparatus 400 includes an antenna 400A for performing wireless communication with the relay apparatus 300.
  • the antenna 400A is installed outside the building BLD2.
  • the flying object mounting apparatus 200 includes an antenna 210, a wireless transmission / reception unit 220, and a rotor control unit 230, as shown in FIG.
  • the flying object mounting apparatus 200 includes a position related information acquisition unit 240, a displacement information acquisition unit 250, an ambient environment information acquisition unit 260, and a feature information acquisition unit 270.
  • the wireless transmission / reception unit 220 uses the antenna 210 to transmit / receive information to / from the relay device 300 (and thus the processing control device 400). Then, when receiving the information transmitted from the relay device 300 via the antenna 210, the radio transmission / reception unit 220 transmits the information to any of the elements 230 to 270 according to the content of the information. Further, when receiving the information sent from any of the elements 230 to 270, the wireless transmission / reception unit 220 transmits the information to the relay device 300 via the antenna 210.
  • the rotor control unit 230 receives the flight control information sent from the processing control device 400 via the relay device 300. And the rotor control part 230 controls rotation of the several propeller with which the unmanned air vehicle 100 is provided according to the said flight control information. For this reason, the unmanned air vehicle 100 can be made to fly at a flight speed and a flight path according to the flight plan generated by the processing control device 400.
  • the position-related information acquisition unit 240 includes an imaging device such as an optical camera.
  • the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying. This imaging result is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
  • the displacement information acquisition unit 250 includes a so-called internal sensor such as a three-dimensional acceleration sensor or a gyro sensor.
  • the detection results by these internal sensors are sent as displacement information to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
  • the ambient environment information acquisition unit 260 includes a so-called external sensor such as a laser range finder (hereinafter referred to as “LRF”).
  • LRF laser range finder
  • the LRF detects the distance from the current position of the unmanned aerial vehicle 100 to the flying obstacle (eg, wall, column, beam, ceiling, floor, etc.) with respect to all directions from the current position of the unmanned air vehicle 100.
  • the detection results by these internal sensors are sent to the processing control apparatus 400 via the wireless transmission / reception unit 220 and the relay apparatus 300 as ambient environment information.
  • the feature information acquisition unit 270 shares the above-described imaging device with the position-related information acquisition unit 240 and includes a QR code decoding processing unit.
  • the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying.
  • the decoding processing unit detects the QR code in the imaging result by the imaging device, the decoding processing unit acquires the characteristic information for each vehicle CR j described above by decoding the QR code.
  • the feature information acquired in this way is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
  • the flying object mounting apparatus 200 further includes an illumination unit (not shown) that illuminates the imaging range.
  • the processing control apparatus 400 includes a wireless transmission / reception unit 420, a storage unit 430, and a flight control unit 440 in addition to the antenna 400A described above. Further, the processing control apparatus 400 includes a flying object position detection unit 451, a map generation unit 455, a vehicle position specification unit 461, and a vehicle feature specification unit 465. Furthermore, the processing control apparatus 400 includes a display unit 470 and an input unit 480.
  • the wireless transmission / reception unit 420 transmits / receives information to / from the relay device 300 (and thus the flying object mounting device 200) using the antenna 400A. Then, when receiving the information transmitted from the relay device 300 via the antenna 400A, the wireless transmission / reception unit 420 appropriately transmits the information to the elements 451 to 465 according to the content of the information. In addition, when receiving the information sent from the flight control unit 440, the wireless transmission / reception unit 420 transmits the information to the relay device 300 via the antenna 400A.
  • Information transmitted from the elements 240 to 270 of the flying object mounting device 200 is sequentially transmitted via the wireless transmission unit 220 and the antenna 210, the relay device 300, and the antenna 400A and the wireless transmission / reception unit 420 of the processing control device 400.
  • the information is appropriately transmitted to the elements 451 to 465 of the processing control apparatus 400.
  • information sent from the flight control unit 440 of the processing control device 400 is sequentially transmitted via the wireless transmission / reception unit 420 and the antenna 400A, the relay device 300, and the antenna 210 and the wireless transmission unit 220 of the flying object mounting device 200. It is transmitted to the rotor control unit 230 of the flying object mounting device 200.
  • the description of the sequentially interposed elements is omitted as described above.
  • the storage unit 430 stores various information used by the processing control device 400. As information stored in the storage unit 430 in this manner, the map information generated by the map generation unit 455, the vehicle position specified by the vehicle position specifying unit 461, and the feature information specified by the vehicle feature specifying unit 465 are associated with each other. Vehicle information table displayed. Any of the elements 440 to 465 can access the storage unit 430.
  • the flight control unit 440 receives a flight control request sent from any of the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465. Then, the flight control unit 440 is based on the map information in the storage unit 430 and the current position of the unmanned air vehicle 100 sent from the flying object position detection unit 451 (hereinafter referred to as “flying object current position”). Flight control information for realizing the flight of the flight speed and flight path (including the flight altitude) included in the flight plan specified in the flight control request is sequentially generated. The flight control information generated in this way is transmitted to the rotor control unit 230 of the flying object mounting device 200.
  • the above-mentioned flying object position detection unit 451 displays the detection result by the inner world sensor sent from the displacement information acquisition unit 250 of the flying object mounting device 200 and the detection result by the outer world sensor sent from the surrounding environment information acquisition unit 260. Receive. Then, the flying object position detection unit 451 detects the current flying object position in the map represented by the map information based on the detection result and the map information in the storage unit 430. The aircraft current position thus detected is sequentially sent to the flight control unit 440, the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465.
  • the flying object position detection unit 451 calculates a provisional flying object current position using the calculation result of the moving distance, the positioning result, and the like obtained from the detection result (that is, the displacement information) by the internal sensor. Then, the flying object position detection unit 451 corrects based on the detection result (that is, ambient environment information) by the external sensor and the map information in the storage unit 430. For this reason, the flying object position detection unit 451 can accurately detect the current position of the flying object in the map represented by the map information while preventing the accumulation of the offset in the detection result by the internal sensor. It has become.
  • the map generation unit 455 generates at least one of an indoor 2D map and a 3D map of the building BLD1. Generation of such a map is started when a map generation start command sent from the input unit 480 is received.
  • the map generation unit 455 When generating a map, the map generation unit 455 receives the detection result from the external sensor sent from the surrounding environment information acquisition unit 260. Then, the map generation unit 455 generates map information around the unmanned air vehicle 100 based on the detection result of the external sensor and the current vehicle position transmitted from the air vehicle position detection unit 451.
  • the map generation unit 455 generates a flight plan for generating a map of the unfinished area, and flight control specifying the generated flight plan The request is sent to the flight control unit 440.
  • the unmanned air vehicle 100 flies according to the flight plan.
  • the map generation unit 455 receives the detection result by the external sensor mounted on the unmanned air vehicle 100 that is flying according to the flight plan for generating the map. Then, the map generation unit 455 generates a map around the unmanned air vehicle 100 based on the detection result by the external sensor and the current position of the air vehicle sent from the air vehicle position detection unit 451.
  • the map generation unit 455 updates the map information in the storage unit 430 by adding the map information of the new area.
  • the map generation unit 455 when the map generation unit 455 receives a map display command sent from the input unit 480, the map generation unit 455 refers to the map information in the storage unit 430 and generates display data for displaying the indoor map of the building BLD1. . Then, the map generation unit 455 sends the generated display data to the display unit 470. As a result, an indoor map of the building BLD1 is displayed on the display unit 470.
  • the position detection unit 451 detects the position of the unmanned air vehicle 100 using the map generated at each time point. Is called. That is, in this embodiment, the indoor map of the building BLD1 is generated with high accuracy by using a so-called SLAM (SimultaneousaneLocalization And Mapping) technique.
  • SLAM SimultaneousaneLocalization And Mapping
  • the specification of the vehicle position is started when a vehicle position specification start command sent from the input unit 480 is received.
  • the vehicle position specifying unit 461 When specifying the vehicle position, the vehicle position specifying unit 461 generates a flight plan from the current position of the unmanned air vehicle 100 for specifying the vehicle position with reference to the map information in the storage unit 430. A flight control request designating the flight plan is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
  • the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200.
  • the imaging result below the unmanned aerial vehicle 100 performing the flight according to the flight plan for specifying the vehicle position described above is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
  • the vehicle position specifying unit 461 analyzes the imaging result, and indoors of each building BLD1 of the vehicle CR j parked inside the building BLD1. Specify the position at. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430.
  • the vehicle position specifying unit 461 when the vehicle position specifying unit 461 receives the vehicle position display command sent from the input unit 480, the vehicle position specifying unit 461 refers to the map information in the storage unit 430 and the vehicle position in the vehicle information table, and the vehicle position in the building BLD1 indoors. Display data for displaying is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470. As a result, the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470.
  • the specification of the feature information is started when a vehicle feature specification start command sent from the input unit 480 is received.
  • the vehicle feature specifying unit 465 When specifying the feature information, the vehicle feature specifying unit 465 generates a flight plan for specifying the feature information with reference to the map information in the storage unit 430, and issues a flight control request specifying the generated flight plan. This is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
  • the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270 of the flying object mounting device 200.
  • the feature information for each vehicle CR j which is the QR code decoding result obtained by the flight according to the flight plan for specifying the feature information, is sent from the feature information acquisition unit 270 to the vehicle feature specification unit 465. Sent.
  • the vehicle feature specifying unit 465 associates the newly received feature information with the vehicle position based on the current vehicle position at that time. Then, the vehicle feature specifying unit 465 registers the feature information newly associated with the vehicle position in the feature information portion in the vehicle information table in the storage unit 430 in association with the vehicle position.
  • the vehicle feature specifying unit 465 when the vehicle feature specifying unit 465 receives the vehicle feature display command sent from the input unit 480, the vehicle feature specifying unit 465 refers to the vehicle position specified by the vehicle feature display command and the vehicle information table and sets the vehicle position. Display data for displaying feature information of the parked vehicle is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
  • the display unit 470 includes a display device such as a liquid crystal panel, an organic EL (Electro Luminescence) panel, and a PDP (Plasma Display Panel).
  • a display device such as a liquid crystal panel, an organic EL (Electro Luminescence) panel, and a PDP (Plasma Display Panel).
  • the display unit 470 receives display data sent from the map generation unit 455, the vehicle position specifying unit 461, or the vehicle feature specifying unit 465, the display unit 470 displays an image corresponding to the display data.
  • the input unit 480 includes a stroke device such as a keyboard and a pointing device such as a mouse.
  • a stroke device such as a keyboard
  • a pointing device such as a mouse.
  • the input unit 480 sends an input result to the map generation unit 455 when a map generation start command or a map display command is input.
  • the input unit 480 sends the input result to the vehicle position specification unit 461.
  • the input unit 480 sends the input result to the vehicle feature identification unit 465.
  • map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465 are described.
  • the indoor map information of the building BLD1 is not stored in the storage unit 430 at all.
  • vehicle position information and feature information are not registered in the vehicle information table in the storage unit 430.
  • the unmanned air vehicle 100 is anchored at the air base BS of the building BLD1.
  • the flying object mounting apparatus 200 has already started operation.
  • map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465 are input to the input unit 480. Are executed sequentially according to the user's command input.
  • the map generation unit 455 When receiving the map generation start command sent from the input unit 480, the map generation unit 455 starts the map generation process.
  • this map generation process as shown in FIG. 4, first, in step S11, the map generation unit 455 determines that the unmanned air vehicle 100 is located at the initial position (the position of the flight base BS of the building BLD1). The detection results by the external sensor sent from the ambient environment information acquisition unit 260 are collected. Then, a map is generated for a peripheral area (hereinafter, simply referred to as “peripheral area”) in which the detection result of the external sensor can ensure a predetermined accuracy. And the map production
  • step S12 the map generation unit 455 generates an initial flight plan for flying to one outer edge position of the region corresponding to the map information in the storage unit 430 in order to generate a map of the ungenerated region. To do. Then, the map generation unit 455 sends a flight control request specifying the generated first flight plan to the flight control unit 440. As a result, the unmanned aerial vehicle 100 performs a flight according to the first flight plan.
  • step S13 the map generation unit 455 is unmanned based on the detection result during the flight by the external sensor mounted on the unmanned air vehicle 100 and the current vehicle position that is sequentially sent from the air vehicle position detection unit 451. A three-dimensional map of the surrounding area of the position of the flying object 100 in flight is generated. And the map production
  • the air vehicle position detection unit 451 accurately detects the current vehicle position as described above and sequentially sends the detected current flight position to the map generation unit 455. ing.
  • step S14 the map generation unit 455 determines whether or not the map of the entire indoor area of the building BLD1 has been completed. In making such a determination, the map generation unit 455 determines whether or not the unmanned air vehicle 100 is only in a position where it cannot be moved due to an obstacle.
  • step S14 determines whether the result of the determination in step S14 is negative (step S14: N). If the result of the determination in step S14 is negative (step S14: N), the process proceeds to step S15.
  • step S15 in order to generate a map of the ungenerated area, the map generation unit 455 creates a next flight plan for flying to one outer edge position of the area corresponding to the map information in the storage unit 430 at that time. Generate. Then, the map generation unit 455 sends a flight control request specifying the generated next flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 performs the flight according to the next flight plan.
  • step S15 When the process of step S15 is completed, the process returns to step S13. Thereafter, the processes in steps S13 to S15 are repeated until the result of the determination in step S14 becomes affirmative.
  • step S16 the map generation unit 455 generates a flight plan (hereinafter referred to as “return flight plan”) for the unmanned air vehicle 100 to return to the flight base BS. Subsequently, the map generation unit 455 sends a flight control request designating the generated return flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the map generation process ends.
  • return flight plan hereinafter referred to as “return flight plan”
  • the vehicle position specifying unit 461 When the vehicle position specifying unit 461 receives the vehicle position specifying start command sent from the input unit 480, the vehicle position specifying unit 461 starts the vehicle position specifying process.
  • the vehicle position specifying unit 461 refers to the map information in the storage unit 430, and the flight plan for specifying the vehicle position. Is generated. Then, a flight control request designating the generated flight plan is sent to the flight control unit 440. Further, the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200. As a result, the imaging result below the unmanned air vehicle 100 that is flying according to the flight plan for specifying the vehicle position is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
  • step S ⁇ b> 22 the vehicle position specifying unit 461 sequentially sends the imaging results below the unmanned air vehicle 100 during the flight sent from the position related information acquisition unit 240 from the flying object position detection unit 451. Collect in association with the current position of the vehicle. Then, when the collection of images for specifying the position of each vehicle parked indoors in the building BLD1 is completed, the process proceeds to step S23.
  • step S23 the image collected in step S22 is analyzed, and the position of each vehicle parked indoors in building BLD1 is specified. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430. Thereafter, the vehicle position specifying process ends.
  • the vehicle position specifying unit 461 After receiving the vehicle position display command sent from the input unit 480 after the vehicle position specifying process is completed as described above, the vehicle position specifying unit 461 reads the vehicle position in the map information and the vehicle information table in the storage unit 430. The display data for displaying the vehicle position inside the building BLD1 is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470.
  • the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470.
  • An example of the display image displayed on the display unit 470 in this way is shown in FIG.
  • the vehicle feature specifying unit 465 When the vehicle feature specifying unit 465 receives the vehicle feature specifying start command sent from the input unit 480, the vehicle feature specifying unit 465 starts the vehicle feature specifying process.
  • the vehicle feature specifying unit 465 first specifies the feature information from the current position of the flying object (hereinafter referred to as “first vehicle”). The flight plan is generated. Then, the vehicle feature identification unit 465 sends a flight control request designating the generated flight plan to the first vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
  • the unmanned air vehicle 100 flies in accordance with the flight plan and reaches the vicinity of the first vehicle. Thereafter, in the feature information acquisition unit 270, the QR code image included in the lower imaging result is analyzed, and the feature information is decoded. Then, the feature information acquisition unit 270 sends the decoded feature information to the vehicle feature identification unit 465.
  • the vehicle feature specifying unit 465 captures the QR code in the feature information sheet affixed to the first vehicle when the feature information reaches the target vehicle. Generate a flight plan for the possible altitude.
  • step S32 the vehicle feature specifying unit 465 collects the feature information sent from the feature information acquisition unit 270 in association with the current vehicle position that is sequentially sent from the flying object position detection unit 451. Then, the vehicle feature specifying unit 465 adds the newly collected feature information to the feature information portion in the vehicle information table in the storage unit 430 based on the current vehicle position at the time of receiving the feature information. Register in association with the vehicle position corresponding to the position.
  • step S33 it is determined whether or not the vehicle feature specifying unit 465 has specified feature information for all of the vehicles parked indoors in the building BLD1. At the time of such determination, the vehicle feature specifying unit 465 determines whether or not the feature information has been registered for all the vehicle positions registered in the vehicle position portion in the vehicle information table in the storage unit 430.
  • step S34 a flight plan from the current position of the flying object to the next vehicle for specifying feature information (hereinafter referred to as “next vehicle”) is generated in the same manner as in step S31 described above. Then, the vehicle feature specifying unit 465 sends a flight control request designating the generated flight plan to the next vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
  • step S34 ends, the process returns to step S32. Thereafter, the processes in steps S32 to S34 are repeated until the result of the determination in step S33 becomes affirmative.
  • step S33 the process proceeds to step S35.
  • the vehicle feature identification unit 465 generates a return flight plan.
  • the vehicle feature specifying unit 465 sends a flight control request designating the generated return flight plan to the flight control unit 440.
  • the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the vehicle feature identification process ends.
  • the vehicle feature specifying unit 465 After receiving the vehicle feature display command sent from the input unit 480 after the vehicle feature specifying process is completed as described above, the vehicle feature specifying unit 465 receives the vehicle position specified by the vehicle feature display command, and With reference to the vehicle information table, display data for displaying the feature information of the vehicle parked at the vehicle position is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
  • a vehicle feature display command is input by designating one vehicle display position in the display image on the display unit 470 shown in FIG. 6 described above with a pointing device.
  • the feature information of the designated vehicle is displayed so as to be superimposed on the display image on the display unit 470 shown in FIG.
  • the position-related information acquisition unit 240 mounted on the unmanned air vehicle 100 that can fly indoors is the vehicle of the vehicle existing indoors in the building BLD 1 during the flight of the unmanned air vehicle 100. Get information related to location. And based on the information acquired by the said position relevant-information acquisition part 240, the vehicle position specific
  • the map generation unit 455 generates an indoor map of the building BLD1 based on the information acquired by the displacement information acquisition unit 250 and the surrounding environment information acquisition unit 260 mounted on the unmanned air vehicle 100. Simultaneously with the generation of the map, the flying object position detection unit 451 detects the position of the unmanned flying object 100 in the map being generated by the map generation unit 455. That is, an indoor map of the building BLD1 is generated using a so-called SLAM technique. For this reason, the indoor map of building BLD1 can be generated with high accuracy.
  • the position of the unmanned air vehicle 100 in flight for specifying the vehicle position described above is determined based on the information acquired by the vehicle position detection unit 451 by the displacement information acquisition unit 250 and the ambient environment information acquisition unit 260. Then, the position of the unmanned air vehicle 100 in the generated map is detected. Therefore, the position of the unmanned air vehicle 100 in flight is accurately detected in order to identify the vehicle position while avoiding a decrease in accuracy due to the accumulation of offsets of the external sensors included in the displacement information acquisition unit 250. be able to. As a result, the position of the vehicle can be accurately identified.
  • the feature information acquisition unit 270 mounted on the unmanned air vehicle 100 acquires the feature information of each vehicle. Then, the vehicle feature specifying unit 465 specifies the feature information of each vehicle by associating the feature information acquired by the feature information acquiring unit 270 with the position of the vehicle specified by the vehicle position specifying unit 461. For this reason, in addition to the position of each vehicle, the characteristic information of each vehicle can be specified.
  • a feature information sheet in which feature information is QR-coded is attached to each vehicle. For this reason, the size of the feature information sheet can be made compact, and the feature information of each vehicle can be easily acquired.
  • the position related information acquisition unit 240 includes an imaging device, and the vehicle position specifying unit 461 analyzes the imaging result of the imaging device and specifies the position of the vehicle indoors in the building BLD1. For this reason, the position of the vehicle inside the building BLD1 can be specified with a simple configuration.
  • the feature information sheet displays the feature information of the affixed vehicle as a QR code. May be.
  • an IC tag may be used instead of the feature information sheet, and the feature information may be acquired by non-contact communication with the IC.
  • the feature information acquisition unit 270 includes the QR code decoding processing unit.
  • the vehicle feature specifying unit may include the QR code decoding processing unit.
  • the object for specifying the position and the characteristic information is the vehicle, but other objects than the vehicle may be the object for specifying the position and the characteristic information.
  • a spectator holding a ticket with an IC tag in which a designated seating position is stored as feature information is targeted for position identification, and the seating position of the spectator holding the ticket is the correct seating position. You may make it confirm whether there exists.
  • the position-related information acquisition unit includes the imaging device.
  • the imaging device instead of the imaging device, a laser radar, a thermal sensor, or the like may be included.
  • the map generation unit generates a map using both the displacement information and the surrounding environment information.
  • the map generation unit may create a map using only one of the displacement information and the surrounding environment information.
  • the processing control apparatus is a single apparatus, but the function of the processing control apparatus may be achieved by a plurality of apparatuses that can communicate with each other.
  • the plurality of devices can be, for example, a server device that has excellent computing ability for image analysis or the like, and a personal computer that can communicate with the server device.
  • the position related information acquisition unit, the movement distance and positioning information acquisition, the environment information acquisition unit, and the feature information acquisition unit are mounted on the unmanned air vehicle. You may make it mount these acquisition parts in the unmanned traveling body which can drive
  • the relay device and the processing control device are separate devices, but may be a single device.
  • the relay device and the processing control device are connected wirelessly, but may be connected by wire.
  • the relay device and the processing control device are arranged separately in different buildings, but may be arranged in the same building.
  • the display unit displays information on the current flying object position of the unmanned flying object detected by the flying object position detection part and the position of the air base, the relay device, etc. alone or in combination with the position specifying result. You may do it.
  • one unmanned air vehicle 100 and one air base BS are provided.
  • a plurality of unmanned air vehicles 100 may be provided, or a plurality of flight bases BS may be provided.
  • a plurality of unmanned air vehicles 100 may be provided, and a plurality of flight bases BS may be provided.

Abstract

A position-related information acquisition unit is provided to a flying-body-mounted device (200) mounted on an unmanned flying body (100) capable of flying indoors and acquires information related to the positions of vehicles (CR1, CR2, CR3, ...) inside a building (BLD1) while the unmanned flying body (100) is in flight. When a vehicle position specification unit that a processing control device (400) is provided with receives, through wireless transmission, information acquired by the position-related information acquisition unit, the vehicle position specification unit specifies the positions of the vehicles (CR1, CR2, CR3, ...) inside the building (BLD1). As a result, even if the building (BLD1) is in use, the locations of the vehicles (CR1, CR2, CR3, ...) inside the building (BLD1) can be specified easily without requiring new equipment installation.

Description

管理システム及び位置特定方法Management system and location method
 本発明は、管理システム及び位置特定方法に関する。 The present invention relates to a management system and a location specifying method.
 従来から車両位置の特定に際して、GPS(Global Positioning System)衛星からの電波の受信結果に基づいて行われるGPS測位や加速度センサ、ジャイロセンサ、車輪回転数、車輪角度等を用いるオドメトリ測位が、一般的に行われてきた。しかしながら、倉庫内、立体駐車場内、船倉内等の屋内(完全な閉空間となっている場合以外に、上部が閉じている又は上部が実質的に閉じており、GPS衛星からの電波を受信できない空間となっている場合も含む:以下、「倉庫内等」ともいう)に駐車中の車両の位置については、GPS測位を行うことができない。 Conventionally, when positioning the vehicle, GPS positioning based on radio wave reception results from GPS (Global Positioning System) satellites and odometry positioning using acceleration sensors, gyro sensors, wheel rotation speed, wheel angle, etc. are generally used. Has been done. However, indoors such as in warehouses, multistory parking lots, and in the hold (except when the space is completely closed, the upper part is closed or the upper part is substantially closed, and radio waves from GPS satellites cannot be received. GPS positioning cannot be performed with respect to the position of a vehicle parked in a space (hereinafter also referred to as “in a warehouse, etc.”).
 また、車両ごとに配置された加速度センサ、ジャイロセンサ等を用いるオドメトリ測位を用いれば、屋内における車両位置を特定することができるが、車両位置を精度良く特定するには、オフセットが微小なセンサを使用することが必要となる。こうしたセンサは非常に高価であり、車両ごとに配置する構成を採用することは現実的ではない。 In addition, if odometry positioning using an acceleration sensor, a gyro sensor, or the like arranged for each vehicle is used, the vehicle position in the indoor can be specified, but in order to specify the vehicle position with high accuracy, a sensor with a small offset is used. It is necessary to use it. Such a sensor is very expensive, and it is not practical to employ a configuration in which each sensor is arranged.
 さらに、車両毎の車輪回転数、車輪角度等を用いるオドメトリ測位の結果を車両のECU(Electronic Control Unit)からの情報として取得できなくはない。しかしながら、完成車(新車)の場合には手を加えることができないため、全ての完成車のECUから測位情報を取得することは容易ではない。また、後付けでセンサを取り付けることで車両位置の情報を取得できるが、完成車に傷を付けることなく取り付ける方法は容易ではなく、出来たとしても、センサとしては非常に高価であり、車両ごとに配置する構成を採用することは現実的ではない。 Furthermore, the result of odometry positioning using the wheel rotation speed, wheel angle, etc. for each vehicle cannot be acquired as information from the ECU (Electronic Control Unit) of the vehicle. However, in the case of a completed vehicle (new vehicle), it is not easy to obtain positioning information from the ECUs of all completed vehicles because it cannot be changed. In addition, it is possible to obtain information on the vehicle position by attaching a sensor as a retrofit, but it is not easy to attach without damaging the finished vehicle, and even if it can be done, it is very expensive as a sensor. It is not realistic to adopt the arrangement to arrange.
 こうした困難性を伴う屋内における車両位置の高精度の特定に際して適用可能な技術として、屋内の高所からの撮像画像を解析して車両位置を特定する技術がある(特許文献1,2参照:以下、「従来例」という)。こうした従来例の技術では、支柱の先端部や天井に撮像デバイス、及び、必要に応じて照明装置を配置する。そして、撮像デバイスによる撮像画像の解析処理を行って、車両位置を特定するようになっている。 As a technique that can be applied when specifying the vehicle position indoors with such difficulty, there is a technique that identifies a vehicle position by analyzing a captured image from an indoor high place (see Patent Documents 1 and 2 below). , "Conventional example"). In such a conventional technique, an imaging device and, if necessary, an illuminating device are arranged on the tip or ceiling of the support column. Then, the vehicle position is specified by analyzing the captured image by the imaging device.
特開平07-041118号公報Japanese Patent Laid-Open No. 07-041118 特開平11-185027号公報Japanese Patent Laid-Open No. 11-185027
 上述した従来例の技術では、新たに支柱を立てる等の工事をした後、支柱の先端にカメラ等の撮像デバイスを配置することが必要となる。このため、使用中の倉庫内等の場合には、新たな設備工事が必要となる。かかる新たな設備工事を行う際には、当該倉庫内等の使用を一時停止する、又は、使用の制限を行うことが必要であった。 In the conventional technology described above, it is necessary to place an imaging device such as a camera at the tip of the support after a new work such as setting up the support. For this reason, in the case of a warehouse or the like in use, new equipment construction is required. When performing such new equipment construction, it was necessary to temporarily stop the use of the inside of the warehouse or to restrict the use.
 また、屋内における車両等の対象物の位置の特定のために従来例の技術を適用する場合、一般に天井の高さに制限があるため、全床面を俯瞰できる適当な支柱の設置が困難であった。また、天井が低く、かつ、床面積が広い屋内全面を撮像しようとすると、多数のカメラを配置する必要があるので、車両等の対象物の位置の特定のため構成が簡易なものとはならず、設備工事も大規模とならざるを得なかった。 In addition, when applying the conventional technique for specifying the position of an object such as a vehicle indoors, the ceiling height is generally limited, so it is difficult to install an appropriate support column that allows the entire floor to be viewed from above. there were. Also, when trying to image the entire indoor surface with a low ceiling and a large floor area, it is necessary to arrange a large number of cameras. Therefore, the configuration is not simple for specifying the position of an object such as a vehicle. In addition, the facility construction was unavoidable.
 このため、新たな設備工事を必要とせず、簡易に、車両等の対象物の位置の特定を精度良く行うことができる技術が望まれている。かかる要請に応えることが、本発明が解決すべき課題の一つとして挙げられる。 For this reason, there is a demand for a technique that can easily and accurately specify the position of an object such as a vehicle without requiring new equipment construction. Meeting this requirement is one of the problems to be solved by the present invention.
 請求項1に記載の発明は、屋内を飛行可能な無人飛行体と;前記無人飛行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人飛行体の飛行中に取得する位置関連情報取得部と;前記位置関連情報部により取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定部と;を備えることを特徴とする管理システムである。 The invention according to claim 1 is an unmanned air vehicle capable of flying indoors; and information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space is in flight of the unmanned air vehicle. A position-related information acquiring unit to be acquired; and a position specifying unit for specifying the position of the object in the room based on the information acquired by the position-related information unit. is there.
 請求項7に記載の発明は、屋内の床面を走行可能な無人走行体と;前記無人走行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人走行体の走行中に、前記所定の対象物の上方から取得する位置関連情報取得部と;前記位置関連情報部により取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定部と;を備えることを特徴とする管理システムである。 The invention according to claim 7 is an unmanned traveling body capable of traveling on an indoor floor surface; the unmanned traveling body is mounted on the unmanned traveling body, and information related to a position of a predetermined object existing in the indoor is displayed on the unmanned traveling body. A position-related information acquiring unit that acquires from above the predetermined object during traveling, and a position specifying unit that specifies the position of the object indoors based on the information acquired by the position-related information unit And a management system characterized by comprising:
 請求項8に記載の発明は、屋内を飛行可能な無人飛行体と;前記無人飛行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人飛行体の飛行中に取得する位置関連情報取得部と;前記屋内における前記対象物の位置を特定する位置特定部と;を備える管理システムにおいて使用される位置特定方法であって、前記位置関連情報取得部が、前記無人飛行体の飛行中に、前記対象物の位置に関連する情報を取得する取得工程と;前記位置特定部が、前記取得工程において取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定工程と;を備えることを特徴とする位置特定方法である。 According to an eighth aspect of the present invention, there is provided an unmanned air vehicle capable of flying indoors; information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space during the flight of the unmanned air vehicle A position specifying method used in a management system comprising: a position related information acquiring unit to acquire; and a position specifying unit for specifying the position of the object in the room, wherein the position related information acquiring unit includes: An acquisition step of acquiring information related to the position of the object during the flight of the unmanned air vehicle; and the position of the target object in the room based on the information acquired by the position specifying unit in the acquisition step And a position specifying step for specifying the position.
本発明の一実施形態に係る管理システムの構成を概略的に示す図である。It is a figure showing roughly the composition of the management system concerning one embodiment of the present invention. 図1の飛行体搭載装置の構成を説明するためのブロック図である。It is a block diagram for demonstrating the structure of the flying object mounting apparatus of FIG. 図1の処理制御装置の構成を説明するためのブロック図である。It is a block diagram for demonstrating the structure of the process control apparatus of FIG. 地図生成処理を説明するためのフローチャートである。It is a flowchart for demonstrating a map production | generation process. 車両位置特定処理を説明するためのフローチャートである。It is a flowchart for demonstrating a vehicle position specific process. 車両位置の特定結果の例を示す図である。It is a figure which shows the example of the specific result of a vehicle position. 車両のそれぞれの特徴情報の特定処理を説明するためのフローチャートである。It is a flowchart for demonstrating the specific process of each characteristic information of a vehicle.
 100 … 無人飛行体
 240 … 位置関連情報取得部
 250 … 変位情報取得部
 260 … 周囲環境情報取得部
 270 … 特徴情報取得部
 451 … 飛行体位置検出部(位置検出部)
 455 … 地図生成部
 461 … 車両位置特定部(位置特定部)
 465 … 車両特徴特定部(特徴特定部)
 500 … 管理システム
DESCRIPTION OF SYMBOLS 100 ... Unmanned air vehicle 240 ... Position related information acquisition part 250 ... Displacement information acquisition part 260 ... Ambient environment information acquisition part 270 ... Feature information acquisition part 451 ... Aircraft position detection part (position detection part)
455 ... Map generation part 461 ... Vehicle position specifying part (position specifying part)
465 ... Vehicle feature specifying unit (feature specifying unit)
500… Management system
 以下、本発明の一実施形態を、図1~図7を参照して説明する。なお、以下の説明及び図面においては、同一又は同等の要素には同一の符号を付し、重複する説明を省略する。 Hereinafter, an embodiment of the present invention will be described with reference to FIGS. In the following description and drawings, the same or equivalent elements are denoted by the same reference numerals, and redundant description is omitted.
 [構成]
 図1には、一実施形態に係る管理システム500の構成が、ブロック図にて示されている。この管理システム500は、建屋BLD1の屋内に駐車している車両CR1,CR2,CR3,…の管理を行うシステムとなっている。本実施形態では、車両CRj(j=1,2,3,…)には、車両CRjの車種、メーカー、色、電装品仕様、内装及び外装の仕様、オプションパーツの有無等の車両CRjに固有の特徴を示す特徴情報がQRコード(登録商標)化されて表された特徴情報シートQRjが貼付されている。なお、当該特徴情報の内容は、建屋BLD1の屋内における車両の管理の観点から予め定められる。
[Constitution]
FIG. 1 is a block diagram illustrating a configuration of a management system 500 according to an embodiment. This management system 500 is a system that manages the vehicles CR 1 , CR 2 , CR 3 ,... Parked indoors in the building BLD 1 . In the present embodiment, the vehicle CR j (j = 1, 2, 3,...) Includes the vehicle CR j such as the vehicle type, manufacturer, color, electrical component specifications, interior and exterior specifications, and optional parts. A feature information sheet QR j in which feature information indicating features unique to j is converted into a QR code (registered trademark) is attached. The content of the feature information is determined in advance from the viewpoint of vehicle management inside the building BLD1.
 図1に示されるように、管理システム500は、無人飛行体100と、飛行体搭載装置200と、中継装置300と、処理制御装置400とを備えている。ここで、無人飛行体100、飛行体搭載装置200及び中継装置300は、建屋BLD1内に配置される。また、本実施形態では、処理制御装置400は、建屋BLD1とは異なる建屋BLD2内に配置されている。 As shown in FIG. 1, the management system 500 includes an unmanned air vehicle 100, an air vehicle mounting device 200, a relay device 300, and a processing control device 400. Here, the unmanned air vehicle 100, the air vehicle mounting device 200, and the relay device 300 are arranged in the building BLD1. Moreover, in this embodiment, the process control apparatus 400 is arrange | positioned in building BLD2 different from building BLD1.
 上記の無人飛行体100は、本実施形態では、複数のプロペラを備えている。この無人飛行体100に対しては、処理制御装置400により、飛行速度、飛行経路等の遠隔制御が可能となっている。なお、建屋BLD1内には、無人飛行体100のための充電設備が用意された飛行基地BSが設置されており、無人飛行体100は、飛行基地BSにおいて充電が可能となっている。 In the present embodiment, the unmanned air vehicle 100 includes a plurality of propellers. The unmanned air vehicle 100 can be remotely controlled by the processing control device 400 such as the flight speed and the flight path. In the building BLD1, a flight base BS in which charging facilities for the unmanned air vehicle 100 are prepared is installed, and the unmanned air vehicle 100 can be charged at the air base BS.
 上記の飛行体搭載装置200は、無人飛行体100に搭載される。この飛行体搭載装置200は、中継装置300との間で無線通信可能となっている。なお、飛行体搭載装置200の構成の詳細については、後述する。 The above-mentioned flying object mounting apparatus 200 is mounted on the unmanned flying object 100. This flying object mounting device 200 can wirelessly communicate with the relay device 300. The details of the configuration of the flying object mounting apparatus 200 will be described later.
 上記の中継装置300は、本実施形態では、本体部が飛行基地BSに配置されるようになっている。中継装置300は、飛行体搭載装置200との間で無線通信を行うためのアンテナ300A1と、処理制御装置400との間で無線通信を行うためのアンテナ300A2とを備えて構成されている。ここで、中継装置300の構成要素のうち、アンテナ300A2のみは、建屋BLD1の屋外に設置されている。 In the present embodiment, the relay device 300 is configured such that the main body is arranged at the flying base BS. The relay device 300 includes an antenna 300A 1 for performing wireless communication with the flying object mounting device 200 and an antenna 300A 2 for performing wireless communication with the processing control device 400. . Here, among the components of the relay device 300, only the antenna 300A 2 is installed outdoors of a building BLD1.
 中継装置300は、飛行体搭載装置200から送信された無線信号をアンテナ300A1と受信すると、増幅処理等を適宜行った後にアンテナ300A2から処理制御装置400へ向けて送信する。また、中継装置300は、処理制御装置400から送信された無線信号をアンテナ300A2で受信すると、増幅処理等を適宜行った後にアンテナ300A1から飛行体搭載装置200へ向けて送信する。 When the relay apparatus 300 receives the radio signal transmitted from the flying object mounting apparatus 200 with the antenna 300A 1 , the relay apparatus 300 performs amplification processing and the like as appropriate, and then transmits the antenna signal from the antenna 300A 2 to the processing control apparatus 400. Also, repeater 300 receives the radio signal transmitted from the processing control unit 400 by the antenna 300A 2, is transmitted from the antenna 300A 1 after the amplification processing or the like as appropriate to the aircraft mounted apparatus 200.
 上記の処理制御装置400は、中継装置300との間で無線通信を行うためのアンテナ400Aを備えて構成されている。ここで、処理制御装置400の構成要素のうち、アンテナ400Aのみは、建屋BLD2の屋外に設置されている。 The processing control apparatus 400 includes an antenna 400A for performing wireless communication with the relay apparatus 300. Here, among the components of the processing control apparatus 400, only the antenna 400A is installed outside the building BLD2.
 <飛行体搭載装置200の構成>
 次に、上記の飛行体搭載装置200の構成について説明する。
<Configuration of flying object mounting device 200>
Next, the configuration of the flying object mounting apparatus 200 will be described.
 飛行体搭載装置200は、図2に示されるように、アンテナ210と、無線送受信部220と、ロータ制御部230とを備えている。また、飛行体搭載装置200は、位置関連情報取得部240と、変位情報取得部250と、周囲環境情報取得部260と、特徴情報取得部270とを備えている。 The flying object mounting apparatus 200 includes an antenna 210, a wireless transmission / reception unit 220, and a rotor control unit 230, as shown in FIG. The flying object mounting apparatus 200 includes a position related information acquisition unit 240, a displacement information acquisition unit 250, an ambient environment information acquisition unit 260, and a feature information acquisition unit 270.
 上記の無線送受信部220は、アンテナ210を利用して、中継装置300(ひいては、処理制御装置400)との間で情報の送受信を行う。そして、無線送受信部220は、アンテナ210を介して、中継装置300から送信された情報を受信した場合には、情報の内容に応じて、要素230~270のいずれかへ当該情報を送る。また、無線送受信部220は、要素230~270のいずれかから送られた情報を受けた場合には、アンテナ210を介して、当該情報を中継装置300へ送信する。 The wireless transmission / reception unit 220 uses the antenna 210 to transmit / receive information to / from the relay device 300 (and thus the processing control device 400). Then, when receiving the information transmitted from the relay device 300 via the antenna 210, the radio transmission / reception unit 220 transmits the information to any of the elements 230 to 270 according to the content of the information. Further, when receiving the information sent from any of the elements 230 to 270, the wireless transmission / reception unit 220 transmits the information to the relay device 300 via the antenna 210.
 上記のロータ制御部230は、処理制御装置400から、中継装置300を介して送られた飛行制御情報を受ける。そして、ロータ制御部230は、当該飛行制御情報に従って、無人飛行体100が備える複数のプロペラの回転を制御する。このため、無人飛行体100に対しては、処理制御装置400で生成された飛行計画に従った飛行速度及び飛行経路での飛行を行わせることができるようになっている。 The rotor control unit 230 receives the flight control information sent from the processing control device 400 via the relay device 300. And the rotor control part 230 controls rotation of the several propeller with which the unmanned air vehicle 100 is provided according to the said flight control information. For this reason, the unmanned air vehicle 100 can be made to fly at a flight speed and a flight path according to the flight plan generated by the processing control device 400.
 上記の位置関連情報取得部240は、光学式カメラ等の撮像デバイスを備えて構成されている。この撮像デバイスは、車両CRjの位置関連情報の取得に際して、無人飛行体100の飛行中に、無人飛行体100の下方の撮像を行う。この撮像結果は、無線送受信部220及び中継装置300を介して、処理制御装置400へ送られる。 The position-related information acquisition unit 240 includes an imaging device such as an optical camera. When acquiring the position-related information of the vehicle CR j , the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying. This imaging result is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
 上記の変位情報取得部250は、3次元加速度センサ、ジャイロセンサ等のいわゆる内界センサを備えて構成されている。これらの内界センサによる検出結果は、変位情報として、無線送受信部220及び中継装置300を介して処理制御装置400へ送られる。 The displacement information acquisition unit 250 includes a so-called internal sensor such as a three-dimensional acceleration sensor or a gyro sensor. The detection results by these internal sensors are sent as displacement information to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
 上記の周囲環境情報取得部260は、レーザレンジファインダ(以下、「LRF」と記す)等のいわゆる外界センサを備えて構成されている。ここで、LRFは、無人飛行体100の現在位置から全方位に関し、無人飛行体100の現在位置から飛行の障害物(例えば、壁、柱、梁、天井、床等)までの距離を検出する。これらの内界センサによる検出結果は、周囲環境情報として、無線送受信部220及び中継装置300を介して処理制御装置400へ送られる。 The ambient environment information acquisition unit 260 includes a so-called external sensor such as a laser range finder (hereinafter referred to as “LRF”). Here, the LRF detects the distance from the current position of the unmanned aerial vehicle 100 to the flying obstacle (eg, wall, column, beam, ceiling, floor, etc.) with respect to all directions from the current position of the unmanned air vehicle 100. . The detection results by these internal sensors are sent to the processing control apparatus 400 via the wireless transmission / reception unit 220 and the relay apparatus 300 as ambient environment information.
 上記の特徴情報取得部270は、上述した撮像デバイスを位置関連情報取得部240と共有するとともに、QRコードのデコード処理部を備えている。この撮像デバイスは、車両CRjの特徴情報の取得に際して、無人飛行体100の飛行中に、無人飛行体100の下方の撮像を行う。また、デコード処理部は、撮像デバイスによる撮像結果中にQRコードを検出すると、当該QRコードをデコードすることにより、上述した車両CRjごとの特徴情報を取得する。こうして取得された特徴情報は、無線送受信部220及び中継装置300を介して、処理制御装置400へ送られる。 The feature information acquisition unit 270 shares the above-described imaging device with the position-related information acquisition unit 240 and includes a QR code decoding processing unit. When acquiring the feature information of the vehicle CR j , the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying. Further, when the decoding processing unit detects the QR code in the imaging result by the imaging device, the decoding processing unit acquires the characteristic information for each vehicle CR j described above by decoding the QR code. The feature information acquired in this way is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
 なお、撮像デバイスによる撮像に際して照明が必要となる場合に対応して、飛行体搭載装置200は、撮像範囲を照明する不図示の照明部を更に備えるようになっている。 Incidentally, in response to a case where illumination is required for imaging by the imaging device, the flying object mounting apparatus 200 further includes an illumination unit (not shown) that illuminates the imaging range.
 <処理制御装置400の構成>
 次に、上記の処理制御装置400の構成について説明する。
<Configuration of Processing Control Device 400>
Next, the configuration of the processing control apparatus 400 will be described.
 処理制御装置400は、図3に示されるように、上述したアンテナ400Aに加えて、無線送受信部420と、記憶部430と、飛行制御部440とを備えている。また、処理制御装置400は、飛行体位置検出部451と、地図生成部455と、車両位置特定部461と、車両特徴特定部465とを備えている。さらに、処理制御装置400は、表示部470と、入力部480とを備えている。 As shown in FIG. 3, the processing control apparatus 400 includes a wireless transmission / reception unit 420, a storage unit 430, and a flight control unit 440 in addition to the antenna 400A described above. Further, the processing control apparatus 400 includes a flying object position detection unit 451, a map generation unit 455, a vehicle position specification unit 461, and a vehicle feature specification unit 465. Furthermore, the processing control apparatus 400 includes a display unit 470 and an input unit 480.
 上記の無線送受信部420は、アンテナ400Aを利用して、中継装置300(ひいては、飛行体搭載装置200)との間で情報の送受信を行う。そして、無線送受信部420は、アンテナ400Aを介して、中継装置300から送信された情報を受信した場合には、情報の内容に応じて、適宜、要素451~465へ当該情報を送る。また、無線送受信部420は、飛行制御部440から送られた情報を受けた場合には、アンテナ400Aを介して、当該情報を中継装置300へ送信する。 The wireless transmission / reception unit 420 transmits / receives information to / from the relay device 300 (and thus the flying object mounting device 200) using the antenna 400A. Then, when receiving the information transmitted from the relay device 300 via the antenna 400A, the wireless transmission / reception unit 420 appropriately transmits the information to the elements 451 to 465 according to the content of the information. In addition, when receiving the information sent from the flight control unit 440, the wireless transmission / reception unit 420 transmits the information to the relay device 300 via the antenna 400A.
 なお、飛行体搭載装置200の要素240~270から送出された情報は、無線送信部220及びアンテナ210、中継装置300、並びに、処理制御装置400のアンテナ400A及び無線送受信部420を順次介して、処理制御装置400の要素451~465へ適宜送信される。また、処理制御装置400の飛行制御部440から送られた情報は、無線送受信部420及びアンテナ400A、中継装置300、並びに、飛行体搭載装置200のアンテナ210及び無線送信部220を順次介して、飛行体搭載装置200のロータ制御部230へ送信される。以下の説明においては、上記のように順次介する要素の記載を省略するものとする。 Information transmitted from the elements 240 to 270 of the flying object mounting device 200 is sequentially transmitted via the wireless transmission unit 220 and the antenna 210, the relay device 300, and the antenna 400A and the wireless transmission / reception unit 420 of the processing control device 400. The information is appropriately transmitted to the elements 451 to 465 of the processing control apparatus 400. In addition, information sent from the flight control unit 440 of the processing control device 400 is sequentially transmitted via the wireless transmission / reception unit 420 and the antenna 400A, the relay device 300, and the antenna 210 and the wireless transmission unit 220 of the flying object mounting device 200. It is transmitted to the rotor control unit 230 of the flying object mounting device 200. In the following description, the description of the sequentially interposed elements is omitted as described above.
 上記の記憶部430は、処理制御装置400で利用される様々な情報を記憶する。こうして記憶部430に記憶される情報としては、地図生成部455により生成された地図情報、車両位置特定部461に特定された車両位置と、車両特徴特定部465により特定された特徴情報とが関連付けられた車両情報テーブルとが含まれている。この記憶部430に対しては、要素440~465のいずれもがアクセス可能となっている。 The storage unit 430 stores various information used by the processing control device 400. As information stored in the storage unit 430 in this manner, the map information generated by the map generation unit 455, the vehicle position specified by the vehicle position specifying unit 461, and the feature information specified by the vehicle feature specifying unit 465 are associated with each other. Vehicle information table displayed. Any of the elements 440 to 465 can access the storage unit 430.
 上記の飛行制御部440は、地図生成部455、車両位置特定部461及び車両特徴特定部465のいずれかから送られた飛行制御要求を受ける。そして、飛行制御部440は、記憶部430内の地図情報、及び、飛行体位置検出部451から送られた無人飛行体100の現在位置(以下、「飛行体現在位置」という)に基づいて、当該飛行制御要求において指定された飛行計画に含まれる飛行速度及び飛行経路(飛行高度を含む)の飛行を実現させるための飛行制御情報を順次生成する。こうして生成された飛行制御情報は、飛行体搭載装置200のロータ制御部230へ送信される。 The flight control unit 440 receives a flight control request sent from any of the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465. Then, the flight control unit 440 is based on the map information in the storage unit 430 and the current position of the unmanned air vehicle 100 sent from the flying object position detection unit 451 (hereinafter referred to as “flying object current position”). Flight control information for realizing the flight of the flight speed and flight path (including the flight altitude) included in the flight plan specified in the flight control request is sequentially generated. The flight control information generated in this way is transmitted to the rotor control unit 230 of the flying object mounting device 200.
 上記の飛行体位置検出部451は、飛行体搭載装置200の変位情報取得部250から送られた内界センサによる検出結果、及び、周囲環境情報取得部260から送られた外界センサによる検出結果を受信する。そして、飛行体位置検出部451は、当該検出結果、及び、記憶部430内の地図情報に基づいて、当該地図情報で表された地図内における飛行体現在位置を検出する。こうして検出された飛行体現在位置は、逐次、飛行制御部440、地図生成部455、車両位置特定部461及び車両特徴特定部465へ送られる。 The above-mentioned flying object position detection unit 451 displays the detection result by the inner world sensor sent from the displacement information acquisition unit 250 of the flying object mounting device 200 and the detection result by the outer world sensor sent from the surrounding environment information acquisition unit 260. Receive. Then, the flying object position detection unit 451 detects the current flying object position in the map represented by the map information based on the detection result and the map information in the storage unit 430. The aircraft current position thus detected is sequentially sent to the flight control unit 440, the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465.
 ここで、飛行体位置検出部451は、内界センサによる検出結果(すなわち、変位情報)から得られる移動距離の算出結果及び測位結果等を利用して暫定的な飛行体現在位置を算出する。そして、飛行体位置検出部451は、外界センサによる検出結果(すなわち、周囲環境情報)及び記憶部430内の地図情報に基づいて補正する。このため、飛行体位置検出部451は、内界センサによる検出結果におけるオフセットの累積を防止しつつ、当該地図情報で表された地図内における飛行体現在位置を精度良く検出することができるようになっている。 Here, the flying object position detection unit 451 calculates a provisional flying object current position using the calculation result of the moving distance, the positioning result, and the like obtained from the detection result (that is, the displacement information) by the internal sensor. Then, the flying object position detection unit 451 corrects based on the detection result (that is, ambient environment information) by the external sensor and the map information in the storage unit 430. For this reason, the flying object position detection unit 451 can accurately detect the current position of the flying object in the map represented by the map information while preventing the accumulation of the offset in the detection result by the internal sensor. It has become.
 上記の地図生成部455は、建屋BLD1の屋内の2次元地図及び3次元地図の少なくとも一方を生成する。かかる地図の生成は、入力部480から送られた地図生成開始指令を受けると開始される。 The map generation unit 455 generates at least one of an indoor 2D map and a 3D map of the building BLD1. Generation of such a map is started when a map generation start command sent from the input unit 480 is received.
 地図生成に際して、地図生成部455は、周囲環境情報取得部260から送られた外界センサによる検出結果を受信する。そして、地図生成部455は、当該外界センサによる検出結果、及び、飛行体位置検出部451から送られた飛行体現在位置に基づいて、無人飛行体100の周辺の地図情報を生成する。 When generating a map, the map generation unit 455 receives the detection result from the external sensor sent from the surrounding environment information acquisition unit 260. Then, the map generation unit 455 generates map information around the unmanned air vehicle 100 based on the detection result of the external sensor and the current vehicle position transmitted from the air vehicle position detection unit 451.
 引き続き、建屋BLD1の屋内全域の地図が完成していない場合には、地図生成部455は、未完成領域の地図を生成するための飛行計画を生成し、生成された飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該飛行計画に従った飛行を行う。 Subsequently, when the map of the entire indoor area of the building BLD1 is not completed, the map generation unit 455 generates a flight plan for generating a map of the unfinished area, and flight control specifying the generated flight plan The request is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
 こうした地図生成のための飛行計画に沿った飛行を行っている無人飛行体100に搭載された外界センサによる検出結果を、地図生成部455は受信する。そして、地図生成部455は、当該外界センサによる検出結果、及び、飛行体位置検出部451から送られた飛行体現在位置に基づいて、無人飛行体100の周辺の地図を生成する。 The map generation unit 455 receives the detection result by the external sensor mounted on the unmanned air vehicle 100 that is flying according to the flight plan for generating the map. Then, the map generation unit 455 generates a map around the unmanned air vehicle 100 based on the detection result by the external sensor and the current position of the air vehicle sent from the air vehicle position detection unit 451.
 こうして新たな領域の地図が生成されるたびに、地図生成部455は、当該新たな領域の地図情報を追加することにより、記憶部430内の地図情報を更新する。 Thus, each time a map of a new area is generated, the map generation unit 455 updates the map information in the storage unit 430 by adding the map information of the new area.
 また、地図生成部455は、入力部480から送られた地図表示指令を受けると、記憶部430内の地図情報を参照して、建屋BLD1の屋内の地図を表示するための表示データを生成する。そして、地図生成部455は、生成された表示データを表示部470へ送る。この結果、建屋BLD1の屋内の地図が表示部470に表示される。 In addition, when the map generation unit 455 receives a map display command sent from the input unit 480, the map generation unit 455 refers to the map information in the storage unit 430 and generates display data for displaying the indoor map of the building BLD1. . Then, the map generation unit 455 sends the generated display data to the display unit 470. As a result, an indoor map of the building BLD1 is displayed on the display unit 470.
 なお、地図生成部455による地図生成処理の詳細については、後述する。 The details of the map generation processing by the map generation unit 455 will be described later.
 以上のように本実施形態では、地図生成部455による地図生成処理と並行して、各時点で生成されている地図を利用した無人飛行体100の位置検出が、飛行体位置検出部451により行われる。すなわち、本実施形態では、いわゆるSLAM(Simultaneous Localization And Mapping)の手法を利用して、建屋BLD1の屋内の地図が、精度良く生成されるようになっている。 As described above, in the present embodiment, in parallel with the map generation process by the map generation unit 455, the position detection unit 451 detects the position of the unmanned air vehicle 100 using the map generated at each time point. Is called. That is, in this embodiment, the indoor map of the building BLD1 is generated with high accuracy by using a so-called SLAM (SimultaneousaneLocalization And Mapping) technique.
 上記の車両位置特定部461は、建屋BLD1の屋内に駐車された車両CRj(j=1,2,3,…)のそれぞれの建屋BLD1の屋内における位置を特定する。かかる車両位置の特定は、入力部480から送られた車両位置特定開始指令を受けると開始される。 The above vehicle position specifying unit 461 specifies the position in the indoor respective building BLD1 of building parked in indoor BLD1 the vehicle CR j (j = 1,2,3, ... ). The specification of the vehicle position is started when a vehicle position specification start command sent from the input unit 480 is received.
 車両位置の特定に際して、車両位置特定部461は、記憶部430内の地図情報を参照して、車両位置の特定のための無人飛行体100の現在位置からの飛行計画を生成し、生成された飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該飛行計画に従った飛行を行う。 When specifying the vehicle position, the vehicle position specifying unit 461 generates a flight plan from the current position of the unmanned air vehicle 100 for specifying the vehicle position with reference to the map information in the storage unit 430. A flight control request designating the flight plan is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
 また、車両位置特定部461は、飛行体搭載装置200の位置関連情報取得部240に対して、下方の撮像指定を送信する。この結果、上述した車両位置の特定のための飛行計画に沿った飛行を行っている無人飛行体100の下方の撮像結果が、位置関連情報取得部240から車両位置特定部461へ送信される。 Also, the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200. As a result, the imaging result below the unmanned aerial vehicle 100 performing the flight according to the flight plan for specifying the vehicle position described above is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
 こうして、位置関連情報取得部240から送信された撮像結果を受信すると、車両位置特定部461は、撮像結果を解析して、建屋BLD1の屋内に駐車された車両CRjのそれぞれの建屋BLD1の屋内における位置を特定する。そして、車両位置特定部461は、記憶部430内の車両情報テーブルにおける車両位置部分に、新たに特定された車両位置を登録する。 In this way, when the imaging result transmitted from the position related information acquisition unit 240 is received, the vehicle position specifying unit 461 analyzes the imaging result, and indoors of each building BLD1 of the vehicle CR j parked inside the building BLD1. Specify the position at. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430.
 また、車両位置特定部461は、入力部480から送られた車両位置表示指令を受けると、記憶部430内の地図情報及び車両情報テーブルにおける車両位置を参照して、建屋BLD1の屋内における車両位置を表示するための表示データを生成する。そして、車両位置特定部461は、生成された表示データを表示部470へ送る。この結果、建屋BLD1の屋内における駐車している車両の位置が表示部470に表示される。 Further, when the vehicle position specifying unit 461 receives the vehicle position display command sent from the input unit 480, the vehicle position specifying unit 461 refers to the map information in the storage unit 430 and the vehicle position in the vehicle information table, and the vehicle position in the building BLD1 indoors. Display data for displaying is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470. As a result, the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470.
 上記の車両特徴特定部465は、建屋BLD1の屋内に駐車された車両CRj(j=1,2,3,…)のそれぞれの位置と特徴情報とを関連付けることにより、車両CRjのそれぞれの特徴情報を特定する。かかる特徴情報の特定は、入力部480から送られた車両特徴特定開始指令を受けると開始される。 Additional vehicle feature identification unit 465 by correlating the respective position and the feature information of the building BLD1 indoors parked vehicle CR j of (j = 1,2,3, ...), each of the vehicle CR j Identify feature information. The specification of the feature information is started when a vehicle feature specification start command sent from the input unit 480 is received.
 特徴情報の特定に際して、車両特徴特定部465は、記憶部430内の地図情報を参照して、特徴情報の特定のための飛行計画を生成し、生成された飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該飛行計画に従った飛行を行う。 When specifying the feature information, the vehicle feature specifying unit 465 generates a flight plan for specifying the feature information with reference to the map information in the storage unit 430, and issues a flight control request specifying the generated flight plan. This is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
 また、車両特徴特定部465は、飛行体搭載装置200の特徴情報取得部270に対して、下方の撮像指定、及び、QRコードのデコード指定を送信する。この結果、上述した特徴情報の特定のための飛行計画に沿った飛行により得られたQRコードのデコード結果である車両CRjごとの特徴情報が、特徴情報取得部270から車両特徴特定部465へ送信される。 In addition, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270 of the flying object mounting device 200. As a result, the feature information for each vehicle CR j , which is the QR code decoding result obtained by the flight according to the flight plan for specifying the feature information, is sent from the feature information acquisition unit 270 to the vehicle feature specification unit 465. Sent.
 こうして、特徴情報取得部270から送信された特徴情報を受信すると、その時点における飛行体現在位置に基づいて、車両特徴特定部465は、新たに受信した特徴情報と、車両位置とを関連付ける。そして、車両特徴特定部465は、記憶部430内の車両情報テーブルにおける特徴情報部分に、車両位置との関連付けが新たに行われた特徴情報を、当該車両位置に関連付けて登録する。 Thus, when the feature information transmitted from the feature information acquisition unit 270 is received, the vehicle feature specifying unit 465 associates the newly received feature information with the vehicle position based on the current vehicle position at that time. Then, the vehicle feature specifying unit 465 registers the feature information newly associated with the vehicle position in the feature information portion in the vehicle information table in the storage unit 430 in association with the vehicle position.
 また、車両特徴特定部465は、入力部480から送られた車両特徴表示指令を受けると、当該車両特徴表示指令で指定された車両位置、及び、車両情報テーブルを参照して、当該車両位置に駐車している車両の特徴情報を表示するための表示データを生成する。そして、車両特徴特定部465は、生成された表示データを表示部470へ送る。この結果、建屋BLD1の屋内に駐車している特定の車両の特徴情報が表示部470に表示される。 Further, when the vehicle feature specifying unit 465 receives the vehicle feature display command sent from the input unit 480, the vehicle feature specifying unit 465 refers to the vehicle position specified by the vehicle feature display command and the vehicle information table and sets the vehicle position. Display data for displaying feature information of the parked vehicle is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
 上記の表示部470は、例えば、液晶パネル、有機EL(Electro Luminescence)パネル、PDP(Plasma Display Panel)等の表示デバイスを備えて構成されている。この表示部470は、地図生成部455、車両位置特定部461又は車両特徴特定部465から送られた表示データを受けると、当該表示データに対応する画像を表示する。 The display unit 470 includes a display device such as a liquid crystal panel, an organic EL (Electro Luminescence) panel, and a PDP (Plasma Display Panel). When the display unit 470 receives display data sent from the map generation unit 455, the vehicle position specifying unit 461, or the vehicle feature specifying unit 465, the display unit 470 displays an image corresponding to the display data.
 上記の入力部480は、本実施形態では、キーボード等のストロークデバイスと、マウス等のポインティングデバイスを備えて構成されている。この入力部480に対する入力操作を行うことにより、利用者は、上述した地図生成開始指令、地図表示指令、車両位置特定開始指令、車両位置表示指令、車両特徴特定開始指令及び車両特徴表示指令を入力できるようになっている。 In the present embodiment, the input unit 480 includes a stroke device such as a keyboard and a pointing device such as a mouse. By performing an input operation on the input unit 480, the user inputs the above-described map generation start command, map display command, vehicle position specification start command, vehicle position display command, vehicle feature specification start command, and vehicle feature display command. It can be done.
 なお、入力部480は、地図生成開始指令又は地図表示指令が入力されると、入力結果を地図生成部455へ送る。また、入力部480は、車両位置特定開始指令又は車両位置表示指令が入力されると、入力結果を車両位置特定部461へ送る。また、入力部480は、車両特徴特定開始指令及び車両特徴表示指令が入力されると、入力結果を車両特徴特定部465へ送る。 The input unit 480 sends an input result to the map generation unit 455 when a map generation start command or a map display command is input. In addition, when the vehicle position specification start command or the vehicle position display command is input, the input unit 480 sends the input result to the vehicle position specification unit 461. In addition, when the vehicle feature identification start command and the vehicle feature display command are input, the input unit 480 sends the input result to the vehicle feature identification unit 465.
 [動作]
 次に、上記のように構成された管理システム500の動作について、地図生成部455による地図生成処理、車両位置特定部461による車両位置特定処理、及び、車両特徴特定部465による車両特徴特定処理に主に着目して説明する。
[Operation]
Next, regarding the operation of the management system 500 configured as described above, map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465. The explanation will be focused on.
 なお、当初においては、記憶部430内には、建屋BLD1の屋内の地図情報は全く記憶されていないものとする。また、記憶部430内の車両情報テーブルには、車両位置情報及び特徴情報が登録されていないものとする。 In the initial stage, it is assumed that the indoor map information of the building BLD1 is not stored in the storage unit 430 at all. In addition, vehicle position information and feature information are not registered in the vehicle information table in the storage unit 430.
 さらに、無人飛行体100は、建屋BLD1の飛行基地BSに停泊しているものとする。また、飛行体搭載装置200は、既に動作を開始しているものとする。 Furthermore, it is assumed that the unmanned air vehicle 100 is anchored at the air base BS of the building BLD1. In addition, it is assumed that the flying object mounting apparatus 200 has already started operation.
 こうした当初状態にある場合、管理システム500では、地図生成部455による地図生成処理、車両位置特定部461による車両位置特定処理、及び、車両特徴特定部465による車両特徴特定処理が、入力部480への利用者の指令入力に従って、順次実行される。 In such an initial state, in the management system 500, map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465 are input to the input unit 480. Are executed sequentially according to the user's command input.
 <地図生成処理>
 まず、地図生成部455による地図生成処理について説明する。
<Map generation processing>
First, map generation processing by the map generation unit 455 will be described.
 地図生成部455は、入力部480から送られた地図生成開始指令を受けると、地図生成処理を開始する。この地図生成処理に際しては、図4に示されるように、まず、ステップS11において、地図生成部455が、当初位置(建屋BLD1の飛行基地BSの位置)に無人飛行体100が位置している場合における周囲環境情報取得部260から送られた外界センサによる検出結果を収集する。そして、当該外界センサによる検出結果が所定の精度を確保できている周辺領域(以下、単に「周辺領域」という)についての地図を生成する。そして、地図生成部455は、生成された新たな地図の情報を、記憶部430内に地図情報として格納する。 When receiving the map generation start command sent from the input unit 480, the map generation unit 455 starts the map generation process. In this map generation process, as shown in FIG. 4, first, in step S11, the map generation unit 455 determines that the unmanned air vehicle 100 is located at the initial position (the position of the flight base BS of the building BLD1). The detection results by the external sensor sent from the ambient environment information acquisition unit 260 are collected. Then, a map is generated for a peripheral area (hereinafter, simply referred to as “peripheral area”) in which the detection result of the external sensor can ensure a predetermined accuracy. And the map production | generation part 455 stores the information of the produced | generated new map in the memory | storage part 430 as map information.
 次に、ステップS12において、地図生成部455が、未生成領域の地図を生成するため、記憶部430内の地図情報に対応する領域の一の外縁位置まで飛行するための最初の飛行計画を生成する。そして、地図生成部455は、生成された最初の飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該最初の飛行計画に従った飛行を行う。 Next, in step S12, the map generation unit 455 generates an initial flight plan for flying to one outer edge position of the region corresponding to the map information in the storage unit 430 in order to generate a map of the ungenerated region. To do. Then, the map generation unit 455 sends a flight control request specifying the generated first flight plan to the flight control unit 440. As a result, the unmanned aerial vehicle 100 performs a flight according to the first flight plan.
 引き続き、ステップS13において、地図生成部455が、無人飛行体100に搭載された外界センサによる飛行中の検出結果、及び、飛行体位置検出部451から逐次送られる飛行体現在位置に基づいて、無人飛行体100の飛行中位置の周辺領域の3次元地図を生成する。そして、地図生成部455が、新たに生成された地図の情報を追加することにより、記憶部430内の地図情報を更新する。 Subsequently, in step S13, the map generation unit 455 is unmanned based on the detection result during the flight by the external sensor mounted on the unmanned air vehicle 100 and the current vehicle position that is sequentially sent from the air vehicle position detection unit 451. A three-dimensional map of the surrounding area of the position of the flying object 100 in flight is generated. And the map production | generation part 455 updates the map information in the memory | storage part 430 by adding the information of the newly produced | generated map.
 なお、無人飛行体100の飛行中には、飛行体位置検出部451が、上述のようにして飛行体現在位置を精度良く検出し、検出された飛行体現在位置を地図生成部455へ逐次送っている。 During the flight of the unmanned air vehicle 100, the air vehicle position detection unit 451 accurately detects the current vehicle position as described above and sequentially sends the detected current flight position to the map generation unit 455. ing.
 次いで、ステップS14において、地図生成部455が、建屋BLD1の屋内の全域の地図が完成した否かを判定する。かかる判定に際しては、地図生成部455が、障害物に邪魔されて無人飛行体100が移動不可能な位置のみとなったか否かを判定する。 Next, in step S14, the map generation unit 455 determines whether or not the map of the entire indoor area of the building BLD1 has been completed. In making such a determination, the map generation unit 455 determines whether or not the unmanned air vehicle 100 is only in a position where it cannot be moved due to an obstacle.
 ステップS14における判定の結果が否定的であった場合(ステップS14:N)には、処理はステップS15へ進む。このステップS15では、未生成領域の地図を生成するため、地図生成部455が、その時点における記憶部430内の地図情報に対応する領域の一の外縁位置まで飛行するための次の飛行計画を生成する。そして、地図生成部455は、生成された次の飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該次の飛行計画に従った飛行を行う。 If the result of the determination in step S14 is negative (step S14: N), the process proceeds to step S15. In this step S15, in order to generate a map of the ungenerated area, the map generation unit 455 creates a next flight plan for flying to one outer edge position of the area corresponding to the map information in the storage unit 430 at that time. Generate. Then, the map generation unit 455 sends a flight control request specifying the generated next flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 performs the flight according to the next flight plan.
 ステップS15の処理が終了すると、処理はステップS13へ戻る。以後、ステップS14における判定の結果が肯定的となるまで、ステップS13~S15の処理が繰り返される。 When the process of step S15 is completed, the process returns to step S13. Thereafter, the processes in steps S13 to S15 are repeated until the result of the determination in step S14 becomes affirmative.
 建屋BLD1の屋内の全域の地図が完成し、ステップS14における判定の結果が肯定的となると(ステップS14:Y)、処理はステップS16へ進む。このステップS16では、地図生成部455は、無人飛行体100が飛行基地BSへ帰還するための飛行計画(以下、「帰還飛行計画」という)を生成する。引き続き、地図生成部455は、生成された帰還飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該帰還飛行計画に従った飛行を行い、飛行基地BSへ帰還する。そして、地図生成処理が終了する。 When the map of the entire area of the building BLD1 is completed and the result of determination in step S14 is affirmative (step S14: Y), the process proceeds to step S16. In step S16, the map generation unit 455 generates a flight plan (hereinafter referred to as “return flight plan”) for the unmanned air vehicle 100 to return to the flight base BS. Subsequently, the map generation unit 455 sends a flight control request designating the generated return flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the map generation process ends.
 <車両位置特定処理>
 次に、車両位置特定部461による車両位置特定処理について説明する。
<Vehicle location specifying process>
Next, the vehicle position specifying process by the vehicle position specifying unit 461 will be described.
 車両位置特定部461は、入力部480から送られた車両位置特定開始指令を受けると、車両位置特定処理を開始する。この車両位置特定処理に際しては、図5に示されるように、まず、ステップS21において、車両位置特定部461が、記憶部430内の地図情報を参照して、車両位置の特定のための飛行計画を生成する。そして、生成された飛行計画を指定した飛行制御要求を飛行制御部440へ送る。また、車両位置特定部461は、飛行体搭載装置200の位置関連情報取得部240に対して、下方の撮像指定を送信する。この結果、車両位置の特定のための飛行計画に沿った飛行を行っている無人飛行体100の下方の撮像結果が、位置関連情報取得部240から車両位置特定部461へ送信される。 When the vehicle position specifying unit 461 receives the vehicle position specifying start command sent from the input unit 480, the vehicle position specifying unit 461 starts the vehicle position specifying process. In this vehicle position specifying process, as shown in FIG. 5, first, in step S21, the vehicle position specifying unit 461 refers to the map information in the storage unit 430, and the flight plan for specifying the vehicle position. Is generated. Then, a flight control request designating the generated flight plan is sent to the flight control unit 440. Further, the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200. As a result, the imaging result below the unmanned air vehicle 100 that is flying according to the flight plan for specifying the vehicle position is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
 次に、ステップS22において、車両位置特定部461が、位置関連情報取得部240から送られた飛行中における無人飛行体100の下方の撮像結果を、飛行体位置検出部451から逐次送られてくる飛行体現在位置と関連付けて収集する。そして、建屋BLD1の屋内に駐車している車両のそれぞれの位置の特定のための画像の収集が終了すると、処理はステップS23へ進む。 Next, in step S <b> 22, the vehicle position specifying unit 461 sequentially sends the imaging results below the unmanned air vehicle 100 during the flight sent from the position related information acquisition unit 240 from the flying object position detection unit 451. Collect in association with the current position of the vehicle. Then, when the collection of images for specifying the position of each vehicle parked indoors in the building BLD1 is completed, the process proceeds to step S23.
 ステップS23では、ステップS22において収集された画像を解析して、建屋BLD1の屋内に駐車している車両のそれぞれの位置を特定する。そして、車両位置特定部461は、記憶部430内の車両情報テーブルにおける車両位置部分に、新たに特定された車両位置を登録する。この後、車両位置特定処理が終了する。 In step S23, the image collected in step S22 is analyzed, and the position of each vehicle parked indoors in building BLD1 is specified. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430. Thereafter, the vehicle position specifying process ends.
 以上のようにして車両位置特定処理が終了した後、入力部480から送られた車両位置表示指令を受けると、車両位置特定部461は、記憶部430内の地図情報及び車両情報テーブルにおける車両位置を参照して、建屋BLD1の屋内における車両位置を表示するための表示データを生成する。そして、車両位置特定部461は、生成された表示データを表示部470へ送る。 After receiving the vehicle position display command sent from the input unit 480 after the vehicle position specifying process is completed as described above, the vehicle position specifying unit 461 reads the vehicle position in the map information and the vehicle information table in the storage unit 430. The display data for displaying the vehicle position inside the building BLD1 is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470.
 この結果、建屋BLD1の屋内における駐車している車両の位置が表示部470に表示される。こうして表示部470に表示された表示画像の例が、図6に示されている。 As a result, the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470. An example of the display image displayed on the display unit 470 in this way is shown in FIG.
 <車両特徴特定処理>
 次いで、車両特徴特定部465による車両特徴特定処理について説明する。
<Vehicle feature identification processing>
Next, the vehicle feature specifying process by the vehicle feature specifying unit 465 will be described.
 車両特徴特定部465は、入力部480から送られた車両特徴特定開始指令を受けると、車両特徴特定処理を開始する。この車両特徴特定処理に際しては、図7に示されるように、まず、ステップS31において、車両特徴特定部465が、飛行体現在位置から最初に特徴情報を特定する車両(以下、「最初の車両」という)までの飛行計画を生成する。そして、車両特徴特定部465は、生成された最初の車両までの飛行計画を指定した飛行制御要求を飛行制御部440へ送る。また、車両特徴特定部465は、特徴情報取得部270に対して、下方の撮像指定、及び、QRコードのデコード指定を送信する。 When the vehicle feature specifying unit 465 receives the vehicle feature specifying start command sent from the input unit 480, the vehicle feature specifying unit 465 starts the vehicle feature specifying process. In this vehicle feature specifying process, as shown in FIG. 7, first, in step S31, the vehicle feature specifying unit 465 first specifies the feature information from the current position of the flying object (hereinafter referred to as “first vehicle”). The flight plan is generated. Then, the vehicle feature identification unit 465 sends a flight control request designating the generated flight plan to the first vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
 この結果、無人飛行体100は、当該飛行計画に従って飛行し、最初の車両の近傍に到達する。この後、特徴情報取得部270では、下方の撮像結果に含まれるQRコード画像が解析されて、特徴情報がデコードされる。そして、特徴情報取得部270は、デコードされた特徴情報を車両特徴特定部465へ送る。 As a result, the unmanned air vehicle 100 flies in accordance with the flight plan and reaches the vicinity of the first vehicle. Thereafter, in the feature information acquisition unit 270, the QR code image included in the lower imaging result is analyzed, and the feature information is decoded. Then, the feature information acquisition unit 270 sends the decoded feature information to the vehicle feature identification unit 465.
 なお、当該飛行計画の生成に際して、車両特徴特定部465は、特徴情報の特定対象となる車両への到達時に、当該最初の車両に貼付された特徴情報シートにおけるQRコードをデコード可能な態様で撮像可能な高度となる飛行計画を生成する。 When generating the flight plan, the vehicle feature specifying unit 465 captures the QR code in the feature information sheet affixed to the first vehicle when the feature information reaches the target vehicle. Generate a flight plan for the possible altitude.
 次に、ステップS32において、車両特徴特定部465が、特徴情報取得部270から送られた特徴情報を、飛行体位置検出部451から逐次送られてくる飛行体現在位置と関連付けて収集する。そして、車両特徴特定部465は、特徴情報の受信時点における飛行体現在位置に基づいて、記憶部430内の車両情報テーブルにおける特徴情報部分に、新たに収集された特徴情報を、当該飛行体現在位置に対応する車両位置に関連付けて登録する。 Next, in step S32, the vehicle feature specifying unit 465 collects the feature information sent from the feature information acquisition unit 270 in association with the current vehicle position that is sequentially sent from the flying object position detection unit 451. Then, the vehicle feature specifying unit 465 adds the newly collected feature information to the feature information portion in the vehicle information table in the storage unit 430 based on the current vehicle position at the time of receiving the feature information. Register in association with the vehicle position corresponding to the position.
 次いで、ステップS33において、車両特徴特定部465が、建屋BLD1の屋内に駐車している車両の全てについて特徴情報を特定したか否かを判定する。かかる判定に際して、車両特徴特定部465は、記憶部430内の車両情報テーブルにおける車両位置部分に登録されている車両位置の全てについて特徴情報が登録されたか否かを判定する。 Next, in step S33, it is determined whether or not the vehicle feature specifying unit 465 has specified feature information for all of the vehicles parked indoors in the building BLD1. At the time of such determination, the vehicle feature specifying unit 465 determines whether or not the feature information has been registered for all the vehicle positions registered in the vehicle position portion in the vehicle information table in the storage unit 430.
 ステップS33における判定の結果が否定的であった場合(ステップS33:N)には、処理はステップS34へ進む。このステップS34では、上述したステップS31の場合と同様にして、飛行体現在位置から次に特徴情報を特定する車両(以下、「次の車両」という)までの飛行計画を生成する。そして、車両特徴特定部465は、生成された次の車両までの飛行計画を指定した飛行制御要求を飛行制御部440へ送る。また、車両特徴特定部465は、特徴情報取得部270に対して、下方の撮像指定、及び、QRコードのデコード指定を送信する。 If the result of the determination in step S33 is negative (step S33: N), the process proceeds to step S34. In step S34, a flight plan from the current position of the flying object to the next vehicle for specifying feature information (hereinafter referred to as “next vehicle”) is generated in the same manner as in step S31 described above. Then, the vehicle feature specifying unit 465 sends a flight control request designating the generated flight plan to the next vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
 ステップS34の処理が終了すると、処理はステップS32へ戻る。以後、ステップS33における判定の結果が肯定的となるまで、ステップS32~S34の処理が繰り返される。 When the process of step S34 ends, the process returns to step S32. Thereafter, the processes in steps S32 to S34 are repeated until the result of the determination in step S33 becomes affirmative.
 建屋BLD1の屋内の全ての車両について特徴情報が特定され、ステップS33における判定の結果が肯定的となると(ステップS33:Y)、処理はステップS35へ進む。このステップS35では、車両特徴特定部465が、帰還飛行計画を生成する。引き続き、車両特徴特定部465は、生成された帰還飛行計画を指定した飛行制御要求を飛行制御部440へ送る。この結果、無人飛行体100は、当該帰還飛行計画に従った飛行を行い、飛行基地BSへ帰還する。そして、車両特徴特定処理が終了する。 When the characteristic information is specified for all the vehicles in the building BLD1 and the result of determination in step S33 is affirmative (step S33: Y), the process proceeds to step S35. In this step S35, the vehicle feature identification unit 465 generates a return flight plan. Subsequently, the vehicle feature specifying unit 465 sends a flight control request designating the generated return flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the vehicle feature identification process ends.
 以上のようにして車両特徴特定処理が終了した後、入力部480から送られた車両特徴表示指令を受けると、車両特徴特定部465は、当該車両特徴表示指令で指定された車両位置、及び、車両情報テーブルを参照して、当該車両位置に駐車している車両の特徴情報を表示するための表示データを生成する。そして、車両特徴特定部465は、生成された表示データを表示部470へ送る。この結果、建屋BLD1の屋内における駐車している特定の車両の特徴情報が表示部470に表示される。 After receiving the vehicle feature display command sent from the input unit 480 after the vehicle feature specifying process is completed as described above, the vehicle feature specifying unit 465 receives the vehicle position specified by the vehicle feature display command, and With reference to the vehicle information table, display data for displaying the feature information of the vehicle parked at the vehicle position is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
 なお、本実施形態では、上述した図6に示された表示部470における表示画像における一の車両表示位置をポインティングデバイスで指定することにより、車両特徴表示指令が入力されるようになっている。そして、指定された車両の特徴情報は、図6に示された表示部470における表示画像に重ねて表示されるようになっている。 In the present embodiment, a vehicle feature display command is input by designating one vehicle display position in the display image on the display unit 470 shown in FIG. 6 described above with a pointing device. The feature information of the designated vehicle is displayed so as to be superimposed on the display image on the display unit 470 shown in FIG.
 以上説明したように、本実施形態では、屋内を飛行可能な無人飛行体100に搭載された位置関連情報取得部240が、無人飛行体100の飛行中に、建屋BLD1の屋内に存在する車両の位置に関連する情報を取得する。そして、当該位置関連情報取得部240により取得された情報に基づいて、車両位置特定部461が、建屋BLD1の屋内における車両の位置を特定する。 As described above, in the present embodiment, the position-related information acquisition unit 240 mounted on the unmanned air vehicle 100 that can fly indoors is the vehicle of the vehicle existing indoors in the building BLD 1 during the flight of the unmanned air vehicle 100. Get information related to location. And based on the information acquired by the said position relevant-information acquisition part 240, the vehicle position specific | specification part 461 specifies the position of the vehicle indoors in building BLD1.
 したがって、本実施形態によれば、使用中の建屋BLD1であっても、新たな設備工事を必要とせず、簡易に、車両の位置の特定を行うことができる。 Therefore, according to the present embodiment, even if the building BLD1 is in use, it is possible to easily identify the position of the vehicle without requiring new equipment construction.
 また、本実施形態では、無人飛行体100に搭載された変位情報取得部250及び周囲環境情報取得部260により取得された情報に基づいて、地図生成部455が建屋BLD1の屋内の地図を生成するとともに、当該地図の生成と同時に、飛行体位置検出部451が、地図生成部455で生成中の地図内における無人飛行体100の位置を検出する。すなわち、いわゆるSLAMの手法を用いて、建屋BLD1の屋内の地図を生成する。このため、建屋BLD1の屋内の地図を精度良く生成することができる。 In the present embodiment, the map generation unit 455 generates an indoor map of the building BLD1 based on the information acquired by the displacement information acquisition unit 250 and the surrounding environment information acquisition unit 260 mounted on the unmanned air vehicle 100. Simultaneously with the generation of the map, the flying object position detection unit 451 detects the position of the unmanned flying object 100 in the map being generated by the map generation unit 455. That is, an indoor map of the building BLD1 is generated using a so-called SLAM technique. For this reason, the indoor map of building BLD1 can be generated with high accuracy.
 そして、上述した車両位置の特定のために飛行中の無人飛行体100の位置を、飛行体位置検出部451が、変位情報取得部250及び周囲環境情報取得部260により取得された情報に基づいて、生成された地図内における無人飛行体100の位置を検出する。このため、変位情報取得部250が備える外界センサのオフセットの累積による精度の低下を回避しつつ、車両位置の特定のために飛行中の無人飛行体100の当該地図内の位置を精度良く検出することができる。この結果、車両の位置の特定を精度良く行うことができる。 Then, the position of the unmanned air vehicle 100 in flight for specifying the vehicle position described above is determined based on the information acquired by the vehicle position detection unit 451 by the displacement information acquisition unit 250 and the ambient environment information acquisition unit 260. Then, the position of the unmanned air vehicle 100 in the generated map is detected. Therefore, the position of the unmanned air vehicle 100 in flight is accurately detected in order to identify the vehicle position while avoiding a decrease in accuracy due to the accumulation of offsets of the external sensors included in the displacement information acquisition unit 250. be able to. As a result, the position of the vehicle can be accurately identified.
 また、本実施形態では、無人飛行体100に搭載された特徴情報取得部270が、個々の車両の特徴情報を取得する。そして、車両特徴特定部465が、特徴情報取得部270により取得された特徴情報を、車両位置特定部461により特定された車両の位置と関連付けることにより、個々の車両の特徴情報を特定する。このため、個々の車両の位置に加えて、個々の車両の特徴情報を特定することができる。 In this embodiment, the feature information acquisition unit 270 mounted on the unmanned air vehicle 100 acquires the feature information of each vehicle. Then, the vehicle feature specifying unit 465 specifies the feature information of each vehicle by associating the feature information acquired by the feature information acquiring unit 270 with the position of the vehicle specified by the vehicle position specifying unit 461. For this reason, in addition to the position of each vehicle, the characteristic information of each vehicle can be specified.
 また、本実施形態では、特徴情報がQRコード化されて表された特徴情報シートが、車両ごとに貼付される。このため、特徴情報シートの大きさをコンパクトにできるとともに、簡易に個々の車両の特徴情報を取得することができる。 In this embodiment, a feature information sheet in which feature information is QR-coded is attached to each vehicle. For this reason, the size of the feature information sheet can be made compact, and the feature information of each vehicle can be easily acquired.
 また、本実施形態では、位置関連情報取得部240が撮像デバイスを備え、車両位置特定部461が、当該撮像デバイスによる撮像結果を解析して、建屋BLD1の屋内における車両の位置を特定する。このため、簡易な構成で、建屋BLD1の屋内における車両の位置を特定することができる。 In the present embodiment, the position related information acquisition unit 240 includes an imaging device, and the vehicle position specifying unit 461 analyzes the imaging result of the imaging device and specifies the position of the vehicle indoors in the building BLD1. For this reason, the position of the vehicle inside the building BLD1 can be specified with a simple configuration.
 [実施形態の変形]
 本発明は、上記の実施形態に限定されるものではなく、様々な変形が可能である。
[Modification of Embodiment]
The present invention is not limited to the above-described embodiment, and various modifications are possible.
 例えば、上記の実施形態では、特徴情報シートには、貼付された車両の特徴情報がQRコード化されて表されるようにしたが、バーコード化等の他のコード化を行って表すようにしてもよい。さらに、特徴情報シートに代えてICタグを用い、当該ICとの非接触通信により特徴情報を取得するようにしてもよい。 For example, in the above embodiment, the feature information sheet displays the feature information of the affixed vehicle as a QR code. May be. Further, an IC tag may be used instead of the feature information sheet, and the feature information may be acquired by non-contact communication with the IC.
 また、上記の実施形態では、特徴情報取得部270がQRコードのデコード処理部を備えるようにしたが、車両特徴特定部がQRコードのデコード処理部を備えるようにしてもよい。 In the above-described embodiment, the feature information acquisition unit 270 includes the QR code decoding processing unit. However, the vehicle feature specifying unit may include the QR code decoding processing unit.
 また、上記の実施形態では、位置特定及び特徴情報特定の対象物を車両としたが、車両以外を位置特定及び特徴情報特定の対象物としてもよい。例えば、特徴情報として指定着席位置が記憶されたICタグが付されたチケットを所持している観客を位置特定対象とし、当該チケットを持っている観客のコンサートホールにおける着席位置が、正しい着席位置であるかを確認するようにしてもよい。 Further, in the above-described embodiment, the object for specifying the position and the characteristic information is the vehicle, but other objects than the vehicle may be the object for specifying the position and the characteristic information. For example, a spectator holding a ticket with an IC tag in which a designated seating position is stored as feature information is targeted for position identification, and the seating position of the spectator holding the ticket is the correct seating position. You may make it confirm whether there exists.
 また、上記の実施形態では、位置関連情報取得部が撮像デバイスを備えるようにしたが、撮像デバイスに代えて、レーザレーダや熱感知センサ等を備えるようにしてもよい。 In the above embodiment, the position-related information acquisition unit includes the imaging device. However, instead of the imaging device, a laser radar, a thermal sensor, or the like may be included.
 また、上記の実施形態では、地図生成部が、変位情報及び周囲環境情報の双方を利用して地図の生成を行うようにした。これに対し、地図生成部は、変位情報及び周囲環境情報のいずれか一方のみを利用して地図の作成を行うようにしてもよい。 In the above embodiment, the map generation unit generates a map using both the displacement information and the surrounding environment information. On the other hand, the map generation unit may create a map using only one of the displacement information and the surrounding environment information.
 また、上記の実施形態では、処理制御装置を1つの装置としたが、互いに通信可能な複数の装置により、処理制御装置の機能を果たすようにしてもよい。こうした場合には、複数の装置を、例えば、画像解析等に際しての演算能力に優れたサーバ装置、及び、当該サーバ装置と通信可能なパーソナルコンピュータとすることができる。 In the above-described embodiment, the processing control apparatus is a single apparatus, but the function of the processing control apparatus may be achieved by a plurality of apparatuses that can communicate with each other. In such a case, the plurality of devices can be, for example, a server device that has excellent computing ability for image analysis or the like, and a personal computer that can communicate with the server device.
 また、上記の実施形態では、位置関連情報取得部、移動距離及び測位情報取得、環境情報取得部及び特徴情報取得部を無人飛行体に搭載した。これらの取得部を、屋内の床面を走行可能な無人走行体に搭載するようにしてもよい。 In the above embodiment, the position related information acquisition unit, the movement distance and positioning information acquisition, the environment information acquisition unit, and the feature information acquisition unit are mounted on the unmanned air vehicle. You may make it mount these acquisition parts in the unmanned traveling body which can drive | work an indoor floor surface.
 また、上記の実施形態では、中継装置と処理制御装置とを別々の装置としたが、1つの装置としてもよい。 In the above embodiment, the relay device and the processing control device are separate devices, but may be a single device.
 また、上記の実施形態では、中継装置と処理制御装置を無線で接続するようにしたが、有線での接続としてもよい。 In the above embodiment, the relay device and the processing control device are connected wirelessly, but may be connected by wire.
 また、上記の実施形態では、中継装置と処理制御装置とは異なる建屋に分かれて配置されるようにしたが、同一建屋内に配置するようにしてもよい。 In the above embodiment, the relay device and the processing control device are arranged separately in different buildings, but may be arranged in the same building.
 また、表示部には、飛行体位置検出部により検出された無人飛行体の現在飛行体位置の情報や飛行基地、中継装置等の位置を、単独で、又は、位置特定結果と併せて表示するようにしてもよい。 In addition, the display unit displays information on the current flying object position of the unmanned flying object detected by the flying object position detection part and the position of the air base, the relay device, etc. alone or in combination with the position specifying result. You may do it.
 また、上記の実施形態では、無人飛行体100を1台とするとともに、飛行基地BSを1箇所とした。これに対し、無人飛行体100を複数台としてもよいし、飛行基地BSを複数箇所設けるようにしてもよい。また、無人飛行体100を複数台とするとともに、飛行基地BSを複数箇所設けるようにしてもよい。 In the above embodiment, one unmanned air vehicle 100 and one air base BS are provided. In contrast, a plurality of unmanned air vehicles 100 may be provided, or a plurality of flight bases BS may be provided. Further, a plurality of unmanned air vehicles 100 may be provided, and a plurality of flight bases BS may be provided.

Claims (8)

  1.  屋内を飛行可能な無人飛行体と;
     前記無人飛行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人飛行体の飛行中に取得する位置関連情報取得部と;
     前記位置関連情報取得部により取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定部と;
     を備えることを特徴とする管理システム。
    An unmanned air vehicle capable of flying indoors;
    A position-related information acquisition unit that is mounted on the unmanned air vehicle and acquires information related to the position of a predetermined object existing indoors during the flight of the unmanned air vehicle;
    A position specifying unit that specifies the position of the object in the room based on the information acquired by the position related information acquiring unit;
    A management system comprising:
  2.  前記無人飛行体に搭載された変位情報取得部及び周囲環境情報取得部の少なくとも一方により取得された情報に基づいて、前記屋内の地図の生成する地図生成部と;
     前記変位情報取得部及び前記周囲環境情報取得部の少なくとも一方により取得された情報に基づいて、前記地図生成部で生成された地図内における前記無人飛行体の位置を検出する位置検出部と;
     を更に備えることを特徴とする請求項1に記載の管理システム。
    A map generation unit that generates the indoor map based on information acquired by at least one of a displacement information acquisition unit and an ambient environment information acquisition unit mounted on the unmanned air vehicle;
    A position detection unit that detects the position of the unmanned air vehicle in the map generated by the map generation unit based on information acquired by at least one of the displacement information acquisition unit and the ambient environment information acquisition unit;
    The management system according to claim 1, further comprising:
  3.  前記無人飛行体に搭載され、前記対象物の特徴情報を取得する特徴情報取得部と;
     前記位置特定部により特定された前記対象物の位置と、前記特徴情報取得部により取得された前記対象物の特徴情報とを関連付ける特徴特定部と;
     を更に備える、ことを特徴とする請求項1に記載の管理システム。
    A feature information acquisition unit mounted on the unmanned air vehicle for acquiring feature information of the object;
    A feature specifying unit that associates the position of the object specified by the position specifying unit with the feature information of the object acquired by the feature information acquiring unit;
    The management system according to claim 1, further comprising:
  4.  前記対象物の特徴情報は、前記対象物の個別的な特徴がコード化されて表されている、ことを特徴とする請求項3に記載の管理システム。 4. The management system according to claim 3, wherein the feature information of the object is expressed by coding individual features of the object.
  5.  前記位置関連情報取得部は撮像デバイスを備え、
     前記位置特定部は、前記撮像デバイスによる撮像結果を解析して、前記屋内における前記対象物の位置を特定する、
     ことを特徴とする請求項1に記載の管理システム。
    The position related information acquisition unit includes an imaging device,
    The position specifying unit analyzes a result of imaging by the imaging device, and specifies the position of the object in the room.
    The management system according to claim 1.
  6.  前記対象物は駐車中の車両である、ことを特徴とする請求項1に記載の管理システム。 The management system according to claim 1, wherein the object is a parked vehicle.
  7.  屋内の床面を走行可能な無人走行体と;
     前記無人走行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人走行体の走行中に、前記所定の対象物の上方から取得する位置関連情報取得部と;
     前記位置関連情報部により取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定部と;
     を備えることを特徴とする管理システム。
    An unmanned vehicle capable of traveling on an indoor floor;
    A position-related information acquisition unit that is mounted on the unmanned traveling body and acquires information related to the position of the predetermined object existing indoors from above the predetermined object while the unmanned traveling body is traveling;
    A position specifying unit for specifying the position of the object in the room based on the information acquired by the position related information unit;
    A management system comprising:
  8.  屋内を飛行可能な無人飛行体と;前記無人飛行体に搭載され、前記屋内に存在する所定の対象物の位置に関連する情報を前記無人飛行体の飛行中に取得する位置関連情報取得部と;前記屋内における前記対象物の位置を特定する位置特定部と;を備える管理システムにおいて使用される位置特定方法であって、
     前記位置関連情報取得部が、前記無人飛行体の飛行中に、前記対象物の位置に関連する情報を取得する取得工程と;
     前記位置特定部が、前記取得工程において取得された情報に基づいて、前記屋内における前記対象物の位置を特定する位置特定工程と;
     を備えることを特徴とする位置特定方法。
    An unmanned air vehicle capable of flying indoors; a position-related information acquisition unit that is mounted on the unmanned air vehicle and acquires information related to the position of a predetermined object existing indoors during the flight of the unmanned air vehicle; A position specifying method for use in a management system comprising: a position specifying unit for specifying the position of the object in the room;
    An acquisition step in which the position-related information acquisition unit acquires information related to the position of the object during the flight of the unmanned air vehicle;
    A position specifying step in which the position specifying unit specifies the position of the object in the room based on the information acquired in the acquiring step;
    A position specifying method comprising:
PCT/JP2014/050499 2014-01-15 2014-01-15 Management system and position specification method WO2015107623A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/050499 WO2015107623A1 (en) 2014-01-15 2014-01-15 Management system and position specification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/050499 WO2015107623A1 (en) 2014-01-15 2014-01-15 Management system and position specification method

Publications (1)

Publication Number Publication Date
WO2015107623A1 true WO2015107623A1 (en) 2015-07-23

Family

ID=53542547

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/050499 WO2015107623A1 (en) 2014-01-15 2014-01-15 Management system and position specification method

Country Status (1)

Country Link
WO (1) WO2015107623A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016015628A (en) * 2014-07-02 2016-01-28 三菱重工業株式会社 Indoor monitoring system and mode of structure
JP2017059955A (en) * 2015-09-15 2017-03-23 ツネイシホールディングス株式会社 Imaging system and computer program
WO2018131165A1 (en) * 2017-01-16 2018-07-19 富士通株式会社 Information processing program, information processing method, and information processing device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004338889A (en) * 2003-05-16 2004-12-02 Hitachi Ltd Image recognition device
JP2005083984A (en) * 2003-09-10 2005-03-31 Neomax Co Ltd Article position checking system
JP2010055444A (en) * 2008-08-29 2010-03-11 Hitachi Industrial Equipment Systems Co Ltd Robot system
JP2013086912A (en) * 2011-10-17 2013-05-13 Fujitsu Advanced Engineering Ltd Article management system, article management method, and article management program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004338889A (en) * 2003-05-16 2004-12-02 Hitachi Ltd Image recognition device
JP2005083984A (en) * 2003-09-10 2005-03-31 Neomax Co Ltd Article position checking system
JP2010055444A (en) * 2008-08-29 2010-03-11 Hitachi Industrial Equipment Systems Co Ltd Robot system
JP2013086912A (en) * 2011-10-17 2013-05-13 Fujitsu Advanced Engineering Ltd Article management system, article management method, and article management program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016015628A (en) * 2014-07-02 2016-01-28 三菱重工業株式会社 Indoor monitoring system and mode of structure
JP2017059955A (en) * 2015-09-15 2017-03-23 ツネイシホールディングス株式会社 Imaging system and computer program
WO2018131165A1 (en) * 2017-01-16 2018-07-19 富士通株式会社 Information processing program, information processing method, and information processing device
JPWO2018131165A1 (en) * 2017-01-16 2019-11-07 富士通株式会社 Information processing program, information processing method, and information processing apparatus
US10885357B2 (en) 2017-01-16 2021-01-05 Fujitsu Limited Recording medium recording information processing program, information processing method, and information processing apparatus

Similar Documents

Publication Publication Date Title
JP2015131713A (en) Management system, flight control method, flight control program, and recording medium
US20210065400A1 (en) Selective processing of sensor data
US11604479B2 (en) Methods and system for vision-based landing
US10599149B2 (en) Salient feature based vehicle positioning
CN105492985B (en) A kind of system and method for the control loose impediment in environment
AU2021202509B2 (en) Image based localization for unmanned aerial vehicles, and associated systems and methods
CN110174093A (en) Localization method, device, equipment and computer readable storage medium
CN110062919A (en) Drop-off location planning for delivery vehicles
US20150237481A1 (en) Navigation method and device
EP3734394A1 (en) Sensor fusion using inertial and image sensors
US20220074744A1 (en) Unmanned Aerial Vehicle Control Point Selection System
EP3579215A1 (en) Electronic device for generating map data and operating method therefor
US20190003840A1 (en) Map registration point collection with mobile drone
JP2020170213A (en) Drone-work support system and drone-work support method
WO2015107623A1 (en) Management system and position specification method
WO2021166845A1 (en) Information processing device, information processing method, and program
JP2016085613A (en) Aircraft operation status display system and aircraft operation status display method
KR102105105B1 (en) Method of aiding driving and apparatuses performing the same
WO2021064982A1 (en) Information processing device and information processing method
CN110892353A (en) Control method, control device and control terminal of unmanned aerial vehicle
WO2023228283A1 (en) Information processing system, movable body, information processing method, and program
US20240118703A1 (en) Display apparatus, communication system, display control method, and recording medium
CN113227818B (en) System for object tracking in physical space using alignment reference system
WO2021033422A1 (en) Position estimation system, position estimation device, flying object, position estimation program, and position estimation method
EP3792719A1 (en) Deduction system, deduction device, deduction method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14879213

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14879213

Country of ref document: EP

Kind code of ref document: A1