WO2021166845A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2021166845A1
WO2021166845A1 PCT/JP2021/005506 JP2021005506W WO2021166845A1 WO 2021166845 A1 WO2021166845 A1 WO 2021166845A1 JP 2021005506 W JP2021005506 W JP 2021005506W WO 2021166845 A1 WO2021166845 A1 WO 2021166845A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
distance
drone
marker
predetermined position
Prior art date
Application number
PCT/JP2021/005506
Other languages
French (fr)
Japanese (ja)
Inventor
賢哉 金田
Original Assignee
本郷飛行機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本郷飛行機株式会社 filed Critical 本郷飛行機株式会社
Publication of WO2021166845A1 publication Critical patent/WO2021166845A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms
    • B64U70/97Means for guiding the UAV to a specific location on the platform, e.g. platform structures preventing landing off-centre
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data

Definitions

  • the present invention relates to an information processing device, an information processing method, and a program.
  • An object of the present invention is to accurately specify the position even indoors.
  • the information processing device of one aspect of the present invention is An information processing device that estimates a predetermined position based on an image of a first object in the real space captured from a predetermined position as a subject to be imaged and a distance from the predetermined position with respect to the second object in the real space.
  • An image data acquisition means for acquiring the first data of the image, and Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated.
  • Each of the information processing methods and programs of one aspect of the present invention is a method and program corresponding to the information processing device of one aspect of the present invention.
  • the position can be accurately specified even indoors.
  • FIG. 1 shows the outline of the maneuvering control system including the drone provided with the position specifying device which concerns on one Embodiment of this invention.
  • FIG. 1 shows an example of the hardware composition of the position identification apparatus provided in the drone of FIG.
  • FIG. 2 shows an example of the functional structure of the position specifying apparatus of FIG.
  • FIG. 4 shows an example of the situation where various information is acquired by the drone provided with the position identification device of FIG.
  • FIG. 4 shows each of the example of the image acquired by the position identification apparatus of FIG. 3 in the situation shown in each of FIGS. 4 and 5.
  • FIG. 1 shows the outline of the maneuvering control system including the drone provided with the position specifying device which concerns on one Embodiment of this invention.
  • FIG. 3 shows an example of the hardware composition of the position identification apparatus provided in the drone of FIG.
  • FIG. 4 shows an example of the functional structure of the position specifying apparatus of FIG.
  • FIG. 4 shows an example of the situation where various information is acquired by the drone provided with the position identification device of FIG.
  • FIG. 3 is a diagram showing an example of a situation in which the position specifying device of FIG. 3 controls a region to be imaged in the situation of FIG. It is a figure which shows an example of the marker which is the object of the image acquired by the position specifying apparatus of FIG.
  • FIG. 5 is a flowchart illustrating an example of a flow of position identification processing executed by the position identification device having the functional configuration of FIG. It is a figure which shows the example different from FIG. 4 and FIG. It is a figure which shows the example different from FIG.4, FIG.
  • the information processing system including the small unmanned aerial vehicle (hereinafter referred to as “drone”) D capable of moving in three-dimensional space will be described with reference to the drawings.
  • the same reference numerals are given to the same elements, and duplicate description will be omitted.
  • the subject of the term "drone” in the present invention is not limited to a small unmanned aerial vehicle that can move in three-dimensional space. That is, for example, a drone that is a traveling vehicle that can move in a two-dimensional space is also included.
  • the directions defined as follows are used. That is, hereinafter, in the case of assuming a propeller type drone D in a state of flying alone as in Patent Document 1 described above, the direction is such that the main propeller rotates to generate a thrust against gravity.
  • the axis passing through the center of gravity of the drone excluding cargo etc. is called the axis DZ.
  • the rotation speed of the main propeller is increased from the state where the drone D is hovering at a certain position, the drone increases in altitude and rises.
  • This direction is appropriately referred to as "the direction in which the axis DZ is positive” and the opposite is appropriately referred to as "the direction in which the axis DZ is negative". Therefore, according to the above definition, the axis DZ is fixed to the drone D regardless of the attitude of the drone D.
  • a three-dimensional Cartesian coordinate system including the following axes X, Y, and Z is used. That is, the axis Z of the three-dimensional Cartesian coordinate system is taken in the direction of the axis DZ of the drone D arranged in the attitude of the normal flight state. Further, the direction opposite to the direction in which gravity acts is defined as the direction of the axis Z. Also, take an axis X and an axis Y that are orthogonal to each other so as to be orthogonal to the axis Z. At this time, the directions of the axes X and Y are defined so that the three-dimensional coordinate system using the axes X, Y, and Z is a right-handed system.
  • the drone D when the drone D moves without changing its height, the drone D flies in the XY plane including the axis X and the axis Y.
  • the direction in which the height increases is referred to as “the direction in which the axis Z is positive”
  • the opposite is referred to as “the direction in which the axis Z is negative”.
  • the directions of the axis X are referred to as “the direction in which the axis X is positive” and “the direction in which the axis X is negative”
  • the directions of the axis Y are referred to as “the direction in which the axis Y is positive” and “the direction in which the axis Y is negative”. They are called “axis Y is in the negative direction” respectively.
  • FIG. 1 is a diagram showing an outline of a maneuvering control system including a drone including a position specifying device according to an embodiment of the present invention.
  • a drone D equipped (mounted) with a position specifying device 1 and a driver terminal 2 operated by a driver U are connected via wireless communication.
  • the drone D and the operator terminal 2 are connected to each other via a predetermined network N such as the Internet (not shown).
  • the network N includes not only the Internet and mobile carrier networks, but also short-range wireless communications such as NFC (registered trademark) and Bluetooth (registered trademark).
  • the drone D provided with the position specifying device 1 acquires position information by receiving a GPS signal transmitted from a GPS (Global Positioning System) satellite GS. That is, normally, the drone D receives the GPS signal transmitted from the GPS satellite GS and uses the position information of the drone D itself acquired for autonomous control, or transmits the position information to the operator terminal 2. ..
  • the drone D is not limited to GPS, and can use positioning systems using various satellites such as the quasi-zenith satellite system in Japan.
  • the operator terminal 2 is a terminal composed of a smartphone or the like and used by the operator U to operate the drone D. By operating the driver terminal 2, the operator U can directly control the drone D, as well as issue a takeoff / landing command, a preset predetermined work command, and the like.
  • the accuracy of the position information specified by using GPS may deteriorate indoors or the like where GPS radio waves cannot be directly received. That is, for example, the accuracy of the position information of the drone D itself based on the GPS signal is insufficient in a place where GPS radio waves hardly reach indoors or in a place near a large building (wall) even outdoors. There was something. Furthermore, the accuracy of the position information of the drone D itself based on the GPS signal has an upper limit on the accuracy of the position information of the drone D itself due to the standard of the satellite positioning system such as GPS.
  • the position identification device 1 of the present embodiment is provided with the configuration described later, so that the position of the drone D itself can be determined indoors, in a place near a large building (wall), or with an accuracy not limited to the GPS standard. Can be identified.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of the position specifying device provided in the drone of FIG.
  • the position specifying device 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input / output interface 15, an output unit 16, and an input unit.
  • a 17, a storage unit 18, a communication unit 19, and a drive 20 are provided.
  • the CPU 11 executes various processes according to the program recorded in the ROM 12 or the program loaded from the storage unit 18 into the RAM 13. Data and the like necessary for the CPU 11 to execute various processes are also appropriately stored in the RAM 13.
  • the CPU 11, ROM 12 and RAM 13 are connected to each other via the bus 14.
  • An input / output interface 15 is also connected to the bus 14.
  • An output unit 16, an input unit 17, a storage unit 18, a communication unit 19, and a drive 20 are connected to the input / output interface 15.
  • the output unit 16 is composed of a display, a speaker, and the like, and outputs various information as images and sounds.
  • the input unit 17 is composed of a keyboard, a mouse, etc., and inputs various information.
  • the storage unit 18 is composed of a hard disk, a DRAM (Dynamic Random Access Memory), or the like, and stores various data.
  • the communication unit 19 communicates with another device (such as the operator terminal 2 in the example of FIG. 1) via the network N including the Internet.
  • a removable media 31 made of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 20.
  • the program read from the removable media 31 by the drive 20 is installed in the storage unit 18 as needed. Further, the removable media 31 can also store various data stored in the storage unit 18 in the same manner as the storage unit 18.
  • the operator terminal 2 in FIG. 1 has basically the same configuration as the hardware configuration shown in FIG. 2 except for the following points. Therefore, the description of the hardware configuration of the operator terminal 2 will be omitted.
  • the position identification device 1 specifies the position of the drone D itself as a series of processes for identifying the position of the drone D itself (hereinafter, referred to as "position identification process"). Can be executed.
  • position identification process a series of processes for identifying the position of the drone D itself.
  • the steering control system of FIG. 1 has a functional configuration as shown in FIG.
  • FIG. 3 is a functional block diagram showing an example of the functional configuration of the position specifying device of FIG.
  • FIG. 4 is a diagram showing an example of a situation in which various information is acquired by a drone equipped with the position identification device of FIG.
  • the position identification device 1 When the position identification process is executed, the position identification device 1, the image pickup unit 41, the distance measuring unit 42, the drive unit 43, and the image pickup unit drive unit 44 provided in the drone D function.
  • the position specifying device 1 provided in the drone D identifies the position of the drone D based on various information. That is, the position specifying device 1 in the present embodiment is an information processing device provided in the drone D and executing information processing related to control. Therefore, in FIG. 3, the drone D is illustrated as a configuration including (mounting) the position specifying device 1.
  • the imaging unit 41, the distance measuring unit 42, the driving unit 43, and the imaging unit driving unit 44 included in the drone D, which acquires various data with the position specifying device 1 and controls each unit, will be described.
  • the image pickup unit 41 includes a lens, an optical element such as a CCD or CMOS, and a control unit thereof, and captures an image or video. Further, although details will be described later with reference to FIG. 5, the image pickup unit 41 includes a set of a plurality of lenses and optical elements capable of capturing images of a plurality of angles of view.
  • the distance measuring unit 42 includes various signal transmitting devices, various sensors, a control unit thereof, and the like, and measures the distance from the distance measuring unit 42 to the point to be measured.
  • a light wave range finder using a light wave such as a radar range finder or a radio wave range finder using a radio wave is adopted as the distance measuring unit 42.
  • a device capable of acquiring a two-dimensional or three-dimensional structure to be measured can also be adopted as the distance measuring unit 42. That is, for example, a radar or LiDAR (Light Detection and Ringing) can be adopted as the distance measuring unit 42.
  • a radar or LiDAR Light Detection and Ringing
  • the drive unit 43 is driven by using the supplied energy.
  • the drone D can move in space by driving the drive unit 43 or the like.
  • a motor driven by using electric energy and an engine driven by using chemical energy such as gasoline are both examples of the driving unit 43.
  • the image pickup unit drive unit 44 can drive the above-mentioned image pickup unit 41 with a gimbal or the like. That is, although the details will be described later with reference to FIG. 7, the image pickup unit drive unit 44 drives a gimbal having a plurality of rotation axes to drive a gimbal having a plurality of rotation axes to drive the gimbal having the plurality of rotation axes to drive the gimbal having the plurality of rotation axes to drive the gimbal having the plurality of rotation axes, respectively.
  • the area imaged by the imaging unit 41 can be changed by rotating the direction of.
  • the drone D of the example shown in FIG. 4 Before explaining each functional block that functions in the position specifying device 1, first, the drone D of the example shown in FIG. 4, the imaging unit 41 and the distance measuring unit 42 included in the drone D, and the marker to be imaged are described. , The arrangement with the wall or the like to be measured for the distance will be described.
  • the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction.
  • the wall W is arranged in the direction in which the axis X is negative with respect to the drone D.
  • the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D.
  • the marker 3 is installed on the ground G. An example of a specific shape of the marker 3 will be described later with reference to FIGS. 6 and 8.
  • the drone D includes an imaging unit 41 that images a direction in which the axis Z is negative.
  • the two dotted lines V1-a and V1-b indicate the angle of view of the imaging unit 41 provided in the drone D. That is, the imaging unit 41 images the region between the intersection of the dotted line V1-a and the ground G and the intersection of the dotted line V1-b and the ground G. That is, in FIG. 4, the imaging unit 41 captures an image including the image of the marker 3.
  • the angle of view of the imaging unit 41 also extends in the direction of the axis Y.
  • the drone D includes a distance measuring unit 42 that measures a distance in a direction in which the axis X is negative.
  • the distance D1 is the distance between the point WP on the wall W and the point S1 on the distance measuring unit 42. That is, the distance measuring unit 42 provided in the drone D measures the distance D1 between the point WP and the point S1 as the distance in the direction in which the axis X is negative.
  • the image data acquisition unit 111 In the CPU 11 of the position identification device 1, the image data acquisition unit 111, the relative position estimation unit 112, the distance acquisition unit 113, the spatial information acquisition unit 114, the position identification unit 115, the attitude control unit 116, and the imaging area control Unit 117 and are functioning.
  • the image data acquisition unit 111 acquires image data captured from the position of the drone D with the marker 3 in the real space as the object to be imaged. Specifically, the image data acquisition unit 111 acquires image data from the image pickup unit 41 included in the drone D via the communication unit 19. As described above, this image data includes an image of the marker 3 in which the axis Z of the drone D is installed on the ground in the negative direction and is captured.
  • the relative position estimation unit 112 determines a relative position including at least one of a distance, a direction, and an angle from the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data. presume. That is, the relative position estimation unit 112 estimates the position relative to the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data acquired by the image data acquisition unit 111. do.
  • the relative position is a second position grasped as a set of coordinate values in a coordinate system having an origin at the first position.
  • the absolute position is a position grasped as a set of coordinate values with respect to one common coordinate system. That is, for example, when a three-dimensional Cartesian coordinate system is used as the coordinate system, the relative position from the first position to the second position is the first from the coordinate value (set) of the absolute position of the second position. It is possible to adopt the value obtained by subtracting (a set) of the coordinates of the absolute position of the position.
  • the relative position from the first position to the second position will be described as being determined by a distance, a direction, and an angle.
  • a three-dimensional polar coordinate system is used as the coordinate system representing the relative position. That is, the relative position from the first position to the second position includes the distance, the azimuth angle, and the elevation / depression angle.
  • the azimuth is an angle that determines the relative position of the XY plane (horizontal plane). That is, the azimuth corresponds to the azimuth (direction) seen from the drone D.
  • the elevation / depression angle is an angle in the vertical direction with respect to the horizontal.
  • the relative position represented by the three-dimensional polar coordinate system estimated by the relative position estimation unit 112 has the distance, azimuth angle, and elevation / depression angle depending on the arrangement of the image of the marker 3 in the captured image.
  • Each accuracy changes. Under certain conditions, it is conceivable that only one of the distance, direction, and angle is highly accurate. Therefore, in the following description, the state in which the relative position is estimated is not only the state in which the three-dimensional relative position is accurately obtained, but also any one of the distance, the azimuth angle, and the elevation / depression angle is predetermined. It will be described as a state estimated by the accuracy of.
  • the relative position estimation unit 112 estimates the relative position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the data acquired by the image data acquisition unit 111. , FIG. 4 will be described as appropriate.
  • the relative position estimation unit 112 determines the distance D2 from the position of the drone D with respect to the marker 3 based on the size of the marker 3 included in the image corresponding to the data acquired by the image data acquisition unit 111. (See FIG. 4) is estimated.
  • the distance D2 as a relative position corresponds to the altitude of the so-called drone D.
  • the marker 3 is located directly below the drone D.
  • the relative position estimation unit 112 sets the distance D2 as the relative position between the point MP on the marker 3 and the point S2 on the image pickup unit 41. Estimate the distance.
  • the drone D usually includes an altimeter including a rangefinder in a direction in which the axis Z is negative. Therefore, the relative position estimation unit 112 may use the altitude information measured by the altimeter provided in the drone D as the distance D2 from the marker 3.
  • the relative position estimation unit 112 is based on the position where the marker 3 is included in the image corresponding to the data acquired by the image data acquisition unit 111 and the posture of the drone D.
  • the azimuth and elevation / depression angles from the position of the drone D with respect to 3 are estimated. That is, although not shown, the drone D usually includes a gyro sensor or the like for acquiring the angle of the drone D's own posture. Further, the area of the image captured by the imaging unit 41 changes due to the change in the posture of the drone D. That is, the relative position estimation unit 112 can estimate the azimuth angle and the elevation / depression angle after grasping which region the imaging unit is capturing based on the posture of the drone D. Even when the marker 3 is not imaged with a sufficient size, the relative position estimation unit 112 can estimate the azimuth angle and the elevation / depression angle with a predetermined accuracy when it can be grasped that the marker 3 is a marker 3.
  • the distance acquisition unit 113 acquires the distance from the position of the drone D with respect to the wall W. That is, the distance acquisition unit 113 acquires the distance D1 measured by the distance measurement unit 42. As described above, in the example of FIG. 4, the distance measuring unit 42 provided in the drone D measures the distance D1 between the point WP and the point S1 as the distance in the direction in which the axis X is negative. The distance acquisition unit 113 can acquire the distance D1 from the distance measuring unit 42 provided in the drone D via the communication unit 19.
  • the spatial information acquisition unit 114 acquires information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space as spatial information. That is, the spatial information acquisition unit 114 acquires spatial information from the operator terminal 2.
  • the spatial information is information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space.
  • the spatial information includes various objects including an object with which the drone D may come into contact, a ground G on which the marker 3 is arranged, and an object whose distance can be measured by the distance measuring unit 42.
  • Information that includes placement information includes information on the arrangement of the wall W, information on the arrangement of the ground G, and information on the arrangement of the marker 3. That is, for example, information such as "where the wall W exists at the coordinate of the axis X" is included in the spatial information. Further, for example, information such as "where the ground G exists at the coordinate (altitude) of the axis Z" is included in the spatial information. Further, for example, information such as "where the absolute coordinates are where the marker 3 exists" is included in the spatial information.
  • the position specifying unit 115 identifies the position of the drone D based on the relative position of the drone D with respect to the marker 3, the distance from the position of the drone D with respect to the wall W, and the spatial information. That is, the position specifying unit 115 is acquired by the relative position including the distance D2 with respect to the marker 3 estimated by the relative position estimation unit 112, the distance D1 to the wall W acquired by the distance acquisition unit 113, and the spatial information acquisition unit 114. The position of the drone D is specified based on the spatial information.
  • the spatial information includes the absolute coordinates of the marker 3. Therefore, the position specifying unit 115 can specify the absolute coordinates of the position of the drone D based on the relative position estimated by the relative position estimating unit 112 and the spatial information.
  • the position specifying unit 115 can improve the accuracy of the specified position based on the distance D1 with the wall W acquired by the distance acquisition unit 113 and the spatial information including the position of the wall W. That is, for example, as described above, the accuracy of the relative coordinates changes depending on the arrangement of the image of the marker 3 in the captured image. For example, in the example of FIG.
  • the position specifying unit 115 can improve the accuracy of the specified position based on the distance D1 from the wall W and the spatial information including the position of the wall W. Furthermore, if the coordinates of the drone D on the XY plane in absolute coordinates have a large error, the drone D may come into contact with the wall W whose axis X is arranged in the negative direction. However, the drone D of the present embodiment can directly measure the distance D1 with respect to the wall W. As a result, the autonomously controlled drone D can be prevented from coming into contact with the wall W.
  • the attitude control unit 116 controls the attitude and position of the drone D based on the position of the drone D specified by the position specifying unit 115 and the distance D1 from the wall W. That is, the attitude control unit 116 can control the drive unit 43 via the communication unit 19. Thereby, as described above, it is possible to control the distance D1 with respect to the wall W so as not to be shorter than a predetermined distance. That is, it is possible to prevent the drone D from coming into contact with the wall W. Further, for example, as will be described later, when the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged, the posture can be controlled so that the entire marker 3 is imaged. As a result, the relative position estimation unit 112 can estimate the relative position with the marker 3. An example in which the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged will be described later with reference to FIGS. 5 and 6.
  • the imaging area control unit 117 controls to change the area imaged by the imaging unit 41. That is, for example, the imaging unit 41 can be turned by a gimbal including an imaging unit driving unit 44.
  • the image pickup area control unit 117 can control to change the image pickup area by controlling the image pickup unit drive unit 44 via the communication unit 19. An example in which the imaging region control unit 117 changes the imaging region will be described later with reference to FIG. 7.
  • the functional configuration of the position specifying device 1 shown in FIG. 3 has been described with reference to an example of specifying the position of the drone D itself in the position specifying device 1 provided in the drone D shown in FIG.
  • an example in which the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged will be described with reference to FIGS. 5 and 6.
  • FIG. 5 is a diagram showing an example different from that of FIG. 4 among an example of a situation in which various information is acquired by a drone equipped with the position specifying device of FIG.
  • FIG. 6 is a diagram showing each of the examples of images acquired by the position identifying device of FIG. 3 in the situations shown in FIGS. 4 and 5, respectively.
  • the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system is displayed.
  • the axis Y is shown from the positive direction.
  • the axis DZ is not parallel to the axis Z. That is, the drone D is in a situation where the aircraft is tilted due to being fanned by the wind or moving horizontally in the direction of the axis X.
  • the drone D in the example of FIG. 5 includes two imaging units 41-1 and 41-2.
  • the two dotted lines V2-a and V2-b indicate the angle of view of the imaging unit 41-1 provided in the drone D.
  • the first imaging unit 41-1 images an angle of view basically similar to that of the imaging unit 41 in FIG.
  • the two dotted lines V3-a and V3-b indicate the angle of view of the imaging unit 41-2 provided in the drone D. That is, the imaging unit 41-2 captures an image having a wider angle of view than the angle of view of the imaging unit 41-1.
  • the figures shown in the situations A and B in FIG. 6 are examples of images taken from the drone D flying in each of the situations of FIGS. 4 and 5, respectively, in which the axis Z is shown from the positive direction. Is.
  • the figure shown in the A situation of FIG. 6 is an example of an image captured by the imaging unit 41 of FIG. That is, the image data acquisition unit of FIG. 3 acquires the image shown in the situation A of FIG. 5 in the situation of FIG.
  • the figure shown in the B situation of FIG. 6 is an example of an image captured by the imaging unit 41-1 of FIG. That is, the image data acquisition unit of FIG. 3 acquires the image shown in the situation B of FIG. 5 in the situation of FIG.
  • the image data acquisition unit 111 can further acquire data of another image having a different angle of view from the image captured from the position of the drone D.
  • the relative position estimation unit 112 acquires the position of the drone D and the relative position of the marker 3 based on the image of the marker 3 included in at least one of the images corresponding to the data of the plurality of images. can do.
  • the drone D is not tilted, and the marker 3 is included in the angle of view indicated by the dotted lines V1-a and V1-b. Therefore, the example of the image captured by the imaging unit 41 shown in the situation A in FIG. 6 includes the ground G and the entire marker 3 arranged on the ground G. In the situation of FIG. 5, the drone D has an inclined body, and the marker 3 protrudes from the angle of view indicated by the dotted lines V2-a and V2-b. Therefore, the example of the image captured by the imaging unit 41-1 shown in the B situation of FIG. 6 includes the ground G and a part of the marker 3 arranged on the ground G.
  • the marker 3 may protrude from the image captured by the imaging unit 41 or the imaging unit 41-1.
  • An image captured by the imaging unit 41-2 is used in such a case.
  • the image pickup unit 41-2 provided in the drone D captures an image having an angle of view indicated by the two dotted lines V3-a and V3-b.
  • the entire marker 3 is included in the angle of view indicated by the two dotted lines V3-a and V3-b. That is, the image pickup unit 41-2 provided in the drone D captures an image including the entire marker 3.
  • the relative position estimation unit 112 cannot estimate the relative position based on the data of the image captured by the imaging unit 41-2 by estimating the relative position based on the data of the image captured by the imaging unit 41-2. Even in the case, the relative position can be estimated.
  • the drone D is provided with a plurality of imaging units having different angles of view, the following uses are possible.
  • the posture of the drone D changes at high speed due to the gust, for example, the posture of the drone D may not be corrected in time.
  • the imaging unit 41-2 having a wide angle of view, it is possible to capture an image including the entire marker 3 even when the posture changes.
  • the position specifying device 1 can specify the position even when the attitude changes, and the drone D can fly safely.
  • FIG. 7 is a diagram showing an example of a situation in which the position identifying device of FIG. 3 controls a region to be imaged in the situation of FIG.
  • the position identification device 1 provided in the drone D executes the position identification process
  • the drone flies in the space adopting the above-mentioned three-dimensional Cartesian coordinate system. This is an example in which D is shown from the direction in which the axis Y is positive.
  • the example of FIG. 7 is an example in which the imaging unit 41-1 is driven by the imaging unit driving unit 44 of the drone D of the example of FIG.
  • the two dotted lines V2-a and V2-b are the angles of view of the imaging unit 41-1 provided in the drone D, but the two dotted lines V2-a and V2-b in the example of FIG. 7 are It is shown at a different position from the example of FIG. That is, in the example of FIG. 7, the direction of imaging by the imaging unit 41-1 is changed by driving the imaging unit 41-1 by the imaging unit driving unit 44 of the drone D of the example of FIG. Is shown.
  • the image pickup area control unit 117 can control to change the real space area to be imaged as an image. That is, the image pickup area control unit 117 can control the region to be imaged by the image pickup unit 41-1 by driving the gimbal or the like of the image pickup unit 41-1 via the image pickup unit drive unit 44. can. Although it depends on the number of pixels of the image captured by the imaging unit 41 or the like, when estimating the relative position, the accuracy of the relative position is usually improved based on the image captured by the imaging unit 41-1 having a narrow angle of view. do. That is, the accuracy of the relative position is improved based on the image of the marker 3 captured by the camera having a narrow angle of view. That is, the accuracy of relative position estimation is improved by driving the gimbal or the like of the imaging unit 41-1 by the imaging region control unit 117 via the imaging unit driving unit 44.
  • the relative position estimation unit 112 can estimate the relative position based on the image captured by the image pickup unit 41-2 having a wide angle of view. In this way, by providing the imaging units 41-1, 41-2 and the like for capturing images of a plurality of angles of view, the area in which the relative position can be estimated becomes wide when the drone D flies. Furthermore, by controlling the area imaged by the image pickup units 41-1, 41-2, etc. by the image pickup area control unit 117, it is possible to estimate the relative position regardless of the posture and position of the drone D. The area becomes wider.
  • FIG. 8 is a diagram showing an example of a marker that is an object to be imaged of an image acquired by the position specifying device of FIG.
  • the marker 3 in the example of the situation A in FIG. 6 is provided with a two-dimensional bar code-like pattern consisting of an arrangement of white-painted squares and black-painted squares inside a square frame with a black border.
  • the pattern provided by the marker 3 has the following features.
  • the shape of the marker 3 shown in the situation A in FIG. 6 is not rotationally symmetric with respect to the axis parallel to the axis Z.
  • the relative position estimation unit 112 can estimate the azimuth angle with respect to the marker 3 from the image captured by the marker 3.
  • the marker 3 can have the following patterns.
  • the marker 3 can include meanings such as high-precision control and instructions.
  • the marker 3 can include a pattern including information on which of the plurality of markers 3 is the marker 3 (hereinafter, referred to as “marker ID”).
  • the position specifying unit 115 can specify the position based on the arrangement of the plurality of markers 3, the marker ID of the marker 3 being imaged, and the relative position of the marker 3 being imaged.
  • the marker 3 is provided with a pattern for compensating for the lack of information included in the marker 3 by using a predetermined algorithm so that information such as the azimuth angle and the marker ID can be acquired even when the entire marker 3 cannot be imaged.
  • the marker 3 may include a pattern including information on GPS coordinates in which the marker is installed. This makes it possible to authenticate whether the marker is installed at the correct position.
  • the position specifying device 1 may collate the GPS coordinates with the marker ID of the marker 3 being imaged by acquiring a set of information of the marker ID and the GPS coordinates installed on the marker 3.
  • the marker 3 in the example of FIG. 8 is composed of a marker 3-a having the same structure as the marker 3 in the example of FIG. 6 and a square marker 3-b having a large black border as compared with the marker 3-a. Will be done.
  • the pattern included in the marker 3 in the example of FIG. 8 has the following features.
  • the imaging region control unit 117 can control the region to be imaged by the imaging unit 41 so that the marker 3-a is imaged.
  • the drone D flying at a high altitude may not be able to grasp the detailed pattern of the relatively small marker 3-a from the captured image.
  • the drone D approaches the marker 3 with the target of the square marker 3-b having a large black border as compared with the marker 3-a, so that the detailed pattern of the marker 3-a can be grasped.
  • the marker ID and the like can be obtained from the marker 3-a, and it can be confirmed that the landing point is correct.
  • the relative position estimation unit 112 determines the distance, direction, and direction from the position of the drone D with respect to the marker 3 based on at least one of the large structure of the marker 3 and the structure smaller than the large structure. Relative positions including at least one of the angles can be analyzed.
  • FIG. 9 is a flowchart illustrating an example of the flow of the position identification process executed by the position identification device having the functional configuration of FIG.
  • the position identification device 1 provided in the drone D executes the position identification process
  • the position identification process is started and the following processes S11 to S15 are executed.
  • step S11 the image data acquisition unit 111 acquires image data captured from the position of the drone D with the marker 3 in the real space as the object to be imaged.
  • step S12 the relative position estimation unit 112 determines the distance, direction, and angle from the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data acquired in step S11. Estimate the relative position including at least one of them.
  • step S13 the distance acquisition unit 113 acquires the distance from the position of the drone D with respect to the wall W.
  • step S14 the spatial information acquisition unit 114 acquires information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space as spatial information.
  • step S15 the position specifying unit 115 is the relative position from the drone D with respect to the marker 3 estimated in step S12, the distance from the position of the drone D with respect to the wall W acquired in step S13, and the position is acquired in step S14.
  • the position of the drone D is specified based on the spatial information. As a result, the position identification process is completed.
  • the marker 3 has been described as a two-dimensional bar code-shaped marker in the above-described embodiment, but the marker 3 is not particularly limited thereto.
  • the marker 3 having a different form will be described with reference to FIGS. 10 and 11.
  • the position specifying device 1 can also be used in the example shown in FIG.
  • FIG. 10 is a diagram showing an example different from FIGS. 4 and 5 in an example of a situation in which various information is acquired by a drone equipped with the position specifying device of FIG.
  • the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction.
  • the marker 3 is an example in which the string-shaped marker 3-R is adopted.
  • the wall W1 is arranged in the direction in which the axis X is positive with respect to the drone D.
  • the wall W2 is arranged in the direction in which the axis X is negative with respect to the drone D.
  • the ceiling C is arranged in the direction in which the axis Z is positive with respect to the drone D.
  • the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D.
  • a string-shaped marker 3-R is hung from the ceiling C and installed. That is, the space in the example of FIG. 10 is a space closed by the ceiling C, the walls W1 and W2, and the ground G.
  • the drone D in the example of FIG. 10 is drawn with the position specifying device 1, the imaging unit 41, and the distance measuring unit 42, as in the example of FIG.
  • the imaging unit 41 is imaging the string-shaped markers 3-R hanging from the ceiling C instead of the markers 3 arranged on the ground.
  • the relative position estimation unit 112 can estimate the relative position based on the image of the marker 3-R imaged by the image pickup unit 41.
  • the marker 3 not only the marker 3 installed on the floor but also the string-shaped marker 3-R hung from the ceiling C can be adopted.
  • the markers 3-R may be a prominently colored rope or a rod built from the ground G.
  • markers 3-R are easy to install, they are effective even in a cylindrical building such as a chimney or a building with high symmetry. Furthermore, although not shown, a plurality of markers 3-R are prepared, and a string shape having a different color for each of the walls W1 and W2 and the wall in which the axis Y (not shown) is in the positive direction and the axis Y is in the negative direction. By hanging the markers 3-R of the above, it is possible to identify the wall. The position of the drone D can be specified by identifying which wall it is by using the markers 3-R.
  • the position specifying device 1 can be used in the example shown in FIG.
  • FIG. 11 is a diagram showing an example different from FIGS. 4, 5, and 9 among an example of a situation in which various information is acquired by a drone equipped with the position identification device of FIG.
  • the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction.
  • the marker 3 is an example in which the string-shaped marker 3-R is adopted.
  • the wall W3-a is arranged in the direction in which the axis X is positive with respect to the drone D.
  • the wall W3-b is arranged in the direction in which the axis X is negative with respect to the drone D.
  • the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D.
  • a string-shaped marker 3-R is hung from the ceiling C and installed.
  • the wall W3-a and the wall W3-b are drawn by drawing an arc and connecting them. That is, the space in the example of FIG. 11 is a chimney-shaped space.
  • a string-shaped marker 3-R is hung and arranged along the wall W3-a.
  • the imaging unit 41 included in the drone D images the string-shaped markers 3-R.
  • the image captured by the imaging unit 41 includes only the region corresponding to the angle indicated by the angle ⁇ shown in FIG.
  • the angle ⁇ is about 90 degrees, and for the drone D, the image is taken as if the markers 3-R exist almost directly above. That is, the image captured by the imaging unit 41 includes the image of the markers 3-R as a large image by the length of the string-shaped markers 3-R.
  • the markers 3-R are imaged as a relatively large image.
  • the relative position estimation unit 112 can use the markers 3-R to specify the relative position in the rotation direction in the chimney symmetrical to the cylinder.
  • the markers 3-R may have a rod-like shape protruding from the wall surface.
  • the markers 3-R may be a light such as an LED.
  • the marker 3 adopts not only the flat marker having the two-dimensional bar code-like pattern shown in FIGS. 6 and 8 but also the string-shaped or rod-shaped marker shown in FIGS. 10 and 11. be able to.
  • the marker 3 may be not only a square but also a rectangular parallelepiped or a straight line (which can be said to be an elongated rectangle). Further, the marker 3 may include two or more lights such as LEDs, and may be drawn with a luminous paint or the like. This makes it easy to image the marker 3 even in a dark place, and the position specifying device 1 can specify the position. Further, when flying to a high altitude, a rectangular frame is suitable because the accuracy of relative position estimation by image recognition can be improved.
  • the marker 3 is arranged on the ground G, and the distance measuring unit 42 measures the distance to the wall W, but the present invention is not particularly limited to this.
  • the marker 3 may be installed on the wall W, and the distance measuring unit 42 may measure the distance to the ground G. Even in this case, the position specifying device 1 can specify the position.
  • the drone D is a small unmanned aerial vehicle that can move in three-dimensional space, but the drone D is not particularly limited to this. That is, for example, the drone D may be a vehicle or the like that moves on the ground.
  • the drone D may be a vehicle or the like that moves on the ground.
  • the distance between the imaging unit 41 and the marker 3 becomes short when the vehicle recognizes the marker 3 installed on the floor.
  • Such a double marker is suitable.
  • the marker 3 is drawn with the luminous paint, it emits light when the vehicle passes over it, so that it is easy to be imaged, and the relative position can be easily estimated by image recognition.
  • the drone D is assumed to include the imaging unit 41.
  • the image pickup unit 41 includes a lens, an optical element such as a CCD or CMOS, and a control unit thereof, and is intended to capture an image or video.
  • the drone D may have a depth sensor.
  • the drone D may have various depth sensors such as a depth camera, a radar, and a LiDAR that can acquire three-dimensional depth information.
  • the position specifying device 1 acquires the measurement result of the depth sensor as depth information.
  • the position specifying device 1 can specify the position based on the depth information acquired through the depth sensor instead of the image data acquired by the imaging unit 41.
  • the marker has a shape having a predetermined unevenness. That is, the position specifying device 1 can recognize the position in the same manner as the example of the marker 3 of the above-described embodiment by recognizing the marker having a predetermined unevenness included in the depth information. Further, as the shape of the marker, the shape shown in FIGS. 6 and 8 can be used even when the depth sensor is used.
  • the drone D may have both an imaging unit 41 and a depth sensor.
  • the position specifying device 1 can specify the position based on the image data captured by the imaging unit 41 and the depth information acquired via the depth sensor.
  • the marker for using the imaging unit 41 and the marker for using the depth sensor can be used properly depending on the installation position of the marker.
  • a marker having both features such as color and fluorescence for using the imaging unit 41 and predetermined unevenness for using the depth sensor can be adopted. This makes it a marker that can be used by both the image pickup unit 41 and the drone D having either the depth sensor.
  • the distance acquisition unit 113 acquires the distance measured by the distance measurement unit 42, but the distance acquisition unit 113 is not particularly limited to this. That is, for example, it is sufficient for the distance measuring unit 42 to be able to acquire the distance from the drone D to an object other than a marker or the like in the real space. Specifically, for example, the distance acquisition unit 113 can acquire the distance based on the position of the drone D and the angle with another object. That is, for example, at a certain time t1, it is assumed that the position specifying device 1 has been able to specify the position of the drone D by an arbitrary method including GPS.
  • the distance acquisition unit 113 analyzes the data of the image captured by the imaging unit such as a camera, so that another object exists as seen from the drone D. Obtain the angle to be used (for example, azimuth and elevation / depression angle). Further, the distance acquisition unit 113 estimates the distance between the drone D and the other object based on the angle and spatial information in which the other object exists when viewed from the drone D. As a result, the distance acquisition unit 113 can acquire the distance between the drone D and another object. Needless to say, the above-mentioned angle may be corrected and used based on the attitude information (tilt and direction of the aircraft) of the drone D. Furthermore, in the above case, the distance acquisition unit 113 can acquire a variation (amount of change in position) of the position of another object by performing image analysis on the image data. This makes it possible to estimate and obtain the distance between the drone D and another object.
  • the present embodiment can also exert the following effects. Specifically, for example, when both the relative position with the marker and the distance with another object such as a wall can be acquired, the accuracy of the position specified by the position specifying device 1 simply specifies the position based on the marker. Compared to doing, it improves. Further, for example, the relative position of the floor (floor on which the marker is installed) with the marker can be acquired, and the position of another object can be estimated from the relative position with the marker and the spatial information.
  • another object is a steel tower will be specifically described.
  • the technology related to the flight of the drone D around the steel tower is an important technology in applications such as inspection of the steel tower using the drone D.
  • the position identification device 1 improves the accuracy of position identification based on the information obtained as a result of movement such as orbiting around the tower while measuring the distance to the tower. Can be done. However, it may not be possible to obtain the relative position with the marker while orbiting the tower.
  • the position specifying device 1 prevents the steel tower from colliding with the drone D based on the measured distance, and obtains information on the distance between the towers for the laps and the markers acquired a plurality of times during the laps.
  • the position can be specified based on the relative position information.
  • the position specifying device 1 can compress the error by, for example, processing such as averaging the results of a plurality of measurements.
  • grass grows around the tower In this case, when the altitude above ground level is measured using a simple distance sensor, it is not possible to specify at which height of the tower the drone D is located.
  • the position specifying device 1 can specify the position (elevation to ground level) based on the relative position with the marker. Furthermore, the accuracy of the position relative to the marker deteriorates as the distance from the marker increases. However, by using the distance from the steel tower, the position specifying device 1 can improve the accuracy of specifying the position based on the distance from the steel tower, and can also prevent contact with the steel tower.
  • the imaging unit 41 is configured to include a set of a plurality of lenses and optical elements capable of capturing images of a plurality of angles of view.
  • the drone D has a plurality of imaging units and may be capable of capturing images having a plurality of angles of view.
  • the effect of being able to capture images with a plurality of angles of view will be described. Specifically, for example, when there are images having different angles of view (telephoto degree), as described with reference to FIG. 7, even when the posture of the drone D is disturbed, an image having a wide angle of view can be used. It is more likely that the marker will be included and imaged.
  • the position identification device 1 can improve the accuracy of position identification. Further, by correcting the posture, it is possible to capture the image as an image having a narrow angle of view and improve the accuracy of the position specified by the position specifying device 1.
  • the region imaged by the imaging unit 41 may be variable. That is, for example, the lens and the set of the image sensor of the image pickup unit 41 may be movable so as to be able to take an image in various directions. As a result, it is possible to take an image for specifying the position, or to take an image of a steel tower as an inspection target in the above example. Furthermore, even when another object in the vicinity moves (for example, a power transmission line swaying by the wind), the movable imaging unit makes it possible to always include the other object within the angle of view.
  • the position specifying device 1 is assumed to be mounted on the drone D, but the present invention is not particularly limited to this. That is, for example, the position identification device 1 may be provided on the ground so that the image captured from the drone D and the distance information may be transmitted to the position identification device 1 on the ground. As a result, for example, it is not necessary to equip the drone D with hardware for executing high-load data processing, and the cost can be reduced.
  • the above-mentioned series of processes can be executed by hardware or software.
  • the functional configuration of FIG. 3 is merely an example and is not particularly limited. That is, it suffices if the steering control system is provided with a function capable of executing the above-mentioned series of processes as a whole, and what kind of functional block is used to realize this function is not particularly limited to the example of FIG.
  • the location of the functional block is not particularly limited to FIG. 3, and may be arbitrary.
  • the functional block of the position specifying device 1 may be transferred to the operator terminal 2 or the like.
  • one functional block may be configured by a single piece of hardware, a single piece of software, or a combination thereof.
  • a program constituting the software is installed on a computer or the like from a network or a recording medium.
  • the computer may be a computer embedded in dedicated hardware.
  • the computer may be a computer capable of executing various functions by installing various programs, for example, a general-purpose smartphone or a personal computer in addition to a server.
  • a recording medium containing such a program is not only composed of a removable medium (not shown) distributed separately from the device main body in order to provide the program to the operator U, but is also preliminarily incorporated in the device main body. It is composed of a recording medium or the like provided to the operator U in the state of being in the state.
  • the steps for describing a program to be recorded on a recording medium are not necessarily processed in chronological order, but also in parallel or individually, even if they are not necessarily processed in chronological order. It also includes the processing to be executed.
  • the term of the system means an overall device composed of a plurality of devices, a plurality of means, and the like.
  • the information processing apparatus to which the present invention is applied can take various embodiments having the following configurations.
  • the information processing device to which the present invention is applied is An image of a first object (for example, marker 3) in real space captured from a predetermined position (for example, the position of drone D) as a subject to be imaged, and a second object (for example, wall W) in the real space.
  • An information processing device that estimates the predetermined position based on the distance from the predetermined position.
  • An image data acquisition means for acquiring the first data of the image for example, the image data acquisition unit 111 in FIG.
  • Relative position estimation means for example, relative position estimation unit 112 in FIG. 3 and A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object (for example, the distance acquisition unit 113 in FIG. 3) and Spatial information acquisition means (for example, spatial information acquisition unit 114 in FIG. 3) for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
  • a position specifying means (for example, FIG. 3) that estimates the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information.
  • Positioning part 115) and An information processing device equipped with the above is sufficient.
  • the information processing device can improve the accuracy of the specified position not only based on the relative position estimated based on the image of the first object but also based on the distance to the second object. Furthermore, by setting an object that has a risk of contact with a moving body or the like as the second object, the risk of direct contact with the second object can be reduced.
  • the image data acquisition means further acquires second data of another image having an angle of view different from that of the image captured from the predetermined position.
  • the relative position estimation means is based on the image of the first object contained in at least one of the image corresponding to the first data and the second data and each of the other images, and the predetermined position. And the relative position with respect to the first object can be obtained.
  • an imaging region control means that controls to change the region of the real space imaged as at least one of the image and the other image.
  • the relative position estimating means determines the distance, direction, and distance from the predetermined position with respect to the first object based on at least one of a large structure of the first object and a structure smaller than the large structure. And relative positions including at least one of the angles can be analyzed.
  • D ... Drone, 1 ... Position identification device, 2 ... Operator terminal, 11 ... CPU, 111 ... Image data acquisition unit, 112 ... Relative position estimation unit, 113 ... Distance acquisition unit, 114 ... Spatial information acquisition unit, 115 ... Position identification unit, 116 ... Attitude control unit, 117 ... Imaging area control unit, 19 ... Communication unit, 41 ... Imaging Unit, 42 ... Distance measuring unit, 43 ... Drive unit, 44 ... Imaging unit drive unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention addresses the problem of accurately identifying a position even indoors. An image data acquisition unit 111 of a position identifying device 1, which estimates a position of a drone D on the basis of an image in which a marker in real space was captured as a subject from the position of the drone D and a distance from the position of the drone D to a wall, acquires data on the image. A relative position estimation unit 112 estimates, on the basis of the image of the marker included in the image corresponding to the data, a relative position including at least a distance, a direction, and an angle with respect the marker. A distance acquisition unit 113 acquires the distance from the position of the drone D to the wall. A spatial information acquisition unit 114 acquires information about the arrangement of a plurality of objects including the marker and the wall as spatial information. A position identifying unit 115 estimates the position of the drone D, on the basis of the relative position with respect to the marker, the distance to the wall, and the spatial information. Accordingly, the problem is solved.

Description

情報処理装置、情報処理方法、及びプログラムInformation processing equipment, information processing methods, and programs
 本発明は、情報処理装置、情報処理方法、及びプログラムに関する。 The present invention relates to an information processing device, an information processing method, and a program.
 従来、無人飛行体が所定の地点に着陸するよう誘導する誘導システムにおいて、着陸指定位置から所定の位置に指標を表示する表示部を有することにより、無人飛行体の撮像部から撮像された指標に基づいて無人飛行体を着陸させるといった技術が提案されている(例えば、特許文献1参照)。 Conventionally, in a guidance system that guides an unmanned aircraft to land at a predetermined point, by having a display unit that displays an index from a designated landing position to a predetermined position, the index imaged from the imaging unit of the unmanned aircraft can be used. Based on this, a technique for landing an unmanned air vehicle has been proposed (see, for example, Patent Document 1).
特開2018-206089号公報Japanese Unexamined Patent Publication No. 2018-206089
 しかしながら、特許文献1に記載の技術を含む従来技術では、指標は無人飛行体の位置に対応付けて表示部に表示されるものであり、表示する制御を行う表示部そのものにコストがかかる。
 また、GPS(Global Positioning System)を利用して特定された位置情報に基づき、離着陸制御を行う技術が存在する。しかしながら、屋内等のGPSの電波を直接的に受信できない位置では、GPSを利用して特定された位置情報の精度は、悪化するという問題があった。
 即ち、GPSを利用できない屋内において、例えば、複数の位置の指標を利用することで対応ができたが、この場合、コストと精度とがトレードオフの関係にあった。
However, in the prior art including the technique described in Patent Document 1, the index is displayed on the display unit in association with the position of the unmanned aircraft, and the display unit itself that controls the display is costly.
In addition, there is a technique for performing takeoff and landing control based on position information specified by using GPS (Global Positioning System). However, there is a problem that the accuracy of the position information specified by using GPS deteriorates at a position where GPS radio waves cannot be directly received, such as indoors.
That is, in indoors where GPS cannot be used, for example, it was possible to deal with it by using indicators at a plurality of positions, but in this case, there was a trade-off relationship between cost and accuracy.
 本発明は、屋内等においても精度よく位置を特定することを目的とする。 An object of the present invention is to accurately specify the position even indoors.
 上記目的を達成するため、本発明の一態様の情報処理装置は、
 実空間内の第1物体が被写対象として所定位置から撮像された画像、及び前記実空間内の第2物体に対する当該所定位置からの距離に基づき、当該所定位置を推定する情報処理装置であって、
 前記画像の第1データを取得する画像データ取得手段と、
 前記第1データに対応する前記画像に含まれる前記第1物体の画像に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する相対位置推定手段と、
 前記第2物体に対する前記所定位置からの距離を取得する距離取得手段と、
 前記実空間における前記第1物体及び前記第2物体を含む複数の物体の配置の情報を、空間情報として取得する空間情報取得手段と、
 前記第1物体に対する前記所定位置からの前記相対位置、前記第2物体に対する前記所定位置からの前記距離、及び前記空間情報に基づいて、前記所定位置を特定する位置特定手段と、
 を備える。
In order to achieve the above object, the information processing device of one aspect of the present invention is
An information processing device that estimates a predetermined position based on an image of a first object in the real space captured from a predetermined position as a subject to be imaged and a distance from the predetermined position with respect to the second object in the real space. hand,
An image data acquisition means for acquiring the first data of the image, and
Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated. Relative position estimation means and
A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object, and
Spatial information acquisition means for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
A position specifying means for specifying the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information.
To be equipped.
 本発明の一態様の情報処理方法及びプログラムの夫々は、本発明の一態様の情報処理装置に対応する方法及びプログラムの夫々である。 Each of the information processing methods and programs of one aspect of the present invention is a method and program corresponding to the information processing device of one aspect of the present invention.
 本発明によれば、屋内等においても精度よく位置を特定することができる。 According to the present invention, the position can be accurately specified even indoors.
本発明の一実施形態に係る位置特定装置を備えるドローンを含む操縦制御システムの概要を示す図である。It is a figure which shows the outline of the maneuvering control system including the drone provided with the position specifying device which concerns on one Embodiment of this invention. 図1のドローンに備えられる位置特定装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware composition of the position identification apparatus provided in the drone of FIG. 図2の位置特定装置の機能的構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of the functional structure of the position specifying apparatus of FIG. 図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例を示す図である。It is a figure which shows an example of the situation where various information is acquired by the drone provided with the position identification device of FIG. 図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4とは異なる例を示す図である。It is a figure which shows the example different from FIG. 4 among the example of the situation where various information is acquired by the drone provided with the position identification apparatus of FIG. 図3の位置特定装置が、図4及び図5の夫々に示す状況において取得する画像の例の夫々を示す図である。It is a figure which shows each of the example of the image acquired by the position identification apparatus of FIG. 3 in the situation shown in each of FIGS. 4 and 5. 図3の位置特定装置が、図5の状況において撮像される領域を制御した状況の例を示す図である。FIG. 3 is a diagram showing an example of a situation in which the position specifying device of FIG. 3 controls a region to be imaged in the situation of FIG. 図3の位置特定装置により取得される画像の被写対象であるマーカの一例を示す図である。It is a figure which shows an example of the marker which is the object of the image acquired by the position specifying apparatus of FIG. 図3の機能的構成を有する位置特定装置により実行される、位置特定処理の流れの一例を説明するフローチャートである。FIG. 5 is a flowchart illustrating an example of a flow of position identification processing executed by the position identification device having the functional configuration of FIG. 図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4及び図5とは異なる例を示す図である。It is a figure which shows the example different from FIG. 4 and FIG. 図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4、図5、及び図9とは異なる例を示す図である。It is a figure which shows the example different from FIG.4, FIG.
 以下、本発明の実施形態について図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 以下、本発明に係る情報処理システムを実施形態に関し、3次元空間を移動可能な小型無人飛行機(以下「ドローン」と呼ぶ)Dを含む情報処理システムについて、図面を用いて説明する。図中、同じ要素に対しては同じ参照符号を付して、重複する説明を省略する。ただし、本発明における「ドローン」の用語の対象は、3次元空間を移動可能な小型無人飛行機に限らない。即ち例えば、2次元空間を移動可能な走行車であるドローン等も含む。 Hereinafter, the information processing system according to the present invention will be described with reference to the embodiment, and the information processing system including the small unmanned aerial vehicle (hereinafter referred to as “drone”) D capable of moving in three-dimensional space will be described with reference to the drawings. In the figure, the same reference numerals are given to the same elements, and duplicate description will be omitted. However, the subject of the term "drone" in the present invention is not limited to a small unmanned aerial vehicle that can move in three-dimensional space. That is, for example, a drone that is a traveling vehicle that can move in a two-dimensional space is also included.
 なお、以下の説明では、特に断りのない限り、次のように定義する方向を用いるものとする。
 即ち、以下、前述の特許文献1のような、単独で飛行する状態にあるプロペラ式のドローンDを仮定した場合において、主たるプロペラが回転することで重力に対抗する推力を発生させる方向であって、貨物等を除いたドローンの重心を通る軸を軸DZと呼ぶ。ここで、ドローンDが一定の位置でホバリングした状態から主たるプロペラの回転数を増やした場合、ドローンは高度を増し上昇する。この方向を「軸DZが正の方向」、その逆を「軸DZが負の方向」と適宜呼ぶ。従って、上述の定義によれば、軸DZはドローンDの姿勢によらずドローンDに対して固定される。
In the following description, unless otherwise specified, the directions defined as follows are used.
That is, hereinafter, in the case of assuming a propeller type drone D in a state of flying alone as in Patent Document 1 described above, the direction is such that the main propeller rotates to generate a thrust against gravity. The axis passing through the center of gravity of the drone excluding cargo etc. is called the axis DZ. Here, when the rotation speed of the main propeller is increased from the state where the drone D is hovering at a certain position, the drone increases in altitude and rises. This direction is appropriately referred to as "the direction in which the axis DZ is positive" and the opposite is appropriately referred to as "the direction in which the axis DZ is negative". Therefore, according to the above definition, the axis DZ is fixed to the drone D regardless of the attitude of the drone D.
 また、以下の説明において、以下の軸X、軸Y及び軸Zからなる3次元直行座標系を用いる。即ち、通常の飛行状態の姿勢に配置されたドローンDの軸DZの方向に3次元直行座標系の軸Zをとる。また、重力の働く方向と逆の方向を、軸Zの方向とする。また、軸Zと直行するように、お互いに直行する軸Xと軸Yをとる。このとき、軸X、軸Y及び軸Zを用いた3次元座標系が右手系となるように軸X及び軸Yの夫々の方向を定義する。即ち、ドローンDが高さを変えずに移動する場合、ドローンDは、軸X及び軸YからなるX-Y平面を飛行する。
 以下、上述の定義に基づいて、高さが高くなる方向を「軸Zが正の方向」、その逆を「軸Zが負の方向」と呼ぶ。また、軸Xの方向の夫々を、「軸Xが正の方向」及び「軸Xが負の方向」の夫々と呼び、軸Yの方向の夫々を、「軸Yが正の方向」及び「軸Yが負の方向」の夫々と呼ぶ。
Further, in the following description, a three-dimensional Cartesian coordinate system including the following axes X, Y, and Z is used. That is, the axis Z of the three-dimensional Cartesian coordinate system is taken in the direction of the axis DZ of the drone D arranged in the attitude of the normal flight state. Further, the direction opposite to the direction in which gravity acts is defined as the direction of the axis Z. Also, take an axis X and an axis Y that are orthogonal to each other so as to be orthogonal to the axis Z. At this time, the directions of the axes X and Y are defined so that the three-dimensional coordinate system using the axes X, Y, and Z is a right-handed system. That is, when the drone D moves without changing its height, the drone D flies in the XY plane including the axis X and the axis Y.
Hereinafter, based on the above definition, the direction in which the height increases is referred to as "the direction in which the axis Z is positive", and the opposite is referred to as "the direction in which the axis Z is negative". Further, the directions of the axis X are referred to as "the direction in which the axis X is positive" and "the direction in which the axis X is negative", and the directions of the axis Y are referred to as "the direction in which the axis Y is positive" and "the direction in which the axis Y is negative". They are called "axis Y is in the negative direction" respectively.
 図1は、本発明の一実施形態に係る位置特定装置を備えるドローンを含む操縦制御システムの概要を示す図である。 FIG. 1 is a diagram showing an outline of a maneuvering control system including a drone including a position specifying device according to an embodiment of the present invention.
 図1に示す操縦制御システムは、位置特定装置1が備えられた(搭載された)ドローンDと、操縦者Uにより操作される操縦者端末2とが無線通信を介して接続されている。ここで、ドローンDと、操縦者端末2との夫々は、図示せぬインターネット等の所定のネットワークNを介して相互に接続される。ネットワークNとは、インターネットや携帯キャリア網等は勿論、NFC(登録商標)やBlue tooth(登録商標)等の近距離無線通信等も含まれる。 In the maneuvering control system shown in FIG. 1, a drone D equipped (mounted) with a position specifying device 1 and a driver terminal 2 operated by a driver U are connected via wireless communication. Here, the drone D and the operator terminal 2 are connected to each other via a predetermined network N such as the Internet (not shown). The network N includes not only the Internet and mobile carrier networks, but also short-range wireless communications such as NFC (registered trademark) and Bluetooth (registered trademark).
 また、図1に示すように、位置特定装置1が備えられたドローンDは、GPS(Global Positioning System)衛星GSから送信されるGPS信号を受信することで位置情報を取得する。
 即ち、通常、ドローンDは、GPS衛星GSから送信されるGPS信号を受信して取得したドローンD自身の位置情報を、自律制御に利用したり、位置情報を操縦者端末2に送信したりする。なお、ドローンDは、GPSに限らず、日本における準天頂衛星システムといった、各種衛星を利用した測位システムを利用できる。
Further, as shown in FIG. 1, the drone D provided with the position specifying device 1 acquires position information by receiving a GPS signal transmitted from a GPS (Global Positioning System) satellite GS.
That is, normally, the drone D receives the GPS signal transmitted from the GPS satellite GS and uses the position information of the drone D itself acquired for autonomous control, or transmits the position information to the operator terminal 2. .. The drone D is not limited to GPS, and can use positioning systems using various satellites such as the quasi-zenith satellite system in Japan.
 操縦者端末2は、スマートフォン等で構成され、操縦者UがドローンDを操縦するために用いる端末である。操縦者Uは、操縦者端末2を操作することにより、ドローンDに対して直接的な操縦の他、離着陸の命令や予め設定された所定の作業の命令等を行うことができる。 The operator terminal 2 is a terminal composed of a smartphone or the like and used by the operator U to operate the drone D. By operating the driver terminal 2, the operator U can directly control the drone D, as well as issue a takeoff / landing command, a preset predetermined work command, and the like.
 しかしながら、上述したように、GPSの電波を直接的に受信できない屋内等では、GPSを利用して特定された位置情報の精度は、悪化することがあった。即ち例えば、屋内においてGPS電波がほぼ到達しない場所や、屋外であっても大きな建造物(壁)等の近くの場所においては、GPS信号に基づくドローンD自身の位置情報の精度は、不十分であることがあった。更に言えば、GPS信号に基づくドローンD自身の位置情報の精度は、GPS等の衛星測位システムの規格により、ドローンD自身の位置情報の精度には上限があった。
 本実施形態の位置特定装置1は、後述する構成を備えることで、屋内や、大きな建造物(壁)等の近くの場所、更にはGPSの規格に限定されない精度で、ドローンD自身の位置を特定することができる。
However, as described above, the accuracy of the position information specified by using GPS may deteriorate indoors or the like where GPS radio waves cannot be directly received. That is, for example, the accuracy of the position information of the drone D itself based on the GPS signal is insufficient in a place where GPS radio waves hardly reach indoors or in a place near a large building (wall) even outdoors. There was something. Furthermore, the accuracy of the position information of the drone D itself based on the GPS signal has an upper limit on the accuracy of the position information of the drone D itself due to the standard of the satellite positioning system such as GPS.
The position identification device 1 of the present embodiment is provided with the configuration described later, so that the position of the drone D itself can be determined indoors, in a place near a large building (wall), or with an accuracy not limited to the GPS standard. Can be identified.
 図2は、図1のドローンに備えられる位置特定装置のハードウェア構成の一例を示すブロック図である。 FIG. 2 is a block diagram showing an example of the hardware configuration of the position specifying device provided in the drone of FIG.
 位置特定装置1は、CPU(Central Processing Unit)11と、ROM(Read Only Memory)12と、RAM(Random Access Memory)13と、バス14と、入出力インターフェース15と、出力部16と、入力部17と、記憶部18と、通信部19と、ドライブ20と、を備えている。 The position specifying device 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input / output interface 15, an output unit 16, and an input unit. A 17, a storage unit 18, a communication unit 19, and a drive 20 are provided.
 CPU11は、ROM12に記録されているプログラム、又は、記憶部18からRAM13にロードされたプログラムに従って各種の処理を実行する。
 RAM13には、CPU11が各種の処理を実行する上において必要なデータ等も適宜記憶される。
The CPU 11 executes various processes according to the program recorded in the ROM 12 or the program loaded from the storage unit 18 into the RAM 13.
Data and the like necessary for the CPU 11 to execute various processes are also appropriately stored in the RAM 13.
 CPU11、ROM12及びRAM13は、バス14を介して相互に接続されている。このバス14にはまた、入出力インターフェース15も接続されている。入出力インターフェース15には、出力部16、入力部17、記憶部18、通信部19及びドライブ20が接続されている。 The CPU 11, ROM 12 and RAM 13 are connected to each other via the bus 14. An input / output interface 15 is also connected to the bus 14. An output unit 16, an input unit 17, a storage unit 18, a communication unit 19, and a drive 20 are connected to the input / output interface 15.
 出力部16は、ディスプレイやスピーカ等で構成され、各種情報を画像や音声として出力する。 The output unit 16 is composed of a display, a speaker, and the like, and outputs various information as images and sounds.
 入力部17は、キーボードやマウス等で構成され、各種情報を入力する。 The input unit 17 is composed of a keyboard, a mouse, etc., and inputs various information.
 記憶部18は、ハードディスクやDRAM(Dynamic Random Access Memory)等で構成され、各種データを記憶する。
 通信部19は、インターネットを含むネットワークNを介して他の装置(図1の例では操縦者端末2等)との間で通信を行う。
The storage unit 18 is composed of a hard disk, a DRAM (Dynamic Random Access Memory), or the like, and stores various data.
The communication unit 19 communicates with another device (such as the operator terminal 2 in the example of FIG. 1) via the network N including the Internet.
 ドライブ20には、磁気ディスク、光ディスク、光磁気ディスク、或いは半導体メモリ等よりなる、リムーバブルメディア31が適宜装着される。ドライブ20によってリムーバブルメディア31から読み出されたプログラムは、必要に応じて記憶部18にインストールされる。
 また、リムーバブルメディア31は、記憶部18に記憶されている各種データも、記憶部18と同様に記憶することができる。
A removable media 31 made of a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 20. The program read from the removable media 31 by the drive 20 is installed in the storage unit 18 as needed.
Further, the removable media 31 can also store various data stored in the storage unit 18 in the same manner as the storage unit 18.
 なお、図示はしないが、図1の操縦者端末2は、以下の点を除き、図2に示すハードウェア構成と基本的に同様の構成を有する。そこで、操縦者端末2のハードウェア構成の説明は省略する。 Although not shown, the operator terminal 2 in FIG. 1 has basically the same configuration as the hardware configuration shown in FIG. 2 except for the following points. Therefore, the description of the hardware configuration of the operator terminal 2 will be omitted.
 以下、図3に示す位置特定装置1の機能的構成を、図4に示すドローンDに備えられた位置特定装置1におけるドローンD自身の位置を特定する例を用いて説明する。 Hereinafter, the functional configuration of the position specifying device 1 shown in FIG. 3 will be described with reference to an example of specifying the position of the drone D itself in the position specifying device 1 provided in the drone D shown in FIG.
 上述したように、位置特定装置1は、ドローンD自身の位置を特定する一連の処理として、位置特定装置1がドローンD自身の位置を特定一連の処理(以下、「位置特定処理」と呼ぶ)を実行することができる。
 このような位置特定処理を実現すべく、図1の操縦制御システムは図3に示すような機能的構成を有している。
As described above, the position identification device 1 specifies the position of the drone D itself as a series of processes for identifying the position of the drone D itself (hereinafter, referred to as "position identification process"). Can be executed.
In order to realize such position identification processing, the steering control system of FIG. 1 has a functional configuration as shown in FIG.
 図3は、図2の位置特定装置の機能的構成の一例を示す機能ブロック図である。
 図4は、図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例を示す図である。
FIG. 3 is a functional block diagram showing an example of the functional configuration of the position specifying device of FIG.
FIG. 4 is a diagram showing an example of a situation in which various information is acquired by a drone equipped with the position identification device of FIG.
 位置特定処理が実行される場合には、ドローンDが備える、位置特定装置1と、撮像部41と、測距部42と、駆動部43と、撮像部駆動部44と、が機能する。 When the position identification process is executed, the position identification device 1, the image pickup unit 41, the distance measuring unit 42, the drive unit 43, and the image pickup unit drive unit 44 provided in the drone D function.
 上述したように、ドローンDに備えられた位置特定装置1は、各種情報に基づいてドローンDの位置を特定する。即ち、本実施形態における位置特定装置1は、ドローンDに備えられ、制御に係る情報処理を実行する情報処理装置である。そこで、図3において、ドローンDは、位置特定装置1を備える(搭載する)構成として図示されている。 As described above, the position specifying device 1 provided in the drone D identifies the position of the drone D based on various information. That is, the position specifying device 1 in the present embodiment is an information processing device provided in the drone D and executing information processing related to control. Therefore, in FIG. 3, the drone D is illustrated as a configuration including (mounting) the position specifying device 1.
 まず、位置特定装置1と各種データを取得したり、各部を制御したりする、ドローンDが備える撮像部41、測距部42、駆動部43、及び撮像部駆動部44について説明する。 First, the imaging unit 41, the distance measuring unit 42, the driving unit 43, and the imaging unit driving unit 44 included in the drone D, which acquires various data with the position specifying device 1 and controls each unit, will be described.
 撮像部41は、レンズと、CCDやCMOS等の光学素子と、その制御部等からなり、画像や映像を撮像する。また、詳細は図5を用いて後述するが、撮像部41は、複数の画角の画像を撮像可能な、複数のレンズと光学素子との組を含んで構成される。 The image pickup unit 41 includes a lens, an optical element such as a CCD or CMOS, and a control unit thereof, and captures an image or video. Further, although details will be described later with reference to FIG. 5, the image pickup unit 41 includes a set of a plurality of lenses and optical elements capable of capturing images of a plurality of angles of view.
 測距部42は、各種信号発信装置と各種センサと、その制御部等からなり、測距部42から測定対象の点までの距離を測定する。具体的には例えば、レーダ距離計のような光波を利用した光波測距儀や、電波を用いた電波測距儀が、測距部42として採用される。更に言えば、測定対象の2次元又は3次元的の構造を取得可能な装置も測距部42として採用できる。即ち例えば、レーダやLiDAR(Light Detection and Ranging)が、測距部42として採用できる。 The distance measuring unit 42 includes various signal transmitting devices, various sensors, a control unit thereof, and the like, and measures the distance from the distance measuring unit 42 to the point to be measured. Specifically, for example, a light wave range finder using a light wave such as a radar range finder or a radio wave range finder using a radio wave is adopted as the distance measuring unit 42. Furthermore, a device capable of acquiring a two-dimensional or three-dimensional structure to be measured can also be adopted as the distance measuring unit 42. That is, for example, a radar or LiDAR (Light Detection and Ringing) can be adopted as the distance measuring unit 42.
 駆動部43は、供給されるエネルギーを用いて駆動する。駆動部43が駆動すること等により、ドローンDは空間を移動することができる。電気エネルギーを用いて駆動するモータや、ガソリン等の化学エネルギーを用いて駆動するエンジンは、いずれも駆動部43の一例である。 The drive unit 43 is driven by using the supplied energy. The drone D can move in space by driving the drive unit 43 or the like. A motor driven by using electric energy and an engine driven by using chemical energy such as gasoline are both examples of the driving unit 43.
 撮像部駆動部44は、上述の撮像部41をジンバル等により駆動することができる。即ち、詳細は図7を用いて後述するが、撮像部駆動部44は、複数の回転軸を有するジンバルを駆動することで、撮像部41に含まれる複数のレンズと光学素子との組の夫々の向きを回転させ、撮像部41により撮像される領域を変更することができる。 The image pickup unit drive unit 44 can drive the above-mentioned image pickup unit 41 with a gimbal or the like. That is, although the details will be described later with reference to FIG. 7, the image pickup unit drive unit 44 drives a gimbal having a plurality of rotation axes to drive a gimbal having a plurality of rotation axes to drive the gimbal having the plurality of rotation axes to drive the gimbal having the plurality of rotation axes to drive the gimbal having the plurality of rotation axes, respectively. The area imaged by the imaging unit 41 can be changed by rotating the direction of.
 位置特定装置1において機能する各機能ブロックの説明をする前に、まず、図4に示す例のドローンDと、ドローンDが備える撮像部41や測距部42と、被写対象となるマーカと、距離の測定の対象となる壁等との配置について説明する。 Before explaining each functional block that functions in the position specifying device 1, first, the drone D of the example shown in FIG. 4, the imaging unit 41 and the distance measuring unit 42 included in the drone D, and the marker to be imaged are described. , The arrangement with the wall or the like to be measured for the distance will be described.
 図4の例は、ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合において、上述の3次元直交座標系を採用した空間を飛行するドローンDを、軸Yが正の方向から示した例である。
 図4の例において、ドローンDに対して、軸Xが負の方向に壁Wが配されている。また、ドローンDに対して、軸Zが負の方向に地面Gが配されている。ここで、地面Gには、マーカ3が設置されている。なお、マーカ3の具体的な形状の例は、図6や図8を用いて後述する。
 ドローンDは、軸Zが負の方向を撮像する撮像部41を備える。2つの点線V1-a,V1-bは、ドローンDに備えられた撮像部41の画角を示している。即ち、撮像部41は、点線V1-aと地面Gとの交点、及び点線V1-bと地面Gとの交点の間の領域を撮像する。つまり、図4において、撮像部41は、マーカ3の像を含む画像を撮像する。なお、図示はしないが、撮像部41の画角は、軸Yの方向にも広がっていることは言うまでもない。
 また、ドローンDは、軸Xが負の方向の距離を測定する測距部42を備える。具体的には例えば、距離D1は、壁W上の点WPと、測距部42上の点S1との距離である。即ち、ドローンDに備えられた測距部42は、軸Xが負の方向の距離として、点WPと点S1との距離D1を測定する。
In the example of FIG. 4, when the position identification device 1 provided in the drone D executes the position identification process, the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction. This is an example shown from.
In the example of FIG. 4, the wall W is arranged in the direction in which the axis X is negative with respect to the drone D. Further, the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D. Here, the marker 3 is installed on the ground G. An example of a specific shape of the marker 3 will be described later with reference to FIGS. 6 and 8.
The drone D includes an imaging unit 41 that images a direction in which the axis Z is negative. The two dotted lines V1-a and V1-b indicate the angle of view of the imaging unit 41 provided in the drone D. That is, the imaging unit 41 images the region between the intersection of the dotted line V1-a and the ground G and the intersection of the dotted line V1-b and the ground G. That is, in FIG. 4, the imaging unit 41 captures an image including the image of the marker 3. Although not shown, it goes without saying that the angle of view of the imaging unit 41 also extends in the direction of the axis Y.
Further, the drone D includes a distance measuring unit 42 that measures a distance in a direction in which the axis X is negative. Specifically, for example, the distance D1 is the distance between the point WP on the wall W and the point S1 on the distance measuring unit 42. That is, the distance measuring unit 42 provided in the drone D measures the distance D1 between the point WP and the point S1 as the distance in the direction in which the axis X is negative.
 以下、図4を用いて、図3の位置特定装置1において機能する各機能ブロックについて説明する。 Hereinafter, each functional block that functions in the position specifying device 1 of FIG. 3 will be described with reference to FIG.
 位置特定装置1のCPU11において、画像データ取得部111と、相対位置推定部112と、距離取得部113と、空間情報取得部114と、位置特定部115と、姿勢制御部116と、撮像領域制御部117と、が機能する。 In the CPU 11 of the position identification device 1, the image data acquisition unit 111, the relative position estimation unit 112, the distance acquisition unit 113, the spatial information acquisition unit 114, the position identification unit 115, the attitude control unit 116, and the imaging area control Unit 117 and are functioning.
 画像データ取得部111は、実空間内のマーカ3が被写対象としてドローンDの位置から撮像された画像のデータを取得する。具体的には、画像データ取得部111は、ドローンDが備える撮像部41から、通信部19を介して、画像のデータを取得する。この画像データには、上述したように、ドローンDの軸Zが負の方向の地面に設置されているマーカ3の画像が含まれて撮像されている。 The image data acquisition unit 111 acquires image data captured from the position of the drone D with the marker 3 in the real space as the object to be imaged. Specifically, the image data acquisition unit 111 acquires image data from the image pickup unit 41 included in the drone D via the communication unit 19. As described above, this image data includes an image of the marker 3 in which the axis Z of the drone D is installed on the ground in the negative direction and is captured.
 相対位置推定部112は、画像のデータに対応する画像に含まれるマーカ3の画像に基づいて、マーカ3に対するドローンDの位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する。即ち、相対位置推定部112は、画像データ取得部111により取得された画像のデータに対応する画像に含まれる、マーカ3の画像に基づいて、マーカ3に対するドローンDの位置からの相対位置を推定する。 The relative position estimation unit 112 determines a relative position including at least one of a distance, a direction, and an angle from the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data. presume. That is, the relative position estimation unit 112 estimates the position relative to the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data acquired by the image data acquisition unit 111. do.
 ここで、相対位置とは、第1の位置に原点を持つ座標系における、座標の値の組として把握された第2の位置である。これに対して、絶対位置とは、共通する1つの座標系に対する座標の値の組として把握された位置である。即ち例えば、座標系として三次元直交座標系を用いる場合、第2の位置に対する第1の位置からの相対位置は、第2の位置の絶対位置の座標の値(の組)から、第1の位置の絶対位置の座標の値(の組)を差し引いたものを採用できる。
 しかしながら、本明細書において、第2の位置に対する第1の位置からの相対位置は、距離、方向、及び角度で決定されるものとして説明する。即ち、相対位置を表す座標系として、3次元極座標系が用いられる。つまり、第2の位置に対する第1の位置からの相対位置は、距離、方位角、及び仰俯角を含む。なお、方位角とは、X-Y平面(水平面)の相対位置を決める角度である。即ち、方位角は、ドローンDから見た方位(方角)に対応する。また、仰俯角とは、水平を基準とした上下方向の角度である。
Here, the relative position is a second position grasped as a set of coordinate values in a coordinate system having an origin at the first position. On the other hand, the absolute position is a position grasped as a set of coordinate values with respect to one common coordinate system. That is, for example, when a three-dimensional Cartesian coordinate system is used as the coordinate system, the relative position from the first position to the second position is the first from the coordinate value (set) of the absolute position of the second position. It is possible to adopt the value obtained by subtracting (a set) of the coordinates of the absolute position of the position.
However, in the present specification, the relative position from the first position to the second position will be described as being determined by a distance, a direction, and an angle. That is, a three-dimensional polar coordinate system is used as the coordinate system representing the relative position. That is, the relative position from the first position to the second position includes the distance, the azimuth angle, and the elevation / depression angle. The azimuth is an angle that determines the relative position of the XY plane (horizontal plane). That is, the azimuth corresponds to the azimuth (direction) seen from the drone D. The elevation / depression angle is an angle in the vertical direction with respect to the horizontal.
 詳しくは後述するが、相対位置推定部112により推定される3次元極座標系で表される相対位置は、撮像された画像における、マーカ3の画像の配置により、距離、方位角、及び仰俯角の夫々の精度の夫々が変化する。ある条件下では、距離、方向、及び角度のうち、いずれかの精度のみが高いという状況が考えられる。そこで、以下の説明では、相対位置が推定されている状態は、3次元的な相対位置が精度よく求められている状態だけでなく、距離、方位角、及び仰俯角のうち、何れかが所定の精度で推定されている状態として説明する。 As will be described in detail later, the relative position represented by the three-dimensional polar coordinate system estimated by the relative position estimation unit 112 has the distance, azimuth angle, and elevation / depression angle depending on the arrangement of the image of the marker 3 in the captured image. Each accuracy changes. Under certain conditions, it is conceivable that only one of the distance, direction, and angle is highly accurate. Therefore, in the following description, the state in which the relative position is estimated is not only the state in which the three-dimensional relative position is accurately obtained, but also any one of the distance, the azimuth angle, and the elevation / depression angle is predetermined. It will be described as a state estimated by the accuracy of.
 以下、相対位置推定部112が、画像データ取得部111により取得されたデータに対応する画像に含まれるマーカ3の画像に基づいて、マーカ3に対するドローンDの位置からの相対位置を推定する方法について、図4を適宜参照して説明する。 Hereinafter, a method in which the relative position estimation unit 112 estimates the relative position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the data acquired by the image data acquisition unit 111. , FIG. 4 will be described as appropriate.
 相対位置推定部112は、画像データ取得部111により取得されたデータに対応する画像に、マーカ3がどのような大きさで含まれるかに基づいて、マーカ3に対するドローンDの位置からの距離D2(図4参照)を推定する。なお、マーカ3が地面に配置されている場合であって、マーカ3とドローンDとの距離が十分離れている場合、相対位置としての距離D2は、所謂ドローンDの高度に対応する。
 図4の例では、マーカ3は、ドローンDの真下に位置している。相対位置推定部112は、画像データ取得部111により取得されたデータに対応する画像に基づいて、相対位置としての距離D2として、マーカ3上の点MPと、撮像部41上の点S2との距離を推定する。
 なお、図示はしないが、通常、ドローンDは、軸Zが負の方向に対する距離計を含む、高度計を備える。そこで、相対位置推定部112は、マーカ3との距離D2として、ドローンDが備える高度計により測定された高度の情報を利用してもよい。
The relative position estimation unit 112 determines the distance D2 from the position of the drone D with respect to the marker 3 based on the size of the marker 3 included in the image corresponding to the data acquired by the image data acquisition unit 111. (See FIG. 4) is estimated. When the marker 3 is placed on the ground and the distance between the marker 3 and the drone D is sufficiently large, the distance D2 as a relative position corresponds to the altitude of the so-called drone D.
In the example of FIG. 4, the marker 3 is located directly below the drone D. Based on the image corresponding to the data acquired by the image data acquisition unit 111, the relative position estimation unit 112 sets the distance D2 as the relative position between the point MP on the marker 3 and the point S2 on the image pickup unit 41. Estimate the distance.
Although not shown, the drone D usually includes an altimeter including a rangefinder in a direction in which the axis Z is negative. Therefore, the relative position estimation unit 112 may use the altitude information measured by the altimeter provided in the drone D as the distance D2 from the marker 3.
 また、相対位置推定部112は、画像データ取得部111により取得されたデータに対応する画像に、マーカ3がどの位置に含まれるかと、ドローンDがどのような姿勢であるかに基づいて、マーカ3に対するドローンDの位置からの方位角や仰俯角を推定する。即ち、図示はしないが、通常、ドローンDは、ドローンD自身の姿勢がどのような角度であるのかを取得するジャイロセンサ等を備える。また、ドローンDの姿勢の変化により、撮像部41により撮像される画像の領域は変化する。即ち、相対位置推定部112は、ドローンDの姿勢に基づき、撮像部がいずれの領域を撮像しているかを把握した上で、方位角や仰俯角を推定することができる。
 マーカ3が十分な大きさで撮像されない場合であっても、マーカ3であることが把握できるとき、相対位置推定部112は、方位角や仰俯角を所定の精度で推定することができる。
Further, the relative position estimation unit 112 is based on the position where the marker 3 is included in the image corresponding to the data acquired by the image data acquisition unit 111 and the posture of the drone D. The azimuth and elevation / depression angles from the position of the drone D with respect to 3 are estimated. That is, although not shown, the drone D usually includes a gyro sensor or the like for acquiring the angle of the drone D's own posture. Further, the area of the image captured by the imaging unit 41 changes due to the change in the posture of the drone D. That is, the relative position estimation unit 112 can estimate the azimuth angle and the elevation / depression angle after grasping which region the imaging unit is capturing based on the posture of the drone D.
Even when the marker 3 is not imaged with a sufficient size, the relative position estimation unit 112 can estimate the azimuth angle and the elevation / depression angle with a predetermined accuracy when it can be grasped that the marker 3 is a marker 3.
 距離取得部113は、壁Wに対するドローンDの位置からの距離を取得する。即ち、距離取得部113は、測距部42により測定された距離D1を取得する。
 上述したように、図4の例では、ドローンDに備えられた測距部42は、軸Xが負の方向の距離として、点WPと点S1との距離D1を測定する。距離取得部113は、ドローンDに備えられた測距部42から、通信部19を介して、距離D1を取得することができる。
The distance acquisition unit 113 acquires the distance from the position of the drone D with respect to the wall W. That is, the distance acquisition unit 113 acquires the distance D1 measured by the distance measurement unit 42.
As described above, in the example of FIG. 4, the distance measuring unit 42 provided in the drone D measures the distance D1 between the point WP and the point S1 as the distance in the direction in which the axis X is negative. The distance acquisition unit 113 can acquire the distance D1 from the distance measuring unit 42 provided in the drone D via the communication unit 19.
 空間情報取得部114は、実空間におけるマーカ3及び壁Wを含む複数の物体の配置の情報を、空間情報として取得する。即ち、空間情報取得部114は、操縦者端末2から、空間情報を取得する。 The spatial information acquisition unit 114 acquires information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space as spatial information. That is, the spatial information acquisition unit 114 acquires spatial information from the operator terminal 2.
 ここで、空間情報とは、実空間におけるマーカ3及び壁Wを含む複数の物体の配置の情報である。具体的には例えば、空間情報は、ドローンDが接触する可能性のある物体や、マーカ3が配置された地面Gや、測距部42により距離を測定する対象となり得る物体を含む各種物体の配置の情報を含む情報である。
 図4の例では、空間情報には、壁Wの配置の情報と、地面Gの配置の情報と、マーカ3の配置の情報とが含まれる。即ち例えば、「軸Xの座標がいくつのところに壁Wが存在する」といった情報が空間情報に含まれる。また例えば、「軸Zの座標(高度)がいくつのところに地面Gが存在する」といった情報が空間情報に含まれる。また例えば、「絶対座標がいくつのところにマーカ3が存在する」といった情報が空間情報に含まれる。
Here, the spatial information is information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space. Specifically, for example, the spatial information includes various objects including an object with which the drone D may come into contact, a ground G on which the marker 3 is arranged, and an object whose distance can be measured by the distance measuring unit 42. Information that includes placement information.
In the example of FIG. 4, the spatial information includes information on the arrangement of the wall W, information on the arrangement of the ground G, and information on the arrangement of the marker 3. That is, for example, information such as "where the wall W exists at the coordinate of the axis X" is included in the spatial information. Further, for example, information such as "where the ground G exists at the coordinate (altitude) of the axis Z" is included in the spatial information. Further, for example, information such as "where the absolute coordinates are where the marker 3 exists" is included in the spatial information.
 位置特定部115は、マーカ3に対するドローンDからの相対位置、壁Wに対するドローンDの位置からの距離、及び空間情報に基づいて、ドローンDの位置を特定する。
 即ち、位置特定部115は、相対位置推定部112により推定されたマーカ3に対する距離D2を含む相対位置、距離取得部113により取得された壁Wとの距離D1、空間情報取得部114により取得された空間情報に基づいて、ドローンDの位置を特定する。
The position specifying unit 115 identifies the position of the drone D based on the relative position of the drone D with respect to the marker 3, the distance from the position of the drone D with respect to the wall W, and the spatial information.
That is, the position specifying unit 115 is acquired by the relative position including the distance D2 with respect to the marker 3 estimated by the relative position estimation unit 112, the distance D1 to the wall W acquired by the distance acquisition unit 113, and the spatial information acquisition unit 114. The position of the drone D is specified based on the spatial information.
 具体的には例えば、空間情報には、マーカ3の絶対座標が含まれる。そこで、位置特定部115は、相対位置推定部112により推定された相対位置と、空間情報に基づいて、ドローンDの位置の絶対座標が特定できる。位置特定部115は、距離取得部113により取得された壁Wとの距離D1と、壁Wの位置を含む空間情報とに基づいて、特定される位置の精度を向上することができる。
 即ち例えば、上述した通り、撮像された画像における、マーカ3の画像の配置により、相対座標の精度は変化する。例えば、図4の例において、距離D2が大きく、方位角や仰俯角の精度が悪い場合、絶対座標におけるドローンDのX-Y平面上の座標は、大きな誤差を持つ。このような場合であっても、位置特定部115は、壁Wとの距離D1と、壁Wの位置を含む空間情報とに基づいて、特定される位置の精度を向上することができる。更に言えば、絶対座標におけるドローンDのX-Y平面上の座標が、大きな誤差を持つ場合、ドローンDは、軸Xが負の方向に配する壁Wに接触する恐れがある。しかしながら、本実施形態のドローンDは、壁Wに対する距離D1を直接測定することができる。これにより、自律制御されるドローンDは、壁Wに接触することを防ぐことができる。
Specifically, for example, the spatial information includes the absolute coordinates of the marker 3. Therefore, the position specifying unit 115 can specify the absolute coordinates of the position of the drone D based on the relative position estimated by the relative position estimating unit 112 and the spatial information. The position specifying unit 115 can improve the accuracy of the specified position based on the distance D1 with the wall W acquired by the distance acquisition unit 113 and the spatial information including the position of the wall W.
That is, for example, as described above, the accuracy of the relative coordinates changes depending on the arrangement of the image of the marker 3 in the captured image. For example, in the example of FIG. 4, when the distance D2 is large and the accuracy of the azimuth angle and the elevation / depression angle is poor, the coordinates of the drone D on the XY plane in absolute coordinates have a large error. Even in such a case, the position specifying unit 115 can improve the accuracy of the specified position based on the distance D1 from the wall W and the spatial information including the position of the wall W. Furthermore, if the coordinates of the drone D on the XY plane in absolute coordinates have a large error, the drone D may come into contact with the wall W whose axis X is arranged in the negative direction. However, the drone D of the present embodiment can directly measure the distance D1 with respect to the wall W. As a result, the autonomously controlled drone D can be prevented from coming into contact with the wall W.
 姿勢制御部116は、位置特定部115により特定されたドローンDの位置や、壁Wとの距離D1に基づいて、ドローンDの姿勢や位置を制御する。即ち、姿勢制御部116は、通信部19を介して、駆動部43を制御することができる。
 これにより、上述したように、壁Wに対する距離D1が所定の距離より短くならないように制御することができる。つまり、ドローンDが壁Wに接触するのを防ぐことができる。
 また例えば、後述するように、撮像部41により撮像される領域からマーカ3がはみ出て撮像された場合、マーカ3の全体が撮像されるように、姿勢を制御することもできる。これにより、相対位置推定部112は、マーカ3との相対位置を推定することができるようになる。撮像部41により撮像される領域からマーカ3がはみ出て撮像された場合の例は、図5及び図6を用いて後述する。
The attitude control unit 116 controls the attitude and position of the drone D based on the position of the drone D specified by the position specifying unit 115 and the distance D1 from the wall W. That is, the attitude control unit 116 can control the drive unit 43 via the communication unit 19.
Thereby, as described above, it is possible to control the distance D1 with respect to the wall W so as not to be shorter than a predetermined distance. That is, it is possible to prevent the drone D from coming into contact with the wall W.
Further, for example, as will be described later, when the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged, the posture can be controlled so that the entire marker 3 is imaged. As a result, the relative position estimation unit 112 can estimate the relative position with the marker 3. An example in which the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged will be described later with reference to FIGS. 5 and 6.
 撮像領域制御部117は、撮像部41により撮像される領域を変化させる制御をする。即ち例えば、撮像部41は、撮像部駆動部44を備えるジンバルにより向きを変えることができる。撮像領域制御部117は、通信部19を介して撮像部駆動部44を制御することにより、撮像する領域を変更する制御をすることができる。撮像領域制御部117が撮像領域を変更する例は、図7を用いて後述する。 The imaging area control unit 117 controls to change the area imaged by the imaging unit 41. That is, for example, the imaging unit 41 can be turned by a gimbal including an imaging unit driving unit 44. The image pickup area control unit 117 can control to change the image pickup area by controlling the image pickup unit drive unit 44 via the communication unit 19. An example in which the imaging region control unit 117 changes the imaging region will be described later with reference to FIG. 7.
 以上、図3に示す位置特定装置1の機能的構成を、図4に示すドローンDに備えられた位置特定装置1におけるドローンD自身の位置を特定する例を用いて説明した。
 以下、図5及び図6を用いて、撮像部41により撮像される領域からマーカ3がはみ出て撮像された場合の例を説明する。
As described above, the functional configuration of the position specifying device 1 shown in FIG. 3 has been described with reference to an example of specifying the position of the drone D itself in the position specifying device 1 provided in the drone D shown in FIG.
Hereinafter, an example in which the marker 3 protrudes from the region imaged by the imaging unit 41 and is imaged will be described with reference to FIGS. 5 and 6.
 図5は、図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4とは異なる例を示す図である。
 図6は、図3の位置特定装置が、図4及び図5の夫々に示す状況において取得する画像の例の夫々を示す図である。
FIG. 5 is a diagram showing an example different from that of FIG. 4 among an example of a situation in which various information is acquired by a drone equipped with the position specifying device of FIG.
FIG. 6 is a diagram showing each of the examples of images acquired by the position identifying device of FIG. 3 in the situations shown in FIGS. 4 and 5, respectively.
 図5の例は、図4と同様に、ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合において、上述の3次元直交座標系を採用した空間を飛行するドローンDを、軸Yが正の方向から示した例である。
 図5の例のドローンDは、軸DZが軸Zと平行でない状況となっている。即ち、ドローンDは風に煽られたり、軸Xの方向に水平移動したりすることで、機体が傾いている状況である。
 ここで、図5の例のドローンDは、2つの撮像部41-1,41-2を備える。2つの点線V2-a,V2-bは、ドローンDに備えられた撮像部41-1の画角を示している。1つ目の撮像部41-1は、図4における撮像部41と基本的に同様の画角を撮像する。また、2つの点線V3-a,V3-bは、ドローンDに備えられた撮像部41-2の画角を示している。即ち、撮像部41-2は、撮像部41-1の画角と比較して広い画角の画像を撮像する。
In the example of FIG. 5, similarly to FIG. 4, when the position identification device 1 provided in the drone D executes the position identification process, the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system is displayed. This is an example in which the axis Y is shown from the positive direction.
In the drone D in the example of FIG. 5, the axis DZ is not parallel to the axis Z. That is, the drone D is in a situation where the aircraft is tilted due to being fanned by the wind or moving horizontally in the direction of the axis X.
Here, the drone D in the example of FIG. 5 includes two imaging units 41-1 and 41-2. The two dotted lines V2-a and V2-b indicate the angle of view of the imaging unit 41-1 provided in the drone D. The first imaging unit 41-1 images an angle of view basically similar to that of the imaging unit 41 in FIG. Further, the two dotted lines V3-a and V3-b indicate the angle of view of the imaging unit 41-2 provided in the drone D. That is, the imaging unit 41-2 captures an image having a wider angle of view than the angle of view of the imaging unit 41-1.
 ここで、図4の撮像部41が撮像する画像、及び図5の撮像部41-1が撮像する画像の例を説明する。
 図6のA状況及びB状況の夫々に示す図は、図4及び図5の状況の夫々において飛行するドローンDから撮像される画像の夫々の例を、軸Zが正の方向から示した例である。
 図6のA状況に示す図は、図4の撮像部41が撮像する画像の例である。即ち、図3の画像データ取得部は、図4の状況において、図5のA状況に示す画像を取得する。
 図6のB状況に示す図は、図5の撮像部41-1が撮像する画像の例である。即ち、図3の画像データ取得部は、図5の状況において、図5のB状況に示す画像を取得する。
Here, an example of an image captured by the imaging unit 41 of FIG. 4 and an image captured by the imaging unit 41-1 of FIG. 5 will be described.
The figures shown in the situations A and B in FIG. 6 are examples of images taken from the drone D flying in each of the situations of FIGS. 4 and 5, respectively, in which the axis Z is shown from the positive direction. Is.
The figure shown in the A situation of FIG. 6 is an example of an image captured by the imaging unit 41 of FIG. That is, the image data acquisition unit of FIG. 3 acquires the image shown in the situation A of FIG. 5 in the situation of FIG.
The figure shown in the B situation of FIG. 6 is an example of an image captured by the imaging unit 41-1 of FIG. That is, the image data acquisition unit of FIG. 3 acquires the image shown in the situation B of FIG. 5 in the situation of FIG.
 即ち、画像データ取得部111は、更に、ドローンDの位置から撮像された画像と異なる画角の他の画像のデータを取得することができる。
 また、相対位置推定部112は、複数の画像のデータの夫々に対応する画像の夫々のうち少なくとも一方に含まれるマーカ3の画像に基づいて、ドローンDの位置とマーカ3との相対位置を取得することができる。
That is, the image data acquisition unit 111 can further acquire data of another image having a different angle of view from the image captured from the position of the drone D.
Further, the relative position estimation unit 112 acquires the position of the drone D and the relative position of the marker 3 based on the image of the marker 3 included in at least one of the images corresponding to the data of the plurality of images. can do.
 図4の状況において、ドローンDは、機体が傾いておらず、点線V1-a,V1-bで示される画角にマーカ3が含まれている。そこで、図6のA状況に示される、撮像部41により撮像される画像の例には、地面Gと、地面Gに配置されたマーカ3の全体とが含まれている。
 図5の状況において、ドローンDは、機体が傾いており、点線V2-a,V2-bで示される画角からマーカ3がはみ出して含まれている。そこで、図6のB状況に示される、撮像部41-1により撮像される画像の例には、地面Gと、地面Gに配置されたマーカ3の一部とが含まれている。
In the situation of FIG. 4, the drone D is not tilted, and the marker 3 is included in the angle of view indicated by the dotted lines V1-a and V1-b. Therefore, the example of the image captured by the imaging unit 41 shown in the situation A in FIG. 6 includes the ground G and the entire marker 3 arranged on the ground G.
In the situation of FIG. 5, the drone D has an inclined body, and the marker 3 protrudes from the angle of view indicated by the dotted lines V2-a and V2-b. Therefore, the example of the image captured by the imaging unit 41-1 shown in the B situation of FIG. 6 includes the ground G and a part of the marker 3 arranged on the ground G.
 このように、ドローンDの機体が傾いた場合、撮像部41や撮像部41-1により撮像される画像からマーカ3がはみ出してしまうことがある。このような場合に利用されるのが撮像部41-2により撮像される画像である。
 上述したように、ドローンDに備えられた撮像部41-2は、2つの点線V3-a,V3-bで示される画角の画像を撮像する。図5を見ると、マーカ3の全体は、2つの点線V3-a,V3-bで示される画角に含まれている。即ち、ドローンDに備えられた撮像部41-2は、マーカ3の全体が含まれる画像を撮像する。相対位置推定部112は、撮像部41-2により撮像された画像のデータに基づいて相対位置を推定することで、撮像部41-2により撮像された画像のデータに基づいて相対位置を推定できない場合においても、相対位置を推定することができる。
In this way, when the body of the drone D is tilted, the marker 3 may protrude from the image captured by the imaging unit 41 or the imaging unit 41-1. An image captured by the imaging unit 41-2 is used in such a case.
As described above, the image pickup unit 41-2 provided in the drone D captures an image having an angle of view indicated by the two dotted lines V3-a and V3-b. Looking at FIG. 5, the entire marker 3 is included in the angle of view indicated by the two dotted lines V3-a and V3-b. That is, the image pickup unit 41-2 provided in the drone D captures an image including the entire marker 3. The relative position estimation unit 112 cannot estimate the relative position based on the data of the image captured by the imaging unit 41-2 by estimating the relative position based on the data of the image captured by the imaging unit 41-2. Even in the case, the relative position can be estimated.
 更に言えば、ドローンDが画角の異なる撮像部を複数備えることで、以下のような利用が可能となる。
 上述したように、突風によりドローンDの姿勢が高速に変化した場合、例えばドローンDの姿勢の修正が間に合わないことがある。このとき、画角が広い撮像部41-2を備えることで、姿勢が変化した場合であってもマーカ3の全体を含んだ画像を撮像することができる。これにより、姿勢が変化した場合であっても位置特定装置1は、位置を特定することができ、ドローンDは安全に飛行することが可能となる。
Furthermore, if the drone D is provided with a plurality of imaging units having different angles of view, the following uses are possible.
As described above, when the posture of the drone D changes at high speed due to the gust, for example, the posture of the drone D may not be corrected in time. At this time, by providing the imaging unit 41-2 having a wide angle of view, it is possible to capture an image including the entire marker 3 even when the posture changes. As a result, the position specifying device 1 can specify the position even when the attitude changes, and the drone D can fly safely.
 次に、図7を用いて、撮像部駆動部44が、撮像部41-1をジンバル等により駆動する例を説明する。
 図7は、図3の位置特定装置が、図5の状況において撮像される領域を制御した状況の例を示す図である。
 図7の例は、図4及び図5と同様に、ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合において、上述の3次元直交座標系を採用した空間を飛行するドローンDを、軸Yが正の方向から示した例である。
Next, an example in which the imaging unit driving unit 44 drives the imaging unit 41-1 with a gimbal or the like will be described with reference to FIG. 7.
FIG. 7 is a diagram showing an example of a situation in which the position identifying device of FIG. 3 controls a region to be imaged in the situation of FIG.
In the example of FIG. 7, similarly to FIGS. 4 and 5, when the position identification device 1 provided in the drone D executes the position identification process, the drone flies in the space adopting the above-mentioned three-dimensional Cartesian coordinate system. This is an example in which D is shown from the direction in which the axis Y is positive.
 図7の例は、図5の例のドローンDの撮像部駆動部44により撮像部41-1が駆動された例である。ここで、2つの点線V2-a,V2-bは、ドローンDに備えられた撮像部41-1の画角であるが、図7の例の2つの点線V2-a,V2-bは、図5の例と異なる位置に示されている。つまり、図7の例は、図5の例のドローンDの撮像部駆動部44により撮像部41-1が駆動されることで、撮像部41-1により撮像される方向が変化している様子を示している。 The example of FIG. 7 is an example in which the imaging unit 41-1 is driven by the imaging unit driving unit 44 of the drone D of the example of FIG. Here, the two dotted lines V2-a and V2-b are the angles of view of the imaging unit 41-1 provided in the drone D, but the two dotted lines V2-a and V2-b in the example of FIG. 7 are It is shown at a different position from the example of FIG. That is, in the example of FIG. 7, the direction of imaging by the imaging unit 41-1 is changed by driving the imaging unit 41-1 by the imaging unit driving unit 44 of the drone D of the example of FIG. Is shown.
 撮像領域制御部117は、画像として撮像される実空間の領域を変更する制御をすることができる。即ち、撮像領域制御部117は、撮像部駆動部44を介して、撮像部41-1のジンバル等を駆動することにより、撮像部41-1により撮像される領域を変化させる制御をすることができる。撮像部41等により撮像される画像の画素数によるが、通常、相対位置を推定する場合、画角が狭い撮像部41-1により撮像された画像に基づいた方が、相対位置の精度が向上する。即ち、画角が狭いカメラで撮像されたマーカ3の画像に基づいた方が、相対位置の精度が向上する。即ち、撮像領域制御部117により、撮像部駆動部44を介して、撮像部41-1のジンバル等を駆動することにより、相対位置推定の精度が向上する。 The image pickup area control unit 117 can control to change the real space area to be imaged as an image. That is, the image pickup area control unit 117 can control the region to be imaged by the image pickup unit 41-1 by driving the gimbal or the like of the image pickup unit 41-1 via the image pickup unit drive unit 44. can. Although it depends on the number of pixels of the image captured by the imaging unit 41 or the like, when estimating the relative position, the accuracy of the relative position is usually improved based on the image captured by the imaging unit 41-1 having a narrow angle of view. do. That is, the accuracy of the relative position is improved based on the image of the marker 3 captured by the camera having a narrow angle of view. That is, the accuracy of relative position estimation is improved by driving the gimbal or the like of the imaging unit 41-1 by the imaging region control unit 117 via the imaging unit driving unit 44.
 しかしながら、着陸する場合、ドローンDが即ち軸Zが方向に移動する場合には、画角の狭い撮像部41-1では、マーカ3の全体を撮像できない領域を飛行することがある。この場合、相対位置推定部112は、画角の広い撮像部41-2により撮像された画像に基づいて相対位置を推定することができる。
 この様に、複数の画角の画像を撮像する撮像部41-1,41-2等を備えることにより、ドローンDが飛行するにあたり、相対位置を推定することが可能となる領域が広くなる。
 更に言えば、撮像領域制御部117により、撮像部41-1,41-2等により撮像される領域を制御することにより、ドローンDの姿勢や位置によらず、相対位置を推定することが可能となる領域が更に広くなる。
However, when landing, when the drone D moves in the direction of the axis Z, the imaging unit 41-1 having a narrow angle of view may fly in a region where the entire marker 3 cannot be imaged. In this case, the relative position estimation unit 112 can estimate the relative position based on the image captured by the image pickup unit 41-2 having a wide angle of view.
In this way, by providing the imaging units 41-1, 41-2 and the like for capturing images of a plurality of angles of view, the area in which the relative position can be estimated becomes wide when the drone D flies.
Furthermore, by controlling the area imaged by the image pickup units 41-1, 41-2, etc. by the image pickup area control unit 117, it is possible to estimate the relative position regardless of the posture and position of the drone D. The area becomes wider.
 ここで、図6のA状況に示される、撮像されたマーカ3の画像の例と、図8に示される他のマーカ3の例を用いて、マーカ3が持つ構造について説明する。
 図8は、図3の位置特定装置により取得される画像の被写対象であるマーカの一例を示す図である。
Here, the structure of the marker 3 will be described with reference to an example of an image of the captured marker 3 shown in the situation A of FIG. 6 and an example of another marker 3 shown in FIG.
FIG. 8 is a diagram showing an example of a marker that is an object to be imaged of an image acquired by the position specifying device of FIG.
 まず、図6のA状況に示される、撮像されたマーカ3の画像の例について説明する。
 図6のA状況の例のマーカ3は、黒く縁取りされた正方形の枠の内側に、白塗りの正方形と黒塗りの正方形との配置からなる二次元バーコード様の模様を備える。
 このマーカ3が備える模様は、以下の様な特徴を持つ。
First, an example of an image of the captured marker 3 shown in the situation A in FIG. 6 will be described.
The marker 3 in the example of the situation A in FIG. 6 is provided with a two-dimensional bar code-like pattern consisting of an arrangement of white-painted squares and black-painted squares inside a square frame with a black border.
The pattern provided by the marker 3 has the following features.
 図6のA状況に示されるマーカ3が備える形状は、軸Zと平行な軸を中心とした回転対称ではない模様である。これにより、相対位置推定部112は、マーカ3が撮像された画像から、マーカ3に対する方位角が推定できる。 The shape of the marker 3 shown in the situation A in FIG. 6 is not rotationally symmetric with respect to the axis parallel to the axis Z. As a result, the relative position estimation unit 112 can estimate the azimuth angle with respect to the marker 3 from the image captured by the marker 3.
 図示はしないが、マーカ3は、以下の様な模様を備えることができる。
 マーカ3は、各種各様な模様を備えることで、高精度の制御や、指示等の意味を含むことができる。具体的には例えば、マーカ3は、複数のマーカ3のうち、いずれのマーカ3であるかの情報(以下、「マーカID」と呼ぶ)を含む模様を備えることができる。これにより、位置特定部115は、複数のマーカ3の配置と、撮像しているマーカ3のマーカIDと、撮像しているマーカ3との相対位置に基づいて、位置を特定することができる。
 また、マーカ3は、マーカ3の全体を撮像できない場合においても、方位角やマーカID等の情報を取得できるように、所定のアルゴリズムを用いて、マーカ3が含む情報の欠落を補う模様を備えることができる。
 また、マーカ3は、当該マーカが設置されるGPS座標の情報を含んだ模様を備えてもよい。これにより、当該マーカが正しい位置に設置されているかを認証することができる。なお、位置特定装置1は、マーカIDとマーカ3設置されるGPS座標の情報の組を取得することにより、撮像しているマーカ3のマーカIDからGPS座標を照合するようにしてもよい。
Although not shown, the marker 3 can have the following patterns.
By providing various patterns, the marker 3 can include meanings such as high-precision control and instructions. Specifically, for example, the marker 3 can include a pattern including information on which of the plurality of markers 3 is the marker 3 (hereinafter, referred to as “marker ID”). As a result, the position specifying unit 115 can specify the position based on the arrangement of the plurality of markers 3, the marker ID of the marker 3 being imaged, and the relative position of the marker 3 being imaged.
Further, the marker 3 is provided with a pattern for compensating for the lack of information included in the marker 3 by using a predetermined algorithm so that information such as the azimuth angle and the marker ID can be acquired even when the entire marker 3 cannot be imaged. be able to.
Further, the marker 3 may include a pattern including information on GPS coordinates in which the marker is installed. This makes it possible to authenticate whether the marker is installed at the correct position. The position specifying device 1 may collate the GPS coordinates with the marker ID of the marker 3 being imaged by acquiring a set of information of the marker ID and the GPS coordinates installed on the marker 3.
 次に、図8に示される、他のマーカ3の例について説明する。
 図8の例のマーカ3は、図6の例のマーカ3と同様の構造を備えるマーカ3-aと、マーカ3-aと比較して大きな黒く縁取りされた正方形のマーカ3-bとから構成される。
 図8の例のマーカ3が備える模様は、以下の様な特徴を持つ。
Next, an example of another marker 3 shown in FIG. 8 will be described.
The marker 3 in the example of FIG. 8 is composed of a marker 3-a having the same structure as the marker 3 in the example of FIG. 6 and a square marker 3-b having a large black border as compared with the marker 3-a. Will be done.
The pattern included in the marker 3 in the example of FIG. 8 has the following features.
 マーカ3-aと比較して大きな黒く縁取りされた正方形のマーカ3-bを備えることにより、撮像部41により撮像された画像から、マーカ3がはみ出した場合において、マーカ3がはみ出したことに対する耐性をあげることができる。即ち例えば、ドローンDが突風等により機体の姿勢を乱した場合、撮像部41により撮像された画像からマーカ3の一部がはみ出したときであっても、大きな黒く縁取りされた正方形のマーカ3-bは、撮像部41により撮像された画像に含まれる可能性が高い。これにより、撮像領域制御部117は、マーカ3-aが撮像されるように撮像部41により撮像される領域を制御することができる。 By providing a square marker 3-b having a large black border as compared with the marker 3-a, when the marker 3 protrudes from the image captured by the imaging unit 41, the resistance to the protrusion of the marker 3 Can be given. That is, for example, when the drone D disturbs the attitude of the aircraft due to a gust or the like, even when a part of the marker 3 protrudes from the image captured by the imaging unit 41, a large black-edged square marker 3- b is likely to be included in the image captured by the imaging unit 41. As a result, the imaging region control unit 117 can control the region to be imaged by the imaging unit 41 so that the marker 3-a is imaged.
 また例えば、高高度を飛行するドローンDは、撮像された画像から、比較的小さいマーカ3-aの詳細な模様を把握できないことがある。この様な場合、ドローンDは、マーカ3-aと比較して大きな黒く縁取りされた正方形のマーカ3-bを目標にマーカ3に接近し、マーカ3-aの詳細な模様を把握できる様になった時点で、マーカ3-aからマーカID等を取得し、正しい着陸点であることを確認できる。 Also, for example, the drone D flying at a high altitude may not be able to grasp the detailed pattern of the relatively small marker 3-a from the captured image. In such a case, the drone D approaches the marker 3 with the target of the square marker 3-b having a large black border as compared with the marker 3-a, so that the detailed pattern of the marker 3-a can be grasped. At that point, the marker ID and the like can be obtained from the marker 3-a, and it can be confirmed that the landing point is correct.
 この様に、相対位置推定部112は、マーカ3が有する大きな構造と当該大きな構造と比較して小さな構造とのうち少なくとも一方に基づいて、マーカ3に対するドローンDの位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を解析することができる。 As described above, the relative position estimation unit 112 determines the distance, direction, and direction from the position of the drone D with respect to the marker 3 based on at least one of the large structure of the marker 3 and the structure smaller than the large structure. Relative positions including at least one of the angles can be analyzed.
 図9は、図3の機能的構成を有する位置特定装置により実行される、位置特定処理の流れの一例を説明するフローチャートである。 FIG. 9 is a flowchart illustrating an example of the flow of the position identification process executed by the position identification device having the functional configuration of FIG.
 ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合、位置特定処理が開始されて、次のようなステップS11乃至S15の処理が実行される。 When the position identification device 1 provided in the drone D executes the position identification process, the position identification process is started and the following processes S11 to S15 are executed.
 ステップS11において、画像データ取得部111は、実空間内のマーカ3が被写対象としてドローンDの位置から撮像された画像のデータを取得する。 In step S11, the image data acquisition unit 111 acquires image data captured from the position of the drone D with the marker 3 in the real space as the object to be imaged.
 ステップS12において、相対位置推定部112は、ステップS11で取得された画像のデータに対応する画像に含まれるマーカ3の画像に基づいて、マーカ3に対するドローンDの位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する。 In step S12, the relative position estimation unit 112 determines the distance, direction, and angle from the position of the drone D with respect to the marker 3 based on the image of the marker 3 included in the image corresponding to the image data acquired in step S11. Estimate the relative position including at least one of them.
 ステップS13において、距離取得部113は、壁Wに対するドローンDの位置からの距離を取得する。 In step S13, the distance acquisition unit 113 acquires the distance from the position of the drone D with respect to the wall W.
 ステップS14において、空間情報取得部114は、実空間におけるマーカ3及び壁Wを含む複数の物体の配置の情報を、空間情報として取得する。 In step S14, the spatial information acquisition unit 114 acquires information on the arrangement of a plurality of objects including the marker 3 and the wall W in the real space as spatial information.
 ステップS15において、位置特定部115は、ステップS12で推定されたマーカ3に対するドローンDからの相対位置、ステップS13で取得された壁Wに対するドローンDの位置からの距離、及びステップS14で取得された空間情報に基づいて、ドローンDの位置を特定する。
 これにより、位置特定処理は終了する。
In step S15, the position specifying unit 115 is the relative position from the drone D with respect to the marker 3 estimated in step S12, the distance from the position of the drone D with respect to the wall W acquired in step S13, and the position is acquired in step S14. The position of the drone D is specified based on the spatial information.
As a result, the position identification process is completed.
 以上、本発明が適用される位置特定装置を備えるドローンを含む操縦制御システムの実施形態を説明してきた。しかしながら、本発明が適用される実施形態は、例えば次のようなものであってもよい。 The embodiment of the maneuvering control system including the drone including the position specifying device to which the present invention is applied has been described above. However, the embodiment to which the present invention is applied may be, for example, as follows.
 例えば、マーカ3は、上述の実施形態において、2次元バーコード状のマーカとして説明したが、特にこれに限定されない。以下、図10及び図11を用いて、異なる形態をとるマーカ3の例を説明する。 For example, the marker 3 has been described as a two-dimensional bar code-shaped marker in the above-described embodiment, but the marker 3 is not particularly limited thereto. Hereinafter, an example of the marker 3 having a different form will be described with reference to FIGS. 10 and 11.
 例えば、位置特定装置1は、図10に示すような例においても利用可能である。
 図10は、図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4及び図5とは異なる例を示す図である。
For example, the position specifying device 1 can also be used in the example shown in FIG.
FIG. 10 is a diagram showing an example different from FIGS. 4 and 5 in an example of a situation in which various information is acquired by a drone equipped with the position specifying device of FIG.
 図10の例は、ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合において、上述の3次元直交座標系を採用した空間を飛行するドローンDを、軸Yが正の方向から示した例である。なお、詳細は後述するが、図10の例は、マーカ3は、紐状のマーカ3-Rが採用されている例である。
 図10の例において、ドローンDに対して、軸Xが正の方向に壁W1が配されている。また、ドローンDに対して、軸Xが負の方向に壁W2が配されている。ドローンDに対して、軸Zが正の方向に天井Cが配されている。また、ドローンDに対して、軸Zが負の方向に地面Gが配されている。ここで、天井Cから、紐状のマーカ3-Rがぶら下げて設置されている。
 即ち、図10の例の空間は、天井Cや壁W1,W2、地面Gにより閉じられた空間である。
In the example of FIG. 10, when the position identification device 1 provided in the drone D executes the position identification process, the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction. This is an example shown from. Although details will be described later, in the example of FIG. 10, the marker 3 is an example in which the string-shaped marker 3-R is adopted.
In the example of FIG. 10, the wall W1 is arranged in the direction in which the axis X is positive with respect to the drone D. Further, the wall W2 is arranged in the direction in which the axis X is negative with respect to the drone D. The ceiling C is arranged in the direction in which the axis Z is positive with respect to the drone D. Further, the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D. Here, a string-shaped marker 3-R is hung from the ceiling C and installed.
That is, the space in the example of FIG. 10 is a space closed by the ceiling C, the walls W1 and W2, and the ground G.
 図10の例のドローンDは、図4の例と同様に、位置特定装置1と、撮像部41と、測距部42とを備えて描かれている。ここで、撮像部41は、図4の例と異なり、地面に配置されたマーカ3ではなく、天井Cからぶら下げて設置されている紐状のマーカ3-Rを撮像している。
 この場合であっても、相対位置推定部112は、撮像部41により撮像されたマーカ3-Rの画像に基づいて、相対位置を推定することができる。このように、マーカ3として、床に設置されたマーカ3のみならず、天井Cからぶら下げて設置されている紐状のマーカ3-Rを採用することができる。
 マーカ3-Rは、目立つ色のロープや、地面Gから建てられた棒であってもよい。この様なマーカ3-Rは、設置が容易であるにも関わらず、煙突等の円筒形の建物や対称性の高い建物であっても有効である。更に言えば、図示はしないが、マーカ3-Rは、複数用意され、壁W1,W2及び図示せぬ軸Yが正の方向及び軸Yが負の方向の壁の夫々に異なる色の紐状のマーカ3-Rがぶら下げられことにより、壁の識別を可能とすることができる。ドローンDは、マーカ3-Rを利用することでいずれの壁であるかを識別することで、位置を特定することができる。
The drone D in the example of FIG. 10 is drawn with the position specifying device 1, the imaging unit 41, and the distance measuring unit 42, as in the example of FIG. Here, unlike the example of FIG. 4, the imaging unit 41 is imaging the string-shaped markers 3-R hanging from the ceiling C instead of the markers 3 arranged on the ground.
Even in this case, the relative position estimation unit 112 can estimate the relative position based on the image of the marker 3-R imaged by the image pickup unit 41. As described above, as the marker 3, not only the marker 3 installed on the floor but also the string-shaped marker 3-R hung from the ceiling C can be adopted.
The markers 3-R may be a prominently colored rope or a rod built from the ground G. Although such markers 3-R are easy to install, they are effective even in a cylindrical building such as a chimney or a building with high symmetry. Furthermore, although not shown, a plurality of markers 3-R are prepared, and a string shape having a different color for each of the walls W1 and W2 and the wall in which the axis Y (not shown) is in the positive direction and the axis Y is in the negative direction. By hanging the markers 3-R of the above, it is possible to identify the wall. The position of the drone D can be specified by identifying which wall it is by using the markers 3-R.
 また例えば、位置特定装置1は、図11に示すような例においても利用可能である。 Also, for example, the position specifying device 1 can be used in the example shown in FIG.
 図11は、図3の位置特定装置を備えるドローンにより各種情報が取得される状況の一例のうち、図4、図5、及び図9とは異なる例を示す図である。 FIG. 11 is a diagram showing an example different from FIGS. 4, 5, and 9 among an example of a situation in which various information is acquired by a drone equipped with the position identification device of FIG.
 図11の例は、ドローンDに備えられた位置特定装置1が位置特定処理を実行する場合において、上述の3次元直交座標系を採用した空間を飛行するドローンDを、軸Yが正の方向から示した例である。なお、詳細は後述するが、図10の例は、マーカ3は、紐状のマーカ3-Rが採用されている例である。
 図11の例において、ドローンDに対して、軸Xが正の方向に壁W3-aが配されている。また、ドローンDに対して、軸Xが負の方向に壁W3-bが配されている。また、ドローンDに対して、軸Zが負の方向に地面Gが配されている。ここで、天井Cから、紐状のマーカ3-Rがぶら下げて設置されている。ここで、図11において、壁W3-aと壁W3-bとは、円弧を描いて接続して描かれている。即ち、図11の例の空間は、煙突状の空間である。
In the example of FIG. 11, when the position identification device 1 provided in the drone D executes the position identification process, the drone D flying in the space adopting the above-mentioned three-dimensional Cartesian coordinate system has the axis Y in the positive direction. This is an example shown from. Although details will be described later, in the example of FIG. 10, the marker 3 is an example in which the string-shaped marker 3-R is adopted.
In the example of FIG. 11, the wall W3-a is arranged in the direction in which the axis X is positive with respect to the drone D. Further, the wall W3-b is arranged in the direction in which the axis X is negative with respect to the drone D. Further, the ground G is arranged in the direction in which the axis Z is negative with respect to the drone D. Here, a string-shaped marker 3-R is hung from the ceiling C and installed. Here, in FIG. 11, the wall W3-a and the wall W3-b are drawn by drawing an arc and connecting them. That is, the space in the example of FIG. 11 is a chimney-shaped space.
 図11の例において、壁W3-aに沿って、紐状のマーカ3-Rがぶら下げて配置されている。 In the example of FIG. 11, a string-shaped marker 3-R is hung and arranged along the wall W3-a.
 ここで、ドローンDが備える撮像部41は、紐状のマーカ3-Rを撮像する。
 この時、撮像部41により撮像された画像には、図11に示す角度θの示す角度に対応する領域だけ含まれる。更に言えば、煙突が長い場合、角度θはおよそ90度となり、ドローンDにとっては、ほぼ真上までマーカ3-Rが存在している様に撮像される。即ち、撮像部41により撮像された画像には、紐状のマーカ3-Rの長さの分だけ、大きな画像としてマーカ3-Rの画像が含まれる。
 このように、図11の例において、マーカ3-Rは、比較的大きな画像として撮像される。この場合、撮像部41により軸Zが正の方向を撮像されていると仮定すると、ドローンDにとっては、円形の煙突の断面に対して、ある半径を示す直線が存在する様に把握される。これにより、相対位置推定部112は、円筒に対称な煙突において回転方向の相対位置を特定するために、マーカ3-Rを利用することができる。なお、図11の例において、マーカ3-Rは、壁面から突き出す棒状の形状を有していてもよい。また、マーカ3-Rは、LED等のライトでもよい。
 このように、マーカ3は、図6や図8に示した2次元バーコード状の模様を有した平面のマーカのみならず、図10や図11に示した紐状又は棒状のマーカを採用することができる。
Here, the imaging unit 41 included in the drone D images the string-shaped markers 3-R.
At this time, the image captured by the imaging unit 41 includes only the region corresponding to the angle indicated by the angle θ shown in FIG. Furthermore, when the chimney is long, the angle θ is about 90 degrees, and for the drone D, the image is taken as if the markers 3-R exist almost directly above. That is, the image captured by the imaging unit 41 includes the image of the markers 3-R as a large image by the length of the string-shaped markers 3-R.
As described above, in the example of FIG. 11, the markers 3-R are imaged as a relatively large image. In this case, assuming that the axis Z is imaged in the positive direction by the imaging unit 41, the drone D is grasped as if there is a straight line indicating a certain radius with respect to the cross section of the circular chimney. As a result, the relative position estimation unit 112 can use the markers 3-R to specify the relative position in the rotation direction in the chimney symmetrical to the cylinder. In the example of FIG. 11, the markers 3-R may have a rod-like shape protruding from the wall surface. Further, the markers 3-R may be a light such as an LED.
As described above, the marker 3 adopts not only the flat marker having the two-dimensional bar code-like pattern shown in FIGS. 6 and 8 but also the string-shaped or rod-shaped marker shown in FIGS. 10 and 11. be able to.
 上述のように、マーカ3は、正方形のみならず、直方体や直線(細長い長方形とも言える)の様なマーカであってもよい。また、マーカ3は、2つ以上のLED等のライトを含んでいてもよく、夜光塗料等により描かれていても良い。これにより、暗所においてもマーカ3を撮像することが容易となり、位置特定装置1は、位置を特定することができる。
 また、高高度まで飛行する場合、長方形の枠だとより画像認識による相対位置推定の精度が向上し得るため、好適である。
As described above, the marker 3 may be not only a square but also a rectangular parallelepiped or a straight line (which can be said to be an elongated rectangle). Further, the marker 3 may include two or more lights such as LEDs, and may be drawn with a luminous paint or the like. This makes it easy to image the marker 3 even in a dark place, and the position specifying device 1 can specify the position.
Further, when flying to a high altitude, a rectangular frame is suitable because the accuracy of relative position estimation by image recognition can be improved.
 また例えば、図4等の説明において、マーカ3は地面Gに配置され、測距部42は、壁Wとの距離を測定するものとして説明したが、特にこれに限定されない。具体的には例えば、マーカ3は壁Wに設置され、測距部42は、地面Gとの距離を測定するものであってもよい。この場合であっても、位置特定装置1は、位置を特定することができる。 Further, for example, in the description of FIG. 4 and the like, the marker 3 is arranged on the ground G, and the distance measuring unit 42 measures the distance to the wall W, but the present invention is not particularly limited to this. Specifically, for example, the marker 3 may be installed on the wall W, and the distance measuring unit 42 may measure the distance to the ground G. Even in this case, the position specifying device 1 can specify the position.
 またドローンDは、3次元空間を移動可能な小型無人飛行機としたが、特にこれに限定されない。即ち例えば、ドローンDは、地上を移動する車両等であってもよい。
 例えば、地上を移動する車両に位置特定装置1を搭載する場合、床に設置されたマーカ3を車両で認識するとき、撮像部41とマーカ3との距離が近くなるため、図8の例のような、2重になっているマーカは好適である。更に言えば、この場合、マーカ3は、夜光塗料により描かれていると、車両が上を通過したときに発光するため撮像されやすく、画像認識により相対位置の推定が容易となる。
The drone D is a small unmanned aerial vehicle that can move in three-dimensional space, but the drone D is not particularly limited to this. That is, for example, the drone D may be a vehicle or the like that moves on the ground.
For example, when the position identification device 1 is mounted on a vehicle moving on the ground, the distance between the imaging unit 41 and the marker 3 becomes short when the vehicle recognizes the marker 3 installed on the floor. Such a double marker is suitable. Furthermore, in this case, if the marker 3 is drawn with the luminous paint, it emits light when the vehicle passes over it, so that it is easy to be imaged, and the relative position can be easily estimated by image recognition.
 また例えば、上述の実施形態の説明において、ドローンDは、撮像部41を備えるものとした。また、撮像部41は、レンズと、CCDやCMOS等の光学素子と、その制御部等からなり、画像や映像を撮像するものとした。しかしながら、これに限定されない。
 即ち例えば、ドローンDは、深度センサを有していてもよい。具体的には例えば、ドローンDは、深度カメラやレーダ、LiDARといった、3次元の深度情報を取得できる各種各様の深度センサを有していてもよい。この場合、位置特定装置1は、深度センサの測定結果を深度情報として取得する。この場合、位置特定装置1は、撮像部41により取得された画像のデータの代わりに、深度センサを介して取得された深度情報に基づいて、位置を特定することができる。
 なお、深度センサを用いる場合、マーカは、所定の凹凸を有する形状を有しているのが好適である。即ち、位置特定装置1は、深度情報に含まれる、所定の凹凸を有するマーカを認識することで、上述の実施形態のマーカ3の例と同様に位置を認識することができる。また、マーカの形状は、深度センサを用いる場合においても、図6や図8に示した形状を用いることができる。
 更に言えば、ドローンDは、撮像部41と深度センサとの両方を有していてもよい。この場合、位置特定装置1は、撮像部41により撮像された画像のデータと、深度センサを介して取得された深度情報に基づき、位置を特定することができる。このような構成により、マーカの設置位置によって、撮像部41を用いるためのマーカと、深度センサを用いるためのマーカとを使い分けることもできる。
 更に言えば、マーカは、撮像部41を用いるための色や蛍光といった特徴と、深度センサを用いるための所定の凹凸といった特徴の両方を有するものを採用することができる。これにより、撮像部41と深度センサの何れかを有するドローンDの両方により利用可能なマーカとなる。
Further, for example, in the above description of the embodiment, the drone D is assumed to include the imaging unit 41. Further, the image pickup unit 41 includes a lens, an optical element such as a CCD or CMOS, and a control unit thereof, and is intended to capture an image or video. However, it is not limited to this.
That is, for example, the drone D may have a depth sensor. Specifically, for example, the drone D may have various depth sensors such as a depth camera, a radar, and a LiDAR that can acquire three-dimensional depth information. In this case, the position specifying device 1 acquires the measurement result of the depth sensor as depth information. In this case, the position specifying device 1 can specify the position based on the depth information acquired through the depth sensor instead of the image data acquired by the imaging unit 41.
When a depth sensor is used, it is preferable that the marker has a shape having a predetermined unevenness. That is, the position specifying device 1 can recognize the position in the same manner as the example of the marker 3 of the above-described embodiment by recognizing the marker having a predetermined unevenness included in the depth information. Further, as the shape of the marker, the shape shown in FIGS. 6 and 8 can be used even when the depth sensor is used.
Furthermore, the drone D may have both an imaging unit 41 and a depth sensor. In this case, the position specifying device 1 can specify the position based on the image data captured by the imaging unit 41 and the depth information acquired via the depth sensor. With such a configuration, the marker for using the imaging unit 41 and the marker for using the depth sensor can be used properly depending on the installation position of the marker.
Furthermore, as the marker, a marker having both features such as color and fluorescence for using the imaging unit 41 and predetermined unevenness for using the depth sensor can be adopted. This makes it a marker that can be used by both the image pickup unit 41 and the drone D having either the depth sensor.
 また例えば、上述の実施形態の説明において、距離取得部113は、測距部42により測定された距離を取得するものとしたが、特にこれに限定されない。即ち例えば、測距部42は、実空間内のマーカ等以外の他の物体に対するドローンDからの距離を取得できれば足りる。
 具体的には例えば、距離取得部113は、ドローンDの位置と他の物体との角度に基づいて距離を取得することができる。即ち例えば、ある時刻t1において、位置特定装置1は、GPSを含む任意の手法により、ドローンDの位置を特定できていたとする。次に、時刻t1より後の時刻である時刻t2において、距離取得部113は、カメラ等の撮像部により撮像された画像のデータを画像解析することにより、ドローンDから見て他の物体が存在する角度(例えば方位角及び仰俯角)を取得する。また、距離取得部113は、ドローンDから見て他の物体が存在する角度及び空間情報に基づき、ドローンDと他の物体との距離を推定する。これにより、距離取得部113は、ドローンDと他の物体との距離を取得することができる。なお、上述の角度は、ドローンDの姿勢の情報(機体の傾きや方角)に基づいて補正されて用いられてよいのは言うまでもない。
 更に言えば、上述の場合において、距離取得部113は、画像のデータを画像解析することにより、他の物体の位置の変分(位置が変化した量)を取得することができる。これにより、ドローンDと他の物体との距離を推定し、取得することもできる。
Further, for example, in the description of the above-described embodiment, the distance acquisition unit 113 acquires the distance measured by the distance measurement unit 42, but the distance acquisition unit 113 is not particularly limited to this. That is, for example, it is sufficient for the distance measuring unit 42 to be able to acquire the distance from the drone D to an object other than a marker or the like in the real space.
Specifically, for example, the distance acquisition unit 113 can acquire the distance based on the position of the drone D and the angle with another object. That is, for example, at a certain time t1, it is assumed that the position specifying device 1 has been able to specify the position of the drone D by an arbitrary method including GPS. Next, at time t2, which is a time after time t1, the distance acquisition unit 113 analyzes the data of the image captured by the imaging unit such as a camera, so that another object exists as seen from the drone D. Obtain the angle to be used (for example, azimuth and elevation / depression angle). Further, the distance acquisition unit 113 estimates the distance between the drone D and the other object based on the angle and spatial information in which the other object exists when viewed from the drone D. As a result, the distance acquisition unit 113 can acquire the distance between the drone D and another object. Needless to say, the above-mentioned angle may be corrected and used based on the attitude information (tilt and direction of the aircraft) of the drone D.
Furthermore, in the above case, the distance acquisition unit 113 can acquire a variation (amount of change in position) of the position of another object by performing image analysis on the image data. This makes it possible to estimate and obtain the distance between the drone D and another object.
 また例えば、本実施形態は以下のような効果を奏することもできる。
 具体的には例えば、マーカとの相対位置や壁といった他の物体との距離の両方を取得できている場合、位置特定装置1により特定される位置の精度は、単にマーカに基づいて位置を特定するのと比較して、向上する。
 また例えば、床(マーカが設置された床)のマーカとの相対位置は取得できる状態であって、当該マーカとの相対位置と空間情報とから、他の物体の位置を推定することができる。
 以下、具体的に例えば、他の物体が鉄塔である場合、について説明する。なお、鉄塔の周辺におけるドローンDの飛行に関する技術は、ドローンDを用いた鉄塔の点検といった用途において重要となる技術である。
 まず、1点との測距が可能な測距センサではドローンDと他の物体である鉄塔との距離を常に測定することは、難しい。また例えば、類似の構造を繰り返す構造を有する場合には、3次元的な測距が可能な深度センサ等であっても、正確な距離の測定は難しいことがある。このような場合であっても、位置特定装置1は、例えば鉄塔との距離を測定しつつ鉄塔の周囲を周回するといった移動の結果得られた情報に基づき、位置の特定の精度を向上させることができる。しかしながら、鉄塔を周回する間に、マーカとの相対位置は取得できないことがある。そこで、位置特定装置1は、測定された距離に基づいて鉄塔とドローンDとの衝突を防ぎつつ、周回した分の鉄塔との距離の情報と、周回の間に複数回取得されたマーカとの相対位置の情報に基づき、位置を特定することができる。その結果、位置特定装置1は、例えば複数回の測定の結果を平均化する等の処理により、誤差を圧縮することができる。
 更に言えば、鉄塔の周辺には草が生い茂っている場合がある。この場合、単なる距離センサを用いて対地高度を測定した場合、鉄塔のどの高さに相当する位置に、ドローンDが位置するのかが特定できない。しかしながら、上述のように鉄塔の近傍の地面にマーカを配置することにより、位置特定装置1は、マーカとの相対位置に基づいて、位置(対地高度)を特定することができる。更に言えば、マーカとの相対位置の精度は、マーカとの距離が離れるほど劣化する。しかしながら、位置特定装置1は、鉄塔との距離を用いることにより、鉄塔との距離に基づくことで位置の特定の精度を向上させることができると共に、鉄塔との接触を防ぐこともできる。
Further, for example, the present embodiment can also exert the following effects.
Specifically, for example, when both the relative position with the marker and the distance with another object such as a wall can be acquired, the accuracy of the position specified by the position specifying device 1 simply specifies the position based on the marker. Compared to doing, it improves.
Further, for example, the relative position of the floor (floor on which the marker is installed) with the marker can be acquired, and the position of another object can be estimated from the relative position with the marker and the spatial information.
Hereinafter, for example, a case where another object is a steel tower will be specifically described. The technology related to the flight of the drone D around the steel tower is an important technology in applications such as inspection of the steel tower using the drone D.
First, it is difficult to constantly measure the distance between the drone D and another object, a steel tower, with a distance measuring sensor capable of measuring a distance from one point. Further, for example, when having a structure that repeats a similar structure, it may be difficult to measure an accurate distance even with a depth sensor or the like capable of three-dimensional distance measurement. Even in such a case, the position identification device 1 improves the accuracy of position identification based on the information obtained as a result of movement such as orbiting around the tower while measuring the distance to the tower. Can be done. However, it may not be possible to obtain the relative position with the marker while orbiting the tower. Therefore, the position specifying device 1 prevents the steel tower from colliding with the drone D based on the measured distance, and obtains information on the distance between the towers for the laps and the markers acquired a plurality of times during the laps. The position can be specified based on the relative position information. As a result, the position specifying device 1 can compress the error by, for example, processing such as averaging the results of a plurality of measurements.
Furthermore, there are cases where grass grows around the tower. In this case, when the altitude above ground level is measured using a simple distance sensor, it is not possible to specify at which height of the tower the drone D is located. However, by arranging the marker on the ground near the steel tower as described above, the position specifying device 1 can specify the position (elevation to ground level) based on the relative position with the marker. Furthermore, the accuracy of the position relative to the marker deteriorates as the distance from the marker increases. However, by using the distance from the steel tower, the position specifying device 1 can improve the accuracy of specifying the position based on the distance from the steel tower, and can also prevent contact with the steel tower.
 また例えば、上述の実施形態の説明において、撮像部41は、複数の画角の画像を撮像可能な、複数のレンズと光学素子との組を含んで構成されるものとしたが、特にこれに限定されない。即ち例えば、ドローンDは複数の撮像部を有しており、複数の画角の画像を撮像可能であってもよいのは言うまでもない。
 以下、複数の画角の画像を撮像可能であることによる効果について説明する。
 具体的には例えば、画角(望遠度合い)の違う画像があった場合、図7を用いて説明したように、ドローンDの姿勢が乱れた場合であっても、画角の広い画像にはマーカが含まれて撮像される可能性が高まる。これにより、位置特定装置1は位置の特定の精度を向上することができる。また、姿勢を正すことにより、画角の狭い画像としてとらえ、位置特定装置1により特定される位置の精度を向上させることもできる。
 また例えば、撮像部41により撮像される領域は、可変であってよい。即ち例えば、撮像部41のレンズや撮像素子の組は、様々な方向を撮像可能なように可動であってよい。これにより、位置を特定するための画像を撮像したり、上述の例でいう点検対象としての鉄塔を撮像したりといった利用をすることができる。更に言えば、近傍の他の物体が動く場合(例えば、風で揺れる送電線)であっても、可動な撮像部により常に画角内に他の物体を備えるといったことが可能となる。
Further, for example, in the description of the above-described embodiment, the imaging unit 41 is configured to include a set of a plurality of lenses and optical elements capable of capturing images of a plurality of angles of view. Not limited. That is, for example, it goes without saying that the drone D has a plurality of imaging units and may be capable of capturing images having a plurality of angles of view.
Hereinafter, the effect of being able to capture images with a plurality of angles of view will be described.
Specifically, for example, when there are images having different angles of view (telephoto degree), as described with reference to FIG. 7, even when the posture of the drone D is disturbed, an image having a wide angle of view can be used. It is more likely that the marker will be included and imaged. As a result, the position identification device 1 can improve the accuracy of position identification. Further, by correcting the posture, it is possible to capture the image as an image having a narrow angle of view and improve the accuracy of the position specified by the position specifying device 1.
Further, for example, the region imaged by the imaging unit 41 may be variable. That is, for example, the lens and the set of the image sensor of the image pickup unit 41 may be movable so as to be able to take an image in various directions. As a result, it is possible to take an image for specifying the position, or to take an image of a steel tower as an inspection target in the above example. Furthermore, even when another object in the vicinity moves (for example, a power transmission line swaying by the wind), the movable imaging unit makes it possible to always include the other object within the angle of view.
 また例えば、上述の実施形態の説明において、位置特定装置1はドローンDに搭載されているものとしたが、特にこれに限定されない。即ち例えば、位置特定装置1は、地上に備えられ、ドローンDから撮像した画像や距離の情報が地上の位置特定装置1に送信されるようになっていてもよい。これにより、例えば、高負荷なデータ処理を実行するためのハードウェアをドローンDに搭載する必要がなくなり、コストを削減する等することができる。 Further, for example, in the above description of the embodiment, the position specifying device 1 is assumed to be mounted on the drone D, but the present invention is not particularly limited to this. That is, for example, the position identification device 1 may be provided on the ground so that the image captured from the drone D and the distance information may be transmitted to the position identification device 1 on the ground. As a result, for example, it is not necessary to equip the drone D with hardware for executing high-load data processing, and the cost can be reduced.
 また例えば、上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。
 換言すると、図3の機能的構成は例示に過ぎず、特に限定されない。
 即ち、上述した一連の処理を全体として実行できる機能が操縦制御システムに備えられていれば足り、この機能を実現するためにどのような機能ブロックを用いるのかは特に図3の例に限定されない。また、機能ブロックの存在場所も、図3に特に限定されず、任意でよい。例えば、位置特定装置1の機能ブロックを操縦者端末2等に移譲させてもよい。
 また、1つの機能ブロックは、ハードウェア単体で構成してもよいし、ソフトウェア単体で構成してもよいし、それらの組み合わせで構成してもよい。
Further, for example, the above-mentioned series of processes can be executed by hardware or software.
In other words, the functional configuration of FIG. 3 is merely an example and is not particularly limited.
That is, it suffices if the steering control system is provided with a function capable of executing the above-mentioned series of processes as a whole, and what kind of functional block is used to realize this function is not particularly limited to the example of FIG. Further, the location of the functional block is not particularly limited to FIG. 3, and may be arbitrary. For example, the functional block of the position specifying device 1 may be transferred to the operator terminal 2 or the like.
Further, one functional block may be configured by a single piece of hardware, a single piece of software, or a combination thereof.
 また例えば、一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、コンピュータ等にネットワークや記録媒体からインストールされる。
 コンピュータは、専用のハードウェアに組み込まれているコンピュータであってもよい。
 また、コンピュータは、各種のプログラムをインストールすることで、各種の機能を実行することが可能なコンピュータ、例えばサーバの他汎用のスマートフォンやパーソナルコンピュータであってもよい。
Further, for example, when a series of processes are executed by software, a program constituting the software is installed on a computer or the like from a network or a recording medium.
The computer may be a computer embedded in dedicated hardware.
Further, the computer may be a computer capable of executing various functions by installing various programs, for example, a general-purpose smartphone or a personal computer in addition to a server.
 また例えば、このようなプログラムを含む記録媒体は、操縦者Uにプログラムを提供するために装置本体とは別に配布される図示せぬリムーバブルメディアにより構成されるだけでなく、装置本体に予め組み込まれた状態で操縦者Uに提供される記録媒体等で構成される。 Further, for example, a recording medium containing such a program is not only composed of a removable medium (not shown) distributed separately from the device main body in order to provide the program to the operator U, but is also preliminarily incorporated in the device main body. It is composed of a recording medium or the like provided to the operator U in the state of being in the state.
 なお、本明細書において、記録媒体に記録されるプログラムを記述するステップは、その順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的或いは個別に実行される処理をも含むものである。
 また、本明細書において、システムの用語は、複数の装置や複数の手段等より構成される全体的な装置を意味するものとする。
In the present specification, the steps for describing a program to be recorded on a recording medium are not necessarily processed in chronological order, but also in parallel or individually, even if they are not necessarily processed in chronological order. It also includes the processing to be executed.
Further, in the present specification, the term of the system means an overall device composed of a plurality of devices, a plurality of means, and the like.
 以上を換言すると、本発明が適用される情報処理装置は、次のような構成を有する各種各様の実施形態を取ることができる。 In other words, the information processing apparatus to which the present invention is applied can take various embodiments having the following configurations.
 即ち、本発明が適用される情報処理装置(例えば図3の位置特定装置1)は、
 実空間内の第1物体(例えば、マーカ3)が被写対象として所定位置(例えば、ドローンDの位置)から撮像された画像、及び前記実空間内の第2物体(例えば、壁W)に対する当該所定位置からの距離に基づき、当該所定位置を推定する情報処理装置であって、
 前記画像の第1データを取得する画像データ取得手段(例えば、図3の画像データ取得部111)と、
 前記第1データに対応する前記画像に含まれる前記第1物体の画像に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する相対位置推定手段(例えば、図3の相対位置推定部112)と、
 前記第2物体に対する前記所定位置からの距離を取得する距離取得手段(例えば、図3の距離取得部113)と、
 前記実空間における前記第1物体及び前記第2物体を含む複数の物体の配置の情報を、空間情報として取得する空間情報取得手段(例えば、図3の空間情報取得部114)と、
 前記第1物体に対する前記所定位置からの前記相対位置、前記第2物体に対する前記所定位置からの前記距離、及び前記空間情報に基づいて、前記所定位置を推定する位置特定手段(例えば、図3の位置特定部115)と、
 を備える情報処理装置であれば足りる。
That is, the information processing device to which the present invention is applied (for example, the position specifying device 1 in FIG. 3) is
An image of a first object (for example, marker 3) in real space captured from a predetermined position (for example, the position of drone D) as a subject to be imaged, and a second object (for example, wall W) in the real space. An information processing device that estimates the predetermined position based on the distance from the predetermined position.
An image data acquisition means for acquiring the first data of the image (for example, the image data acquisition unit 111 in FIG. 3) and
Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated. Relative position estimation means (for example, relative position estimation unit 112 in FIG. 3) and
A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object (for example, the distance acquisition unit 113 in FIG. 3) and
Spatial information acquisition means (for example, spatial information acquisition unit 114 in FIG. 3) for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
A position specifying means (for example, FIG. 3) that estimates the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information. Positioning part 115) and
An information processing device equipped with the above is sufficient.
 これにより、情報処理装置は、第1物体の画像に基づいて推定された相対位置のみならず、第2物体との距離に基づくことで、特定された位置の精度を向上することができる。更に言えば、移動体等が接触する危険性を持つ物体を第2物体とすることで、直接的に第2物体に接触するリスクを低減することができる。 As a result, the information processing device can improve the accuracy of the specified position not only based on the relative position estimated based on the image of the first object but also based on the distance to the second object. Furthermore, by setting an object that has a risk of contact with a moving body or the like as the second object, the risk of direct contact with the second object can be reduced.
 さらに、前記画像データ取得手段は、更に、前記所定位置から撮像された前記画像と異なる画角の他の画像の第2データを取得し、
 前記相対位置推定手段は、前記第1データ及び前記第2データの夫々に対応する前記画像及び前記他の画像の夫々のうち少なくとも一方に含まれる前記第1物体の画像に基づいて、前記所定位置と前記第1物体との相対位置を取得する、ことができる。
Further, the image data acquisition means further acquires second data of another image having an angle of view different from that of the image captured from the predetermined position.
The relative position estimation means is based on the image of the first object contained in at least one of the image corresponding to the first data and the second data and each of the other images, and the predetermined position. And the relative position with respect to the first object can be obtained.
 さらに、前記画像及び前記他の画像のうち少なくとも一方として撮像される前記実空間の領域を変更する制御をする撮像領域制御手段を更に備える、ことができる。 Further, it is possible to further include an imaging region control means that controls to change the region of the real space imaged as at least one of the image and the other image.
 さらに、前記相対位置推定手段は、前記第1物体が有する大きな構造と当該大きな構造と比較して小さな構造とのうち少なくとも一方に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を解析する、ことができる。 Further, the relative position estimating means determines the distance, direction, and distance from the predetermined position with respect to the first object based on at least one of a large structure of the first object and a structure smaller than the large structure. And relative positions including at least one of the angles can be analyzed.
 D・・・ドローン、1・・・位置特定装置、2・・・操縦者端末、11・・・CPU、111・・・画像データ取得部、112・・・相対位置推定部、113・・・距離取得部、114・・・空間情報取得部、115・・・位置特定部、116・・・姿勢制御部、117・・・撮像領域制御部、19・・・通信部、41・・・撮像部、42・・・測距部、43・・・駆動部、44・・・撮像部駆動部 D ... Drone, 1 ... Position identification device, 2 ... Operator terminal, 11 ... CPU, 111 ... Image data acquisition unit, 112 ... Relative position estimation unit, 113 ... Distance acquisition unit, 114 ... Spatial information acquisition unit, 115 ... Position identification unit, 116 ... Attitude control unit, 117 ... Imaging area control unit, 19 ... Communication unit, 41 ... Imaging Unit, 42 ... Distance measuring unit, 43 ... Drive unit, 44 ... Imaging unit drive unit

Claims (6)

  1.  実空間内の第1物体が被写対象として所定位置から撮像された画像、及び前記実空間内の第2物体に対する当該所定位置からの距離に基づき、当該所定位置を推定する情報処理装置であって、
     前記画像の第1データを取得する画像データ取得手段と、
     前記第1データに対応する前記画像に含まれる前記第1物体の画像に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する相対位置推定手段と、
     前記第2物体に対する前記所定位置からの距離を取得する距離取得手段と、
     前記実空間における前記第1物体及び前記第2物体を含む複数の物体の配置の情報を、空間情報として取得する空間情報取得手段と、
     前記第1物体に対する前記所定位置からの前記相対位置、前記第2物体に対する前記所定位置からの前記距離、及び前記空間情報に基づいて、前記所定位置を特定する位置特定手段と、
     を備える情報処理装置。
    An information processing device that estimates a predetermined position based on an image of a first object in the real space captured from a predetermined position as a subject to be imaged and a distance from the predetermined position with respect to the second object in the real space. hand,
    An image data acquisition means for acquiring the first data of the image, and
    Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated. Relative position estimation means and
    A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object, and
    Spatial information acquisition means for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
    A position specifying means for specifying the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information.
    Information processing device equipped with.
  2.  前記画像データ取得手段は、更に、前記所定位置から撮像された前記画像と異なる画角の他の画像の第2データを取得し、
     前記相対位置推定手段は、前記第1データ及び前記第2データの夫々に対応する前記画像及び前記他の画像の夫々のうち少なくとも一方に含まれる前記第1物体の画像に基づいて、前記所定位置と前記第1物体との相対位置を取得する、
     請求項1に記載の情報処理装置。
    The image data acquisition means further acquires second data of another image having an angle of view different from that of the image captured from the predetermined position.
    The relative position estimation means is based on the image of the first object contained in at least one of the image corresponding to the first data and the second data and each of the other images, and the predetermined position. And the relative position with respect to the first object,
    The information processing device according to claim 1.
  3.  前記画像及び前記他の画像のうち少なくとも一方として撮像される前記実空間の領域を変更する制御をする撮像領域制御手段を更に備える、
     請求項2に記載の情報処理装置。
    Further comprising an imaging region control means for controlling to change the region of the real space imaged as at least one of the image and the other image.
    The information processing device according to claim 2.
  4.  前記相対位置推定手段は、前記第1物体が有する大きな構造と当該大きな構造と比較して小さな構造とのうち少なくとも一方に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を解析する、
     請求項1乃至3のうち何れか1項に記載の情報処理装置。
    The relative position estimating means is based on at least one of a large structure of the first object and a structure smaller than the large structure, and the distance, direction, and angle from the predetermined position with respect to the first object. Analyze relative positions including at least one of
    The information processing device according to any one of claims 1 to 3.
  5.  実空間内の第1物体が被写対象として所定位置から撮像された画像、及び前記実空間内の第2物体に対する当該所定位置からの距離に基づき、当該所定位置を推定する情報処理装置が実行する情報処理方法において、
     前記画像の第1データを取得する画像データ取得ステップと、
     前記第1データに対応する前記画像に含まれる前記第1物体の画像に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する相対位置推定ステップと、
     前記第2物体に対する前記所定位置からの距離を取得する距離取得手段と、
     前記実空間における前記第1物体及び前記第2物体を含む複数の物体の配置の情報を、空間情報として取得する空間情報取得ステップと、
     前記第1物体に対する前記所定位置からの前記相対位置、前記第2物体に対する前記所定位置からの前記距離、及び前記空間情報に基づいて、前記所定位置を推定する位置推定ステップと、
     を含む情報処理方法。
    An information processing device that estimates the predetermined position based on the image of the first object in the real space captured from the predetermined position as the object to be imaged and the distance from the predetermined position to the second object in the real space is executed. In the information processing method
    An image data acquisition step for acquiring the first data of the image, and
    Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated. Relative position estimation step and
    A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object, and
    A spatial information acquisition step for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
    A position estimation step for estimating the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information.
    Information processing methods including.
  6.  実空間内の第1物体が被写対象として所定位置から撮像された画像、及び前記実空間内の第2物体に対する当該所定位置からの距離に基づき、当該所定位置を推定するコンピュータに、
     前記画像の第1データを取得する画像データ取得ステップと、
     前記第1データに対応する前記画像に含まれる前記第1物体の画像に基づいて、前記第1物体に対する前記所定位置からの距離、方向、及び角度のうち少なくとも1つを含む相対位置を推定する相対位置推定ステップと、
     前記第2物体に対する前記所定位置からの距離を取得する距離取得手段と、
     前記実空間における前記第1物体及び前記第2物体を含む複数の物体の配置の情報を、空間情報として取得する空間情報取得ステップと、
     前記第1物体に対する前記所定位置からの前記相対位置、前記第2物体に対する前記所定位置からの前記距離、及び前記空間情報に基づいて、前記所定位置を推定する位置推定ステップと、
     を含む制御処理を実行させるプログラム。
    A computer that estimates the predetermined position based on the image of the first object in the real space captured from the predetermined position as the object to be imaged and the distance from the predetermined position to the second object in the real space.
    An image data acquisition step for acquiring the first data of the image, and
    Based on the image of the first object included in the image corresponding to the first data, a relative position including at least one of a distance, a direction, and an angle from the predetermined position with respect to the first object is estimated. Relative position estimation step and
    A distance acquisition means for acquiring the distance from the predetermined position with respect to the second object, and
    A spatial information acquisition step for acquiring information on the arrangement of the first object and a plurality of objects including the second object in the real space as spatial information.
    A position estimation step for estimating the predetermined position based on the relative position from the predetermined position with respect to the first object, the distance from the predetermined position with respect to the second object, and the spatial information.
    A program that executes control processing including.
PCT/JP2021/005506 2020-02-20 2021-02-15 Information processing device, information processing method, and program WO2021166845A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-027161 2020-02-20
JP2020027161A JP2021131762A (en) 2020-02-20 2020-02-20 Information processing apparatus, information processing method, and program

Publications (1)

Publication Number Publication Date
WO2021166845A1 true WO2021166845A1 (en) 2021-08-26

Family

ID=77391165

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/005506 WO2021166845A1 (en) 2020-02-20 2021-02-15 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2021131762A (en)
WO (1) WO2021166845A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023037458A1 (en) * 2021-09-08 2023-03-16 日本電信電話株式会社 Inspection system, inspection method, and flight device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7360983B2 (en) 2020-03-31 2023-10-13 関西電力株式会社 Data acquisition device and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019117584A (en) * 2017-12-27 2019-07-18 株式会社ダイヘン Mobile body
WO2019139172A1 (en) * 2018-01-15 2019-07-18 本郷飛行機株式会社 Information processing system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019117584A (en) * 2017-12-27 2019-07-18 株式会社ダイヘン Mobile body
WO2019139172A1 (en) * 2018-01-15 2019-07-18 本郷飛行機株式会社 Information processing system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023037458A1 (en) * 2021-09-08 2023-03-16 日本電信電話株式会社 Inspection system, inspection method, and flight device

Also Published As

Publication number Publication date
JP2021131762A (en) 2021-09-09

Similar Documents

Publication Publication Date Title
US20210065400A1 (en) Selective processing of sensor data
US10599149B2 (en) Salient feature based vehicle positioning
EP3540464B1 (en) Ranging method based on laser radar system, device and readable storage medium
US10198634B2 (en) Systems and methods for detecting and tracking movable objects
US20190346562A1 (en) Systems and methods for radar control on unmanned movable platforms
JP6029446B2 (en) Autonomous flying robot
US9073637B2 (en) Flying vehicle guiding system and flying vehicle guiding method
US10409293B1 (en) Gimbal stabilized components for remotely operated aerial vehicles
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN105182992A (en) Unmanned aerial vehicle control method and device
CN106291535A (en) A kind of obstacle detector, robot and obstacle avoidance system
WO2021166845A1 (en) Information processing device, information processing method, and program
CN111192318B (en) Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
JP6138326B1 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE
US11490005B2 (en) Overhead line image capturing system and overhead line image capturing method
JP2014149621A (en) Autonomous flying robot
CN112335190A (en) Radio link coverage map and impairment system and method
GB2571711A (en) Drone control system
US20210208606A1 (en) Information processing system, information processing method, and program
CN110162081A (en) Mobile device control method and device, mobile terminal and mobile device
US20240124137A1 (en) Obstacle avoidance for aircraft from shadow analysis
JP2020071580A (en) Information processing apparatus, flight control method and flight control system
WO2020244467A1 (en) Method and device for motion state estimation
CN110515086A (en) A kind of naval target search simulation system and method applied to unmanned boat
CN110892353A (en) Control method, control device and control terminal of unmanned aerial vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21757465

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21757465

Country of ref document: EP

Kind code of ref document: A1