WO2021049508A1 - Dimension display system, and dimension display method - Google Patents

Dimension display system, and dimension display method Download PDF

Info

Publication number
WO2021049508A1
WO2021049508A1 PCT/JP2020/034039 JP2020034039W WO2021049508A1 WO 2021049508 A1 WO2021049508 A1 WO 2021049508A1 JP 2020034039 W JP2020034039 W JP 2020034039W WO 2021049508 A1 WO2021049508 A1 WO 2021049508A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimension
dimension display
display system
image
arithmetic processing
Prior art date
Application number
PCT/JP2020/034039
Other languages
French (fr)
Japanese (ja)
Inventor
幸佑 野平
Original Assignee
株式会社Liberaware
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Liberaware filed Critical 株式会社Liberaware
Priority to KR1020227011101A priority Critical patent/KR20220058596A/en
Publication of WO2021049508A1 publication Critical patent/WO2021049508A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D7/00Indicating measured values
    • G01D7/002Indicating measured values giving both analog and numerical indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to a dimension display system and a dimension display method.
  • Patent Document 1 in a place where inspection is difficult by a worker or the like, the inside of the structure is inspected on behalf of the worker or the like by flying and photographing the inside of the structure with a flying object. Is being done.
  • the internal structure is complicated, and it is necessary to fly while checking the places where the flying object can pass, but the operator flies while checking only the image obtained from the image pickup device at the site.
  • the present disclosure has been made in view of such a background, and an object of the present disclosure is to provide a dimension display system and a dimension display method of a photographed image that can identify a place where an air vehicle can pass at the site.
  • the main invention of the present disclosure for solving the above problems is a dimension display system, in which the dimension display system is superimposed on the image with an arithmetic processing unit that performs arithmetic processing on an image acquired from an image pickup device of an air vehicle.
  • the arithmetic processing unit includes a dimensional storage unit that stores dimensional values in either the width direction or the height direction at least at the position of an object located in front of the aircraft body of the flying object, which is indicated by the dimensional display displayed.
  • a dimension display position calculation unit that calculates a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object. Based on the calculated position, the arithmetic processing unit superimposes the dimension display on the image acquired from the imaging device to generate the arithmetically processed image.
  • FIG. 1 It is a figure which shows the structure of the system which concerns on this embodiment. It is a block diagram which shows the hardware configuration of the user terminal of FIG. It is a block diagram which shows the hardware composition of the flying object of FIG. It is a block diagram which shows the hardware composition which concerns on the dimension display of the flying object of FIG. This is an example of a bird's-eye view of the periphery of the flying object 3 according to the present embodiment. It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG.
  • the air vehicle according to the embodiment of the present disclosure has the following configuration.
  • the dimension display system An arithmetic processing unit that arithmetically processes the image acquired from the image pickup device of the flying object, and Dimensional memory that stores in advance either the width direction or the height direction at the position of an object located in front of the aircraft body, which is indicated by the dimension display superimposed on the image and displayed on the user terminal.
  • the arithmetic processing unit calculates a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object.
  • the arithmetic processing unit superimposes the dimension display on the image acquired from the imaging device to generate the arithmetically processed image.
  • [Item 2] The dimension display system according to item 1.
  • the arithmetic processing unit superimposes and displays the frontal distance on the image acquired from the image pickup apparatus.
  • [Item 3] The dimension display system according to item 1 or 2.
  • the arithmetic processing unit superimposes and displays at least one of information about the flying object of the flying object and / or environmental information on the image acquired from the imaging device.
  • [Item 4] The dimension display system according to any one of claims 1 to 3 is provided.
  • this system for displaying a photographed image of an air vehicle according to the embodiment of the present disclosure
  • this system a display system for displaying a photographed image of an air vehicle according to the embodiment of the present disclosure
  • this system has user terminals 1 and 2 and an air vehicle 3.
  • the user terminals 1 and 2 and the aircraft 3 are connected to each other so as to be able to communicate with each other via the network NW.
  • NW The illustrated configuration is an example, and the number and types of user terminals and flying objects are arbitrary.
  • FIG. 2 is a block diagram showing a hardware configuration of the user terminals 1 and 2 of FIG.
  • the user terminals 1 and 2 shown in FIG. 1 include an arithmetic processing unit 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
  • the arithmetic processing unit 10 is an arithmetic unit that controls the operation of the entire user terminals 1 and 2, controls the transmission and reception of data between each element, and performs information processing and the like necessary for application execution and authentication processing.
  • the arithmetic processing unit 10 is a CPU (Central Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11.
  • CPU Central Processing Unit
  • the memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary memory composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • the memory 11 is used as a work area or the like of the arithmetic processing unit 10, and also stores a BIOS (Basic Input / Output System) executed when the user terminals 1 and 2 are started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 12 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 12.
  • the transmission / reception unit 13 connects the user terminals 1 and 2 to the network NW.
  • the transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 14 is an input device such as a keyboard and a mouse, and an output device such as a display unit 141 (for example, a display).
  • the display unit in the present embodiment is not limited to the display unit 141 provided in the user terminals 1 and 2, but is a display unit of an external display device directly or indirectly connected to the user terminals 1 and 2. May be good.
  • the bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • FIG. 1 illustrates general-purpose computers such as workstations and personal computers, and mobile terminals such as smartphones and tablet PCs as user terminals 1 and 2, but the present invention is not limited to this, and for example, a head-mounted display is shown.
  • a device such as (HMD) that can be operated from a first-person viewpoint may be used, and in this case, the effect of the display system of the present embodiment is further enhanced.
  • FIG. 3 is a block diagram showing a hardware configuration of the flying object 3.
  • the flight controller 31 can have one or more arithmetic processing units such as a programmable processor (for example, a central processing unit (CPU)).
  • a programmable processor for example, a central processing unit (CPU)
  • the flight controller 31 has a memory 311 and can access the memory.
  • Memory 311 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the flight controller 31 may include sensors 312 such as an inertial sensor (for example, an acceleration sensor and a gyro sensor), a GPS sensor, and a proximity sensor (for example, LiDAR (Light Detection And Ringing)).
  • sensors 312 such as an inertial sensor (for example, an acceleration sensor and a gyro sensor), a GPS sensor, and a proximity sensor (for example, LiDAR (Light Detection And Ringing)).
  • Memory 311 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the image pickup apparatus 32 / sensors 33 may be directly transmitted and stored in the memory 311.
  • still image / moving image data taken by an imaging device or the like is recorded in an internal memory or an external memory.
  • the flight controller 31 includes a control module (not shown) configured to control the state of the flying object.
  • the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • ESC34 Electric Speed Controller
  • the propulsion mechanism (motor 35, etc.) of the flying object.
  • the propeller 36 is rotated by the motor 35 supplied from the battery 38 to generate lift of the flying object.
  • the control module can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 31 communicates with a receiver 37 configured to receive data from one or more external devices (eg, transmitter (propo) 39, terminal, display device, or other remote controller). It is possible.
  • the transmitter 39 can use any suitable communication means such as wired communication or wireless communication.
  • the receiving unit 37 uses one or more of, for example, a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. can do.
  • LAN local area network
  • WAN wide area network
  • P2P point-to-point
  • the receiving unit 37 can receive one or more of predetermined control data, a user command from a terminal or a remote control, and the like.
  • the sensors 33 include an inertial sensor (acceleration sensor, gyro sensor), a GPS sensor, a proximity sensor (for example, LiDAR (Light Detection And Ringing)), or a vision / image sensor (for example, an image pickup device 32).
  • the present embodiment particularly includes a sensor (for example, a laser sensor) for measuring the frontal distance from the body of the flying object 3 to the object in front of the flying object 3.
  • FIG. 4 is a block diagram showing a hardware configuration related to dimension display according to the present embodiment.
  • the dimension display system 4 is provided separately from the configuration shown in FIG. 3 in the flying object 3, for example, and has a storage unit 40, an arithmetic processing unit 42, a transmission unit 44, and a bus 46, via the bus 46.
  • the image pickup device 32 / sensors 33 are electrically connected to each other.
  • the dimension display system 4 is not limited to the form provided in the flying object 3, and is, for example, separately provided in the user terminals 1 and 2, or has a configuration corresponding to the arithmetic processing unit and the storage unit of the user terminals 1 and 2.
  • the image pickup device 32 / sensors 33 of the flying object 3 may be provided. Processing for dimension display is performed by receiving the data from. Further, the storage unit 40 and the arithmetic processing unit 42 may be provided in different devices (for example, the flying object 3, the user terminals 1, 2 and the server).
  • the storage unit 40 includes a front distance storage unit 401 and a dimension storage unit 402.
  • the front distance storage unit 401 stores the front distance from the body of the flying object to the object in front of the flying object acquired from the sensors 33.
  • the dimensional storage unit 402 stores at least the dimensional values in the width direction or the height direction at the position of the object indicated by the dimensional display displayed superimposed on the image acquired from the image pickup device 32.
  • the front distance is stored in the front distance storage unit 401, but the present invention is not limited to this, and the sensors 33 are configured to transmit the front distance to the arithmetic processing unit 42. May be good.
  • the calculation processing unit 42 includes a dimension display position calculation unit 421.
  • the dimension display position calculation unit 421 displays a dimension indicating the dimension value in the width direction or the height direction at the above-mentioned front object position based on at least the front distance, the angle of view of the image pickup device, and the dimension value (both will be described later). Calculate the display position.
  • the calculation method by the dimension display position calculation unit 421 will be described later.
  • the arithmetic processing unit 42 superimposes the dimension display on the image acquired from the image pickup apparatus 32 based on the position calculated by the dimension display position calculation unit 421.
  • the transmitter 44 can use any suitable communication means such as wired communication or wireless communication, for example, local area network (LAN), wide area network (WAN), infrared, wireless, WiFi, point-to-point.
  • LAN local area network
  • WAN wide area network
  • P2P personal area network
  • telecommunications networks telecommunications networks
  • cloud communications etc.
  • FIG. 5 exemplifies a bird's-eye view of the vicinity of the flying object 3 according to the present embodiment. It should be noted that this bird's-eye view is used in the present specification for explaining the present embodiment, and may not be displayed on the system according to the present embodiment.
  • the flight situation may be, for example, under GPS or outdoors, and the flight purpose is.
  • the flight purpose is.
  • it may be surveying, aerial photography, etc., and may be applied to any situation or purpose in which an air vehicle is generally used.
  • an obstacle is found while the flying object 3 is flying in the aisle, and once it flies horizontally to the left side, it flies forward and passes through a narrow space. It is assumed that.
  • 6 to 8 are display examples by the display unit 141 (for example, a display or the like) of the input / output unit 14 of the display system according to the present embodiment.
  • FIG. 6 it can be confirmed that a columnar obstacle exists on the passage.
  • Two vertical dotted lines are displayed on the display indicating the widthwise dimension (for example, 1.0 m) at the position of the obstacle in front.
  • the width between the two vertical dotted lines indicates the width direction dimension at the position of the obstacle in front.
  • the front distance of the center of the screen (for example, 2.8 m) is displayed, and the remaining battery level (for example, 7.0 V) is displayed.
  • the remaining battery level for example, 7.0 V
  • either or both indications may not be shown, and conversely, information about the other aircraft 3 (eg, battery consumption, flight altitude, flight time, etc.), environmental information (eg, flight time, etc.). , Carbon monoxide concentration, oxygen concentration, etc.) may be further displayed.
  • FIG. 7 is a display example when flying forward after FIG.
  • the frontal distance of the center of the screen is updated (eg, 2.5 m) and the widthwise dimension (eg, 1.0 m) at the position of the front obstacle is shown. Since the flying object 3 is closer to the obstacle than in FIG. 6, the width between the two vertical lines is wider than in FIG.
  • FIG. 8 is a display example when a horizontal flight is made to the left after FIG. 7.
  • the space between the obstacle on the left side and the obstacle in front is shown, the front distance in the center of the screen is not updated (for example, 2.5 m), so the dimension in the width direction at the position of the obstacle in front (For example, 1.0 m) is shown with the same width as in FIG.
  • the dimensional display in the width direction at the position of the narrow space is also displayed as an auxiliary, so that the flying object 3 passes through. It is possible to easily identify possible locations on site.
  • the dimension display position calculation unit 421 As a specific calculation method of the dimension display position calculation unit 421, a method of calculating the positions of two vertical dotted lines indicating a dimension value of 1.0 m by the dimension display position calculation unit 421 of the calculation processing unit 42. An example will be described below.
  • the front distance a of the flying object 3 is acquired by the front distance sensor such as the laser sensor included in the sensors 43, and stored in the front distance storage unit 401.
  • the length b at the front object position corresponding to half of the display image width of the display unit 141 Find a ⁇ tan ⁇ .
  • the length b' b / at the front object position per unit number of pixels.
  • the required number of pixels m is calculated by dividing the value (0.5 m) of half the dimension value by the obtained length b', and the pixels are calculated from the center line of the display unit 141.
  • a vertical dotted line is superimposed and displayed on the image acquired from the image pickup device 32 of the flying object 3 at a position several meters away.
  • this calculation method is only an example, and is not limited to this as long as the above-mentioned vertical dotted line position can be calculated.
  • a table relating to the above calculation method is provided in advance, and the number of pixels corresponding to the front distance in a certain range is provided.
  • m for example, when the front distance is 2.1 to 2.5 m, the number of pixels is m, and when the front distance is 2.6 to 3.0, the number of pixels is m', etc.
  • a known correction method may be used when calculating the vertical dotted line position.
  • the dimensional value in the width direction is displayed, but the present invention is not limited to this.
  • the dimensional display position is calculated based on the pixels in the height direction, and the dimensional display in the height direction is superimposed. You may.
  • FIG. 9 illustrates a side view of the periphery of the flying object 3 according to the present embodiment in a situation different from that of FIG. It should be noted that this side view is used in the present specification for explaining the present embodiment, and may not be displayed on the system according to the present embodiment.
  • the flight situation may be, for example, under GPS or outdoors, and the flight purpose is.
  • the flight purpose may be surveying, aerial photography, etc., and may be applied to any situation or purpose in which an air vehicle is generally used.
  • the flying object 3 becomes a dead end when flying in the aisle, climbs along the aisle, and then flies forward and passes through a narrow space. It is assumed that.
  • 10 to 11 are other display examples by the display unit 141 (for example, a display or the like) of the input / output unit 14 of the display system according to the present embodiment.
  • the display unit 141 for example, a display or the like
  • the passage is rising along the wall surface.
  • Two vertical dotted lines are displayed on the display indicating the widthwise dimension (for example, 1.0 m) at the position of the front wall.
  • the distance to the front wall is 2.0 m.
  • FIG. 11 a narrow space appeared while climbing the passage along the wall surface.
  • Two vertical dotted lines indicating the width direction dimension (for example, 1.0 m) at the position of the front wall are displayed on the display, and the width of the narrow space and the dimension display can be visually compared. Therefore, it can be determined that the vehicle can pass through a narrow space by moving horizontally to the right.
  • the dimensional display in the width direction at the position of the narrow space is also displayed as an auxiliary, so that the flying object 3 passes through. It is possible to easily identify possible locations on site.
  • the dimensions in the width direction or the height direction at the position of the front object are calculated by acquiring the front distance by the sensors 33 of the flying object 3, but the present invention is not limited to this, for example.
  • the flying object 3 can be provided. It is also possible to calculate and display the dimensions in the width direction or the height direction at the position of the object closest to the aircraft in the image to be captured.

Abstract

[Problem] The objective of the present invention is to provide a display system for captured images with which it is possible to identify, on-site, a place through which an aircraft can pass. [Solution] The display system according to the present disclosure is a dimension display system provided with an arithmetic processing unit which subjects a video acquired from an image capturing device of an aircraft to arithmetic processing, and a dimension storing unit which stores, in advance, at least a dimensional value in either the width direction or the height direction of the aircraft at the position of an object positioned in front of the fuselage, as indicated by a dimension display which is displayed on a user terminal, superimposed on the video, wherein the arithmetic processing unit is additionally provided with a dimension display position calculating unit which calculates the position in which the dimension display is to be displayed, on the basis, at least, of a front surface distance from the fuselage to the object, and the angle of view of the image capturing device, acquired by means of a sensor of the aircraft, and the dimensional value, and the arithmetic processing unit generates the video, subjected to arithmetic processing, by superimposing the dimension display on the video acquired from the image capturing device, on the basis of the calculated position.

Description

寸法表示システムおよび寸法表示方法Dimension display system and dimensional display method
 本開示は、寸法表示システムおよび寸法表示方法に関する。 This disclosure relates to a dimension display system and a dimension display method.
 従来から、例えば、特許文献1に開示されるように、作業員等により点検が困難な場所において、飛行体により構造物の内部を飛行及び撮影することで、作業員等の代わりに点検することが行われている。 Conventionally, for example, as disclosed in Patent Document 1, in a place where inspection is difficult by a worker or the like, the inside of the structure is inspected on behalf of the worker or the like by flying and photographing the inside of the structure with a flying object. Is being done.
特開2016-15628号公報Japanese Unexamined Patent Publication No. 2016-15628
 しかしながら、構造体によっては内部構造が複雑化しており、飛行体が通過可能な個所を確認しながら飛行する必要があるが、操縦者は現場で撮像装置から得られた映像のみを確認しながら飛行体が通過可能かどうかを判断しなければならず、十分な対応が難しい状況があった。そこで、事前に建物図面等により狭所や障害物を確認する対策があり得るが、飛行前の事前確認の負担が大きく、内部構造が建物図面等の状況と変化していることもあり得るため、この対策は十分とはいえなかった。また、構造体内部に限らず、屋外であっても、操縦者が現場で撮像装置から得られた映像のみを確認しながら飛行体が通過可能かどうかを判断しなければならない状況では、同様の課題が生じ得る。 However, depending on the structure, the internal structure is complicated, and it is necessary to fly while checking the places where the flying object can pass, but the operator flies while checking only the image obtained from the image pickup device at the site. There was a situation where it was difficult to adequately respond because it was necessary to judge whether the body could pass through. Therefore, it is possible to take measures to confirm narrow spaces and obstacles in advance by using building drawings, etc., but the burden of prior confirmation before flight is heavy, and the internal structure may change from the situation of building drawings, etc. , This measure was not enough. The same applies to situations where the operator must determine whether or not the air vehicle can pass by checking only the image obtained from the image pickup device at the site, not only inside the structure but also outdoors. Challenges can arise.
 本開示はこのような背景を鑑みてなされたものであり、飛行体が通過可能な個所を現場で特定可能な撮影画像の寸法表示システムおよび寸法表示方法を提供することを目的とする。 The present disclosure has been made in view of such a background, and an object of the present disclosure is to provide a dimension display system and a dimension display method of a photographed image that can identify a place where an air vehicle can pass at the site.
 上記課題を解決するための本開示の主たる発明は、寸法表示システムであって、前記寸法表示システムは、飛行体の撮像装置から取得した映像を演算処理する演算処理部と、前記映像に重畳して表示される寸法表示が示す、少なくとも前記飛行体の機体正面に位置する物体の位置における幅方向または高さ方向のいずれかの寸法値を記憶する寸法記憶部と、を備え、前記演算処理部は、少なくとも前記飛行体のセンサにより取得した、前記機体から前記物体までの正面距離及び前記撮像装置の画角、前記寸法値に基づき、前記寸法表示を表示する位置を算出する寸法表示位置算出部をさらに備え、前記演算処理部は、前記算出された位置を基に、前記撮像装置から取得した映像上に前記寸法表示を重畳させて、前記演算処理された映像を生成する、こととする。 The main invention of the present disclosure for solving the above problems is a dimension display system, in which the dimension display system is superimposed on the image with an arithmetic processing unit that performs arithmetic processing on an image acquired from an image pickup device of an air vehicle. The arithmetic processing unit includes a dimensional storage unit that stores dimensional values in either the width direction or the height direction at least at the position of an object located in front of the aircraft body of the flying object, which is indicated by the dimensional display displayed. Is a dimension display position calculation unit that calculates a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object. Based on the calculated position, the arithmetic processing unit superimposes the dimension display on the image acquired from the imaging device to generate the arithmetically processed image.
 本開示によれば、飛行体が通過可能な個所を現場で特定可能な撮影画像の寸法表示システムおよび寸法表示方法を提供することができる。 According to the present disclosure, it is possible to provide a dimension display system and a dimension display method for captured images that can identify a place where an air vehicle can pass at the site.
本実施形態に係るシステムの構成を示す図である。It is a figure which shows the structure of the system which concerns on this embodiment. 図1のユーザ端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the user terminal of FIG. 図1の飛行体のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware composition of the flying object of FIG. 図1の飛行体の寸法表示にかかるハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware composition which concerns on the dimension display of the flying object of FIG. 本実施形態にかかる飛行体3周辺の俯瞰図の例である。This is an example of a bird's-eye view of the periphery of the flying object 3 according to the present embodiment. 図5の状況における本実施形態に係るシステムの表示例を示す図である。It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. 図5の状況における本実施形態に係るシステムの表示例を示す図である。It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. 図5の状況における本実施形態に係るシステムの表示例を示す図である。It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. 本実施形態にかかる飛行体3周辺の側面図の例である。It is an example of the side view around the flying object 3 which concerns on this embodiment. 図9の状況における本実施形態に係るシステムの表示例を示す図である。It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG. 図9の状況における本実施形態に係るシステムの表示例を示す図である。It is a figure which shows the display example of the system which concerns on this Embodiment in the situation of FIG.
 本開示の実施形態の内容を列記して説明する。本開示の実施の形態による飛行体は、以下のような構成を備える。 The contents of the embodiments of the present disclosure will be listed and described. The air vehicle according to the embodiment of the present disclosure has the following configuration.
[項目1]
 寸法表示システムであって、
 前記寸法表示システムは、
 飛行体の撮像装置から取得した映像を演算処理する演算処理部と、
 前記映像に重畳して前記ユーザ端末に表示される寸法表示が示す、少なくとも前記飛行体の機体正面に位置する物体の位置における幅方向または高さ方向のいずれかの寸法値を予め記憶する寸法記憶部と、を備え、
 前記演算処理部は、少なくとも前記飛行体のセンサにより取得した、前記機体から前記物体までの正面距離及び前記撮像装置の画角、前記寸法値に基づき、前記寸法表示を表示する位置を算出する寸法表示位置算出部をさらに備え、
 前記演算処理部は、前記算出された位置を基に、前記撮像装置から取得した映像上に前記寸法表示を重畳させて、前記演算処理された映像を生成する、
 ことを特徴とする寸法表示システム。
[項目2]
 項目1に記載の寸法表示システムであって、
 前記演算処理部は、前記正面距離を前記撮像装置から取得した映像に重畳して表示する、
 ことを特徴とする寸法表示システム。
[項目3]
 項目1または2に記載の寸法表示システムであって、
 前記演算処理部は、少なくとも前記飛行体の飛行体に関する情報またはまたは環境情報のいずれか1つを前記撮像装置から取得した映像に重畳して表示する、
 ことを特徴とする寸法表示システム。
[項目4]
 請求項1乃至3に記載の寸法表示システムを備える、
 ことを特徴とする飛行体。
[項目5]
 請求項1乃至3に記載の寸法表示システムを備える、
 ことを特徴とするユーザ端末。
[項目6]
 請求項1乃至3に記載の寸法表示システムを備える、
 ことを特徴とするサーバ。
[項目7]
 飛行体の撮像装置から取得した映像を演算処理するステップと、
 前記演算処理された映像を表示するステップと、
 前記映像に重畳して表示される寸法表示が示す、少なくとも前記飛行体の機体正面に位置する物体の位置における幅方向または高さ方向のいずれかの寸法値を予め記憶するステップと、
 少なくとも前記飛行体のセンサにより取得した、前記機体から前記物体までの正面距離及び前記撮像装置の画角、前記寸法値に基づき、前記寸法表示を表示する位置を算出するステップと、
 前記算出された位置を基に、前記撮像装置から取得した映像上に前記寸法表示を重畳させて、前記演算処理された映像を生成するステップと、
 を備えることを特徴とする寸法表示方法。
[Item 1]
It is a dimension display system
The dimension display system
An arithmetic processing unit that arithmetically processes the image acquired from the image pickup device of the flying object, and
Dimensional memory that stores in advance either the width direction or the height direction at the position of an object located in front of the aircraft body, which is indicated by the dimension display superimposed on the image and displayed on the user terminal. With a department,
The arithmetic processing unit calculates a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object. Equipped with a display position calculation unit
Based on the calculated position, the arithmetic processing unit superimposes the dimension display on the image acquired from the imaging device to generate the arithmetically processed image.
A dimension display system characterized by that.
[Item 2]
The dimension display system according to item 1.
The arithmetic processing unit superimposes and displays the frontal distance on the image acquired from the image pickup apparatus.
A dimension display system characterized by that.
[Item 3]
The dimension display system according to item 1 or 2.
The arithmetic processing unit superimposes and displays at least one of information about the flying object of the flying object and / or environmental information on the image acquired from the imaging device.
A dimension display system characterized by that.
[Item 4]
The dimension display system according to any one of claims 1 to 3 is provided.
An air vehicle characterized by that.
[Item 5]
The dimension display system according to any one of claims 1 to 3 is provided.
A user terminal characterized by that.
[Item 6]
The dimension display system according to any one of claims 1 to 3 is provided.
A server that features that.
[Item 7]
Steps to calculate and process the image acquired from the image pickup device of the flying object,
The step of displaying the calculated image and
A step of preliminarily storing at least a dimension value in either the width direction or the height direction at the position of an object located in front of the aircraft body, which is indicated by the dimension display superimposed on the image.
A step of calculating a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object.
Based on the calculated position, the step of superimposing the dimension display on the image acquired from the imaging device to generate the arithmetically processed image, and
A dimension display method comprising.
<実施の形態の詳細>
 以下、本開示の実施の形態による飛行体の撮影画像を表示する表示システム(以下「本システム」という)について説明する。添付図面において、同一または類似の要素には同一または類似の参照符号及び名称が付され、各実施形態の説明において同一または類似の要素に関する重複する説明は省略することがある。また、各実施形態で示される特徴は、互いに矛盾しない限り他の実施形態にも適用可能である。
<Details of the embodiment>
Hereinafter, a display system (hereinafter referred to as “this system”) for displaying a photographed image of an air vehicle according to the embodiment of the present disclosure will be described. In the accompanying drawings, the same or similar elements are given the same or similar reference numerals and names, and duplicate description of the same or similar elements may be omitted in the description of each embodiment. In addition, the features shown in each embodiment can be applied to other embodiments as long as they do not contradict each other.
<構成>
 図1に示されるように、本システムは、ユーザ端末1、2と、飛行体3とを有している。ユーザ端末1、2と、飛行体3とは、ネットワークNWを介して互いに通信可能に接続されている。なお、図示された構成は一例であり、ユーザ端末や飛行体の数や種類は任意である。
<Structure>
As shown in FIG. 1, this system has user terminals 1 and 2 and an air vehicle 3. The user terminals 1 and 2 and the aircraft 3 are connected to each other so as to be able to communicate with each other via the network NW. The illustrated configuration is an example, and the number and types of user terminals and flying objects are arbitrary.
<ユーザ端末1、2>
 図2は、図1のユーザ端末1、2のハードウェア構成を示すブロック図である。図1に示されるユーザ端末1、2は、演算処理部10、メモリ11、ストレージ12、送受信部13、入出力部14等を備え、これらはバス15を通じて相互に電気的に接続される。
< User terminals 1 and 2>
FIG. 2 is a block diagram showing a hardware configuration of the user terminals 1 and 2 of FIG. The user terminals 1 and 2 shown in FIG. 1 include an arithmetic processing unit 10, a memory 11, a storage 12, a transmission / reception unit 13, an input / output unit 14, and the like, and these are electrically connected to each other through a bus 15.
 演算処理部10は、ユーザ端末1、2全体の動作を制御し、各要素間におけるデータの送受信の制御、及びアプリケーションの実行及び認証処理に必要な情報処理等を行う演算装置である。例えば演算処理部10はCPU(Central Processing Unit)であり、ストレージ12に格納されメモリ11に展開された本システムのためのプログラム等を実行して各情報処理を実施する。 The arithmetic processing unit 10 is an arithmetic unit that controls the operation of the entire user terminals 1 and 2, controls the transmission and reception of data between each element, and performs information processing and the like necessary for application execution and authentication processing. For example, the arithmetic processing unit 10 is a CPU (Central Processing Unit), and executes each information processing by executing a program or the like for the system stored in the storage 12 and expanded in the memory 11.
 メモリ11は、DRAM(Dynamic Random Access Memory)等の揮発性記憶装置で構成される主記憶と、フラッシュメモリやHDD(Hard Disc Drive)等の不揮発性記憶装置で構成される補助記憶と、を含む。メモリ11は、演算処理部10のワークエリア等として使用され、また、ユーザ端末1、2の起動時に実行されるBIOS(Basic Input / Output System)、及び各種設定情報等を格納する。 The memory 11 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary memory composed of a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). .. The memory 11 is used as a work area or the like of the arithmetic processing unit 10, and also stores a BIOS (Basic Input / Output System) executed when the user terminals 1 and 2 are started, various setting information, and the like.
 ストレージ12は、アプリケーション・プログラム等の各種プログラムを格納する。各処理に用いられるデータを格納したデータベースがストレージ12に構築されていてもよい。 The storage 12 stores various programs such as application programs. A database storing data used for each process may be built in the storage 12.
 送受信部13は、ユーザ端末1、2をネットワークNWに接続する。なお、送受信部13は、Bluetooth(登録商標)及びBLE(Bluetooth Low Energy)の近距離通信インターフェースを備えていてもよい。 The transmission / reception unit 13 connects the user terminals 1 and 2 to the network NW. The transmission / reception unit 13 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
 入出力部14は、キーボード・マウス類等の入力機器、及び表示部141(例えば、ディスプレイ)等の出力機器である。なお、本実施形態における表示部は、ユーザ端末1、2が備えている表示部141に限らず、ユーザ端末1、2に直接または間接的に接続される外部の表示装置の表示部であってもよい。 The input / output unit 14 is an input device such as a keyboard and a mouse, and an output device such as a display unit 141 (for example, a display). The display unit in the present embodiment is not limited to the display unit 141 provided in the user terminals 1 and 2, but is a display unit of an external display device directly or indirectly connected to the user terminals 1 and 2. May be good.
 バス15は、上記各要素に共通に接続され、例えば、アドレス信号、データ信号及び各種制御信号を伝達する。 The bus 15 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
 なお、図1では、ユーザ端末1、2として、例えばワークステーションやパーソナルコンピュータのような汎用コンピュータや、スマートフォンやタブレットPC等の携帯端末を図示しているが、これに限らず、例えばヘッドマウントディスプレイ(HMD)のような、一人称視点で操作可能な装置であってもよく、この場合、本実施形態の表示システムの効果を一層奏する。 Note that FIG. 1 illustrates general-purpose computers such as workstations and personal computers, and mobile terminals such as smartphones and tablet PCs as user terminals 1 and 2, but the present invention is not limited to this, and for example, a head-mounted display is shown. A device such as (HMD) that can be operated from a first-person viewpoint may be used, and in this case, the effect of the display system of the present embodiment is further enhanced.
<飛行体3>
 図3は、飛行体3のハードウェア構成を示すブロック図である。フライトコントローラ31は、プログラマブルプロセッサ(例えば、中央演算処理装置(CPU))などの1つ以上の演算処理部を有することができる。
<Aircraft 3>
FIG. 3 is a block diagram showing a hardware configuration of the flying object 3. The flight controller 31 can have one or more arithmetic processing units such as a programmable processor (for example, a central processing unit (CPU)).
 フライトコントローラ31は、メモリ311を有しており、当該メモリにアクセス可能である。メモリ311は、1つ以上のステップを行うためにフライトコントローラが実行可能であるロジック、コード、および/またはプログラム命令を記憶している。また、フライトコントローラ31は、慣性センサ(例えば、加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、LiDAR(Light Detection And Ranging))等のセンサ類312を含みうる。 The flight controller 31 has a memory 311 and can access the memory. Memory 311 stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps. Further, the flight controller 31 may include sensors 312 such as an inertial sensor (for example, an acceleration sensor and a gyro sensor), a GPS sensor, and a proximity sensor (for example, LiDAR (Light Detection And Ringing)).
 メモリ311は、例えば、SDカードやランダムアクセスメモリ(RAM)などの分離可能な媒体または外部の記憶装置を含んでいてもよい。撮像装置32/センサ類33から取得したデータは、メモリ311に直接に伝達されかつ記憶されてもよい。例えば、撮像装置等で撮影した静止画・動画データが内蔵メモリ又は外部メモリに記録される。 Memory 311 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device. The data acquired from the image pickup apparatus 32 / sensors 33 may be directly transmitted and stored in the memory 311. For example, still image / moving image data taken by an imaging device or the like is recorded in an internal memory or an external memory.
 フライトコントローラ31は、飛行体の状態を制御するように構成された図示しない制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θ、θ及びθ)を有する飛行体の空間的配置、速度、および/または加速度を調整するために、ESC34(Electric Speed Controller)を経由して飛行体の推進機構(モータ35等)を制御する。バッテリー38から給電されるモータ35によりプロペラ36が回転することで飛行体の揚力を生じさせる。制御モジュールは、搭載部、センサ類の状態のうちの1つ以上を制御することができる。 The flight controller 31 includes a control module (not shown) configured to control the state of the flying object. For example, the control module adjusts the spatial placement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion θ x , θ y and θ z). , ESC34 (Electric Speed Controller) to control the propulsion mechanism (motor 35, etc.) of the flying object. The propeller 36 is rotated by the motor 35 supplied from the battery 38 to generate lift of the flying object. The control module can control one or more of the states of the mounting unit and the sensors.
 フライトコントローラ31は、1つ以上の外部のデバイス(例えば、送信機(プロポ)39、端末、表示装置、または他の遠隔の制御器)からのデータを受け取るように構成された受信部37と通信可能である。送信機39は、有線通信または無線通信などの任意の適当な通信手段を使用することができる。 The flight controller 31 communicates with a receiver 37 configured to receive data from one or more external devices (eg, transmitter (propo) 39, terminal, display device, or other remote controller). It is possible. The transmitter 39 can use any suitable communication means such as wired communication or wireless communication.
 受信部37は、例えば、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。 The receiving unit 37 uses one or more of, for example, a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. can do.
 受信部37は、所定の制御データ、端末または遠隔の制御器からのユーザコマンドなどのうちの1つ以上を受け取ることができる。 The receiving unit 37 can receive one or more of predetermined control data, a user command from a terminal or a remote control, and the like.
 本実施の形態によるセンサ類33は、慣性センサ(加速度センサ、ジャイロセンサ)、GPSセンサ、近接センサ(例えば、LiDAR(Light Detection And Ranging))、またはビジョン/イメージセンサ(例えば、撮像装置32)を含み得るが、本実施形態では、特に飛行体3の機体から飛行体3の正面の物体までの正面距離を測定するセンサ(例えば、レーザセンサなど)を含んでいる。 The sensors 33 according to the present embodiment include an inertial sensor (acceleration sensor, gyro sensor), a GPS sensor, a proximity sensor (for example, LiDAR (Light Detection And Ringing)), or a vision / image sensor (for example, an image pickup device 32). Although it may be included, the present embodiment particularly includes a sensor (for example, a laser sensor) for measuring the frontal distance from the body of the flying object 3 to the object in front of the flying object 3.
 図4は、本実施の形態の寸法表示にかかるハードウェア構成を示すブロック図である。寸法表示システム4は、例えば飛行体3において図3に示した構成とは別に備えられており、記憶部40、演算処理部42、送信部44、バス46を有しており、バス46を介して撮像装置32/センサ類33と相互に電気的に接続されている。なお、寸法表示システム4は、飛行体3に備えられる形態に限らず、例えば、ユーザ端末1、2に別途設けられる、または、ユーザ端末1、2の演算処理部及び記憶部に相当する構成の一部または全部を利用して構成される形態や、ネットワークNWを介して接続されるサーバ(不図示)に備える形態であってもよく、その場合、飛行体3の撮像装置32/センサ類33からのデータを受信することで寸法表示のための処理を行う。また、記憶部40と演算処理部42がそれぞれ別の装置(例えば、飛行体3やユーザ端末1、2、サーバ)に備えられていてもよい。 FIG. 4 is a block diagram showing a hardware configuration related to dimension display according to the present embodiment. The dimension display system 4 is provided separately from the configuration shown in FIG. 3 in the flying object 3, for example, and has a storage unit 40, an arithmetic processing unit 42, a transmission unit 44, and a bus 46, via the bus 46. The image pickup device 32 / sensors 33 are electrically connected to each other. The dimension display system 4 is not limited to the form provided in the flying object 3, and is, for example, separately provided in the user terminals 1 and 2, or has a configuration corresponding to the arithmetic processing unit and the storage unit of the user terminals 1 and 2. It may be configured by using a part or all of it, or it may be provided in a server (not shown) connected via a network NW. In that case, the image pickup device 32 / sensors 33 of the flying object 3 may be provided. Processing for dimension display is performed by receiving the data from. Further, the storage unit 40 and the arithmetic processing unit 42 may be provided in different devices (for example, the flying object 3, the user terminals 1, 2 and the server).
 記憶部40は、正面距離記憶部401と、寸法記憶部402を備えている。正面距離記憶部401は、センサ類33から取得した、飛行体の機体から飛行体の正面の物体までの正面距離を記憶する。寸法記憶部402は、撮像装置32から取得した映像に重畳して表示される寸法表示が示す、少なくとも物体の位置における幅方向または高さ方向の寸法値を記憶する。なお、本実施の形態の具体例では、正面距離記憶部401に正面距離を記憶しているが、これに限らず、センサ類33から演算処理部42に正面距離を送信するように構成してもよい。 The storage unit 40 includes a front distance storage unit 401 and a dimension storage unit 402. The front distance storage unit 401 stores the front distance from the body of the flying object to the object in front of the flying object acquired from the sensors 33. The dimensional storage unit 402 stores at least the dimensional values in the width direction or the height direction at the position of the object indicated by the dimensional display displayed superimposed on the image acquired from the image pickup device 32. In the specific example of the present embodiment, the front distance is stored in the front distance storage unit 401, but the present invention is not limited to this, and the sensors 33 are configured to transmit the front distance to the arithmetic processing unit 42. May be good.
 演算処理部42は、寸法表示位置算出部421を備えている。寸法表示位置算出部421は、少なくとも正面距離及び撮像装置の画角、寸法値(いずれも後述する)に基づき、上述の正面の物体位置における幅方向または高さ方向の寸法値を示す寸法表示を表示する位置を算出する。この寸法表示位置算出部421による算出方法については、後述する。そして、演算処理部42は、寸法表示位置算出部421により算出された位置を基に、撮像装置32からの取得した映像上に寸法表示を重畳する。 The calculation processing unit 42 includes a dimension display position calculation unit 421. The dimension display position calculation unit 421 displays a dimension indicating the dimension value in the width direction or the height direction at the above-mentioned front object position based on at least the front distance, the angle of view of the image pickup device, and the dimension value (both will be described later). Calculate the display position. The calculation method by the dimension display position calculation unit 421 will be described later. Then, the arithmetic processing unit 42 superimposes the dimension display on the image acquired from the image pickup apparatus 32 based on the position calculated by the dimension display position calculation unit 421.
 送信部44は、有線通信または無線通信などの任意の適当な通信手段を使用することができ、例えば、ローカルエリアネットワーク(LAN)、ワイドエリアネットワーク(WAN)、赤外線、無線、WiFi、ポイントツーポイント(P2P)ネットワーク、電気通信ネットワーク、クラウド通信などのうちの1つ以上を利用することができる。上述の寸法表示が重畳された映像は、送信部44を介してユーザ端末1、2に送信される。 The transmitter 44 can use any suitable communication means such as wired communication or wireless communication, for example, local area network (LAN), wide area network (WAN), infrared, wireless, WiFi, point-to-point. One or more of (P2P) networks, telecommunications networks, cloud communications, etc. can be used. The video on which the above-mentioned dimension display is superimposed is transmitted to the user terminals 1 and 2 via the transmission unit 44.
 図5は、本実施形態にかかる飛行体3周辺の俯瞰図を例示する。なお、この俯瞰図は、本実施形態を説明するために本明細書において用いられるものであり、本実施形態にかかるシステム上で表示されなくてもよい。 FIG. 5 exemplifies a bird's-eye view of the vicinity of the flying object 3 according to the present embodiment. It should be noted that this bird's-eye view is used in the present specification for explaining the present embodiment, and may not be displayed on the system according to the present embodiment.
 図5では、例示的な実施形態として非GPS下での屋内点検について説明するが、実施形態はこれに限らず、飛行状況は、例えばGPS下や屋外等であってもよく、飛行目的は、例えば測量や空撮等であってもよく、飛行体が一般的に用いられる如何なる状況や目的であっても適用され得る。この例では、通路に円柱状の障害物(例えば、配管等)が存在し、それらにより狭所が形成されている。より具体的な例として、点線矢印で示されるように、飛行体3が通路を飛行している際に障害物を発見し、一旦左側に水平飛行した後、前進飛行して狭所を通過することを想定している。 In FIG. 5, an indoor inspection under non-GPS will be described as an exemplary embodiment, but the embodiment is not limited to this, and the flight situation may be, for example, under GPS or outdoors, and the flight purpose is. For example, it may be surveying, aerial photography, etc., and may be applied to any situation or purpose in which an air vehicle is generally used. In this example, there are columnar obstacles (eg, pipes, etc.) in the passage, which form narrow spaces. As a more specific example, as shown by the dotted arrow, an obstacle is found while the flying object 3 is flying in the aisle, and once it flies horizontally to the left side, it flies forward and passes through a narrow space. It is assumed that.
 図6から図8は、本実施形態にかかる表示システムの入出力部14の表示部141(例えば、ディスプレイ等)による表示例である。 6 to 8 are display examples by the display unit 141 (for example, a display or the like) of the input / output unit 14 of the display system according to the present embodiment.
 図6では、通路上に円柱状の障害物が存在することが確認できる。ディスプレイ上には、正面の障害物の位置における幅方向の寸法(例えば、1.0m)を示す、2本の縦の点線が表示されている。この2つの縦の点線間の幅が、正面の障害物の位置における幅方向の寸法を示している。 In FIG. 6, it can be confirmed that a columnar obstacle exists on the passage. Two vertical dotted lines are displayed on the display indicating the widthwise dimension (for example, 1.0 m) at the position of the obstacle in front. The width between the two vertical dotted lines indicates the width direction dimension at the position of the obstacle in front.
 また、図6には、表示の一例として、画面中心の正面距離(例えば、2.8m)を表示したり、バッテリー残量(例えば、7.0V)を表示したりしているが、この限りではなく、どちらかの表示または両方の表示が示されていなくてもよく、反対に、その他の飛行体3に関する情報(例えば、バッテリー消費量、飛行高度、飛行可能時間など)、環境情報(例えば、一酸化炭素濃度や酸素濃度など)がさらに表示されていてもよい。 Further, in FIG. 6, as an example of the display, the front distance of the center of the screen (for example, 2.8 m) is displayed, and the remaining battery level (for example, 7.0 V) is displayed. Instead, either or both indications may not be shown, and conversely, information about the other aircraft 3 (eg, battery consumption, flight altitude, flight time, etc.), environmental information (eg, flight time, etc.). , Carbon monoxide concentration, oxygen concentration, etc.) may be further displayed.
 図7では、図6の後に前方に飛行した際の表示例である。画面中心の正面距離が更新される(例えば、2.5m)とともに、正面の障害物の位置における幅方向の寸法(例えば、1.0m)が示されている。図6よりもさらに飛行体3が障害物に近づいたため、図6よりも2つの縦線間の幅が広くなっている。 FIG. 7 is a display example when flying forward after FIG. The frontal distance of the center of the screen is updated (eg, 2.5 m) and the widthwise dimension (eg, 1.0 m) at the position of the front obstacle is shown. Since the flying object 3 is closer to the obstacle than in FIG. 6, the width between the two vertical lines is wider than in FIG.
 図8では、図7の後に左に水平飛行した際の表示例である。左側の障害物と正面の障害物との間(狭所)が映されているものの、画面中心の正面距離が更新されないため(例えば、2.5m)正面の障害物の位置における幅方向の寸法(例えば、1.0m)が図7と同じ幅で示されている。このように、飛行体3により撮像された映像内に狭所が映し出されている際に、狭所の位置における幅方向の寸法表示も補助的に表示されていることにより、飛行体3が通過可能な個所を現場で特定を容易に行うことが可能である。 FIG. 8 is a display example when a horizontal flight is made to the left after FIG. 7. Although the space between the obstacle on the left side and the obstacle in front (narrow space) is shown, the front distance in the center of the screen is not updated (for example, 2.5 m), so the dimension in the width direction at the position of the obstacle in front (For example, 1.0 m) is shown with the same width as in FIG. In this way, when a narrow space is projected in the image captured by the flying object 3, the dimensional display in the width direction at the position of the narrow space is also displayed as an auxiliary, so that the flying object 3 passes through. It is possible to easily identify possible locations on site.
 ここで、具体的な寸法表示位置算出部421の算出方法として、寸法値が1.0mを示す、2つの縦の点線の位置を演算処理部42の寸法表示位置算出部421により算出する方法の一例を、以下で説明する。 Here, as a specific calculation method of the dimension display position calculation unit 421, a method of calculating the positions of two vertical dotted lines indicating a dimension value of 1.0 m by the dimension display position calculation unit 421 of the calculation processing unit 42. An example will be described below.
 まず、センサ類43に含まれるレーザセンサ等の正面距離センサにより、飛行体3の正面距離aを取得し、正面距離記憶部401に記憶する。 First, the front distance a of the flying object 3 is acquired by the front distance sensor such as the laser sensor included in the sensors 43, and stored in the front distance storage unit 401.
 次に、寸法表示位置算出部421において、取得した正面距離aと、撮像装置32の画角θとに基づき、表示部141の表示画像幅の半分が対応する正面の物体位置における長さb=a×tanθを求める。 Next, in the dimension display position calculation unit 421, based on the acquired front distance a and the angle of view θ of the imaging device 32, the length b at the front object position corresponding to half of the display image width of the display unit 141 = Find a × tan θ.
 そして、寸法表示位置算出部421において、求めた長さbと、表示部141の画像幅の半分の画素数nとに基づき、単位画素数当たりの正面の物体位置における長さb’=b/nを求める。 Then, based on the length b obtained by the dimension display position calculation unit 421 and the number of pixels n which is half the image width of the display unit 141, the length b'= b / at the front object position per unit number of pixels. Find n.
 最後に、寸法表示位置算出部421において、寸法値の半分の値(0.5m)を求めた長さb’で除算して必要な画素数mを算出し、表示部141の中心線から画素数m離れた位置に縦の点線を、飛行体3の撮像装置32から取得した映像に重畳して表示する。 Finally, in the dimension display position calculation unit 421, the required number of pixels m is calculated by dividing the value (0.5 m) of half the dimension value by the obtained length b', and the pixels are calculated from the center line of the display unit 141. A vertical dotted line is superimposed and displayed on the image acquired from the image pickup device 32 of the flying object 3 at a position several meters away.
 なお、この計算方法は一例にすぎず、上述の縦の点線位置が算出できるのであれば、これに限らないし、例えば、上記計算方法に関するテーブルを予め設け、ある範囲の正面距離に対応した画素数mを設定しておく(例えば、正面距離が2.1~2.5mの場合は、画素数m、正面距離が2.6~3.0の場合は、画素数m’など)ことで、計算速度を向上させるとともに、逐次更新による表示処理の負荷を減らすことも可能である。また、より正確な正面の物体位置における幅方向の寸法値を示すために、縦の点線位置を算出時に既知の補正方法を利用してもよい。また、本実施形態においては、幅方向の寸法値を表示したが、これに限らず、例えば、高さ方向の画素を基に寸法表示位置を算出して、高さ方向の寸法表示を重畳してもよい。 Note that this calculation method is only an example, and is not limited to this as long as the above-mentioned vertical dotted line position can be calculated. For example, a table relating to the above calculation method is provided in advance, and the number of pixels corresponding to the front distance in a certain range is provided. By setting m (for example, when the front distance is 2.1 to 2.5 m, the number of pixels is m, and when the front distance is 2.6 to 3.0, the number of pixels is m', etc.). It is possible to improve the calculation speed and reduce the load of display processing due to sequential update. Further, in order to show a more accurate dimension value in the width direction at the front object position, a known correction method may be used when calculating the vertical dotted line position. Further, in the present embodiment, the dimensional value in the width direction is displayed, but the present invention is not limited to this. For example, the dimensional display position is calculated based on the pixels in the height direction, and the dimensional display in the height direction is superimposed. You may.
 図9は、図5とは異なる状況における、本実施形態にかかる飛行体3周辺の側面図を例示する。なお、この側面図は、本実施形態を説明するために本明細書において用いられるものであり、本実施形態にかかるシステム上で表示されなくてもよい。 FIG. 9 illustrates a side view of the periphery of the flying object 3 according to the present embodiment in a situation different from that of FIG. It should be noted that this side view is used in the present specification for explaining the present embodiment, and may not be displayed on the system according to the present embodiment.
 図9では、例示的な実施形態として非GPS下での屋内点検について説明するが、実施形態はこれに限らず、飛行状況は、例えばGPS下や屋外等であってもよく、飛行目的は、例えば測量や空撮等であってもよく、飛行体が一般的に用いられる如何なる状況や目的であっても適用され得る。この例では、具体的な例として、点線矢印で示されるように、飛行体3が通路を飛行している際に行き止まりとなり、通路に沿って上昇した後、前進飛行して狭所を通過することを想定している。 In FIG. 9, an indoor inspection under non-GPS will be described as an exemplary embodiment, but the embodiment is not limited to this, and the flight situation may be, for example, under GPS or outdoors, and the flight purpose is. For example, it may be surveying, aerial photography, etc., and may be applied to any situation or purpose in which an air vehicle is generally used. In this example, as a specific example, as indicated by the dotted arrow, the flying object 3 becomes a dead end when flying in the aisle, climbs along the aisle, and then flies forward and passes through a narrow space. It is assumed that.
 図10から図11は、本実施形態にかかる表示システムの入出力部14の表示部141(例えば、ディスプレイ等)による他の表示例である。 10 to 11 are other display examples by the display unit 141 (for example, a display or the like) of the input / output unit 14 of the display system according to the present embodiment.
 図10では、通路を壁面に沿って上昇している状況である。ディスプレイ上には、正面の壁の位置における幅方向の寸法(例えば、1.0m)を示す、2本の縦の点線が表示されている。例えば、正面の壁までの距離は、2.0mである。 In FIG. 10, the passage is rising along the wall surface. Two vertical dotted lines are displayed on the display indicating the widthwise dimension (for example, 1.0 m) at the position of the front wall. For example, the distance to the front wall is 2.0 m.
 図11では、通路を壁面に沿って上昇している途中で狭所が現れた状況である。ディスプレイ上には、正面の壁の位置における幅方向の寸法(例えば、1.0m)を示す、2本の縦の点線が表示されており、狭所の幅と寸法表示を目視で比較することにより、右に水平移動することで、狭所を通過することができると判断できる。このように、飛行体3により撮像された映像内に狭所が映し出されている際に、狭所の位置における幅方向の寸法表示も補助的に表示されていることにより、飛行体3が通過可能な個所を現場で特定を容易に行うことが可能である。 In FIG. 11, a narrow space appeared while climbing the passage along the wall surface. Two vertical dotted lines indicating the width direction dimension (for example, 1.0 m) at the position of the front wall are displayed on the display, and the width of the narrow space and the dimension display can be visually compared. Therefore, it can be determined that the vehicle can pass through a narrow space by moving horizontally to the right. In this way, when a narrow space is projected in the image captured by the flying object 3, the dimensional display in the width direction at the position of the narrow space is also displayed as an auxiliary, so that the flying object 3 passes through. It is possible to easily identify possible locations on site.
 なお、本実施形態においては、飛行体3のセンサ類33により正面距離取得することで、正面の物体の位置における幅方向または高さ方向の寸法を算出しているが、これに限らず、例えば、同種または異種の複数のセンサ(例えば、レーザセンサ、超音波センサ、画像センサなど)を備えたり、放射状に広がるxy平面で位置を感知する測域センサを備えたりすることにより、飛行体3が撮像する映像内で機体に最も近い物体の位置における幅方向または高さ方向の寸法を算出、表示するように構成をなすことも可能である。 In the present embodiment, the dimensions in the width direction or the height direction at the position of the front object are calculated by acquiring the front distance by the sensors 33 of the flying object 3, but the present invention is not limited to this, for example. By providing a plurality of sensors of the same type or different types (for example, a laser sensor, an ultrasonic sensor, an image sensor, etc.), or by providing a range sensor that senses a position in a radially extending xy plane, the flying object 3 can be provided. It is also possible to calculate and display the dimensions in the width direction or the height direction at the position of the object closest to the aircraft in the image to be captured.
 上述した実施の形態は、本開示の理解を容易にするための例示に過ぎず、本開示を限定して解釈するためのものではない。本開示は、その趣旨を逸脱することなく、変更、改良することができると共に、本開示にはその均等物が含まれることは言うまでもない。 The above-described embodiment is merely an example for facilitating the understanding of the present disclosure, and is not intended to limit the interpretation of the present disclosure. It goes without saying that the present disclosure may be modified or improved without departing from its spirit, and the present disclosure includes its equivalents.
 1    ユーザ端末
 2    ユーザ端末
 3    飛行体
NW    ネットワーク
1 User terminal 2 User terminal 3 Aircraft NW network

Claims (7)

  1.  寸法表示システムであって、
     前記寸法表示システムは、
     飛行体の撮像装置から取得した映像を演算処理する演算処理部と、
     前記映像に重畳して表示される寸法表示が示す、少なくとも前記飛行体の機体正面に位置する物体の位置における幅方向または高さ方向のいずれかの寸法値を予め記憶する寸法記憶部と、を備え、
     前記演算処理部は、少なくとも前記飛行体のセンサにより取得した、前記機体から前記物体までの正面距離及び前記撮像装置の画角、前記寸法値に基づき、前記寸法表示を表示する位置を算出する寸法表示位置算出部を含み、
     前記演算処理部は、前記算出された位置を基に、前記撮像装置から取得した映像上に前記寸法表示を重畳させて、前記演算処理された映像を生成する、
     ことを特徴とする寸法表示システム。
    It is a dimension display system
    The dimension display system
    An arithmetic processing unit that arithmetically processes the image acquired from the image pickup device of the flying object, and
    A dimension storage unit that stores at least a dimension value in either the width direction or the height direction at the position of an object located in front of the airframe of the flying object, which is indicated by the dimension display superimposed on the image. Prepare,
    The arithmetic processing unit calculates a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object. Including the display position calculation unit
    Based on the calculated position, the arithmetic processing unit superimposes the dimension display on the image acquired from the imaging device to generate the arithmetically processed image.
    A dimension display system characterized by that.
  2.  請求項1に記載の寸法表示システムであって、
     前記演算処理部は、前記正面距離を前記撮像装置から取得した映像に重畳して表示する、
     ことを特徴とする寸法表示システム。
    The dimension display system according to claim 1.
    The arithmetic processing unit superimposes and displays the frontal distance on the image acquired from the image pickup apparatus.
    A dimension display system characterized by that.
  3.  請求項1または2に記載の寸法表示システムであって、
     前記演算処理部は、少なくとも前記飛行体に関する情報または環境情報のいずれか1つを前記撮像装置から取得した映像に重畳して表示する、
     ことを特徴とする寸法表示システム。
    The dimension display system according to claim 1 or 2.
    The arithmetic processing unit superimposes and displays at least one of information about the flying object or environmental information on the image acquired from the imaging device.
    A dimension display system characterized by that.
  4.  請求項1乃至3に記載の寸法表示システムを備える、
     ことを特徴とする飛行体。
    The dimension display system according to any one of claims 1 to 3 is provided.
    An air vehicle characterized by that.
  5.  請求項1乃至3に記載の寸法表示システムを備える、
     ことを特徴とするユーザ端末。
    The dimension display system according to any one of claims 1 to 3 is provided.
    A user terminal characterized by that.
  6.  請求項1乃至3に記載の寸法表示システムを備える、
     ことを特徴とするサーバ。
    The dimension display system according to any one of claims 1 to 3 is provided.
    A server that features that.
  7.  飛行体の撮像装置から取得した映像を演算処理するステップと、
     前記演算処理された映像を表示するステップと、
     前記映像に重畳して表示される寸法表示が示す、少なくとも前記飛行体の機体正面に位置する物体の位置における幅方向または高さ方向のいずれかの寸法値を予め記憶するステップと、
     少なくとも前記飛行体のセンサにより取得した、前記機体から前記物体までの正面距離及び前記撮像装置の画角、前記寸法値に基づき、前記寸法表示を表示する位置を算出するステップと、
     前記算出された位置を基に、前記撮像装置から取得した映像上に前記寸法表示を重畳させて、前記演算処理された映像を生成するステップと、
     を備えることを特徴とする寸法表示方法。
    Steps to calculate and process the image acquired from the image pickup device of the flying object,
    The step of displaying the calculated image and
    A step of preliminarily storing at least a dimension value in either the width direction or the height direction at the position of an object located in front of the aircraft body, which is indicated by the dimension display superimposed on the image.
    A step of calculating a position for displaying the dimension display based on at least the frontal distance from the aircraft to the object, the angle of view of the image pickup device, and the dimension value acquired by the sensor of the flying object.
    Based on the calculated position, the step of superimposing the dimension display on the image acquired from the imaging device to generate the arithmetically processed image, and
    A dimension display method comprising.
PCT/JP2020/034039 2019-09-11 2020-09-09 Dimension display system, and dimension display method WO2021049508A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020227011101A KR20220058596A (en) 2019-09-11 2020-09-09 Dimension Display System and Dimension Display Method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-165084 2019-09-11
JP2019165084A JP6604681B1 (en) 2019-09-11 2019-09-11 Dimension display system and dimension display method

Publications (1)

Publication Number Publication Date
WO2021049508A1 true WO2021049508A1 (en) 2021-03-18

Family

ID=68532179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/034039 WO2021049508A1 (en) 2019-09-11 2020-09-09 Dimension display system, and dimension display method

Country Status (3)

Country Link
JP (1) JP6604681B1 (en)
KR (1) KR20220058596A (en)
WO (1) WO2021049508A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003244501A (en) * 2002-02-13 2003-08-29 Fuji Photo Film Co Ltd Electronic camera
JP2005142938A (en) * 2003-11-07 2005-06-02 Casio Comput Co Ltd Electronic camera, control program
JP2009015730A (en) * 2007-07-06 2009-01-22 Location View:Kk Image display system with streoscopic measure display function and program of image display with stereoscopic measure display function
JP2017175517A (en) * 2016-03-25 2017-09-28 オリンパス株式会社 Imaging device and imaging method
JP2018007051A (en) * 2016-07-04 2018-01-11 オリンパス株式会社 Photographing apparatus, movable photographing device, photographing mobile body, and mobile body photographing control device
JP2018189536A (en) * 2017-05-09 2018-11-29 浄真 清水 Image processing device, actual dimension display method, and actual dimension display processing program
JP2019142714A (en) * 2018-02-23 2019-08-29 コニカミノルタ株式会社 Image processing device for fork lift

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6486024B2 (en) 2014-07-02 2019-03-20 三菱重工業株式会社 Indoor monitoring system and method for structure

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003244501A (en) * 2002-02-13 2003-08-29 Fuji Photo Film Co Ltd Electronic camera
JP2005142938A (en) * 2003-11-07 2005-06-02 Casio Comput Co Ltd Electronic camera, control program
JP2009015730A (en) * 2007-07-06 2009-01-22 Location View:Kk Image display system with streoscopic measure display function and program of image display with stereoscopic measure display function
JP2017175517A (en) * 2016-03-25 2017-09-28 オリンパス株式会社 Imaging device and imaging method
JP2018007051A (en) * 2016-07-04 2018-01-11 オリンパス株式会社 Photographing apparatus, movable photographing device, photographing mobile body, and mobile body photographing control device
JP2018189536A (en) * 2017-05-09 2018-11-29 浄真 清水 Image processing device, actual dimension display method, and actual dimension display processing program
JP2019142714A (en) * 2018-02-23 2019-08-29 コニカミノルタ株式会社 Image processing device for fork lift

Also Published As

Publication number Publication date
KR20220058596A (en) 2022-05-09
JP6604681B1 (en) 2019-11-13
JP2021044689A (en) 2021-03-18

Similar Documents

Publication Publication Date Title
US11017558B2 (en) Camera registration in a multi-camera system
WO2021199449A1 (en) Position calculation method and information processing system
JP7118490B1 (en) Information processing system, information processing method, program, mobile object, management server
US20210240185A1 (en) Shooting control method and unmanned aerial vehicle
JP6730763B1 (en) Flight body flight path creation method and management server
JP2024009938A (en) Flight management server and flight management system for unmanned flying body
WO2021049508A1 (en) Dimension display system, and dimension display method
JP7004374B1 (en) Movement route generation method and program of moving object, management server, management system
JP6991525B1 (en) Waypoint height coordinate setting method and management server, information processing system, program
JP7370045B2 (en) Dimension display system and method
JP6818379B1 (en) Flight route creation method and management server for aircraft
JP2020012774A (en) Method for measuring building
WO2021130980A1 (en) Aircraft flight path display method and information processing device
JP6601810B1 (en) Aircraft guidance method, guidance device, and guidance system
JP2021100234A (en) Aircraft imaging method and information processing device
JP7418727B1 (en) Information processing method, information processing system and program
JPWO2021064982A1 (en) Information processing device and information processing method
WO2022180975A1 (en) Position determination device, information processing device, position determination method, information processing method, and program
JP6810498B1 (en) Flight route creation method and management server for aircraft
JP6810497B1 (en) Flight route creation method and management server for aircraft
JP7401068B1 (en) Information processing system, information processing method and program
JP7228298B1 (en) Information processing system, information processing method, program, mobile object, management server
JP6715541B1 (en) Aircraft management server and management system
JP6934646B1 (en) Flight restriction area setting method, waypoint setting method and management server, information processing system, program
WO2022113482A1 (en) Information processing device, method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20863311

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20227011101

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 20863311

Country of ref document: EP

Kind code of ref document: A1