WO2022113482A1 - Dispositif, procédé et programme de traitement d'informations - Google Patents

Dispositif, procédé et programme de traitement d'informations Download PDF

Info

Publication number
WO2022113482A1
WO2022113482A1 PCT/JP2021/033765 JP2021033765W WO2022113482A1 WO 2022113482 A1 WO2022113482 A1 WO 2022113482A1 JP 2021033765 W JP2021033765 W JP 2021033765W WO 2022113482 A1 WO2022113482 A1 WO 2022113482A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
world coordinates
coordinate information
wall surface
Prior art date
Application number
PCT/JP2021/033765
Other languages
English (en)
Japanese (ja)
Inventor
剛史 中村
Original Assignee
株式会社Clue
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Clue filed Critical 株式会社Clue
Priority to JP2022565078A priority Critical patent/JPWO2022113482A1/ja
Publication of WO2022113482A1 publication Critical patent/WO2022113482A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • This disclosure relates to information processing devices, methods and programs.
  • Patent Document 1 discloses a technique of measuring the shape and dimension of a roof, which is an object, from an image of an object captured by a camera mounted on a flying object, and calculating the roof area from the shape and dimension.
  • the present disclosure has been made in view of such a background, and an object thereof is to provide an information processing device, a method and a program capable of easily measuring the wall surface of an object.
  • a first acquisition unit for acquiring coordinate information on an image displayed on the screen of at least two reference points displayed on a screen and each having coordinate information in world coordinates, and at least two of the reference points.
  • the first estimation unit that estimates the position of the wall surface of the screen
  • the second acquisition unit that acquires the coordinate information on the image of the input point input to the screen, and the input point are located on the virtual wall surface.
  • an information processing apparatus including a second estimation unit that estimates the world coordinates of the input point in the case of the above.
  • the processor acquires the coordinate information on the image displayed on the screen of at least two reference points displayed on the screen and each having the coordinate information in the world coordinates.
  • the world coordinates passing through at least two reference points based on the coordinate information of the two reference points in the world coordinates, the coordinate information on the image, and the information related to the imaging state when the image is captured.
  • Estimating the position of the virtual wall surface in the above acquiring the coordinate information on the image of the input point input to the screen, and assuming that the input point is located on the virtual wall surface.
  • Methods are provided, including, including estimating the world coordinates of the input point.
  • the computer is displayed on the screen, and the first acquisition unit for acquiring the coordinate information on the image displayed on the screen of at least two reference points each having the coordinate information in the world coordinates.
  • the at least two reference points are passed through the at least two reference points based on the coordinate information in the world coordinates, the coordinate information on the image, and the information related to the imaging state when the image is imaged.
  • the first estimation unit that estimates the position of the virtual wall surface in the world coordinates, the second acquisition unit that acquires the coordinate information on the image of the input point input to the screen, and the input point are the virtual A program for functioning as a second estimation unit for estimating the world coordinates of the input point when it is located on the wall surface is provided.
  • the wall surface of the object can be easily surveyed.
  • FIG. 1 is a diagram showing an outline of a system 1 according to an embodiment of the present disclosure.
  • the system 1 includes an information processing terminal 10 (an example of an information processing device) and an unmanned flying object 20.
  • the system 1 according to the present embodiment can be used, for example, for surveying or inspecting the building S1 which is an object of photography by the unmanned flying object 20.
  • U using the information processing terminal 10 operates on the touch panel of the information processing terminal 10 and uses the unmanned flying object 20 to capture an image including the wall surface W1 of the building S1.
  • the information processing terminal 10 defines a region of the wall surface W1 included in the image displayed on the touch panel of the information processing terminal 10, and the wall surface is based on the region and various information at the time of imaging of the unmanned flying object 20. Acquires the position information in the world coordinates (real space) of W1. Based on such position information, for example, the length between arbitrary points on the wall surface W1 and the area of an arbitrary region can be estimated.
  • the information processing terminal 10 is mounted by a so-called tablet-shaped small computer.
  • the information processing terminal 10 may be realized by a portable information processing terminal such as a smartphone or a game machine, or may be realized by a stationary information processing terminal such as a personal computer.
  • the information processing terminal 10 may be realized by a plurality of hardware and may have a configuration in which the functions are distributed to them.
  • FIG. 2 is a block diagram showing the configuration of the information processing terminal 10 according to the present embodiment.
  • the information processing terminal 10 includes a control unit 11 and a touch panel 12 which is an example of a display unit.
  • the processor 11a is an arithmetic unit that controls the operation of the control unit 11, controls the transmission and reception of data between each element, and performs processing necessary for program execution.
  • the processor 11a is, for example, a CPU (Central Processing Unit), and executes each process by executing a program stored in the storage 11c and expanded in the memory 11b, which will be described later.
  • CPU Central Processing Unit
  • the memory 11b includes a main storage device composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory), and an auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive). .. While the memory 11b is used as a work area of the processor 11a, a boot loader executed when the control unit 11 is started, various setting information, and the like are stored.
  • a volatile storage device such as a DRAM (Dynamic Random Access Memory)
  • auxiliary storage device composed of a non-volatile storage device such as a flash memory and an HDD (Hard Disc Drive).
  • the storage 11c stores programs and information used for various processes. For example, when the user operates the flying object for capturing the image information of the wall surface W1 via the information processing terminal 10, the storage 11c may store a program for controlling the flight of the flying object. ..
  • the transmission / reception unit 11d connects the control unit 11 to a network such as an Internet network, and is a local area network (LAN), wide area network (WAN), infrared ray, wireless, WiFi, point-to-point (P2P) network, and the like. It may be equipped with a short-range communication interface such as a telecommunications network, cloud communication, Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • the input / output unit 11e is an interface to which input / output devices are connected, and in the present embodiment, the touch panel 12 is connected.
  • the bus 11f transmits, for example, an address signal, a data signal, and various control signals between the connected processor 11a, memory 11b, storage 11c, transmission / reception unit 11d, and input / output unit 11e.
  • the touch panel 12 is an example of a display unit, and includes a display surface on which acquired images and images are displayed.
  • this display surface receives information input by contact with the display surface, and is implemented by various techniques such as a resistance film method and a capacitance method.
  • an image captured by the unmanned flying object 20 can be displayed on the display surface of the touch panel 12.
  • buttons, objects, and the like for flight control of the unmanned vehicle 20 and control of the image pickup apparatus may be displayed on the display surface.
  • the user can input input information to the image, the button, or the like displayed on the display surface via the touch panel 12.
  • the operation for inputting such input information is, for example, a touch (tap) operation, a slide operation, a swipe operation, or the like for a button, an object, or the like.
  • FIG. 3 is a block diagram showing an example of the functional configuration of the unmanned aircraft 20 according to the present embodiment.
  • the unmanned aircraft 20 includes a transmission / reception unit 22, a flight controller 23, a battery 24, an ESC 25, a motor 26, a propeller 27, and a camera 28 in the main body 21.
  • the unmanned flying object 20 is an example of an flying object.
  • the type of the flying object is not particularly limited, and may be, for example, a so-called multi-rotor type drone as shown in FIG.
  • the flight controller 23 can have one or more processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • processors 23A such as a programmable processor (eg, central processing unit (CPU)).
  • the flight controller 23 has a memory 23B and can access the memory 23B.
  • Memory 23B stores logic, code, and / or program instructions that the flight controller can execute to perform one or more steps.
  • the memory 23B may include, for example, a separable medium such as an SD card or a random access memory (RAM) or an external storage device.
  • the data acquired from the sensors 23C may be directly transmitted and stored in the memory 23B.
  • still image / moving image data taken by the camera 28 is recorded in the built-in memory or the external memory.
  • the flight controller 23 includes a control module configured to control the state of the flying object.
  • the control module may adjust the spatial placement, velocity, and / or acceleration of an air vehicle with 6 degrees of freedom (translation x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z ).
  • the propulsion mechanism (motor 26, etc.) of the air vehicle is controlled via the ESC (Electric Speed Controller) 25.
  • the control module can control one or more of the camera 28, the sensors 23C, and the like.
  • the flight controller 23 can generate and retain information about the state of the flying object.
  • the information regarding the state of the flying object includes, for example, data acquired by the camera 28 and the sensors 23C.
  • the data acquired by the camera 28 includes, for example, image information generated by the camera 28 taking an image, and information regarding an image pickup state of the camera 28 (for example, an image pickup position of the camera 28, an image pickup direction of the camera 28).
  • the imaging direction may include angles in the pan direction and tilt direction of the camera 28 and the like.
  • the flight controller 23 is data from one or more external devices (eg, a terminal such as an information processing terminal 10, a display device, a radio or other remote controller that is a controller that remotely controls the unmanned aircraft 20). Is communicable with a transmitter / receiver 22 configured to transmit and / or receive.
  • the transmission / reception unit 22 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, cloud communication, and the like. can do.
  • the transmission / reception unit 22 is one or more of data acquired by the camera 28 and sensors 23C, processing results generated by the flight controller 23, predetermined control data, user commands from the information processing terminal 10 or a remote controller, and the like. Can be sent and / or received.
  • the transmission / reception unit 22 may receive, for example, an input to the information processing terminal 10 and receive control related to flight or imaging via a radio (not shown). Further, the transmission / reception unit 22 may transmit, for example, the data acquired by the camera 28 or the like to the information processing terminal via a radio (not shown).
  • Sensors 23C may include an inertial sensor (acceleration sensor, gyro sensor), GPS sensor, proximity sensor (eg, rider), or vision / image sensor (eg, camera).
  • inertial sensor acceleration sensor, gyro sensor
  • GPS sensor GPS sensor
  • proximity sensor eg, rider
  • vision / image sensor eg, camera
  • the battery 24 can be a known battery such as a lithium polymer battery.
  • the power for driving the unmanned vehicle 20 is not limited to the electric power supplied from the battery 24 or the like, and may be, for example, the power of an internal combustion engine or the like.
  • the camera 28 is an example of an image pickup device.
  • the type of the camera 28 is not particularly limited, and may be, for example, an ordinary digital camera, an omnidirectional camera, an infrared camera, an image sensor such as a thermography, or the like.
  • the camera 28 may be connected to the main body 21 so as to be independently displaceable by a gimbal or the like (not shown).
  • FIG. 4 is a block diagram showing a functional configuration of the control unit 11 according to the present embodiment.
  • the control unit 11 includes an input information acquisition unit 111, a display control unit 112, an acquisition unit 113, an estimation unit 114, a calculation unit 115, and an image information DB (database) 116.
  • Each of these functional units can be realized by the processor 11a reading a program stored in the storage 11c into the memory 11b and executing the program.
  • the input information acquisition unit 111 has a function of acquiring input information generated based on an operation on an image displayed on the touch panel 12.
  • the input information referred to here includes, for example, information regarding a position on an image displayed on the touch panel 12.
  • the position on the image is, for example, the position of the pixels constituting the image. That is, the input information includes information indicating to which position on the image the user has performed an operation. More specifically, the input information may include information related to the designation of points for the image displayed on the touch panel 12 and information related to the touch operation for the object of the button displayed on the touch panel 12.
  • the display control unit 112 has a function of displaying the acquired image on the touch panel 12. Further, the display control unit 112 includes information such as buttons, objects, and texts for providing information to the user who uses the system 1 and for acquiring input information based on the operation by the user in the image. It has a function to display. Further, it may also have a function for displaying the display based on the input information obtained by the input information acquisition unit 111 on the touch panel 12.
  • the acquisition unit 113 has a function of acquiring input information and image information.
  • the acquisition unit 113 may acquire image information obtained by imaging an object from the image information DB 116. Further, the acquisition unit 113 may acquire image information from the unmanned aircraft 20 that is performing image pickup processing in real time.
  • the acquisition unit 113 may acquire other information based on various information.
  • the acquisition unit 113 includes a first acquisition unit 1131 and a second acquisition unit 1132.
  • the first acquisition unit 1131 is displayed on the screen of the touch panel 12, and acquires the coordinate information on the image displayed on the screen of at least two reference points each having the coordinate information in the world coordinates.
  • the reference point referred to here may be attached with information related to coordinate information (for example, latitude information, longitude information, altitude information, etc.) in world coordinates (coordinates in real space).
  • the reference point can be set on the screen by, for example, an operation on the touch panel 12 of the user U.
  • the coordinate information in the world coordinates may be attached when the reference point is set on the screen as described later, or may be attached to the reference point in advance.
  • the coordinate information on the image may be, for example, the coordinate information in the XY coordinate system determined based on the pixels of the image. The specific behavior will be described later.
  • the reference point may have, for example, coordinate information in a predetermined height direction in world coordinates.
  • the coordinate information in the predetermined height direction may be the coordinate information in the height direction of the ground.
  • the coordinate information in the plane direction in the world coordinates of the reference point is based on the coordinate information in the predetermined height direction, the coordinate information of the reference point on the image, and the information related to the imaging state when the image is captured. May be obtained. That is, the coordinate information in the plane direction in the world coordinates of the reference point is a coordinate conversion based on a conversion formula obtained from the coordinate information of the reference point on the image, the information related to the imaging situation, and the coordinate information in the predetermined height direction. It can be obtained by performing processing.
  • the second acquisition unit 1132 has a function of acquiring coordinate information on an image of at least one input point input to the screen.
  • the input point is different from the reference point.
  • the input point may be set on the screen by, for example, an operation on the touch panel 12 of the user U.
  • the estimation unit 114 has a function of estimating a position or the like in world coordinates based on the coordinate information acquired by the acquisition unit 113. Specifically, the estimation unit 114 includes a first estimation unit 1141 and a second estimation unit 1142.
  • the first estimation unit 1141 determines at least two reference points based on the coordinate information of at least two reference points in the world coordinates, the coordinate information on the image, and the information related to the imaging situation when the image is captured. It has a function to estimate the position of the virtual wall surface in the passing world coordinates. For example, if it is assumed that the (actual) wall surface of the object is at a predetermined angle (for example, perpendicular) to the ground, the virtual wall surface can be considered to have the same predetermined angle. Then, the first estimation unit 1141 can estimate the position of the virtual wall surface in the world coordinates by performing a coordinate conversion process as described later, for example.
  • the position of the virtual wall surface in the world coordinates means, for example, a group of coordinates in the real space of the region corresponding to the virtual wall surface.
  • the second estimation unit 1142 has a function of estimating the world coordinates of the input point when the input point is located on a virtual wall surface.
  • the second estimation unit 1142 can estimate the world coordinates of the input point by performing coordinate conversion processing when the input point is located on a virtual wall surface from the coordinate information on the image of the input point.
  • the calculation unit 115 has a function of calculating the distance between the reference point and the input point on the virtual wall surface in world coordinates. Further, the calculation unit 115 may calculate the distance between the input point and another input point on the virtual wall surface in world coordinates. Further, the calculation unit 115 may calculate the area of the area on the virtual wall surface having the point on the virtual wall surface in the world coordinates as the apex, which corresponds to at least one of the reference point and the input point. Such a region may be a region having a point corresponding to a reference point and an input point as a vertex, or may be a region having a point corresponding to a plurality of input points as a vertex.
  • the region having a point corresponding to a plurality of input points as a vertex may be, for example, a region formed by a vertex composed of only the points corresponding to the input points. It should be noted that such a region may be composed of at least one of a straight line and a curved line. That is, the region may have a polygonal shape or a shape as drawn by freehand. The setting of such a region is performed by, for example, the calculation unit 115.
  • the image information DB 116 is a database that stores information (image information) of images captured by an unmanned flying object 20 or the like.
  • Such an image may be an image taken by the unmanned flying object 20, but is not limited to such an example.
  • such an image may be an image taken by taking an image from a high place or the like using a digital camera, a smartphone, a tablet, or the like.
  • the imaging position, imaging direction, and the like can be acquired by any method, such information can be used as information related to the imaging status.
  • FIG. 5 is a flowchart of a series of processes in the information processing terminal 10 according to the present embodiment.
  • the acquisition unit 113 acquires image information and information related to the imaging status (step SQ101).
  • the unmanned flying object 20 obtains an image of the wall surface W1 of the building S1.
  • FIG. 6 is a diagram showing an example of image pickup processing by the unmanned flying object 20.
  • the unmanned vehicle 20 is flying in the vicinity of the building S1, and the camera 28 is facing the direction PV1 of the wall surface W1.
  • the camera 28 performs an image pickup process so that the wall surface W1 is reflected, and obtains a captured image.
  • the unmanned flying object 20 acquires information relating to the height H1 of the camera 28, the imaging direction Dir1 in the horizontal direction of the camera 28, and the imaging angle Ang1 of the camera 28 as information relating to the imaging situation.
  • the obtained image information and information related to the imaging status are transmitted to the information processing terminal 10 via the unmanned flying object 20.
  • the acquisition unit 113 acquires this information.
  • the image information may be stored once in the image information DB 116. Further, the information related to the imaging status may be associated with the image information.
  • FIG. 7 is a display example on the screen V1 of the touch panel 12.
  • the image D11 captured by the unmanned aircraft 20 is displayed on the screen V1.
  • Image D11 includes an image of building S1.
  • the building S1 includes a wall surface W1, a window W2, and a window W3.
  • the screen V1 is provided with a button 101 for creating an object on a virtual wall surface, a button 102 for creating an area object corresponding to a window, and a button 103 for deleting an area once created. You may.
  • the input information acquisition unit 111 acquires the operation of the user U, and the reference point is set (step SQ105).
  • the reference points are points provided for acquiring the coordinates in the world coordinates of the wall surface W1, and are provided at least two points.
  • the reference point may be set on the image D11, for example, by the operation of the user U.
  • FIG. 8 is a diagram showing an example of setting to a reference point on the image D11. As shown in FIG. 8, two reference points 51 and 52 may be set on the image D11. For example, the two reference points 51 and 52 are provided so as to correspond to the positions of the lower vertices of the wall surface W1 to be surveyed.
  • the first acquisition unit 1131 associates the coordinates on the image of the reference point with the coordinates in the world coordinates, and acquires the coordinate information of the reference point in the world coordinates (step SQ107).
  • the coordinates in the height direction in the world coordinates that is, the height from the ground
  • the coordinates in the height direction of the lower apex of the wall surface W1 may be, for example, an actual altitude value other than 0.
  • the horizontal coordinates of the reference points 51 and 52 in the world coordinates are known from the coordinates on the image of the reference points 51 and 52, the coordinates in the height direction in the world coordinates, and the information related to the imaging status of the camera 28. It can be obtained by using the method of coordinate conversion of. Specifically, the coordinates on the image of the reference points 51 and 52 are converted into the film coordinates of the camera 28, the film coordinates are converted into the camera coordinates of the camera 28, and the camera coordinates are converted into the world coordinates. The world coordinates of 51 and 52 can be obtained.
  • the horizontal coordinates in the world coordinates of the reference points 51 and 52 can be obtained.
  • the conversion formula between the respective coordinate systems can be determined by the specifications of the camera 28 and the information related to the image pickup status of the camera 28. In such coordinate conversion, a method considering information such as the specifications of the camera 28 such as the F value and distortion of the lens may be used.
  • FIG. 9 is a diagram for explaining the process of estimating the position of the virtual wall surface.
  • a virtual wall surface 60 passing through the reference points 51 and 52 is virtually provided on the image D11.
  • the virtual wall surface 60 may or may not be actually displayed on the screen V1.
  • the virtual wall surface 60 is obtained, for example, as follows. Here, it is assumed that the virtual wall surface 60 is perpendicular to the ground, but the present technique is not limited to this example.
  • the normal direction (yaw direction) of the virtual wall surface is calculated assuming that the virtual wall surface passes through the reference points 51 and 52.
  • the coordinate system consisting of the line segment connecting the reference points 51 and 52 and the calculated normal direction is defined as the coordinate system of the virtual wall surface.
  • the coordinates on the image of the virtual wall surface 60 and the world coordinates are associated with each other, and the position of the virtual wall surface 60 in the world coordinates is associated with the position of the actual wall surface W1 in the real space.
  • the input information acquisition unit 111 acquires the operation of the user U, and the input point is set (step SQ111).
  • the second acquisition unit 1132 acquires the coordinate information on the image of the input point.
  • the second estimation unit 1142 estimates the world coordinates of the input point when the input point is located on the virtual wall surface 60 (step SQ113).
  • a region having the reference point and the input point as the vertices is set (step SQ115).
  • FIG. 10 is a diagram for explaining the process related to the setting of the input point and the setting of the area.
  • input points 53 and 54 are set in addition to the reference points 51 and 52.
  • the input points 53 and 54 are processed as being located on the virtual wall surface 60.
  • the coordinates in the world coordinates of the input points 53 and 54 are obtained by being converted from the coordinates on the image D11 of the input points 53 and 54 by the second estimation unit 1142 via the coordinate system of the virtual wall surface 60.
  • Such a region 61 is a region corresponding to the virtual wall surface 60, and is a region corresponding to the wall surface W1 reflected in the image D11.
  • the calculation unit 115 calculates the area surrounded by the area 61 (step SQ117).
  • the calculation unit 115 calculates the area in real space of the area surrounded by the reference points 51 and 52 and the input points 53 and 54 using the coordinate information in the world coordinates. Further, the calculation unit 115 may calculate the distance between two points in real space. Information on such an area and a distance may be output to the screen V1 or the like in any manner by the display control unit 112, for example.
  • FIG. 11 is a diagram for explaining a process related to estimation of a region according to a modification of the present embodiment.
  • the area 61 (corresponding to the wall surface W1) surrounded by the reference points 51, 52 and the input points 53, 54 is surrounded by four other input points 55, 56, 57, 58.
  • the area 62 is set.
  • Such a region 62 corresponds to a window W2 provided on the wall surface W1 of the building S1.
  • the calculation unit 115 calculates the area of the area 61 and the area of the area 62, and by subtracting the area of the area 62 from the area of the area 61, the area of the wall surface W1 excluding the window W2 can be calculated.
  • At least two reference points whose world coordinates are known for the image of the wall surface W1 of the building S1 taken from the sky by the unmanned aircraft 20. Can be used to estimate the position of the virtual wall surface corresponding to the world coordinates, set the input point on the virtual wall surface, and estimate the position coordinates of the reference point and the input point in the world coordinates.
  • the world coordinates in the height direction of at least two reference points for example, the height direction from the ground
  • the coordinate information in the world coordinates of the reference points is acquired from the information related to the imaging situation by coordinate conversion. be able to.
  • this reference point is automatically set at a position corresponding to the installation portion of the wall surface W1 of the building S1. It is possible to uniquely determine the coordinate information in the world coordinates of the reference point.
  • the coordinate information in the world coordinates of the input point when the input point set on the image is located on the virtual wall surface is uniquely determined. be able to.
  • the area of the wall surface in the real space and the distance between the two points can be calculated.
  • the actual position, length and area in the real space can be more accurately obtained even for the wall surface imaged from an oblique direction. Can be calculated.
  • the device described in the present specification may be realized as a single device, or may be realized by a plurality of devices, etc., which are partially or wholly connected by a network.
  • the control unit and the storage of the information processing terminal 10 may be realized by different servers connected to each other by a network.
  • the series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the information processing terminal 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed, for example, via a network without using a recording medium.
  • the passage through the at least two reference points based on the coordinate information of the at least two reference points in the world coordinates, the coordinate information on the image, and the information related to the imaging state when the image is imaged.
  • the first estimation unit that estimates the position of the virtual wall surface in world coordinates
  • a second acquisition unit that acquires coordinate information on the image of the input point input to the screen
  • the reference point has coordinate information in a predetermined height direction in the world coordinates.
  • the coordinate information in the plane direction in the world coordinates of the reference point includes the coordinate information in the predetermined height direction, the coordinate information of the reference point on the image, and the information related to the imaging state when the image is imaged.
  • the information processing device acquired based on.
  • the information processing apparatus in which the coordinate information in the predetermined height direction in the world coordinates is the coordinate information in the height direction of the ground.
  • An information processing device (item 5) further comprising a calculation unit for calculating the distance between the reference point and the input point on the virtual wall surface in the world coordinates.
  • Information further comprising a calculation unit for calculating the area of the area on the virtual wall surface having a point on the virtual wall surface in the world coordinates as an apex corresponding to at least one of the reference point and the input point.
  • Processing equipment (Item 6) The information processing device according to item 5.
  • the region is an information processing apparatus including a region whose apex is a point corresponding to the reference point and the input point.
  • the area is an information processing apparatus including an area having a point corresponding to the plurality of input points as an apex.
  • the information relating to the image pickup state is an information processing apparatus including information regarding an image pickup position of the image pickup device and an image pickup direction of the image pickup device.
  • the information processing apparatus according to any one of items 1 to 8.
  • the image is an information processing apparatus including an image captured by an unmanned vehicle.
  • the processor Acquiring the coordinate information on the image displayed on the screen of at least two reference points displayed on the screen and having the coordinate information in the world coordinates, respectively. The passage through the at least two reference points based on the coordinate information of the at least two reference points in the world coordinates, the coordinate information on the image, and the information related to the imaging state when the image is imaged.
  • the first estimation unit that estimates the position of the virtual wall surface in world coordinates
  • a second acquisition unit that acquires coordinate information on the image of the input point input to the screen
  • a second estimation unit that estimates the world coordinates of the input point when the input point is located on the virtual wall surface

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Le problème à résoudre par la présente invention est de mesurer facilement la surface de paroi d'un objet. La solution de la présente divulgation porte sur un dispositif de traitement d'informations (10) qui comprend : une première unité d'acquisition (1131) permettant d'acquérir des informations de coordonnées, sur une image affichée sur un écran, d'au moins deux points de référence, chacun étant affiché sur l'écran et comportant des informations de coordonnées dans des coordonnées universelles; une première unité d'inférence (1141) permettant d'inférer l'emplacement d'une surface de paroi virtuelle dans les coordonnées universelles passant par lesdits au moins deux points de référence, en fonction des informations de coordonnées desdits au moins deux points de référence dans les coordonnées universelles, des informations de coordonnées desdits au moins deux points de référence sur l'image, et d'informations se rapportant à une condition d'imagerie dans laquelle l'image a été capturée; une seconde unité d'acquisition (1132) permettant d'acquérir des informations de coordonnées d'au moins un point d'entrée sur l'image entré par rapport à l'écran; et une seconde unité d'inférence (1142) permettant d'inférer des coordonnées universelles du point d'entrée pour un cas où le point d'entrée est supposé être situé sur la surface de paroi virtuelle.
PCT/JP2021/033765 2020-11-30 2021-09-14 Dispositif, procédé et programme de traitement d'informations WO2022113482A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022565078A JPWO2022113482A1 (fr) 2020-11-30 2021-09-14

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-199019 2020-11-30
JP2020199019 2020-11-30

Publications (1)

Publication Number Publication Date
WO2022113482A1 true WO2022113482A1 (fr) 2022-06-02

Family

ID=81755528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/033765 WO2022113482A1 (fr) 2020-11-30 2021-09-14 Dispositif, procédé et programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JPWO2022113482A1 (fr)
WO (1) WO2022113482A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000221037A (ja) * 1999-01-29 2000-08-11 Topcon Corp 自動測量機と3次元測定方法
JP2001033245A (ja) * 1999-07-19 2001-02-09 Maeda Science:Kk 平面上の点の位置測定方法
JP2004163292A (ja) * 2002-11-13 2004-06-10 Topcon Corp 測量装置と電子的記憶媒体

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000221037A (ja) * 1999-01-29 2000-08-11 Topcon Corp 自動測量機と3次元測定方法
JP2001033245A (ja) * 1999-07-19 2001-02-09 Maeda Science:Kk 平面上の点の位置測定方法
JP2004163292A (ja) * 2002-11-13 2004-06-10 Topcon Corp 測量装置と電子的記憶媒体

Also Published As

Publication number Publication date
JPWO2022113482A1 (fr) 2022-06-02

Similar Documents

Publication Publication Date Title
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
US11556681B2 (en) Method and system for simulating movable object states
US20200012756A1 (en) Vision simulation system for simulating operations of a movable platform
CN111344644A (zh) 用于基于运动的自动图像捕获的技术
JP6829513B1 (ja) 位置算出方法及び情報処理システム
WO2019230604A1 (fr) Système d'inspection
WO2021251441A1 (fr) Procédé, système et programme
JP2023100642A (ja) 検査システム
US20210404840A1 (en) Techniques for mapping using a compact payload in a movable object environment
US20230177707A1 (en) Post-processing of mapping data for improved accuracy and noise-reduction
WO2020019175A1 (fr) Procédé et dispositif de traitement d'image et dispositif photographique et véhicule aérien sans pilote
US20210185235A1 (en) Information processing device, imaging control method, program and recording medium
US20220187828A1 (en) Information processing device, information processing method, and program
JP6681101B2 (ja) 検査システム
JP7004374B1 (ja) 移動体の移動経路生成方法及びプログラム、管理サーバ、管理システム
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations
US20220113421A1 (en) Online point cloud processing of lidar and camera data
JP2020012774A (ja) 建造物の測定方法
JP6684012B1 (ja) 情報処理装置および情報処理方法
WO2021124579A1 (fr) Procédé de capture d'image de véhicule volant et dispositif de traitement d'informations
WO2021130980A1 (fr) Procédé d'affichage de trajectoire de vol d'un aéronef et dispositif de traitement d'informations
WO2022070851A1 (fr) Procédé, système et programme
JP2023083072A (ja) 方法、システムおよびプログラム
US20240013460A1 (en) Information processing apparatus, information processing method, program, and information processing system
JP6681102B2 (ja) 検査システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21897460

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022565078

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21897460

Country of ref document: EP

Kind code of ref document: A1