WO2013030929A1 - 監視装置、監視システム及び監視方法 - Google Patents
監視装置、監視システム及び監視方法 Download PDFInfo
- Publication number
- WO2013030929A1 WO2013030929A1 PCT/JP2011/069462 JP2011069462W WO2013030929A1 WO 2013030929 A1 WO2013030929 A1 WO 2013030929A1 JP 2011069462 W JP2011069462 W JP 2011069462W WO 2013030929 A1 WO2013030929 A1 WO 2013030929A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- equipment
- self
- shape
- facility
- map
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/03—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/33—Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
Definitions
- the present invention relates to a technique for monitoring equipment elements in a building.
- Non-Patent Document 1 Self-localization methods for robots can be used.
- Non-Patent Document 1 can estimate the self-position, it cannot identify what equipment element is being monitored. For useful monitoring, in addition to self-position estimation, it is desirable to identify what equipment elements are being monitored.
- An object of the present invention is to enable a monitoring device that monitors equipment elements in a building to identify what equipment elements are being monitored.
- a typical example of the invention disclosed in the present application is as follows. That is, a monitoring device that monitors equipment elements in a building, the three-dimensional measurement unit that measures a three-dimensional shape around the monitoring device, and the measured three-dimensional shape, the building other than the equipment elements
- a self-position estimating unit that estimates the self-position of the monitoring device by collating with a map for self-position estimation including the shape and position of the structure inside, and the equipment elements around the estimated self-position
- a peripheral equipment element extraction unit that extracts from a facility element matching map including the shape and position of equipment elements in a building, and a facility element candidate extraction unit that extracts the shape and position of equipment element candidates from the measured three-dimensional shape
- a degree-of-match evaluation unit that calculates the error distribution of the shape and position of the equipment elements around the self-position and the error distribution of the
- FIG. 1st Embodiment It is an external appearance block diagram of the monitoring apparatus in 1st Embodiment. It is A arrow directional view of FIG. It is a block diagram which shows the hardware constitutions of the monitoring system comprised by the monitoring apparatus and server in 1st Embodiment. It is a functional block diagram which shows the logical structure of the monitoring apparatus in 1st Embodiment. It is a functional block diagram which shows the detail of the installation element specific
- the operational status of equipment elements such as power / distribution equipment and chemical plants is often monitored by a monitoring system such as SCADA (Supervision Control And Data Acquisition), and monitoring work is managed by the equipment elements and the monitoring system being monitored. It is required to have a high degree of reliability in correspondence with installed equipment elements.
- SCADA Supervision Control And Data Acquisition
- the physical quantity for identifying the equipment element and determining whether it is normal or abnormal This is because it is necessary to obtain the normal range of the database from the database.
- the monitoring apparatus not only estimates the self-position of the monitoring apparatus but also specifies what equipment elements are being monitored. Further, the abnormality of the equipment element is determined based on the measured physical quantity of the equipment element being monitored.
- FIG. 1 shows an external configuration of the monitoring apparatus 100 according to the first embodiment of the present invention.
- the monitoring device 100 includes a main body 101, sensors 102 to 105, and a wireless communication device 106. During the monitoring work, the operator grips the main body 101 and points the sensors 102 to 105 forward, and measures the surroundings of the monitoring device 100 by the sensors 102 to 105.
- the “building” is a building such as a power / distribution facility or a chemical plant
- the “facility element” is an element such as a pipe, a pump, or a valve installed inside the building.
- the sensors 102 to 105 are attached to the front surface of the main body 101.
- the sensors 102 to 105 include a laser range finder (also referred to as “scanner-type range sensor”) 102, a thermal infrared camera 103, a visible light camera 104, and an acceleration sensor 105.
- the laser range finder 102 has a laser irradiation part and a light receiving part.
- the laser range finder 102 radiates the laser beam from the laser irradiation unit using a rotary reflecting mirror or the like, and reflects the time until the laser beam is reflected from the surface of the nearest object and returns to the light receiving unit. By measuring, the distance to the nearest object is measured. By measuring in various directions, the laser range finder 102 measures the three-dimensional shape around the monitoring device 100.
- “periphery” means a region having a predetermined radius centered on the monitoring device 100 (for example, a range in which the distance can be measured by the laser range finder 102).
- the thermal infrared camera 103 is composed of a CCD (Charge Coupled Device) image sensor having a sensitivity distribution in the infrared region.
- the thermal infrared camera 103 measures the intensity (or wavelength) of radiant heat from the facility element being monitored, that is, the temperature of the facility element being monitored.
- the visible light camera 104 is composed of a CCD image sensor having a sensitivity distribution in the visible light region.
- the visible light camera 104 acquires a visible light image of the facility element being monitored. Note that the thermal infrared camera 103 and the visible light camera 104 can be configured by a single CCD image sensor.
- the acceleration sensor 105 is composed of a semiconductor accelerometer and a gyroscope, and measures the acceleration in the three-axis direction and the rotational acceleration around the three axes of the monitoring device 100.
- the wireless communication device 106 is a device for connecting the monitoring device 100 to an external server 120 and exchanging data with the server 120 (for example, equipment ID, position, temperature, abnormality detection result, etc. of the equipment element being monitored). is there.
- the wireless communication device 106 is, for example, a wireless LAN transmission / reception module.
- the monitoring device 100 and the server 120 may be connected by wire.
- the monitoring device 100 and the server 120 are connected by a wired LAN interface, a serial interface such as USB, or the like.
- the monitoring apparatus 100 and the server 120 may communicate in real time, and may communicate as needed.
- the main body 101 has a processor for data processing, a memory for data storage, and the like. Further, as shown in FIG. 2, the main body 101 has a display 201 and an operation unit 202 including buttons on the back surface thereof, for example.
- the server 120 includes a wireless communication device 121 for exchanging data with the monitoring device 100.
- the wireless communication device 121 is, for example, a wireless LAN module.
- FIG. 3 is a block diagram illustrating a hardware configuration of the monitoring system 1 including the monitoring device 100 and the server 120.
- the monitoring device 100 is a computer having a processor 301, a memory 302, a storage device 303, an input interface 304, an output interface 305, and a wireless communication device 106. These configurations 301 to 305 and 106 are connected to each other by a bus 306.
- the processor 301 executes a program stored in the memory 302.
- the memory 302 is a volatile storage device such as a DRAM and stores a program executed by the processor 301. Specifically, the memory 302 includes a self-position estimating unit 401, an equipment element specifying unit 402, a coordinate system correcting unit 403, 3D mapping units 404 and 405, an abnormality detecting unit 406, and an image superimposing unit 407 shown in FIG. Store the program for The memory 302 stores an operating system (OS). The processor 301 executes the operating system, thereby realizing the basic functions of the computer.
- OS operating system
- the storage device 303 is a non-volatile storage device such as a magnetic disk drive or a flash memory, and stores data used when the processor 301 executes a program. Specifically, the storage device 303 stores the self-position estimation map 411, the equipment element matching map 412, the installation error statistical model 413, the equipment CAD coordinate system 414, the standard temperature data 415, and the system diagram data 416 shown in FIG. Store.
- the memory 302 stores a program for mounting the functional units 401 to 407 of the monitoring apparatus 100
- the storage device 303 stores data 411 to 416 used by the functional units 401 to 407.
- the programs corresponding to the functional units 401 to 407 are stored in the storage device 303, and are read from the storage device 303 and loaded into the memory 302 when the program is executed.
- Data 411 to 416 are also read from the storage device 303 and loaded into the memory 302 when the program needs them.
- the sensors 102 to 105 are connected to the input interface 304.
- a display 201 is connected to the output interface 305.
- the server 120 is a computer having a processor 311, a memory 312, a storage device 313, an input interface 314, an output interface 315, and a wireless communication device 121. These components 311 to 315 and 121 are connected to each other by a bus 316.
- the processor 311 executes a program (such as software for managing the facility elements and the monitoring device 100) stored in the memory 312.
- the memory 312 is a volatile storage device such as a DRAM and stores a program executed by the processor 311.
- the memory 312 stores an operating system (OS).
- the processor 311 executes the operating system, thereby realizing the basic functions of the computer.
- the storage device 313 is a nonvolatile storage device such as a magnetic disk drive or a flash memory, and stores data used when the processor 311 executes a program.
- program is stored in the storage device 313 and is read from the storage device 313 and loaded into the memory 312 when the program is executed. Data is also read from the storage device 313 and loaded into the memory 312 when the program requires it.
- the input interface 314 is connected to an input device 317 such as a keyboard and a mouse.
- a display 318 is connected to the output interface 315.
- FIG. 4 is a functional block diagram showing a logical configuration of the monitoring apparatus 100. As shown in FIG.
- the monitoring device 100 corresponds to the portion surrounded by a broken line in the figure, and includes sensors 102 to 105, a display 201, functional units 401 to 407, and data 411 to 416.
- the data 411 to 416 are created in advance prior to the monitoring work and are stored in the storage device 313 of the monitoring device 100. Note that the data configuration described below is an example and may be configured in other formats.
- the self-position estimation map 411 is a map including the shape and position of a structure in a building such as a wall or a pillar (a fixed object with a small installation error and a fixed position and orientation).
- the map 411 for self-location estimation is obtained by converting data such as walls and columns managed by a structure CAD (Computer Aided Design) 421 which is design data of a building into a format suitable for self-location estimation by a conversion process 422. Created.
- CAD Computer Aided Design
- the format suitable for self-position estimation is a format suitable for collation with the three-dimensional shape measured by the laser range finder 102.
- the self-position estimation map 411 includes only data relating to the surface shape of a structure in a building.
- FIG. 6A and 6B are specific examples of the structure CAD 421 and the self-position estimation map 411.
- FIG. 6A the structure CAD 421 has data of structures such as a wall 601 and a pillar 602, and the self-position estimation map 401 includes a surface 603 of the structure as shown in FIG. 6B. Has data only.
- the self-position estimation map 411 is a map generated from the structure CAD 421, the self-position estimation map 411 does not include data on equipment elements such as pipes, pumps, and valves.
- the facility element verification map 412 is created by converting the shape, position, and the like of the facility elements such as pipes, pumps, and valves managed by the facility CAD 423 by the conversion process 424.
- the conversion process 424 is based on numerical values (positional deviation, direction deviation, scale deviation, etc. necessary for alignment between the self-position estimation map 411 and the equipment element matching map 412) stored in the equipment CAD coordinate system 414.
- the coordinate conversion for aligning the self-position estimation map 411 and the equipment element matching map 412 is included.
- FIG. 7 is a diagram for explaining the configuration of the facility element matching map 412.
- the facility element matching map 412 is a table including a facility ID 701, a shape type 702, and a position / size 703.
- the facility ID 701 stores an identifier uniquely assigned to each facility element such as a pipe and a pump.
- the equipment ID 701 is common to an equipment error statistical model 413, standard temperature data 415, and system diagram data 416, which will be described later.
- the shape type 702 stores an identifier assigned according to the shape type of the equipment element. For example, 1 is given for a cylinder, and 2 is given for a rectangular parallelepiped.
- Position / size 703 stores a numerical value group for defining the position, size, etc. according to the shape type of the equipment element.
- the position / size 703 is, for example, a center coordinate x, y and a radius r in the case of a cylinder.
- FIG. 8 is a diagram for explaining the configuration of the installation error statistical model 413.
- the installation error statistical model 413 is a table including equipment ID 801, variance 802, and correction coefficient 803.
- the facility ID 801 stores an identifier uniquely assigned to each facility element such as a pipe and a pump.
- the equipment ID 801 is common with the equipment element matching map 412 and the like.
- the variance 802 stores a parameter representing an error distribution of installation positions of equipment elements.
- the error distribution is represented by a two-dimensional probability density distribution.
- the variance 802 is a variance value in a plane orthogonal to the central axis of the pipe. ⁇ 1 2 and ⁇ 2 2 are stored.
- the correction coefficient 803 stores a correction coefficient when the shape of the error distribution changes due to the installation location of the equipment element, the installation method, and the like.
- the correction coefficient 803 is, for example, when a pipe is installed along a wall, and when the distance from the wall to the pipe is strictly protected but the installation error in the direction along the wall is allowed, the error distribution is Stores a correction coefficient for correcting the error distribution so that it becomes a long ellipse along the wall.
- the variance 802 and the correction coefficient 803 are set according to the type of equipment element, the installation location, the installation method, etc. based on the past results, but may be created for each equipment element.
- the equipment CAD coordinate system 414 includes numerical values such as a positional deviation, a direction deviation, and a scale necessary for alignment between the self-position estimation map 411 and the equipment element matching map 412 although not shown.
- FIG. 9 is a diagram for explaining the configuration of the standard temperature data 415.
- Standard temperature data 415 is a table including equipment ID 901, lower limit temperature 902, and upper limit temperature 903.
- the facility ID 901 stores an identifier uniquely assigned to each facility element such as a pipe and a pump.
- the equipment ID 901 is common with the equipment element matching map 412 and the like.
- the lower limit temperature 902 stores the lower limit temperature of each facility element
- the upper limit temperature 903 stores the upper limit temperature of each facility element.
- the lower limit temperature 902 and the upper limit temperature 903 define a temperature range in which each facility element can be determined as normal operation.
- FIG. 10 is a diagram for explaining the configuration of the system diagram data 416.
- the system diagram data 416 is a table including an equipment ID 1001, a connection source 1002, and a connection destination 1003.
- the facility ID 1001 stores an identifier uniquely assigned to each facility element such as a pipe and a pump.
- the equipment ID 1001 is common to the equipment element matching map 412 and the like.
- the connection source 1002 stores the identifier of the facility element connected to the upstream side of the facility element. When a plurality of equipment elements are connected on the upstream side of the equipment element, the connection source 1002 stores a plurality of identifiers.
- the connection destination 1003 stores the identifier of the facility element connected to the downstream side of the facility element. When a plurality of facility elements are connected to the downstream side of the facility element, the connection destination 1003 stores a plurality of identifiers.
- the self-position estimation unit 401 estimates the self-position of the monitoring device 100 (“position” includes “direction” of the monitoring device 100, the same applies hereinafter) based on a plurality of measurement results. Specifically, the acceleration measured by the acceleration sensor 105 is integrated twice to calculate the first predicted self-position, and the surrounding three-dimensional shape measured by the laser range finder 102 is displayed as the self-position estimation required map 401. And the second predicted self-position is calculated. Then, the self-position estimation unit 401 integrates these two predicted self-positions including variations based on statistical errors using a Kalman filter, and estimates the most likely self-position.
- the equipment element specifying unit 402 specifies the equipment ID and the position of the equipment element being monitored using the self position estimated by the self position estimating unit 401 and the result of the three-dimensional measurement by the laser range finder 102.
- FIG. 5 is a functional block diagram showing details of the equipment element specifying unit 402.
- the facility element specifying unit 402 includes a peripheral facility element extracting unit 501, an installation error distribution calculating unit 502, a difference extracting unit 503, a facility element candidate extracting unit 504, a coincidence degree evaluating unit 505, and a facility element specifying unit 506.
- the peripheral equipment element extraction unit 501 determines the shape type and position / size of the equipment elements around the self-position from the equipment element verification map 413, that is, the shape and position. Search and extract as peripheral equipment elements.
- the installation error distribution calculation unit 502 refers to the installation error statistical model 413 for each of the peripheral equipment elements extracted by the peripheral equipment element extraction unit 501 and searches for the dispersion and correction coefficient of the corresponding equipment ID. The installation error distribution is calculated based on the correction coefficient.
- the difference extraction unit 503 calculates the difference between the three-dimensional measurement result obtained by the laser range finder 102 and the self-position estimation required map 411. As a result, measurement point groups for building structures such as walls and pillars are excluded from 3D measurement results, and only objects other than building structures, that is, measurement point groups for equipment elements, are extracted as differences.
- the facility element candidate extraction unit 504 detects the shape corresponding to the facility element such as a plane or a cylinder and its position from the difference extracted by the difference extraction unit 503 by the least square method, Hough transform, or the like, and detects this as the facility element candidate. Extract as
- the shape existing around the self-location is predicted to some extent based on the shape and position of the peripheral facility element extracted by the peripheral facility element extraction unit 501, and the predicted shape and position are changed.
- the shape and position that minimize the square error are searched, and the shape and position that minimize the square error are extracted as equipment element candidates.
- voting is performed using parameters according to the shape to be searched, and the shape is searched based on the result. For example, in the case of a cylinder, voting is performed for each measurement point with three parameters of an arbitrary radius r and center coordinates x and y, and it is assumed that there is a cylinder having a parameter with the most votes, and the shape of the cylinder And the position are extracted as equipment element candidates.
- the facility element candidate extraction unit 504 also calculates a prediction error distribution that is a statistical error included in the extracted facility element candidates. For example, when an equipment element candidate is extracted using the least square method, a prediction error distribution is calculated by adding a known self-position estimation error to the sum of square errors.
- the degree-of-match evaluation unit 505 determines whether or not all of the installation error distributions of the peripheral equipment elements obtained by the installation error distribution calculation unit 502 and the prediction error distributions of the equipment element candidates extracted by the equipment element candidate extraction unit 504. The degree of coincidence is calculated for the combination. The degree of coincidence is the maximum value of the integrated error distribution obtained by integrating the installation error distribution and the prediction error distribution.
- the facility element specifying unit 506 extracts a combination that maximizes the matching degree from all the combinations. Since the combination that maximizes the degree of coincidence is most probable, it is possible to specify which of the peripheral equipment elements the equipment element candidate extracted by the equipment element candidate extraction unit 504 corresponds to. That is, the facility ID of the facility element being monitored can be specified. Further, since the position where the maximum value is obtained in the integrated error distribution is most likely as the position of the equipment element being monitored, the position of the equipment element being monitored can also be specified.
- the deviation between the map 411 and the equipment element matching map 412 is calculated, and the numerical value stored in the equipment CAD coordinate system 414 (the positional deviation necessary for the alignment between the self-position estimation map 411 and the equipment element matching map 412). , Direction deviation, scale deviation, etc.). Accordingly, the shift between the self-position estimation map 411 and the facility element matching map 412 when the facility element matching map 412 is generated next time by the conversion process 424 is reduced.
- the 3D mapping unit 404 integrates the temperature of the monitored facility element measured by the thermal infrared camera 103 and the distance / direction to the monitored facility element measured by the laser range finder 102 to obtain a three-dimensional temperature. Calculate the distribution.
- the 3D mapping unit 405 integrates the visible light image of the monitored facility element acquired by the visible light camera 104 and the distance / direction to the monitored facility element measured by the laser range finder 102 to obtain a visible light image. Is converted to a three-dimensional image.
- the anomaly detection unit 406 refers to the standard temperature data 415 to search for and obtain a lower limit temperature and an upper limit temperature corresponding to the equipment ID of the equipment element being monitored, and a 3D mapping part for the temperature of the equipment element being monitored Obtained from 404. And the abnormality detection part 406 compares both, and when the temperature of the equipment element under monitoring is lower than the lower limit temperature or higher than the upper limit temperature, it is determined that the equipment element is abnormal.
- the image superimposing unit 407 generates a superimposed image in which the three-dimensional temperature distribution calculated by the 3D mapping unit 404 is superimposed on the visible light image converted into the three-dimensional image by the 3D mapping unit 405. Then, the image superimposing unit 407 projects the generated superimposed image onto a plane, converts it into a two-dimensional image, and outputs it to the display 201.
- the temperature distribution to be superimposed is color-coded according to the temperature so that the part where the temperature is higher than the upper limit temperature is red and the other parts are colorless. You may color-code so that a color may change continuously according to temperature.
- an appropriate transmittance is set for the temperature distribution so that the operator can confirm the visible light image even when the temperature distribution is superimposed on the visible light image.
- the image superimposing unit 407 generates a system diagram that visually represents the connection relationship of each facility element based on the connection information between the facility elements stored in the system diagram data 416. Then, a mark (for example, a colored circle) is added to a part corresponding to the equipment element in which the abnormality is detected by the abnormality detection unit 406, and is output to the display 201.
- a mark for example, a colored circle
- the display 201 displays a superimposed image and a system diagram input from the image superimposing unit 407.
- a specific example of display is shown in FIG. 22, which will be described later.
- FIG. 11 is a flowchart of the self-position estimation process. This self-position estimation process is performed by the self-position estimation unit 401 of the monitoring apparatus 100, that is, when the processor 301 executes a program stored in the memory 302.
- the processor 301 integrates the acceleration detected by the acceleration sensor 105 twice, and adds this to the previous value of the self position to estimate the current self position (1101).
- first estimated self-position the self-position estimated by the processing 1101 is referred to as “first estimated self-position”.
- the processor 301 assumes the arrangement of buildings such as walls and pillars when assuming various self-positions, and the building structure such as the assumed walls and pillars and the three-dimensional measurement result by the laser range finder 102. Are matched (scan matching). Then, the position having the highest degree of coincidence is estimated as the self position (1102).
- the self-position estimated by the processing 1102 is referred to as “second estimated self-position”. Further, a statistical error distribution of the second estimated self-position is calculated based on the degree of coincidence when the self-position estimated from the position where the degree of coincidence is highest.
- the self-position estimation map 411 includes wall data 1201a and 1202a corresponding to the walls 1201 and 1202 as shown in FIG.
- FIG. 14 shows the result 1401 of the three-dimensional measurement by the laser range finder 102 with a thick line.
- a result 1401 of the three-dimensional measurement is a set of a large number of measurement points corresponding to a part of the walls 1201 and 1202 irradiated with the laser and a part of the pipe 1203.
- 1402 is the position of the laser range finder 102, that is, the self position.
- the processor 301 translates and rotates the wall data 1201a and 1202a read from the self-position estimation map 411 by affine transformation, and assumes various arrangements as indicated by broken lines in the figure.
- the processor 301 compares the assumed arrangement of the wall data 1201a and 1202a with the result 1401 of the three-dimensional measurement, and calculates the degree of coincidence. Then, the processor 301 obtains an arrangement 1501 having the highest matching degree. As shown in FIG. 14, since the relationship between the three-dimensional measurement result 1401 and the self-position 1402 is known, the self-position 1402 is estimated from the three-dimensional measurement result 1401 and the arrangement 1501 having the highest degree of coincidence with this. Can do.
- the processor 301 calculates an error distribution of the estimated self position (second estimated self position) from the distribution of the degree of coincidence when the position is shifted from the arrangement 1501 having the highest degree of coincidence.
- the processor 301 integrates the first estimated self-position and the second estimated self-position by the Kalman filter, and estimates the most likely position as the self-position (1103).
- FIG. 16 is a diagram for explaining an example of the self-position estimation process. Although this figure is two-dimensional, the actual processing is performed in three dimensions.
- the first estimated self-position is obtained by adding a value obtained by integrating twice the acceleration value measured by the acceleration sensor 105 to the self-position 1601 estimated last time.
- the value measured by the acceleration sensor 105 includes a statistical error. For this reason, the first estimated self-position also varies.
- An ellipse 1602 indicates the error distribution of the first estimated self-position.
- the second estimated self-position is estimated by collating the result of the three-dimensional measurement by the laser range finder 102 with the self-position estimation map 411 (scan matching), and the second estimated self-position is also the first estimated self-position. As with, there are variations based on statistical errors. An ellipse 1603 indicates the error distribution of the second estimated self-position.
- the first estimated self-position and the second estimated self-position each having variations are integrated by the Kalman filter, and the most probable position, that is, the position having the largest value in the integrated error distribution is obtained. Estimate as self-position. In this example, the position 1604 having the largest value in the integrated error distribution is estimated as the self position.
- the self-position estimation process estimates the self-position every moment by repeating the above process.
- FIG. 17 is a flowchart of the equipment element specifying process. This equipment element specifying process is performed by the equipment element specifying unit 402 of the monitoring apparatus 100, that is, when the processor 301 executes a program stored in the memory 302.
- the processor 301 extracts equipment IDs, shape types, positions and sizes of peripheral equipment elements with reference to the equipment element matching map 412 based on the self-position estimated by the self-position estimation processing (1701).
- the pipe 1203 and the valve 1204 are extracted as peripheral equipment elements.
- the processor 301 extracts a difference between the result (measurement point group) of the three-dimensional measurement by the laser range finder 102 and the self-position estimation map 411 (1702). As a result, the measurement point group corresponding to the structure of the building such as a wall or a pillar is removed from the measurement point group obtained by the three-dimensional measurement, and only the measurement point group corresponding to the facility element is extracted.
- FIG. 18 shows the difference extracted by the processing 1702 in the example shown in FIG.
- the difference includes only the curved surface 1801 corresponding to a part of the pipe 1203.
- a portion corresponding to the valve 1204 is originally included in the difference, but is omitted here for simplicity.
- the example shown in FIG. 18 is the difference extracted from the three-dimensional measurement result obtained by the laser range finder 102 scanning once, but the results of scanning at a plurality of positions as shown in FIG.
- the difference 1901 may be extracted from the superimposed result. By using the superimposed result, the equipment element candidate extraction accuracy in the next processing 1702 is improved.
- the processor 301 detects and extracts the shape and position of candidate equipment elements such as a plane and a cylinder from the difference extracted in the processing 1702 by the least square method, Hough transform, or the like. Further, a prediction error distribution, which is a statistical error included in the extracted equipment element candidates, is also calculated (1703). For example, when an equipment element candidate is extracted using the least square method, the processor 301 calculates a prediction error distribution by adding a known self-position estimation error to the sum of square errors.
- FIG. 20 shows the result of detecting and extracting the shape and position of the facility element candidate from the difference shown in FIG.
- the cylinder 2001 corresponding to the pipe 1203 and its position are detected and extracted from the difference.
- the processor 301 refers to the installation error statistical model 413 for each of the peripheral equipment elements extracted in the process 1701, acquires parameters (dispersion and correction coefficient) regarding the installation error, and calculates a predicted installation error distribution. (1706). Then, the processor 301 integrates the installation error distribution and the prediction error distribution for all the combinations of the peripheral equipment elements extracted in the process 1701 and the equipment element candidates extracted in the process 1703, and calculates the degree of coincidence between them. (1709). The degree of coincidence is the maximum value of the integrated error distribution.
- the processor 301 uses the counters i and j to calculate the degree of coincidence for all combinations of the peripheral equipment elements extracted in the processing 1701 and the equipment element candidates extracted in the processing 1703 (processing 1704 ⁇ ). 1711).
- the processor 301 calculates the combination of the peripheral equipment element and the equipment element candidate having the highest degree of coincidence, and the equipment element candidate extracted in the process 1703 corresponds to any of the peripheral equipment elements extracted in the process 1701. Identify what to do. That is, the facility ID of the facility element being monitored is specified. Further, the position having the maximum value in the error distribution in the combination is specified as the position of the equipment element being monitored (1712).
- FIG. 21 shows a state in which the equipment ID and position of the equipment element are specified in the situation shown in FIG. 12 by the equipment element specifying process.
- a part corresponding to the pipe 1203 is extracted as a facility element candidate (cylinder) from the result of the three-dimensional measurement, and the prediction error distribution is a concentric contour line 2101 centered on the central axis of the cylinder.
- the pipe 1203 is extracted from the equipment element matching map 412 as the peripheral equipment elements, and the installation error distribution is centered on the central axis of the pipe 1203 based on the parameters relating to the installation error acquired from the installation error statistical model 413. It is represented by an elliptic contour 2102.
- the shape of the installation error distribution is an ellipse, for example, because of the restrictions on the construction method when installing the pipe 1203. The distance from the wall is strictly protected, but the installation error in the direction parallel to the wall is This is because it is allowed.
- the processor 301 integrates the two error distributions by multiplying each position, and calculates the maximum value of the integrated error distribution as the degree of coincidence between the facility element candidate and the pipe 1203. If the degree of coincidence of the combination is higher than the degree of coincidence in the other combinations, it is determined that the equipment element being monitored is the pipe 1203, and the position 2103 where the error distribution is the maximum value is the position of the pipe 1203. To be identified.
- FIG. 22 is a display example on the display 201 in the situation shown in FIG.
- a superimposed image of the visible light image and the temperature distribution generated by the image superimposing unit 407 is displayed in the left area of the display 201.
- a portion 2201 exceeding the upper limit temperature of each equipment element is displayed in color.
- the valve 1204 has exceeded its upper limit temperature, and the valve 1204 is displayed in red.
- a system diagram 2202 generated by the image superimposing unit 407 is displayed in the right area of the display 201.
- the portion corresponding to the valve 1204 in the system diagram 2202 is surrounded by a red circle 2203.
- the operator can confirm that the equipment elements being monitored are the pipe 1203 and the valve 1204, and confirm that an abnormality has occurred in the valve 1204. be able to.
- the monitoring device 100 can not only estimate the self-position, but also what the equipment element is being monitored and up to its position. Can be identified. That is, the correspondence between the facility element being monitored and the facility element managed by the monitoring system 1 can be obtained with high reliability.
- a plurality of self-positions estimated from measurement results by a plurality of sensors are statistically integrated to estimate the final self-position, and
- the map for self-location estimation used for self-location estimation includes only the structures in the building with little installation error.
- the self-position can be estimated with high accuracy, and as a result, the specific accuracy of the facility element that uses the self-position estimation result can also be increased.
- the position where the equipment element is actually arranged may be shifted from the position stored in the equipment CAD due to the installation error.
- the equipment elements are specified in consideration of the installation error distribution of the equipment elements, so that the equipment elements can be specified even if there is an installation error.
- FIG. 23 is a functional block diagram showing a logical configuration of the monitoring apparatus 100 according to the second embodiment of the present invention. A portion surrounded by a broken line is a portion mounted on the monitoring device 100.
- the data 411 to 415 are stored in the storage device 303 of the monitoring device 100.
- the data 411 to 415 are stored in the storage device 313 of the server 120.
- the monitoring apparatus 100 requests the server 120 to transmit a portion of the data 411 to 415 necessary for processing by the function units 401 to 407, and the server 120 transmits the requested data to the monitoring apparatus 100.
- the monitoring apparatus 100 executes processing in the function units 401 to 407 using the received data.
- all of the data 411 to 415 are held on the server 120 side, but the monitoring apparatus 100 may have a part of the data 411 to 415. Further, the system diagram data 416 may be held on the server 120 side.
- FIG. 24 is a functional block diagram showing a logical configuration of the monitoring device 100 according to the third embodiment of the present invention. A portion surrounded by a broken line is a portion mounted on the monitoring device 100.
- all of the data 411 to 416 and a program for mounting the self-position estimating unit 401, the equipment element specifying unit 402, the coordinate system correcting unit 403, and the abnormality detecting unit 406 are stored in the storage device 313 of the server 120. Stored. These programs are read into the memory 312 of the server 120 and executed by the processor 311, so that these functional units 401 to 403 and 406 are installed in the server 120.
- the monitoring device 100 transmits the measurement results of the sensors 102 to 105 to the server 120, and the server 120 performs self-position estimation processing, equipment element identification processing, etc. based on the measurement results received from the monitoring device 100. The various processes are executed. Then, the server 120 transmits the processing result (the result of self-position estimation, the identification result of the equipment element, the abnormality detection result, etc.) to the monitoring device 100.
- the monitoring apparatus 100 processes the processing result received from the server 120 in the image superimposing unit 407 and displays the processed result on the display 201.
- the self-position estimating unit 401, the equipment element specifying unit 402, the coordinate system correcting unit 403, and the abnormality detecting unit 406 are mounted on the server 120, but some of them are mounted on the monitoring device 100. May be.
- the 3D mapping unit 404, the 3D mapping unit 405, and the image superimposing unit 407 are mounted on the monitoring device 100, some or all of them may be mounted on the server 120.
- the laser range finder 102 is used to measure the surrounding three-dimensional shape, but the other three sensors may be used to measure the surrounding three-dimensional shape.
- a method using a parallax image generated from images acquired by two cameras, a method of analyzing changes in feature points based on a plurality of images acquired while moving one camera, and the like may be used.
- the self-position estimation map 401 measures the interior of the building three-dimensionally by the laser range finder 102 as an operator moves in the building before monitoring, and corresponds to a structure such as a wall or a pillar from the measurement result. You may make it produce
- the monitoring apparatus 100 measures the temperature of an installation element and compares the measured temperature with a minimum temperature and an upper limit temperature, it determines the abnormality of an installation element, but other physical quantities (sound, Vibration, color, etc.) may be measured, and abnormality of equipment elements may be determined based on the measured other physical quantities.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
本発明の第1実施形態では、本発明の典型的な例を説明する。
続いて、本発明の第2実施形態について説明する。
続いて、本発明の第3実施形態について説明する。
Claims (15)
- 建物内の設備要素を監視する監視装置であって、
前記監視装置の周囲の三次元形状を計測する三次元計測部と、
計測された前記三次元形状を、前記設備要素以外の前記建物内の構造物の形状及び位置を含む自己位置推定用地図と照合することによって、前記監視装置の自己位置を推定する自己位置推定部と、
推定された前記自己位置の周辺の設備要素を、前記建物内の設備要素の形状及び位置を含む設備要素照合用地図から抽出する周辺設備要素抽出部と、
計測された前記三次元形状から設備要素候補の形状及び位置を抽出する設備要素候補抽出部と、
前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素の形状及び位置と計測された前記三次元形状から抽出された前記設備要素の候補の形状及び位置との一致度を、前記自己位置の周辺の設備要素の形状及び位置の誤差分布及び計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布に基づき算出する一致度評価部と、
算出された前記一致度に基づき、計測された前記三次元形状から抽出された前記設備要素候補の形状が前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素のいずれであるかを特定する設備要素特定部と、
を備えたことを特徴とする監視装置。 - 請求項1に記載の監視装置であって、
可視光画像を取得する可視光カメラと、
特定された前記設備要素の物理量を測定する物理量測定部と、
測定された前記物理量と特定された前記設備要素の標準物理量とを比較することによって特定された前記設備要素の異常を検知する異常検知部と、
検知された前記異常を、特定された前記設備要素の位置に基づき、前記可視光カメラによって取得された前記可視光画像に重畳して表示する画像表示部と、
をさらに備えたことを特徴とする監視装置。 - 請求項1に記載の監視装置であって、
前記一致度評価部は、
前記自己位置の周辺の設備要素の形状及び位置の誤差分布と、計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布とを乗じて統合し、
統合後の誤差分布の最大値を前記一致度として算出する、
ことを特徴とする監視装置。 - 請求項1に記載の監視装置であって、
前記設備要素特定部は、特定された前記設備要素の位置を前記一致度に基づき特定し、
前記監視装置が、
前記設備要素特定部によって特定された前記設備要素の位置に基づき、前記自己位置推定用地図と前記設備要素照合用地図とのずれを算出し、当該ずれが減少するように前記設備要素照合用地図を修正する座標系補正部をさらに備えた、
ことを特徴とする監視装置。 - 請求項1に記載の監視装置であって、
前記自己位置推定用地図及び前記設備要素照合用地図の少なくとも一つは、サーバによって保持され、
前記監視装置は、前記自己位置推定用地図及び前記設備要素照合用地図の少なくとも一つを前記サーバから取得する、
ことを特徴とする監視装置。 - 建物内の設備要素を監視する監視装置と、前記監視装置と接続されるサーバとを備える監視システムであって、
前記監視装置の周囲の三次元形状を計測する三次元計測部と、
計測された前記三次元形状を、前記設備要素以外の前記建物内の構造物の形状及び位置を含む自己位置推定用地図と照合することによって、前記監視装置の自己位置を推定する自己位置推定部と、
推定された前記自己位置の周辺の設備要素を、前記建物内の設備要素の形状及び位置を含む設備要素照合用地図から抽出する周辺設備要素抽出部と、
計測された前記三次元形状から設備要素候補の形状及び位置を抽出する設備要素候補抽出部と、
前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素の形状及び位置と計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置との一致度を、前記自己位置の周辺の設備要素の形状及び位置の誤差分布及び計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布に基づき算出する一致度評価部と、
算出された前記一致度に基づき、計測された前記三次元形状から抽出された前記設備要素候補の形状が前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素のいずれであるかを特定する設備要素特定部と、
を備えたことを特徴とする監視システム。 - 請求項6に記載の監視システムであって、
前記監視装置は、
可視光画像を取得する可視光カメラと、
特定された前記設備要素の物理量を測定する物理量測定部と、
を有し、
前記監視システムは、
測定された前記物理量と特定された前記設備要素の標準物理量とを比較することによって特定された前記設備要素の異常を検知する異常検知部と、
検知された前記異常を、特定された前記設備要素の位置に基づき、前記可視光カメラによって取得された前記可視光画像に重畳して表示する画像表示部と、
をさらに備えたことを特徴とする監視システム。 - 請求項6に記載の監視システムであって、
前記一致度評価部は、
前記自己位置の周辺の設備要素の形状及び位置の誤差分布と、計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布とを乗じて統合し、
統合後の誤差分布の最大値を前記一致度として算出する、
ことを特徴とする監視システム。 - 請求項6に記載の監視システムであって、
前記設備要素特定部は、特定された前記設備要素の位置を前記一致度に基づき特定し、
前記監視システムが、
前記設備要素特定部によって特定された前記設備要素の位置に基づき、前記自己位置推定用地図と前記設備要素照合用地図とのずれを算出し、当該ずれが減少するように前記設備要素照合用地図を修正する座標系補正部をさらに備えた、
ことを特徴とする監視システム。 - 請求項6に記載の監視システムであって、
前記監視装置は、前記三次元計測部、前記自己位置推定部、前記周辺設備要素抽出部、前記一致度評価部及び前記設備要素特定部を含み、
前記サーバは、前記自己位置推定用地図及び前記設備要素照合用地図の少なくとも一つを保持する、
ことを特徴とする監視システム。 - 請求項6に記載の監視システムであって、
前記サーバは、
前記三次元計測部、前記自己位置推定部、前記周辺設備要素抽出部、前記一致度評価部及び前記設備要素特定部の少なくとも一つを含み、
前記自己位置推定用地図及び前記設備要素照合用地図の少なくとも一つを保持する、
ことを特徴とする監視システム。 - 監視装置によって建物内の設備要素を監視する監視方法であって、
前記監視装置の周囲の三次元形状を計測する三次元計測ステップと、
計測された前記三次元形状を、前記設備要素以外の前記建物内の構造物の形状及び位置を含む自己位置推定用地図と照合することによって、前記監視装置の自己位置を推定する自己位置推定ステップと、
推定された前記自己位置の周辺の設備要素を、前記建物内の設備要素の形状及び位置を含む設備要素照合用地図から抽出する周辺設備要素抽出ステップと、
計測された前記三次元形状から設備要素候補の形状及び位置を抽出する設備要素候補抽出ステップと、
前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素の形状及び位置と計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置との一致度を、前記自己位置の周辺の設備要素の形状及び位置の誤差分布及び計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布に基づき算出する一致度評価ステップと、
算出された前記一致度に基づき、計測された前記三次元形状から抽出された前記設備要素候補の形状が前記設備要素照合用地図から抽出された前記自己位置の周辺の設備要素のいずれであるかを特定する設備要素特定ステップと、
を含むことを特徴とする監視方法。 - 請求項12に記載の監視方法であって、
可視光画像を取得する可視光画像取得ステップと、
特定された前記設備要素の物理量を測定する物理量測定ステップと、
測定された前記物理量と特定された前記設備要素の標準物理量とを比較することによって特定された前記設備要素の異常を検知する異常検知ステップと、
検知された前記異常を、特定された前記設備要素の位置に基づき、前記可視光カメラによって取得された前記可視光画像に重畳して表示する画像表示ステップと、
をさらに含むことを特徴とする監視方法。 - 請求項12に記載の監視方法であって、
前記一致度評価ステップでは、
前記自己位置の周辺の設備要素の形状及び位置の誤差分布と、計測された前記三次元形状から抽出された前記設備要素候補の形状及び位置の誤差分布とを乗じて統合し、
統合後の誤差分布の最大値を前記一致度として算出する、
ことを特徴とする監視方法。 - 請求項12に記載の監視方法であって、
前記設備要素特定ステップでは、特定された前記設備要素の位置を前記一致度に基づき特定し、
前記監視方法は、
前記設備要素特定ステップによって特定された前記設備要素の位置に基づき、前記自己位置推定用地図と前記設備要素照合用地図とのずれを算出し、当該ずれが減少するように前記設備要素照合用地図を修正する座標系補正ステップをさらに含む、
ことを特徴とする監視方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11871559.8A EP2765386A4 (en) | 2011-08-29 | 2011-08-29 | MONITORING DEVICE, MONITORING SYSTEM AND MONITORING PROCEDURE |
CN201180072688.9A CN103717995B (zh) | 2011-08-29 | 2011-08-29 | 监视装置、监视系统及监视方法 |
JP2013530912A JP5658372B2 (ja) | 2011-08-29 | 2011-08-29 | 監視装置、監視システム及び監視方法 |
US14/238,087 US9911041B2 (en) | 2011-08-29 | 2011-08-29 | Monitoring device, monitoring system and monitoring method |
PCT/JP2011/069462 WO2013030929A1 (ja) | 2011-08-29 | 2011-08-29 | 監視装置、監視システム及び監視方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/069462 WO2013030929A1 (ja) | 2011-08-29 | 2011-08-29 | 監視装置、監視システム及び監視方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013030929A1 true WO2013030929A1 (ja) | 2013-03-07 |
Family
ID=47755483
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/069462 WO2013030929A1 (ja) | 2011-08-29 | 2011-08-29 | 監視装置、監視システム及び監視方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9911041B2 (ja) |
EP (1) | EP2765386A4 (ja) |
JP (1) | JP5658372B2 (ja) |
CN (1) | CN103717995B (ja) |
WO (1) | WO2013030929A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017505444A (ja) * | 2013-10-30 | 2017-02-16 | テクタス・ドリームラブ・プライベート・リミテッドTectus Dreamlab Pte Ltd | 対象物、特に建造物を検査するための構成および方法 |
JP2017182612A (ja) * | 2016-03-31 | 2017-10-05 | 倉敷紡績株式会社 | 画像配置方法及び画像配置用コンピュータプログラム |
US10445160B2 (en) | 2014-12-31 | 2019-10-15 | Nuscale Power, Llc | Remote monitoring of critical reactor parameters |
JP2019194605A (ja) * | 2019-06-11 | 2019-11-07 | テクタス・ドリームラブ・プライベート・リミテッドTectus Dreamlab Pte Ltd | 対象物、特に建造物を検査するための構成および方法 |
JP2020004231A (ja) * | 2018-06-29 | 2020-01-09 | 大和ハウス工業株式会社 | 自律移動装置、自律移動プログラム及び位置推定システム |
JP2020519987A (ja) * | 2017-04-21 | 2020-07-02 | エックス デベロップメント エルエルシー | 環境地図生成および位置整合のための方法ならびにシステム |
JP2021504715A (ja) * | 2017-11-28 | 2021-02-15 | 深▲せん▼市杉川机器人有限公司Shenzhen 3Irobotix Co., Ltd. | 領域輪郭作成の方法、装置及びコンピュータ読取可能な記憶媒体 |
JP2021103149A (ja) * | 2019-12-25 | 2021-07-15 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
JP2021103148A (ja) * | 2019-12-25 | 2021-07-15 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
WO2022014443A1 (ja) * | 2020-07-16 | 2022-01-20 | コニカミノルタ株式会社 | プラント管理方法、プラント管理装置およびプラント管理プログラム |
JP2022058620A (ja) * | 2019-06-11 | 2022-04-12 | スクリーニング・イーグル・ドリームラボ・プライベート・リミテッド | 対象物、特に建造物を検査するための構成および方法 |
JPWO2022219780A1 (ja) * | 2021-04-15 | 2022-10-20 | ||
JP7520585B2 (ja) | 2020-06-12 | 2024-07-23 | 株式会社竹中工務店 | データ変換システム、及びデータ変換プログラム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6432494B2 (ja) | 2015-11-30 | 2018-12-05 | オムロン株式会社 | 監視装置、監視システム、監視プログラムおよび記録媒体 |
CN111009036B (zh) * | 2019-12-10 | 2023-11-21 | 北京歌尔泰克科技有限公司 | 同步定位与地图构建中栅格地图的修正方法、装置 |
US11335072B2 (en) * | 2020-06-03 | 2022-05-17 | UrsaLeo Inc. | System for three dimensional visualization of a monitored item, sensors, and reciprocal rendering for a monitored item incorporating extended reality |
CN112729181A (zh) * | 2020-12-25 | 2021-04-30 | 上海广川科技有限公司 | 一种进行晶圆定位检测的装置及方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281753A (ja) * | 1994-04-15 | 1995-10-27 | Toshiba Corp | 移動ロボット |
JPH08304581A (ja) * | 1995-04-28 | 1996-11-22 | Toshiba Corp | プラント点検支援装置および方法 |
JP2004347488A (ja) * | 2003-05-23 | 2004-12-09 | Mitsubishi Electric Corp | 現場作業支援装置 |
JP2007322138A (ja) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | 移動装置及び移動装置の自己位置推定方法 |
JP2009236774A (ja) | 2008-03-27 | 2009-10-15 | Hokuyo Automatic Co | 三次元測距装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2815045B2 (ja) * | 1996-12-16 | 1998-10-27 | 日本電気株式会社 | 画像特徴抽出装置,画像特徴解析装置,および画像照合システム |
TW518882B (en) * | 2000-03-27 | 2003-01-21 | Hitachi Ltd | Liquid crystal display device for displaying video data |
JP2002132341A (ja) * | 2000-10-26 | 2002-05-10 | Toshiba Corp | 現場点検装置 |
KR100493159B1 (ko) * | 2002-10-01 | 2005-06-02 | 삼성전자주식회사 | 이동체의 효율적 자기 위치 인식을 위한 랜드마크 및 이를이용한 자기 위치 인식 장치 및 방법 |
US20050285941A1 (en) * | 2004-06-28 | 2005-12-29 | Haigh Karen Z | Monitoring devices |
US8121399B2 (en) * | 2005-12-16 | 2012-02-21 | Ihi Corporation | Self-position identifying method and device, and three-dimensional shape measuring method and device |
CN101331380B (zh) * | 2005-12-16 | 2011-08-03 | 株式会社Ihi | 三维形状数据的存储/显示方法和装置以及三维形状的计测方法和装置 |
JP4974217B2 (ja) * | 2006-11-27 | 2012-07-11 | アルパイン株式会社 | ナビゲーション装置 |
US8050458B2 (en) * | 2007-06-18 | 2011-11-01 | Honda Elesys Co., Ltd. | Frontal view imaging and control device installed on movable object |
JP5120926B2 (ja) * | 2007-07-27 | 2013-01-16 | 有限会社テクノドリーム二十一 | 画像処理装置、画像処理方法およびプログラム |
US8515257B2 (en) * | 2007-10-17 | 2013-08-20 | International Business Machines Corporation | Automatic announcer voice attenuation in a presentation of a televised sporting event |
KR100926783B1 (ko) * | 2008-02-15 | 2009-11-13 | 한국과학기술연구원 | 물체인식 및 인식된 물체를 포함하는 주변 환경 정보를바탕으로 한 로봇의 자기 위치 추정 방법 |
WO2010025539A1 (en) * | 2008-09-05 | 2010-03-11 | Optosecurity Inc. | Method and system for performing x-ray inspection of a liquid product at a security checkpoint |
DE102009016230B4 (de) * | 2008-09-12 | 2013-12-19 | Siemens Aktiengesellschaft | Verfahren und Vorrichtung zur merkmalsbasierten Ortung eines mobilen Objekts mittels einer Referenzkarte oder zum unüberwachten Lernen einer Referenzkarte zur merkmalsbasierten Ortung |
KR101503903B1 (ko) * | 2008-09-16 | 2015-03-19 | 삼성전자 주식회사 | 이동 로봇의 지도 구성 장치 및 방법 |
JP5111616B2 (ja) * | 2008-11-26 | 2013-01-09 | 三菱電機株式会社 | 施設検索装置 |
JP4866951B2 (ja) * | 2009-09-16 | 2012-02-01 | 株式会社日立製作所 | 測位組み合わせ決定システム |
TWI403690B (zh) * | 2009-10-26 | 2013-08-01 | Ind Tech Res Inst | 自我定位裝置及其方法 |
US8635015B2 (en) * | 2009-12-17 | 2014-01-21 | Deere & Company | Enhanced visual landmark for localization |
US8908923B2 (en) * | 2011-05-13 | 2014-12-09 | International Business Machines Corporation | Interior location identification |
-
2011
- 2011-08-29 CN CN201180072688.9A patent/CN103717995B/zh not_active Expired - Fee Related
- 2011-08-29 EP EP11871559.8A patent/EP2765386A4/en not_active Withdrawn
- 2011-08-29 WO PCT/JP2011/069462 patent/WO2013030929A1/ja active Application Filing
- 2011-08-29 US US14/238,087 patent/US9911041B2/en not_active Expired - Fee Related
- 2011-08-29 JP JP2013530912A patent/JP5658372B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281753A (ja) * | 1994-04-15 | 1995-10-27 | Toshiba Corp | 移動ロボット |
JPH08304581A (ja) * | 1995-04-28 | 1996-11-22 | Toshiba Corp | プラント点検支援装置および方法 |
JP2004347488A (ja) * | 2003-05-23 | 2004-12-09 | Mitsubishi Electric Corp | 現場作業支援装置 |
JP2007322138A (ja) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | 移動装置及び移動装置の自己位置推定方法 |
JP2009236774A (ja) | 2008-03-27 | 2009-10-15 | Hokuyo Automatic Co | 三次元測距装置 |
Non-Patent Citations (2)
Title |
---|
SEBASTIAN THRUN; WOLFRAM BURGARD; DIETER FOX: "Probabilistic Robotics", 2005, THE MIT PRESS |
See also references of EP2765386A4 |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10551280B2 (en) | 2013-10-30 | 2020-02-04 | Tectus Dreamlab Pte Ltd | Arrangement and method for inspecting an object, in particular a building |
JP2017505444A (ja) * | 2013-10-30 | 2017-02-16 | テクタス・ドリームラブ・プライベート・リミテッドTectus Dreamlab Pte Ltd | 対象物、特に建造物を検査するための構成および方法 |
US10445160B2 (en) | 2014-12-31 | 2019-10-15 | Nuscale Power, Llc | Remote monitoring of critical reactor parameters |
JP2017182612A (ja) * | 2016-03-31 | 2017-10-05 | 倉敷紡績株式会社 | 画像配置方法及び画像配置用コンピュータプログラム |
JP6986188B6 (ja) | 2017-04-21 | 2022-02-08 | イントリンジック イノベーション エルエルシー | 環境地図生成および位置整合のための方法ならびにシステム |
JP2020519987A (ja) * | 2017-04-21 | 2020-07-02 | エックス デベロップメント エルエルシー | 環境地図生成および位置整合のための方法ならびにシステム |
US11275178B2 (en) | 2017-11-28 | 2022-03-15 | Shenzhen 3Irobotix Co., Ltd. | Method and device for drawing region outline and computer readable storage medium |
JP2021504715A (ja) * | 2017-11-28 | 2021-02-15 | 深▲せん▼市杉川机器人有限公司Shenzhen 3Irobotix Co., Ltd. | 領域輪郭作成の方法、装置及びコンピュータ読取可能な記憶媒体 |
JP2020004231A (ja) * | 2018-06-29 | 2020-01-09 | 大和ハウス工業株式会社 | 自律移動装置、自律移動プログラム及び位置推定システム |
JP7144991B2 (ja) | 2018-06-29 | 2022-09-30 | 大和ハウス工業株式会社 | 自律移動装置、自律移動プログラム及び位置推定システム |
JP7312864B2 (ja) | 2019-06-11 | 2023-07-21 | スクリーニング・イーグル・ドリームラボ・プライベート・リミテッド | 対象物、特に建造物を検査するための構成および方法 |
JP7042238B2 (ja) | 2019-06-11 | 2022-03-25 | スクリーニング・イーグル・ドリームラボ・プライベート・リミテッド | 対象物、特に建造物を検査するための構成および方法 |
JP2019194605A (ja) * | 2019-06-11 | 2019-11-07 | テクタス・ドリームラブ・プライベート・リミテッドTectus Dreamlab Pte Ltd | 対象物、特に建造物を検査するための構成および方法 |
JP2022058620A (ja) * | 2019-06-11 | 2022-04-12 | スクリーニング・イーグル・ドリームラボ・プライベート・リミテッド | 対象物、特に建造物を検査するための構成および方法 |
JP2021103149A (ja) * | 2019-12-25 | 2021-07-15 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
JP2021103148A (ja) * | 2019-12-25 | 2021-07-15 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
JP7318521B2 (ja) | 2019-12-25 | 2023-08-01 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
JP7318522B2 (ja) | 2019-12-25 | 2023-08-01 | 株式会社デンソー | 推定装置、推定方法、推定プログラム |
JP7520585B2 (ja) | 2020-06-12 | 2024-07-23 | 株式会社竹中工務店 | データ変換システム、及びデータ変換プログラム |
WO2022014443A1 (ja) * | 2020-07-16 | 2022-01-20 | コニカミノルタ株式会社 | プラント管理方法、プラント管理装置およびプラント管理プログラム |
JP7260839B2 (ja) | 2020-07-16 | 2023-04-20 | コニカミノルタ株式会社 | プラント管理方法、プラント管理装置およびプラント管理プログラム |
JPWO2022014443A1 (ja) * | 2020-07-16 | 2022-01-20 | ||
JPWO2022219780A1 (ja) * | 2021-04-15 | 2022-10-20 | ||
WO2022219780A1 (ja) * | 2021-04-15 | 2022-10-20 | 三菱電機株式会社 | 点検支援装置、点検支援システム、点検支援方法、及び点検支援プログラム |
JP7361992B2 (ja) | 2021-04-15 | 2023-10-16 | 三菱電機株式会社 | 点検支援装置、点検支援システム、点検支援方法、及び点検支援プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN103717995B (zh) | 2016-05-11 |
JPWO2013030929A1 (ja) | 2015-03-23 |
CN103717995A (zh) | 2014-04-09 |
US9911041B2 (en) | 2018-03-06 |
EP2765386A1 (en) | 2014-08-13 |
JP5658372B2 (ja) | 2015-01-21 |
US20140168423A1 (en) | 2014-06-19 |
EP2765386A4 (en) | 2015-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5658372B2 (ja) | 監視装置、監視システム及び監視方法 | |
US9250073B2 (en) | Method and system for position rail trolley using RFID devices | |
US7627447B2 (en) | Method and apparatus for localizing and mapping the position of a set of points on a digital model | |
EP4060288A2 (en) | Apparatus and method for providing vehicular positioning | |
JP6545279B2 (ja) | 衝突は発生しないかについて、車両が辿るべき目標軌跡を監視するための方法及び装置 | |
Bonnabel et al. | On the covariance of ICP-based scan-matching techniques | |
CN110470333B (zh) | 传感器参数的标定方法及装置、存储介质和电子装置 | |
CN110501036A (zh) | 传感器参数的标定检查方法及装置 | |
CN105874350A (zh) | 校准装置、校准方法及校准程序 | |
US9727978B2 (en) | Method for extracting outer space feature information from spatial geometric data | |
JP2019101694A (ja) | 位置特定装置、位置特定プログラム及び位置特定方法、並びに、撮影画像登録装置、撮影画像登録プログラム及び撮影画像登録方法 | |
CN112990151B (zh) | 障碍物检测模块的精度检测方法和电子设备 | |
Yu et al. | Displacement measurement of large structures using nonoverlapping field of view multi‐camera systems under six degrees of freedom ego‐motion | |
KR102050995B1 (ko) | 공간좌표의 신뢰성 평가 장치 및 방법 | |
US11288554B2 (en) | Determination method and determination device | |
CN105741260A (zh) | 行动定位装置及其定位方法 | |
Nocerino et al. | Introduction to mobile mapping with portable systems | |
KR101502071B1 (ko) | 랜드마크 기반 비전항법시스템을 위한 카메라 데이터 생성기와 그것을 실행시키기 위한 프로그램을 기록한 컴퓨터로 읽을 수 있는 매체 | |
JP2019168226A (ja) | プラント設備ナビゲーションシステムおよびプラント設備ナビゲーション方法 | |
CN114509778A (zh) | 自升造楼系统、三维矢量地图数据获取方法、装置和介质 | |
CN112344966A (zh) | 一种定位失效检测方法、装置、存储介质及电子设备 | |
CN112099509A (zh) | 地图优化方法、装置及机器人 | |
Barczyk et al. | Observability, covariance and uncertainty of ICP scan matching | |
JP7441579B1 (ja) | 情報処理システム及び情報処理方法 | |
US20230050389A1 (en) | System representation and method of use |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180072688.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11871559 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013530912 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14238087 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011871559 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |