WO2020157973A1 - Dispositif de traitement d'images - Google Patents

Dispositif de traitement d'images Download PDF

Info

Publication number
WO2020157973A1
WO2020157973A1 PCT/JP2019/003696 JP2019003696W WO2020157973A1 WO 2020157973 A1 WO2020157973 A1 WO 2020157973A1 JP 2019003696 W JP2019003696 W JP 2019003696W WO 2020157973 A1 WO2020157973 A1 WO 2020157973A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
deflection
image
amount
partial
Prior art date
Application number
PCT/JP2019/003696
Other languages
English (en)
Japanese (ja)
Inventor
遊哉 石井
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2019/003696 priority Critical patent/WO2020157973A1/fr
Priority to JP2020569324A priority patent/JP6989036B2/ja
Priority to US17/426,838 priority patent/US20220113260A1/en
Publication of WO2020157973A1 publication Critical patent/WO2020157973A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8861Determining coordinates of flaws
    • G01N2021/8864Mapping zones of defects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • G01N2021/8893Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques providing a video image and a processed signal for helping visual decision

Definitions

  • the present invention relates to an image processing device, an image processing method, and a recording medium that determine the orientation of a captured image.
  • An object of the present invention is to solve the above-mentioned problem, that is, when a subject such as a shape or a pattern that can specify the orientation of the image is not captured in the image, it is difficult to determine the orientation of the image from the image itself.
  • An object is to provide an image processing device that solves the problem.
  • An image processing apparatus is A time-series image captured while the surface of the structure is passing through a traffic load is spatially divided into a plurality of partial regions, and a dividing means for generating a plurality of partial time-series images, From the plurality of partial time-series images, measuring means for measuring a temporal change in the amount of deflection of the surface of the structure for each of the partial regions, Comparing means for comparing temporal changes in the amount of deflection of the surface of the structure for each of the partial regions, Based on the result of the comparison, determination means for determining the direction of the time-series image with respect to the passage direction of the traffic load, Equipped with.
  • An image processing method is A time-series image taken on the surface of a structure while passing a traffic load is spatially divided into a plurality of partial areas to generate a plurality of partial time-series images. From the plurality of partial time-series images, measuring the temporal change in the amount of deflection of the surface of the structure for each of the partial regions, Comparing temporal changes in the amount of deflection of the surface of the structure for each of the partial regions, Based on the result of the comparison, the direction of the time-series image with respect to the passing direction of the traffic load is determined.
  • a computer-readable recording medium A process of spatially dividing a time-series image captured on the surface of a structure during passage of a traffic load into a plurality of partial regions to generate a plurality of partial time-series images, From the plurality of partial time series images, a process of measuring a temporal change in the amount of deflection of the surface of the structure for each of the partial regions, A process of comparing the temporal changes in the amount of deflection of the surface of the structure for each of the partial regions, A program is recorded for causing a computer to perform a process of determining the direction of the time series image with respect to the passing direction of the traffic load based on the result of the comparison.
  • the present invention having the above-described configuration, it is possible to determine which direction the image is facing from the image itself, even if a subject such as a shape or a pattern that can identify the direction of the image is not reflected in the image. ..
  • FIG. 1 shows the structural example of the diagnostic device which concerns on the 1st Embodiment of this invention.
  • FIG. shows an example of a structure of the computer in the diagnostic device which concerns on the 1st Embodiment of this invention.
  • It is a flow chart which shows an example of operation of the diagnostic device concerning a 1st embodiment of the present invention.
  • FIG. 6 is a flowchart showing details of an operation of the diagnostic apparatus according to the first embodiment of the present invention for creating and displaying a shooting position/shooting direction guide screen. It is a figure which shows the structural example of the image orientation determination part in the diagnostic device which concerns on the 1st Embodiment of this invention.
  • the arrangement state of the upper left block, the lower left block, the upper right block, and the lower right block when the lateral direction of the captured image is inclined 45 degrees counterclockwise with respect to the bridge axis direction is shown. It is a figure. It is a block diagram of the image processing apparatus which concerns on the 2nd Embodiment of this invention.
  • FIG. 1 is a diagram showing a configuration example of a diagnostic device 100 according to the first embodiment of the present invention.
  • the diagnostic device 100 includes a computer 110 and a camera 130 connected to the computer 110 via a cable 120.
  • the camera 130 is an imaging device that captures an image of the area 141 existing on the surface of the structure 140 to be diagnosed at a predetermined frame rate.
  • the structure 140 is a bridge in this embodiment.
  • the region 141 is a part of the floor slab that serves as a diagnostic site for the bridge.
  • the structure 140 is not limited to a bridge.
  • the structure 140 may be an elevated structure of a highway or a railway.
  • the size of the region 141 is, for example, several tens of centimeters square.
  • the camera 130 is attached to the platform 151 on the tripod 150 so that the shooting direction of the camera can be fixed in any direction.
  • the camera 130 may be, for example, a high-speed camera including a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary MOS) image sensor having a pixel capacity of several million pixels.
  • the camera 130 may be a visible light and black and white camera, or an infrared camera or a color camera.
  • the camera 130 also includes a GPS receiver that measures the position of the camera. Further, the camera 130 includes an azimuth sensor and an acceleration sensor that measure the shooting direction of the camera.
  • the computer 110 has a diagnostic function of capturing an image of the structure 140 taken by the camera 130, performing predetermined image processing to determine the soundness of the structure 140, and outputting the determination result.
  • the computer 110 has a guide function that assists the operator so that the same area 141 of the structure 140 can be imaged in the same direction from the same imaging position each time in a diagnosis performed at a predetermined cycle such as once every several months.
  • FIG. 2 is a block diagram showing an example of the configuration of the computer 110.
  • the computer 110 includes a camera I/F (interface) unit 111, a communication I/F unit 112, an operation input unit 113, a screen display unit 114, a voice output unit 115, and a storage unit 116. And an arithmetic processing unit 117.
  • the camera I/F unit 111 is connected to the camera 130 via the cable 120, and is configured to transmit and receive data between the camera 130 and the arithmetic processing unit 117.
  • the camera 130 includes a main unit 131 including an image sensor and an optical system, the GPS receiver 132, the azimuth sensor 133, and the acceleration sensor 134 described above.
  • the camera I/F unit 111 is configured to perform data transmission/reception between the arithmetic processing unit 117 and the main unit 131, the GPS receiver 132, the azimuth sensor 133, and the acceleration sensor 134.
  • the communication I/F unit 112 includes a data communication circuit, and is configured to perform data communication with an external device (not shown) connected via a wire or wirelessly.
  • the operation input unit 113 is composed of an operation input device such as a keyboard and a mouse, and is configured to detect the operation of the operator and output it to the arithmetic processing unit 117.
  • the screen display unit 114 is composed of a screen display device such as an LCD (Liquid Crystal Display), and is configured to display various information such as a menu screen on the screen according to an instruction from the arithmetic processing unit 117.
  • the voice output unit 115 is configured by a sound output device such as a speaker, and is configured to output various voices such as a guide message according to an instruction from the arithmetic processing unit 117.
  • the storage unit 116 is composed of a storage device such as a hard disk and a memory, and is configured to store processing information and a program 1161 necessary for various processes in the arithmetic processing unit 117.
  • the program 1161 is a program that realizes various processing units by being read and executed by the arithmetic processing unit 117, and is transmitted from an external device (not shown) or a recording medium via a data input/output function such as the communication I/F unit 112. It is read in advance and stored in the storage unit 116.
  • the main processing information stored in the storage unit 116 includes a shooting position 1162, a shooting direction 1163, a time-series image 1164, an image orientation determination result 1165, a diagnosis location database 1166, and a diagnosis result database 1167.
  • the shooting position 1162 is data including the latitude, longitude, and altitude representing the position of the camera measured by the GPS receiver 132, and the current time.
  • the shooting direction 1163 is data representing the shooting direction of the camera 130, which is calculated based on the data measured by the azimuth sensor 133 and the acceleration sensor 134 provided in the camera 130.
  • the shooting direction 1163 is composed of three angles representing the attitude of the camera 130: pitch, roll, and yaw.
  • the time series image 1164 is a time series image captured by the camera 130.
  • the time-series image 1164 may be a plurality of frame images forming a moving image of the area 141 of the structure 140 captured by the camera 130.
  • the image orientation determination result 1165 is data representing the orientation of the captured image determined based on the time-series image 1164.
  • the orientation of the captured image represents, for example, the relationship between the bridge axis direction of the bridge that is the structure 140 and the lateral direction of the captured image.
  • the orientation of the captured image is not limited to the above.
  • the orientation of the captured image may represent the relationship between the bridge axis direction and the vertical direction of the captured image.
  • the diagnostic location database 1166 is a storage unit that stores information related to past diagnostic locations.
  • FIG. 3 shows an example of the format of the diagnostic location information 11661 stored in the diagnostic location database 1166.
  • the diagnosis location information 11661 in this example includes a diagnosis location ID 11662, registration date and time 11663, registration shooting position 11664, registration shooting direction 11665, and registration time series image 11666.
  • the diagnostic location ID 11662 is identification information that uniquely identifies the diagnostic location.
  • the registration date and time 11663 is the date and time when the diagnostic location information 11661 was registered.
  • the registered photographing position 11664 is the latitude, longitude, and altitude representing the position of the camera 130 when the diagnosis is performed.
  • Registered shooting directions 11665 are three angles of pitch, roll, and yaw that represent the posture of the camera 130 at the time of diagnosis.
  • the registration time-series image 11666 is a time-series image obtained by shooting the region 141 of the structure 140 with the camera 130 in the registration shooting direction 11665 from the registration shooting position 11664.
  • the diagnosis result database 1167 is a storage unit that stores information related to the diagnosis result.
  • FIG. 4 shows an example of the format of the diagnosis result information 11671 stored in the diagnosis result database 1167.
  • the diagnosis result information 11671 in this example includes a diagnosis location ID 11672, a diagnosis date/time 11673, and a diagnosis result 11674.
  • the diagnostic location ID 11672 is identification information that uniquely identifies the diagnostic location.
  • the diagnosis date and time 11673 is the date and time when the diagnosis was performed.
  • the diagnosis result 11674 is information indicating the result of diagnosis.
  • the arithmetic processing unit 117 has a processor such as an MPU and its peripheral circuits, reads the program 1161 from the storage unit 116, and executes it to realize various processing units by cooperating the hardware and the program 1161. Is configured.
  • the main processing units implemented by the arithmetic processing unit 117 are a shooting position acquisition unit 1171, a shooting direction acquisition unit 1172, a time-series image acquisition unit 1173, an image orientation determination unit 1174, a diagnosis unit 1175, and a control unit 1176.
  • the shooting position acquisition unit 1171 periodically acquires the position and the current time of the camera 130 measured by the GPS receiver 132 via the camera I/F unit 111, and updates the shooting position 1162 of the storage unit 116 with the acquired position. Is configured to.
  • the photographing direction acquisition unit 1172 is configured to periodically acquire the accelerations in the three directions of the azimuth angle and the vertical and horizontal heights measured by the azimuth sensor 133 and the acceleration sensor 134 through the camera I/F unit 111.
  • the shooting direction acquisition unit 1172 also calculates the posture of the camera 130, that is, three angles representing a shooting direction, that is, a pitch, a roll, and a yaw, from the acquired azimuth angle and acceleration, and the shooting direction 1163 of the storage unit 116 is calculated based on the calculation results. It is configured to update.
  • the roll representing the shooting direction is calculated based on the azimuth angle and the three-direction acceleration acquired by the azimuth sensor 133 and the acceleration sensor 134.
  • the roll representing the shooting direction may be the image orientation determined by the image orientation determination unit 1174.
  • the time series image acquisition unit 1173 acquires the time series image captured by the camera 130 from the main unit 131 through the camera I/F unit 111, and updates the time series image 1164 in the storage unit 116 with the acquired time series image. Is configured.
  • the time series image acquisition unit 1173 is configured to acquire time series images before and after at least one vehicle passes over the area 141 of the structure 140. For example, the time-series image acquisition unit 1173 starts the acquisition of the time-series images according to an operator's instruction or the output of a sensor (not shown) that mechanically detects a planned passing vehicle before the vehicle passes the bridge portion around the area 141.
  • the acquisition of the time-series images may be terminated by an instruction of an operator or the output of a sensor (not shown) that mechanically detects the passing vehicle.
  • the time-series image acquisition unit 1173 is a time-series image only for the time of the time frame in which at least one vehicle is considered to pass the bridge in a timeless manner with the timing when the vehicle passes the bridge. May be acquired.
  • the image orientation determination unit 1174 is configured to determine the orientation of the captured image based on the time-series image 1164. Details of the image orientation determination unit 1174 will be described later.
  • the diagnosis unit 1175 is configured to perform a deterioration diagnosis of the structure 140 based on the image of the structure 140 taken by the camera 130.
  • the deterioration diagnosis method is not particularly limited.
  • the diagnostic unit 1175 measures a surface vibration by analyzing a moving image of a region 141 of a structure 140 such as a bridge excited by a traffic of a vehicle at a high speed with a camera 130, and measures a surface vibration, and a crack is generated from the vibration pattern.
  • -It is configured to estimate internal deterioration such as peeling and cavities.
  • the diagnosis unit 1175 is configured to store information related to the estimated diagnosis result in the diagnosis result database 1167.
  • the control unit 1176 is configured to control the main control of the diagnostic device 100.
  • FIG. 5 is a flowchart showing an example of the operation of the diagnostic device 100.
  • the operation of the diagnostic device 100 when performing the degradation diagnosis of the structure 140 will be described with reference to the drawings.
  • control by the control unit 1176 is started. ..
  • the control unit 1176 displays an initial screen as shown in FIG. 6 on the screen display unit 114 (step S1).
  • a new diagnosis button and a continuous diagnosis button are displayed on the initial screen.
  • the new diagnosis button is a button that is selected when the first diagnosis is performed on the structure 140 that is a new diagnosis target.
  • the continuous diagnosis button is a button selected when the second and subsequent diagnoses are performed on the same portion of the same structure 140. The operation during the new diagnosis will be described below, and then the operation during the continuous diagnosis will be described.
  • FIG. 7 is a flowchart showing an example of the new diagnosis process.
  • the control unit 1176 first displays the new diagnosis screen on the screen display unit 114 (step S11).
  • Fig. 8 shows an example of the new diagnosis screen.
  • the new diagnosis screen in this example includes a monitor screen that displays an image captured by the camera 130, image orientation guide information, a registration button, a diagnosis button, and an end button.
  • the latest time series image captured by the camera 130 is acquired from the camera 130 by the captured image acquisition unit 1173 and stored in the storage unit 116 as the time series image 1164.
  • the control unit 1176 acquires the time series image 1164 from the storage unit 116 and displays it on the monitor screen of the new diagnosis screen.
  • the operator determines the diagnostic location of the structure 140 and sets the determined diagnostic location as the area 141. Then, the installation location of the tripod 150 is adjusted so that the image of the area 141 is displayed in an appropriate size on the monitor screen of the new diagnostic screen.
  • the angle of view and the magnification of the camera 130 are fixed. Therefore, when the image size of the area 141 is small, the operator moves the tripod 150 to a position closer to the structure 140 to increase the image size of the area 141. On the contrary, when the image size of the area 141 is large, the operator reduces the image size of the area 141 by moving the tripod 150 to a position farther from the structure 140.
  • control unit 1176 displays a text or an illustration graphic indicating how much the lateral direction of the captured image is inclined with respect to the bridge axis direction of the bridge that is the structure 140 in the display field of the image orientation guide information. .. Further, the control unit 1176 converts the information displayed in the display field of the image orientation guide information into voice, and outputs the voice from the voice output unit 115. For example, "The horizontal direction of the captured image is parallel to the bridge axis direction", "The horizontal direction of the captured image is perpendicular to the bridge axis direction”, "The horizontal direction of the captured image is relative to the bridge axis direction" It is tilted clockwise by 45 degrees.”
  • the operator can recognize whether or not the position and the shooting direction of the camera 130 are appropriate based on the orientation guide information of the displayed and voiced image. Further, the operator has a difference between the lateral direction of the image captured by the camera 130 and the predetermined direction (bridge axis direction), for example, about 45 degrees in the clockwise direction, based on the orientation guide information of the displayed and voiced images. You can judge that. Finally, the operator adjusts the shooting direction of the camera 130 by using the platform 151 so that the direction of the shot image of the camera 130 is a predetermined direction with reference to the image orientation guide information. For example, the operator adjusts the lateral direction of the captured image to be parallel to the bridge axis.
  • the control unit 1176 detects that the end button is turned on (step S14), the process of FIG. 7 ends.
  • the control unit 1176 creates new diagnostic location information 11661 and registers it in the diagnostic location database 1166 (step S15).
  • the current position and the current time of the camera 130 are acquired from the GPS receiver 132 by the shooting position acquisition unit 1171 and stored in the storage unit 116 as the shooting position 1162.
  • the shooting direction of the camera 130 is calculated by the shooting direction acquisition unit 1172 based on the direction and acceleration acquired from the direction sensor 133 and the acceleration sensor 134, and is stored in the storage unit 116 as the shooting direction 1163.
  • the image captured by the camera 130 is acquired from the camera 130 by the captured image acquisition unit 1173 and stored in the storage unit 116 as a time series image 1164.
  • the control unit 1176 acquires the imaging position 1162, the imaging direction 1163, and the time-series image 1164 from the storage unit 116, creates the diagnostic location information 11661 based on these information, and registers it in the diagnostic location database 1166. At that time, the control unit 1176 sets the newly assigned ID to the diagnostic location ID 11662 of the diagnostic location information 11661, and sets the current time to the registration date and time 11663.
  • the control unit 1176 When the control unit 1176 detects that the diagnosis button is turned on (step S13), the control unit 1176 activates the diagnosis unit 1175 and executes the diagnosis (step S16).
  • the diagnostic unit 1175 measures, for example, a surface vibration by analyzing a moving image of the area 141 of the structure 140 taken at high speed by the camera 130 as described above, and based on the vibration pattern, the inside of cracks, peeling, cavities, or the like is detected. Estimate the deterioration state. Then, the diagnosis unit 1175 stores information related to the estimated diagnosis result in the diagnosis result database 1167.
  • the control unit 1176 reads the diagnosis result of the diagnosis unit 1175 from the diagnosis result database 1167, displays it on the screen display unit 114, and/or outputs it to an external terminal through the communication I/F unit 112 (step S17).
  • FIG. 9 is a flowchart showing an example of the continuous diagnosis process.
  • the control unit 1176 first displays the diagnostic location selection screen on the screen display unit 114 (step S21).
  • Fig. 10 shows an example of the diagnostic location selection screen.
  • the diagnostic location selection screen of this example has a map, a selection button, and an end button.
  • a current position icon (circle in the figure) indicating the current position of the camera 130 and a past position icon (x in the figure) indicating a past photographing position are displayed.
  • the control unit 1176 retrieves the diagnostic location database 1166 using the current position of the camera 130 as a key to acquire diagnostic location information 11661 having a position within a predetermined distance from the current location at the imaging location 11664 from the diagnostic location database 1166. Then, the control unit 1176 displays the past position icon at the position indicated by the shooting position 11664 of the acquired diagnostic location information 11661.
  • control unit 1176 When the control unit 1176 detects that the selection button is turned on (step S23), the control unit 1176 creates a shooting position/shooting direction guide screen based on the diagnostic location information 11661 corresponding to the selected past position icon, and displays the screen. It is displayed on the section 114 (step S24).
  • Fig. 11 shows an example of the shooting position/shooting direction guide screen.
  • the shooting position/shooting direction guide screen of this example includes a monitor screen, a diagnostic location ID display field, a shooting position/shooting direction guide information display field, a diagnostic button, and an end button.
  • the monitor screen displays a captured image of the camera 130 on the monitor.
  • the display field of the shooting position/shooting direction guide information indicates the difference between the position of the camera 130 and the position of the camera 130 when the registered shot image was shot, and the shooting direction of the camera 130 and the registered shot image when shot. Information related to the difference in the shooting direction of the camera 130 is displayed.
  • control unit 1176 converts the information displayed in the display field of the shooting position/shooting direction guide information into sound and outputs the sound from the sound output unit 115.
  • a guide message such as "is output.
  • the operator can recognize whether or not the position and the shooting direction of the camera 130 are appropriate based on the shooting position/shooting direction guide information displayed and output as voice. Further, the operator can determine how to change the position and the shooting direction of the camera 130 based on the shooting position/shooting direction guide information displayed and voiced.
  • the control unit 1176 After displaying the shooting position/shooting direction guide screen, the control unit 1176 detects whether the end button is turned on (step S25) and whether the diagnostic button is turned on (step S26), and when neither button is turned on. , And returns to step S24. Therefore, when the operator changes the position of the camera 130 and the shooting direction, the shooting position/shooting direction guide screen is recreated/redrawn according to the change. Then, when the diagnosis button is turned on, the control unit 1176 activates the diagnosis unit 1175 and executes the diagnosis (step S27).
  • control unit 1176 reads the diagnosis result of the diagnosis unit 1176 from the diagnosis result database 1167 and displays it on the screen display unit 114, and/or an external terminal through the communication I/F unit 112. (Step S28). Then, the control unit 1176 returns to step S24. On the other hand, when the end button is turned on, the control unit 1176 ends the processing of FIG.
  • FIG. 12 is a flowchart showing details of step S24 in FIG.
  • the control unit 1176 first acquires the diagnostic location ID, the registered imaging position, and the registered imaging direction from the diagnostic location information 11661 corresponding to the selected past position icon (step S31). Next, the control unit 1176 acquires the shooting position 1162, the shooting direction 1163, and the time-series image 1164 from the storage unit 116 (step S32). Next, the control unit 1176 compares the shooting position 1162 and the shooting direction 1163 with the registered shooting position and the registered shooting direction, and detects a difference in position and a difference in direction (step S33).
  • control unit 1176 creates a monitor screen based on the time-series image 1164, and creates shooting position/shooting direction guide information based on the position difference and the direction difference detected in step S33 (step S34). .. Next, the control unit 1176 assembles the shooting position/shooting direction guide screen from the created monitor screen, shooting position/shooting direction guide information, and other screen elements and displays it on the screen display unit 114 (step S35).
  • the shooting position/shooting direction guide information information on the difference between the position of the camera 130 detected by the GPS receiver 132 and the registered shooting position is displayed. Further, in the shooting position/shooting direction guide information, information on the difference between the shooting direction of the camera 130 and the registered shooting direction calculated by the direction and acceleration detected by the direction sensor 133 and the acceleration sensor 134 is displayed. With such information, the operator can adjust the position and shooting direction of the camera 130 to the same camera position and shooting direction as in the initial diagnosis. Therefore, if the horizontal direction of the image taken by the camera was adjusted to be parallel to the bridge axis direction at the time of the first diagnosis, the direction of the image taken by the camera is adjusted to the same direction as the first time during the second and subsequent diagnoses. can do.
  • FIG. 13 is a block diagram showing an example of the image orientation determination unit 1174.
  • the image orientation determination unit 1174 includes a division unit 11741, a measurement unit 11742, a comparison unit 11743, and a determination unit 11744.
  • the dividing unit 11741 is configured to spatially divide the time-series image 1164 obtained by capturing the region 141 of the structure 140 into a plurality of regions and generate a plurality of partial time-series images corresponding to the plurality of partial regions.
  • FIG. 14 is a diagram showing an example of a relationship between the time series image 1164 before division and the partial time series image after division.
  • the time-series image 1164 before division is composed of n frame images arranged in time order.
  • the frame image is divided into four equal parts into an upper left block, a lower left block, an upper right block and a lower right block. Each of the upper left block, the lower left block, the upper right block, and the lower right block constitutes one partial area.
  • the block combining the upper left block and the lower left block is the left block
  • the block combining the upper right block and the lower right block is the right block
  • the block combining the upper left block and the upper right block is the upper block
  • the lower left block is the lower right block.
  • the block that was created is called the lower block.
  • Each of the left block, the right block, the upper block, and the lower block constitutes one partial area.
  • One partial time series image is composed of a set of n partial frame images in which the upper left block is arranged in time order (this partial time series image is referred to as an upper left partial time series image BG1).
  • Another one partial time series image is composed of a set of n partial frame images in which the lower left block is arranged in time order (this partial time series image is referred to as a lower left partial time series image BG2).
  • the other partial time series image is composed of a set of n partial frame images in which upper right blocks are arranged in time order (this partial time series image is referred to as an upper right partial time series image BG3).
  • the other partial time series image is composed of a set of n partial frame images in which the lower right block is arranged in time order (this partial time series image is referred to as the lower right partial time series image BG4).
  • the other partial time series image is composed of a set of n partial frame images in which left blocks are arranged in time order (this partial time series image is referred to as left partial time series image GB5).
  • the other partial time series image is composed of a set of n partial frame images in which right blocks are arranged in time order (this partial time series image is referred to as a right partial time series image GB6).
  • the other partial time-series image is composed of a set of n partial frame images in which the upper blocks are arranged in time order (this partial time-series image is referred to as an upper partial time-series image GB7).
  • the last one partial time series image is composed of a set of n partial frame images in which lower blocks are arranged in time order (this partial time series image is referred to as a lower partial time series image GB8).
  • the measuring unit 11742 is configured to measure the temporal change in the amount of deflection of the surface of the structure 140 from each of the partial regions of the partial time-series image generated by the dividing unit 11741.
  • the photographing distance L between the camera and the floor slab becomes short due to the amount of deflection ⁇ generated in the floor slab of the bridge due to traffic load. Therefore, the captured image is enlarged around the optical axis of the camera, and an apparent displacement ⁇ i due to bending occurs.
  • the shooting distance L can be measured in advance by, for example, a laser rangefinder, the distance x can be obtained from the displacement calculation position of the image before division and the camera optical axis, and F is known for each imaging device. ..
  • FIG. 15 is a schematic diagram showing an example of a temporal change in the amount of deflection of a partial area of one partial time series image.
  • the vertical axis represents the amount of deflection and the horizontal axis represents time.
  • the initial amount of deflection is almost zero, then the amount of deflection gradually increases, the amount of deflection becomes maximum at time t, and then gradually decreases and returns to zero again.
  • Such a characteristic is obtained when one vehicle passes directly above or in the vicinity of the partial area within the time from the first image frame to the last image frame of the time series image.
  • the comparison unit 11743 is configured to compare the temporal changes in the amount of deflection of the surface of the structure 140 for each partial region measured by the measurement unit 11742 between the different partial regions.
  • a method of comparing the changes in the amounts of deflection of two partial regions with time a method of obtaining the difference in the amount of deflection of both partial regions at the same time is used. That is, the amount of deflection of one partial area at times t1, t2,..., Tn is ⁇ 11, ⁇ 12,..., ⁇ 1n, and the amount of deflection of another partial area at times t1, t2,...
  • the differences in the deflection amounts at the times t1, t2,..., tn are calculated as .delta.11-.delta.21, .delta.12-.delta.22,..., .delta.n1-.delta.n2.
  • the partial areas to be compared with each other are the following four combinations.
  • FIG. 16 is an example of a graph showing an example of a temporal change in the amount of deflection of two partial areas.
  • FIG. 17 is an example of a graph showing an example of a temporal change in the difference between the amounts of deflection of the two partial areas.
  • the change over time in the deflection amount of one partial region is shown by the solid line in the graph of FIG. 16, and the change over time of the deflection amount of the other partial region is shown by the broken line in the graph of FIG.
  • the temporal change in the difference between the deflection amounts of the two partial regions, which is obtained by subtracting the deflection amount of the other partial region is shown by the solid line in the graph of FIG.
  • the difference in the amount of deflection is initially zero, but gradually increases in the positive direction, reaches its maximum at the time t1, then gradually decreases, and becomes the minimum at the time t2. After that, it is gradually approaching zero.
  • the value obtained by adding the absolute value of the maximum value and the minimum value of the difference in the amount of deflection is defined as the maximum value of the difference in the amount of deflection.
  • the determination unit 11744 is configured to determine the orientation of the captured image with respect to the passing direction of the traffic load based on the comparison result of each combination by the comparison unit 11743.
  • the determination unit 11744 determines whether the lateral direction of the captured image is parallel to the bridge axis direction or vertical. Further, the determination unit 11744 determines how much the lateral direction of the captured image is inclined with respect to the bridge axis direction when the images are not parallel/vertical.
  • the determination unit 11744 determines that the maximum value of the difference in the amount of deflection of the combination A is equal to or greater than the predetermined threshold THmax and the maximum value of the difference in the amount of deflection of the combination B is set to the predetermined threshold THmin ( ⁇ THmax. ) Is below (this condition is hereinafter referred to as condition 1), it is determined that the lateral direction of the captured image is parallel to the bridge axis direction.
  • the threshold THmin is set to 0 or a positive value close to 0.
  • the threshold THmax is set to, for example, the maximum deflection amount of the floor slab observed when only one vehicle (for example, a private vehicle) passes through the bridge.
  • the thresholds THmin and THmax are not limited to the above example. Further, the thresholds THmin and THmax may be fixed values or may be variable values that change according to the situation of each structure. The reason for making the above determination will be described below.
  • the left block (G11, G12) and the right block (G13, G14) of the combination A are arranged in the bridge axis direction as shown in FIG. 18, for example.
  • the upper blocks (G11, G13) and the lower blocks (G12, G14) of the combination B are arranged in parallel, and are arranged vertically in the bridge axis direction. Since the vehicle passing through the bridge moves in parallel to the bridge axis direction, the temporal changes in the amount of deflection of the left block and the right block may deviate from each other in time as shown by the solid line and the broken line in the graph of FIG. 19, for example. become.
  • the change over time in the difference between the flexure amounts has a characteristic shown by the solid line in the graph of FIG. 20, and the maximum value of the difference between the flexure amounts is equal to or greater than the threshold THmax.
  • the temporal changes in the amount of deflection of the upper block and the lower block are equal to each other as indicated by the alternate long and short dash line in the graph of FIG. Therefore, the temporal changes in the difference between the deflection amounts of the two become substantially equal, and the characteristics shown by the solid line in the graph of FIG. 21 are obtained.
  • the maximum value of the difference in the amount of deflection becomes less than or equal to the threshold value THmin.
  • the determination unit 11744 determines that the maximum value of the difference in the amount of deflection of the combination B is greater than or equal to the threshold value THmax and the maximum value of the difference in the amount of deflection of the combination A is less than or equal to the threshold value THmin (this condition is referred to as Condition 2 below. It is determined that the lateral direction of the captured image is perpendicular to the bridge axis direction. The reason for making the above determination is as follows.
  • the upper block (G11, G13) and the lower block (G12, G14) of the combination B are in the bridge axis direction, for example, as shown in FIG.
  • the left block (G11, G12) and the right block (G13, G14) of the combination A are arranged vertically in the bridge axis direction. Since the vehicle passing through the bridge moves in parallel to the bridge axis direction, the temporal changes in the amount of deflection of the upper block and the lower block deviate in time from each other as shown by the solid line and the broken line in the graph of FIG.
  • the change in the difference between the two deflection amounts over time is as shown by the solid line in the graph of FIG. 20, and the maximum value of the difference between the deflection amounts is greater than or equal to the threshold THmax.
  • the temporal changes in the amount of deflection of the left block and the right block are substantially equal to each other as indicated by the alternate long and short dash line in the graph of FIG. Therefore, the change over time in the difference between the deflection amounts is as shown by the solid line in the graph of FIG. 21, and the maximum difference between the deflection amounts is less than or equal to the threshold THmin.
  • the determination unit 11744 determines that the lateral direction of the captured image is neither parallel nor perpendicular to the bridge axis direction. Further, when such a determination is made, the determination unit 11744 determines how much the lateral direction of the captured image is inclined with respect to the bridge axis direction.
  • the determination unit 11744 determines that the maximum value of the difference in the amount of deflection of the combination D is equal to or greater than the threshold value THmax and the maximum value of the difference in the amount of deflection of the combination C is equal to or less than the threshold value THmin (this condition will be referred to as Condition 3 below. It is determined that the lateral direction of the captured image is inclined 45 degrees clockwise with respect to the bridge axis direction. The reason for making the above determination is as follows.
  • the lower left block G12 and the upper right block G13 of the combination D are parallel to the bridge axis direction, as shown in an example in FIG.
  • the upper left block G11 and the lower right block G14 of the combination C are arranged vertically in the bridge axis direction. Since the vehicle passing through the bridge moves parallel to the bridge axis direction, the temporal changes in the deflection amounts of the lower left block G12 and the upper right block G13 are temporally different from each other as indicated by the solid line and the broken line in the graph of FIG. 19, for example. It shifts. Therefore, the temporal change in the difference between the deflection amounts of the two is as shown in FIG.
  • the maximum value of the difference between the deflection amounts is equal to or greater than the threshold THmax.
  • the temporal changes in the amount of deflection of the upper left block G11 and the lower right block G14 become substantially equal to each other as indicated by the alternate long and short dash line in the graph of FIG. Therefore, the change over time in the difference between the deflection amounts is as shown by the solid line in the graph of FIG. 21, and the maximum difference between the deflection amounts is less than or equal to the threshold THmin.
  • the determination unit determines that the maximum value of the difference in the amount of deflection of the combination C is equal to or greater than the threshold THmax and the maximum value of the difference of the amount of deflection of the combination D is equal to or less than the threshold THmin (this condition will be referred to as condition 4 below). It is determined that the lateral direction of the photographed image is inclined 45 degrees counterclockwise with respect to the bridge axis direction. The reason for making the above determination is as follows.
  • the upper left block G11 and the lower right block G14 of the combination C are arranged in the bridge axis direction.
  • the lower left block G12 and the upper right block G13 of the combination D are arranged in parallel, and are arranged vertically in the bridge axis direction. Since a vehicle passing through the bridge moves parallel to the bridge axis direction, the temporal changes in the deflection amounts of the upper left block G11 and the lower right block G14 are temporally different from each other as indicated by the solid line and the broken line in the graph of FIG. 19, for example. It shifts.
  • the change in the difference between the two deflection amounts over time is as shown by the solid line in the graph of FIG. 20, and the maximum value of the difference between the deflection amounts is greater than or equal to the threshold THmax.
  • the temporal changes in the amount of deflection of the lower left block G12 and the upper right block G13 are substantially equal to each other as indicated by the alternate long and short dash line in the graph of FIG. Therefore, the change over time in the difference between the deflection amounts is as shown by the solid line in the graph of FIG. 21, and the maximum difference between the deflection amounts is less than or equal to the threshold THmin.
  • the determination unit 11744 compares the maximum value of the difference in the amount of deflection of the combination A with the maximum value of the difference in the amount of deflection of the combination B, and which is greater. To judge. Next, if the maximum value of the difference in the amount of deflection of the combination A is larger than the maximum value of the difference in the amount of deflection of the combination B (this condition is hereinafter referred to as condition 5), the determination unit 11744 determines that the lateral direction of the captured image is It is judged to be tilted within ⁇ 45 degrees with respect to the bridge axis direction.
  • the maximum value of the difference in the amount of deflection of the combination A becomes larger than the maximum value of the difference in the amount of deflection of the combination B because the blocks G11 to G14 are rotated 45 degrees clockwise from the state shown in FIG. 23, or any state from the state shown in FIG. 18 to the state shown in FIG. 24 rotated 45 degrees counterclockwise. This is because.
  • condition 6 the determination unit 11744 determines that the lateral direction of the captured image is It is judged to be tilted within ⁇ 45 degrees with respect to the direction perpendicular to the bridge axis direction.
  • the determination unit 11744 compares the maximum value of the difference in the amount of deflection of the combination C with the maximum value of the difference in the amount of deflection of the combination D. If the maximum value of the difference in the amount of deflection of the combination C is larger than the maximum value of the difference in the amount of deflection of the combination D, the determination unit 11744 determines that the lateral direction of the captured image is 45 in the counterclockwise direction with respect to the bridge axis direction. It is judged to be inclined within a degree.
  • the determination unit 11744 determines that the lateral direction of the captured image is 45 degrees clockwise with respect to the bridge axis direction. It is determined to be tilted within.
  • the determination unit 11744 compares the maximum value of the difference in the amount of deflection of the combination C with the maximum value of the difference in the amount of deflection of the combination D. Then, if the maximum value of the difference in the amount of deflection of the combination C is larger than the maximum value of the difference in the amount of deflection of the combination D, the determination unit 11744 rotates the lateral direction of the captured image clockwise with respect to the direction perpendicular to the bridge axis direction. It is determined that the direction is inclined within 45 degrees.
  • the determination unit 11744 counterclockwise with respect to the direction perpendicular to the bridge axis direction. It is determined to be tilted within 45 degrees in the circumferential direction.
  • the image orientation determination unit 1174 divides the entire frame image into four equal parts.
  • the method of dividing the frame image into a plurality of blocks is not limited to the above.
  • a part of the image such as the central portion excluding the peripheral portion may be divided into a plurality of blocks.
  • each divided block may not be in contact with other blocks.
  • the number of divisions is not limited to 4, and may be a number less than 4, that is, 2 or 3, or a number greater than 4.
  • the combination of partial areas for comparing changes in the amount of deflection with time is not limited to the above example.
  • the combination of the upper left block and the upper right block or the combination of the lower left block and the lower right block may be used instead of the combination of the upper block and the lower block.
  • the combination of the upper left block and the lower left block, or the combination of the upper right block and the lower right block may be used instead of the combination of the upper left block and the lower left block.
  • the comparing unit 11743 compares the deflection amount of one partial region and the deflection amount of the other partial region at the same time in order to compare the changes in the deflection amounts of the two partial regions with respect to each other. I tried to find the difference in the amount of deflection.
  • the comparison method is not limited to the above method.
  • the comparison unit 11743 uses the temporal variation pattern of the deflection amount of one partial region as the first variation pattern and the temporal variation pattern of the deflection amount of the other partial region as the second variation pattern.
  • the minimum value of the shift time with respect to the first change pattern necessary for the best match between the change pattern and the second change pattern may be calculated as the maximum value of the difference in the amount of deflection described above. ..
  • the orientation of an image can be determined from the image itself even if there is no subject whose orientation can be specified in the image taken from the bottom of the bridge floor.
  • You can The reason is that a time-series image taken while a bridge is passing a traffic load is spatially divided into a plurality of partial areas to generate a plurality of partial time-series images. This is to determine the orientation of the image with respect to the passage direction of the traffic load by measuring the temporal change in the amount of flexure of the bridge and comparing the temporal changes in the amount of flexure of the bridge for each partial area.
  • the operator since the operator is notified visually or by voice of the determined orientation of the image, the operator views the image of the camera, which captures the area to be the diagnosis point of the bridge, in a predetermined direction, that is, parallel to the bridge axis. It can be correctly adjusted in the vertical or vertical direction.
  • the control unit 1176 determines the image orientation of the registration time-series image 11666 recorded in the diagnostic location database 1166 of FIG. 2 by the image orientation determination unit 1174, and determines a predetermined orientation (for example, the horizontal direction of the image is a bridge).
  • the orientations of the images of the registration time-series images 11666 are converted so that the orientations of the registration time-series images 11666 are unified so that the orientations of the registration time-series images 11666 of all the diagnostic location information 11661 are changed to a predetermined orientation afterwards. You may make it align.
  • the control unit 1176 functions as a positioning unit.
  • FIG. 25 is a block diagram of the image processing apparatus according to this embodiment. In this embodiment, an outline of the image processing apparatus of the present invention will be described.
  • the image processing apparatus 200 is configured to include a dividing unit 201, a measuring unit 202, a comparing unit 203, and a determining unit 204.
  • the dividing unit 201 is configured to spatially divide the time-series image captured on the surface of the structure during passage of the traffic load into a plurality of partial regions to generate a plurality of partial time-series images. ..
  • the dividing unit 201 can be configured in the same manner as the dividing unit 11741 in FIG. 13, for example, but is not limited thereto.
  • the measuring unit 202 is configured to measure the temporal change in the amount of deflection of the structure for each partial region from the plurality of partial time-series images generated by the dividing unit 201.
  • the measuring unit 202 can be configured in the same manner as the measuring unit 11742 of FIG. 13, for example, but is not limited thereto.
  • the comparing means 203 is configured to compare the temporal changes in the deflection amount of the structure for each of the plurality of partial areas measured by the measuring means 202.
  • the comparison unit 203 can be configured in the same manner as, for example, the comparison unit 11743 in FIG. 13, but is not limited thereto.
  • the judging means 204 is configured to judge the direction of the time series image with respect to the passing direction of the traffic load based on the result of the comparison by the comparing means 203.
  • the determination unit 204 can be configured in the same manner as the determination unit 11744 of FIG. 13, for example, but is not limited to this.
  • the image processing device 200 configured in this way operates as follows. That is, the dividing unit 201 spatially divides the time-series image captured on the surface of the structure during passage of the traffic load into a plurality of partial regions to generate a plurality of partial time-series images. Next, the measuring unit 202 measures the temporal change in the amount of deflection of the structure for each partial region from the plurality of partial time-series images generated by the dividing unit 201. Next, the comparison unit 203 compares the temporal changes in the amount of deflection of the structure for each of the plurality of partial regions measured by the measurement unit 202. Next, the determination unit 204 determines the orientation of the time-series image with respect to the passage direction of the traffic load based on the result of the comparison by the comparison unit 203.
  • the present embodiment configured and operated as described above determines the image orientation from the image itself even when a subject whose orientation can be specified is not included in the image of the surface of the structure. can do.
  • the reason is that the orientation of the image with respect to the passage direction of the traffic load is determined based on the difference in the temporal change in the amount of deflection for each partial area when the traffic load passes through the surface of the structure.
  • the present invention can be used when determining the orientation of an image of the surface of a structure such as a bridge.
  • a time-series image captured while the surface of the structure is passing through a traffic load is spatially divided into a plurality of partial regions, and a dividing means for generating a plurality of partial time-series images, From the plurality of partial time-series images, measuring means for measuring a temporal change in the amount of deflection of the surface of the structure for each of the partial regions, Comparing means for comparing temporal changes in the amount of deflection of the surface of the structure for each of the partial regions, Based on the result of the comparison, determination means for determining the direction of the time-series image with respect to the passage direction of the traffic load, An image processing apparatus including.
  • Appendix 3 An output unit that detects a difference between the direction of the time-series image and a predetermined direction and outputs information indicating the detected difference is further provided.
  • Appendix 4 Further comprising a positioning means for positioning the time-series images based on the result of the determination, 4.
  • the comparison means calculates a difference in the amount of deflection at each time between the amount of deflection of the surface of the structure of one of the partial regions at each time and the amount of deflection of the surface of the structure of the other one of the partial regions at each time. Configured to calculate a maximum value, The image processing device according to any one of appendices 1 to 4.
  • the comparison means may include a first pattern indicating a temporal change in the amount of deflection of the surface of the structure in one of the partial regions, and a temporal pattern of the amount of deflection in the surface of the structure of another one of the partial regions.
  • a time-series image taken on the surface of a structure while passing a traffic load is spatially divided into a plurality of partial areas to generate a plurality of partial time-series images.
  • Appendix 13 A process of spatially dividing a time-series image captured on the surface of a structure during passage of a traffic load into a plurality of partial regions to generate a plurality of partial time-series images, From the plurality of partial time-series images, a process of measuring a temporal change in the amount of deflection of the surface of the structure for each of the partial regions, A process of comparing the temporal changes in the amount of deflection of the surface of the structure for each of the partial regions, A computer-readable recording medium recording a program for causing a computer to perform a process of determining a direction of the time-series image with respect to a passing direction of the traffic load based on a result of the comparison.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Traffic Control Systems (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image comprenant un moyen de partitionnement, un moyen de mesure, un moyen de comparaison et un moyen de détermination. Le moyen de partitionnement partitionne spatialement une image de série chronologique capturée pendant le passage d'une charge de trafic sur la surface d'une structure en une pluralité de régions partielles et génère ainsi une pluralité d'images de série chronologique partielles. À partir de la pluralité d'images de série chronologique partielles, le moyen de mesure mesure le changement temporel de la déflexion de la surface de la structure dans chaque région partielle. Le moyen de comparaison compare le changement temporel de la déflexion de la surface de la structure dans chaque région partielle. En fonction des résultats de la comparaison, le moyen de détermination détermine l'orientation de l'image de série chronologique par rapport à la direction de passage de la charge de trafic.
PCT/JP2019/003696 2019-02-01 2019-02-01 Dispositif de traitement d'images WO2020157973A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2019/003696 WO2020157973A1 (fr) 2019-02-01 2019-02-01 Dispositif de traitement d'images
JP2020569324A JP6989036B2 (ja) 2019-02-01 2019-02-01 画像処理装置
US17/426,838 US20220113260A1 (en) 2019-02-01 2019-02-01 Image processing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/003696 WO2020157973A1 (fr) 2019-02-01 2019-02-01 Dispositif de traitement d'images

Publications (1)

Publication Number Publication Date
WO2020157973A1 true WO2020157973A1 (fr) 2020-08-06

Family

ID=71841983

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/003696 WO2020157973A1 (fr) 2019-02-01 2019-02-01 Dispositif de traitement d'images

Country Status (3)

Country Link
US (1) US20220113260A1 (fr)
JP (1) JP6989036B2 (fr)
WO (1) WO2020157973A1 (fr)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0618223A (ja) * 1992-02-04 1994-01-25 Inter Detsuku:Kk 遠隔物体の光学的測定方法
JPH11258188A (ja) * 1998-03-10 1999-09-24 Kokusai Kogyo Co Ltd 熱映像による構造物変状診断システム及び診断方法
JP2011257389A (ja) * 2010-05-14 2011-12-22 West Japan Railway Co 構造物変位量測定方法
JP2015036620A (ja) * 2013-08-12 2015-02-23 復建調査設計株式会社 橋梁における活荷重無載荷状態時の標高計測方法
WO2016027874A1 (fr) * 2014-08-21 2016-02-25 公立大学法人大阪市立大学 Dispositif de visualisation de contrainte et dispositif de visualisation de valeur de propriété mécanique
WO2016047093A1 (fr) * 2014-09-25 2016-03-31 日本電気株式会社 Dispositif de détermination d'état et procédé de détermination d'état
JP2016084579A (ja) * 2014-10-23 2016-05-19 国立研究開発法人産業技術総合研究所 構造物のたわみ量分布監視方法及び監視装置
WO2016152076A1 (fr) * 2015-03-20 2016-09-29 日本電気株式会社 Dispositif d'évaluation de l'état d'une structure, système et procédé d'évaluation associés
JP2017027468A (ja) * 2015-07-24 2017-02-02 株式会社Ttes データ生成装置、データ生成方法、プログラム及び記録媒体
JP2017031579A (ja) * 2015-07-29 2017-02-09 日本電気株式会社 抽出システム、抽出サーバ、抽出方法、および抽出プログラム
JP2017068781A (ja) * 2015-10-02 2017-04-06 セイコーエプソン株式会社 計測装置、計測方法、計測システム、およびプログラム
WO2017179535A1 (fr) * 2016-04-15 2017-10-19 日本電気株式会社 Dispositif d'évaluation d'état d'une structure, système d'évaluation d'état et procédé d'évaluation d'état
JP2018109558A (ja) * 2017-01-04 2018-07-12 株式会社東芝 回転ずれ量検出装置、物体検知センサ、回転ずれ量検出システム、回転ずれ量検出方法及び回転ずれ量検出プログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2461300B1 (fr) * 2008-10-14 2014-11-26 Nohmi Bosai Ltd. Appareil de détection de fumée
JP7001060B2 (ja) * 2016-11-02 2022-01-19 ソニーグループ株式会社 情報処理装置、情報処理方法及び情報処理システム

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0618223A (ja) * 1992-02-04 1994-01-25 Inter Detsuku:Kk 遠隔物体の光学的測定方法
JPH11258188A (ja) * 1998-03-10 1999-09-24 Kokusai Kogyo Co Ltd 熱映像による構造物変状診断システム及び診断方法
JP2011257389A (ja) * 2010-05-14 2011-12-22 West Japan Railway Co 構造物変位量測定方法
JP2015036620A (ja) * 2013-08-12 2015-02-23 復建調査設計株式会社 橋梁における活荷重無載荷状態時の標高計測方法
WO2016027874A1 (fr) * 2014-08-21 2016-02-25 公立大学法人大阪市立大学 Dispositif de visualisation de contrainte et dispositif de visualisation de valeur de propriété mécanique
WO2016047093A1 (fr) * 2014-09-25 2016-03-31 日本電気株式会社 Dispositif de détermination d'état et procédé de détermination d'état
JP2016084579A (ja) * 2014-10-23 2016-05-19 国立研究開発法人産業技術総合研究所 構造物のたわみ量分布監視方法及び監視装置
WO2016152076A1 (fr) * 2015-03-20 2016-09-29 日本電気株式会社 Dispositif d'évaluation de l'état d'une structure, système et procédé d'évaluation associés
JP2017027468A (ja) * 2015-07-24 2017-02-02 株式会社Ttes データ生成装置、データ生成方法、プログラム及び記録媒体
JP2017031579A (ja) * 2015-07-29 2017-02-09 日本電気株式会社 抽出システム、抽出サーバ、抽出方法、および抽出プログラム
JP2017068781A (ja) * 2015-10-02 2017-04-06 セイコーエプソン株式会社 計測装置、計測方法、計測システム、およびプログラム
WO2017179535A1 (fr) * 2016-04-15 2017-10-19 日本電気株式会社 Dispositif d'évaluation d'état d'une structure, système d'évaluation d'état et procédé d'évaluation d'état
JP2018109558A (ja) * 2017-01-04 2018-07-12 株式会社東芝 回転ずれ量検出装置、物体検知センサ、回転ずれ量検出システム、回転ずれ量検出方法及び回転ずれ量検出プログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"New Energy and Industrial Technology Development Organization", NEDO. DOCUMENT 6-2-1, 7 March 2019 (2019-03-07), XP055727986, Retrieved from the Internet <URL:https://www.nedo.go.jp/content/100803391.pdf.> *

Also Published As

Publication number Publication date
US20220113260A1 (en) 2022-04-14
JPWO2020157973A1 (ja) 2021-11-25
JP6989036B2 (ja) 2022-01-05

Similar Documents

Publication Publication Date Title
JP5615441B2 (ja) 画像処理装置及び画像処理方法
US10198796B2 (en) Information processing apparatus, control method of information processing apparatus, and non-transitory storage medium storing information processing program
JPWO2020179188A1 (ja) 構造物の変位量計測装置
US11895396B2 (en) Imaging plan presentation apparatus and method for updating and re-generating an imaging plan
JP2008310446A (ja) 画像検索システム
WO2020194539A1 (fr) Dispositif de mesure de déplacement de structure
JPWO2017130700A1 (ja) 撮影支援装置及び撮影支援方法
JP5108392B2 (ja) 軌道変位測定システム
WO2020145004A1 (fr) Dispositif de guidage pour la photographie
US10432843B2 (en) Imaging apparatus, control method of imaging apparatus, and non-transitory recording medium for judging an interval between judgement targets
KR100458290B1 (ko) 이미지 프로세싱을 이용한 구조물의 변위량 측정방법
JP3570198B2 (ja) 画像処理方法およびその装置
WO2020157973A1 (fr) Dispositif de traitement d&#39;images
JP7173281B2 (ja) 構造物のたわみ計測装置
JP2016220024A (ja) パンニング表示制御装置および撮像装置
JP4479396B2 (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JP7067668B2 (ja) 変位・重量対応付け装置
JP2010230423A (ja) 変位量測定装置及び同測定方法
JPWO2020174833A1 (ja) 変位‐重量対応付け装置
JP2005354461A (ja) 監視カメラシステム、ビデオ処理装置およびその文字表示方法
WO2021039088A1 (fr) Procédé de sortie de paramètre d&#39;imagerie et dispositif de sortie de paramètre d&#39;imagerie
US20220148208A1 (en) Image processing apparatus, image processing method, program, and storage medium
JP5057134B2 (ja) 距離画像生成装置、距離画像生成方法及びプログラム
JP2017225067A (ja) 撮像装置及びその制御方法
WO2019093062A1 (fr) Dispositif de mesure, procédé de commande de dispositif de mesure, programme de mesure et support d&#39;enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913314

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020569324

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913314

Country of ref document: EP

Kind code of ref document: A1