WO2023084644A1 - Information processing device, information processing method, and information processing program - Google Patents

Information processing device, information processing method, and information processing program Download PDF

Info

Publication number
WO2023084644A1
WO2023084644A1 PCT/JP2021/041367 JP2021041367W WO2023084644A1 WO 2023084644 A1 WO2023084644 A1 WO 2023084644A1 JP 2021041367 W JP2021041367 W JP 2021041367W WO 2023084644 A1 WO2023084644 A1 WO 2023084644A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
parameter
image
images
luminance
Prior art date
Application number
PCT/JP2021/041367
Other languages
French (fr)
Japanese (ja)
Inventor
啓太 吉田
徹平 藤原
高之 天見
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2022510220A priority Critical patent/JP7066082B1/en
Priority to PCT/JP2021/041367 priority patent/WO2023084644A1/en
Publication of WO2023084644A1 publication Critical patent/WO2023084644A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and an information processing program for determining parameters used for correcting the brightness of multiple images obtained by imaging an object with multiple imaging devices.
  • a measurement vehicle in which a plurality of imaging devices with different imaging directions are installed to inspect an object such as a tunnel, and the object is imaged by these imaging devices while the vehicle is traveling.
  • imaging conditions for an imaging target, such as focus or iris are switched for each imaging device.
  • Japanese Patent Laid-Open No. 2002-200000 discloses a technology for automatically adjusting imaging conditions, which is installed in a vehicle based on driving environment information including information such as the presence or absence of tunnels, the presence or absence of overpasses, road gradients, date and time, weather, and direction of the vehicle. Techniques for adjusting the exposure conditions of an imaging device have been proposed.
  • the modified portion of the object is detected by correcting the luminance.
  • the accuracy of detection of irregularities may decrease.
  • the present disclosure has been made in view of the above, and provides an information processing apparatus capable of appropriately determining parameters capable of correcting brightness while suppressing deterioration in detection accuracy of a deformed portion of an object. with the aim of obtaining
  • the information processing device of the present disclosure includes a data acquisition unit, a luminance correction unit, an image stitching unit, a deformation detection unit, and a parameter determination unit.
  • the data acquisition unit acquires data of a plurality of images obtained by imaging an object with a plurality of imaging devices.
  • the luminance corrector corrects the luminance of each of the plurality of images.
  • the image stitching unit generates a stitched image that is an image obtained by stitching together a plurality of images whose luminance has been corrected by the luminance correction unit.
  • the deformation detection unit detects a deformation portion of the object using a learning model for detecting a deformation portion of the object from the combined images.
  • the parameter determination unit determines a parameter used for luminance correction based on the accuracy of detection of the deformation location by the deformation detection unit.
  • FIG. 1 is a diagram showing an example of a configuration of a measurement processing system according to a first embodiment
  • FIG. FIG. 2 is a diagram showing an example of the configuration of a measurement device included in the measurement vehicle according to the first embodiment
  • FIG. FIG. 2 is a diagram showing an example of arrangement of a plurality of imaging devices in the measuring device according to the first embodiment
  • 1 is a diagram showing an example of a configuration of an information processing apparatus according to a first embodiment
  • FIG. FIG. 4 is a diagram showing an example of a captured image data table stored in the captured image data storage unit according to the first embodiment
  • FIG. FIG. 4 shows an example of a parameter candidate table stored in the parameter candidate storage unit according to the first embodiment
  • FIG. 4 is a diagram showing an example of a parameter table stored in the parameter storage unit according to the first embodiment;
  • FIG. 4 is a diagram showing an example of parameter determination processing executed by the processing unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining gamma correction executed by a luminance correction unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining a local histogram equalization method executed by the luminance correction unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining high-pass filtering performed by the luminance correction unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining an example of captured image stitching processing by an image stitching unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining a method of calculating a recall rate, a precision rate, and an F value by an evaluation unit of the information processing apparatus according to the first embodiment;
  • FIG. 4 is a diagram for explaining a method of calculating a recall rate, a precision rate, and an F value by an evaluation unit of the information processing apparatus according to the first embodiment;
  • Embodiment 1. 1 is a diagram illustrating an example of a configuration of a measurement processing system according to a first embodiment; FIG. As shown in FIG. 1 , the measurement processing system 100 according to the first embodiment includes an information processing device 1 and a measurement vehicle 2 .
  • the measurement vehicle 2 includes a vehicle body 60 and a measurement device 70 mounted on the vehicle body 60 .
  • the measurement device 70 repeatedly captures images of objects existing around the vehicle body 60 as objects to be measured while the vehicle body 60 is running.
  • the objects to be measured by the measuring device 70 are the tunnel 4 and the guardrail 5, but are not limited to such examples. It may be an object or the like.
  • the measurement vehicle 2 is a vehicle that travels on a road and is an automobile that travels on roads, but may be a railroad vehicle that travels on rails.
  • the measuring device 70 generates captured image data including data of an image obtained by capturing an object.
  • the measuring device 70 can transmit and receive data to and from the information processing device 1 via the network 3 , and transmits measurement data including captured image data to the information processing device 1 .
  • the network 3 is, for example, a WAN (Wide Area Network) such as the Internet, but may be a LAN (Local Area Network) or other networks.
  • FIG. 2 is a diagram showing an example of the configuration of the measurement device provided in the measurement vehicle according to the first embodiment.
  • the measurement device 70 included in the measurement vehicle 2 includes imaging devices 71 a , 71 b , and 71 c , a position/attitude velocity detection unit 72 , a processing unit 73 , and a communication unit 74 .
  • the imaging devices 71a, 71b, and 71c have imaging directions in directions orthogonal to the traveling direction of the vehicle body 60 and in different directions.
  • FIG. 3 is a diagram illustrating an example of arrangement of a plurality of imaging devices in the measuring device according to the first embodiment; In the example shown in FIG. 3, the imaging direction of the imaging device 71a is the upper left direction, the imaging direction of the imaging device 71b is the upper direction, and the imaging direction of the imaging device 71c is the upper right direction. be.
  • images of the inner wall surface of the tunnel 4 are captured by a plurality of imaging devices 71a, 71b, and 71c.
  • the imaging devices 71a, 71b, and 71c are, for example, color line sensors, and repeatedly capture images of mutually different regions of the inner wall surface 4a that partially overlap each other while the vehicle body 60 is running.
  • the imaging devices 71a, 71b, and 71c may be referred to as the imaging device 71 when they are not individually distinguished.
  • the imaging device 71 switches imaging conditions for the object by changing imaging parameter settings.
  • the content of the setting change of the imaging parameter is set in advance according to the state of the imaging environment such as the state of the tunnel 4, the time of day, the weather, and the season.
  • the setting may be changed automatically.
  • the imaging parameters are, for example, at least one of focus, iris, gain control, dynamic range, and the like.
  • the imaging device 71 may be a color area sensor or a monochrome sensor instead of the color line sensor. Also, the number of imaging devices 71 provided in the measuring device 70 is not limited to three, and may be two or less, or may be four or more, for example.
  • the position/attitude/speed detection unit 72 shown in FIG. 2 includes a GPS (Global Positioning System) receiver, an inertial sensor, and a speed sensor, and detects the position, attitude, and speed of the vehicle body 60 .
  • the processing unit 73 controls the plurality of imaging devices 71 based on, for example, the position, orientation, and speed of the vehicle body 60, and causes the plurality of imaging devices 71 to repeatedly perform imaging of the target object.
  • the processing unit 73 connects, for each imaging device 71, a plurality of images repeatedly captured by each imaging device 71 in the traveling direction of the vehicle body 60 based on the position, posture, and speed of the vehicle body 60, for example. Captured image data for each imaging device 71, which is the captured image data, is generated.
  • the processing unit 73 outputs measurement data including captured image data for each imaging device 71 to the communication unit 74 .
  • the communication unit 74 is wirelessly connected to the network 3 and transmits measurement data acquired from the processing unit 73 to the information processing device 1 via the network 3 .
  • the information processing device 1 generates a composite image by combining a plurality of captured images in a direction orthogonal to the traveling direction of the vehicle body 60 based on the measurement data acquired from the measurement vehicle 2, and generates the composite image. Based on, a deformed portion of the object is detected.
  • a deformed portion is a portion where deformation occurs. For example, when the object is the tunnel 4, the deformation is cracking, peeling, staining, water leakage, or the like of the inner wall surface 4a of the tunnel 4.
  • the information processing device 1 generates an inspection image by performing a process of arranging information indicating the detected deformed portion on the composite image, and displays or prints the generated inspection image.
  • an inspection image allows the user of the information processing apparatus 1, as an inspection operator, to easily grasp the deformed portion of the object and to paste the inspection image onto a form or the like.
  • a plurality of captured images obtained from the corresponding imaging devices 71 under mutually different imaging conditions may have a luminance difference. Visibility of the stitched image may deteriorate due to luminance unevenness in the stitched image. In addition, when the contrast of a plurality of captured images is poor or when the luminance is low, the visibility of the composite image may be deteriorated.
  • the information processing apparatus 1 corrects the luminance of a plurality of captured images before generating a composite image, and generates a composite image by combining the plurality of images whose luminance has been corrected.
  • the visibility of the stitched image in the inspection image can be improved compared to the stitched image obtained by stitching a plurality of images without correcting the brightness.
  • the user arbitrarily sets a parameter P for correcting the brightness of a plurality of captured images to be stitched together, or automatically corrects the brightness from the histogram shape of each captured image to be stitched together.
  • a parameter P for correcting the brightness of a plurality of captured images to be stitched together or automatically corrects the brightness from the histogram shape of each captured image to be stitched together.
  • the detection accuracy for detecting a deformed portion from the composite image will be lower than in the case where luminance correction processing is not performed.
  • brightness correction may cause overexposure or underexposure in the deformed portion of the stitched image, making it impossible to detect the deformed portion.
  • the information processing device 1 determines the parameter P for correcting the luminance based on the detection accuracy of the deformed portion from the stitched image, which is an image obtained by stitching together a plurality of captured images whose luminance is corrected. As a result, the information processing apparatus 1 appropriately determines the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the configuration of the information processing apparatus 1 will be specifically described below.
  • FIG. 4 is a diagram showing an example of the configuration of the information processing device according to the first embodiment.
  • the information processing device 1 includes a communication section 10 , an input section 11 , a display section 12 , a storage section 13 and a processing section 14 .
  • the communication unit 10 is communicably connected to the network 3 by wire or wirelessly, and transmits and receives information to and from an external device such as a measuring device 70, a printer (not shown), or a terminal device via the network 3.
  • an external device such as a measuring device 70, a printer (not shown), or a terminal device via the network 3.
  • the input unit 11 includes, for example, a mouse and keyboard, but may also be a touch pad.
  • the display unit 12 is, for example, a liquid crystal display, an organic EL (ElectroLuminescence) display, or a projector. In the example shown in FIG. 4, the input unit 11 and the display unit 12 are included in the information processing device 1. The input unit 11 and the display unit 12 are connected to the information processing device 1 as external devices of the information processing device 1. may be
  • the storage unit 13 includes a captured image data storage unit 20, a parameter candidate storage unit 21, and a parameter storage unit 22.
  • the captured image data storage unit 20 stores a captured image data table including captured image data for each imaging device 71 included in the measurement data transmitted from the measuring device 70 .
  • FIG. 5 is a diagram showing an example of a captured image data table stored in a captured image data storage unit according to the first embodiment.
  • the captured image data table shown in FIG. 5 includes “target object ID (IDentifier)”, “imaging device ID”, and “captured image data” for each captured image data.
  • IDentifier target object ID
  • imaging device ID imaging device ID
  • captured image data captured image data
  • Object ID is unique identification information for each object.
  • Imaging device ID is unique identification information for each imaging device 71 .
  • Captured image data is captured image data or information indicating the storage location of captured image data.
  • the captured image data of the object with the object ID "M1” is captured by the imaging devices 71a, 71b, and 71c with the imaging device IDs "DE1,” "DE2,” and “DE3.” It is shown that the captured image data generated based on the images obtained are the captured image data IMD1, IMD2, and IMD3.
  • the parameter candidate storage unit 21 shown in FIG. 4 stores a parameter candidate table containing a plurality of parameter candidates that are candidates for the parameter P used in the brightness correction process.
  • 6 is a diagram of an example of a parameter candidate table stored in a parameter candidate storage unit according to the first embodiment; FIG.
  • the parameter candidate table shown in FIG. 6 includes "object ID” and "parameter candidate".
  • the "object ID” is the same as the "object ID” shown in FIG.
  • the “parameter candidate” is information of a parameter candidate that is a candidate of the parameter P used for the luminance correction process.
  • parameter candidates Pc1, Pc2, . n is an integer of 4 or more, for example. Further, parameter candidates Pc2 and the like are included as parameter candidates for the object with the object ID “M2”. In the following description, parameter candidates Pc1, Pc2, . Note that the parameter candidate table may include identification information unique to the type of object instead of or in addition to the object ID.
  • the parameter storage unit 22 shown in FIG. 4 stores a parameter table including parameters P used for luminance correction processing.
  • 7 is a diagram of an example of a parameter table stored in a parameter storage unit according to the first embodiment; FIG.
  • the parameter table shown in FIG. 7 includes "object ID” and "parameter".
  • the "object ID” is the same as the "object ID” shown in FIG. “Parameter” is information of a parameter P used for luminance correction processing.
  • the parameter candidate Pc2 is set as the parameter P for the object with the object ID "M1”
  • the parameter candidate Pcn is set as the parameter P for the object with the object ID "M2”.
  • the parameter table may include identification information unique to the type of object instead of or in addition to the object ID.
  • the processing unit 14 shown in FIG. 4 performs parameter determination processing for determining a parameter P for luminance correction when a parameter determination request is received through an input operation to the input unit 11 by the user of the information processing apparatus 1 .
  • 8 is a diagram illustrating an example of parameter determination processing performed by the processing unit of the information processing apparatus according to the first embodiment; FIG.
  • FIG. 8 shows an example in which captured images A, B, and C are pasted together as a plurality of captured images.
  • the captured image A is, for example, a captured image represented by the above-described captured image data IMD1
  • the captured image B is, for example, a captured image represented by the above-described captured image data IMD2
  • the captured image C is, for example, , and the captured image represented by the captured image data IMD3 described above.
  • the processing unit 14 performs luminance correction processing for correcting the luminance of each of the captured images A, B, and C based on the parameter candidates Pc1, Pc2, . .
  • the captured images A, B, and C corrected for brightness using the parameter candidate Pc1 the captured images A, B, and C corrected for brightness using the parameter candidate Pc2, and the brightness corrected using the parameter candidate Pcn.
  • the picked-up images A, B, C, etc. are generated.
  • the processing unit 14 performs a stitching process of generating a stitched image, which is an image obtained by stitching together the picked-up images A, B, and C whose brightness has been corrected, for each parameter candidate Pc.
  • a stitching process of generating a stitched image, which is an image obtained by stitching together the picked-up images A, B, and C whose brightness has been corrected, for each parameter candidate Pc.
  • composite images IC Pc1 , IC Pc2 , . . . , IC Pcn are generated.
  • a stitched image IC Pc1 is an image obtained by stitching captured images A, B, and C corrected for brightness using the parameter candidate Pc1
  • a stitched image IC Pc2 is an image obtained by performing brightness correction using the parameter candidate Pc2. It is an image in which images A, B, and C are pasted together.
  • a stitched image IC Pcn is an image obtained by stitching together the captured images A, B, and C whose brightness has been corrected using the parameter candidate Pcn. In the following , when each of the composite images IC Pc1 , IC Pc2 , .
  • the processing unit 14 performs a deformation detection process for detecting a deformation portion from each of the composite images IC Pc1 , IC Pc2 , . . . , IC Pcn . Thereby, the processing unit 14 generates deformation detection results RS Pc1 , RS Pc2 , .
  • the deformation detection result RS Pc1 includes deformation data that is data of a deformation location detected from the composite image IC Pc1 obtained using the parameter candidate Pc1.
  • the deformation detection result RS Pc2 includes deformation data that is data of a deformation location detected from the composite image IC Pc2 obtained using the parameter candidate Pc2.
  • the deformation detection result RS Pcn includes deformation data that is data of a deformation location detected from the composite image IC Pcn obtained using the parameter candidate Pcn.
  • the processing unit 14 evaluates the deformation detection accuracy for the composite images IC Pc1 , IC Pc2 , ..., IC Pcn based on the deformation detection results RS Pc1 , RS Pc2 , ..., RS Pcn .
  • Detection result evaluation processing for calculating values VR Pc1 , VR Pc2 , . . . , VR Pcn is performed.
  • the processing unit 14 calculates the evaluation value VR Pc1 of the deformation detection accuracy for the composite image IC Pc1 based on the deformation detection result RS Pc1 , and calculates the composite image IC An evaluation value VR Pc2 of the deformation detection accuracy for Pc2 is calculated.
  • the processing unit 14 also calculates an evaluation value VR Pcn of the deformation detection accuracy for the composite image IC Pcn based on the deformation detection result RS Pcn . . . , VR Pcn may be referred to as an evaluation value VR Pc hereinafter when they are individually indicated without being distinguished from each other.
  • the evaluation value VR Pc becomes a larger value as the deformation detection accuracy is higher.
  • the processing unit 14 based on the evaluation values VR Pc1 , VR Pc2 , . decide. For example, the processing unit 14 determines the parameter candidate Pc corresponding to the largest evaluation value VR Pc among the evaluation values VR Pc1 , VR Pc2 , . process.
  • the processing unit 14 can also determine the parameter P for correcting the luminance based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion.
  • the processing unit 14 calculates the visibility evaluation values VV Pc1 , VV Pc2 , . . . , VV Pcn may be referred to as an evaluation value VV Pc hereinafter when they are indicated without being distinguished from each other .
  • the evaluation value VV Pc is calculated based on the contrast and edge strength of the stitched image IC Pc .
  • the processing unit 14 increases the evaluation value VV Pc as the contrast increases, and increases the evaluation value VV Pc as the edge strength increases.
  • the processing unit 14 determines, as the parameter P, the parameter candidate Pc corresponding to the largest result among the weighted addition results of the evaluation value VR Pc and the evaluation value VV Pc . For example, the processing unit 14 calculates a value obtained by weighting and adding the evaluation value VR Pc1 and the evaluation value VV Pc1 as the total evaluation value VT Pc1 , and weights and adds the evaluation value VR Pc2 and the evaluation value VV Pc2 . The value is calculated as the comprehensive evaluation value VT Pc2 .
  • the processing unit 14 calculates a total evaluation value VT Pc by weighting and adding the evaluation value VR Pcn and the evaluation value VV Pcn .
  • the processing unit 14 determines, as the parameter P, the parameter candidate Pc corresponding to the largest evaluation value among the comprehensive evaluation values VT Pc1 , VT Pc2 , . . . , VT Pcn . . . , VT Pcn may be referred to as a total evaluation value VT Pc when they are not individually distinguished .
  • the processing unit 14 determines the parameter P for correcting the luminance based on the visibility of the stitched image IC Pc in addition to the detection accuracy of the deformed portion.
  • a parameter P can be determined.
  • the processing unit 14 corrects the brightness of each of the captured images A, B, and C using the determined parameter P, and combines the captured images A, B, and C whose brightness has been corrected to generate a combined image ICP . .
  • the processing unit 14 detects a deformed portion from the composite image ICP , and generates an inspection image in which information on the detected deformed portion is superimposed on the composite image ICP .
  • the processing unit 14 can improve the visibility of the composite image ICP while suppressing deterioration in detection accuracy of the deformed portion from the composite image ICP .
  • the configuration of the processing unit 14 will be specifically described below.
  • the processing unit 14 of the information processing apparatus 1 includes an input reception unit 30, a data acquisition unit 31, a luminance correction unit 32, an image stitching unit 33, a deformation detection unit 34, a parameter A determination unit 35 , a display processing unit 36 , an inspection image generation unit 37 , and a data output unit 38 are provided.
  • the input reception unit 30 receives an input operation to the input unit 11 by the user.
  • the input reception unit 30 receives a parameter determination request from the user via the input unit 11 .
  • the parameter determination request includes, for example, information such as the object ID of the object whose parameters are to be determined.
  • the input reception unit 30 receives an inspection image generation request from the user via the input unit 11 .
  • the inspection image generation request includes, for example, information such as the object ID of the object whose parameters are to be determined. Also, the input reception unit 30 receives an output request from the user via the input unit 11 .
  • the data acquisition unit 31 acquires data from an external device via the communication unit 10 , stores the acquired data in the storage unit 13 , and causes the processing unit 14 to perform processing using the data stored in the storage unit 13 .
  • the data stored in the storage unit 13 is acquired when it is performed.
  • the data acquisition unit 31 acquires measurement data received by the communication unit 10 from the measuring device 70 via the network 3 from the communication unit 10, and stores captured image data of the acquired measurement data in the captured image data storage unit 20. be memorized.
  • the data acquisition unit 31 stores a plurality of captured image data and a plurality of parameter candidates Pc associated with the object ID included in the parameter determination request in the storage unit 13. Get from When the input reception unit 30 receives the parameter determination request, the operation mode of the processing unit 14 becomes the parameter determination mode.
  • the data acquisition unit 31 stores a plurality of parameter candidates Pc associated with the object type of the object ID included in the parameter determination request in the storage unit 13. can also be obtained from
  • the data acquisition unit 31 acquires a plurality of captured image data and the parameter P associated with the object ID included in the inspection image generation request from the storage unit 13. get.
  • the operation mode of the processing unit 14 becomes the inspection image generation mode.
  • the data acquisition unit 31 acquires the parameter P determined by the parameter determination unit 35 from the parameter determination unit 35 and stores it in the parameter storage unit 22 .
  • the brightness correction unit 32 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the brightness correction unit 32 when the operation mode is the parameter determination mode will be described.
  • the brightness correction unit 32 acquires the brightness of each of the plurality of captured images represented by the plurality of captured image data acquired by the data acquisition unit 31 . Correction is performed for each parameter candidate Pc using a plurality of parameter candidates Pc.
  • the luminance correction unit 32 corrects the luminance of each of the multiple captured images by gamma correction, for example.
  • the brightness correction unit 32 performs gamma correction, for example, by calculating the following formula (1).
  • "x" is the input
  • "y” is the output
  • " ⁇ " is the correction parameter.
  • f(x,y) 255 ⁇ (x/255) 1/ ⁇ (1)
  • parameter candidates Pc1, Pc2, . . . , Pcn are values to be substituted for parameter ⁇ .
  • the brightness correction unit 32 substitutes the parameter candidate Pc for “ ⁇ ” in the above equation (1), and uses the brightness value of each pixel of the captured image as the input x, thereby obtaining the brightness of each of the plurality of captured images as a parameter candidate. It is corrected in units of Pc.
  • FIG. 9 is a diagram for explaining gamma correction performed by the luminance correction unit of the information processing apparatus according to the first embodiment. As shown in FIG. 9, when the parameter ⁇ is less than 1, the larger the input x, the larger the output y increases. rate becomes smaller.
  • the luminance correction unit 32 can also correct the luminance of each of a plurality of captured images for each parameter candidate Pc by the local histogram equalization method.
  • the local histogram equalization method is a method for improving contrast by dividing a captured image and correcting the luminance value distribution in each divided area.
  • FIG. 10 is a diagram for explaining the local histogram equalization method executed by the luminance correction unit of the information processing apparatus according to the first embodiment. As shown in FIG. 10, in the local histogram equalization method, the contrast is improved by dividing the captured image and modifying the luminance value distribution in each divided area.
  • the luminance correction unit 32 can also correct the luminance of each of the plurality of captured images for each parameter candidate Pc using a high-pass filter. For example, the brightness correction unit 32 extracts the frequency components of the captured image by performing a discrete Fourier transform on the captured image, removes only the low-frequency components from the extracted frequency components, and produces a captured image from which only the low-frequency components are removed. By inverse Fourier transforming the frequency components of , a captured image with corrected brightness is generated. The brightness correction unit 32 can homogenize the image or extract edges by removing low-frequency components.
  • FIG. 11 is a diagram for explaining high-pass filtering performed by the luminance correction unit of the information processing apparatus according to the first embodiment
  • the parameter candidate Pc1 is a parameter specifying the removal frequency band fg1
  • the parameter candidate Pc2 is a parameter specifying the removal frequency band fg2.
  • the parameter candidate Pc3 is a parameter specifying the removal frequency band fg3
  • the parameter candidate Pc4 is a parameter specifying the removal frequency band fg4.
  • the removal frequency band fg2 is higher than the removal frequency band fg1
  • the removal frequency band fg3 is higher than the removal frequency band fg2
  • the removal frequency band fg4 is higher than the removal frequency band fg3.
  • the brightness correction unit 32 acquires the brightness of each of the plurality of captured images represented by the plurality of captured image data acquired by the data acquisition unit 31. is corrected using the parameter P.
  • the brightness correction unit 32 uses the parameter P to correct the brightness of each of the multiple captured images by gamma correction, local histogram equalization, or high-pass filtering.
  • the luminance correction method by the luminance correction unit 32 is not limited to gamma correction, local histogram equalization method, and high-pass filter, and various luminance correction methods can be applied. It may be a brightness correction method.
  • the luminance correction unit 32 can change the correction method according to the user's input operation to the input unit 11, or change the correction method according to the type of the target object.
  • the luminance correction unit 32 has correction information indicating a correction method corresponding to the type of object, and based on this correction information, a correction method for each object or a correction method corresponding to the type of object. can correct the luminance of each of a plurality of captured images.
  • the type of object is, for example, the type of structure such as tunnel 4, guardrail 5, road, bridge, and the like.
  • the luminance correction unit 32 can also change the correction method based on the time zone, season, weather, etc. when the captured image data was obtained.
  • the image stitching unit 33 generates stitched images IC Pc and ICP , which are images obtained by stitching together a plurality of captured images whose luminance has been corrected by the luminance correcting unit 32 .
  • the image stitching unit 33 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the image stitching unit 33 when the operation mode is the parameter determination mode will be described.
  • the image stitching unit 33 When the operation mode is the parameter determination mode, the image stitching unit 33 generates a stitched image IC Pc, which is an image obtained by stitching together a plurality of captured images whose brightness is corrected using the parameter candidates Pc in units of the parameter candidates Pc . Generate.
  • the image stitching unit 33 stitches together a plurality of captured images A, B, and C whose brightness is corrected by the parameter candidates Pc1, Pc2, . . . , IC Pcn are generated .
  • FIG. 12 is a diagram for explaining an example of captured image stitching processing by the image stitching unit of the information processing apparatus according to the first embodiment;
  • the image stitching unit 33 stitches together a plurality of captured images A, B, and C whose brightness is corrected by the parameter candidate Pc to generate a stitched image IC Pc .
  • Captured images A, B, and C shown in FIG. 12 are images of the inner wall surface 4 a of the tunnel 4 .
  • the image stitching unit 33 When the operation mode is the inspection image generation mode, the image stitching unit 33 generates a stitched image ICP, which is an image obtained by stitching together a plurality of captured images whose luminance has been corrected by the luminance correction unit 32 using the parameter P. Generate.
  • ICP an image obtained by stitching together a plurality of captured images whose luminance has been corrected by the luminance correction unit 32 using the parameter P.
  • the image stitching unit 33 deletes the overlapping portion between the captured images A and B and the overlapping portion between the captured images B and C in the process of generating the stitched image IC Pc or the stitched image ICP .
  • the deformation detection unit 34 detects a deformed portion of the object using a learning model for detecting the deformed portion of the object from the composite images IC Pc and ICP .
  • a learning model is a neural network such as a convolutional neural network or a recurrent neural network, and is generated by deep learning.
  • the learning model of the deformation detection unit 34 is, for example, a learning model that performs semantic segmentation.
  • the deformation detection unit 34 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the deformation detector 34 when the operation mode is the parameter determination mode will be described.
  • the deformation detection unit 34 inputs the stitched image IC Pc generated by the image stitching unit 33 to the learning model, and class label of each pixel output from the learning model. based on the score of
  • the class label is, for example, a class label indicating cracks, a class label indicating delamination, a class label indicating dirt, or a class label indicating water leakage.
  • the class label score of a pixel is, for example, a value between 0 and 1. If there is a class label pixel with a score of 0.5 or more, the deformation detection unit 34 determines that the pixel is a deformation pixel. Yes, and the type of deformation is determined to be deformation corresponding to a class label with a score of 0.5 or higher.
  • the deformation detection unit 34 detects a deformation location from each of the composite images IC Pc1 , IC Pc2 , . RS Pc1 , RS Pc2 , . . . , RS Pcn are generated.
  • the deformation detection unit 34 inputs the stitched image ICP generated by the image stitching unit 33 to the learning model, and determines the class label of each pixel output from the learning model. Based on each score, the location of the deformation is detected.
  • the learning model may be a learning model that is generated by labeling attributes of the entire composite images IC Pc and IC P.
  • the learning model may be a network model other than a neural network, or a computational model generated by machine learning other than deep learning, such as linear regression or logistic regression.
  • the parameter determination unit 35 determines the parameter P based on the detection accuracy of the deformation location detected by the deformation detection unit 34 whose operation mode is the parameter determination mode.
  • the parameter determining unit 35 calculates the deformation detection accuracy evaluation value VR Pc for the composite image IC Pc for each parameter candidate Pc based on the deformation detection result RS Pc . Then, the parameter determining unit 35 determines the parameter P based on the evaluation value VR Pc for each parameter candidate Pc. Thereby, the parameter determination unit 35 can determine the parameter P that can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the parameter determination unit 35 can also determine the parameter P for correcting the brightness based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion.
  • the parameter determining unit 35 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc.
  • the parameter determination unit 35 calculates a total evaluation value VT Pc for each parameter candidate Pc based on the evaluation value VR Pc for the deformation detection accuracy and the evaluation value VV Pc for visibility, and calculates the total evaluation value VT Pc for each parameter candidate Pc.
  • a parameter P is determined based on the comprehensive evaluation value VT Pc .
  • the parameter determination unit 35 can accurately determine the parameter P as a parameter capable of correcting the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the parameter determination unit 35 includes a true value information acquisition unit 40, an evaluation unit 41, and a determination unit 42, as shown in FIG.
  • the true value information acquisition unit 40 acquires true value information including a true value indicating the presence or absence of deformation for each pixel of the composite image IC Pc .
  • the true value information includes a true value indicating the presence or absence of deformation for each type of deformation.
  • the true value information acquisition unit 40 causes the display processing unit 36 to perform processing for displaying the composite image IC Pc on the display unit 12, for example.
  • the true value information acquiring unit 40 acquires the true value information accepted by the input accepting unit 30 via the data acquiring unit 31 when the true value information input by the user is accepted by the input accepting unit 30 .
  • the true value information acquisition unit 40 when the automatic mode is set, the true value information acquisition unit 40 generates, in the image stitching unit 33, a stitched image obtained by stitching together a plurality of captured images for which luminance correction has not been performed by the luminance correction unit 32. Let Then, the true value information acquisition unit 40 causes the deformation detection unit 34 to detect a deformation location from a composite image obtained by combining a plurality of captured images that have not been subjected to luminance correction, and the deformation detection unit 34 detects the deformation location. The true value information is acquired from the information on the deformed location.
  • the evaluation unit 41 evaluates the composite image IC based on the information on the deformation location detected by the deformation detection unit 34 whose operation mode is the parameter determination mode and the true value information acquired by the true value information acquisition unit 40. An evaluation value VR Pc of the detection accuracy of the deformed portion from Pc is calculated.
  • the evaluation unit 41 calculates at least one of the recall rate, the precision rate, and the F value as the evaluation value VR Pc , or weights and adds at least two of the recall rate, the precision rate, and the F value. This value can be calculated as the evaluation value VR Pc .
  • FIG. 13A and 13B are diagrams for explaining a method of calculating a recall rate, a precision rate, and an F value by an evaluation unit of the information processing apparatus according to the first embodiment
  • the number of pixels that are actually deformed out of the pixels detected as being deformed by the deformation detection unit 34 is TP (True Positive).
  • the number of pixels that are actually background among pixels detected as being present is defined as TN (True Negative).
  • the number of pixels that are actually deformed out of the pixels detected as background by the deformation detection unit 34 is FN (False Negative).
  • FP False Positive
  • a background pixel is a pixel that indicates a non-deformation.
  • the evaluation unit 41 treats deformation pixels that are not of the same type as background pixels, and calculates and totals TP, TN, FN, and FP for each deformation. , TP, TN, FN, and FP are calculated.
  • the evaluation unit 41 can calculate at least one of the recall rate, the precision rate, and the F value as the evaluation value VR Pc by calculating the formula shown in FIG.
  • the evaluation unit 41 can also calculate a value obtained by weighting and adding at least two of the recall rate, the precision rate, and the F value as the evaluation value VR Pc .
  • the evaluation unit 41 calculates the difference between the score output from the learning model of the deformation detection unit 34 and the value corresponding to the actual pixel type for each pixel, and sums the calculated results as an evaluation value. It can also be VR Pc .
  • the value corresponding to the pixel type is "1" when the pixel type is deformation, and is "0" when the pixel type is background.
  • the evaluation unit 41 can also calculate the visibility evaluation value VV Pc of the composite image IC Pc based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion.
  • the evaluation value VV Pc is calculated based on the contrast and edge strength of the stitched image IC Pc . For example, the evaluation unit 41 increases the evaluation value VV Pc as the contrast increases, and increases the evaluation value VV Pc as the edge strength increases.
  • the evaluation unit 41 calculates evaluation values VV Pc1 , VV Pc2 , . .
  • the evaluation unit 41 calculates, for example, a value obtained by weighting and adding the evaluation value VR Pc1 and the evaluation value VV Pc1 as the total evaluation value VT Pc1 , and weights the evaluation value VR Pc2 and the evaluation value VV Pc2 .
  • the added value is calculated as the total evaluation value VT Pc2 .
  • the evaluation unit 41 also calculates a value obtained by weighting and adding the evaluation value VR Pcn and the evaluation value VV Pcn as the total evaluation value VT Pcn .
  • the determination unit 42 shown in FIG. 4 determines the parameter P based on the multiple evaluation values VR Pc or multiple evaluation values VV Pc calculated by the evaluation unit 41 . Thereby, the determination unit 42 can determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the determination unit 42 determines, as the parameter P, the parameter candidate Pc corresponding to the largest evaluation value VR Pc among the plurality of evaluation values VR Pc calculated by the evaluation unit 41 .
  • the evaluation values VR Pc1 , VR Pc2 it is assumed that the evaluation values VR Pc1 , VR Pc2 , .
  • the determining unit 42 determines, as the parameter P, the parameter candidate Pc2 corresponding to the evaluation value VR Pc2 .
  • the determination unit 42 determines, as the parameter P, the parameter candidate Pc corresponding to the largest comprehensive evaluation value VT Pc among the plurality of comprehensive evaluation values VT Pc calculated by the evaluation unit 41 .
  • the evaluation unit 41 calculates the total evaluation values VT Pc1 , VT Pc2 , . and In this case, the determining unit 42 determines, as the parameter P, the parameter candidate Pcn corresponding to the comprehensive evaluation value VT Pcn .
  • the parameter P determined by the determination unit 42 is stored in the parameter storage unit 22 by the data acquisition unit 31 as described above.
  • the display processing unit 36 causes the display unit 12 to display the information stored in the storage unit 13 or the information processed by the processing unit 14 on the display unit 12 based on the reception result by the input reception unit 30. or For example, the display processing unit 36 causes the display unit 12 to display the stitched image IC Pc generated by the image stitching unit 33, or causes the display unit 12 to display the inspection image generated by the inspection image generation unit 37. do.
  • the inspection image generator 37 generates an inspection image by superimposing the stitched image ICP generated by the image stitcher 33 and information on the deformed portion detected by the deformed detector 34 from the stitched image ICP . do.
  • the information on the deformed portion is, for example, a diagram showing the shape of the deformed portion, or a frame surrounding the deformed portion.
  • FIG. 14 is a diagram of an example of an inspection image generated by the inspection image generation unit of the information processing apparatus according to the first embodiment; FIG. As shown in FIG. 14, in the inspection image generated by the inspection image generation unit 37, information highlighting the deformation detected by the deformation detection unit 34 is superimposed on the composite image ICP . Although FIG. 14 shows the composite image ICP in solid color, the composite image ICP includes an image of the object.
  • the deformation is emphasized by setting the thickness or color of the line indicating the deformation to a thickness or color distinguishable from the composite image ICP .
  • the inspection image generator 37 can change the highlighting method according to the type of deformation and the type of object, for example. Note that the inspection image generation unit 37 can also output a composite image ICP on which the information of the deformed portion is not superimposed as an inspection image.
  • the data output unit 38 shown in FIG. is output to an external device via the communication unit 10 .
  • the data output unit 38 outputs the parameter P determined by the parameter determination unit 35, the captured image data generated by the inspection image generation unit 37, and the like to an external device via the communication unit 10.
  • An external device to which the parameter P determined by the parameter determination unit 35 is transmitted is, for example, an information processing device that does not have the parameter determination unit 35, the parameter determination mode, and the parameter candidate storage unit 21.
  • Such an external device acquires a plurality of imaged image data from the measurement device 70 of the measurement vehicle 2 or the like, corrects the brightness of each of the plurality of imaged images represented by the acquired plurality of imaged image data, and then corrects the brightness of each of the plurality of imaged images.
  • a stitched image ICP is generated by stitching together a plurality of captured images. Then, the external device detects a deformed portion from the composite image ICP , and generates an inspection image in which information on the detected deformed portion is superimposed on the composite image ICP .
  • FIG. 15 is a flowchart illustrating an example of processing by a processing unit of the information processing apparatus according to the first embodiment
  • the processing unit 14 of the information processing device 1 determines whether measurement data output from the measurement device 70 of the measurement vehicle 2 or the like is acquired via the communication unit 10 (step S10). ). When the processing unit 14 determines that the measurement data has been acquired (step S10: Yes), the processing unit 14 stores captured image data included in the measurement data in the storage unit 13 (step S11).
  • step S11 When the processing of step S11 is completed, or when it is determined that the measurement data has not been acquired (step S10: No), the processing unit 14 determines whether or not the parameter determination timing has come (step S12). The processing unit 14 determines that the parameter determination timing has come, for example, when the user requests parameter determination or when the captured image data is stored in the storage unit 13 .
  • step S12 When the processing unit 14 determines that the parameter determination timing has come (step S12: Yes), it performs parameter determination processing (step S13).
  • step S13 is the process of steps S20 to S26 shown in FIG. 16, and will be described in detail later.
  • step S14 determines whether or not the inspection image generation timing has come. For example, when there is an inspection image generation request from the user, or after the parameter P is determined by the process of step S13, the processing unit 14 stores the captured image data of the object corresponding to the determined parameter P in the storage unit 13. , it is determined that the inspection image generation timing has come.
  • step S15 When the processing unit 14 determines that it is time to generate an inspection image (step S14: Yes), it executes inspection image generation processing (step S15).
  • the process of step S15 is the process of steps S60 to S65 shown in FIG. 20, and will be described in detail later.
  • step S15 When the processing of step S15 is completed, or when it is determined that the inspection image generation timing has not come (step S14: No), the processing unit 14 determines whether or not there is an output request (step S16).
  • step S17 When the processing unit 14 determines that there is an output request (step S16: Yes), it performs output processing (step S17). In the output process of step S17, the processing unit 14 sends, for example, the parameter P determined by the parameter determination process of step S13 or the data of the inspection image generated by the inspection image generation process of step S15 to the external device via the communication unit 10. output.
  • step S17 When the processing of step S17 is finished, or when it is determined that there is no output request (step S16: No), the processing unit 14 determines whether or not it is time to end the operation (step S18). For example, when the processing unit 14 determines that the power supply (not shown) of the information processing device 1 is turned off or determines that an operation end operation has been performed on the input unit 11, the processing unit 14 determines that it is time to end the operation. do.
  • step S18 determines that it is not the time to end the operation (step S18: No), the process proceeds to step S10. 15 ends.
  • FIG. 16 is a flowchart illustrating an example of parameter determination processing by the processing unit of the information processing apparatus according to Embodiment 1.
  • the processing unit 14 acquires from the storage unit 13 a plurality of pieces of captured image data of an object to be subjected to parameter determination processing (step S20).
  • the processing unit 14 acquires from the storage unit 13 a plurality of parameter candidates Pc corresponding to the type of target object to be subjected to parameter determination processing or the target object (step S21). Then, the processing unit 14 uses each parameter candidate Pc to correct the brightness of the plurality of captured images represented by the plurality of captured image data (step S22).
  • the processing unit 14 creates a composite image IC Pc for each parameter candidate Pc by combining the plurality of captured images whose brightness has been corrected in step S22 for each parameter candidate Pc (step S23).
  • step S24 the processing unit 14 detects a deformed portion for each parameter candidate Pc from the composite image IC Pc. Then, the processing unit 14 executes a deformation detection accuracy evaluation process (step S25).
  • the process of step S25 is the process of steps S30 to S32 shown in FIG. 17, and will be described in detail later.
  • the processing unit 14 determines the parameter P used in the luminance correction process based on the evaluation value VR Pc or the total evaluation value VT Pc for each parameter candidate Pc calculated in the deformation detection accuracy evaluation process (step S26 ), the process shown in FIG. 16 is terminated.
  • FIG. 17 is a flowchart showing an example of deformation detection accuracy evaluation processing by the processing unit of the information processing apparatus according to the first embodiment. As shown in FIG. 17, the processing unit 14 determines whether or not the luminance correction process is set to the manual mode (step S30).
  • step S30 When the processing unit 14 determines that the luminance correction process is set to the manual mode (step S30: Yes), it executes the manual mode process (step S31).
  • the process of step S31 is the process of steps S40 to S43 shown in FIG. 18, and will be described in detail later.
  • step S30 determines that the luminance correction process is set to the automatic mode instead of the manual mode (step S30: No)
  • the process of step S32 is the process of steps S50 to S56 shown in FIG. 19, and will be described in detail later.
  • the processing unit 14 ends the processing shown in FIG. 17 when the processing of step S31 is completed or when the processing of step S32 is completed.
  • FIG. 18 is a flowchart showing an example of manual mode processing by the processing unit of the information processing apparatus according to Embodiment 1.
  • FIG. 18 the processing unit 14 receives the true value of the deformed portion from the user (step S40).
  • the processing unit 14 performs a process of comparing the detection result in step S24 and the true value received in step S40 for each pixel in parameter candidate Pc units, and calculates the detection accuracy evaluation value VR Pc in parameter candidate Pc units. (step S41).
  • the processing unit 14 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc (step S42). Then, the processing unit 14 calculates the total evaluation value VT Pc for each parameter candidate Pc based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VV Pc (step S43), and performs the process shown in FIG. exit.
  • FIG. 19 is a flowchart showing an example of automatic mode processing by the processing unit of the information processing apparatus according to Embodiment 1.
  • the processing unit 14 acquires the same plurality of captured image data as the plurality of captured image data acquired in step S20 from the storage unit 13 (step S50).
  • the processing unit 14 generates a composite image by combining a plurality of captured images represented by a plurality of captured image data acquired in step S50 without luminance correction (step S51). Then, the processing unit 14 detects a deformed portion from the composite image generated in step S51 (step S52).
  • the processing unit 14 acquires true value information from the detection result of the deformed portion in step S52 (step S53).
  • the processing unit 14 compares the detection result obtained in step S24 with the true value obtained in step S53 for each pixel, and calculates the evaluation value VR Pc of the detection accuracy as the parameter candidate Pc. It is calculated in units (step S54).
  • the processing unit 14 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc (step S55). Then, the processing unit 14 calculates the total evaluation value VT Pc for each parameter candidate Pc based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VV Pc (step S56), and performs the process shown in FIG. exit.
  • FIG. 20 is a flowchart showing an example of inspection image generation processing by the processing unit of the information processing apparatus according to Embodiment 1.
  • the processing unit 14 acquires from the storage unit 13 a plurality of pieces of captured image data of an object to be inspected image generation processing (step S60).
  • the processing unit 14 acquires from the storage unit 13 the type of object to be inspected image generation processing or the parameter P of the object (step S61).
  • the processing unit 14 uses the parameter P acquired in step S61 to correct the brightness of the multiple captured images represented by the multiple captured image data acquired in step S60 (step S62).
  • the processing unit 14 creates a composite image ICP by combining a plurality of captured images whose brightness has been corrected (step S63). Then, the processing unit 14 detects a deformed portion from the composite image ICP generated in step S63 (step S64).
  • the processing unit 14 generates an inspection image by superimposing the information of the deformed portion detected in step S64 on the composite image ICP generated in step S63 (step S65), and ends the processing shown in FIG. do.
  • FIG. 21 is a diagram illustrating an example of the hardware configuration of the information processing apparatus according to the first embodiment;
  • the information processing apparatus 1 includes a computer having a processor 101, a memory 102, a communication device 103, an input device 104, a display device 105, and a bus .
  • the processor 101, the memory 102, the communication device 103, the input device 104, and the display device 105 can transmit and receive information to and from each other via the bus 106, for example.
  • Storage unit 13 is implemented by memory 102 .
  • the communication unit 10 is implemented by the communication device 103 .
  • the input unit 11 is implemented by the input device 104 .
  • the display unit 12 is implemented by the display device 105 .
  • the processor 101 reads the information processing program from the recording medium set in the recording medium drive and installs the read information processing program in the memory 102 .
  • the recording medium drive is, for example, a CD (Compact Disc)-ROM drive, a DVD (Digital Versatile Disc)-ROM drive, or a USB drive, and the recording medium is, for example, a CD-ROM, a DVD-ROM, or a non-volatile Such as semiconductor memory.
  • the processor 101 executes the functions of the processing unit 14 by reading and executing programs stored in the memory 102 .
  • the processor 101 is an example of a processing circuit, for example, and includes one or more of a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a system LSI (Large Scale Integration).
  • the memory 102 includes one or more of RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), and EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory).
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory)
  • EEPROM registered trademark
  • the information processing device 1 may include integrated circuits such as ASIC (Application Specific Integrated Circuit) and FPGA (Field Programmable Gate Array).
  • the information processing device 1 may be configured by a client device, may be configured by a server device, or may be configured by a client device and a server device.
  • each of the two or more devices has the hardware configuration shown in FIG. 21, for example. Note that communication between two or more devices is performed via the communication device 103 .
  • the information processing device 1 may be composed of two or more server devices.
  • the information processing device 1 may be configured with a processing server and a data server.
  • the parameter determination unit 35 can also determine the parameter P individually for each of a plurality of captured images to be pasted together.
  • the luminance correction unit 32 corrects the luminance of each of the plurality of captured images to be pasted together by combining different parameter candidates Pc.
  • the information processing apparatus 1 includes the data acquisition unit 31, the brightness correction unit 32, the image stitching unit 33, the deformation detection unit 34, and the parameter determination unit 35.
  • the data acquisition unit 31 acquires data of a plurality of images obtained by imaging an object with a plurality of imaging devices 71a, 71b, and 71c.
  • the luminance correction unit 32 corrects the luminance of each of the multiple images.
  • the image stitching unit 33 generates a stitched image IC Pc , which is an image obtained by stitching together a plurality of images whose luminance has been corrected by the luminance correcting unit 32 .
  • the deformation detection unit 34 detects a deformed portion of the object using a learning model for detecting the deformed portion of the object from the composite image IC Pc .
  • the parameter determination unit 35 determines a parameter P used for luminance correction based on the accuracy of detection of the deformation location by the deformation detection unit 34 . As a result, the information processing apparatus 1 can appropriately determine the parameter P that can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the luminance correction unit 32 corrects the luminance of each of the plurality of images using each of the plurality of parameter candidates Pc.
  • the image stitching unit 33 generates a stitched image IC Pc , which is an image obtained by stitching together a plurality of images whose luminance has been corrected in units of parameter candidates Pc by the luminance correction unit 32 in units of parameter candidates Pc.
  • the deformation detection unit 34 detects a deformation portion of the object in units of parameter candidates Pc from the composite image IC Pc in units of parameter candidates Pc.
  • the parameter determination unit 35 determines a parameter candidate Pc selected from a plurality of parameter candidates Pc as the parameter P based on the detection accuracy of the deformation location in units of the parameter candidates Pc by the deformation detection unit 34 . As a result, the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the parameter determination unit 35 includes an evaluation unit 41 and a determination unit 42 .
  • the evaluation unit 41 calculates an evaluation value VR Pc of the detection accuracy of the deformation location for each parameter candidate Pc by the deformation detection unit 34 .
  • the determination unit 42 determines a parameter candidate Pc to be used as the parameter P from among the plurality of parameter candidates Pc based on the evaluation value VR Pc calculated by the evaluation unit 41 .
  • the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the evaluation unit 41 further calculates the visibility evaluation value VV Pc of the composite image ICP for each parameter candidate Pc , and performs a comprehensive evaluation based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VVPc .
  • a value VT Pc is calculated for each parameter candidate Pc.
  • the determination unit 42 determines a parameter candidate Pc to be the parameter P from among the plurality of parameter candidates Pc based on the comprehensive evaluation value VT Pc evaluated by the evaluation unit 41 .
  • the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • the brightness correction unit 32 corrects the brightness of each of a plurality of images obtained by imaging the object with the plurality of imaging devices 71 a , 71 b , 71 c using the parameter P determined by the parameter determination unit 35 .
  • the image stitching unit 33 generates a stitched image ICP , which is an image obtained by stitching together a plurality of images whose luminance is corrected by the parameter P by the luminance correcting unit 32 .
  • the deformation detection unit 34 detects a deformation portion of the object from the composite image ICP whose luminance is corrected by the parameter P. FIG. As a result, the information processing apparatus 1 can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
  • 1 information processing device 1 information processing device, 2 measurement vehicle, 3 network, 4 tunnel, 4a inner wall surface, 5 guardrail, 10, 74 communication unit, 11 input unit, 12 display unit, 13 storage unit, 14, 73 processing unit, 20 captured image data Storage unit 21 Parameter candidate storage unit 22 Parameter storage unit 30 Input reception unit 31 Data acquisition unit 32 Brightness correction unit 33 Image stitching unit 34 Deformation detection unit 35 Parameter determination unit 36 Display processing unit , 37 inspection image generation unit, 38 data output unit, 40 true value information acquisition unit, 41 evaluation unit, 42 determination unit, 60 vehicle body, 70 measurement device, 71, 71a, 71b, 71c imaging device, 72 position/attitude speed detection Unit, 100 measurement processing system, 101 processor, 102 memory, 103 communication device, 104 input device, 105 display device, 106 bus.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

An information processing device according to the present invention is provided with a data acquisition unit, a brightness correction unit, an image superimposing unit, a deformation detection unit, and a parameter determination unit. The data acquisition unit acquires data of a plurality of images acquired by capturing images of a target object by means of a plurality of image capturing devices. The brightness correction unit corrects brightness of each of the plurality of images. The image superimposing unit generates a superimposed image, which is an image acquired by superimposing the plurality of images on which the brightness correction by the brightness correction unit has been performed. The deformation detection unit detects a deformed part of the target object by using a learning model for detecting a deformed part of the target object from the superimposed image. The parameter determination unit determines a parameter to be used for the brightness correction on the basis of accuracy of the deformed part detection by the deformation detection unit.

Description

情報処理装置、情報処理方法、および情報処理プログラムInformation processing device, information processing method, and information processing program
 本開示は、複数の撮像装置で対象物を撮像して得られる複数の画像の輝度を補正するために用いられるパラメータを決定する情報処理装置、情報処理方法、および情報処理プログラムに関する。 The present disclosure relates to an information processing device, an information processing method, and an information processing program for determining parameters used for correcting the brightness of multiple images obtained by imaging an object with multiple imaging devices.
 従来、トンネルなどの対象物の点検を行うために撮像方向が互いに異なる複数の撮像装置が搭載され走行しながらこれら複数の撮像装置によって対象物を撮像する計測車両が知られている。かかる計測車両では、フォーカスまたはアイリスなどといった撮像対象に対する撮像条件が撮像装置毎に切り替えられる。 Conventionally, there is known a measurement vehicle in which a plurality of imaging devices with different imaging directions are installed to inspect an object such as a tunnel, and the object is imaged by these imaging devices while the vehicle is traveling. In such a measurement vehicle, imaging conditions for an imaging target, such as focus or iris, are switched for each imaging device.
 特許文献1には、撮像条件を自動で調整する技術として、トンネルの有無、高架の有無、道路勾配、日時、天候、および車両の向きなどの情報を含む走行環境情報に基づいて車両に搭載された撮像装置の露光条件を調整する技術が提案されている。 Japanese Patent Laid-Open No. 2002-200000 discloses a technology for automatically adjusting imaging conditions, which is installed in a vehicle based on driving environment information including information such as the presence or absence of tunnels, the presence or absence of overpasses, road gradients, date and time, weather, and direction of the vehicle. Techniques for adjusting the exposure conditions of an imaging device have been proposed.
特開2006-222844号公報JP 2006-222844 A
 しかしながら、複数の撮像装置で対象物を撮像して得られる複数の画像の輝度を補正した後に貼り合わせた画像である貼り合わせ画像から対象物の変状箇所を検出する場合、輝度の補正によって変状箇所の検出精度が低下する場合がある。 However, when detecting a deformed portion of an object from a stitched image, which is an image obtained by photographing an object with a plurality of imaging devices and then stitching the images together after correcting the luminance, the modified portion of the object is detected by correcting the luminance. In some cases, the accuracy of detection of irregularities may decrease.
 本開示は、上記に鑑みてなされたものであって、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータを適切に決定することができる情報処理装置を得ることを目的とする。 The present disclosure has been made in view of the above, and provides an information processing apparatus capable of appropriately determining parameters capable of correcting brightness while suppressing deterioration in detection accuracy of a deformed portion of an object. with the aim of obtaining
 上述した課題を解決し、目的を達成するために、本開示の情報処理装置は、データ取得部と、輝度補正部と、画像貼り合わせ部と、変状検出部と、パラメータ決定部とを備える。データ取得部は、複数の撮像装置で対象物を撮像して得られる複数の画像のデータを取得する。輝度補正部は、複数の画像の各々の輝度を補正する。画像貼り合わせ部は、輝度補正部によって輝度の補正が行われた複数の画像を貼り合わせた画像である貼り合わせ画像を生成する。変状検出部は、貼り合わせ画像から対象物の変状箇所を検出する学習モデルを用いて、対象物の変状箇所を検出する。パラメータ決定部は、変状検出部による変状箇所の検出精度に基づいて、輝度の補正に用いられるパラメータを決定する。 In order to solve the above-described problems and achieve the object, the information processing device of the present disclosure includes a data acquisition unit, a luminance correction unit, an image stitching unit, a deformation detection unit, and a parameter determination unit. . The data acquisition unit acquires data of a plurality of images obtained by imaging an object with a plurality of imaging devices. The luminance corrector corrects the luminance of each of the plurality of images. The image stitching unit generates a stitched image that is an image obtained by stitching together a plurality of images whose luminance has been corrected by the luminance correction unit. The deformation detection unit detects a deformation portion of the object using a learning model for detecting a deformation portion of the object from the combined images. The parameter determination unit determines a parameter used for luminance correction based on the accuracy of detection of the deformation location by the deformation detection unit.
 本開示によれば、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータを適切に決定することができる、という効果を奏する。 According to the present disclosure, it is possible to appropriately determine a parameter capable of correcting luminance while suppressing deterioration in detection accuracy of a deformed portion of an object.
実施の形態1にかかる計測処理システムの構成の一例を示す図1 is a diagram showing an example of a configuration of a measurement processing system according to a first embodiment; FIG. 実施の形態1にかかる計測車両が備える計測装置の構成の一例を示す図FIG. 2 is a diagram showing an example of the configuration of a measurement device included in the measurement vehicle according to the first embodiment; FIG. 実施の形態1にかかる計測装置における複数の撮像装置の配置の一例を示す図FIG. 2 is a diagram showing an example of arrangement of a plurality of imaging devices in the measuring device according to the first embodiment; 実施の形態1にかかる情報処理装置の構成の一例を示す図1 is a diagram showing an example of a configuration of an information processing apparatus according to a first embodiment; FIG. 実施の形態1にかかる撮像画像データ記憶部に記憶される撮像画像データテーブルの一例を示す図FIG. 4 is a diagram showing an example of a captured image data table stored in the captured image data storage unit according to the first embodiment; FIG. 実施の形態1にかかるパラメータ候補記憶部に記憶されるパラメータ候補テーブルの一例を示す図FIG. 4 shows an example of a parameter candidate table stored in the parameter candidate storage unit according to the first embodiment; 実施の形態1にかかるパラメータ記憶部に記憶されるパラメータテーブルの一例を示す図FIG. 4 is a diagram showing an example of a parameter table stored in the parameter storage unit according to the first embodiment; FIG. 実施の形態1にかかる情報処理装置の処理部によって実行されるパラメータ決定処理の一例を示す図FIG. 4 is a diagram showing an example of parameter determination processing executed by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の輝度補正部によって実行されるガンマ補正を説明するための図FIG. 4 is a diagram for explaining gamma correction executed by a luminance correction unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の輝度補正部によって実行される局所的ヒストグラム平坦化法を説明するための図FIG. 4 is a diagram for explaining a local histogram equalization method executed by the luminance correction unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の輝度補正部によって実行されるハイパスフィルタリングを説明するための図FIG. 4 is a diagram for explaining high-pass filtering performed by the luminance correction unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の画像貼り合わせ部による撮像画像貼り合わせ処理の一例を説明するための図FIG. 4 is a diagram for explaining an example of captured image stitching processing by an image stitching unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の評価部による再現率、適合率、およびF値の算出方法を説明するための図FIG. 4 is a diagram for explaining a method of calculating a recall rate, a precision rate, and an F value by an evaluation unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の点検画像生成部によって生成される点検画像の一例を示す図FIG. 4 is a diagram showing an example of an inspection image generated by an inspection image generation unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部による処理の一例を示すフローチャート4 is a flowchart showing an example of processing by a processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部によるパラメータ決定処理の一例を示すフローチャート4 is a flowchart showing an example of parameter determination processing by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部による変状検出精度評価処理の一例を示すフローチャート3 is a flowchart showing an example of deformation detection accuracy evaluation processing by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部による手動モード処理の一例を示すフローチャート4 is a flowchart showing an example of manual mode processing by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部による自動モード処理の一例を示すフローチャート4 is a flowchart showing an example of automatic mode processing by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置の処理部による点検画像生成処理の一例を示すフローチャート3 is a flowchart showing an example of inspection image generation processing by the processing unit of the information processing apparatus according to the first embodiment; 実施の形態1にかかる情報処理装置のハードウェア構成の一例を示す図1 is a diagram showing an example of a hardware configuration of an information processing apparatus according to a first embodiment; FIG.
 以下に、実施の形態にかかる情報処理装置、情報処理方法、および情報処理プログラムを図面に基づいて詳細に説明する。 The information processing device, information processing method, and information processing program according to the embodiment will be described in detail below with reference to the drawings.
実施の形態1.
 図1は、実施の形態1にかかる計測処理システムの構成の一例を示す図である。図1に示すように、実施の形態1にかかる計測処理システム100は、情報処理装置1と、計測車両2とを備える。計測車両2は、車両本体60と、車両本体60に搭載された計測装置70とを備える。計測装置70は、車両本体60が走行中に周囲に存在する対象物を計測対象として繰り返し撮像する。
Embodiment 1.
1 is a diagram illustrating an example of a configuration of a measurement processing system according to a first embodiment; FIG. As shown in FIG. 1 , the measurement processing system 100 according to the first embodiment includes an information processing device 1 and a measurement vehicle 2 . The measurement vehicle 2 includes a vehicle body 60 and a measurement device 70 mounted on the vehicle body 60 . The measurement device 70 repeatedly captures images of objects existing around the vehicle body 60 as objects to be measured while the vehicle body 60 is running.
 図1に示す例では、計測装置70による計測対象となる対象物は、トンネル4およびガードレール5などであるが、かかる例に限定されず、例えば、道路、看板、信号機、橋梁、またはその他の構造物などであってもよい。なお、計測車両2は、図1に示す例では走行路を走行する車両であり、道路を走行路として走行する自動車であるが、レールを走行路として走行する鉄道車両であってもよい。 In the example shown in FIG. 1, the objects to be measured by the measuring device 70 are the tunnel 4 and the guardrail 5, but are not limited to such examples. It may be an object or the like. In the example shown in FIG. 1, the measurement vehicle 2 is a vehicle that travels on a road and is an automobile that travels on roads, but may be a railroad vehicle that travels on rails.
 計測装置70は、対象物を撮像した画像のデータを含む撮像画像データを生成する。計測装置70は、ネットワーク3を介して情報処理装置1との間でデータの送受信が可能であり、撮像画像データを含む計測データを情報処理装置1へ送信する。ネットワーク3は、例えば、インターネットなどのWAN(Wide Area Network)であるが、LAN(Local Area Network)であってもよく、その他のネットワークであってもよい。 The measuring device 70 generates captured image data including data of an image obtained by capturing an object. The measuring device 70 can transmit and receive data to and from the information processing device 1 via the network 3 , and transmits measurement data including captured image data to the information processing device 1 . The network 3 is, for example, a WAN (Wide Area Network) such as the Internet, but may be a LAN (Local Area Network) or other networks.
 図2は、実施の形態1にかかる計測車両が備える計測装置の構成の一例を示す図である。図2に示すように、計測車両2が備える計測装置70は、撮像装置71a,71b,71cと、位置姿勢速度検出部72と、処理部73と、通信部74とを備える。 FIG. 2 is a diagram showing an example of the configuration of the measurement device provided in the measurement vehicle according to the first embodiment. As shown in FIG. 2 , the measurement device 70 included in the measurement vehicle 2 includes imaging devices 71 a , 71 b , and 71 c , a position/attitude velocity detection unit 72 , a processing unit 73 , and a communication unit 74 .
 撮像装置71a,71b,71cは、車両本体60の進行方向と直交する方向であってそれぞれ異なる向きに撮像方向を有している。図3は、実施の形態1にかかる計測装置における複数の撮像装置の配置の一例を示す図である。図3に示す例では、撮像装置71aの撮像方向は、左斜め上の方向であり、撮像装置71bの撮像方向は、上方向であり、撮像装置71cの撮像方向は、右斜め上の方向である。 The imaging devices 71a, 71b, and 71c have imaging directions in directions orthogonal to the traveling direction of the vehicle body 60 and in different directions. FIG. 3 is a diagram illustrating an example of arrangement of a plurality of imaging devices in the measuring device according to the first embodiment; In the example shown in FIG. 3, the imaging direction of the imaging device 71a is the upper left direction, the imaging direction of the imaging device 71b is the upper direction, and the imaging direction of the imaging device 71c is the upper right direction. be.
 図3に示す例では、複数の撮像装置71a,71b,71cによってトンネル4の内壁面が撮像される。撮像装置71a,71b,71cは、例えば、カラーラインセンサであり、車両本体60の走行時に内壁面4aのうち一部が重複する互いに異なる領域を繰り返し撮像する。以下において、撮像装置71a,71b,71cの各々を個別に区別せずに示す場合、撮像装置71と記載する場合がある。 In the example shown in FIG. 3, images of the inner wall surface of the tunnel 4 are captured by a plurality of imaging devices 71a, 71b, and 71c. The imaging devices 71a, 71b, and 71c are, for example, color line sensors, and repeatedly capture images of mutually different regions of the inner wall surface 4a that partially overlap each other while the vehicle body 60 is running. In the following description, the imaging devices 71a, 71b, and 71c may be referred to as the imaging device 71 when they are not individually distinguished.
 撮像装置71は、撮像パラメータの設定変更により対象物に対する撮像条件の切り替えが行われる。撮像パラメータの設定変更の内容は、例えば、トンネル4の状態、時間帯、天気、および季節などの撮像環境の状態に応じて予め設定されるが、撮像パラメータは、撮像環境の状態に応じて自動的に設定変更が行われてもよい。撮像パラメータは、例えば、フォーカス、アイリス、ゲインコントロール、およびダイナミックレンジなどのうちの少なくとも1つである。 The imaging device 71 switches imaging conditions for the object by changing imaging parameter settings. The content of the setting change of the imaging parameter is set in advance according to the state of the imaging environment such as the state of the tunnel 4, the time of day, the weather, and the season. The setting may be changed automatically. The imaging parameters are, for example, at least one of focus, iris, gain control, dynamic range, and the like.
 なお、撮像装置71は、カラーラインセンサに代えて、カラーエリアセンサであってもよく、モノクロセンサであってもよい。また、計測装置70に設けられる撮像装置71の数は、3つに限定されず、例えば、2つ以下であってもよく、4つ以上であってもよい。 Note that the imaging device 71 may be a color area sensor or a monochrome sensor instead of the color line sensor. Also, the number of imaging devices 71 provided in the measuring device 70 is not limited to three, and may be two or less, or may be four or more, for example.
 図2に示す位置姿勢速度検出部72は、GPS(Global Positioning System)受信機と、慣性センサと、速度センサとを備えており、車両本体60の位置、姿勢、および速度を検出する。処理部73は、例えば、車両本体60の位置、姿勢、および速度に基づいて、複数の撮像装置71を制御して、複数の撮像装置71に対象物の撮像を繰り返し実行させる。 The position/attitude/speed detection unit 72 shown in FIG. 2 includes a GPS (Global Positioning System) receiver, an inertial sensor, and a speed sensor, and detects the position, attitude, and speed of the vehicle body 60 . The processing unit 73 controls the plurality of imaging devices 71 based on, for example, the position, orientation, and speed of the vehicle body 60, and causes the plurality of imaging devices 71 to repeatedly perform imaging of the target object.
 また、処理部73は、例えば、車両本体60の位置、姿勢、および速度に基づいて、各撮像装置71から繰り返し撮像された複数の画像を撮像装置71毎に車両本体60の走行方向で繋ぎ合わせた撮像画像のデータである撮像装置71毎の撮像画像データを生成する。 In addition, the processing unit 73 connects, for each imaging device 71, a plurality of images repeatedly captured by each imaging device 71 in the traveling direction of the vehicle body 60 based on the position, posture, and speed of the vehicle body 60, for example. Captured image data for each imaging device 71, which is the captured image data, is generated.
 処理部73は、撮像装置71毎の撮像画像データを含む計測データを通信部74へ出力する。通信部74は無線によってネットワーク3に接続されており、処理部73から取得した計測データを情報処理装置1へネットワーク3を介して送信する。 The processing unit 73 outputs measurement data including captured image data for each imaging device 71 to the communication unit 74 . The communication unit 74 is wirelessly connected to the network 3 and transmits measurement data acquired from the processing unit 73 to the information processing device 1 via the network 3 .
 情報処理装置1は、計測車両2から取得される計測データに基づいて、複数の撮像画像を車両本体60の走行方向と直交する方向で貼り合わせた貼り合わせ画像を生成し、生成した貼り合わせ画像に基づいて、対象物の変状箇所を検出する。変状箇所は、変状が発生している箇所である。変状は、例えば、対象物がトンネル4である場合、トンネル4の内壁面4aのひび割れ、剥離、汚れ、または漏れ水などである。 The information processing device 1 generates a composite image by combining a plurality of captured images in a direction orthogonal to the traveling direction of the vehicle body 60 based on the measurement data acquired from the measurement vehicle 2, and generates the composite image. Based on, a deformed portion of the object is detected. A deformed portion is a portion where deformation occurs. For example, when the object is the tunnel 4, the deformation is cracking, peeling, staining, water leakage, or the like of the inner wall surface 4a of the tunnel 4.
 情報処理装置1は、検出した変状箇所を示す情報を貼り合わせ画像上に配置する処理を行って点検画像を生成し、生成した点検画像を表示したり印刷したりする。かかる点検画像によって、情報処理装置1の利用者は、点検作業者として、対象物の変状箇所を容易に把握することができ、また点検画像を帳票などに貼り付けることができる。 The information processing device 1 generates an inspection image by performing a process of arranging information indicating the detected deformed portion on the composite image, and displays or prints the generated inspection image. Such an inspection image allows the user of the information processing apparatus 1, as an inspection operator, to easily grasp the deformed portion of the object and to paste the inspection image onto a form or the like.
 互いに異なる撮像条件で各々対応する撮像装置71から得られる複数の撮像画像は、互いに輝度差が生じている場合があり、これら複数の撮像画像をそのまま貼り合わせて貼り合わせ画像を生成した場合、貼り合わせ画像内での輝度ムラによって、貼り合わせ画像の視認性が悪くなる場合がある。また、複数の撮像画像のコントラストが悪い場合や輝度が低い場合などにおいても、貼り合わせ画像の視認性が悪くなる場合がある。 A plurality of captured images obtained from the corresponding imaging devices 71 under mutually different imaging conditions may have a luminance difference. Visibility of the stitched image may deteriorate due to luminance unevenness in the stitched image. In addition, when the contrast of a plurality of captured images is poor or when the luminance is low, the visibility of the composite image may be deteriorated.
 そこで、情報処理装置1は、貼り合わせ画像を生成する前に、複数の撮像画像の輝度を補正し、輝度を補正した複数の画像を貼り合わせて貼り合わせ画像を生成する。これにより、輝度を補正することなく複数の画像を貼り合わせた貼り合わせ画像に比べて、点検画像における貼り合わせ画像の視認性を向上させることができる。 Therefore, the information processing apparatus 1 corrects the luminance of a plurality of captured images before generating a composite image, and generates a composite image by combining the plurality of images whose luminance has been corrected. As a result, the visibility of the stitched image in the inspection image can be improved compared to the stitched image obtained by stitching a plurality of images without correcting the brightness.
 貼り合わせ対象となる複数の撮像画像に対する輝度の補正のためのパラメータPを利用者が任意に設定したり、貼り合わせ対象となる各撮像画像のヒストグラム形状から自動的に輝度の補正を行ったりすると、輝度の補正処理を行わない場合に比べて、貼り合わせ画像から変状箇所を検出する検出精度が低下する可能性がある。例えば、輝度の補正によって貼り合わせ画像の変状箇所に白飛びまたは黒飛びが生じてしまい、変状箇所の検出ができなくなって、変状箇所を検出する検出精度が低下する可能性がある。 If the user arbitrarily sets a parameter P for correcting the brightness of a plurality of captured images to be stitched together, or automatically corrects the brightness from the histogram shape of each captured image to be stitched together. , there is a possibility that the detection accuracy for detecting a deformed portion from the composite image will be lower than in the case where luminance correction processing is not performed. For example, brightness correction may cause overexposure or underexposure in the deformed portion of the stitched image, making it impossible to detect the deformed portion.
 そこで、情報処理装置1は、輝度を補正した複数の撮像画像を貼り合わせた画像である貼り合わせ画像からの変状箇所の検出精度に基づいて、輝度を補正するためのパラメータPを決定する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPを適切に決定する。以下、情報処理装置1の構成について具体的に説明する。 Therefore, the information processing device 1 determines the parameter P for correcting the luminance based on the detection accuracy of the deformed portion from the stitched image, which is an image obtained by stitching together a plurality of captured images whose luminance is corrected. As a result, the information processing apparatus 1 appropriately determines the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object. The configuration of the information processing apparatus 1 will be specifically described below.
 図4は、実施の形態1にかかる情報処理装置の構成の一例を示す図である。図4に示すように、情報処理装置1は、通信部10と、入力部11と、表示部12と、記憶部13と、処理部14とを備える。 FIG. 4 is a diagram showing an example of the configuration of the information processing device according to the first embodiment. As shown in FIG. 4 , the information processing device 1 includes a communication section 10 , an input section 11 , a display section 12 , a storage section 13 and a processing section 14 .
 通信部10は、ネットワーク3に有線または無線によって通信可能に接続され、ネットワーク3を介して計測装置70、不図示のプリンタ、または端末装置などの外部装置との間で情報の送受信を行う。 The communication unit 10 is communicably connected to the network 3 by wire or wirelessly, and transmits and receives information to and from an external device such as a measuring device 70, a printer (not shown), or a terminal device via the network 3.
 入力部11は、例えば、マウスおよびキーボードなどを含むが、タッチパッドであってもよい。表示部12は、例えば、液晶ディスプレイ、有機EL(ElectroLuminescence)ディスプレイ、またはプロジェクターなどである。なお、図4に示す例では、入力部11と表示部12とが情報処理装置1に含まれるが、入力部11および表示部12は、情報処理装置1の外部装置として情報処理装置1に接続されてもよい。 The input unit 11 includes, for example, a mouse and keyboard, but may also be a touch pad. The display unit 12 is, for example, a liquid crystal display, an organic EL (ElectroLuminescence) display, or a projector. In the example shown in FIG. 4, the input unit 11 and the display unit 12 are included in the information processing device 1. The input unit 11 and the display unit 12 are connected to the information processing device 1 as external devices of the information processing device 1. may be
 記憶部13は、撮像画像データ記憶部20と、パラメータ候補記憶部21と、パラメータ記憶部22とを含む。撮像画像データ記憶部20は、計測装置70から送信される計測データに含まれる撮像装置71単位の撮像画像データを含む撮像画像データテーブルを記憶する。 The storage unit 13 includes a captured image data storage unit 20, a parameter candidate storage unit 21, and a parameter storage unit 22. The captured image data storage unit 20 stores a captured image data table including captured image data for each imaging device 71 included in the measurement data transmitted from the measuring device 70 .
 図5は、実施の形態1にかかる撮像画像データ記憶部に記憶される撮像画像データテーブルの一例を示す図である。図5に示す撮像画像データテーブルは、「対象物ID(IDentifier)」、「撮像装置ID」、および「撮像画像データ」を撮像画像データ毎に含む。 FIG. 5 is a diagram showing an example of a captured image data table stored in a captured image data storage unit according to the first embodiment. The captured image data table shown in FIG. 5 includes “target object ID (IDentifier)”, “imaging device ID”, and “captured image data” for each captured image data.
 「対象物ID」は、対象物毎に固有の識別情報である。「撮像装置ID」は、撮像装置71毎に固有の識別情報である。「撮像画像データ」は、撮像画像データまたは撮像画像データの格納位置を示す情報である。 "Object ID" is unique identification information for each object. “Imaging device ID” is unique identification information for each imaging device 71 . “Captured image data” is captured image data or information indicating the storage location of captured image data.
 図5に示す撮像画像データテーブルでは、対象物ID「M1」の対象物の撮像画像データとして、撮像装置ID「DE1」,「DE2」,「DE3」の撮像装置71a,71b,71cによって撮像された画像に基づいて生成された撮像画像データが撮像画像データIMD1,IMD2,IMD3であることが示されている。 In the captured image data table shown in FIG. 5, the captured image data of the object with the object ID "M1" is captured by the imaging devices 71a, 71b, and 71c with the imaging device IDs "DE1," "DE2," and "DE3." It is shown that the captured image data generated based on the images obtained are the captured image data IMD1, IMD2, and IMD3.
 図4に示すパラメータ候補記憶部21は、輝度の補正処理に用いられるパラメータPの候補となる複数のパラメータ候補を含むパラメータ候補テーブルを記憶する。図6は、実施の形態1にかかるパラメータ候補記憶部に記憶されるパラメータ候補テーブルの一例を示す図である。 The parameter candidate storage unit 21 shown in FIG. 4 stores a parameter candidate table containing a plurality of parameter candidates that are candidates for the parameter P used in the brightness correction process. 6 is a diagram of an example of a parameter candidate table stored in a parameter candidate storage unit according to the first embodiment; FIG.
 図6に示すパラメータ候補テーブルは、「対象物ID」および「パラメータ候補」を含む。「対象物ID」は、図5に示す「対象物ID」と同じである。「パラメータ候補」は、輝度の補正処理に用いられるパラメータPの候補であるパラメータ候補の情報である。 The parameter candidate table shown in FIG. 6 includes "object ID" and "parameter candidate". The "object ID" is the same as the "object ID" shown in FIG. The “parameter candidate” is information of a parameter candidate that is a candidate of the parameter P used for the luminance correction process.
 図6に示すパラメータ候補テーブルでは、対象物ID「M1」の対象物に対するパラメータ候補として、パラメータ候補Pc1,Pc2,・・・,Pcnが含まれる。nは、例えば、4以上の整数である。また、対象物ID「M2」の対象物に対するパラメータ候補として、パラメータ候補Pc2などが含まれる。以下、パラメータ候補Pc1,Pc2,・・・,Pcnの各々を個別に区別せずに示す場合、パラメータ候補Pcと記載する場合がある。なお、パラメータ候補テーブルは、対象物IDに代えてまたは加えて、対象物の種類に固有の識別情報を含んでいてもよい。 In the parameter candidate table shown in FIG. 6, parameter candidates Pc1, Pc2, . n is an integer of 4 or more, for example. Further, parameter candidates Pc2 and the like are included as parameter candidates for the object with the object ID “M2”. In the following description, parameter candidates Pc1, Pc2, . Note that the parameter candidate table may include identification information unique to the type of object instead of or in addition to the object ID.
 図4に示すパラメータ記憶部22は、輝度の補正処理に用いられるパラメータPを含むパラメータテーブルを記憶する。図7は、実施の形態1にかかるパラメータ記憶部に記憶されるパラメータテーブルの一例を示す図である。 The parameter storage unit 22 shown in FIG. 4 stores a parameter table including parameters P used for luminance correction processing. 7 is a diagram of an example of a parameter table stored in a parameter storage unit according to the first embodiment; FIG.
 図7に示すパラメータテーブルは、「対象物ID」および「パラメータ」を含む。「対象物ID」は、図5に示す「対象物ID」と同じである。「パラメータ」は、輝度の補正処理に用いられるパラメータPの情報である。図7に示すパラメータテーブルでは、対象物ID「M1」の対象物に対するパラメータPとして、パラメータ候補Pc2が設定され、対象物ID「M2」の対象物に対するパラメータPとして、パラメータ候補Pcnが設定されている。なお、パラメータテーブルは、対象物IDに代えてまたは加えて、対象物の種類に固有の識別情報を含んでいてもよい。 The parameter table shown in FIG. 7 includes "object ID" and "parameter". The "object ID" is the same as the "object ID" shown in FIG. “Parameter” is information of a parameter P used for luminance correction processing. In the parameter table shown in FIG. 7, the parameter candidate Pc2 is set as the parameter P for the object with the object ID "M1", and the parameter candidate Pcn is set as the parameter P for the object with the object ID "M2". there is Note that the parameter table may include identification information unique to the type of object instead of or in addition to the object ID.
 図4に戻って、情報処理装置1の説明を続ける。図4に示す処理部14は、情報処理装置1の利用者による入力部11への入力操作によってパラメータ決定要求を受け付けた場合に、輝度の補正のパラメータPを決定するパラメータ決定処理を行う。図8は、実施の形態1にかかる情報処理装置の処理部によって実行されるパラメータ決定処理の一例を示す図である。 Returning to FIG. 4, the description of the information processing device 1 will be continued. The processing unit 14 shown in FIG. 4 performs parameter determination processing for determining a parameter P for luminance correction when a parameter determination request is received through an input operation to the input unit 11 by the user of the information processing apparatus 1 . 8 is a diagram illustrating an example of parameter determination processing performed by the processing unit of the information processing apparatus according to the first embodiment; FIG.
 図8では、複数の撮像画像として、撮像画像A,B,Cを貼り合わせる例を示している。撮像画像Aは、例えば、上述した撮像画像データIMD1で表される撮像画像であり、撮像画像Bは、例えば、上述した撮像画像データIMD2で表される撮像画像であり、撮像画像Cは、例えば、上述した撮像画像データIMD3で表される撮像画像である。 FIG. 8 shows an example in which captured images A, B, and C are pasted together as a plurality of captured images. The captured image A is, for example, a captured image represented by the above-described captured image data IMD1, the captured image B is, for example, a captured image represented by the above-described captured image data IMD2, and the captured image C is, for example, , and the captured image represented by the captured image data IMD3 described above.
 処理部14は、輝度の補正に用いられるパラメータ候補Pc1,Pc2,・・・,Pcnに基づいて、撮像画像A,B,Cの各々の輝度を補正する輝度補正処理をパラメータ候補Pc単位で行う。これにより、例えば、パラメータ候補Pc1で輝度の補正を行った撮像画像A,B,C、パラメータ候補Pc2で輝度の補正を行った撮像画像A,B,C、パラメータ候補Pcnで輝度の補正を行った撮像画像A,B,Cなどが生成される。 The processing unit 14 performs luminance correction processing for correcting the luminance of each of the captured images A, B, and C based on the parameter candidates Pc1, Pc2, . . As a result, for example, the captured images A, B, and C corrected for brightness using the parameter candidate Pc1, the captured images A, B, and C corrected for brightness using the parameter candidate Pc2, and the brightness corrected using the parameter candidate Pcn. The picked-up images A, B, C, etc. are generated.
 次に、処理部14は、輝度の補正を行った撮像画像A,B,Cを貼り合わせた画像である貼り合わせ画像をパラメータ候補Pc単位で生成する貼り合わせ処理を行う。これにより、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnが生成される。 Next, the processing unit 14 performs a stitching process of generating a stitched image, which is an image obtained by stitching together the picked-up images A, B, and C whose brightness has been corrected, for each parameter candidate Pc. As a result, composite images IC Pc1 , IC Pc2 , . . . , IC Pcn are generated.
 貼り合わせ画像ICPc1は、パラメータ候補Pc1で輝度の補正を行った撮像画像A,B,Cを貼り合わせた画像であり、貼り合わせ画像ICPc2は、パラメータ候補Pc2で輝度の補正を行った撮像画像A,B,Cを貼り合わせた画像である。また、貼り合わせ画像ICPcnは、パラメータ候補Pcnで輝度の補正を行った撮像画像A,B,Cを貼り合わせた画像である。以下において、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnの各々を個別に区別せずに示す場合、貼り合わせ画像ICPcと記載する場合がある。 A stitched image IC Pc1 is an image obtained by stitching captured images A, B, and C corrected for brightness using the parameter candidate Pc1, and a stitched image IC Pc2 is an image obtained by performing brightness correction using the parameter candidate Pc2. It is an image in which images A, B, and C are pasted together. A stitched image IC Pcn is an image obtained by stitching together the captured images A, B, and C whose brightness has been corrected using the parameter candidate Pcn. In the following , when each of the composite images IC Pc1 , IC Pc2 , .
 次に、処理部14は、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnの各々から変状箇所を検出する変状検出処理を行う。これにより、処理部14は検出した結果である変状検出結果RSPc1,RSPc2,・・・,RSPcnを生成する。 Next, the processing unit 14 performs a deformation detection process for detecting a deformation portion from each of the composite images IC Pc1 , IC Pc2 , . . . , IC Pcn . Thereby, the processing unit 14 generates deformation detection results RS Pc1 , RS Pc2 , .
 変状検出結果RSPc1は、パラメータ候補Pc1を用いて得られる貼り合わせ画像ICPc1から検出された変状箇所のデータである変状データを含む。変状検出結果RSPc2は、パラメータ候補Pc2を用いて得られる貼り合わせ画像ICPc2から検出された変状箇所のデータである変状データを含む。また、変状検出結果RSPcnは、パラメータ候補Pcnを用いて得られる貼り合わせ画像ICPcnから検出された変状箇所のデータである変状データを含む。以下において、変状検出結果RSPc1,RSPc2,・・・,RSPcnの各々を個別に区別せずに示す場合、変状検出結果RSPcと記載する場合がある。 The deformation detection result RS Pc1 includes deformation data that is data of a deformation location detected from the composite image IC Pc1 obtained using the parameter candidate Pc1. The deformation detection result RS Pc2 includes deformation data that is data of a deformation location detected from the composite image IC Pc2 obtained using the parameter candidate Pc2. Further, the deformation detection result RS Pcn includes deformation data that is data of a deformation location detected from the composite image IC Pcn obtained using the parameter candidate Pcn. In the following , when each of the deformation detection results RS Pc1 , RS Pc2 , .
 次に、処理部14は、変状検出結果RSPc1,RSPc2,・・・,RSPcnに基づいて、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnに対する変状検出精度の評価値VRPc1,VRPc2,・・・,VRPcnを算出する検出結果評価処理を行う。 Next, the processing unit 14 evaluates the deformation detection accuracy for the composite images IC Pc1 , IC Pc2 , ..., IC Pcn based on the deformation detection results RS Pc1 , RS Pc2 , ..., RS Pcn . Detection result evaluation processing for calculating values VR Pc1 , VR Pc2 , . . . , VR Pcn is performed.
 例えば、処理部14は、変状検出結果RSPc1に基づいて、貼り合わせ画像ICPc1に対する変状検出精度の評価値VRPc1を算出し、変状検出結果RSPc2に基づいて、貼り合わせ画像ICPc2に対する変状検出精度の評価値VRPc2を算出する。また、処理部14は、変状検出結果RSPcnに基づいて、貼り合わせ画像ICPcnに対する変状検出精度の評価値VRPcnを算出する。以下において、評価値VRPc1,VRPc2,・・・,VRPcnの各々を個別に区別せずに示す場合、評価値VRPcと記載する場合がある。評価値VRPcは、変状検出精度が高いほど大きな値になる。 For example, the processing unit 14 calculates the evaluation value VR Pc1 of the deformation detection accuracy for the composite image IC Pc1 based on the deformation detection result RS Pc1 , and calculates the composite image IC An evaluation value VR Pc2 of the deformation detection accuracy for Pc2 is calculated. The processing unit 14 also calculates an evaluation value VR Pcn of the deformation detection accuracy for the composite image IC Pcn based on the deformation detection result RS Pcn . . . , VR Pcn may be referred to as an evaluation value VR Pc hereinafter when they are individually indicated without being distinguished from each other. The evaluation value VR Pc becomes a larger value as the deformation detection accuracy is higher.
 次に、処理部14は、評価値VRPc1,VRPc2,・・・,VRPcnに基づいて、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正するためのパラメータPを決定する。例えば、処理部14は、評価値VRPc1,VRPc2,・・・,VRPcnのうち最も大きい評価値VRPcに対応するパラメータ候補Pcを、輝度を補正するためのパラメータPとして決定するパラメータ決定処理を行う。 Next, the processing unit 14 , based on the evaluation values VR Pc1 , VR Pc2 , . decide. For example, the processing unit 14 determines the parameter candidate Pc corresponding to the largest evaluation value VR Pc among the evaluation values VR Pc1 , VR Pc2 , . process.
 また、処理部14は、変状箇所の検出精度に加えて、貼り合わせ画像ICPcの視認性に基づいて、輝度を補正するためのパラメータPとして決定することもできる。処理部14は、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnの各々の視認性の評価値VVPc1,VVPc2,・・・,VVPcnを算出する。以下、評価値VVPc1,VVPc2,・・・,VVPcnの各々を個別に区別せずに示す場合、評価値VVPcと記載する場合がある。 The processing unit 14 can also determine the parameter P for correcting the luminance based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion. The processing unit 14 calculates the visibility evaluation values VV Pc1 , VV Pc2 , . . . , VV Pcn may be referred to as an evaluation value VV Pc hereinafter when they are indicated without being distinguished from each other .
 評価値VVPcは、貼り合わせ画像ICPcのコントラストおよびエッジ強度などに基づいて算出される。例えば、処理部14は、コントラストが高いほど評価値VVPcを大きくし、エッジ強度が大きいほど評価値VVPcを大きくする。 The evaluation value VV Pc is calculated based on the contrast and edge strength of the stitched image IC Pc . For example, the processing unit 14 increases the evaluation value VV Pc as the contrast increases, and increases the evaluation value VV Pc as the edge strength increases.
 処理部14は、評価値VRPcと評価値VVPcとを重み付けして加算した結果のうち最も大きい結果に対応するパラメータ候補PcをパラメータPとして決定する。例えば、処理部14は、評価値VRPc1と評価値VVPc1とを重み付けして加算した値を総合評価値VTPc1として算出し、評価値VRPc2と評価値VVPc2とを重み付けして加算した値を総合評価値VTPc2として算出する。 The processing unit 14 determines, as the parameter P, the parameter candidate Pc corresponding to the largest result among the weighted addition results of the evaluation value VR Pc and the evaluation value VV Pc . For example, the processing unit 14 calculates a value obtained by weighting and adding the evaluation value VR Pc1 and the evaluation value VV Pc1 as the total evaluation value VT Pc1 , and weights and adds the evaluation value VR Pc2 and the evaluation value VV Pc2 . The value is calculated as the comprehensive evaluation value VT Pc2 .
 また、処理部14は、評価値VRPcnと評価値VVPcnとを重み付けして加算した値を総合評価値VTPcとして算出する。処理部14は、総合評価値VTPc1,VTPc2,・・・,VTPcnのうち最も大きい評価値に対応するパラメータ候補PcをパラメータPとして決定する。以下、総合評価値VTPc1,VTPc2,・・・,VTPcnの各々を個別に区別せずに示す場合、総合評価値VTPcと記載する場合がある。 In addition, the processing unit 14 calculates a total evaluation value VT Pc by weighting and adding the evaluation value VR Pcn and the evaluation value VV Pcn . The processing unit 14 determines, as the parameter P, the parameter candidate Pc corresponding to the largest evaluation value among the comprehensive evaluation values VT Pc1 , VT Pc2 , . . . , VT Pcn . . . , VT Pcn may be referred to as a total evaluation value VT Pc when they are not individually distinguished .
 処理部14は、総合評価値VTPcに基づいて、パラメータPを決定することで、変状箇所の検出精度に加えて、貼り合わせ画像ICPcの視認性に基づいて、輝度を補正するためのパラメータPを決定することができる。処理部14は、決定したパラメータPを用いて撮像画像A,B,Cの各々の輝度を補正し、輝度を補正した撮像画像A,B,Cを貼り合わせて貼り合わせ画像ICを生成する。 By determining the parameter P based on the comprehensive evaluation value VT Pc , the processing unit 14 determines the parameter P for correcting the luminance based on the visibility of the stitched image IC Pc in addition to the detection accuracy of the deformed portion. A parameter P can be determined. The processing unit 14 corrects the brightness of each of the captured images A, B, and C using the determined parameter P, and combines the captured images A, B, and C whose brightness has been corrected to generate a combined image ICP . .
 そして、処理部14は、貼り合わせ画像ICから変状箇所を検出し、貼り合わせ画像IC上に検出した変状箇所の情報を重畳した点検画像を生成する。これにより、処理部14は、貼り合わせ画像ICの視認性を向上させつつ、貼り合わせ画像ICからの変状箇所の検出精度の低下を抑制することができる。以下、処理部14の構成について具体的に説明する。 Then, the processing unit 14 detects a deformed portion from the composite image ICP , and generates an inspection image in which information on the detected deformed portion is superimposed on the composite image ICP . As a result, the processing unit 14 can improve the visibility of the composite image ICP while suppressing deterioration in detection accuracy of the deformed portion from the composite image ICP . The configuration of the processing unit 14 will be specifically described below.
 図4に示すように、情報処理装置1の処理部14は、入力受付部30と、データ取得部31と、輝度補正部32と、画像貼り合わせ部33と、変状検出部34と、パラメータ決定部35と、表示処理部36と、点検画像生成部37と、データ出力部38とを備える。 As shown in FIG. 4, the processing unit 14 of the information processing apparatus 1 includes an input reception unit 30, a data acquisition unit 31, a luminance correction unit 32, an image stitching unit 33, a deformation detection unit 34, a parameter A determination unit 35 , a display processing unit 36 , an inspection image generation unit 37 , and a data output unit 38 are provided.
 入力受付部30は、利用者による入力部11への入力操作を受け付ける。例えば、入力受付部30は、入力部11を介して利用者からのパラメータ決定要求を受け付ける。パラメータ決定要求には、例えば、パラメータ決定対象となる対象物の対象物IDなどの情報が含まれる。 The input reception unit 30 receives an input operation to the input unit 11 by the user. For example, the input reception unit 30 receives a parameter determination request from the user via the input unit 11 . The parameter determination request includes, for example, information such as the object ID of the object whose parameters are to be determined.
 また、入力受付部30は、入力部11を介して利用者からの点検画像生成要求を受け付ける。点検画像生成要求には、例えば、パラメータ決定対象となる対象物の対象物IDなどの情報が含まれる。また、入力受付部30は、入力部11を介して利用者からの出力要求を受け付ける。 Also, the input reception unit 30 receives an inspection image generation request from the user via the input unit 11 . The inspection image generation request includes, for example, information such as the object ID of the object whose parameters are to be determined. Also, the input reception unit 30 receives an output request from the user via the input unit 11 .
 データ取得部31は、通信部10を介して外部装置からデータを取得し、取得したデータを記憶部13に記憶させたり、処理部14が記憶部13に記憶されているデータを用いた処理を行う場合に記憶部13に記憶されているデータを取得したりする。 The data acquisition unit 31 acquires data from an external device via the communication unit 10 , stores the acquired data in the storage unit 13 , and causes the processing unit 14 to perform processing using the data stored in the storage unit 13 . The data stored in the storage unit 13 is acquired when it is performed.
 例えば、データ取得部31は、計測装置70からネットワーク3を介して通信部10で受信された計測データを通信部10から取得し、取得した計測データのうち撮像画像データを撮像画像データ記憶部20に記憶させる。 For example, the data acquisition unit 31 acquires measurement data received by the communication unit 10 from the measuring device 70 via the network 3 from the communication unit 10, and stores captured image data of the acquired measurement data in the captured image data storage unit 20. be memorized.
 また、データ取得部31は、入力受付部30によってパラメータ決定要求が受け付けられた場合、パラメータ決定要求に含まれる対象物IDに関連付けられた複数の撮像画像データおよび複数のパラメータ候補Pcを記憶部13から取得する。入力受付部30によってパラメータ決定要求が受け付けられた場合、処理部14の動作モードがパラメータ決定モードになる。 Further, when the parameter determination request is received by the input receiving unit 30, the data acquisition unit 31 stores a plurality of captured image data and a plurality of parameter candidates Pc associated with the object ID included in the parameter determination request in the storage unit 13. Get from When the input reception unit 30 receives the parameter determination request, the operation mode of the processing unit 14 becomes the parameter determination mode.
 なお、データ取得部31は、入力受付部30によってパラメータ決定要求が受け付けられた場合において、パラメータ決定要求に含まれる対象物IDの対象物の種類に関連付けられた複数のパラメータ候補Pcを記憶部13から取得することもできる。 Note that, when the parameter determination request is received by the input receiving unit 30, the data acquisition unit 31 stores a plurality of parameter candidates Pc associated with the object type of the object ID included in the parameter determination request in the storage unit 13. can also be obtained from
 また、データ取得部31は、入力受付部30によって点検画像生成要求が受け付けられた場合、点検画像生成要求に含まれる対象物IDに関連付けられた複数の撮像画像データおよびパラメータPを記憶部13から取得する。入力受付部30によって点検画像生成要求が受け付けられた場合、処理部14の動作モードが点検画像生成モードになる。データ取得部31は、パラメータ決定部35によって決定されたパラメータPをパラメータ決定部35から取得してパラメータ記憶部22に記憶させる。 Further, when an inspection image generation request is received by the input reception unit 30, the data acquisition unit 31 acquires a plurality of captured image data and the parameter P associated with the object ID included in the inspection image generation request from the storage unit 13. get. When the input reception unit 30 receives the inspection image generation request, the operation mode of the processing unit 14 becomes the inspection image generation mode. The data acquisition unit 31 acquires the parameter P determined by the parameter determination unit 35 from the parameter determination unit 35 and stores it in the parameter storage unit 22 .
 輝度補正部32は、動作モードとしてパラメータ決定モードと点検画像生成モードとを有する。まず、動作モードがパラメータ決定モードである場合の輝度補正部32の動作について説明する。 The brightness correction unit 32 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the brightness correction unit 32 when the operation mode is the parameter determination mode will be described.
 輝度補正部32は、動作モードがパラメータ決定モードである場合、データ取得部31によって取得された複数の撮像画像データで表される複数の撮像画像の各々の輝度を、データ取得部31によって取得された複数のパラメータ候補Pcを用いて、パラメータ候補Pc単位で補正する。 When the operation mode is the parameter determination mode, the brightness correction unit 32 acquires the brightness of each of the plurality of captured images represented by the plurality of captured image data acquired by the data acquisition unit 31 . Correction is performed for each parameter candidate Pc using a plurality of parameter candidates Pc.
 輝度補正部32は、例えば、ガンマ補正によって複数の撮像画像の各々の輝度を補正する。輝度補正部32は、例えば、下記式(1)の演算によって、ガンマ補正を行う。下記式(1)において、「x」は、入力であり、「y」は、出力であり、「γ」は、補正のパラメータである。
 f(x,y)=255×(x/255)1/γ・・・(1)
The luminance correction unit 32 corrects the luminance of each of the multiple captured images by gamma correction, for example. The brightness correction unit 32 performs gamma correction, for example, by calculating the following formula (1). In the following equation (1), "x" is the input, "y" is the output, and "γ" is the correction parameter.
f(x,y)=255×(x/255) 1/γ (1)
 ガンマ補正の場合、パラメータ候補Pc1,Pc2,・・・,Pcnは、パラメータγに代入する値である。輝度補正部32は、パラメータ候補Pcを上記式(1)の「γ」に代入し、撮像画像の各画素の輝度値を入力xとすることで、複数の撮像画像の各々の輝度をパラメータ候補Pc単位で補正する。 In the case of gamma correction, parameter candidates Pc1, Pc2, . . . , Pcn are values to be substituted for parameter γ. The brightness correction unit 32 substitutes the parameter candidate Pc for “γ” in the above equation (1), and uses the brightness value of each pixel of the captured image as the input x, thereby obtaining the brightness of each of the plurality of captured images as a parameter candidate. It is corrected in units of Pc.
 図9は、実施の形態1にかかる情報処理装置の輝度補正部によって実行されるガンマ補正を説明するための図である。図9に示すように、パラメータγが1よりも小さい場合、入力xが大きくなるほど、出力yの増加率が大きくなり、パラメータγが1よりも大きい場合、入力xが大きくなるほど、出力yの増加率が小さくなる。 FIG. 9 is a diagram for explaining gamma correction performed by the luminance correction unit of the information processing apparatus according to the first embodiment. As shown in FIG. 9, when the parameter γ is less than 1, the larger the input x, the larger the output y increases. rate becomes smaller.
 輝度補正部32は、局所的ヒストグラム平坦化法によって複数の撮像画像の各々の輝度をパラメータ候補Pc単位で補正することもできる。局所的ヒストグラム平坦化法は、撮像画像を分割し、それぞれの分割領域で輝度値の分布を修正することでコントラストを改善する方法である。 The luminance correction unit 32 can also correct the luminance of each of a plurality of captured images for each parameter candidate Pc by the local histogram equalization method. The local histogram equalization method is a method for improving contrast by dividing a captured image and correcting the luminance value distribution in each divided area.
 局所的ヒストグラム平坦化法の場合、パラメータ候補Pc1,Pc2,・・・,Pcnは、撮像画像の分割方法および分割領域の結合方法のうちの少なくとも一方を特定するパラメータである。 In the case of the local histogram equalization method, the parameter candidates Pc1, Pc2, .
 図10は、実施の形態1にかかる情報処理装置の輝度補正部によって実行される局所的ヒストグラム平坦化法を説明するための図である。図10に示すように、局所的ヒストグラム平坦化法では、撮像画像を分割し、それぞれの分割領域で輝度値の分布を修正することで、コントラストが改善される。 FIG. 10 is a diagram for explaining the local histogram equalization method executed by the luminance correction unit of the information processing apparatus according to the first embodiment. As shown in FIG. 10, in the local histogram equalization method, the contrast is improved by dividing the captured image and modifying the luminance value distribution in each divided area.
 輝度補正部32は、ハイパスフィルタによって複数の撮像画像の各々の輝度をパラメータ候補Pc単位で補正することもできる。輝度補正部32は、例えば、撮像画像を離散フーリエ変換することで、撮像画像の周波数成分を抽出し、抽出した周波数成分のうち低周波成分のみを除去し、低周波成分のみを除去した撮像画像の周波数成分を逆フーリエ変換することで、輝度を補正した撮像画像を生成する。輝度補正部32では、低周波成分の除去によって画像の均一化またはエッジの抽出などが可能である。 The luminance correction unit 32 can also correct the luminance of each of the plurality of captured images for each parameter candidate Pc using a high-pass filter. For example, the brightness correction unit 32 extracts the frequency components of the captured image by performing a discrete Fourier transform on the captured image, removes only the low-frequency components from the extracted frequency components, and produces a captured image from which only the low-frequency components are removed. By inverse Fourier transforming the frequency components of , a captured image with corrected brightness is generated. The brightness correction unit 32 can homogenize the image or extract edges by removing low-frequency components.
 ハイパスフィルタの場合、パラメータ候補Pc1,Pc2,・・・,Pcnは、撮像画像の周波数成分から除去する周波数帯を特定するパラメータである。図11は、実施の形態1にかかる情報処理装置の輝度補正部によって実行されるハイパスフィルタリングを説明するための図である。 In the case of the high-pass filter, the parameter candidates Pc1, Pc2, . FIG. 11 is a diagram for explaining high-pass filtering performed by the luminance correction unit of the information processing apparatus according to the first embodiment;
 図11に示す例では、パラメータ候補Pc1は、除去周波数帯fg1を特定するパラメータであり、パラメータ候補Pc2は、除去周波数帯fg2を特定するパラメータである。また、パラメータ候補Pc3は、除去周波数帯fg3を特定するパラメータであり、パラメータ候補Pc4は、除去周波数帯fg4を特定するパラメータである。除去周波数帯fg2は、除去周波数帯fg1よりも高く、除去周波数帯fg3は、除去周波数帯fg2よりも高く、除去周波数帯fg4は、除去周波数帯fg3よりも高い。 In the example shown in FIG. 11, the parameter candidate Pc1 is a parameter specifying the removal frequency band fg1, and the parameter candidate Pc2 is a parameter specifying the removal frequency band fg2. Further, the parameter candidate Pc3 is a parameter specifying the removal frequency band fg3, and the parameter candidate Pc4 is a parameter specifying the removal frequency band fg4. The removal frequency band fg2 is higher than the removal frequency band fg1, the removal frequency band fg3 is higher than the removal frequency band fg2, and the removal frequency band fg4 is higher than the removal frequency band fg3.
 次に、動作モードが点検画像生成モードである場合の輝度補正部32の動作について説明する。輝度補正部32は、動作モードが点検画像生成モードである場合、データ取得部31によって取得された複数の撮像画像データで表される複数の撮像画像の各々の輝度をデータ取得部31によって取得されたパラメータPを用いて補正する。 Next, the operation of the luminance correction unit 32 when the operation mode is the inspection image generation mode will be described. When the operation mode is the inspection image generation mode, the brightness correction unit 32 acquires the brightness of each of the plurality of captured images represented by the plurality of captured image data acquired by the data acquisition unit 31. is corrected using the parameter P.
 例えば、輝度補正部32は、パラメータPを用いて、ガンマ補正、局所的ヒストグラム平坦化法、またはハイパスフィルタによって複数の撮像画像の各々の輝度を補正する。 For example, the brightness correction unit 32 uses the parameter P to correct the brightness of each of the multiple captured images by gamma correction, local histogram equalization, or high-pass filtering.
 なお、輝度補正部32による輝度の補正方法は、ガンマ補正、局所的ヒストグラム平坦化法、およびハイパスフィルタに限定されず、種々の輝度補正方法を適用することができ、また、独自に開発された輝度補正方法であってもよい。 The luminance correction method by the luminance correction unit 32 is not limited to gamma correction, local histogram equalization method, and high-pass filter, and various luminance correction methods can be applied. It may be a brightness correction method.
 輝度補正部32は、利用者による入力部11への入力操作で補正方法を変更したり、対象物の種類に応じて補正方法を変更したりすることができる。例えば、輝度補正部32は、対象物の種類に対応する補正方法を示す補正情報を有しており、かかる補正情報に基づいて、対象物毎の補正方法または対象物の種類に対応する補正方法で複数の撮像画像の各々の輝度を補正することができる。対象物の種類は、例えば、トンネル4、ガードレール5、道路、橋梁などといった構造物の種類である。また、輝度補正部32は、補正方法を撮像画像データが得られた時間帯、季節、天気などに基づいて変更することもできる。 The luminance correction unit 32 can change the correction method according to the user's input operation to the input unit 11, or change the correction method according to the type of the target object. For example, the luminance correction unit 32 has correction information indicating a correction method corresponding to the type of object, and based on this correction information, a correction method for each object or a correction method corresponding to the type of object. can correct the luminance of each of a plurality of captured images. The type of object is, for example, the type of structure such as tunnel 4, guardrail 5, road, bridge, and the like. The luminance correction unit 32 can also change the correction method based on the time zone, season, weather, etc. when the captured image data was obtained.
 次に、図4に示す画像貼り合わせ部33について説明する。画像貼り合わせ部33は、輝度補正部32によって輝度が補正された複数の撮像画像を貼り合わせた画像である貼り合わせ画像ICPc,ICを生成する。画像貼り合わせ部33は、動作モードとしてパラメータ決定モードと点検画像生成モードとを有する。まず、動作モードがパラメータ決定モードである場合の画像貼り合わせ部33の動作について説明する。 Next, the image stitching unit 33 shown in FIG. 4 will be described. The image stitching unit 33 generates stitched images IC Pc and ICP , which are images obtained by stitching together a plurality of captured images whose luminance has been corrected by the luminance correcting unit 32 . The image stitching unit 33 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the image stitching unit 33 when the operation mode is the parameter determination mode will be described.
 画像貼り合わせ部33は、動作モードがパラメータ決定モードである場合、パラメータ候補Pcを用いて輝度が補正された複数の撮像画像をパラメータ候補Pc単位で貼り合わせた画像である貼り合わせ画像ICPcを生成する。 When the operation mode is the parameter determination mode, the image stitching unit 33 generates a stitched image IC Pc, which is an image obtained by stitching together a plurality of captured images whose brightness is corrected using the parameter candidates Pc in units of the parameter candidates Pc . Generate.
 例えば、図8に示す例では、画像貼り合わせ部33は、パラメータ候補Pc1,Pc2,・・・,Pcnで輝度が補正された複数の撮像画像A,B,Cをパラメータ候補Pc単位で貼り合わせた画像である貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnを生成する。 For example, in the example shown in FIG. 8, the image stitching unit 33 stitches together a plurality of captured images A, B, and C whose brightness is corrected by the parameter candidates Pc1, Pc2, . . . , IC Pcn are generated .
 図12は、実施の形態1にかかる情報処理装置の画像貼り合わせ部による撮像画像貼り合わせ処理の一例を説明するための図である。図12に示す例では、画像貼り合わせ部33は、パラメータ候補Pcによって輝度が補正された複数の撮像画像A,B,Cが貼り合わせて貼り合わせ画像ICPcを生成する。図12に示す撮像画像A,B,Cは、トンネル4の内壁面4aの画像である。 FIG. 12 is a diagram for explaining an example of captured image stitching processing by the image stitching unit of the information processing apparatus according to the first embodiment; In the example shown in FIG. 12, the image stitching unit 33 stitches together a plurality of captured images A, B, and C whose brightness is corrected by the parameter candidate Pc to generate a stitched image IC Pc . Captured images A, B, and C shown in FIG. 12 are images of the inner wall surface 4 a of the tunnel 4 .
 画像貼り合わせ部33は、動作モードが点検画像生成モードである場合、パラメータPを用いて輝度補正部32によって輝度が補正された複数の撮像画像を貼り合わせた画像である貼り合わせ画像ICを生成する。 When the operation mode is the inspection image generation mode, the image stitching unit 33 generates a stitched image ICP, which is an image obtained by stitching together a plurality of captured images whose luminance has been corrected by the luminance correction unit 32 using the parameter P. Generate.
 なお、画像貼り合わせ部33は、貼り合わせ画像ICPcまたは貼り合わせ画像ICを生成する処理において、撮像画像A,B間の重複部分および撮像画像B,C間の重複部分は削除する。 Note that the image stitching unit 33 deletes the overlapping portion between the captured images A and B and the overlapping portion between the captured images B and C in the process of generating the stitched image IC Pc or the stitched image ICP .
 次に、図4に示す変状検出部34について説明する。変状検出部34は、貼り合わせ画像ICPc,ICから対象物の変状箇所を検出する学習モデルを用いて、対象物の変状箇所を検出する。学習モデルは、畳み込みニューラルネットワークまたは回帰型ニューラルネットワークなどのニューラルネットワークであり、深層学習(Deep Learning)によって生成される。変状検出部34の学習モデルは、例えば、セマンティックセグメンテーションを行う学習モデルである。 Next, the deformation detector 34 shown in FIG. 4 will be described. The deformation detection unit 34 detects a deformed portion of the object using a learning model for detecting the deformed portion of the object from the composite images IC Pc and ICP . A learning model is a neural network such as a convolutional neural network or a recurrent neural network, and is generated by deep learning. The learning model of the deformation detection unit 34 is, for example, a learning model that performs semantic segmentation.
 変状検出部34は、動作モードとしてパラメータ決定モードと点検画像生成モードとを有する。まず、動作モードがパラメータ決定モードである場合の変状検出部34の動作について説明する。 The deformation detection unit 34 has a parameter determination mode and an inspection image generation mode as operation modes. First, the operation of the deformation detector 34 when the operation mode is the parameter determination mode will be described.
 変状検出部34は、動作モードがパラメータ決定モードである場合、画像貼り合わせ部33によって生成された貼り合わせ画像ICPcを学習モデルに入力し、学習モデルから出力される各画素のクラスラベル毎のスコアに基づいて、変状箇所を検出する。 When the operation mode is the parameter determination mode, the deformation detection unit 34 inputs the stitched image IC Pc generated by the image stitching unit 33 to the learning model, and class label of each pixel output from the learning model. based on the score of
 クラスラベルは、対象物がトンネル4である場合、例えば、ひび割れを示すクラスラベル、剥離を示すクラスラベル、汚れを示すクラスラベル、または漏れ水を示すクラスラベルなどである。画素におけるクラスラベルのスコアは、例えば、0~1の値であり、変状検出部34は、例えば、スコアが0.5以上のクラスラベルの画素がある場合、かかる画素が変状の画素であり、変状の種類が、スコアが0.5以上のクラスラベルに対応する変状であると判定する。 When the object is the tunnel 4, the class label is, for example, a class label indicating cracks, a class label indicating delamination, a class label indicating dirt, or a class label indicating water leakage. The class label score of a pixel is, for example, a value between 0 and 1. If there is a class label pixel with a score of 0.5 or more, the deformation detection unit 34 determines that the pixel is a deformation pixel. Yes, and the type of deformation is determined to be deformation corresponding to a class label with a score of 0.5 or higher.
 例えば、図8に示す例では、変状検出部34は、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnの各々から変状箇所を検出し、検出した結果である変状検出結果RSPc1,RSPc2,・・・,RSPcnを生成する。 For example, in the example shown in FIG. 8, the deformation detection unit 34 detects a deformation location from each of the composite images IC Pc1 , IC Pc2 , . RS Pc1 , RS Pc2 , . . . , RS Pcn are generated.
 変状検出部34は、動作モードが点検画像生成モードである場合、画像貼り合わせ部33によって生成された貼り合わせ画像ICを学習モデルに入力し、学習モデルから出力される各画素のクラスラベル毎のスコアに基づいて、変状箇所を検出する。 When the operation mode is the inspection image generation mode, the deformation detection unit 34 inputs the stitched image ICP generated by the image stitching unit 33 to the learning model, and determines the class label of each pixel output from the learning model. Based on each score, the location of the deformation is detected.
 学習モデルは、セマンティックセグメンテーションを行う学習モデルに代えて、貼り合わせ画像ICPc,IC全体の属性をラベリングして生成される学習モデルであってもよい。また、学習モデルは、ニューラルネットワーク以外のネットワークモデルであってもよく、線形回帰またはロジスティック回帰などといった深層学習以外の機械学習によって生成される計算モデルであってもよい。 Instead of a learning model that performs semantic segmentation, the learning model may be a learning model that is generated by labeling attributes of the entire composite images IC Pc and IC P. Also, the learning model may be a network model other than a neural network, or a computational model generated by machine learning other than deep learning, such as linear regression or logistic regression.
 次に、図4に示すパラメータ決定部35について説明する。パラメータ決定部35は、動作モードがパラメータ決定モードである変状検出部34によって検出された変状箇所の検出精度に基づいて、パラメータPを決定する。 Next, the parameter determination unit 35 shown in FIG. 4 will be described. The parameter determination unit 35 determines the parameter P based on the detection accuracy of the deformation location detected by the deformation detection unit 34 whose operation mode is the parameter determination mode.
 例えば、図8に示す例では、パラメータ決定部35は、変状検出結果RSPcに基づいて、貼り合わせ画像ICPcに対する変状検出精度の評価値VRPcをパラメータ候補Pc単位で算出する。そして、パラメータ決定部35は、パラメータ候補Pc単位の評価値VRPcに基づいて、パラメータPを決定する。これにより、パラメータ決定部35は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPを決定することができる。 For example, in the example shown in FIG. 8, the parameter determining unit 35 calculates the deformation detection accuracy evaluation value VR Pc for the composite image IC Pc for each parameter candidate Pc based on the deformation detection result RS Pc . Then, the parameter determining unit 35 determines the parameter P based on the evaluation value VR Pc for each parameter candidate Pc. Thereby, the parameter determination unit 35 can determine the parameter P that can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
 また、パラメータ決定部35は、変状箇所の検出精度に加えて、貼り合わせ画像ICPcの視認性に基づいて、輝度を補正するためのパラメータPとして決定することもできる。パラメータ決定部35は、貼り合わせ画像ICPcの視認性の評価値VVPcをパラメータ候補Pc単位で算出する。 The parameter determination unit 35 can also determine the parameter P for correcting the brightness based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion. The parameter determining unit 35 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc.
 そして、パラメータ決定部35は、変状検出精度の評価値VRPcと視認性の評価値VVPcと基づいて、総合評価値VTPcをパラメータ候補Pc単位で算出し、算出したパラメータ候補Pc単位の総合評価値VTPcに基づいて、パラメータPを決定する。これにより、パラメータ決定部35は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータとしてパラメータPを精度よく決定することができる。 Then, the parameter determination unit 35 calculates a total evaluation value VT Pc for each parameter candidate Pc based on the evaluation value VR Pc for the deformation detection accuracy and the evaluation value VV Pc for visibility, and calculates the total evaluation value VT Pc for each parameter candidate Pc. A parameter P is determined based on the comprehensive evaluation value VT Pc . Thereby, the parameter determination unit 35 can accurately determine the parameter P as a parameter capable of correcting the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
 パラメータ決定部35は、図4に示すように、真値情報取得部40と、評価部41と、決定部42とを備える。真値情報取得部40は、貼り合わせ画像ICPcの画素毎に変状の有無を示す真値を含む真値情報を取得する。真値情報は、変状の種類が複数ある場合、変状の種類毎に変状の有無を示す真値を含む。 The parameter determination unit 35 includes a true value information acquisition unit 40, an evaluation unit 41, and a determination unit 42, as shown in FIG. The true value information acquisition unit 40 acquires true value information including a true value indicating the presence or absence of deformation for each pixel of the composite image IC Pc . When there are a plurality of types of deformation, the true value information includes a true value indicating the presence or absence of deformation for each type of deformation.
 真値情報取得部40は、手動モードに設定されている場合、例えば、貼り合わせ画像ICPcを表示部12に表示させる処理を表示処理部36に実行させる。真値情報取得部40は、利用者から入力された真値情報が入力受付部30によって受け付けられた場合、データ取得部31を介して入力受付部30で受け付けられた真値情報を取得する。 When the manual mode is set, the true value information acquisition unit 40 causes the display processing unit 36 to perform processing for displaying the composite image IC Pc on the display unit 12, for example. The true value information acquiring unit 40 acquires the true value information accepted by the input accepting unit 30 via the data acquiring unit 31 when the true value information input by the user is accepted by the input accepting unit 30 .
 また、真値情報取得部40は、自動モードに設定されている場合、輝度補正部32によって輝度補正が行われていない複数の撮像画像を貼り合わせた貼り合わせ画像を画像貼り合わせ部33に生成させる。そして、真値情報取得部40は、輝度補正が行われていない複数の撮像画像を貼り合わせた貼り合わせ画像から変状箇所を変状検出部34に検出させ、変状検出部34によって検出された変状箇所の情報から真値情報を取得する。 In addition, when the automatic mode is set, the true value information acquisition unit 40 generates, in the image stitching unit 33, a stitched image obtained by stitching together a plurality of captured images for which luminance correction has not been performed by the luminance correction unit 32. Let Then, the true value information acquisition unit 40 causes the deformation detection unit 34 to detect a deformation location from a composite image obtained by combining a plurality of captured images that have not been subjected to luminance correction, and the deformation detection unit 34 detects the deformation location. The true value information is acquired from the information on the deformed location.
 評価部41は、動作モードがパラメータ決定モードである変状検出部34によって検出された変状箇所の情報と真値情報取得部40によって取得された真値情報とに基づいて、貼り合わせ画像ICPcからの変状箇所の検出精度の評価値VRPcを算出する。 The evaluation unit 41 evaluates the composite image IC based on the information on the deformation location detected by the deformation detection unit 34 whose operation mode is the parameter determination mode and the true value information acquired by the true value information acquisition unit 40. An evaluation value VR Pc of the detection accuracy of the deformed portion from Pc is calculated.
 評価部41は、再現率、適合率、およびF値のうちの少なくとも1つを評価値VRPcとして算出したり、再現率、適合率、およびF値のうちの少なくとも2つを重み付けして加算した値を評価値VRPcとして算出したりすることができる。 The evaluation unit 41 calculates at least one of the recall rate, the precision rate, and the F value as the evaluation value VR Pc , or weights and adds at least two of the recall rate, the precision rate, and the F value. This value can be calculated as the evaluation value VR Pc .
 ここで、再現率、適合率、およびF値について説明する。図13は、実施の形態1にかかる情報処理装置の評価部による再現率、適合率、およびF値の算出方法を説明するための図である。 Here, recall, precision, and F value will be explained. 13A and 13B are diagrams for explaining a method of calculating a recall rate, a precision rate, and an F value by an evaluation unit of the information processing apparatus according to the first embodiment; FIG.
 図13に示す例では、変状検出部34によって変状であると検出された画素のうち実際に変状である画素の数をTP(True Positive)とし、変状検出部34によって変状であると検出された画素のうち実際には背景である画素の数をTN(True Negative)としている。 In the example shown in FIG. 13, the number of pixels that are actually deformed out of the pixels detected as being deformed by the deformation detection unit 34 is TP (True Positive). The number of pixels that are actually background among pixels detected as being present is defined as TN (True Negative).
 また、図13に示す例では、変状検出部34によって背景であると検出された画素のうち実際には変状である画素の数をFN(False Negative)とし、変状検出部34によって変状と検出された画素のうち背景である画素の数をFP(False Positive)としている。背景の画素は、変状以外を示す画素である。 In the example shown in FIG. 13, the number of pixels that are actually deformed out of the pixels detected as background by the deformation detection unit 34 is FN (False Negative). FP (False Positive) is the number of pixels that are background among the pixels detected as having a shape. A background pixel is a pixel that indicates a non-deformation.
 なお、評価部41は、変状の種類が複数ある場合、例えば、同じ種類でない変状の画素は背景の画素として扱い、TP,TN,FN,FPを変状毎に算出して合算することで、TP,TN,FN,FPを算出する。 When there are multiple types of deformation, for example, the evaluation unit 41 treats deformation pixels that are not of the same type as background pixels, and calculates and totals TP, TN, FN, and FP for each deformation. , TP, TN, FN, and FP are calculated.
 評価部41は、図13に示す式の演算によって再現率、適合率、およびF値のうちの少なくとも1つを評価値VRPcとして算出することができる。また、評価部41は、再現率、適合率、およびF値のうちの少なくとも2つを重み付けして加算した値を評価値VRPcとして算出することもできる。 The evaluation unit 41 can calculate at least one of the recall rate, the precision rate, and the F value as the evaluation value VR Pc by calculating the formula shown in FIG. The evaluation unit 41 can also calculate a value obtained by weighting and adding at least two of the recall rate, the precision rate, and the F value as the evaluation value VR Pc .
 また、評価部41は、変状検出部34の学習モデルから出力されるスコアと実際の画素の種別に対応する値との差を画素毎に算出し、算出した結果を合算した値を評価値VRPcとすることもできる。画素の種別に対応する値は、画素の種別が変状である場合、「1」であり、画素の種別が背景である場合、「0」である。 In addition, the evaluation unit 41 calculates the difference between the score output from the learning model of the deformation detection unit 34 and the value corresponding to the actual pixel type for each pixel, and sums the calculated results as an evaluation value. It can also be VR Pc . The value corresponding to the pixel type is "1" when the pixel type is deformation, and is "0" when the pixel type is background.
 また、評価部41は、変状箇所の検出精度に加えて、貼り合わせ画像ICPcの視認性に基づいて、貼り合わせ画像ICPcの視認性の評価値VVPcを算出することもできる。評価値VVPcは、貼り合わせ画像ICPcのコントラストおよびエッジ強度などに基づいて算出される。例えば、評価部41は、コントラストが高いほど評価値VVPcを大きくし、エッジ強度が大きいほど評価値VVPcを大きくする。 The evaluation unit 41 can also calculate the visibility evaluation value VV Pc of the composite image IC Pc based on the visibility of the composite image IC Pc in addition to the detection accuracy of the deformed portion. The evaluation value VV Pc is calculated based on the contrast and edge strength of the stitched image IC Pc . For example, the evaluation unit 41 increases the evaluation value VV Pc as the contrast increases, and increases the evaluation value VV Pc as the edge strength increases.
 図8に示す例では、評価部41は、貼り合わせ画像ICPc1,ICPc2,・・・,ICPcnの各々の視認性の評価値VVPc1,VVPc2,・・・,VVPcnを算出する。 In the example shown in FIG . 8, the evaluation unit 41 calculates evaluation values VV Pc1 , VV Pc2 , . .
 そして、評価部41は、例えば、評価値VRPc1と評価値VVPc1とを重み付けして加算した値を総合評価値VTPc1として算出し、評価値VRPc2と評価値VVPc2とを重み付けして加算した値を総合評価値VTPc2として算出する。また、評価部41は、評価値VRPcnと評価値VVPcnとを重み付けして加算した値を総合評価値VTPcnとして算出する。 Then, the evaluation unit 41 calculates, for example, a value obtained by weighting and adding the evaluation value VR Pc1 and the evaluation value VV Pc1 as the total evaluation value VT Pc1 , and weights the evaluation value VR Pc2 and the evaluation value VV Pc2 . The added value is calculated as the total evaluation value VT Pc2 . The evaluation unit 41 also calculates a value obtained by weighting and adding the evaluation value VR Pcn and the evaluation value VV Pcn as the total evaluation value VT Pcn .
 図4に示す決定部42は、評価部41によって算出された複数の評価値VRPcまたは複数の評価値VVPcに基づいて、パラメータPを決定する。これにより、決定部42は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPを決定することができる。 The determination unit 42 shown in FIG. 4 determines the parameter P based on the multiple evaluation values VR Pc or multiple evaluation values VV Pc calculated by the evaluation unit 41 . Thereby, the determination unit 42 can determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
 例えば、決定部42は、評価部41によって算出された複数の評価値VRPcのうち最も大きい評価値VRPcに対応するパラメータ候補PcをパラメータPとして決定する。例えば、評価部41によって評価値VRPc1,VRPc2,・・・,VRPcnが算出され、評価値VRPc1,VRPc2,・・・,VRPcnのうち評価値VRPc2が最も大きいとする。この場合、決定部42は、評価値VRPc2に対応するパラメータ候補Pc2をパラメータPとして決定する。 For example, the determination unit 42 determines, as the parameter P, the parameter candidate Pc corresponding to the largest evaluation value VR Pc among the plurality of evaluation values VR Pc calculated by the evaluation unit 41 . For example , it is assumed that the evaluation values VR Pc1 , VR Pc2 , . In this case, the determining unit 42 determines, as the parameter P, the parameter candidate Pc2 corresponding to the evaluation value VR Pc2 .
 また、決定部42は、評価部41によって算出された複数の総合評価値VTPcのうち最も大きい総合評価値VTPcに対応するパラメータ候補PcをパラメータPとして決定する。例えば、評価部41によって総合評価値VTPc1,VTPc2,・・・,VTPcnが算出され、総合評価値VTPc1,VTPc2,・・・,VTPcnのうち総合評価値VTPcnが最も大きいとする。この場合、決定部42は、総合評価値VTPcnに対応するパラメータ候補PcnをパラメータPとして決定する。決定部42によって決定されたパラメータPは、上述したようにデータ取得部31によってパラメータ記憶部22に記憶される。 Further, the determination unit 42 determines, as the parameter P, the parameter candidate Pc corresponding to the largest comprehensive evaluation value VT Pc among the plurality of comprehensive evaluation values VT Pc calculated by the evaluation unit 41 . For example , the evaluation unit 41 calculates the total evaluation values VT Pc1 , VT Pc2 , . and In this case, the determining unit 42 determines, as the parameter P, the parameter candidate Pcn corresponding to the comprehensive evaluation value VT Pcn . The parameter P determined by the determination unit 42 is stored in the parameter storage unit 22 by the data acquisition unit 31 as described above.
 表示処理部36は、入力受付部30による受け付け結果に基づいて、記憶部13に記憶されている情報を表示部12に表示させたり、処理部14によって処理された情報を表示部12に表示させたりする。例えば、表示処理部36は、画像貼り合わせ部33によって生成された貼り合わせ画像ICPcを表示部12に表示させたり、点検画像生成部37によって生成された点検画像を表示部12に表示させたりする。 The display processing unit 36 causes the display unit 12 to display the information stored in the storage unit 13 or the information processed by the processing unit 14 on the display unit 12 based on the reception result by the input reception unit 30. or For example, the display processing unit 36 causes the display unit 12 to display the stitched image IC Pc generated by the image stitching unit 33, or causes the display unit 12 to display the inspection image generated by the inspection image generation unit 37. do.
 点検画像生成部37は、画像貼り合わせ部33によって生成された貼り合わせ画像ICと、貼り合わせ画像ICから変状検出部34によって検出された変状箇所の情報を重畳した点検画像を生成する。変状箇所の情報は、例えば、変状の形状を示す線図、または変状の箇所を囲む枠などである。 The inspection image generator 37 generates an inspection image by superimposing the stitched image ICP generated by the image stitcher 33 and information on the deformed portion detected by the deformed detector 34 from the stitched image ICP . do. The information on the deformed portion is, for example, a diagram showing the shape of the deformed portion, or a frame surrounding the deformed portion.
 図14は、実施の形態1にかかる情報処理装置の点検画像生成部によって生成される点検画像の一例を示す図である。図14に示すように、点検画像生成部37によって生成される点検画像は、貼り合わせ画像IC上に、変状検出部34によって検出された変状を強調表示する情報が重畳される。なお、図14では、貼り合わせ画像ICを無地で示しているが、貼り合わせ画像ICは対象物の画像を含む。 14 is a diagram of an example of an inspection image generated by the inspection image generation unit of the information processing apparatus according to the first embodiment; FIG. As shown in FIG. 14, in the inspection image generated by the inspection image generation unit 37, information highlighting the deformation detected by the deformation detection unit 34 is superimposed on the composite image ICP . Although FIG. 14 shows the composite image ICP in solid color, the composite image ICP includes an image of the object.
 変状の強調表示は、変状を示す線の太さまたは色などを貼り合わせ画像ICと区別可能な太さまたは色とすることによって行われる。点検画像生成部37は、例えば、変状の種類および対象物の種類によって強調表示の方法を変更することもできる。なお、点検画像生成部37は、変状箇所の情報が重畳されていない貼り合わせ画像ICを点検画像として出力することもできる。 The deformation is emphasized by setting the thickness or color of the line indicating the deformation to a thickness or color distinguishable from the composite image ICP . The inspection image generator 37 can change the highlighting method according to the type of deformation and the type of object, for example. Note that the inspection image generation unit 37 can also output a composite image ICP on which the information of the deformed portion is not superimposed as an inspection image.
 図4に示すデータ出力部38は、入力受付部30によって利用者からの出力要求が受け付けられた場合、利用者の入力部11への入力操作に応じた情報または処理部14によって処理された情報を外部装置へ通信部10を介して出力する。 When the input reception unit 30 receives an output request from the user, the data output unit 38 shown in FIG. is output to an external device via the communication unit 10 .
 例えば、データ出力部38は、パラメータ決定部35によって決定されたパラメータP、および点検画像生成部37によって生成された撮像画像のデータなどを外部装置へ通信部10を介して出力する。 For example, the data output unit 38 outputs the parameter P determined by the parameter determination unit 35, the captured image data generated by the inspection image generation unit 37, and the like to an external device via the communication unit 10.
 パラメータ決定部35によって決定されたパラメータPが送信される外部装置は、例えば、パラメータ決定部35、パラメータ決定モード、およびパラメータ候補記憶部21を有していない情報処理装置などである。かかる外部装置は、計測車両2の計測装置70などから複数の撮像画像データを取得し、取得した複数の撮像画像データで表される複数の撮像画像の各々の輝度の補正を行った後、これら複数の撮像画像を貼り合わせた貼り合わせ画像ICを生成する。そして、外部装置は、貼り合わせ画像ICから変状箇所を検出し、検出した変状箇所の情報を貼り合わせ画像ICに重畳した点検画像を生成する。 An external device to which the parameter P determined by the parameter determination unit 35 is transmitted is, for example, an information processing device that does not have the parameter determination unit 35, the parameter determination mode, and the parameter candidate storage unit 21. Such an external device acquires a plurality of imaged image data from the measurement device 70 of the measurement vehicle 2 or the like, corrects the brightness of each of the plurality of imaged images represented by the acquired plurality of imaged image data, and then corrects the brightness of each of the plurality of imaged images. A stitched image ICP is generated by stitching together a plurality of captured images. Then, the external device detects a deformed portion from the composite image ICP , and generates an inspection image in which information on the detected deformed portion is superimposed on the composite image ICP .
 つづいて、フローチャートを用いて情報処理装置1の処理部14による処理を説明する。図15は、実施の形態1にかかる情報処理装置の処理部による処理の一例を示すフローチャートである。 Next, processing by the processing unit 14 of the information processing device 1 will be described using a flowchart. 15 is a flowchart illustrating an example of processing by a processing unit of the information processing apparatus according to the first embodiment; FIG.
 図15に示すように、情報処理装置1の処理部14は、計測車両2の計測装置70などから出力される計測データを、通信部10を介して取得したか否かを判定する(ステップS10)。処理部14は、計測データを取得したと判定した場合(ステップS10:Yes)、計測データに含まれる撮像画像データを記憶部13に記憶させる(ステップS11)。 As shown in FIG. 15, the processing unit 14 of the information processing device 1 determines whether measurement data output from the measurement device 70 of the measurement vehicle 2 or the like is acquired via the communication unit 10 (step S10). ). When the processing unit 14 determines that the measurement data has been acquired (step S10: Yes), the processing unit 14 stores captured image data included in the measurement data in the storage unit 13 (step S11).
 処理部14は、ステップS11の処理が終了した場合、または計測データを取得していないと判定した場合(ステップS10:No)、パラメータ決定タイミングになったか否かを判定する(ステップS12)。処理部14は、例えば、利用者からパラメータ決定要求がある場合、または撮像画像データが記憶部13に記憶された場合に、パラメータ決定タイミングになったと判定する。 When the processing of step S11 is completed, or when it is determined that the measurement data has not been acquired (step S10: No), the processing unit 14 determines whether or not the parameter determination timing has come (step S12). The processing unit 14 determines that the parameter determination timing has come, for example, when the user requests parameter determination or when the captured image data is stored in the storage unit 13 .
 処理部14は、パラメータ決定タイミングになったと判定した場合(ステップS12:Yes)、パラメータ決定処理を行う(ステップS13)。ステップS13の処理は、図16に示すステップS20~S26の処理であり、後で詳述する。 When the processing unit 14 determines that the parameter determination timing has come (step S12: Yes), it performs parameter determination processing (step S13). The process of step S13 is the process of steps S20 to S26 shown in FIG. 16, and will be described in detail later.
 処理部14は、ステップS13の処理が終了した場合、またはパラメータ決定タイミングになっていないと判定した場合(ステップS12:No)、点検画像生成タイミングになったか否かを判定する(ステップS14)。処理部14は、例えば、利用者から点検画像生成要求がある場合、またはステップS13の処理によってパラメータPが決定された後、決定されたパラメータPに対応する対象物の撮像画像データが記憶部13に記憶された場合に点検画像生成タイミングになったと判定する。 When the processing of step S13 is completed, or when it is determined that the parameter determination timing has not come (step S12: No), the processing unit 14 determines whether or not the inspection image generation timing has come (step S14). For example, when there is an inspection image generation request from the user, or after the parameter P is determined by the process of step S13, the processing unit 14 stores the captured image data of the object corresponding to the determined parameter P in the storage unit 13. , it is determined that the inspection image generation timing has come.
 処理部14は、点検画像生成タイミングになったと判定した場合(ステップS14:Yes)、点検画像生成処理を実行する(ステップS15)。ステップS15の処理は、図20に示すステップS60~S65の処理であり、後で詳述する。 When the processing unit 14 determines that it is time to generate an inspection image (step S14: Yes), it executes inspection image generation processing (step S15). The process of step S15 is the process of steps S60 to S65 shown in FIG. 20, and will be described in detail later.
 処理部14は、ステップS15の処理が終了した場合、または点検画像生成タイミングになっていないと判定した場合(ステップS14:No)、出力要求があるか否かを判定する(ステップS16)。 When the processing of step S15 is completed, or when it is determined that the inspection image generation timing has not come (step S14: No), the processing unit 14 determines whether or not there is an output request (step S16).
 処理部14は、出力要求があると判定した場合(ステップS16:Yes)、出力処理を行う(ステップS17)。ステップS17の出力処理において、処理部14は、例えば、ステップS13のパラメータ決定処理で決定したパラメータP、またはステップS15の点検画像生成処理で生成した点検画像のデータを外部装置へ通信部10を介して出力する。 When the processing unit 14 determines that there is an output request (step S16: Yes), it performs output processing (step S17). In the output process of step S17, the processing unit 14 sends, for example, the parameter P determined by the parameter determination process of step S13 or the data of the inspection image generated by the inspection image generation process of step S15 to the external device via the communication unit 10. output.
 処理部14は、ステップS17の処理が終了した場合、または出力要求がないと判定した場合(ステップS16:No)、動作終了のタイミングになったか否かを判定する(ステップS18)。処理部14は、例えば、情報処理装置1の不図示の電源がオフされたと判定した場合または入力部11への動作終了の操作が行われたと判定した場合に、動作終了のタイミングになったと判定する。 When the processing of step S17 is finished, or when it is determined that there is no output request (step S16: No), the processing unit 14 determines whether or not it is time to end the operation (step S18). For example, when the processing unit 14 determines that the power supply (not shown) of the information processing device 1 is turned off or determines that an operation end operation has been performed on the input unit 11, the processing unit 14 determines that it is time to end the operation. do.
 処理部14は、動作終了のタイミングになっていないと判定した場合(ステップS18:No)、処理をステップS10へ移行し、動作終了のタイミングになったと判定した場合(ステップS18:Yes)、図15に示す処理を終了する。 If the processing unit 14 determines that it is not the time to end the operation (step S18: No), the process proceeds to step S10. 15 ends.
 図16は、実施の形態1にかかる情報処理装置の処理部によるパラメータ決定処理の一例を示すフローチャートである。図16に示すように、処理部14は、パラメータ決定処理の対象となる対象物の複数の撮像画像データを記憶部13から取得する(ステップS20)。 16 is a flowchart illustrating an example of parameter determination processing by the processing unit of the information processing apparatus according to Embodiment 1. FIG. As shown in FIG. 16, the processing unit 14 acquires from the storage unit 13 a plurality of pieces of captured image data of an object to be subjected to parameter determination processing (step S20).
 また、処理部14は、パラメータ決定処理の対象となる対象物の種類または対象物に対応する複数のパラメータ候補Pcを記憶部13から取得する(ステップS21)。そして、処理部14は、各パラメータ候補Pcを用いて、複数の撮像画像データで表される複数の撮像画像の輝度を補正する(ステップS22)。 In addition, the processing unit 14 acquires from the storage unit 13 a plurality of parameter candidates Pc corresponding to the type of target object to be subjected to parameter determination processing or the target object (step S21). Then, the processing unit 14 uses each parameter candidate Pc to correct the brightness of the plurality of captured images represented by the plurality of captured image data (step S22).
 次に、処理部14は、ステップS22で輝度を補正した複数の撮像画像をパラメータ候補Pc単位で貼り合わせて、貼り合わせ画像ICPcをパラメータ候補Pc単位で生成する(ステップS23)。 Next, the processing unit 14 creates a composite image IC Pc for each parameter candidate Pc by combining the plurality of captured images whose brightness has been corrected in step S22 for each parameter candidate Pc (step S23).
 次に、処理部14は、貼り合わせ画像ICPcからパラメータ候補Pc単位で変状箇所を検出する(ステップS24)。そして、処理部14は、変状検出精度評価処理を実行する(ステップS25)。ステップS25の処理は、図17に示すステップS30~S32の処理であり、後で詳述する。 Next, the processing unit 14 detects a deformed portion for each parameter candidate Pc from the composite image IC Pc (step S24). Then, the processing unit 14 executes a deformation detection accuracy evaluation process (step S25). The process of step S25 is the process of steps S30 to S32 shown in FIG. 17, and will be described in detail later.
 次に、処理部14は、変状検出精度評価処理で算出されたパラメータ候補Pc単位の評価値VRPcまたは総合評価値VTPcに基づいて、輝度補正処理に用いるパラメータPを決定し(ステップS26)、図16に示す処理を終了する。 Next, the processing unit 14 determines the parameter P used in the luminance correction process based on the evaluation value VR Pc or the total evaluation value VT Pc for each parameter candidate Pc calculated in the deformation detection accuracy evaluation process (step S26 ), the process shown in FIG. 16 is terminated.
 図17は、実施の形態1にかかる情報処理装置の処理部による変状検出精度評価処理の一例を示すフローチャートである。図17に示すように、処理部14は、輝度補正処理が手動モードに設定されているか否かを判定する(ステップS30)。 FIG. 17 is a flowchart showing an example of deformation detection accuracy evaluation processing by the processing unit of the information processing apparatus according to the first embodiment. As shown in FIG. 17, the processing unit 14 determines whether or not the luminance correction process is set to the manual mode (step S30).
 処理部14は、輝度補正処理が手動モードに設定されていると判定した場合(ステップS30:Yes)、手動モード処理を実行する(ステップS31)。ステップS31の処理は、図18に示すステップS40~S43の処理であり、後で詳述する。 When the processing unit 14 determines that the luminance correction process is set to the manual mode (step S30: Yes), it executes the manual mode process (step S31). The process of step S31 is the process of steps S40 to S43 shown in FIG. 18, and will be described in detail later.
 また、処理部14は、輝度補正処理が手動モードではなく自動モードに設定されていると判定した場合(ステップS30:No)、自動モード処理を実行する(ステップS32)。ステップS32の処理は、図19に示すステップS50~S56の処理であり、後で詳述する。 Also, when the processing unit 14 determines that the luminance correction process is set to the automatic mode instead of the manual mode (step S30: No), it executes the automatic mode process (step S32). The process of step S32 is the process of steps S50 to S56 shown in FIG. 19, and will be described in detail later.
 処理部14は、ステップS31の処理が終了した場合、またはステップS32の処理が終了した場合、図17に示す処理を終了する。 The processing unit 14 ends the processing shown in FIG. 17 when the processing of step S31 is completed or when the processing of step S32 is completed.
 図18は、実施の形態1にかかる情報処理装置の処理部による手動モード処理の一例を示すフローチャートである。図18に示すように、処理部14は、利用者からの変状箇所の真値を受け付ける(ステップS40)。 18 is a flowchart showing an example of manual mode processing by the processing unit of the information processing apparatus according to Embodiment 1. FIG. As shown in FIG. 18, the processing unit 14 receives the true value of the deformed portion from the user (step S40).
 次に、処理部14は、ステップS24での検出結果とステップS40で受け付けた真値とを画素毎に比較する処理をパラメータ候補Pc単位で行い、検出精度の評価値VRPcをパラメータ候補Pc単位で算出する(ステップS41)。 Next, the processing unit 14 performs a process of comparing the detection result in step S24 and the true value received in step S40 for each pixel in parameter candidate Pc units, and calculates the detection accuracy evaluation value VR Pc in parameter candidate Pc units. (step S41).
 次に、処理部14は、貼り合わせ画像ICPcの視認性の評価値VVPcをパラメータ候補Pc単位で算出する(ステップS42)。そして、処理部14は、検出精度の評価値VRPcと視認性の評価値VVPcとに基づいて、パラメータ候補Pc単位で総合評価値VTPcを算出し(ステップS43)、図18に示す処理を終了する。 Next, the processing unit 14 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc (step S42). Then, the processing unit 14 calculates the total evaluation value VT Pc for each parameter candidate Pc based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VV Pc (step S43), and performs the process shown in FIG. exit.
 図19は、実施の形態1にかかる情報処理装置の処理部による自動モード処理の一例を示すフローチャートである。図19に示すように、処理部14は、ステップS20で取得した複数の撮像画像データと同じ複数の撮像画像データを記憶部13から取得する(ステップS50)。 19 is a flowchart showing an example of automatic mode processing by the processing unit of the information processing apparatus according to Embodiment 1. FIG. As shown in FIG. 19, the processing unit 14 acquires the same plurality of captured image data as the plurality of captured image data acquired in step S20 from the storage unit 13 (step S50).
 次に、処理部14は、ステップS50で取得した複数の撮像画像データで表される複数の撮像画像を輝度補正なしで貼り合わせた貼り合わせ画像を生成する(ステップS51)。そして、処理部14は、ステップS51で生成した貼り合わせ画像から変状箇所を検出する(ステップS52)。 Next, the processing unit 14 generates a composite image by combining a plurality of captured images represented by a plurality of captured image data acquired in step S50 without luminance correction (step S51). Then, the processing unit 14 detects a deformed portion from the composite image generated in step S51 (step S52).
 次に、処理部14は、ステップS52での変状箇所の検出結果から真値の情報を取得する(ステップS53)。処理部14は、ステップS24で検出した結果である検出結果とステップS53で取得した真値とを画素毎に比較する処理をパラメータ候補Pc単位で行い、検出精度の評価値VRPcをパラメータ候補Pc単位で算出する(ステップS54)。 Next, the processing unit 14 acquires true value information from the detection result of the deformed portion in step S52 (step S53). The processing unit 14 compares the detection result obtained in step S24 with the true value obtained in step S53 for each pixel, and calculates the evaluation value VR Pc of the detection accuracy as the parameter candidate Pc. It is calculated in units (step S54).
 次に、処理部14は、貼り合わせ画像ICPcの視認性の評価値VVPcをパラメータ候補Pc単位で算出する(ステップS55)。そして、処理部14は、検出精度の評価値VRPcと視認性の評価値VVPcとに基づいて、パラメータ候補Pc単位で総合評価値VTPcを算出し(ステップS56)、図19に示す処理を終了する。 Next, the processing unit 14 calculates the visibility evaluation value VV Pc of the composite image IC Pc for each parameter candidate Pc (step S55). Then, the processing unit 14 calculates the total evaluation value VT Pc for each parameter candidate Pc based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VV Pc (step S56), and performs the process shown in FIG. exit.
 図20は、実施の形態1にかかる情報処理装置の処理部による点検画像生成処理の一例を示すフローチャートである。図20に示すように、処理部14は、点検画像生成処理の対象となる対象物の複数の撮像画像データを記憶部13から取得する(ステップS60)。 20 is a flowchart showing an example of inspection image generation processing by the processing unit of the information processing apparatus according to Embodiment 1. FIG. As shown in FIG. 20, the processing unit 14 acquires from the storage unit 13 a plurality of pieces of captured image data of an object to be inspected image generation processing (step S60).
 次に、処理部14は、点検画像生成処理の対象となる対象物の種類または対象物のパラメータPを記憶部13から取得する(ステップS61)。処理部14は、ステップS61で取得したパラメータPを用いて、ステップS60で取得した複数の撮像画像データで表される複数の撮像画像の輝度を補正する(ステップS62)。 Next, the processing unit 14 acquires from the storage unit 13 the type of object to be inspected image generation processing or the parameter P of the object (step S61). The processing unit 14 uses the parameter P acquired in step S61 to correct the brightness of the multiple captured images represented by the multiple captured image data acquired in step S60 (step S62).
 次に、処理部14は、輝度を補正した複数の撮像画像を貼り合わせて貼り合わせ画像ICを生成する(ステップS63)。そして、処理部14は、ステップS63で生成した貼り合わせ画像ICから変状箇所を検出する(ステップS64)。 Next, the processing unit 14 creates a composite image ICP by combining a plurality of captured images whose brightness has been corrected (step S63). Then, the processing unit 14 detects a deformed portion from the composite image ICP generated in step S63 (step S64).
 次に、処理部14は、ステップS63で生成した貼り合わせ画像IC上にステップS64で検出した変状箇所の情報を重畳した点検画像を生成し(ステップS65)、図20に示す処理を終了する。 Next, the processing unit 14 generates an inspection image by superimposing the information of the deformed portion detected in step S64 on the composite image ICP generated in step S63 (step S65), and ends the processing shown in FIG. do.
 図21は、実施の形態1にかかる情報処理装置のハードウェア構成の一例を示す図である。図21に示すように、情報処理装置1は、プロセッサ101と、メモリ102と、通信装置103と、入力装置104と、表示装置105と、バス106とを備えるコンピュータを含む。 FIG. 21 is a diagram illustrating an example of the hardware configuration of the information processing apparatus according to the first embodiment; As shown in FIG. 21, the information processing apparatus 1 includes a computer having a processor 101, a memory 102, a communication device 103, an input device 104, a display device 105, and a bus .
 プロセッサ101、メモリ102、通信装置103、入力装置104、および表示装置105は、例えば、バス106によって互いに情報の送受信が可能である。記憶部13は、メモリ102によって実現される。通信部10は、通信装置103で実現される。入力部11は、入力装置104によって実現される。表示部12は、表示装置105によって実現される。 The processor 101, the memory 102, the communication device 103, the input device 104, and the display device 105 can transmit and receive information to and from each other via the bus 106, for example. Storage unit 13 is implemented by memory 102 . The communication unit 10 is implemented by the communication device 103 . The input unit 11 is implemented by the input device 104 . The display unit 12 is implemented by the display device 105 .
 プロセッサ101は、記録媒体ドライブにセットされた記録媒体から情報処理プログラムを読み出し、読み出した情報処理プログラムをメモリ102にインストールする。記録媒体ドライブは、例えば、CD(Compact Disc)-ROMドライブ、DVD(Digital Versatile Disc)-ROMドライブ、またはUSBドライブであり、記録媒体は、例えば、CD-ROM、DVD-ROM、または不揮発性の半導体メモリなどである。 The processor 101 reads the information processing program from the recording medium set in the recording medium drive and installs the read information processing program in the memory 102 . The recording medium drive is, for example, a CD (Compact Disc)-ROM drive, a DVD (Digital Versatile Disc)-ROM drive, or a USB drive, and the recording medium is, for example, a CD-ROM, a DVD-ROM, or a non-volatile Such as semiconductor memory.
 プロセッサ101は、メモリ102に記憶されたプログラムを読み出して実行することによって、処理部14の機能を実行する。プロセッサ101は、例えば、処理回路の一例であり、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、およびシステムLSI(Large Scale Integration)のうち1つ以上を含む。 The processor 101 executes the functions of the processing unit 14 by reading and executing programs stored in the memory 102 . The processor 101 is an example of a processing circuit, for example, and includes one or more of a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a system LSI (Large Scale Integration).
 メモリ102は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、およびEEPROM(登録商標)(Electrically Erasable Programmable Read Only Memory)のうち1つ以上を含む。なお、情報処理装置1は、ASIC(Application Specific Integrated Circuit)およびFPGA(Field Programmable Gate Array)などの集積回路を含んでいてもよい。 The memory 102 includes one or more of RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), and EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory). include. The information processing device 1 may include integrated circuits such as ASIC (Application Specific Integrated Circuit) and FPGA (Field Programmable Gate Array).
 なお、情報処理装置1は、クライアント装置で構成されてもよく、サーバ装置で構成されてもよく、クライアント装置とサーバ装置とで構成されてもよい。情報処理装置1が2以上の装置で構成される場合、2以上の装置の各々は、例えば、図21に示すハードウェア構成を有する。なお、2以上の装置間の通信は、通信装置103を介して行われる。また、情報処理装置1は、2以上のサーバ装置で構成されてもよい。例えば、情報処理装置1は、処理サーバと、データサーバとで構成されてもよい。 The information processing device 1 may be configured by a client device, may be configured by a server device, or may be configured by a client device and a server device. When the information processing apparatus 1 is composed of two or more devices, each of the two or more devices has the hardware configuration shown in FIG. 21, for example. Note that communication between two or more devices is performed via the communication device 103 . Further, the information processing device 1 may be composed of two or more server devices. For example, the information processing device 1 may be configured with a processing server and a data server.
 また、パラメータ決定部35は、貼り合わされる複数の撮像画像の各々に対して個別にパラメータPを決定することもできる。この場合、輝度補正部32において、貼り合わされる複数の撮像画像の各々に対して異なるパラメータ候補Pcの組み合わせで輝度が補正される。 In addition, the parameter determination unit 35 can also determine the parameter P individually for each of a plurality of captured images to be pasted together. In this case, the luminance correction unit 32 corrects the luminance of each of the plurality of captured images to be pasted together by combining different parameter candidates Pc.
 以上のように、実施の形態1にかかる情報処理装置1は、データ取得部31と、輝度補正部32と、画像貼り合わせ部33と、変状検出部34と、パラメータ決定部35とを備える。データ取得部31は、複数の撮像装置71a,71b,71cで対象物を撮像して得られる複数の画像のデータを取得する。輝度補正部32は、複数の画像の各々の輝度を補正する。画像貼り合わせ部33は、輝度補正部32によって輝度の補正が行われた複数の画像を貼り合わせた画像である貼り合わせ画像ICPcを生成する。変状検出部34は、貼り合わせ画像ICPcから対象物の変状箇所を検出する学習モデルを用いて、対象物の変状箇所を検出する。パラメータ決定部35は、変状検出部34による変状箇所の検出精度に基づいて、輝度の補正に用いられるパラメータPを決定する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPを適切に決定することができる。 As described above, the information processing apparatus 1 according to the first embodiment includes the data acquisition unit 31, the brightness correction unit 32, the image stitching unit 33, the deformation detection unit 34, and the parameter determination unit 35. . The data acquisition unit 31 acquires data of a plurality of images obtained by imaging an object with a plurality of imaging devices 71a, 71b, and 71c. The luminance correction unit 32 corrects the luminance of each of the multiple images. The image stitching unit 33 generates a stitched image IC Pc , which is an image obtained by stitching together a plurality of images whose luminance has been corrected by the luminance correcting unit 32 . The deformation detection unit 34 detects a deformed portion of the object using a learning model for detecting the deformed portion of the object from the composite image IC Pc . The parameter determination unit 35 determines a parameter P used for luminance correction based on the accuracy of detection of the deformation location by the deformation detection unit 34 . As a result, the information processing apparatus 1 can appropriately determine the parameter P that can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
 また、輝度補正部32は、複数の画像の各々の輝度を複数のパラメータ候補Pcの各々を用いて補正する。画像貼り合わせ部33は、輝度補正部32によってパラメータ候補Pc単位で輝度が補正された複数の画像をパラメータ候補Pc単位で貼り合わせた画像である貼り合わせ画像ICPcを生成する。変状検出部34は、パラメータ候補Pc単位の貼り合わせ画像ICPcから対象物の変状箇所をパラメータ候補Pc単位で検出する。パラメータ決定部35は、変状検出部34によるパラメータ候補Pc単位の変状箇所の検出精度に基づいて、複数のパラメータ候補Pcの中から選択したパラメータ候補PcをパラメータPとして決定する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPをより適切に決定することができる。 Further, the luminance correction unit 32 corrects the luminance of each of the plurality of images using each of the plurality of parameter candidates Pc. The image stitching unit 33 generates a stitched image IC Pc , which is an image obtained by stitching together a plurality of images whose luminance has been corrected in units of parameter candidates Pc by the luminance correction unit 32 in units of parameter candidates Pc. The deformation detection unit 34 detects a deformation portion of the object in units of parameter candidates Pc from the composite image IC Pc in units of parameter candidates Pc. The parameter determination unit 35 determines a parameter candidate Pc selected from a plurality of parameter candidates Pc as the parameter P based on the detection accuracy of the deformation location in units of the parameter candidates Pc by the deformation detection unit 34 . As a result, the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
 また、パラメータ決定部35は、評価部41と、決定部42とを備える。評価部41は、変状検出部34によるパラメータ候補Pc単位の変状箇所の検出精度の評価値VRPcを算出する。決定部42は、評価部41によって算出された評価値VRPcに基づいて、複数のパラメータ候補Pcの中からパラメータPとして用いられるパラメータ候補Pcを決定する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPをより適切に決定することができる。 Also, the parameter determination unit 35 includes an evaluation unit 41 and a determination unit 42 . The evaluation unit 41 calculates an evaluation value VR Pc of the detection accuracy of the deformation location for each parameter candidate Pc by the deformation detection unit 34 . The determination unit 42 determines a parameter candidate Pc to be used as the parameter P from among the plurality of parameter candidates Pc based on the evaluation value VR Pc calculated by the evaluation unit 41 . As a result, the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
 また、評価部41は、パラメータ候補Pc単位の貼り合わせ画像ICの視認性の評価値VVPcをさらに算出し、検出精度の評価値VRPcと視認性の評価値VVPcとに基づく総合評価値VTPcをパラメータ候補Pc単位で算出する。決定部42は、評価部41によって評価された総合評価値VTPcに基づいて、複数のパラメータ候補Pcの中からパラメータPとするパラメータ候補Pcを決定する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができるパラメータPをより適切に決定することができる。 In addition, the evaluation unit 41 further calculates the visibility evaluation value VV Pc of the composite image ICP for each parameter candidate Pc , and performs a comprehensive evaluation based on the detection accuracy evaluation value VR Pc and the visibility evaluation value VVPc . A value VT Pc is calculated for each parameter candidate Pc. The determination unit 42 determines a parameter candidate Pc to be the parameter P from among the plurality of parameter candidates Pc based on the comprehensive evaluation value VT Pc evaluated by the evaluation unit 41 . As a result, the information processing apparatus 1 can more appropriately determine the parameter P that can correct the brightness while suppressing deterioration in detection accuracy of the deformed portion of the object.
 また、輝度補正部32は、複数の撮像装置71a,71b,71cで対象物を撮像して得られる複数の画像の各々の輝度をパラメータ決定部35によって決定されたパラメータPを用いて補正する。画像貼り合わせ部33は、輝度補正部32によってパラメータPで輝度が補正された複数の画像を貼り合わせた画像である貼り合わせ画像ICを生成する。変状検出部34は、パラメータPで輝度が補正された貼り合わせ画像ICから対象物の変状箇所を検出する。これにより、情報処理装置1は、対象物の変状箇所の検出精度の低下を抑制しつつ輝度を補正することができる。 Also, the brightness correction unit 32 corrects the brightness of each of a plurality of images obtained by imaging the object with the plurality of imaging devices 71 a , 71 b , 71 c using the parameter P determined by the parameter determination unit 35 . The image stitching unit 33 generates a stitched image ICP , which is an image obtained by stitching together a plurality of images whose luminance is corrected by the parameter P by the luminance correcting unit 32 . The deformation detection unit 34 detects a deformation portion of the object from the composite image ICP whose luminance is corrected by the parameter P. FIG. As a result, the information processing apparatus 1 can correct the luminance while suppressing deterioration in detection accuracy of the deformed portion of the object.
 以上の実施の形態に示した構成は、一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。 The configuration shown in the above embodiment is an example, and can be combined with another known technique, and part of the configuration can be omitted or changed without departing from the scope of the invention. It is possible.
 1 情報処理装置、2 計測車両、3 ネットワーク、4 トンネル、4a 内壁面、5 ガードレール、10,74 通信部、11 入力部、12 表示部、13 記憶部、14,73 処理部、20 撮像画像データ記憶部、21 パラメータ候補記憶部、22 パラメータ記憶部、30 入力受付部、31 データ取得部、32 輝度補正部、33 画像貼り合わせ部、34 変状検出部、35 パラメータ決定部、36 表示処理部、37 点検画像生成部、38 データ出力部、40 真値情報取得部、41 評価部、42 決定部、60 車両本体、70 計測装置、71,71a,71b,71c 撮像装置、72 位置姿勢速度検出部、100 計測処理システム、101 プロセッサ、102 メモリ、103 通信装置、104 入力装置、105 表示装置、106 バス。 1 information processing device, 2 measurement vehicle, 3 network, 4 tunnel, 4a inner wall surface, 5 guardrail, 10, 74 communication unit, 11 input unit, 12 display unit, 13 storage unit, 14, 73 processing unit, 20 captured image data Storage unit 21 Parameter candidate storage unit 22 Parameter storage unit 30 Input reception unit 31 Data acquisition unit 32 Brightness correction unit 33 Image stitching unit 34 Deformation detection unit 35 Parameter determination unit 36 Display processing unit , 37 inspection image generation unit, 38 data output unit, 40 true value information acquisition unit, 41 evaluation unit, 42 determination unit, 60 vehicle body, 70 measurement device, 71, 71a, 71b, 71c imaging device, 72 position/attitude speed detection Unit, 100 measurement processing system, 101 processor, 102 memory, 103 communication device, 104 input device, 105 display device, 106 bus.

Claims (7)

  1.  複数の撮像装置で対象物を撮像して得られる複数の画像のデータを取得するデータ取得部と、
     前記複数の画像の各々の輝度を補正する輝度補正部と、
     前記輝度補正部によって輝度の補正が行われた前記複数の画像を貼り合わせた画像である貼り合わせ画像を生成する画像貼り合わせ部と、
     前記貼り合わせ画像から前記対象物の変状箇所を検出する学習モデルを用いて、前記対象物の変状箇所を検出する変状検出部と、
     前記変状検出部による前記変状箇所の検出精度に基づいて、前記輝度の補正に用いられるパラメータを決定するパラメータ決定部と、を備える
     ことを特徴とする情報処理装置。
    a data acquisition unit that acquires data of a plurality of images obtained by imaging an object with a plurality of imaging devices;
    a luminance correction unit that corrects the luminance of each of the plurality of images;
    an image stitching unit that generates a stitched image that is an image obtained by stitching together the plurality of images whose luminance has been corrected by the luminance correction unit;
    a deformation detection unit that detects a deformed portion of the object using a learning model that detects the deformed portion of the object from the combined image;
    an information processing apparatus comprising: a parameter determination unit that determines a parameter used for correcting the luminance based on the accuracy of detection of the deformation location by the deformation detection unit.
  2.  前記輝度補正部は、
     前記複数の画像の各々の輝度を複数のパラメータ候補の各々を用いて補正し、
     前記画像貼り合わせ部は、
     前記輝度補正部によって前記パラメータ候補単位で輝度が補正された前記複数の画像を前記パラメータ候補単位で貼り合わせた画像である貼り合わせ画像を生成し、
     前記変状検出部は、
     前記パラメータ候補単位の前記貼り合わせ画像から前記対象物の変状箇所を前記パラメータ候補単位で検出し、
     前記パラメータ決定部は、
     前記変状検出部による前記パラメータ候補単位の前記変状箇所の検出精度に基づいて、前記複数のパラメータ候補の中から選択したパラメータ候補を前記パラメータとして決定する
     ことを特徴とする請求項1に記載の情報処理装置。
    The luminance correction unit
    correcting the brightness of each of the plurality of images using each of the plurality of parameter candidates;
    The image stitching unit includes:
    generating a stitched image obtained by stitching together the plurality of images whose luminance has been corrected in units of the candidate parameters by the luminance correction unit in units of the candidate parameters;
    The deformation detection unit is
    detecting a deformed portion of the object in units of the parameter candidates from the stitched image in units of the parameter candidates;
    The parameter determination unit
    2. The parameter candidate according to claim 1, wherein a parameter candidate selected from among the plurality of parameter candidates is determined as the parameter based on detection accuracy of the deformed portion in the parameter candidate unit by the deformation detection unit. information processing equipment.
  3.  前記パラメータ決定部は、
     前記変状検出部による前記パラメータ候補単位の前記変状箇所の検出精度の評価値を算出する評価部と、
     前記評価部によって算出された前記評価値に基づいて、前記複数のパラメータ候補の中から前記パラメータとして用いられるパラメータ候補を決定する決定部と、を備える
     ことを特徴とする請求項2に記載の情報処理装置。
    The parameter determination unit
    an evaluation unit that calculates an evaluation value of detection accuracy of the deformation location for each of the parameter candidates by the deformation detection unit;
    3. The information according to claim 2, further comprising a determination unit that determines a parameter candidate to be used as the parameter from among the plurality of parameter candidates based on the evaluation value calculated by the evaluation unit. processing equipment.
  4.  前記評価部は、
     前記パラメータ候補単位の前記貼り合わせ画像の視認性の評価値をさらに算出し、前記検出精度の評価値と前記視認性の評価値とに基づく総合評価値を前記パラメータ候補単位で算出し、
     前記決定部は、
     前記評価部によって評価された前記総合評価値に基づいて、前記複数のパラメータ候補の中から前記パラメータとするパラメータ候補を決定する
     ことを特徴とする請求項3に記載の情報処理装置。
    The evaluation unit
    further calculating a visibility evaluation value of the stitched image for each of the parameter candidates, calculating a comprehensive evaluation value based on the detection accuracy evaluation value and the visibility evaluation value for each of the parameter candidates;
    The decision unit
    4. The information processing apparatus according to claim 3, wherein a parameter candidate to be said parameter is determined from among said plurality of parameter candidates based on said comprehensive evaluation value evaluated by said evaluation unit.
  5.  前記輝度補正部は、
     前記複数の撮像装置で対象物を撮像して得られる複数の画像の各々の輝度を前記パラメータ決定部によって決定された前記パラメータを用いて補正し、
     前記画像貼り合わせ部は、
     前記輝度補正部によって前記パラメータで前記輝度が補正された前記複数の画像を貼り合わせた画像である貼り合わせ画像を生成し、
     前記変状検出部は、
     前記パラメータで前記輝度が補正された前記貼り合わせ画像から前記対象物の変状箇所を検出する
     ことを特徴とする請求項1から4のいずれか1つに記載の情報処理装置。
    The luminance correction unit
    correcting the brightness of each of a plurality of images obtained by imaging an object with the plurality of imaging devices using the parameter determined by the parameter determining unit;
    The image stitching unit includes:
    generating a stitched image that is an image obtained by stitching together the plurality of images, the luminance of which has been corrected by the luminance correction unit using the parameter;
    The deformation detection unit is
    5. The information processing apparatus according to any one of claims 1 to 4, wherein a deformed portion of the object is detected from the composite image in which the luminance is corrected by the parameter.
  6.  コンピュータが実行する情報処理方法であって、
     複数の撮像装置で対象物を撮像して得られる複数の画像のデータを取得するデータ取得ステップと、
     前記複数の画像の各々の輝度を補正する輝度補正ステップと、
     前記輝度補正ステップによって輝度の補正が行われた前記複数の画像を貼り合わせた画像である貼り合わせ画像を生成する画像貼り合わせステップと、
     前記貼り合わせ画像から前記対象物の変状箇所を検出する学習モデルを用いて、前記対象物の変状箇所を検出する変状検出ステップと、
     前記変状検出ステップによる前記変状箇所の検出精度に基づいて、前記輝度の補正に用いられるパラメータを決定するパラメータ決定ステップと、を含む
     ことを特徴とする情報処理方法。
    A computer-executed information processing method comprising:
    a data acquisition step of acquiring data of a plurality of images obtained by imaging an object with a plurality of imaging devices;
    a brightness correction step of correcting the brightness of each of the plurality of images;
    an image stitching step of generating a stitched image that is an image obtained by stitching together the plurality of images whose luminance has been corrected by the luminance correction step;
    a deformation detection step of detecting a deformed portion of the object using a learning model for detecting a deformed portion of the object from the combined image;
    and a parameter determination step of determining a parameter used for correcting the brightness based on the detection accuracy of the deformed portion obtained by the deformation detection step.
  7.  複数の撮像装置で対象物を撮像して得られる複数の画像のデータを取得するデータ取得ステップと、
     前記複数の画像の各々の輝度を補正する輝度補正ステップと、
     前記輝度補正ステップによって輝度の補正が行われた前記複数の画像を貼り合わせた画像である貼り合わせ画像を生成する画像貼り合わせステップと、
     前記貼り合わせ画像から前記対象物の変状箇所を検出する学習モデルを用いて、前記対象物の変状箇所を検出する変状検出ステップと、
     前記変状検出ステップによる前記変状箇所の検出精度に基づいて、前記輝度の補正に用いられるパラメータを決定するパラメータ決定ステップと、をコンピュータに実行させる
     ことを特徴とする情報処理プログラム。
    a data acquisition step of acquiring data of a plurality of images obtained by imaging an object with a plurality of imaging devices;
    a brightness correction step of correcting the brightness of each of the plurality of images;
    an image stitching step of generating a stitched image that is an image obtained by stitching together the plurality of images whose luminance has been corrected by the luminance correction step;
    a deformation detection step of detecting a deformed portion of the object using a learning model for detecting a deformed portion of the object from the combined image;
    and a parameter determination step of determining a parameter used for correcting the brightness based on the detection accuracy of the deformed portion obtained by the deformation detection step.
PCT/JP2021/041367 2021-11-10 2021-11-10 Information processing device, information processing method, and information processing program WO2023084644A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022510220A JP7066082B1 (en) 2021-11-10 2021-11-10 Information processing equipment, information processing methods, and information processing programs
PCT/JP2021/041367 WO2023084644A1 (en) 2021-11-10 2021-11-10 Information processing device, information processing method, and information processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/041367 WO2023084644A1 (en) 2021-11-10 2021-11-10 Information processing device, information processing method, and information processing program

Publications (1)

Publication Number Publication Date
WO2023084644A1 true WO2023084644A1 (en) 2023-05-19

Family

ID=81584938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/041367 WO2023084644A1 (en) 2021-11-10 2021-11-10 Information processing device, information processing method, and information processing program

Country Status (2)

Country Link
JP (1) JP7066082B1 (en)
WO (1) WO2023084644A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11242737A (en) * 1998-02-26 1999-09-07 Ricoh Co Ltd Method for processing picture and device therefor and information recording medium
JP2002310920A (en) * 2001-04-19 2002-10-23 Keisoku Kensa Kk Method for detecting crack in concrete wall and device thereof
JP2010038881A (en) * 2008-08-08 2010-02-18 Nikon Corp Image processing apparatus, image processing method, and program
JP2010081090A (en) * 2008-09-24 2010-04-08 Olympus Corp Image processing device and image processing program
JP2017049311A (en) * 2015-08-31 2017-03-09 沖電気工業株式会社 Information processing apparatus, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11242737A (en) * 1998-02-26 1999-09-07 Ricoh Co Ltd Method for processing picture and device therefor and information recording medium
JP2002310920A (en) * 2001-04-19 2002-10-23 Keisoku Kensa Kk Method for detecting crack in concrete wall and device thereof
JP2010038881A (en) * 2008-08-08 2010-02-18 Nikon Corp Image processing apparatus, image processing method, and program
JP2010081090A (en) * 2008-09-24 2010-04-08 Olympus Corp Image processing device and image processing program
JP2017049311A (en) * 2015-08-31 2017-03-09 沖電気工業株式会社 Information processing apparatus, information processing method, and program

Also Published As

Publication number Publication date
JPWO2023084644A1 (en) 2023-05-19
JP7066082B1 (en) 2022-05-12

Similar Documents

Publication Publication Date Title
KR101214806B1 (en) Apparatus and method for defect inspection of wafer
JP5549364B2 (en) Wafer defect inspection apparatus and wafer defect inspection method
JP5556346B2 (en) Wafer defect inspection apparatus and wafer defect inspection method
CN106920245B (en) Boundary detection method and device
CN103868471B (en) 3 d shape measuring apparatus and its control method
CN103916599B (en) A kind of quick focus adjustment method for the imaging of remote sensing camera outdoor scene
RU2012145612A (en) IMAGE PROCESSING DEVICE AND METHOD FOR MANAGING THE IMAGE PROCESSING DEVICE
JP2012002531A (en) Crack detection method
JP6875836B2 (en) Wire rope measuring device and method
KR100888235B1 (en) Image processing device and method
WO2023084644A1 (en) Information processing device, information processing method, and information processing program
JP2016015536A5 (en)
JP2020109924A5 (en)
JP6199799B2 (en) Self-luminous material image processing apparatus and self-luminous material image processing method
JP2009250937A (en) Pattern inspection device and method
KR101775272B1 (en) Depth refinement method and system of sparse depth images in multi aperture camera
JP5157575B2 (en) Defect detection method
KR101793091B1 (en) Method and apparatus for detecting defective pixels
JP5087165B1 (en) Adjustment device for outputting data for adjusting surface inspection device, adjustment data output method and program
JP4380371B2 (en) Method for extracting bump image of mounting substrate such as semiconductor device, method for inspecting bump image, program for the method, recording medium recording the program, and device for inspecting bump
JP4484041B2 (en) Edge position detection device
JP2010206244A (en) Image degradation evaluation system, image degradation evaluation method, and image degradation evaluation program
JP4889018B2 (en) Appearance inspection method
KR101710110B1 (en) Image interpretation method for non-destructive inspection apparatus
TWI592544B (en) Pavement marking degradation judgment method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2022510220

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21964003

Country of ref document: EP

Kind code of ref document: A1