WO2024084873A1 - Photographing method and image processing method - Google Patents

Photographing method and image processing method Download PDF

Info

Publication number
WO2024084873A1
WO2024084873A1 PCT/JP2023/033795 JP2023033795W WO2024084873A1 WO 2024084873 A1 WO2024084873 A1 WO 2024084873A1 JP 2023033795 W JP2023033795 W JP 2023033795W WO 2024084873 A1 WO2024084873 A1 WO 2024084873A1
Authority
WO
WIPO (PCT)
Prior art keywords
photographing
slope
image
target area
distance
Prior art date
Application number
PCT/JP2023/033795
Other languages
French (fr)
Japanese (ja)
Inventor
英臣 佐久間
響 辰野
泰宏 秋田
公平 牛尾
Original Assignee
株式会社リコー
英臣 佐久間
響 辰野
泰宏 秋田
公平 牛尾
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社リコー, 英臣 佐久間, 響 辰野, 泰宏 秋田, 公平 牛尾 filed Critical 株式会社リコー
Publication of WO2024084873A1 publication Critical patent/WO2024084873A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present invention relates to a photography method and an information processing method.
  • Patent Document 1 describes a method of inspecting a slope by inspecting the condition of the slope based on visual observation, and recording the condition and location of any abnormalities found.
  • a vehicle traveling on the flat surface at the top or to the side of the slope uses multiple cameras with different focal lengths to divide the slope in the height direction and continuously photograph it with an almost uniform resolution, while a position measurement means mounted on the vehicle records the photographing positions.
  • This slope inspection method also creates three-dimensional measurement data of the slope through photogrammetry using the continuously photographed images and the photographing positions and orientations.
  • This slope inspection method also uses the three-dimensional measurement data to externally observe the condition of the slope based on a direct projection image that has been projected in a direction directly facing the slope, and if any abnormality is found, identifies its condition and position.
  • Patent Document 1 there is room for improvement in Patent Document 1 in terms of obtaining an image that is in focus over a wide area of the slope.
  • the present invention has been made in consideration of the above points, and aims to provide an imaging method and information processing method that can obtain an image that is in focus over a wide area of an object (e.g., a slope).
  • the photographing method of this embodiment is a photographing method in which an object is photographed while moving by a photographing device installed on a moving body, and the photographing device photographs a photographing target area of the object located in a direction intersecting with the moving direction, and includes an installation position setting step for setting the installation position of the photographing device on the moving body, and using an XYZ Cartesian coordinate system in which the moving direction of the moving body is the X axis, the direction intersecting the moving direction is the Z axis, and the direction perpendicular to the X axis and the Z axis is the Y axis, the length from the position where the photographing reference position of the photographing device installed on the moving body is projected onto the XZ plane on which the moving body moves to the position on the XZ plane where the XZ plane and the object intersect is defined as the distance from the photographing reference position to the object, and the length on the Y axis from the XZ plane to the photographing target area of the object is defined as the height of
  • the present invention provides an imaging method and information processing method that can obtain an image that is in focus over a wide area of an object (e.g., a slope).
  • FIG. 1 is a diagram showing an example of the overall configuration of a state inspection system according to the present embodiment.
  • FIG. 2 is a diagram showing an example of how a slope condition is inspected using the condition inspection system.
  • FIG. 3 is a diagram showing an example of a state of a slope.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration of the data acquisition device.
  • FIG. 5 is a diagram for explaining a photographed image acquired by the mobile system.
  • FIG. 6 is an explanatory diagram of a photographed image and a distance measurement image.
  • FIG. 7 is an explanatory diagram of a plurality of shooting regions.
  • FIG. 8 is a diagram showing an example of the configuration of a moving body equipped with the imaging device of this embodiment.
  • FIG. 8 is a diagram showing an example of the configuration of a moving body equipped with the imaging device of this embodiment.
  • FIG. 9 is a diagram illustrating an example of the configuration of the variable installation position photographing unit.
  • FIG. 10 is a diagram showing an example of adjustment of the installation position of the installation position variable imaging unit.
  • FIG. 11 is a first conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle.
  • FIG. 12 is a second conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle.
  • FIG. 13 is a third conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle.
  • FIG. 14 is a sequence diagram showing an example of a display process in the state inspection system.
  • FIG. 15 is an explanatory diagram of operations on the display screen of the status inspection system.
  • FIG. 16 is a flowchart showing an example of the process of the information processing method.
  • FIG. 17 is a diagram showing an example of a display screen after the processing shown in FIG.
  • the imaging method and information processing method of this embodiment will be described in detail below with reference to the drawings.
  • the X-axis, Y-axis, and Z-axis directions are based on the arrow directions shown in the drawings.
  • the X-axis, Y-axis, and Z-axis directions define a three-dimensional space that is orthogonal to each other.
  • the X-axis direction may be interpreted as the moving direction of the moving body (direction of progress, direction of travel).
  • the Y-axis direction may be interpreted as the height direction of the slope of the object (vertical direction, perpendicular direction).
  • the Z-axis direction may be interpreted as the direction intersecting with the moving direction of the moving body (depth direction from the moving body toward the slope, imaging direction).
  • the imaging method and information processing device of this embodiment use an XYZ Cartesian coordinate system in which the moving direction of the moving body is the X-axis, the direction intersecting the moving direction is the Z-axis, and the direction perpendicular to the X-axis and Z-axis is the Y-axis.
  • FIG. 1 is a diagram showing an example of the overall configuration of a condition inspection system 1 for implementing the imaging method and information processing method of this embodiment.
  • the condition inspection system 1 is an example of an information processing system, and is a system for inspecting the condition of road earthwork structures using various data acquired by a mobile system 60.
  • Road earthwork structures are a general term for structures constructed to build roads using ground materials such as soil and rocks as their main materials, and structures associated with them, and refer to cut and slope stabilization facilities, embankments, culverts, and similar.
  • road earthwork structures may be referred to as "objects,” “slope,” and “objects including slopes.”
  • the condition inspection system 1 is composed of a mobile system 60, an evaluation system 4, a terminal device 1100 of the national or local government, and a terminal device 1200 of a contractor.
  • the mobile system 60 is an example of an imaging system, and is composed of a data acquisition device 9 and a mobile body 6 such as a vehicle equipped with the data acquisition device 9.
  • the vehicle may be a vehicle that runs on a road or a vehicle that runs on a railroad.
  • the data acquisition device 9 has an imaging device 7, which is an example of a measuring device that measures a structure, as well as a distance sensor 8a and a GNSS (Global Navigation Satellite System) sensor 8b.
  • GNSS Global Navigation Satellite System
  • GNSS Global Navigation Satellite System
  • the imaging device 7 is a line camera equipped with a line sensor in which photoelectric conversion elements are arranged in one or more rows.
  • the imaging device 7 captures images of positions along a predetermined imaging range on an imaging surface aligned in the traveling direction (X-axis direction) of the moving body 6.
  • the imaging device 7 is not limited to a line camera, and may be a camera equipped with an area sensor in which photoelectric conversion elements are arranged in a planar manner.
  • the imaging device 7 may also be composed of multiple cameras. Details of the imaging device 7 composed of multiple cameras will be described later.
  • the distance sensor 8a is a ToF (Time of Flight) sensor that measures the distance to the subject photographed by the photographing device 7.
  • the GNSS sensor 8b is a positioning means that receives signals transmitted at each time by multiple GNSS satellites and calculates the distance to the satellite from the difference in the time at which each signal was received, thereby measuring a position on the earth.
  • the positioning means may be a device dedicated to positioning, or may be an application dedicated to positioning installed on a PC (Personal Computer), smartphone, etc.
  • the distance sensor 8a and the GNSS sensor 8b are examples of sensor devices.
  • the distance sensor 8a is also an example of a three-dimensional sensor.
  • the ToF sensor used as distance sensor 8a measures the distance from a light source to an object by irradiating the object with laser light from the light source and measuring the scattered and reflected light.
  • the distance sensor 8a is a LiDAR (Light Detection and Ranging) sensor.
  • LiDAR is a method of measuring the time of flight of light using pulses, but as another method of the ToF sensor, the distance may be measured using a phase difference detection method.
  • the phase difference detection method a laser light amplitude modulated at a fundamental frequency is irradiated onto the measurement range, the reflected light is received, and the phase difference between the irradiated light and the reflected light is measured to obtain time, and the distance is calculated by multiplying this time by the speed of light.
  • the distance sensor 8a may also be configured with a stereo camera, etc.
  • the mobile system 60 can obtain three-dimensional information that is difficult to obtain from two-dimensional images, such as the height, inclination angle, or overhang of a slope.
  • the mobile system 60 may further include an angle sensor 8c.
  • the angle sensor 8c is a gyro sensor or the like for detecting the angle (attitude) or angular velocity (or each acceleration) of the shooting direction of the image capture device 7.
  • the evaluation system 4 is constructed by an evaluation device 3 and a data management device 5.
  • the evaluation device 3 and data management device 5 constituting the evaluation system 4 can communicate with a mobile system 60, a terminal device 1100 and a terminal device 1200 via a communication network 100.
  • the communication network 100 is constructed by the Internet, a mobile communication network, a LAN (Local Area Network), etc.
  • the communication network 100 may include not only wired communication but also networks using wireless communication such as 3G (3rd Generation), 4G (4th Generation), 5G (5th Generation), Wi-Fi (Wireless Fidelity) (registered trademark), WiMAX (Worldwide Interoperability for Microwave Access), or LTE (Long Term Evolution).
  • the evaluation device 3 and the data management device 5 may have a communication function using a short-range communication technology such as NFC (Near Field Communication) (registered trademark).
  • the data management device 5 is an example of an information processing device, and is a computer such as a PC that manages various data acquired by the data acquisition device 9.
  • the data management device 5 receives various acquired data from the data acquisition device 9, and transfers the received various acquired data to the evaluation device 3 that performs data analysis.
  • the method of transferring the various acquired data from the data management device 5 to the evaluation device 3 may be manual transfer using a USB (Universal Serial Bus) memory or the like.
  • the evaluation device 3 is a computer such as a PC that evaluates the condition of the slope based on various acquired data transferred from the data management device 5.
  • a dedicated application program for evaluating the condition of the slope is installed in the evaluation device 3.
  • the evaluation device 3 detects the type or structure of the slope from the captured image data and sensor data, extracts shape data, and performs a detailed analysis by detecting the presence or absence of deformation and the degree of deformation.
  • the evaluation device 3 also generates a report to be submitted to a road administrator such as the country, local government, or a commissioned business operator, using the captured image data, sensor data, evaluation target data, and the detailed analysis results.
  • the data of the report generated by the evaluation device 3 is submitted to the country or local government via the commissioned business operator in the form of electronic data or printed paper.
  • the report generated by the evaluation device 3 is called an investigation record sheet, inspection sheet, investigation ledger, or report.
  • the evaluation device 3 is not limited to a PC, and may be a smartphone or tablet terminal.
  • the evaluation system 4 may be configured to construct the evaluation device 3 and the data management device 5 as a single device or terminal.
  • the terminal device 1200 is provided to the commissioned business operator, and the terminal device 1100 is provided to the national or local government.
  • the evaluation device 3, the terminal device 1100, and the terminal device 1200 are examples of communication terminals capable of communicating with the data management device 5, and various data managed by the data management device 5 can be viewed.
  • FIG. 2 is a diagram showing an example of how the condition of a slope is inspected using a condition inspection system 1 for implementing the imaging method and information processing method of this embodiment.
  • the mobile system 60 drives a mobile body 6 equipped with a data acquisition device 9 along a road while capturing images of a predetermined area of the slope (the slope of the target object) using an imaging device 7.
  • a cut slope is a slope where the soil has been piled up
  • a bank slope is a slope where the soil has been piled up.
  • the slope on the side of a road that runs along the side of a mountain is called a natural slope.
  • Cut slopes and bank slopes can be made more durable by planting plants on the surface of the slope, and can be left unchanged for decades. However, this is not always the case.
  • cut slopes, bank slopes, and natural slopes deteriorate due to wind and rain, surface collapses occur, causing rocks and soil to fall, or the mountain collapses, causing road closures.
  • Earthwork structures include retaining walls that are installed between natural slopes and roads, and rockfall protection fences that prevent rocks from falling onto the road, but both are intended to prevent road closures or human injury caused by soil or falling rocks flowing onto the road.
  • the condition inspection system 1 acquires photographed image data of the slope of an earthwork structure using the photographing device 7, and acquires sensor data including three-dimensional information using a three-dimensional sensor such as a distance sensor 8a.
  • the evaluation system 4 then combines the acquired photographed image data and sensor data to evaluate the condition of the slope, thereby detecting shape data indicating the three-dimensional shape of the slope and detecting abnormalities such as cracks and peeling. This allows the condition inspection system 1 to efficiently perform evaluations that are difficult to inspect with the human eye.
  • Figure 3 shows an example of the condition of a slope.
  • Figure 3(a) is an image showing the surface of the slope five years before the collapse
  • Figure 3(b) is an explanatory diagram of the image shown in Figure 3(a).
  • cracks in the surface layer of the slope are noticeable, and image analysis shown in an unfolded view, etc. is effective in detecting changes or signs of changes in the surface layer, such as cracks, peeling, and seepage.
  • Figure 3(c) is an image showing the surface of the slope two years before the collapse
  • Figure 3(d) is an explanatory diagram of the image shown in Figure 3(c).
  • the inside of the slope has turned to soil, the soil has pushed against the surface of the slope, and the slope has bulged.
  • three-dimensional analysis of images such as development drawings + cross-sections is effective.
  • Figure 3(e) is an image showing the surface of the collapsed slope
  • Figure 3(f) is an explanatory diagram of the image shown in Figure 3(e). In this state, the surface layer of the slope is unable to contain the soil and has collapsed.
  • FIG. 4 is a diagram showing an example of the hardware configuration of the data acquisition device 9.
  • the data acquisition device 9 includes the image capture device 7 and the sensor device 8 as shown in FIG. 1, as well as a controller 900 that controls the processing or operation of the data acquisition device 9.
  • the controller 900 includes an imaging device I/F (Interface) 901, a sensor device I/F 902, a bus line 910, a CPU (Central Processing Unit) 911, a ROM (Read Only Memory) 912, a RAM (Random Access Memory) 913, a HD (Hard Disk) 914, a HDD (Hard Disk Drive) controller 915, a network I/F 916, a DVD-RW (Digital Versatile Disk Rewritable) drive 918, a media I/F 922, an external device connection I/F 923, and a timer 924.
  • an imaging device I/F Interface
  • sensor device I/F 902 a bus line 910
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • HD Hard Disk
  • HDD Hard Disk Drive
  • the imaging device I/F 901 is an interface for transmitting and receiving various data or information to and from the imaging device 7.
  • the sensor device I/F 902 is an interface for transmitting and receiving various data or information to and from the sensor device 8.
  • the bus line 910 is an address bus, data bus, etc. for electrically connecting each component such as the CPU 911 shown in FIG. 4.
  • the CPU 911 also controls the operation of the entire data acquisition device 9.
  • the ROM 912 stores programs used to drive the CPU 911, such as IPL.
  • the RAM 913 is used as a work area for the CPU 911.
  • the HD 914 stores various data such as programs.
  • the HDD controller 915 controls the reading and writing of various data from the HD 914 under the control of the CPU 911.
  • the network I/F 916 is an interface for data communication using the communication network 100.
  • the DVD-RW drive 918 controls the reading and writing of various data from and to a DVD-RW 917, which is an example of a removable recording medium.
  • the medium is not limited to a DVD-RW, and may be a DVD-R or a Blu-ray (registered trademark) Disc, etc.
  • the media I/F 922 controls the reading and writing (storing) of data from and to a recording medium 921 such as a flash memory.
  • the external device connection I/F 923 is an interface for connecting an external device such as an external PC 930 having a display, a reception unit, and a display control unit.
  • the timer 924 is a measurement device with a time measurement function.
  • the timer 924 may be a computer-based software timer. It is preferable that the timer 924 be synchronized with the time of the GNSS sensor 8b. This makes it easy to synchronize the time and associate the positions of each sensor data and captured image data.
  • Figure 5 is a diagram explaining the captured images acquired by the mobile system.
  • the mobile body system 60 uses the imaging device 7 provided in the data acquisition device 9 to capture images of slopes on the road while the mobile body 6 is traveling.
  • the X-axis direction shown in FIG. 5 indicates the direction of movement of the mobile body 6, the Y-axis direction is the vertical direction, and the Z-axis direction is perpendicular to the X-axis and Y-axis directions and indicates the depth direction from the mobile body 6 toward the slope.
  • the data acquisition device 9 acquires photographed image 1, distance-measured image 1, photographed image 2, and distance-measured image 2 in chronological order, as shown in FIG. 5.
  • Distance-measured image 1 and distance-measured image 2 are images acquired by distance sensor 8a.
  • the photographing device 7 and sensor device 8 are time-synchronized, so photographed image 1 and distance-measured image 1, and photographed image 2 and distance-measured image 2 are images of the same area of the slope.
  • tilt correction (image correction) of the photographed image is performed based on the attitude of the vehicle at the time of shooting, and the image data and positioning data (north latitude and east longitude) are linked based on the time of the photographed image.
  • the mobile system 60 acquires photographed image data of the slope and sensor data acquired in response to photographing by the photographing device 7 while driving the vehicle as the mobile body 6, and uploads them to the data management device 5.
  • the data acquisition device 9 may acquire the ranging images and the photographed images while driving separately, but considering changes in the slope shape due to collapses, etc., it is preferable to acquire the ranging images and the photographed images for the same slope shape while driving at the same time.
  • Figure 6 is an explanatory diagram of the captured image and the distance measurement image.
  • FIG. 6(a) shows captured image data 7A of captured images 1, 2, etc. shown in FIG. 5.
  • Each pixel 7A1 of the captured image data 7A acquired by the photographing device 7 is arranged at coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5, and has brightness information corresponding to the amount of stored power.
  • the captured image data 7A is an example of a brightness image.
  • the brightness information of each pixel 7A1 of the captured image data 7A is stored in the storage unit as captured image data in association with coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5.
  • FIG. 6(b) shows distance measurement image data 8A such as distance measurement images 1 and 2 shown in FIG. 5.
  • Each pixel 8A1 of distance measurement image data 8A acquired by distance sensor 8a is arranged at coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5, and has distance information in the Z-axis direction shown in FIG. 5 corresponding to the amount of stored power.
  • distance measurement image data 8A is three-dimensional point cloud data, but is generally referred to as distance measurement image data because it is visually displayed with luminance information added when viewed by a user.
  • the captured image data 7A and distance measurement image data 8A are collectively referred to as image data.
  • the distance information for each pixel 8A1 in the distance measurement image data 8A is then stored in the storage unit as three-dimensional data included in the sensor data, in association with coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5.
  • the captured image data 7A shown in FIG. 6(a) and the distance measurement image data 8A shown in FIG. 6(b) are images of the same area of the slope, so the brightness information and distance information are stored in the memory unit in association with the coordinates corresponding to the X-axis and Y-axis directions shown in the figure.
  • FIG. 7 is an explanatory diagram of multiple shooting areas. As shown in FIG. 7(a), the shooting device 7 moves together with the moving body 6 and captures images of the target area 70, which is the shooting range of the slope 80, divided into multiple shooting areas d11, d12, etc., at a constant shooting interval t along the X-axis direction, which is the moving direction of the moving body 6.
  • the captured images of multiple shooting areas d11, d12, etc. are long slit-shaped images in the Y-axis direction, and by stitching together the images of multiple shooting areas d11, d12, etc., it is possible to obtain a captured image of the target area 70 that is continuous in the X-axis direction.
  • FIG. 7(c) is a diagram showing a case where the entire target area 70 is divided into a plurality of target areas and imaged when the entire target area 70 is imaged.
  • the entire target area 70 is imaged by dividing the captured image into four target areas, namely, target areas 701A, 702A, 701B, and 702B.
  • each of the multiple target areas 701A, 702A, 701B, and 702B is divided into multiple shooting areas d11, d12, etc., and images of the multiple shooting areas d11, d12, etc. are stitched together to obtain images of each of the multiple target areas 701A, 702A, 701B, and 702B. Then, by stitching together images of the multiple target areas 701A, 702A, 701B, and 702B, an image of the entire target area 70 can be obtained.
  • the image capture device 7 is equipped with multiple image capture devices, and the target areas 702A and 702B are captured by an image capture device different from the image capture device that captures the target areas 701A and 701B.
  • the target area 701B may be photographed by the same imaging device that photographed the target area 701A, and the target area 702B may also be photographed by the same imaging device that photographed the target area 702A.
  • the distance sensor 8a when the image capturing device 7 captures the target area of the slope 80 divided into a plurality of image capturing areas d11, d12, etc., it is desirable that the distance sensor 8a also acquires distance information indicating the distance from the distance sensor 8a to each of the plurality of image capturing areas d11, d12, etc.
  • the condition inspection system 1 of this embodiment photographs the slope while a mobile body (vehicle) equipped with an imaging device (camera) travels along a road beside the slope.
  • multiple imaging devices (cameras) are prepared, and the slope is divided and photographed in the traveling direction (X-axis direction) of the mobile body (vehicle) by each imaging device (camera).
  • the imaging timing of each imaging device (camera) is synchronized to divide and photograph the area in the height direction (Y-axis direction) of the slope.
  • the slope is sliced and photographed in the traveling direction (X-axis direction) of the mobile body (vehicle), and each slice image is stitched together after shooting to obtain a surface image of the entire slope at a specified resolution (e.g. 4K).
  • a specified resolution e.g. 4K
  • FIG. 8 is a diagram showing an example of the configuration of a moving body equipped with an image capture device of this embodiment.
  • a moving body (vehicle) 6 moving in the X-axis direction is equipped with multiple image capture devices (cameras) 7A, 7B, 7C, 7D, and 7E at different positions in the X-axis direction.
  • the image capture devices 7A, 7B, 7C, 7D, and 7E are arranged in this order from the front to the rear of the moving body 6 in the direction of travel.
  • the image capture devices 7A, 7B, 7C, 7D, and 7E may also be called cameras 1, 2, 3, 4, and 5.
  • a distance sensor 8a (e.g., LiDAR) is mounted on the rear side of the image capture device 7E. Note that the arrangement of the image capture devices 7A to 7E and the distance sensor 8a is not limited to that shown in FIG. 8, and various design changes are possible.
  • the height direction of the slope (object including a slope) of the object does not fit within the angle of view of a single camera, so multiple camera devices, here five camera devices 7A to 7E, are used to divide and photograph the area of the slope (object including a slope) of the object in the height direction.
  • the height direction of the slope (object including a slope) of the object is divided from bottom to top into a first area, a second area, a third area, a fourth area, and a fifth area
  • the first area is photographed by camera device 7A
  • the second area is photographed by camera device 7B
  • the third area is photographed by camera device 7C
  • the fourth area is photographed by camera device 7D
  • the fifth area is photographed by camera device 7E.
  • camera device 7A is the camera that photographs the lowest part of the slope
  • camera device 7E is the camera that photographs the highest part of the slope.
  • the height of the slope (object including a slope) of the object is 15m or close to that.
  • the photographing method of this embodiment involves photographing an object (slope) while moving using photographing devices (cameras) 7A, 7B, 7C, 7D, and 7E installed on a moving body (vehicle) 6. Furthermore, the photographing devices (cameras) 7A, 7B, 7C, 7D, and 7E photograph a "photographed target area" of the object (slope) located in the Z-axis direction that intersects with the moving direction of the moving body (vehicle) 6.
  • This "photographed target area” includes a "first target area” and a "second target area” located above the first target area in the Y-axis direction.
  • the "photographing devices” include a “first photographing device” that photographs the "first target area” and a “second photographing device” that photographs the "second target area”.
  • all or part of the photographing devices 7A-7D correspond to the "first photographing device” that photographs the "first target area”
  • the photographing device 7E corresponds to the "second photographing device” that photographs the "second target area”.
  • the photographing device 7E (camera 5) that photographs the uppermost part of the slope (around 15 m) is an "installation position variable photographing unit" whose installation position in the direction (Z axis direction) intersecting with the moving direction (X axis direction) relative to the moving body 6 is adjusted.
  • the installation position in the Z axis direction of the photographing device 7E (second photographing device) on the moving body (vehicle) 6 is set (adjusted).
  • Figures 9(a), (b), and (c) are diagrams showing an example of the configuration of the variable installation position imaging unit.
  • the direction perpendicular to the paper surface in the figure is the X-axis direction, i.e., the direction of movement of the moving body 6, and the rightward direction in the figure is the Z-axis direction, i.e., the direction in which the slope of the target object (target object including a slope) exists.
  • the configuration of the variable installation position imaging unit (configuration of the installation position adjustment mechanism) is not limited to the examples shown in Figures 9(a) to (c), and various design modifications are possible.
  • two camera mounting parts 71 and 72 are provided on the ceiling of the moving body 6, spaced apart in the Z-axis direction, and the installation position in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6 is adjusted depending on which of the two camera mounting parts 71 and 72 the imaging device 7E (camera 5) is attached to.
  • the imaging device 7E (camera 5) is attached to the camera mounting part 71, the distance to the target slope (target including a slope) can be reduced, and when the imaging device 7E (camera 5) is attached to the camera mounting part 72, the distance to the target slope (target including a slope) can be increased (two-stage setting is possible).
  • FIG. 9(a) illustrates an example in which the imaging device 7E (camera 5) is attached to the camera mounting part 71.
  • FIG. 9(b) three camera mounting parts 71, 72, 73 spaced apart in the Z-axis direction are provided on the ceiling of the moving body 6, and the installation position in the direction (Z-axis direction) intersecting the direction of movement (X-axis direction) relative to the moving body 6 is adjusted depending on which of the three camera mounting parts 71, 72, 73 the imaging device 7E (camera 5) is attached to.
  • the distance to the target slope (target including a slope) can be made the closest
  • the distance to the target slope (target including a slope) can be made the furthest
  • the distance to the target slope (target including a slope) can be set to an intermediate position (three-stage setting is possible).
  • the photographing devices (camera mounting parts 71, 72, 73) may be moved not only in the Z-axis direction but also in the Y-axis direction (their positions may be adjustable).
  • Figure 9(b) shows an example in which the photographing device 7E (camera 5) is mounted on the camera mounting part 73.
  • a camera mounting part 74 with a sliding mechanism capable of sliding in the Z-axis direction is provided on the ceiling of the moving body 6, and the installation position in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6 is adjusted (stepless setting is possible) by attaching the imaging device 7E (camera 5) to the camera mounting part 74 with a sliding mechanism and sliding it.
  • FIG. 9(c) illustrates an example in which the imaging device 7E (camera 5) is attached to the camera mounting part 74 with a sliding mechanism and slid to be farthest from the target slope (target including a slope).
  • the installation position may be adjusted not only in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6, but also in the Y-axis direction by tilting the imaging device and attaching it to the camera mounting part 74 with a sliding mechanism and sliding it.
  • the imaging devices 7A, 7B, 7C, and 7D (cameras 1, 2, 3, and 4) other than the imaging device 7E (camera 5) are "fixed installation position imaging units" whose installation positions are fixed in the direction (Z-axis direction) that intersects with the moving direction (X-axis direction) relative to the moving body 6.
  • variable installation position photographing unit photographs a relatively high position on the slope of the object
  • fixed installation position photographing unit photographs a relatively low position on the slope of the object
  • imaging devices 7A-7E are so-called “normal cameras” in which the Scheimpflug angle is not set
  • imaging devices 7C-7E are so-called “Scheimpflug cameras” in which the Scheimpflug angle is set.
  • a Scheimpflug camera is an example of a special camera in which the entire plane facing each other diagonally appears in focus, and may be read as a "tilt camera” or a “tilt-mount camera.”
  • the photographing devices 7A to 7E have an image sensor and a photographing lens
  • the Scheimpflug angle ⁇ ' which indicates the angle between the perpendicular to the sensor surface of the image sensor and the central axis (optical axis) of the photographing lens
  • the perpendicular to the sensor surface of the image sensor and the central axis (optical axis) of the photographing lens are coaxial (parallel) and have an angle of 0 degrees.
  • the imaging devices 7C to 7E (cameras 3 to 5), which are Scheimpflug cameras, have Scheimpflug angles that correspond to the imaging position in the height direction of the slope of the object.
  • the Scheimpflug angle of the imaging devices 7C and 7D (cameras 3 and 4) (imaging units with fixed installation positions) is set to 0.7 degrees
  • the Scheimpflug angle of the imaging device 7E (camera 5) (imaging unit with variable installation position) is set to 1.1 degrees.
  • the Scheimpflug angles of the imaging devices 7C to 7E (cameras 3 to 5) are merely examples and are set appropriately depending on the range of inclination angles of the slope of the object.
  • the imaging device used in this embodiment may be equipped with a mechanism that dynamically changes the Scheimpflug angle by adjusting the positional relationship between the image sensor and the imaging lens (the relationship of the optical axis of the imaging lens to the imaging surface of the image sensor).
  • the Scheimpflug angles of the imaging devices 7C to 7E (cameras 3 to 5) are uniquely determined at the time of manufacture, and no mechanism is equipped to dynamically change the Scheimpflug angle (the Scheimpflug angle is invariable when focusing on the camera alone).
  • the imaging device 7E (second imaging device) has a preset Scheimpflug angle in which the optical axis of the imaging lens in the imaging device 7E (second imaging device) is tilted with respect to an axis perpendicular to the imaging surface of the imaging device 7E (second imaging device).
  • FIG. 10 is a diagram showing an example of adjusting the installation position of the variable installation position photographing unit, i.e., the photographing device 7E (camera 5).
  • FIG. 10 corresponds to an "installation position setting step" for setting (adjusting) the installation position in the Z-axis direction of the photographing device 7E (second photographing device) on the moving body (vehicle) 6.
  • FIG. 10 illustrates an example in which the height of the slope, which is the subject, is 15 m, and the inclination angle of the slope, which is the subject, is approximately 45 degrees to 73.3 degrees (the inclination angle of the slope, which is the subject, is drawn as if it were constant, but this is for convenience of drawing).
  • the slope of the target object has a variety of angles, from gentle to steep, so even when using a Scheimpflug camera, it can be difficult to obtain an image that is in focus on the slope surface. For example, if the overall image to be included in the report and the image showing the deformation are not in focus, they will not be able to serve as an inspection report. Also, because the crack width is estimated from an image of the crack, the focus affects the error in the crack width estimation results.
  • the slope angle is about 45 degrees to 73.3 degrees and the height of the slope is 15 m or thereabouts
  • the horizontal distance from the shooting reference position of the moving body 6 for example, the installation position (lens object side principal point position) of the shooting device 7E (camera 5) at the right end in Figure 10) to the end of the slope (the "distance from the shooting reference position to the subject" described below) is in the range of 1.75 m to 3.75 m, it may be difficult to capture an image in focus on the surface of the slope, even with the shooting device 7E (camera 5) with the Scheimpflug angle set to 1.1 degrees.
  • an "installation position setting step” is provided for setting (adjusting) the installation position in the Z-axis direction of the imaging device 7E (second imaging device) on the moving body (vehicle) 6.
  • an XYZ Cartesian coordinate system is used, in which the movement direction of the moving body (vehicle) 6 is the X-axis, the direction intersecting the movement direction of the moving body (vehicle) 6 is the Z-axis, and the direction perpendicular to the X-axis and Z-axis is the Y-axis.
  • the length from the position where the shooting reference position of the shooting device 7E (second shooting device) installed on the moving body (vehicle) 6 is projected onto the XZ plane on which the moving body (vehicle) 6 moves to the position on the XZ plane where the XZ plane intersects with the object is defined as the "distance from the shooting reference position to the object.”
  • the length on the Y axis from the XZ plane to the shooting target area on the object is defined as the "height of the shooting target area.”
  • the shooting reference position may be interpreted as the object-side principal point position of the lens.
  • the object is a slope
  • the moving body is a vehicle
  • the XZ plane on which the moving body moves is the road surface on which the vehicle moves.
  • the "distance from the shooting reference position to the object" can be interpreted as the “distance from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface.”
  • the installation position of the imaging device 7E (second imaging device) is set (adjusted) in the Z-axis direction that intersects with the direction of movement of the moving body (vehicle) 6, depending on at least one of the "distance from the imaging reference position to the target object" and the "height of the imaging target area".
  • the "distance from the shooting reference position to the object” and the "height of the object to be shot” are parameters for determining the theoretical Scheimpflug angle for focusing on the object to be shot, and the installation position of the shooting device 7E (second shooting device) is set (adjusted) according to the difference between the theoretical Scheimpflug angle and the set Scheimpflug angle that is set in advance in the shooting device 7E (second shooting device).
  • the installation position of the shooting device 7E (second shooting device) is changed in the Z-axis direction (for example, away from the slope of the object) to change the theoretical Scheimpflug angle, thereby bringing the difference between the theoretical Scheimpflug angle and the set Scheimpflug angle within a specified range.
  • the installation position of the imaging device 7E (camera 5) in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving object 6 is adjusted according to the "distance W from the imaging reference position to the object".
  • the installation position of the imaging device 7E (camera 5) is adjusted to the right end in FIG. 10.
  • the installation position of the imaging device 7E (camera 5) is adjusted to the left end in FIG. 10.
  • the installation position of the imaging device 7E (camera 5) as an installation position variable imaging unit is adjusted in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving object 6 according to the "distance W from the imaging reference position to the object".
  • the photographing device 7E (camera 5) is for photographing the highest area of the slope
  • the photographing device 7E (camera 5) as a position-variable photographing unit has its position adjusted in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving body 6 depending on the "height of the photographed area (for example, whether the photographed area is the first or second target area)". If the photographing device (camera) is for photographing a low position on the slope, there is little need to make the photographing device (camera) a position-variable photographing unit in the first place, and a fixed-position photographing unit is sufficient.
  • the "height of the photographed area” is marked with the symbol D (the height D of the photographed area is set within a maximum range of 15 m).
  • the height D of the photographed area in Figure 11 may be read as the "height of the intersection of the optical axis and the slope".
  • the installation position of the imaging device 7E (camera 5) as a position-adjustable imaging unit is set (adjusted) in the Z-axis direction that intersects with the direction of movement of the moving body (vehicle) 6, depending on at least one of the "distance W from the imaging reference position to the target object" and the "height D of the imaging target area.”
  • the installation position of the imaging device 7E (camera 5) as a position-adjustable imaging unit is adjusted in the direction (Z-axis direction) that intersects with the moving direction (X-axis direction) relative to the moving body 6 based on the relationship of the "distance W from the imaging reference position to the target object" to a predetermined distance (whether the distance W is in the range of 1.75m to 3.75m, or whether it is less than 2.75m or 2.75m or more) and the relationship of the "height D of the imaging target area" to a predetermined height (whether it is 15m or close to that, or whether it is the highest slope height, or whether the imaging target area is the first or second target area).
  • the installation position of the imaging device 7E (camera 5) in the direction (Z axis direction) intersecting the moving direction (X axis direction) relative to the moving body 6 is adjusted based on whether or not the slope angle, etc., according to the "distance W from the imaging reference position to the object" and the "height D of the imaging target area” exceeds the allowable range for the theoretical value of the Scheimpflug angle calculated from the Scheimpflug angle formula (details will be described later).
  • the Scheimpflug angle constant refers to one camera, and does not mean that the Scheimpflug angle is made common to the other cameras.
  • Each camera here, imaging devices 7C to 7E (cameras 3 to 5)
  • the optimal Scheimpflug angle is set for each camera within the allowable range for the theoretical value of the Scheimpflug angle calculated from the Scheimpflug angle formula described later.
  • the Scheimpflug angle set for each camera is not changed.
  • a mechanism for changing the Scheimpflug angle involves preparing a lens adapter for each Scheimpflug angle and replacing the lens adapter to obtain the desired Scheimpflug angle.
  • a mechanism for moving the imaging device (second imaging device) 7E in the Z-axis direction is provided, eliminating the need to replace the lens adapter and improving work efficiency.
  • in-focus photography is possible even with a camera that does not have a mechanism for dynamically changing the Scheimpflug angle.
  • the imaging devices 7C to 7E (cameras 3 to 5) have a structure in which the Scheimpflug angle is fixed, a high level of rigidity can be ensured (guaranteed) for the cameras and lenses.
  • the "photographing reference position” is the installation position of the camera on the moving body 6 (in this embodiment, the object-side principal point position of the lens), and may be interpreted as a "predetermined photography reference position", for example, uniquely defined by the driving line of the moving body 6 at the time of shooting.
  • the "distance W from the photography reference position to the object" that is, the “distance W from the position where the photography reference position of the camera installed on the moving body (vehicle) is projected onto the road surface to the toe of the slope where the slope meets the road surface”
  • the “distance W from the photography reference position to the object” may differ depending on, for example, differences in the environment between the slope and the road along which the vehicle is traveling, that is, the presence or absence of a sidewalk, a gutter, or a guardrail.
  • the "distance W from the photographing reference position to the object” that is, the “distance W from the position where the photographing reference position of the photographing device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface,” can be grasped in advance by using the output of the GNSS sensor 8b, checking in advance on a map, etc., driving once before photographing, using the results of the previous photographing, or any other means. Then, once the installation position of the photographing device 7E (camera 5) as the position-variable photographing unit is adjusted, it is not moved during photographing, and photographing is performed with the photographing conditions related to focus fixed (constant).
  • the Scheimpflug angle is set according to the inclination angle of the subject.
  • the inventors have noticed that when photographing slopes, there is a demand to artificially change the set Scheimpflug angle (to cover a Scheimpflug angle with a certain range) rather than simply changing the Scheimpflug angle setting as appropriate according to the inclination angle of the slope.
  • the “distance W from the shooting reference position to the object,” i.e., the “distance W from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface,” may vary depending on the environment of the shooting site (presence or absence of sidewalks, gutters, and guardrails).
  • the slope angle is approximately 45 degrees to 73.3 degrees
  • the "distance W from the shooting reference position to the object” i.e., the “distance W from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," is 1.75 m to 3.75 m. It is assumed that shooting will be performed without changing the Scheimpflug angle set for each of the shooting devices 7C to 7E (cameras 3 to 5) (0.7 degrees for shooting devices 7C and 7D (cameras 3 and 4), and 1.1 degrees for shooting device 7E (camera 5)).
  • the imaging device 7E (camera 5) is used as a "position-variable imaging unit" and its installation position on the moving body 6 is moved in the direction in which the distance W increases (the direction moving away from the slope (end of the slope) on the Z-axis), so that in-focus imaging can be achieved even with the imaging device 7E (camera 5) with a Scheimpflug angle of 1.1 degrees.
  • the theoretical Scheimpflug angle of the photographing device 7E (camera 5) is changed in a pseudo manner by adjusting the installation position of the photographing device 7E (camera 5) as a position-variable photographing unit in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving body 6. That is, the set Scheimpflug angle of the photographing device 7E (camera 5) itself is the same (although it does not have an angle change unit), but the theoretical Scheimpflug angle is "pseudo" changed by adjusting the slide position of the photographing device 7E (camera 5).
  • the "focus plane angle" of the photographing device 7E (camera 5) as a position-variable photographing unit can be changed (adjusted) without changing the set Scheimpflug angle of the photographing device 7E (camera 5) itself.
  • the "focus plane” of the imaging device means a plane that includes the focus position (point) on the optical axis of the lens of the imaging device.
  • the imaging surface (object-side principal plane) of the imaging device (image sensor) and the focus plane are parallel, and when the set Scheimpflug angle is other than 0°, the focus plane is angled with respect to a plane parallel to the object-side principal plane according to the set Scheimpflug angle.
  • the angle of the focus plane with respect to this plane parallel to the object-side principal plane is defined as the "focus plane angle.”
  • the angle of the focus plane changes even when the Scheimpflug angle set in the imaging device itself is fixed.
  • the camera and the image capturing lens (lens barrel) with the Scheimpflug angle set are depicted as schematics, and the "lens object side principal point position" is depicted there.
  • the “lens object side principal point position” is the intersection point of the optical axis and the object side principal plane.
  • the plane that passes through the lens image-side principal point position and is perpendicular to the optical axis of the photographing lens is the "image-side principal plane”
  • the plane that passes through the lens object-side principal point position and is perpendicular to the optical axis of the photographing lens is the "object-side principal plane”.
  • the "image-side principal plane” and the "object-side principal plane” are perpendicular to the optical axis of the photographing lens.
  • the Scheimpflug angle when the Scheimpflug angle is set, the optical axis of the photographing lens is no longer perpendicular to the "imaging surface of the image sensor", so the "imaging surface of the image sensor” and the "object-side principal plane” are no longer parallel.
  • the Scheimpflug angle is not set (when the Scheimpflug angle is 0°)
  • the "focus plane” depicted in FIG. 11 is parallel to the "imaging surface of the image sensor”
  • the Scheimpflug angle when the Scheimpflug angle is set, it is significantly tilted with respect to the "imaging surface of the image sensor". In this way, the focal range on the slope is wider by being closer to the angle of the slope than when the Scheimpflug angle is not set.
  • the installation position of the camera photographing the photographing target area located at a high position e.g., the second target area
  • a high position e.g., the second target area
  • Figures 11, 12, and 13 are first, second, and third conceptual diagrams showing an example of the allowable deviation from the theoretical Scheimpflug angle value.
  • the lens direction ⁇ (°) is determined based on the lens position and the inclination angle of the slope, in accordance with the criteria in Table 1 below.
  • the direction is determined taking into consideration the ability to observe slopes with a height of 15 m or more, and the shooting range of the lower camera.
  • the lens direction is set to 15 m or near there, and the object distance is calculated for a total of six levels, including two levels of each inclination angle (73.3°, 45°) and three levels of distance W from the shooting reference position to the object (1.75 m, 2.75 m, 3.75 m).
  • the object distance Z0 is calculated based on the following formulas (A), (B), (C), and (D) when the parameters ⁇ , A, ⁇ , C, W, H, and ⁇ in FIG.
  • Angle between the optical axis and the slope [°]
  • A Angle of inclination of the slope [°]
  • Lens orientation [°]
  • C Distance between the end of the lens and the edge of the lens [m]
  • W Distance from the shooting reference position to the object [m]
  • H Lens height [m]
  • Angle between the line connecting the principal point of the object and the toe of the slope and the line perpendicular to the ground [°]
  • D Height of the target area [m] Z0 : Object distance [m]
  • the Scheimpflug angle ⁇ ' of the camera is calculated based on the following formula (E): In Fig. 12, the inclination angle A of the slope is set to about 45° to 73.3°, the distance W from the shooting reference position to the target object is set to 1.75 m to 3.75 m, the lens height H is set to 2.3 m, and the movement amount (slide amount in the Z-axis direction) of the lens for shooting near 15 m (camera 5) is set to 1 m.
  • ⁇ ' Scheimpflug angle [°] f: 15m focal length of lens for shooting near field (lens focal length of camera 5) [m] Z0 : Object distance [m]
  • Angle between the optical axis and the slope [°]
  • the lens height H [m] above means the height from the road surface (XZ plane) on which the moving body (vehicle) moves to the "lens object side principal point position.”
  • the lens height H [m] means the height from the road surface (XZ plane) on which the moving body (vehicle) moves to the "shooting reference position.”
  • the actual Scheimpflug angle that is set is within a predetermined range ( ⁇ 0.2°) from the theoretical value, it can be determined that the depth of field is sufficient for performing a search, but if the actual Scheimpflug angle that is set is outside the predetermined range from the theoretical value (exceeding ⁇ 0.2°), the higher the possibility that an unfocused part will be captured in the captured image.
  • the predetermined range of " ⁇ 0.2°” is merely an example, and various design changes are possible.
  • the predetermined range that is the criterion for determining whether or not the depth of field required for inspection is within the range can be set appropriately according to the image resolution and inspection accuracy (required specifications) at the time of inspection.
  • the photographing device 7E (camera 5) as the position-adjustable photographing unit is moved in the Z-axis direction (horizontal direction) (away from the slope) to adjust the actual Scheimpflug angle that is set so that it falls within the specified range from the theoretical value ( ⁇ 0.2°).
  • Figure 13(a) shows the theoretical value of the Scheimpflug angle when the Z-axis position of the image capture device 7E (camera 5) as a position-adjustable image capture unit is fixed
  • Figure 13(b) shows the theoretical value (improvement) of the Scheimpflug angle when the Z-axis position of the image capture device 7E (camera 5) as a position-adjustable image capture unit is adjusted.
  • the horizontal axis represents the slope angle and the vertical axis represents the theoretical value of the Scheimpflug angle
  • the theoretical value of the Scheimpflug angle is plotted against the slope angle.
  • Figures 13(a) and (b) also plot the theoretical values of the Scheimpflug angle for cameras 2, 4, and 5 (CAM2, CAM4, CAM5) under two conditions where the distance W from the image capture reference position to the object is 1.75 m and 3.75 m.
  • the theoretical range of the Scheimpflug angle for CAM2 which photographs the middle of the slope, is within a range of 0.37° from -0.27° to 0.1°, and since the difference from the theoretical value is small even when the Scheimpflug angle is set to 0°, focus shifts are unlikely to occur at the top and bottom of the shooting range even if the shooting conditions (slope or distance W from the shooting reference position to the subject) change.
  • the theoretical range of the Scheimpflug angle for CAM4 is within a range of 0.39° from -0.69° to -0.3°, and since the difference from the theoretical value is small even when the Scheimpflug angle is set to -0.7°, there are no focus problems.
  • a warning may be issued to indicate that further photographs will not be able to capture images suitable for inspection, photographing may be stopped, or an instruction may be given to retake photographs due to the detected error.
  • the imaging device captures an area to be imaged of an object by dividing the area to be imaged into a plurality of imaging areas along the direction of movement of the moving object.
  • the information processing method of this embodiment includes a generating step of generating an image of the area to be imaged by stitching together the images of the plurality of imaging areas captured by the imaging method described above.
  • the information processing method of this embodiment also includes an evaluation step of evaluating the image of the area to be imaged obtained by stitching together the images of the plurality of imaging areas.
  • the information processing method of this embodiment also includes a display step of displaying the image of the area to be imaged obtained by stitching together the images of the plurality of imaging areas.
  • Figure 14 is a sequence diagram showing an example of display processing in a status inspection system.
  • the reception unit of the evaluation device 3 accepts the selection of the target data (step S91).
  • the user of the evaluation device 3 may select an arbitrary position in the map information managed by the map data management unit of the evaluation device 3, and the reception unit of the evaluation device 3 may accept the selection of position information in the map information.
  • the communication unit of the evaluation device 3 transmits a request for an input/output screen related to the target data selected in step S91 to the data management device 5, and the communication unit of the data management device 5 receives the request transmitted from the evaluation device 3 (step S92).
  • This request includes the folder name selected in step S91. Alternatively, this request may include location information in the map information.
  • the storage/reading unit of the data management device 5 searches the processing data management DB 5003 using the folder name included in the request received in step S92 as a search key, thereby reading out image data associated with the folder name included in the request.
  • the storage/reading unit searches the acquisition data management DB using the location information included in the request received in step S92 as a search key, thereby reading out image data associated with the location information included in the request.
  • the generating unit of the data management device 5 generates an input/output screen including the image data based on the image data read by the storing/reading unit (step S93).
  • This input/output screen is a screen that accepts an instruction operation to generate an image showing a specific position in a luminance image showing a slope.
  • Step S94 is an example of a generation reception screen transmission step.
  • the display control unit of the evaluation device 3 displays the input/output screen received in step S94 on the display (step S95).
  • the reception unit of the evaluation device 3 receives a predetermined input operation by the user on the displayed input/output screen.
  • This input operation includes an instruction operation to generate an image showing a specific position in a luminance image showing a slope.
  • Step S95 is an example of a reception step.
  • the communication unit of the evaluation device 3 transmits input information related to the input operation received by the reception unit to the data management device 5, and the communication unit of the data management device 5 receives the input information transmitted from the evaluation device 3 (step S96).
  • This input information includes instruction information that instructs the generation of an image showing a specific position in the luminance image showing the slope.
  • the generating section of the data management device 5 generates a display image using the image data read by the storing and reading section in step S93 based on the received input information (step S97).
  • This display image includes a surface display image including a surface image showing the surface of the slope and a surface position image showing a specific position in the surface image, and a cross-section display image including a cross-section image showing the cross-section of the slope and a cross-section position image showing a specific position in the cross-section image.
  • Step S97 is an example of an image generating step.
  • Step S98 is an example of a display image transmission step.
  • the display control unit of the evaluation device 3 displays the display image received in step S98 on the display (step S99).
  • Step S99 is an example of a display step.
  • FIG. 14 shows the sequence of the display process between the evaluation device 3 and the data management device 5, but the evaluation device 3 may execute the display process independently.
  • steps S92, 94, 96, and 98 relating to data transmission and reception are omitted, and the evaluation device 3 can perform the same display processing as in FIG. 14 by independently executing steps S91, 93, 95, 97, and 99.
  • the data acquisition device 9, terminal device 1100, and terminal device 1200 can also independently execute display processing like the evaluation device 3.
  • FIG. 15 is an explanatory diagram of an operation on a display screen of a state inspection system.
  • Fig. 15 shows an input/output screen 2000 displayed on the display of the evaluation device 3 in step S95 of the sequence diagram shown in Fig. 14, but the same is true for the input/output screen 2000 displayed on the respective displays of the data acquisition device 9, the terminal device 1100, and the terminal device 1200.
  • the display control unit of the evaluation device 3 displays an input/output screen 2000 including a designation reception screen 2010 that receives a designation operation for designating a specific position in a luminance image showing a slope, and a generation reception screen 2020 that receives an instruction operation for instructing the generation of an image showing a specific position on the slope.
  • the display control unit displays a surface image 2100 showing the surface of the slope on the specification reception screen 2010, and also displays a pointer 2300 operated by a pointing device on the surface image 2100.
  • the surface image 2100 is a luminance image read out from the captured image data in step S92 of FIG. 14, and the display control unit displays the surface image 2100 in association with the captured images 1 and 2 shown in FIG. 5 and the X-axis direction and Y-axis direction shown in the captured image data 7A shown in FIG. 6(a).
  • the display control unit displays the generation reception screen 2020, which includes a specified position confirmation button 2400, a deformation confirmation button 2410, a deformation sign confirmation button 2420, a front view analysis button 2430, a front view comparison button 2440, a cross-sectional view analysis button 2450, and a cross-sectional view comparison button 2460.
  • the deformation confirmation button 2410, the deformation sign confirmation button 2420, the front view analysis button 2430, the front view comparison button 2440, the cross-sectional view analysis button 2450, and the cross-sectional view comparison button 2460 are buttons that instruct the generation of an image showing a specific position on the slope, with the position of a part in the surface image 2100 or the cross-sectional image 2200 that satisfies a specified condition as the specific position.
  • the designated position confirmation button 2400 is a button that instructs the user to confirm a specific position on the slope specified on the designation acceptance screen 2010 and generate an image showing the specific position on the slope.
  • the designated position confirmation button 2400 may determine not only a specific position specified on the designation acceptance screen 2010, but also a specific position that has been specified by a judgment unit or the like and displayed on the designation acceptance screen 2010.
  • the Deformation Confirmation button 2410 is a button that instructs the system to generate an image showing a specific position on the slope, with a position indicating a deformation of the slope set as the specific position
  • the Deformation Sign Confirmation button 2420 is a button that instructs the system to generate an image showing a specific position on the slope, with a position indicating a deformation of the slope set as the specific position.
  • the front view analysis button 2430 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by analyzing the surface image 2100 as the specific position
  • the front view comparison button 2440 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by comparing the surface image 2100 with another image as the specific position.
  • the cross-sectional view analysis button 2450 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by analyzing the cross-sectional image (described later) as the specific position
  • the cross-sectional view comparison button 2460 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by comparing the cross-sectional image with another image as the specific position.
  • FIG. 16 is a flowchart showing an example of the information processing method.
  • FIG. 16(a) shows the processing in the evaluation device 3
  • FIG. 16(b) shows the processing in the data management device 5.
  • the reception unit of the evaluation device 3 receives the pointing operation (step S101), and when the designated position confirmation button 2400 is operated, the reception unit receives the operation (step S102).
  • the judgment unit of the evaluation device 3 detects the XY coordinates of the pointed position in the surface image 2100 as a specific position (step S103).
  • This specific position may indicate a point in the XY coordinates, or may indicate an area.
  • the communication unit of the evaluation device 3 transmits input information related to the input operation received by the reception unit 32 to the data management device 5 (step S104).
  • This input information includes designation information for designating a specific position in XY coordinates based on a pointing operation using the pointer 2300, and instruction information for instructing the generation of an image showing the specific position on the slope based on the operation of the designated position confirmation button 2400.
  • the communication unit of the data management device 5 receives the input information sent from the evaluation device 3, and the generation unit uses the image data shown in FIG. 6(a) based on the instruction information and specification information contained in the received input information to generate a surface position image that overlaps with the XY coordinates of the specific position by superimposing it on the surface image to generate a surface display image (step S105).
  • the surface position image does not necessarily have to completely match the XY coordinates of the specific position, as long as it overlaps with the XY coordinates of the specific position.
  • the generating unit generates a cross-sectional image corresponding to the X-coordinate of the specific position using the image data shown in FIG. 6(a) and the distance measurement data shown in FIG. 6(b) (step S106). If the distance measurement data shown in FIG. 6(b) does not include the X-coordinate of the specific position, the generating unit generates a cross-sectional image based on data in the vicinity of the X-coordinate of the specific position included in the distance measurement data shown in FIG. 6(b).
  • the generating unit In step S106, the generating unit generates a cross-sectional image of a cross-section including the Z-axis direction and the vertical direction shown in FIG. 5, but it may also generate a cross-sectional image of a cross-section including the Z-axis direction and a direction inclined from the vertical direction, or a cross-sectional image of a cross-section including a direction inclined from the Z-axis direction.
  • the generation unit generates a cross-sectional position image that overlaps with the Y coordinate of the specific position by superimposing it on the edge line of the cross-sectional image, and generates a cross-sectional display image (step S107).
  • the communication unit transmits the surface display image generated in step S105 and the cross-sectional display image generated in step S107 to the evaluation device 3 (step S108).
  • the communication unit of the evaluation device 3 receives the surface display image and cross-sectional display image transmitted from the data management device 5, and the display control unit of the evaluation device 3 displays the received surface display image and cross-sectional display image on the display.
  • FIG. 17 is a diagram showing an example of a display screen after the processing shown in FIG. 14.
  • FIG. 17 shows an input/output screen 2000 that is displayed on the display of the evaluation device 3 in step S99 of the sequence diagram shown in FIG. 14.
  • the display contents of the generation reception screen 2020 are the same as those in FIG. 15, but the display contents of the specification reception screen 2010 are different from those in FIG. 15.
  • the display control unit of the evaluation device 3 causes the specification reception screen 2010 to display a surface display image 2150 including a surface image 2100 showing the surface of the slope and a surface position image 2110 showing a specific position in the surface image 2100, and a cross-section display image 2250 including a cross-section image 2200 showing the cross-section of the slope and a cross-section position image 2210 showing a specific position in the cross-section image 2200.
  • the display control unit displays the cross-sectional image 2200 in association with the Y-axis direction and Z-axis direction shown in FIG. 5.
  • the length from the position where the imaging reference position of the imaging device installed on the moving body is projected onto the XZ plane along which the moving body moves to the position on the XZ plane where the XZ plane intersects with the object is defined as the distance from the imaging reference position to the object
  • the length on the Z axis from the XZ plane to the imaging target area of the object is defined as the height of the imaging target area.
  • five photographing devices are prepared as the multiple photographing devices (cameras), and only one photographing device (camera) that photographs the uppermost part of the slope (object including the slope) that is the target is designated as a "variable installation position photographing unit (variable installation position camera)", and the remaining four photographing devices (cameras) are designated as "fixed installation position photographing units (fixed installation position cameras)".
  • a degree of freedom in the number of photographing devices (cameras), and there is also a degree of freedom in how to arrange the "variable installation position photographing units (variable installation position cameras)" and the "fixed installation position photographing units (fixed installation position cameras)”.
  • variable installation position photographing unit variable installation position camera
  • six or more photographing devices cameras
  • two or more of them variable installation position photographing units (variable installation position cameras)

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A photographing method for photographing, while moving, an object by a photographing device installed in a mobile unit is characterized in that the photographing device photographs a region to be photographed in the object located in a direction intersecting a moving direction, the method comprises an installation position setting step for setting the installation position of the photographing device in the mobile unit, and when, using an XYZ orthogonal coordinate system with the moving direction of the mobile unit as an X-axis, the direction intersecting the moving direction as a Z-axis, and a direction orthogonal to the X-axis and the Z-axis as a Y-axis, the length from a position at which a photographing reference position of the photographing device installed in the mobile unit is projected onto an X-Z plane on which the mobile unit moves to a position on the X-Z plane at which the X-Z plane and the object intersect is defined as the distance from the photographing reference position to the object, and the length on the Y-axis from the X-Z plane to the region to be photographed in the object is defined as the height of the region to be photographed, in the installation position setting step, the installation position is adjusted in the Z-axis direction intersecting the moving direction according to the distance from the photographing reference position to the object and/or the height of the region to be photographed.

Description

撮影方法及び情報処理方法Photographing method and information processing method
 本発明は、撮影方法及び情報処理方法に関する。 The present invention relates to a photography method and an information processing method.
 例えば、特許文献1には、法面の状態を外観観察に基づいて点検し、変状が認められたときにその状態及び位置を記録する法面点検方法が記載されている。 For example, Patent Document 1 describes a method of inspecting a slope by inspecting the condition of the slope based on visual observation, and recording the condition and location of any abnormalities found.
 この法面点検方法では、天端あるいは法尻側方の平坦面を走行する車両から焦点距離の異なる複数のカメラにより法面を高さ方向に分割して法面での解像度をほぼ均一にして連続撮影するとともに、車両に搭載した位置計測手段により撮影位置を記録する。また、この法面点検方法では、連続撮影した画像並びに撮影位置及び姿勢を利用した写真測量によって法面の3次元計測データを作成する。また、この法面点検方法では、3次元計測データを利用し、法面に正対する方向に投影変換を行った直射投影画像に基づき法面の状態を外観観察し、変状が認められたときにその状態及び位置を特定する。 In this slope inspection method, a vehicle traveling on the flat surface at the top or to the side of the slope uses multiple cameras with different focal lengths to divide the slope in the height direction and continuously photograph it with an almost uniform resolution, while a position measurement means mounted on the vehicle records the photographing positions. This slope inspection method also creates three-dimensional measurement data of the slope through photogrammetry using the continuously photographed images and the photographing positions and orientations. This slope inspection method also uses the three-dimensional measurement data to externally observe the condition of the slope based on a direct projection image that has been projected in a direction directly facing the slope, and if any abnormality is found, identifies its condition and position.
 しかしながら、特許文献1は、法面の広い領域でピントが合った画像を得るという観点において、改良の余地がある。 However, there is room for improvement in Patent Document 1 in terms of obtaining an image that is in focus over a wide area of the slope.
 本発明は、以上の点に鑑みてなされたものであり、対象物(例えば法面)の広い領域でピントが合った画像を得ることができる撮影方法及び情報処理方法を提供することを目的とする。 The present invention has been made in consideration of the above points, and aims to provide an imaging method and information processing method that can obtain an image that is in focus over a wide area of an object (e.g., a slope).
 本実施形態の撮影方法は、移動体に設置された撮影装置により、移動しながら対象物を撮影する撮影方法であって、前記撮影装置は、移動方向と交差する方向に位置する前記対象物における撮影対象領域を撮影するものであり、前記移動体における前記撮影装置の設置位置を設定する設置位置設定ステップを有し、前記移動体の移動方向をX軸、前記移動方向と交差する方向をZ軸、前記X軸と前記Z軸と直交する方向をY軸、とするXYZ直交座標系を用いて、前記移動体が移動するXZ平面上に前記移動体に設置された前記撮影装置の撮影基準位置を投影した位置から前記XZ平面上で前記XZ平面と前記対象物とが交わる位置までの長さを前記撮影基準位置から前記対象物までの距離と定義し、前記XZ平面から前記対象物における前記撮影対象領域までの前記Y軸上の長さを前記撮影対象領域の高さと定義すると、前記設置位置設定ステップは、前記撮影基準位置から前記対象物までの距離、及び前記撮影対象領域の高さの少なくとも一方に応じて、前記移動方向と交差する前記Z軸方向について前記設置位置を調整することを特徴とする。 The photographing method of this embodiment is a photographing method in which an object is photographed while moving by a photographing device installed on a moving body, and the photographing device photographs a photographing target area of the object located in a direction intersecting with the moving direction, and includes an installation position setting step for setting the installation position of the photographing device on the moving body, and using an XYZ Cartesian coordinate system in which the moving direction of the moving body is the X axis, the direction intersecting the moving direction is the Z axis, and the direction perpendicular to the X axis and the Z axis is the Y axis, the length from the position where the photographing reference position of the photographing device installed on the moving body is projected onto the XZ plane on which the moving body moves to the position on the XZ plane where the XZ plane and the object intersect is defined as the distance from the photographing reference position to the object, and the length on the Y axis from the XZ plane to the photographing target area of the object is defined as the height of the photographing target area, and the installation position setting step is characterized in that the installation position is adjusted in the Z axis direction intersecting with the moving direction in accordance with at least one of the distance from the photographing reference position to the object and the height of the photographing target area.
 本発明によれば、対象物(例えば法面)の広い領域でピントが合った画像を得ることができる撮影方法及び情報処理方法を提供することができる。 The present invention provides an imaging method and information processing method that can obtain an image that is in focus over a wide area of an object (e.g., a slope).
図1は、本実施形態の状態検査システムの全体構成の一例を示す図である。FIG. 1 is a diagram showing an example of the overall configuration of a state inspection system according to the present embodiment. 図2は、状態検査システムを用いて法面状態を検査する様子の一例を示す図である。FIG. 2 is a diagram showing an example of how a slope condition is inspected using the condition inspection system. 図3は、法面の状態の一例を示す図である。FIG. 3 is a diagram showing an example of a state of a slope. 図4は、データ取得装置のハードウエア構成の一例を示す図である。FIG. 4 is a diagram illustrating an example of a hardware configuration of the data acquisition device. 図5は、移動体システムによって取得される撮影画像について説明するための図である。FIG. 5 is a diagram for explaining a photographed image acquired by the mobile system. 図6は、撮影画像と測距画像の説明図である。FIG. 6 is an explanatory diagram of a photographed image and a distance measurement image. 図7は、複数の撮影領域の説明図である。FIG. 7 is an explanatory diagram of a plurality of shooting regions. 図8は、本実施形態の撮影装置を搭載した移動体の構成の一例を示す図である。FIG. 8 is a diagram showing an example of the configuration of a moving body equipped with the imaging device of this embodiment. 図9は、設置位置可変撮影部の構成の一例を示す図である。FIG. 9 is a diagram illustrating an example of the configuration of the variable installation position photographing unit. 図10は、設置位置可変撮影部の設置位置の調整の一例を示す図である。FIG. 10 is a diagram showing an example of adjustment of the installation position of the installation position variable imaging unit. 図11は、シャインプルーフ角理論値からのズレ許容量の一例を示す第1の概念図である。FIG. 11 is a first conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle. 図12は、シャインプルーフ角理論値からのズレ許容量の一例を示す第2の概念図である。FIG. 12 is a second conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle. 図13は、シャインプルーフ角理論値からのズレ許容量の一例を示す第3の概念図である。FIG. 13 is a third conceptual diagram showing an example of the allowable deviation from the theoretical value of the Scheimpflug angle. 図14は、状態検査システムにおける表示処理の一例を示すシーケンス図である。FIG. 14 is a sequence diagram showing an example of a display process in the state inspection system. 図15は、状態検査システムの表示画面における操作の説明図である。FIG. 15 is an explanatory diagram of operations on the display screen of the status inspection system. 図16は、情報処理方法の処理の一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the process of the information processing method. 図17は、図14に示した処理後の表示画面の一例を示す図である。FIG. 17 is a diagram showing an example of a display screen after the processing shown in FIG.
 以下、図面を参照して、本実施形態の撮影方法及び情報処理方法について詳細に説明する。以下の説明におけるX軸方向、Y軸方向、Z軸方向は、図中に示す矢線方向を基準とする。X軸方向、Y軸方向、Z軸方向は、互いに直交する三次元空間を規定する。X軸方向は、移動体の移動方向(進行方向、走行方向)と読み替えてもよい。Y軸方向は、対象物の法面の高さ方向(鉛直方向、垂直方向)と読み替えてもよい。Z軸方向は、移動体の移動方向と交差する方向(移動体から法面に向かう奥行方向、撮影方向)と読み替えてもよい。このように、本実施形態の撮影方法及び情報処理装置では、移動体の移動方向をX軸、移動方向と交差する方向をZ軸、X軸とZ軸と直交する方向をY軸、とするXYZ直交座標系が用いられる。 The imaging method and information processing method of this embodiment will be described in detail below with reference to the drawings. In the following description, the X-axis, Y-axis, and Z-axis directions are based on the arrow directions shown in the drawings. The X-axis, Y-axis, and Z-axis directions define a three-dimensional space that is orthogonal to each other. The X-axis direction may be interpreted as the moving direction of the moving body (direction of progress, direction of travel). The Y-axis direction may be interpreted as the height direction of the slope of the object (vertical direction, perpendicular direction). The Z-axis direction may be interpreted as the direction intersecting with the moving direction of the moving body (depth direction from the moving body toward the slope, imaging direction). In this way, the imaging method and information processing device of this embodiment use an XYZ Cartesian coordinate system in which the moving direction of the moving body is the X-axis, the direction intersecting the moving direction is the Z-axis, and the direction perpendicular to the X-axis and Z-axis is the Y-axis.
 図1は、本実施形態の撮影方法及び情報処理方法を実施するための状態検査システム1の全体構成の一例を示す図である。状態検査システム1は、情報処理システムの一例であり、移動体システム60によって取得された各種データを用いて、道路土工構造物の状態の検査を行うためのシステムである。道路土工構造物とは、道路を建設するために構築する土砂や岩石等の地盤材料を主材料として構成される構造物およびそれらに附帯する構造物の総称であり、切土・斜面安定施設、盛土、カルバートおよびこれらに類するものをいう。以下、道路土工構造物を「対象物」、「法面」、「法面を含む対象物」と称することがある。 FIG. 1 is a diagram showing an example of the overall configuration of a condition inspection system 1 for implementing the imaging method and information processing method of this embodiment. The condition inspection system 1 is an example of an information processing system, and is a system for inspecting the condition of road earthwork structures using various data acquired by a mobile system 60. Road earthwork structures are a general term for structures constructed to build roads using ground materials such as soil and rocks as their main materials, and structures associated with them, and refer to cut and slope stabilization facilities, embankments, culverts, and similar. Hereinafter, road earthwork structures may be referred to as "objects," "slope," and "objects including slopes."
 状態検査システム1は、移動体システム60、評価システム4、国、自治体の端末装置1100および委託事業者の端末装置1200によって構成されている。移動体システム60は、撮影システムの一例であり、データ取得装置9およびデータ取得装置9を搭載した車両等の移動体6によって構成されている。車両は、道路上を走行する車両であってもよく、線路上を走行する車両であってもよい。データ取得装置9は、構造物を計測する計測装置の例である撮影装置7、並びに距離センサ8aおよびGNSS(Global Navigation Satellite System:全球測位衛星システム)センサ8bを有している。GNSSは、GPS(Global Positioning System)または準天頂衛星(QZSS)等の衛星測位システムの総称である。 The condition inspection system 1 is composed of a mobile system 60, an evaluation system 4, a terminal device 1100 of the national or local government, and a terminal device 1200 of a contractor. The mobile system 60 is an example of an imaging system, and is composed of a data acquisition device 9 and a mobile body 6 such as a vehicle equipped with the data acquisition device 9. The vehicle may be a vehicle that runs on a road or a vehicle that runs on a railroad. The data acquisition device 9 has an imaging device 7, which is an example of a measuring device that measures a structure, as well as a distance sensor 8a and a GNSS (Global Navigation Satellite System) sensor 8b. GNSS is a general term for satellite positioning systems such as the Global Positioning System (GPS) or the Quasi-Zenith Satellite System (QZSS).
 撮影装置7は、光電変換素子を一列または複数列に配置させたラインセンサを搭載したラインカメラである。撮影装置7は、移動体6の走行方向(X軸方向)に沿った撮影面上にある所定の撮影範囲に沿った位置を撮影する。なお、撮影装置7は、ラインカメラに限られず、光電変換素子が面状に配置されたエリアセンサを搭載したカメラであってもよい。また、撮影装置7は、複数のカメラによって構成されていてもよい。複数のカメラによって構成された撮影装置7の詳細は後述する。 The imaging device 7 is a line camera equipped with a line sensor in which photoelectric conversion elements are arranged in one or more rows. The imaging device 7 captures images of positions along a predetermined imaging range on an imaging surface aligned in the traveling direction (X-axis direction) of the moving body 6. Note that the imaging device 7 is not limited to a line camera, and may be a camera equipped with an area sensor in which photoelectric conversion elements are arranged in a planar manner. The imaging device 7 may also be composed of multiple cameras. Details of the imaging device 7 composed of multiple cameras will be described later.
 距離センサ8aは、ToF(Time of Flight)センサであり、撮影装置7によって撮影された被写体との距離を計測する。GNSSセンサ8bは、複数のGNSS衛星が発信した各時間の信号を受信し、各信号を受信した時刻との差で衛星までの距離を算出することで、地球上の位置を計測する測位手段である。測位手段は、測位専用の装置であってもよく、PC(Personal Computer)やスマートフォン等にインストールされた測位専用のアプリケーションであってもよい。距離センサ8aおよびGNSSセンサ8bは、センサ装置の一例である。また、距離センサ8aは、三次元センサの一例である。 The distance sensor 8a is a ToF (Time of Flight) sensor that measures the distance to the subject photographed by the photographing device 7. The GNSS sensor 8b is a positioning means that receives signals transmitted at each time by multiple GNSS satellites and calculates the distance to the satellite from the difference in the time at which each signal was received, thereby measuring a position on the earth. The positioning means may be a device dedicated to positioning, or may be an application dedicated to positioning installed on a PC (Personal Computer), smartphone, etc. The distance sensor 8a and the GNSS sensor 8b are examples of sensor devices. The distance sensor 8a is also an example of a three-dimensional sensor.
 距離センサ8aとして用いられるToFセンサは、光源から物体にレーザ光を照射し、その散乱や反射光を計測することにより、光源から物体までの距離を測定する。 The ToF sensor used as distance sensor 8a measures the distance from a light source to an object by irradiating the object with laser light from the light source and measuring the scattered and reflected light.
 本実施形態では、距離センサ8aは、LiDAR(Light Detection and Ranging)センサである。LiDARとは、パルスを用いて光飛行時間を測定する方式であるが、ToFセンサのその他の方式として、位相差検出方式で距離を計測してもよい。位相差検出方式では、基本周波数で振幅変調したレーザ光を計測範囲に照射し、その反射光を受光して照射光と反射光との位相差を測定することで時間を求め、その時間に光速をかけて距離を算出する。また、距離センサ8aは、ステレオカメラ等により構成されてもよい。 In this embodiment, the distance sensor 8a is a LiDAR (Light Detection and Ranging) sensor. LiDAR is a method of measuring the time of flight of light using pulses, but as another method of the ToF sensor, the distance may be measured using a phase difference detection method. In the phase difference detection method, a laser light amplitude modulated at a fundamental frequency is irradiated onto the measurement range, the reflected light is received, and the phase difference between the irradiated light and the reflected light is measured to obtain time, and the distance is calculated by multiplying this time by the speed of light. The distance sensor 8a may also be configured with a stereo camera, etc.
 移動体システム60は、三次元センサを用いることで、法面の高さ、傾斜角度またははみだし等の二次元の画像からは得ることが困難な三次元の情報を得ることができる。 By using a three-dimensional sensor, the mobile system 60 can obtain three-dimensional information that is difficult to obtain from two-dimensional images, such as the height, inclination angle, or overhang of a slope.
 なお、移動体システム60は、さらに、角度センサ8cを搭載した構成であってもよい。角度センサ8cは、撮影装置7の撮影方向の角度(姿勢)または角速度(または各加速度)を検出するためのジャイロセンサ等である。 The mobile system 60 may further include an angle sensor 8c. The angle sensor 8c is a gyro sensor or the like for detecting the angle (attitude) or angular velocity (or each acceleration) of the shooting direction of the image capture device 7.
 評価システム4は、評価装置3およびデータ管理装置5によって構築されている。評価システム4を構成する評価装置3およびデータ管理装置5は、通信ネットワーク100を介して移動体システム60、端末装置1100および端末装置1200と通信することができる。通信ネットワーク100は、インターネット、移動体通信網、LAN(Local Area Network)等によって構築されている。なお、通信ネットワーク100には、有線通信だけでなく、3G(3rd Generation)、4G(4th Generation)、5G(5th Generation)、Wi-Fi(Wireless Fidelity)(登録商標)、WiMAX(Worldwide Interoperability for Microwave Access)またはLTE(Long Term Evolution)等の無線通信によるネットワークが含まれてもよい。また、評価装置3およびデータ管理装置5は、NFC(Near Field Communication)(登録商標)等の近距離通信技術による通信機能を備えていてもよい。 The evaluation system 4 is constructed by an evaluation device 3 and a data management device 5. The evaluation device 3 and data management device 5 constituting the evaluation system 4 can communicate with a mobile system 60, a terminal device 1100 and a terminal device 1200 via a communication network 100. The communication network 100 is constructed by the Internet, a mobile communication network, a LAN (Local Area Network), etc. In addition, the communication network 100 may include not only wired communication but also networks using wireless communication such as 3G (3rd Generation), 4G (4th Generation), 5G (5th Generation), Wi-Fi (Wireless Fidelity) (registered trademark), WiMAX (Worldwide Interoperability for Microwave Access), or LTE (Long Term Evolution). In addition, the evaluation device 3 and the data management device 5 may have a communication function using a short-range communication technology such as NFC (Near Field Communication) (registered trademark).
 データ管理装置5は、情報処理装置の一例であり、データ取得装置9によって取得された各種データを管理するPC等のコンピュータである。データ管理装置5は、データ取得装置9から各種取得データを受信し、受信した各種取得データを、データ解析を行う評価装置3に受け渡す。なお、データ管理装置5から評価装置3への各種取得データの受け渡し方法はUSB(Universal Serial Bus)メモリ等を使った人的な移動であってもよい。 The data management device 5 is an example of an information processing device, and is a computer such as a PC that manages various data acquired by the data acquisition device 9. The data management device 5 receives various acquired data from the data acquisition device 9, and transfers the received various acquired data to the evaluation device 3 that performs data analysis. The method of transferring the various acquired data from the data management device 5 to the evaluation device 3 may be manual transfer using a USB (Universal Serial Bus) memory or the like.
 評価装置3は、データ管理装置5から受け渡された各種取得データに基づいて、法面の状態を評価するPC等のコンピュータである。評価装置3は、法面状態を評価するための専用アプリケーションプログラムがインストールされている。評価装置3は、撮影画像データおよびセンサデータから法面の種別または構造を検出して形状データを抽出するとともに、変状の有無や変状の度合いを検出することによる詳細分析を行う。また、評価装置3は、撮影画像データおよびセンサデータ、評価対象データ、並びに詳細分析結果を用いて、国、自治体または委託事業者等の道路管理者に提出するためのレポートを生成する。評価装置3によって生成されたレポートのデータは、委託事業者を介して国、自治体に、電子データまたは書類に印刷した状態で提出される。評価装置3によって生成されるレポートは、調査記録表、点検表、調査台帳または調書等と称される。なお、評価装置3は、PCに限られず、スマートフォンまたはタブレット端末等であってもよい。また、評価システム4は、評価装置3とデータ管理装置5を、一台の装置または端末として構築する構成であってもよい。 The evaluation device 3 is a computer such as a PC that evaluates the condition of the slope based on various acquired data transferred from the data management device 5. A dedicated application program for evaluating the condition of the slope is installed in the evaluation device 3. The evaluation device 3 detects the type or structure of the slope from the captured image data and sensor data, extracts shape data, and performs a detailed analysis by detecting the presence or absence of deformation and the degree of deformation. The evaluation device 3 also generates a report to be submitted to a road administrator such as the country, local government, or a commissioned business operator, using the captured image data, sensor data, evaluation target data, and the detailed analysis results. The data of the report generated by the evaluation device 3 is submitted to the country or local government via the commissioned business operator in the form of electronic data or printed paper. The report generated by the evaluation device 3 is called an investigation record sheet, inspection sheet, investigation ledger, or report. The evaluation device 3 is not limited to a PC, and may be a smartphone or tablet terminal. The evaluation system 4 may be configured to construct the evaluation device 3 and the data management device 5 as a single device or terminal.
 端末装置1200は、委託事業者に備えられており、端末装置1100は、国、自治体に備えられている。評価装置3、端末装置1100および端末装置1200は、データ管理装置5と通信可能な通信端末の例であり、データ管理装置5で管理される各種データが閲覧可能である。 The terminal device 1200 is provided to the commissioned business operator, and the terminal device 1100 is provided to the national or local government. The evaluation device 3, the terminal device 1100, and the terminal device 1200 are examples of communication terminals capable of communicating with the data management device 5, and various data managed by the data management device 5 can be viewed.
 図2は、本実施形態の撮影方法及び情報処理方法を実施するための状態検査システム1を用いて法面状態を検査する様子の一例を示す図である。移動体システム60は、データ取得装置9を搭載した移動体6を道路上に走行させながら、撮影装置7で法面(対象物の法面)の所定範囲を撮影していく。 FIG. 2 is a diagram showing an example of how the condition of a slope is inspected using a condition inspection system 1 for implementing the imaging method and information processing method of this embodiment. The mobile system 60 drives a mobile body 6 equipped with a data acquisition device 9 along a road while capturing images of a predetermined area of the slope (the slope of the target object) using an imaging device 7.
 ここで、図2に示されているように、法面のうち、削った斜面を切土法面、土を盛った斜面を盛土法面という。また、山の脇に通した道路において、側面にある斜面のことを自然斜面という。切土法面や盛土法面は、法面の表面に植物を植えることで耐久性が増し、そのまま数十年変化させないで済ませられることがある。しかしながら、このようなケースばかりではなく、風雨等によって切土法面、盛土法面および自然斜面の劣化が進むと、表面の岩や土が落ちてくる表層崩壊や山が崩れて道路封鎖に至る崩壊が起こる。このような事態を避けるため、斜面の表面には、モルタルを吹き付けたり(モルタル吹付)、コンクリートの構造物を設置し固めることで斜面が風雨にさらされて劣化する速度を遅くしたりする手法が取られている。このような手法によって構築された構造物を土工構造物という。土工構造物には、自然斜面と道路の間に設置する擁壁や落石が道路に落下することを防ぐ落石防護柵等が存在するが、いずれも道路への土砂や落石などの流出による道路封鎖または人的被害を未然に防ぐためのものである。 As shown in Figure 2, a cut slope is a slope where the soil has been piled up, and a bank slope is a slope where the soil has been piled up. In addition, the slope on the side of a road that runs along the side of a mountain is called a natural slope. Cut slopes and bank slopes can be made more durable by planting plants on the surface of the slope, and can be left unchanged for decades. However, this is not always the case. When cut slopes, bank slopes, and natural slopes deteriorate due to wind and rain, surface collapses occur, causing rocks and soil to fall, or the mountain collapses, causing road closures. To prevent such situations, methods are used to spray mortar on the surface of the slope (mortar spraying) or to install and harden concrete structures to slow the rate at which the slope deteriorates when exposed to wind and rain. Structures constructed using such methods are called earthwork structures. Earthwork structures include retaining walls that are installed between natural slopes and roads, and rockfall protection fences that prevent rocks from falling onto the road, but both are intended to prevent road closures or human injury caused by soil or falling rocks flowing onto the road.
 近年、施工から数十年経過した土工構造物の劣化が著しく、社会インフラの整備が大きな課題となっている。そのため、土工構造物の劣化を早期に発見し、土工構造物を長持ちさせるための点検および老朽化保全が重要となる。従来の自然斜面および土工構造物の点検は、斜面の落石、崩壊、地滑りまたは土石流を調査して修繕計画を作成するもので、専門家による目視点検によって行われていた。 In recent years, the deterioration of earthwork structures that were constructed decades ago has been significant, making the development of social infrastructure a major issue. For this reason, it is important to detect deterioration of earthwork structures early and to inspect and protect them from deterioration in order to prolong their life. Conventionally, inspections of natural slopes and earthwork structures have been carried out by visual inspections by experts, investigating rock falls, collapses, landslides, or debris flows on the slopes and creating repair plans.
 しかしながら、専門家による目視点検では、日本中に大量にある土工構造物を一定期間に点検しきれないことや高所や川沿いの盛土等を点検できないといった効率面の問題に加えて、目視点検では、土工構造物表層に発生するひびまたは剥離といった変状の劣化の進行度合いを定量的に把握できなかった。 However, visual inspections by experts have problems with efficiency, such as the inability to inspect all of the large number of earthwork structures throughout Japan in a given period of time and the inability to inspect embankments at high altitudes or along rivers. In addition, visual inspections cannot quantitatively grasp the degree of deterioration, such as cracks or peeling that have occurred on the surface of earthwork structures.
 そこで、実施形態に係る状態検査システム1は、撮影装置7によって土工構造物斜面の撮影画像データを取得し、距離センサ8a等の三次元センサによって三次元情報を含むセンサデータを取得する。そして、評価システム4は、取得された撮影画像データとセンサデータを組み合わせて法面状態の評価を行うことで、法面の三次元形状を示す形状データを検出するとともに、ひびや剥離といった変状を検出する。これによって、状態検査システム1は、人間の目視では点検が困難な評価を効率的に行うことができる。 The condition inspection system 1 according to the embodiment acquires photographed image data of the slope of an earthwork structure using the photographing device 7, and acquires sensor data including three-dimensional information using a three-dimensional sensor such as a distance sensor 8a. The evaluation system 4 then combines the acquired photographed image data and sensor data to evaluate the condition of the slope, thereby detecting shape data indicating the three-dimensional shape of the slope and detecting abnormalities such as cracks and peeling. This allows the condition inspection system 1 to efficiently perform evaluations that are difficult to inspect with the human eye.
 図3は、法面の状態の一例を示す図である。図3(a)は、崩壊5年前の法面の表面を示す画像であり、図3(b)は、図3(a)に示す画像の説明図である。この状態は、法面表層のひび割れが目立つ段階であり、ひび割れ、剥離、湧水等の表層の変状または変状の予兆を検出するために、展開図等に示される画像解析が有効である。 Figure 3 shows an example of the condition of a slope. Figure 3(a) is an image showing the surface of the slope five years before the collapse, and Figure 3(b) is an explanatory diagram of the image shown in Figure 3(a). At this stage, cracks in the surface layer of the slope are noticeable, and image analysis shown in an unfolded view, etc. is effective in detecting changes or signs of changes in the surface layer, such as cracks, peeling, and seepage.
 図3(c)は、崩壊2年前の法面の表面を示す画像であり、図3(d)は、図3(c)に示す画像の説明図である。この状態は、法面内部が土砂化し、土砂が法面表層を押し、斜面が膨らんだ段階であり、ひび割れを伴う段差、はらみだし等の三次元的な変状を検出するために、展開図等の画像+断面図等の三次元解析が有効である。 Figure 3(c) is an image showing the surface of the slope two years before the collapse, and Figure 3(d) is an explanatory diagram of the image shown in Figure 3(c). In this state, the inside of the slope has turned to soil, the soil has pushed against the surface of the slope, and the slope has bulged. In order to detect three-dimensional deformations such as cracks, steps, and bulges, three-dimensional analysis of images such as development drawings + cross-sections is effective.
 図3(e)は、崩壊した法面の表面を示す画像であり、図3(f)は、図3(e)に示す画像の説明図である。この状態は、法面表層が土砂を抑えきれず崩壊している。 Figure 3(e) is an image showing the surface of the collapsed slope, and Figure 3(f) is an explanatory diagram of the image shown in Figure 3(e). In this state, the surface layer of the slope is unable to contain the soil and has collapsed.
 図4は、データ取得装置9のハードウエア構成の一例を示す図である。データ取得装置9は、図1に示されているような撮影装置7およびセンサ装置8とともに、データ取得装置9の処理または動作を制御するコントローラ900を備える。 FIG. 4 is a diagram showing an example of the hardware configuration of the data acquisition device 9. The data acquisition device 9 includes the image capture device 7 and the sensor device 8 as shown in FIG. 1, as well as a controller 900 that controls the processing or operation of the data acquisition device 9.
 コントローラ900は、撮影装置I/F(Interface)901、センサ装置I/F902、バスライン910、CPU(Central Processing Unit)911、ROM(Read Only Memory)912、RAM(Random Access Memory)913、HD(Hard Disk)914、HDD(Hard Disk Drive)コントローラ915、ネットワークI/F916、DVD-RW(Digital Versatile Disk Rewritable)ドライブ918、メディアI/F922、外部機器接続I/F923およびタイマ924を備えている。 The controller 900 includes an imaging device I/F (Interface) 901, a sensor device I/F 902, a bus line 910, a CPU (Central Processing Unit) 911, a ROM (Read Only Memory) 912, a RAM (Random Access Memory) 913, a HD (Hard Disk) 914, a HDD (Hard Disk Drive) controller 915, a network I/F 916, a DVD-RW (Digital Versatile Disk Rewritable) drive 918, a media I/F 922, an external device connection I/F 923, and a timer 924.
 これらのうち、撮影装置I/F901は、撮影装置7との間で各種データまたは情報の送受信を行うためのインターフェースである。センサ装置I/F902は、センサ装置8との間で各種データまたは情報の送受信を行うためのインターフェースである。バスライン910は、図4に示されているCPU911等の各構成要素を電気的に接続するためのアドレスバスやデータバス等である。 Of these, the imaging device I/F 901 is an interface for transmitting and receiving various data or information to and from the imaging device 7. The sensor device I/F 902 is an interface for transmitting and receiving various data or information to and from the sensor device 8. The bus line 910 is an address bus, data bus, etc. for electrically connecting each component such as the CPU 911 shown in FIG. 4.
 また、CPU911は、データ取得装置9全体の動作を制御する。ROM912は、IPL等のCPU911の駆動に用いられるプログラムを記憶する。RAM913は、CPU911のワークエリアとして使用される。HD914は、プログラム等の各種データを記憶する。HDDコントローラ915は、CPU911の制御にしたがってHD914に対する各種データの読み出しまたは書き込みを制御する。ネットワークI/F916は、通信ネットワーク100を利用してデータ通信をするためのインターフェースである。 The CPU 911 also controls the operation of the entire data acquisition device 9. The ROM 912 stores programs used to drive the CPU 911, such as IPL. The RAM 913 is used as a work area for the CPU 911. The HD 914 stores various data such as programs. The HDD controller 915 controls the reading and writing of various data from the HD 914 under the control of the CPU 911. The network I/F 916 is an interface for data communication using the communication network 100.
 DVD-RWドライブ918は、着脱可能な記録媒体の一例としてのDVD-RW917に対する各種データの読み出しまたは書き込みを制御する。なお、DVD-RWに限らず、DVD-RやBlu-ray(登録商標) Disc(ブルーレイディスク)等であってもよい。 The DVD-RW drive 918 controls the reading and writing of various data from and to a DVD-RW 917, which is an example of a removable recording medium. Note that the medium is not limited to a DVD-RW, and may be a DVD-R or a Blu-ray (registered trademark) Disc, etc.
 メディアI/F922は、フラッシュメモリ等の記録メディア921に対するデータの読み出しまたは書き込み(記憶)を制御する。外部機器接続I/F923は、ディスプレイ、受付部および表示制御部を有する外部PC930等の外部機器を接続するためのインターフェースである。タイマ924は、時間計測機能を有する計測装置である。タイマ924は、コンピュータによるソフトタイマでもよい。タイマ924は、GNSSのセンサ8bの時刻と同期することが好ましい。これにより、各センサデータおよび撮影画像データは、時刻の同期、位置の対応付けが容易になる。 The media I/F 922 controls the reading and writing (storing) of data from and to a recording medium 921 such as a flash memory. The external device connection I/F 923 is an interface for connecting an external device such as an external PC 930 having a display, a reception unit, and a display control unit. The timer 924 is a measurement device with a time measurement function. The timer 924 may be a computer-based software timer. It is preferable that the timer 924 be synchronized with the time of the GNSS sensor 8b. This makes it easy to synchronize the time and associate the positions of each sensor data and captured image data.
 図5は、移動体システムによって取得される撮影画像について説明するための図である。 Figure 5 is a diagram explaining the captured images acquired by the mobile system.
 移動体システム60は、移動体6を走行させながら、データ取得装置9に備えられた撮影装置7を用いて、道路上に設けられた法面を撮影する。図5に示すX軸方向は、移動体6の移動方向を示し、Y軸方向は鉛直方向、Z軸方向は、X軸方向およびY軸方向に直交し、移動体6から法面に向かう奥行き方向を示す。 The mobile body system 60 uses the imaging device 7 provided in the data acquisition device 9 to capture images of slopes on the road while the mobile body 6 is traveling. The X-axis direction shown in FIG. 5 indicates the direction of movement of the mobile body 6, the Y-axis direction is the vertical direction, and the Z-axis direction is perpendicular to the X-axis and Y-axis directions and indicates the depth direction from the mobile body 6 toward the slope.
 データ取得装置9は、移動体6の走行に伴い、図5に示されているように、撮影画像1、測距画像1および撮影画像2、測距画像2を時系列に取得していく。測距画像1および測距画像2は、距離センサ8aにより取得された画像である。このとき、撮影装置7およびセンサ装置8は、時刻同期が取られており、撮影画像1および測距画像1と、撮影画像2および測距画像2とは、夫々法面の同じ領域に対する画像となる。また、撮影時の車両の姿勢から撮影画像の傾き補正(画像補正)が行われ、撮影画像の時刻から画像データと測位データ(北緯東経)が紐づけられる。 As the mobile unit 6 travels, the data acquisition device 9 acquires photographed image 1, distance-measured image 1, photographed image 2, and distance-measured image 2 in chronological order, as shown in FIG. 5. Distance-measured image 1 and distance-measured image 2 are images acquired by distance sensor 8a. At this time, the photographing device 7 and sensor device 8 are time-synchronized, so photographed image 1 and distance-measured image 1, and photographed image 2 and distance-measured image 2 are images of the same area of the slope. In addition, tilt correction (image correction) of the photographed image is performed based on the attitude of the vehicle at the time of shooting, and the image data and positioning data (north latitude and east longitude) are linked based on the time of the photographed image.
 このように、移動体システム60は、移動体6としての車両を走行させながら、法面が撮影された撮影画像データおよび撮影装置7の撮影に応じて取得されたセンサデータを取得し、データ管理装置5に対してアップロードする。なお、データ取得装置9は、測距画像と撮影画像をそれぞれ別々の走行時に取得してもよいが、崩落等による法面形状の変化を考慮すると、同一の法面形状に対して同一の走行時に測距画像と撮影画像を取得することが好ましい。 In this way, the mobile system 60 acquires photographed image data of the slope and sensor data acquired in response to photographing by the photographing device 7 while driving the vehicle as the mobile body 6, and uploads them to the data management device 5. Note that the data acquisition device 9 may acquire the ranging images and the photographed images while driving separately, but considering changes in the slope shape due to collapses, etc., it is preferable to acquire the ranging images and the photographed images for the same slope shape while driving at the same time.
 図6は、撮影画像と測距画像の説明図である。 Figure 6 is an explanatory diagram of the captured image and the distance measurement image.
 図6(a)は、図5に示した撮影画像1、2等の撮影画像データ7Aを示す。撮影装置7により取得された撮影画像データ7Aの各画素7A1は、図5に示したX軸方向およびY軸方向に対応する座標で配置されており、蓄電量に対応する輝度情報を有する。すなわち、撮影画像データ7Aは、輝度画像の一例である。 FIG. 6(a) shows captured image data 7A of captured images 1, 2, etc. shown in FIG. 5. Each pixel 7A1 of the captured image data 7A acquired by the photographing device 7 is arranged at coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5, and has brightness information corresponding to the amount of stored power. In other words, the captured image data 7A is an example of a brightness image.
 そして、撮影画像データ7Aの各画素7A1の輝度情報は、図5に示したX軸方向およびY軸方向に対応する座標に対応付けて、撮影画像データとして、記憶部に記憶される。 Then, the brightness information of each pixel 7A1 of the captured image data 7A is stored in the storage unit as captured image data in association with coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5.
 図6(b)は、図5に示した測距画像1、2等の測距画像データ8Aを示す。距離センサ8aにより取得された測距画像データ8Aの各画素8A1は、図5に示したX軸方向およびY軸方向に対応する座標で配置されており、蓄電量に対応する図5に示したZ軸方向における距離情報を有する。なお、測距画像データ8Aは、三次元点群データであるが、一般的にユーザに視認させる際には、輝度情報を付与して可視表示するため、測距画像データと称する。そして、撮影画像データ7Aおよび測距画像データ8Aを総称して、画像データと称する。 FIG. 6(b) shows distance measurement image data 8A such as distance measurement images 1 and 2 shown in FIG. 5. Each pixel 8A1 of distance measurement image data 8A acquired by distance sensor 8a is arranged at coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5, and has distance information in the Z-axis direction shown in FIG. 5 corresponding to the amount of stored power. Note that distance measurement image data 8A is three-dimensional point cloud data, but is generally referred to as distance measurement image data because it is visually displayed with luminance information added when viewed by a user. The captured image data 7A and distance measurement image data 8A are collectively referred to as image data.
 そして、測距画像データ8Aの各画素8A1の距離情報は、図5に示したX軸方向およびY軸方向に対応する座標に対応付けて、センサデータに含まれる三次元データとして、記憶部に記憶される。 The distance information for each pixel 8A1 in the distance measurement image data 8A is then stored in the storage unit as three-dimensional data included in the sensor data, in association with coordinates corresponding to the X-axis and Y-axis directions shown in FIG. 5.
 ここで、図6(a)に示す撮影画像データ7Aと、図6(b)に示す測距画像データ8Aは、夫々法面の同じ領域に対する画像であるから、図に示したX軸方向およびY軸方向に対応する座標に対応付けて、輝度情報および距離情報が、記憶部に記憶されることになる。 Here, the captured image data 7A shown in FIG. 6(a) and the distance measurement image data 8A shown in FIG. 6(b) are images of the same area of the slope, so the brightness information and distance information are stored in the memory unit in association with the coordinates corresponding to the X-axis and Y-axis directions shown in the figure.
 図7は、複数の撮影領域の説明図である。図7(a)に示すように、撮影装置7は、移動体6とともに移動しながら、移動体6の移動方向であるX軸方向に沿って一定の撮影間隔tで、斜面80の撮影範囲である対象領域70を複数の撮影領域d11、d12・・・に分けて撮影する。 FIG. 7 is an explanatory diagram of multiple shooting areas. As shown in FIG. 7(a), the shooting device 7 moves together with the moving body 6 and captures images of the target area 70, which is the shooting range of the slope 80, divided into multiple shooting areas d11, d12, etc., at a constant shooting interval t along the X-axis direction, which is the moving direction of the moving body 6.
 図7(b)に示すように、複数の撮影領域d11、d12・・・を撮影した撮影画像はY軸方向に長いスリット状の撮影画像であり、これら複数の撮影領域d11、d12・・・を撮像した画像をつなぎ合わせることにより、X軸方向に連続する対象領域70の撮影画像を得ることができる。 As shown in FIG. 7(b), the captured images of multiple shooting areas d11, d12, etc. are long slit-shaped images in the Y-axis direction, and by stitching together the images of multiple shooting areas d11, d12, etc., it is possible to obtain a captured image of the target area 70 that is continuous in the X-axis direction.
 図7(c)は、対象領域70全体を撮像する際に、対象領域70全体を複数の対象領域に分けて撮像する場合を示した図である。図7(c)では、対象領域70全体は、撮影画像を複数の対象領域701A、702A、701Bおよび702Bという4つの対象領域に分けて撮像される。 FIG. 7(c) is a diagram showing a case where the entire target area 70 is divided into a plurality of target areas and imaged when the entire target area 70 is imaged. In FIG. 7(c), the entire target area 70 is imaged by dividing the captured image into four target areas, namely, target areas 701A, 702A, 701B, and 702B.
 図7(b)に示す場合と同様に、複数の対象領域701A、702A、701Bおよび702Bのそれぞれは、複数の撮影領域d11、d12・・・に分けて撮像され、これら、複数の撮影領域d11、d12・・・を撮像した画像をつなぎ合わせることにより、複数の対象領域701A、702A、701Bおよび702Bそれぞれの撮影画像を得ることができる。そして、複数の対象領域701A、702A、701Bおよび702Bのそれぞれを撮影した撮影画像をつなぎ合わせることにより、対象領域70の全体の撮影画像を得ることができる。 As in the case shown in FIG. 7(b), each of the multiple target areas 701A, 702A, 701B, and 702B is divided into multiple shooting areas d11, d12, etc., and images of the multiple shooting areas d11, d12, etc. are stitched together to obtain images of each of the multiple target areas 701A, 702A, 701B, and 702B. Then, by stitching together images of the multiple target areas 701A, 702A, 701B, and 702B, an image of the entire target area 70 can be obtained.
 この場合、撮影装置7は、複数の撮影装置を備えており、対象領域702Aおよび702Bは、対象領域701Aおよび701Bを撮影する撮影装置とは異なる撮影装置により撮影される。 In this case, the image capture device 7 is equipped with multiple image capture devices, and the target areas 702A and 702B are captured by an image capture device different from the image capture device that captures the target areas 701A and 701B.
 また、対象領域701Bは、対象領域701Aを撮影したのと同じ撮影装置により撮影され、対象領域702Bも、対象領域702Aを撮影したのと同じ撮影装置により撮影されてもよい。 In addition, the target area 701B may be photographed by the same imaging device that photographed the target area 701A, and the target area 702B may also be photographed by the same imaging device that photographed the target area 702A.
 なお、図7(a)に示したように、撮影装置7が、斜面80の対象領域を複数の撮影領域d11、d12・・・に分けて撮影するタイミングで、距離センサ8aも、距離センサ8aから複数の撮影領域d11、d12・・・のそれぞれへの距離を示す距離情報を取得することが望ましい。 As shown in FIG. 7(a), when the image capturing device 7 captures the target area of the slope 80 divided into a plurality of image capturing areas d11, d12, etc., it is desirable that the distance sensor 8a also acquires distance information indicating the distance from the distance sensor 8a to each of the plurality of image capturing areas d11, d12, etc.
 これにより、図6で説明したように、撮影装置7により取得された撮影画像データ7Aの各画素7A1の輝度情報と、距離センサ8aにより取得された測距画像データ8Aの各画素8A1の距離情報を容易に対応付けることができる。そして、斜面80の対象領域を撮影した撮像画像の各画像の輝度情報と斜面80の対象領域を測距した測距画像の各画素の距離情報とを対応付けすることで、斜面80の対象領域について、高精度な検査を行うことが出来る。 As a result, as explained in FIG. 6, it is possible to easily associate the luminance information of each pixel 7A1 of the photographed image data 7A acquired by the photographing device 7 with the distance information of each pixel 8A1 of the distance measurement image data 8A acquired by the distance sensor 8a. Then, by associating the luminance information of each image of the captured image capturing the target area of the slope 80 with the distance information of each pixel of the distance measurement image capturing the target area of the slope 80, it is possible to perform a highly accurate inspection of the target area of the slope 80.
 このように、本実施形態の状態検査システム1は、撮影装置(カメラ)を搭載した移動体(車両)で法面脇の道路を走行しながら、法面の撮影を行う。その際、撮影装置(カメラ)を複数用意して、各々の撮影装置(カメラ)によって法面を移動体(車両)の進行方向(X軸方向)に分割して撮影する。また、各々の撮影装置(カメラ)の撮影タイミングを同期させるようにして、法面の高さ方向(Y軸方向)の領域を分割して撮影する。このように、法面を移動体(車両)の進行方向(X軸方向)に輪切りにしながら撮影して、各輪切り画像を撮影後に繋ぎ合わせることで法面全体の表面画像を所定の解像度(例えば4K)で得ることができる。これにより法面の変状の点検が可能となり、繋ぎ合わせた画像を縮小することで法面の広い範囲の状態を確認することが可能になる。 In this way, the condition inspection system 1 of this embodiment photographs the slope while a mobile body (vehicle) equipped with an imaging device (camera) travels along a road beside the slope. In this case, multiple imaging devices (cameras) are prepared, and the slope is divided and photographed in the traveling direction (X-axis direction) of the mobile body (vehicle) by each imaging device (camera). In addition, the imaging timing of each imaging device (camera) is synchronized to divide and photograph the area in the height direction (Y-axis direction) of the slope. In this way, the slope is sliced and photographed in the traveling direction (X-axis direction) of the mobile body (vehicle), and each slice image is stitched together after shooting to obtain a surface image of the entire slope at a specified resolution (e.g. 4K). This makes it possible to inspect the slope for deformation, and by reducing the stitched image, it becomes possible to check the condition of a wide area of the slope.
 図8は、本実施形態の撮影装置を搭載した移動体の構成の一例を示す図である。図8に示すように、X軸方向に移動する移動体(車両)6には、X軸方向の位置を異ならせて、複数の撮影装置(カメラ)7A、7B、7C、7D、7Eが搭載されている。撮影装置7A、7B、7C、7D、7Eは、この順に、移動体6の進行方向の前側から後側に配置されている。撮影装置7A、7B、7C、7D、7Eは、カメラ1、2、3、4、5と呼んでもよい。撮影装置7Eの後側には、距離センサ8a(例えばLiDAR)が搭載されている。なお、撮影装置7A~7E及び距離センサ8aの配置は図8に例示したものに限定されず、種々の設計変更が可能である。 FIG. 8 is a diagram showing an example of the configuration of a moving body equipped with an image capture device of this embodiment. As shown in FIG. 8, a moving body (vehicle) 6 moving in the X-axis direction is equipped with multiple image capture devices (cameras) 7A, 7B, 7C, 7D, and 7E at different positions in the X-axis direction. The image capture devices 7A, 7B, 7C, 7D, and 7E are arranged in this order from the front to the rear of the moving body 6 in the direction of travel. The image capture devices 7A, 7B, 7C, 7D, and 7E may also be called cameras 1, 2, 3, 4, and 5. A distance sensor 8a (e.g., LiDAR) is mounted on the rear side of the image capture device 7E. Note that the arrangement of the image capture devices 7A to 7E and the distance sensor 8a is not limited to that shown in FIG. 8, and various design changes are possible.
 対象物である法面(法面を含む対象物)の高さ方向について、1台の撮影装置の画角内に収まらない場合があるので、複数台の撮影装置、ここでは5台の撮影装置7A~7Eを用いて、対象物である法面(法面を含む対象物)の高さ方向の領域を分割して撮影する。対象物である法面(法面を含む対象物)の高さ方向を下から上に向かって第1領域、第2領域、第3領域、第4領域、第5領域に分割した場合において、第1領域を撮影装置7Aが撮影し、第2領域を撮影装置7Bが撮影し、第3領域を撮影装置7Cが撮影し、第4領域を撮影装置7Dが撮影し、第5領域を撮影装置7Eが撮影する。つまり、撮影装置7Aが法面の最も下側を撮影するカメラであり、撮影装置7Eが法面の最も上側を撮影するカメラである。以下では、対象物である法面(法面を含む対象物)の高さが15m又はその近傍であるものとして説明する。 There are cases where the height direction of the slope (object including a slope) of the object does not fit within the angle of view of a single camera, so multiple camera devices, here five camera devices 7A to 7E, are used to divide and photograph the area of the slope (object including a slope) of the object in the height direction. When the height direction of the slope (object including a slope) of the object is divided from bottom to top into a first area, a second area, a third area, a fourth area, and a fifth area, the first area is photographed by camera device 7A, the second area is photographed by camera device 7B, the third area is photographed by camera device 7C, the fourth area is photographed by camera device 7D, and the fifth area is photographed by camera device 7E. In other words, camera device 7A is the camera that photographs the lowest part of the slope, and camera device 7E is the camera that photographs the highest part of the slope. In the following explanation, it is assumed that the height of the slope (object including a slope) of the object is 15m or close to that.
 このように、本実施形態の撮影方法は、移動体(車両)6に設置された撮影装置(カメラ)7A、7B、7C、7D、7Eにより、移動しながら対象物(法面)を撮影するものである。また、撮影装置(カメラ)7A、7B、7C、7D、7Eは、移動体(車両)6の移動方向と交差するZ軸方向に位置する対象物(法面)における「撮影対象領域」を撮影するものである。この「撮影対象領域」は、「第1の対象領域」と、Y軸方向において第1の対象領域よりも上側に位置する「第2の対象領域」とを含んでいる。また、「撮影装置」は、「第1の対象領域」を撮影する「第1の撮影装置」と、「第2の対象領域」を撮影する「第2の撮影装置」とを含んでいる。本実施形態の例に当て込むと、撮影装置7A-7Dの全部又は一部が「第1の対象領域」を撮影する「第1の撮影装置」に相当し、撮影装置7Eが「第2の対象領域」を撮影する「第2の撮影装置」に相当する。 In this way, the photographing method of this embodiment involves photographing an object (slope) while moving using photographing devices (cameras) 7A, 7B, 7C, 7D, and 7E installed on a moving body (vehicle) 6. Furthermore, the photographing devices (cameras) 7A, 7B, 7C, 7D, and 7E photograph a "photographed target area" of the object (slope) located in the Z-axis direction that intersects with the moving direction of the moving body (vehicle) 6. This "photographed target area" includes a "first target area" and a "second target area" located above the first target area in the Y-axis direction. Furthermore, the "photographing devices" include a "first photographing device" that photographs the "first target area" and a "second photographing device" that photographs the "second target area". In the example of this embodiment, all or part of the photographing devices 7A-7D correspond to the "first photographing device" that photographs the "first target area", and the photographing device 7E corresponds to the "second photographing device" that photographs the "second target area".
 ここで、法面の最も上側(15m付近)を撮影する撮影装置7E(カメラ5)は、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される「設置位置可変撮影部」である。後述する「設置位置設定ステップ」では、移動体(車両)6における撮影装置7E(第2の撮影装置)のZ軸方向の設置位置が設定(調整)される。 Here, the photographing device 7E (camera 5) that photographs the uppermost part of the slope (around 15 m) is an "installation position variable photographing unit" whose installation position in the direction (Z axis direction) intersecting with the moving direction (X axis direction) relative to the moving body 6 is adjusted. In the "installation position setting step" described later, the installation position in the Z axis direction of the photographing device 7E (second photographing device) on the moving body (vehicle) 6 is set (adjusted).
 図9(a)、(b)、(c)は、設置位置可変撮影部の構成の一例を示す図である。図中の紙面直交方向がX軸方向、すなわち移動体6の移動方向であり、図中の右方向がZ軸方向、すなわち対象物である法面(法面を含む対象物)が存在する方向である。なお、設置位置可変撮影部の構成(設置位置調整機構の構成)は、図9(a)~(c)で例示したものに限定されず、種々の設計変更が可能である。 Figures 9(a), (b), and (c) are diagrams showing an example of the configuration of the variable installation position imaging unit. The direction perpendicular to the paper surface in the figure is the X-axis direction, i.e., the direction of movement of the moving body 6, and the rightward direction in the figure is the Z-axis direction, i.e., the direction in which the slope of the target object (target object including a slope) exists. Note that the configuration of the variable installation position imaging unit (configuration of the installation position adjustment mechanism) is not limited to the examples shown in Figures 9(a) to (c), and various design modifications are possible.
 図9(a)では、移動体6の天井部にZ軸方向に離間した2つのカメラ取付部71、72が設けられており、撮影装置7E(カメラ5)を2つのカメラ取付部71、72のいずれに取り付けるかによって、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される。撮影装置7E(カメラ5)をカメラ取付部71に取り付けた場合には、対象物である法面(法面を含む対象物)への距離を近付けることができ、撮影装置7E(カメラ5)をカメラ取付部72に取り付けた場合には、対象物である法面(法面を含む対象物)への距離を遠ざけることができる(二段階設定が可能である)。また、撮影装置(カメラ取付部71、72)は、Z軸方向のみならずY軸方向へも移動した位置でもよい(位置調整可能としてもよい)。図9(a)では、撮影装置7E(カメラ5)をカメラ取付部71に取り付けた場合を例示的に描いている。 In FIG. 9(a), two camera mounting parts 71 and 72 are provided on the ceiling of the moving body 6, spaced apart in the Z-axis direction, and the installation position in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6 is adjusted depending on which of the two camera mounting parts 71 and 72 the imaging device 7E (camera 5) is attached to. When the imaging device 7E (camera 5) is attached to the camera mounting part 71, the distance to the target slope (target including a slope) can be reduced, and when the imaging device 7E (camera 5) is attached to the camera mounting part 72, the distance to the target slope (target including a slope) can be increased (two-stage setting is possible). In addition, the imaging devices (camera mounting parts 71 and 72) may be moved not only in the Z-axis direction but also in the Y-axis direction (position adjustment may be possible). FIG. 9(a) illustrates an example in which the imaging device 7E (camera 5) is attached to the camera mounting part 71.
 図9(b)では、移動体6の天井部にZ軸方向に離間した3つのカメラ取付部71、72、73が設けられており、撮影装置7E(カメラ5)を3つのカメラ取付部71、72、73のいずれに取り付けるかによって、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される。撮影装置7E(カメラ5)をカメラ取付部71に取り付けた場合には、対象物である法面(法面を含む対象物)への距離を最も近付けることができ、撮影装置7E(カメラ5)をカメラ取付部72に取り付けた場合には、対象物である法面(法面を含む対象物)への距離を最も遠ざけることができ、撮影装置7E(カメラ5)をカメラ取付部73に取り付けた場合には、対象物である法面(法面を含む対象物)への距離を中間に設定することができる(三段階設定が可能である)。また、撮影装置(カメラ取付部71、72、73)は、Z軸方向のみならずY軸方向へも移動した位置でもよい(位置調整可能としてもよい)。図9(b)では、撮影装置7E(カメラ5)をカメラ取付部73に取り付けた場合を例示的に描いている。 In FIG. 9(b), three camera mounting parts 71, 72, 73 spaced apart in the Z-axis direction are provided on the ceiling of the moving body 6, and the installation position in the direction (Z-axis direction) intersecting the direction of movement (X-axis direction) relative to the moving body 6 is adjusted depending on which of the three camera mounting parts 71, 72, 73 the imaging device 7E (camera 5) is attached to. When the imaging device 7E (camera 5) is attached to the camera mounting part 71, the distance to the target slope (target including a slope) can be made the closest, when the imaging device 7E (camera 5) is attached to the camera mounting part 72, the distance to the target slope (target including a slope) can be made the furthest, and when the imaging device 7E (camera 5) is attached to the camera mounting part 73, the distance to the target slope (target including a slope) can be set to an intermediate position (three-stage setting is possible). In addition, the photographing devices ( camera mounting parts 71, 72, 73) may be moved not only in the Z-axis direction but also in the Y-axis direction (their positions may be adjustable). Figure 9(b) shows an example in which the photographing device 7E (camera 5) is mounted on the camera mounting part 73.
 図9(c)では、移動体6の天井部にZ軸方向へのスライド移動が可能なスライド機構付きカメラ取付部74が設けられており、撮影装置7E(カメラ5)をスライド機構付きカメラ取付部74に取り付けてスライドさせることによって、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される(無段階設定が可能である)。図9(c)では、撮影装置7E(カメラ5)をスライド機構付きカメラ取付部74に取り付けてスライドさせることによって、対象物である法面(法面を含む対象物)から最も遠ざけた場合を例示的に描いている。また、撮影装置をスライド機構付きカメラ取付部74に傾けて取り付けてスライドさせることによって、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)のみならず、Y軸方向へも移動した設置位置が調整されてもよい。 In FIG. 9(c), a camera mounting part 74 with a sliding mechanism capable of sliding in the Z-axis direction is provided on the ceiling of the moving body 6, and the installation position in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6 is adjusted (stepless setting is possible) by attaching the imaging device 7E (camera 5) to the camera mounting part 74 with a sliding mechanism and sliding it. FIG. 9(c) illustrates an example in which the imaging device 7E (camera 5) is attached to the camera mounting part 74 with a sliding mechanism and slid to be farthest from the target slope (target including a slope). In addition, the installation position may be adjusted not only in the direction (Z-axis direction) intersecting the moving direction (X-axis direction) relative to the moving body 6, but also in the Y-axis direction by tilting the imaging device and attaching it to the camera mounting part 74 with a sliding mechanism and sliding it.
 撮影装置7E(カメラ5)以外の撮影装置7A、7B、7C、7D(カメラ1、2、3、4)は、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が固定された「設置位置固定撮影部」である。 The imaging devices 7A, 7B, 7C, and 7D (cameras 1, 2, 3, and 4) other than the imaging device 7E (camera 5) are "fixed installation position imaging units" whose installation positions are fixed in the direction (Z-axis direction) that intersects with the moving direction (X-axis direction) relative to the moving body 6.
 従って、設置位置可変撮影部(撮影装置7E(カメラ5))は、対象物の法面の相対的に高い撮影位置を撮影し、設置位置固定撮影部(撮影装置7A~7D(カメラ1~4))は、対象物の法面の相対的に低い撮影位置を撮影することになる。 Therefore, the variable installation position photographing unit (photographing device 7E (camera 5)) photographs a relatively high position on the slope of the object, and the fixed installation position photographing unit (photographing devices 7A to 7D (cameras 1 to 4)) photographs a relatively low position on the slope of the object.
 また、5台の撮影装置7A~7E(カメラ1~5)のうち、撮影装置7A、7B(カメラ1、2)は、シャインプルーフ角が設定されない所謂「通常カメラ」であり、撮影装置7C~7E(カメラ3~5)は、シャインプルーフ角が設定される所謂「シャインプルーフカメラ」である。シャインプルーフカメラは、斜めに向かい合った平面全体が合焦して見えるような特殊なカメラの一例であって、「あおりカメラ」、「チルトマウントカメラ」と読み替えてもよい。 Furthermore, of the five imaging devices 7A-7E (cameras 1-5), imaging devices 7A and 7B (cameras 1 and 2) are so-called "normal cameras" in which the Scheimpflug angle is not set, while imaging devices 7C-7E (cameras 3-5) are so-called "Scheimpflug cameras" in which the Scheimpflug angle is set. A Scheimpflug camera is an example of a special camera in which the entire plane facing each other diagonally appears in focus, and may be read as a "tilt camera" or a "tilt-mount camera."
 撮影装置7A~7E(カメラ1~5)は、イメージセンサと撮影レンズを有しており、シャインプルーフカメラは、イメージセンサのセンサ面に対する垂線と撮影レンズの中心軸(光軸)とのなす角度を示すシャインプルーフ角θ’が0度以外に設定されている。逆に、通常カメラは、イメージセンサのセンサ面に対する垂線と撮影レンズの中心軸(光軸)とが同軸(平行)であって角度0度となっている。 The photographing devices 7A to 7E (cameras 1 to 5) have an image sensor and a photographing lens, and in a Scheimpflug camera, the Scheimpflug angle θ', which indicates the angle between the perpendicular to the sensor surface of the image sensor and the central axis (optical axis) of the photographing lens, is set to a value other than 0 degrees. Conversely, in a normal camera, the perpendicular to the sensor surface of the image sensor and the central axis (optical axis) of the photographing lens are coaxial (parallel) and have an angle of 0 degrees.
 シャインプルーフカメラである撮影装置7C~7E(カメラ3~5)は、対象物の法面の高さ方向の撮影位置に応じたシャインプルーフ角を有している。例えば、本実施形態では、撮影装置7C、7D(カメラ3、4)(設置位置固定撮影部)のシャインプルーフ角が0.7度に設定されており、撮影装置7E(カメラ5)(設置位置可変撮影部)のシャインプルーフ角が1.1度に設定されている。なお、撮影装置7C~7E(カメラ3~5)のシャインプルーフ角はあくまで一例にすぎず、対象物の法面の傾斜角度の範囲等に応じて適宜設定される。なお、本実施形態に用いる撮影装置は、イメージセンサと撮影レンズとの位置関係(イメージセンサの撮影面に対する撮影レンズの光軸の関係)を調整してシャインプルーフ角を動的に変更するような機構を搭載していてもよいが、本実施形態では、撮影装置7C~7E(カメラ3~5)のシャインプルーフ角は、製造時に一義的に決定されており、シャインプルーフ角を動的に変更するような機構は搭載していない(カメラ単体に注目したときにシャインプルーフ角は不変である)。例えば、撮影装置7E(第2の撮影装置)は、撮影装置7E(第2の撮影装置)における撮影レンズの光軸が撮影装置7E(第2の撮影装置)における撮像面に直交する軸に対して傾いた設定シャインプルーフ角が予め設定されている。 The imaging devices 7C to 7E (cameras 3 to 5), which are Scheimpflug cameras, have Scheimpflug angles that correspond to the imaging position in the height direction of the slope of the object. For example, in this embodiment, the Scheimpflug angle of the imaging devices 7C and 7D (cameras 3 and 4) (imaging units with fixed installation positions) is set to 0.7 degrees, and the Scheimpflug angle of the imaging device 7E (camera 5) (imaging unit with variable installation position) is set to 1.1 degrees. Note that the Scheimpflug angles of the imaging devices 7C to 7E (cameras 3 to 5) are merely examples and are set appropriately depending on the range of inclination angles of the slope of the object. The imaging device used in this embodiment may be equipped with a mechanism that dynamically changes the Scheimpflug angle by adjusting the positional relationship between the image sensor and the imaging lens (the relationship of the optical axis of the imaging lens to the imaging surface of the image sensor). However, in this embodiment, the Scheimpflug angles of the imaging devices 7C to 7E (cameras 3 to 5) are uniquely determined at the time of manufacture, and no mechanism is equipped to dynamically change the Scheimpflug angle (the Scheimpflug angle is invariable when focusing on the camera alone). For example, the imaging device 7E (second imaging device) has a preset Scheimpflug angle in which the optical axis of the imaging lens in the imaging device 7E (second imaging device) is tilted with respect to an axis perpendicular to the imaging surface of the imaging device 7E (second imaging device).
 図10は、設置位置可変撮影部、すなわち撮影装置7E(カメラ5)の設置位置の調整の一例を示す図である。図10は、移動体(車両)6における撮影装置7E(第2の撮影装置)のZ軸方向の設置位置を設定(調整)する「設置位置設定ステップ」に相当する。図10では、対象物である法面の高さが15mであり、対象物である法面の傾斜角が45度~73.3度程度の場合を例示している(対象物である法面の傾斜角が一定であるように描いているがこれは作図の便宜上の理由によるものである)。 FIG. 10 is a diagram showing an example of adjusting the installation position of the variable installation position photographing unit, i.e., the photographing device 7E (camera 5). FIG. 10 corresponds to an "installation position setting step" for setting (adjusting) the installation position in the Z-axis direction of the photographing device 7E (second photographing device) on the moving body (vehicle) 6. FIG. 10 illustrates an example in which the height of the slope, which is the subject, is 15 m, and the inclination angle of the slope, which is the subject, is approximately 45 degrees to 73.3 degrees (the inclination angle of the slope, which is the subject, is drawn as if it were constant, but this is for convenience of drawing).
 対象物である法面にはなだらかな傾斜から急な傾斜まで様々な角度が存在しているため、たとえシャインプルーフカメラを使用した場合でも、法面の表面にピントが合った画像を採取することが難しい場合が存在する。例えば、調書に記載する全景画像、変状を示す画像はピントが合っていなければ、点検調書として役目を果たすことができない。またヒビの画像からヒビ幅推定をするため、ピントはヒビ幅推定結果の誤差に影響を及ぼす。 The slope of the target object has a variety of angles, from gentle to steep, so even when using a Scheimpflug camera, it can be difficult to obtain an image that is in focus on the slope surface. For example, if the overall image to be included in the report and the image showing the deformation are not in focus, they will not be able to serve as an inspection report. Also, because the crack width is estimated from an image of the crack, the focus affects the error in the crack width estimation results.
 図10の例では、法面の傾斜角が45度~73.3度程度で、それぞれの高さが15m又はその近傍を撮影する場合において、移動体6の撮影基準位置(例えば図10中の右端の撮影装置7E(カメラ5)の設置位置(レンズ物体側主点位置))から法尻までの水平距離(後述する「撮影基準位置から対象物までの距離」)が1.75m~3.75mの範囲であるとき、シャインプルーフ角を1.1度に設定した撮影装置7E(カメラ5)を以ってしても、法面の表面にピントが合った画像を採取することが難しい場合が存在する。 In the example of Figure 10, when the slope angle is about 45 degrees to 73.3 degrees and the height of the slope is 15 m or thereabouts, if the horizontal distance from the shooting reference position of the moving body 6 (for example, the installation position (lens object side principal point position) of the shooting device 7E (camera 5) at the right end in Figure 10) to the end of the slope (the "distance from the shooting reference position to the subject" described below) is in the range of 1.75 m to 3.75 m, it may be difficult to capture an image in focus on the surface of the slope, even with the shooting device 7E (camera 5) with the Scheimpflug angle set to 1.1 degrees.
 そこで、本実施形態では、移動体(車両)6における撮影装置7E(第2の撮影装置)のZ軸方向の設置位置を設定(調整)する「設置位置設定ステップ」を設ける。具体的に、本実施形態では、移動体(車両)6の移動方向をX軸、移動体(車両)6の移動方向と交差する方向をZ軸、X軸とZ軸と直交する方向をY軸、とするXYZ直交座標系を用いる。 Therefore, in this embodiment, an "installation position setting step" is provided for setting (adjusting) the installation position in the Z-axis direction of the imaging device 7E (second imaging device) on the moving body (vehicle) 6. Specifically, in this embodiment, an XYZ Cartesian coordinate system is used, in which the movement direction of the moving body (vehicle) 6 is the X-axis, the direction intersecting the movement direction of the moving body (vehicle) 6 is the Z-axis, and the direction perpendicular to the X-axis and Z-axis is the Y-axis.
 ここで、移動体(車両)6が移動するXZ平面上に移動体(車両)6に設置された撮影装置7E(第2の撮影装置)の撮影基準位置を投影した位置からXZ平面上でXZ平面と対象物(例えば法面)とが交わる位置までの長さを「撮影基準位置から対象物までの距離」と定義する。また、XZ平面から対象物(例えば法面)における撮影対象領域までのY軸上の長さを「撮影対象領域の高さ」と定義する。撮影基準位置は、レンズ物体側主点位置と読み替えてもよい。 Here, the length from the position where the shooting reference position of the shooting device 7E (second shooting device) installed on the moving body (vehicle) 6 is projected onto the XZ plane on which the moving body (vehicle) 6 moves to the position on the XZ plane where the XZ plane intersects with the object (e.g. a slope) is defined as the "distance from the shooting reference position to the object." Also, the length on the Y axis from the XZ plane to the shooting target area on the object (e.g. a slope) is defined as the "height of the shooting target area." The shooting reference position may be interpreted as the object-side principal point position of the lens.
 本実施形態において、対象物は法面であり、移動体は車両であり、移動体が移動するXZ平面は、車両が移動する路面である。この場合、「撮影基準位置から対象物までの距離」は、「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離」と読み替えることができる。 In this embodiment, the object is a slope, the moving body is a vehicle, and the XZ plane on which the moving body moves is the road surface on which the vehicle moves. In this case, the "distance from the shooting reference position to the object" can be interpreted as the "distance from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface."
 そして、「設置位置設定ステップ」では、「撮影基準位置から対象物までの距離」、及び「撮影対象領域の高さ」の少なくとも一方に応じて、移動体(車両)6の移動方向と交差するZ軸方向について、撮影装置7E(第2の撮影装置)の設置位置を設定(調整)する。 Then, in the "installation position setting step", the installation position of the imaging device 7E (second imaging device) is set (adjusted) in the Z-axis direction that intersects with the direction of movement of the moving body (vehicle) 6, depending on at least one of the "distance from the imaging reference position to the target object" and the "height of the imaging target area".
 「撮影基準位置から対象物までの距離」、及び「撮影対象領域の高さ」は、撮影対象領域に合焦するための理論シャインプルーフ角を求めるためのパラメータであり、理論シャインプルーフ角と撮影装置7E(第2の撮影装置)に予め設定されている設定シャインプルーフ角との差分に応じて、撮影装置7E(第2の撮影装置)の設置位置を設定(調整)する。具体的に、理論シャインプルーフ角と設定シャインプルーフ角との差が所定範囲以上の場合に、撮影装置7E(第2の撮影装置)の設置位置をZ軸方向に変更する(例えば対象物である法面から遠ざける)ことで理論シャインプルーフ角を変化させることにより、理論シャインプルーフ角と設定シャインプルーフ角との差を所定範囲内とする。 The "distance from the shooting reference position to the object" and the "height of the object to be shot" are parameters for determining the theoretical Scheimpflug angle for focusing on the object to be shot, and the installation position of the shooting device 7E (second shooting device) is set (adjusted) according to the difference between the theoretical Scheimpflug angle and the set Scheimpflug angle that is set in advance in the shooting device 7E (second shooting device). Specifically, when the difference between the theoretical Scheimpflug angle and the set Scheimpflug angle is equal to or greater than a specified range, the installation position of the shooting device 7E (second shooting device) is changed in the Z-axis direction (for example, away from the slope of the object) to change the theoretical Scheimpflug angle, thereby bringing the difference between the theoretical Scheimpflug angle and the set Scheimpflug angle within a specified range.
 図10において、「撮影基準位置から対象物までの距離W」が2.75m以上かつ3.75m以下であるときは、撮影装置7E(カメラ5)が図10中の右端に設置されていても法面の表面にピントが合った画像を採取することができる。しかし、「撮影基準位置から対象物までの距離W」が1.75m以上かつ2.75m未満であるときは、撮影装置7E(カメラ5)が図10中の右端に設置されていると法面の表面にピントが合った画像を採取することが難しくなってしまう。 In Figure 10, when the "distance W from the shooting reference position to the target object" is 2.75 m or more and 3.75 m or less, an image focused on the slope surface can be captured even if the shooting device 7E (camera 5) is installed at the right end of Figure 10. However, when the "distance W from the shooting reference position to the target object" is 1.75 m or more and less than 2.75 m, it becomes difficult to capture an image focused on the slope surface if the shooting device 7E (camera 5) is installed at the right end of Figure 10.
 そこで、本実施形態では、「撮影基準位置から対象物までの距離W」に応じて、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の撮影装置7E(カメラ5)の設置位置を調整する。「撮影基準位置から対象物までの距離W」が2.75m以上かつ3.75m以下であるときは、撮影装置7E(カメラ5)の設置位置を図10中の右端に調整する。一方、「撮影基準位置から対象物までの距離W」が1.75m以上かつ2.75m未満であるときは、撮影装置7E(カメラ5)の設置位置を図10中の左端に調整する。このように、設置位置可変撮影部としての撮影装置7E(カメラ5)は、「撮影基準位置から対象物までの距離W」に応じて、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される。 Therefore, in this embodiment, the installation position of the imaging device 7E (camera 5) in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving object 6 is adjusted according to the "distance W from the imaging reference position to the object". When the "distance W from the imaging reference position to the object" is 2.75 m or more and 3.75 m or less, the installation position of the imaging device 7E (camera 5) is adjusted to the right end in FIG. 10. On the other hand, when the "distance W from the imaging reference position to the object" is 1.75 m or more and less than 2.75 m, the installation position of the imaging device 7E (camera 5) is adjusted to the left end in FIG. 10. In this way, the installation position of the imaging device 7E (camera 5) as an installation position variable imaging unit is adjusted in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving object 6 according to the "distance W from the imaging reference position to the object".
 また、撮影装置7E(カメラ5)が法面の最も高い領域を撮影するためのものであることを踏まえると、設置位置可変撮影部としての撮影装置7E(カメラ5)は、「撮影対象領域の高さ(例えば撮影対象領域が第1、第2の対象領域のいずれであるか)」に応じて、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整されることになる。撮影装置(カメラ)が法面の低い位置を撮影するものである場合、そもそも、撮影装置(カメラ)を設置位置可変撮影部とする必要性が低く、設置位置固定撮影部で十分である。図10、図11では、「撮影対象領域の高さ」に符号Dを付している(撮影対象領域の高さDは最大で15mの範囲内で設定される)。撮影対象領域の高さDは、図11は、「光軸と法面の交点の高さ」と読み替えてもよい。 Also, considering that the photographing device 7E (camera 5) is for photographing the highest area of the slope, the photographing device 7E (camera 5) as a position-variable photographing unit has its position adjusted in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving body 6 depending on the "height of the photographed area (for example, whether the photographed area is the first or second target area)". If the photographing device (camera) is for photographing a low position on the slope, there is little need to make the photographing device (camera) a position-variable photographing unit in the first place, and a fixed-position photographing unit is sufficient. In Figures 10 and 11, the "height of the photographed area" is marked with the symbol D (the height D of the photographed area is set within a maximum range of 15 m). The height D of the photographed area in Figure 11 may be read as the "height of the intersection of the optical axis and the slope".
 このように、設置位置可変撮影部としての撮影装置7E(カメラ5)は、「撮影基準位置から対象物までの距離W」、及び「撮影対象領域の高さD」の少なくとも一方に応じて、移動体(車両)6の移動方向と交差するZ軸方向の設置位置が設定(調整)される。 In this way, the installation position of the imaging device 7E (camera 5) as a position-adjustable imaging unit is set (adjusted) in the Z-axis direction that intersects with the direction of movement of the moving body (vehicle) 6, depending on at least one of the "distance W from the imaging reference position to the target object" and the "height D of the imaging target area."
 設置位置可変撮影部としての撮影装置7E(カメラ5)は、「撮影基準位置から対象物までの距離W」の所定距離との関係(距離Wが1.75m~3.75mの範囲であるか否か、2.75m未満と2.75m以上のいずれであるか)、及び、「撮影対象領域の高さD」の所定高さとの関係(15m又はその近傍であるか否か、又は、最も高い法面高さであるか否か、撮影対象領域が第1、第2の対象領域のいずれであるか)に基づいて、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置が調整される。 The installation position of the imaging device 7E (camera 5) as a position-adjustable imaging unit is adjusted in the direction (Z-axis direction) that intersects with the moving direction (X-axis direction) relative to the moving body 6 based on the relationship of the "distance W from the imaging reference position to the target object" to a predetermined distance (whether the distance W is in the range of 1.75m to 3.75m, or whether it is less than 2.75m or 2.75m or more) and the relationship of the "height D of the imaging target area" to a predetermined height (whether it is 15m or close to that, or whether it is the highest slope height, or whether the imaging target area is the first or second target area).
 別言すると、「撮影基準位置から対象物までの距離W」及び「撮影対象領域の高さD」に応じた法面の角度等のシャインプルーフ角の式から求めたシャインプルーフ角理論値に対する許容範囲を超えているか否か(詳細は後述)に基づいて、移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の撮影装置7E(カメラ5)の設置位置が調整される。シャインプルーフ角を一定にするとは、1つのカメラに関することであり、他のカメラとシャインプルーフ角を共通にするものではない。各カメラ(ここでは撮影装置7C~7E(カメラ3~5))には、それぞれシャインプルーフ角が設定されている。つまり、後述するシャインプルーフ角の式から求めたシャインプルーフ角理論値に対する許容範囲で最適なシャインプルーフ角がそれぞれのカメラに対して設定されている。また各カメラに設定されているシャインプルーフ角は変更しない。 In other words, the installation position of the imaging device 7E (camera 5) in the direction (Z axis direction) intersecting the moving direction (X axis direction) relative to the moving body 6 is adjusted based on whether or not the slope angle, etc., according to the "distance W from the imaging reference position to the object" and the "height D of the imaging target area" exceeds the allowable range for the theoretical value of the Scheimpflug angle calculated from the Scheimpflug angle formula (details will be described later). Keeping the Scheimpflug angle constant refers to one camera, and does not mean that the Scheimpflug angle is made common to the other cameras. Each camera (here, imaging devices 7C to 7E (cameras 3 to 5)) has its own Scheimpflug angle set. In other words, the optimal Scheimpflug angle is set for each camera within the allowable range for the theoretical value of the Scheimpflug angle calculated from the Scheimpflug angle formula described later. In addition, the Scheimpflug angle set for each camera is not changed.
 一般的に、シャインプルーフ角を変更する機構として、シャインプルーフ角ごとにレンズアダプタを用意し、所望のシャインプルーフ角とするためにレンズアダプタを交換する方法がある。これに対し、本実施形態では、撮影装置(第2の撮影装置)7EをZ軸方向に移動する機構を備えることで、レンズアダプタを交換する作業が不要となり、作業効率を上げることができる。また、シャインプルーフ角を動的に変更するような機構を搭載していないカメラでもピントが合った撮影が可能となる。さらに、撮影装置7C~7E(カメラ3~5)はシャインプルーフ角が固定された構造であるので、カメラとレンズの剛性を高いレベルで確保(保証)することができる。 Generally, a mechanism for changing the Scheimpflug angle involves preparing a lens adapter for each Scheimpflug angle and replacing the lens adapter to obtain the desired Scheimpflug angle. In contrast, in this embodiment, a mechanism for moving the imaging device (second imaging device) 7E in the Z-axis direction is provided, eliminating the need to replace the lens adapter and improving work efficiency. In addition, in-focus photography is possible even with a camera that does not have a mechanism for dynamically changing the Scheimpflug angle. Furthermore, since the imaging devices 7C to 7E (cameras 3 to 5) have a structure in which the Scheimpflug angle is fixed, a high level of rigidity can be ensured (guaranteed) for the cameras and lenses.
 ここで、「撮影基準位置」は、既述したように、移動体6における撮影装置の設置位置(本実施形態ではレンズ物体側主点位置)であるので、例えば、撮影時における移動体6の走行ラインで一義的に規定され、「所定の撮影基準位置」と読み替えてもよい。そうすると、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」は、例えば、法面と走行する道路との間の環境の違い、すなわち歩道、側溝、ガードレールの有無により異なり得る。本実施形態では、法面撮影の前に、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」を事前に把握するべく、撮影現場の環境(歩道、側溝、ガードレールの有無)について、例えば、GNSSセンサ8bの出力を利用する、地図等で事前確認する、撮影前に一度走行する、前回の撮影結果を利用する等、何れかの手段を採用することができる。そして、設置位置可変撮影部としての撮影装置7E(カメラ5)の設置位置を一旦調整した後は、撮影時にこれを動かすことはなく、合焦に関する撮影条件は固定(一定)にして撮影を実行する。 As already mentioned, the "photographing reference position" is the installation position of the camera on the moving body 6 (in this embodiment, the object-side principal point position of the lens), and may be interpreted as a "predetermined photography reference position", for example, uniquely defined by the driving line of the moving body 6 at the time of shooting. In that case, the "distance W from the photography reference position to the object", that is, the "distance W from the position where the photography reference position of the camera installed on the moving body (vehicle) is projected onto the road surface to the toe of the slope where the slope meets the road surface", may differ depending on, for example, differences in the environment between the slope and the road along which the vehicle is traveling, that is, the presence or absence of a sidewalk, a gutter, or a guardrail. In this embodiment, before photographing the slope, the "distance W from the photographing reference position to the object," that is, the "distance W from the position where the photographing reference position of the photographing device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," can be grasped in advance by using the output of the GNSS sensor 8b, checking in advance on a map, etc., driving once before photographing, using the results of the previous photographing, or any other means. Then, once the installation position of the photographing device 7E (camera 5) as the position-variable photographing unit is adjusted, it is not moved during photographing, and photographing is performed with the photographing conditions related to focus fixed (constant).
 一般に、通常の撮影においてシャインプルーフ角を設定する場合には、被写体の傾斜角に応じてシャインプルーフ角を設定する。しかしながら、撮影時の手間を考えると、法面撮影を行う撮影現場において、法面の傾斜角に応じて、適宜シャインプルーフ角の設定を変更することはせず、設定されたシャインプルーフ角を疑似的に変更したい(一定の幅を持つシャインプルーフ角をカバーしたい)という要求があることに本発明者は着目した。 Generally, when setting the Scheimpflug angle in normal photography, the Scheimpflug angle is set according to the inclination angle of the subject. However, considering the effort required for photography, the inventors have noticed that when photographing slopes, there is a demand to artificially change the set Scheimpflug angle (to cover a Scheimpflug angle with a certain range) rather than simply changing the Scheimpflug angle setting as appropriate according to the inclination angle of the slope.
 上述した通り、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」は、撮影現場の環境(歩道、側溝、ガードレールの有無)によって異なり得る。本実施形態では、法面の傾斜角が45度~73.3度程度であり、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」が1.75m~3.75mである場合について、撮影装置7C~7E(カメラ3~5)のそれぞれに設定されたシャインプルーフ角(撮影装置7C、7D(カメラ3、4)は0.7度、撮影装置7E(カメラ5)は1.1度)を変更せずに撮影することを前提としている。 As mentioned above, the "distance W from the shooting reference position to the object," i.e., the "distance W from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," may vary depending on the environment of the shooting site (presence or absence of sidewalks, gutters, and guardrails). In this embodiment, the slope angle is approximately 45 degrees to 73.3 degrees, and the "distance W from the shooting reference position to the object," i.e., the "distance W from the position where the shooting reference position of the shooting device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," is 1.75 m to 3.75 m. It is assumed that shooting will be performed without changing the Scheimpflug angle set for each of the shooting devices 7C to 7E (cameras 3 to 5) (0.7 degrees for shooting devices 7C and 7D (cameras 3 and 4), and 1.1 degrees for shooting device 7E (camera 5)).
 但し、本実施形態では、撮影装置7C、7D(カメラ3、4)については、シャインプルーフ角0.7度で撮影しても合焦した撮影が可能であるが、シャインプルーフ角1.1度の撮影装置7E(カメラ5)については、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」が1.75m近傍になると(法面に近付きすぎると)、合焦した撮影が難しくなってしまうおそれがある。この問題を解決するために、撮影装置7E(カメラ5)を「設置位置可変撮影部」として、その移動体6上の設置位置を距離Wが大きくなる方向(Z方向軸において法面(法尻)から遠ざかる方向)に移動させることで、シャインプルーフ角1.1度の撮影装置7E(カメラ5)でも合焦した撮影を実現することができる。 However, in this embodiment, for the imaging devices 7C and 7D (cameras 3 and 4), in-focus imaging is possible even when the Scheimpflug angle is 0.7 degrees. However, for the imaging device 7E (camera 5) with a Scheimpflug angle of 1.1 degrees, when the "distance W from the imaging reference position to the object," that is, the "distance W from the position where the imaging reference position of the imaging device installed on the moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," approaches 1.75 m (when getting too close to the slope), it may become difficult to capture in focus. To solve this problem, the imaging device 7E (camera 5) is used as a "position-variable imaging unit" and its installation position on the moving body 6 is moved in the direction in which the distance W increases (the direction moving away from the slope (end of the slope) on the Z-axis), so that in-focus imaging can be achieved even with the imaging device 7E (camera 5) with a Scheimpflug angle of 1.1 degrees.
 その意味で、設置位置可変撮影部としての撮影装置7E(カメラ5)の移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置を調整することで、撮影装置7E(カメラ5)の理論シャインプルーフ角が疑似的に変更される。すなわち、撮影装置7E(カメラ5)自体が持つ設定シャインプルーフ角は同じであるが(角度変更部は備えないが)、撮影装置7E(カメラ5)のスライド位置を調整することにより理論シャインプルーフ角を「疑似的」に変更している。別言すると、設置位置可変撮影部としての撮影装置7E(カメラ5)の移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置を調整し、撮影装置7E(カメラ5)の仰角を下げることで、撮影装置7E(カメラ5)自体が持つ設定シャインプルーフ角を変更することなく、設置位置可変撮影部としての撮影装置7E(カメラ5)の「合焦面角度」を変更(調整)することができる。 In this sense, the theoretical Scheimpflug angle of the photographing device 7E (camera 5) is changed in a pseudo manner by adjusting the installation position of the photographing device 7E (camera 5) as a position-variable photographing unit in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving body 6. That is, the set Scheimpflug angle of the photographing device 7E (camera 5) itself is the same (although it does not have an angle change unit), but the theoretical Scheimpflug angle is "pseudo" changed by adjusting the slide position of the photographing device 7E (camera 5). In other words, by adjusting the installation position of the photographing device 7E (camera 5) as a position-variable photographing unit in the direction (Z-axis direction) intersecting with the moving direction (X-axis direction) relative to the moving body 6 and lowering the elevation angle of the photographing device 7E (camera 5), the "focus plane angle" of the photographing device 7E (camera 5) as a position-variable photographing unit can be changed (adjusted) without changing the set Scheimpflug angle of the photographing device 7E (camera 5) itself.
 ここで、撮影装置の「合焦面」とは、撮影装置のレンズの光軸上の合焦位置(点)を含む平面を意味している。設定シャインプルーフ角が0°の場合は、撮影装置(撮像素子)の撮像面(物体側主平面)と合焦面は平行であり、設定シャインプルーフ角が0°以外の場合、合焦面は、設定シャインプルーフ角に応じて、物体側主平面と平行な平面に対して角度が付く。本実施形態では、この物体側主平面と平行な平面に対する合焦面の角度を「合焦面角度」と定義する。そして、合焦距離が変化すれば、撮影装置自体に設定されているシャインプルーフ角を固定した状態でも、合焦面の角度が変化する。従って、上述したように、設置位置可変撮影部としての撮影装置7E(カメラ5)の移動体6に対する移動方向(X軸方向)と交差する方向(Z軸方向)の設置位置を調整し、撮影装置7E(カメラ5)の仰角を下げることで、撮影装置7E(カメラ5)自体が持つシャインプルーフ角を固定したままで、設置位置可変撮影部としての撮影装置7E(カメラ5)の「合焦面角度」を変更(調整)することができる。なお、図11では、設置位置可変撮影部としての撮影装置7E(カメラ5)の「合焦面角度」に符号βを付して描いている。また、図11、図12では、シャインプルーフ角が設定されたカメラと撮影レンズ(鏡筒)を模式的に描くとともに、そこに「レンズ物体側主点位置」を描いている。「レンズ物体側主点位置」は、光軸と物体側主平面の交点である。 Here, the "focus plane" of the imaging device means a plane that includes the focus position (point) on the optical axis of the lens of the imaging device. When the set Scheimpflug angle is 0°, the imaging surface (object-side principal plane) of the imaging device (image sensor) and the focus plane are parallel, and when the set Scheimpflug angle is other than 0°, the focus plane is angled with respect to a plane parallel to the object-side principal plane according to the set Scheimpflug angle. In this embodiment, the angle of the focus plane with respect to this plane parallel to the object-side principal plane is defined as the "focus plane angle." And if the focusing distance changes, the angle of the focus plane changes even when the Scheimpflug angle set in the imaging device itself is fixed. Therefore, as described above, by adjusting the installation position of the image capturing device 7E (camera 5) as the installation position variable image capturing unit in the direction (Z axis direction) intersecting with the moving direction (X axis direction) relative to the moving body 6 and lowering the elevation angle of the image capturing device 7E (camera 5), it is possible to change (adjust) the "focal plane angle" of the image capturing device 7E (camera 5) as the installation position variable image capturing unit while keeping the Scheimpflug angle of the image capturing device 7E (camera 5) itself fixed. Note that in FIG. 11, the "focal plane angle" of the image capturing device 7E (camera 5) as the installation position variable image capturing unit is depicted with the symbol β. Also, in FIG. 11 and FIG. 12, the camera and the image capturing lens (lens barrel) with the Scheimpflug angle set are depicted as schematics, and the "lens object side principal point position" is depicted there. The "lens object side principal point position" is the intersection point of the optical axis and the object side principal plane.
 図11において、レンズ像側主点位置を通り、撮影レンズの光軸と直交する平面が「像側主平面」となっており、レンズ物体側主点位置を通り、撮影レンズの光軸と直交する平面が「物体側主平面」となっている。「像側主平面」と「物体側主平面」は、撮影レンズの光軸と直交している。シャインプルーフ角が設定されていない場合(シャインプルーフ角が0°の場合)には、撮影レンズの光軸は、「撮像素子の撮像面」と直交するので、「撮像素子の撮像面」と「物体側主平面」とは平行となる。一方、シャインプルーフ角を設定することにより、撮影レンズの光軸は、「撮像素子の撮像面」とは直交しなくなるので、「撮像素子の撮像面」と「物体側主平面」とは平行ではなくなる。図11に描いた「合焦面」について、シャインプルーフ角が設定されていない場合(シャインプルーフ角が0°の場合)には、「撮像素子の撮像面」と平行な面となるが、シャインプルーフ角が設定されることにより、「撮像素子の撮像面」に対して大きく傾く。このように、シャインプルーフ角が設定されていない場合よりも大きく傾き、法面の角度に近くなることで、法面における合焦範囲が広くなる。 In FIG. 11, the plane that passes through the lens image-side principal point position and is perpendicular to the optical axis of the photographing lens is the "image-side principal plane", and the plane that passes through the lens object-side principal point position and is perpendicular to the optical axis of the photographing lens is the "object-side principal plane". The "image-side principal plane" and the "object-side principal plane" are perpendicular to the optical axis of the photographing lens. When the Scheimpflug angle is not set (when the Scheimpflug angle is 0°), the optical axis of the photographing lens is perpendicular to the "imaging surface of the image sensor", so the "imaging surface of the image sensor" and the "object-side principal plane" are parallel. On the other hand, when the Scheimpflug angle is set, the optical axis of the photographing lens is no longer perpendicular to the "imaging surface of the image sensor", so the "imaging surface of the image sensor" and the "object-side principal plane" are no longer parallel. When the Scheimpflug angle is not set (when the Scheimpflug angle is 0°), the "focus plane" depicted in FIG. 11 is parallel to the "imaging surface of the image sensor", but when the Scheimpflug angle is set, it is significantly tilted with respect to the "imaging surface of the image sensor". In this way, the focal range on the slope is wider by being closer to the angle of the slope than when the Scheimpflug angle is not set.
 このように、本実施形態によれば、法尻(対象物の基準位置)から撮影基準位置までの想定距離内(例えば1.75m~3.75m)において、所定の傾斜角範囲内(例えば45度~73.3度)の法面(対象物における撮影領域)を撮影する際に、カメラに設定したシャインプルーフ角を変更せずに撮影することができる。より具体的に、「撮影基準位置から対象物までの距離」が所定範囲内の近傍である場合に、高い位置にある撮影対象領域(例えば第2の対象領域)を撮影するカメラの設置位置を「撮影基準位置から対象物までの距離」が大きくなるように変更することで、高い位置にある撮影対象領域(例えば第2の対象領域)を撮影するカメラのシャインプルーフ角を変更せずに撮影して合焦した画像を得ることができる。 In this way, according to this embodiment, when photographing a slope (photographing area of the object) within a predetermined inclination angle range (e.g., 45 degrees to 73.3 degrees) within the expected distance (e.g., 1.75 m to 3.75 m) from the toe of the slope (reference position of the object) to the photographing reference position, it is possible to photograph the slope without changing the Scheimpflug angle set on the camera. More specifically, when the "distance from the photographing reference position to the object" is close to within the predetermined range, by changing the installation position of the camera photographing the photographing target area located at a high position (e.g., the second target area) so that the "distance from the photographing reference position to the object" is increased, it is possible to obtain a focused image by photographing the photographing target area located at a high position (e.g., the second target area) without changing the Scheimpflug angle of the camera photographing the photographing target area located at a high position (e.g., the second target area).
 続いて、「シャインプルーフ角理論値からのズレ許容量」という概念(キーワード)を利用して、撮影装置7C、7D(カメラ3、4)については、シャインプルーフ角0.7度で撮影しても合焦した撮影が可能であるが、シャインプルーフ角1.1度の撮影装置7E(カメラ5)については、「撮影基準位置から対象物までの距離W」、つまり「路面上に移動体(車両)に設置された撮影装置の撮影基準位置を投影した位置から法面が路面と接する法尻までの距離W」が1.75m近傍になると(法面に近付きすぎると)、合焦した撮影が難しくなってしまうおそれがあることを説明する。 Next, using the concept (keyword) of "tolerance of deviation from the theoretical value of the Scheimpflug angle," it will be explained that for the imaging devices 7C and 7D (cameras 3 and 4), in-focus photography is possible even when shooting with a Scheimpflug angle of 0.7 degrees, but for the imaging device 7E (camera 5) with a Scheimpflug angle of 1.1 degrees, when the "distance W from the imaging reference position to the subject," in other words, "the distance W from the position where the imaging reference position of the imaging device installed on a moving body (vehicle) is projected onto the road surface to the end of the slope where the slope meets the road surface," becomes close to 1.75 m (too close to the slope), and it may become difficult to shoot in focus.
 図11、図12、図13は、シャインプルーフ角理論値からのズレ許容量の一例を示す第1、第2、第3の概念図である。 Figures 11, 12, and 13 are first, second, and third conceptual diagrams showing an example of the allowable deviation from the theoretical Scheimpflug angle value.
 図11において、レンズの位置と法面の傾斜角からレンズの向きα(°)を下記の表1の基準に基づいて決定する。高さ15m以上の法面が観察できること、且つ、下のカメラの撮影範囲を考慮して決定する。
Figure JPOXMLDOC01-appb-T000001
In Fig. 11, the lens direction α (°) is determined based on the lens position and the inclination angle of the slope, in accordance with the criteria in Table 1 below. The direction is determined taking into consideration the ability to observe slopes with a height of 15 m or more, and the shooting range of the lower camera.
Figure JPOXMLDOC01-appb-T000001
 Sim、条件の物体距離の算出を行う。レンズの向きを15m又はその近傍に合わせて、各傾斜角を2水準(73.3°、45°)及び撮影基準位置から対象物までの距離Wを3水準(1.75m、2.75m、3.75m)の合計6水準の物体距離を算出する。物体距離Zは、図11中のθ、A、α、C、W、H、γの各パラメータを次のように規定したとき、下記の数式(A)、(B)、(C)、(D)に基づいて求められる。
Figure JPOXMLDOC01-appb-M000002
θ:光軸と法面の角[°]
A:法面の傾斜角[°]
α:レンズの向き[°]
C:法尻-レンズ間距離[m]
W:撮影基準位置から対象物までの距離[m]
H:レンズの高さ[m]
γ:物体側主点と法尻を結ぶ線と地面に垂直な線のなす角[°]
D:撮影対象領域の高さ[m]
:物体距離[m]
Sim, calculate the object distance under the conditions. The lens direction is set to 15 m or near there, and the object distance is calculated for a total of six levels, including two levels of each inclination angle (73.3°, 45°) and three levels of distance W from the shooting reference position to the object (1.75 m, 2.75 m, 3.75 m). The object distance Z0 is calculated based on the following formulas (A), (B), (C), and (D) when the parameters θ, A, α, C, W, H, and γ in FIG. 11 are defined as follows:
Figure JPOXMLDOC01-appb-M000002
θ: Angle between the optical axis and the slope [°]
A: Angle of inclination of the slope [°]
α: Lens orientation [°]
C: Distance between the end of the lens and the edge of the lens [m]
W: Distance from the shooting reference position to the object [m]
H: Lens height [m]
γ: Angle between the line connecting the principal point of the object and the toe of the slope and the line perpendicular to the ground [°]
D: Height of the target area [m]
Z0 : Object distance [m]
 また、図11、図12において、カメラのシャインプルーフ角θ’は、下記の数式(E)に基づいて求められる。図12において、法面の傾斜角Aは45°~73.3°程度に設定されており、撮影基準位置から対象物までの距離Wは1.75m~3.75mに設定されており、レンズの高さHは2.3mに設定されており、15m近傍撮影用レンズ(カメラ5)の移動量(Z軸方向へのスライド量)は1mに設定されている。
Figure JPOXMLDOC01-appb-M000003
θ’:シャインプルーフ角[°]
f:15m近傍撮影用レンズ焦点距離(カメラ5のレンズ焦点距離)[m]
:物体距離[m]
θ:光軸と法面の角[°]
11 and 12, the Scheimpflug angle θ' of the camera is calculated based on the following formula (E): In Fig. 12, the inclination angle A of the slope is set to about 45° to 73.3°, the distance W from the shooting reference position to the target object is set to 1.75 m to 3.75 m, the lens height H is set to 2.3 m, and the movement amount (slide amount in the Z-axis direction) of the lens for shooting near 15 m (camera 5) is set to 1 m.
Figure JPOXMLDOC01-appb-M000003
θ': Scheimpflug angle [°]
f: 15m focal length of lens for shooting near field (lens focal length of camera 5) [m]
Z0 : Object distance [m]
θ: Angle between the optical axis and the slope [°]
 本実施形態では、「レンズ物体側主点位置」を撮影装置の撮影基準位置としている。このため、上記のレンズの高さH[m]は、移動体(車両)が移動する路面(XZ平面)から「レンズ物体側主点位置」までの高さを意味している。別言すると、レンズの高さH[m]は、移動体(車両)が移動する路面(XZ平面)から「撮影基準位置」までの高さを意味している。 In this embodiment, the "lens object side principal point position" is the shooting reference position of the imaging device. Therefore, the lens height H [m] above means the height from the road surface (XZ plane) on which the moving body (vehicle) moves to the "lens object side principal point position." In other words, the lens height H [m] means the height from the road surface (XZ plane) on which the moving body (vehicle) moves to the "shooting reference position."
 このようにして、検査対象である法面を撮影する際のシャインプルーフ角θ’の理論値を求める。すなわち、上記の数式(A)~(D)で求めた物体距離Zを、シャインプルーフ角の理論値を求める数式(E)に代入することで、検査対象である法面を撮影する際のシャインプルーフ角θ’の理論値を求めることができる。 In this manner, the theoretical value of the Scheimpflug angle θ' when the slope to be inspected is photographed is obtained. That is, by substituting the object distance Z0 obtained by the above formulas (A) to (D) into formula (E) for obtaining the theoretical value of the Scheimpflug angle, the theoretical value of the Scheimpflug angle θ' when the slope to be inspected is photographed can be obtained.
 ここで、設定されている実際のシャインプルーフ角が理論値から外れるほど、「蛇行許容量手前-」と「蛇行許容量奥+」により規定される被写体深度(図12参照)が狭くなり、撮影画像において合焦していない箇所が撮影される可能性が高くなる。一例として、設定されている実際のシャインプルーフ角が理論値から所定範囲内(±0.2°)であれば、検索を行うために十分な被写界深度内に収まっていると判断することができるが、設定されている実際のシャインプルーフ角が理論値から所定範囲外となると(±0.2°を超えると)、撮影画像において合焦していない箇所が撮影される可能性が高くなる。ここで、所定範囲とした「±0.2°」はあくまで一例であって、種々の設計変更が可能である。例えば、検査に用いるために必要な被写界深度に収まっているか否かの判断の基準である所定範囲は、検査する際の画像の解像度や検査精度(要求仕様)に応じて、適宜設定することができる。 Here, the more the actual Scheimpflug angle that is set deviates from the theoretical value, the narrower the depth of field (see FIG. 12) defined by the "meander allowance before -" and the "meander allowance after +" becomes, and the higher the possibility that an unfocused part will be captured in the captured image. As an example, if the actual Scheimpflug angle that is set is within a predetermined range (±0.2°) from the theoretical value, it can be determined that the depth of field is sufficient for performing a search, but if the actual Scheimpflug angle that is set is outside the predetermined range from the theoretical value (exceeding ±0.2°), the higher the possibility that an unfocused part will be captured in the captured image. Here, the predetermined range of "±0.2°" is merely an example, and various design changes are possible. For example, the predetermined range that is the criterion for determining whether or not the depth of field required for inspection is within the range can be set appropriately according to the image resolution and inspection accuracy (required specifications) at the time of inspection.
 そして、設定されている実際のシャインプルーフ角が理論値から所定範囲外となる場合(そのままでは±0.2°を超える場合)は、設置位置可変撮影部としての撮影装置7E(カメラ5)をZ軸方向(水平方向)に移動させて(法面から遠ざけて)、設定されている実際のシャインプルーフ角が理論値から所定範囲内(±0.2°)となるように調整する。 If the actual Scheimpflug angle that is set falls outside the specified range from the theoretical value (if it exceeds ±0.2° as is), the photographing device 7E (camera 5) as the position-adjustable photographing unit is moved in the Z-axis direction (horizontal direction) (away from the slope) to adjust the actual Scheimpflug angle that is set so that it falls within the specified range from the theoretical value (±0.2°).
 図13(a)は、設置位置可変撮影部としての撮影装置7E(カメラ5)のZ軸方向の位置を固定した場合におけるシャインプルーフ角の理論値を示しており、図13(b)は、設置位置可変撮影部としての撮影装置7E(カメラ5)のZ軸方向の位置を調整した場合におけるシャインプルーフ角の理論値(の改善)を示している。図13(a)、(b)では、横軸を法面の傾斜角、縦軸をシャインプルーフ角の理論値として、法面の傾斜角に対するシャインプルーフ角の理論値をプロットしている。また、図13(a)、(b)では、撮影基準位置から対象物までの距離Wが1.75mと3.75mの二条件でのカメラ2、4、5(CAM2、CAM4、CAM5)のシャインプルーフ角の理論値をプロットしている。 Figure 13(a) shows the theoretical value of the Scheimpflug angle when the Z-axis position of the image capture device 7E (camera 5) as a position-adjustable image capture unit is fixed, and Figure 13(b) shows the theoretical value (improvement) of the Scheimpflug angle when the Z-axis position of the image capture device 7E (camera 5) as a position-adjustable image capture unit is adjusted. In Figures 13(a) and (b), the horizontal axis represents the slope angle and the vertical axis represents the theoretical value of the Scheimpflug angle, and the theoretical value of the Scheimpflug angle is plotted against the slope angle. Figures 13(a) and (b) also plot the theoretical values of the Scheimpflug angle for cameras 2, 4, and 5 (CAM2, CAM4, CAM5) under two conditions where the distance W from the image capture reference position to the object is 1.75 m and 3.75 m.
 法面の中段付近を撮影するCAM2のシャインプルーフ角の理論値の範囲は、-0.27°~0.1°の0.37°の範囲に収まっており、シャインプルーフ角を0°に設定しても理論値からの差が小さいので、撮影条件(傾斜や撮影基準位置から対象物までの距離W)が変わっても、撮影範囲の上側と下側でのピントずれが起きにくい。同様に、CAM4もシャインプルーフ角の理論値の範囲は、-0.69°~-0.3°の0.39°の範囲に収まっており、シャインプルーフ角を-0.7°に設定しても理論値からの差が小さいのでピントも問題が起きない。CAM1、CAM3についても同様にピントの問題は起きない。 The theoretical range of the Scheimpflug angle for CAM2, which photographs the middle of the slope, is within a range of 0.37° from -0.27° to 0.1°, and since the difference from the theoretical value is small even when the Scheimpflug angle is set to 0°, focus shifts are unlikely to occur at the top and bottom of the shooting range even if the shooting conditions (slope or distance W from the shooting reference position to the subject) change. Similarly, the theoretical range of the Scheimpflug angle for CAM4 is within a range of 0.39° from -0.69° to -0.3°, and since the difference from the theoretical value is small even when the Scheimpflug angle is set to -0.7°, there are no focus problems. Similarly, there are no focus problems with CAM1 and CAM3.
 しかし、CAM5は高所を見ているため、撮影条件(傾斜や撮影基準位置から対象物までの距離W)による距離変動が大きいことと、レンズの焦点距離が大きいことで、シャインプルーフ角の理論値が-1.82°~-0.95°の0.87°の広い範囲となり、ピントが合わない障害が発生する。 However, because CAM5 is looking at a high place, there is a large variation in distance depending on the shooting conditions (inclination and the distance W from the shooting reference position to the subject), and because the focal length of the lens is large, the theoretical value of the Scheimpflug angle is in a wide range of 0.87°, from -1.82° to -0.95°, resulting in problems with focusing.
 そこで、図13(b)のように、水平方向(Z軸方向)に1mのスライド機構を付けることで、法尻までの距離が近い(撮影基準位置から対象物までの距離Wが小さい)1.75mの場合にCAM5をスライド機構で遠ざけることで、実質的な法尻までの距離(撮影基準位置から対象物までの距離W)が2.75m~3.75mの間だけとなり、シャインプルーフ角の理論値の範囲を狭くすることができる(図13(b)の太字枠を参照)。その結果、法尻までの距離が近い(撮影基準位置から対象物までの距離Wが小さい)1.75m~3.75mの間において、45°~73.3°の法面の傾斜の範囲でピントが合わないといった障害を解消することができる。すなわち、法面の広い領域でピントが合った画像を得ることができる。 As a result, by attaching a 1m sliding mechanism in the horizontal direction (Z-axis direction) as shown in Figure 13(b), when the distance to the end of the slope is close (distance W from the shooting reference position to the target object is small) of 1.75m, the CAM5 can be moved away with the sliding mechanism, so that the effective distance to the end of the slope (distance W from the shooting reference position to the target object) is only between 2.75m and 3.75m, narrowing the range of the theoretical value of the Scheimpflug angle (see the bold frame in Figure 13(b)). As a result, when the distance to the end of the slope is close (distance W from the shooting reference position to the target object is small) between 1.75m and 3.75m, it is possible to eliminate the problem of being unable to focus in the slope inclination range of 45° to 73.3°. In other words, it is possible to obtain an image that is in focus over a wide area of the slope.
 なお、法面撮影中に設置したカメラの位置では、シャインプルーフ角理論値から許容できる範囲を超えてしまうことを検出した場合(走行ラインを変更する必要がある、または法面の形状により撮影距離が変化する場合等)には、それ以上撮影しても検査に適した画像が撮影出来ないとして警告を発したり、撮影を停止したり、エラー検出に伴う撮影のやり直しを指示したりしてもよい。 If it is detected that the position of the camera installed while photographing the slope exceeds the allowable range based on the theoretical Scheimpflug angle (when the driving line needs to be changed, or the photographing distance changes due to the shape of the slope, etc.), a warning may be issued to indicate that further photographs will not be able to capture images suitable for inspection, photographing may be stopped, or an instruction may be given to retake photographs due to the detected error.
 本実施形態の撮影方法は、撮影装置が、対象物における撮影対象領域を、移動体の移動方向に沿って複数の撮影領域に分けて撮影する。本実施形態の情報処理方法は、上述した撮影方法により撮影された複数の撮影領域のそれぞれの撮影画像をつなぎ合わせて撮影対象領域の撮影画像を生成する生成ステップを含む。また、本実施形態の情報処理方法は、複数の撮影領域のそれぞれの撮影画像をつなぎ合わせた撮影対象領域の撮影画像を評価する評価ステップを含む。また、本実施形態の情報処理方法は、複数の撮影領域のそれぞれの撮影画像をつなぎ合わせた撮影対象領域の撮影画像を表示する表示ステップを含む。以下、図14~図17を参照して、これらの情報処理方法の一態様について説明する。 In the imaging method of this embodiment, the imaging device captures an area to be imaged of an object by dividing the area to be imaged into a plurality of imaging areas along the direction of movement of the moving object. The information processing method of this embodiment includes a generating step of generating an image of the area to be imaged by stitching together the images of the plurality of imaging areas captured by the imaging method described above. The information processing method of this embodiment also includes an evaluation step of evaluating the image of the area to be imaged obtained by stitching together the images of the plurality of imaging areas. The information processing method of this embodiment also includes a display step of displaying the image of the area to be imaged obtained by stitching together the images of the plurality of imaging areas. Below, one aspect of these information processing methods will be described with reference to Figs. 14 to 17.
 図14は、状態検査システムにおける表示処理の一例を示すシーケンス図である。 Figure 14 is a sequence diagram showing an example of display processing in a status inspection system.
 以下、評価装置3とデータ管理装置5間のシーケンスについて説明するが、データ取得装置9、端末装置1100および端末装置1200とデータ管理装置5間のシーケンスについても同様である。 Below, the sequence between the evaluation device 3 and the data management device 5 will be explained, but the same applies to the sequences between the data acquisition device 9, the terminal device 1100, and the terminal device 1200 and the data management device 5.
 評価装置3のユーザが、フォルダ指定を行うことで、評価装置3の受付部は、対象データの選択を受け付ける(ステップS91)。あるいは、評価装置3のユーザが、評価装置3の地図データ管理部により管理される地図情報における任意の位置を選択することで、評価装置3の受付部は、地図情報における位置情報の選択を受け付けてもよい。 When the user of the evaluation device 3 specifies a folder, the reception unit of the evaluation device 3 accepts the selection of the target data (step S91). Alternatively, the user of the evaluation device 3 may select an arbitrary position in the map information managed by the map data management unit of the evaluation device 3, and the reception unit of the evaluation device 3 may accept the selection of position information in the map information.
 次に、評価装置3の通信部は、データ管理装置5に対して、ステップS91で選択された対象データに係る入出力画面の要求を送信して、データ管理装置5の通信部は、評価装置3から送信された要求を受信する(ステップS92)。この要求には、ステップS91で選択されたフォルダ名が含まれている。あるいは、この要求には、地図情報における位置情報が含まれてもよい。 Next, the communication unit of the evaluation device 3 transmits a request for an input/output screen related to the target data selected in step S91 to the data management device 5, and the communication unit of the data management device 5 receives the request transmitted from the evaluation device 3 (step S92). This request includes the folder name selected in step S91. Alternatively, this request may include location information in the map information.
 次に、データ管理装置5の記憶・読出部は、ステップS92で受信された要求に含まれているフォルダ名を検索キーとして処理データ管理DB5003を検索することで、要求に含まれているフォルダ名に関連づけられた画像データを読み出す。あるいは、記憶・読出部は、ステップS92で受信された要求に含まれている位置情報を検索キーとして取得データ管理DBを検索することで、要求に含まれている位置情報に関連づけられた画像データを読み出す。 Next, the storage/reading unit of the data management device 5 searches the processing data management DB 5003 using the folder name included in the request received in step S92 as a search key, thereby reading out image data associated with the folder name included in the request. Alternatively, the storage/reading unit searches the acquisition data management DB using the location information included in the request received in step S92 as a search key, thereby reading out image data associated with the location information included in the request.
 データ管理装置5の生成部は、記憶・読出部が読みだした画像データに基づき当該画像データを含む入出力画面を生成する(ステップS93)。この入出力画面は、法面を示す輝度画像中の特定位置を示す画像を生成することを指示する指示操作を受け付ける画面である。 The generating unit of the data management device 5 generates an input/output screen including the image data based on the image data read by the storing/reading unit (step S93). This input/output screen is a screen that accepts an instruction operation to generate an image showing a specific position in a luminance image showing a slope.
 データ管理装置5の通信部は、評価装置3に対して、ステップS93で生成した入出力画面に係る入出力画面情報を送信して、評価装置3の通信部は、データ管理装置5から送信された入出力画面情報を受信する(ステップS94)。ステップS94は、生成受付画面送信ステップの一例である。 The communication unit of the data management device 5 transmits input/output screen information related to the input/output screen generated in step S93 to the evaluation device 3, and the communication unit of the evaluation device 3 receives the input/output screen information transmitted from the data management device 5 (step S94). Step S94 is an example of a generation reception screen transmission step.
 次に、評価装置3の表示制御部は、ステップS94で受信された入出力画面を、ディスプレイに表示させる(ステップS95)。評価装置3の受付部は、表示された入出力画面に対するユーザの所定の入力操作を受け付ける。この入力操作は、法面を示す輝度画像中の特定位置を示す画像を生成することを指示する指示操作を含む。ステップS95は、受付ステップの一例である。 Next, the display control unit of the evaluation device 3 displays the input/output screen received in step S94 on the display (step S95). The reception unit of the evaluation device 3 receives a predetermined input operation by the user on the displayed input/output screen. This input operation includes an instruction operation to generate an image showing a specific position in a luminance image showing a slope. Step S95 is an example of a reception step.
 評価装置3の通信部は、データ管理装置5に対して、受付部が受け付けた入力操作に係る入力情報を送信して、データ管理装置5の通信部は、評価装置3から送信された入力情報を受信する(ステップS96)。この入力情報は、法面を示す輝度画像中の特定位置を示す画像を生成することを指示する指示情報を含む。 The communication unit of the evaluation device 3 transmits input information related to the input operation received by the reception unit to the data management device 5, and the communication unit of the data management device 5 receives the input information transmitted from the evaluation device 3 (step S96). This input information includes instruction information that instructs the generation of an image showing a specific position in the luminance image showing the slope.
 データ管理装置5の生成部は、受信した入力情報に基づき、ステップS93で記憶・読出部が読みだした画像データを用いて表示画像を生成する(ステップS97)。この表示画像は、法面の表面を示す表面画像および表面画像における特定位置を示す表面位置画像を含む表面表示画像と、法面の断面を示す断面画像および断面画像における特定位置を示す断面位置画像を含む断面表示画像を含む。ステップS97は、画像生成ステップの一例である。 The generating section of the data management device 5 generates a display image using the image data read by the storing and reading section in step S93 based on the received input information (step S97). This display image includes a surface display image including a surface image showing the surface of the slope and a surface position image showing a specific position in the surface image, and a cross-section display image including a cross-section image showing the cross-section of the slope and a cross-section position image showing a specific position in the cross-section image. Step S97 is an example of an image generating step.
 データ管理装置5の通信部は、評価装置3に対して、ステップS97で生成した表示画像を送信して、評価装置3の通信部は、データ管理装置5から送信された表示画像を受信する(ステップS98)。ステップS98は、表示画像送信ステップの一例である。 The communication unit of the data management device 5 transmits the display image generated in step S97 to the evaluation device 3, and the communication unit of the evaluation device 3 receives the display image transmitted from the data management device 5 (step S98). Step S98 is an example of a display image transmission step.
 評価装置3の表示制御部は、ステップS98で受信した表示画像をディスプレイに表示させる(ステップS99)。ステップS99は、表示ステップの一例である。 The display control unit of the evaluation device 3 displays the display image received in step S98 on the display (step S99). Step S99 is an example of a display step.
 図14は、評価装置3とデータ管理装置5間の表示処理に係るシーケンスを示したが、評価装置3は、単独で表示処理を実行してもよい。 FIG. 14 shows the sequence of the display process between the evaluation device 3 and the data management device 5, but the evaluation device 3 may execute the display process independently.
 この場合、データ送受信に係るステップS92、94、96、98が省略され、評価装置3は、ステップS91,93、95、97、99を単独で実行することにより、図14と同様の表示処理が行える。データ取得装置9、端末装置1100および端末装置1200についても、それぞれ評価装置3と同様に単独で表示処理を実行することができる。 In this case, steps S92, 94, 96, and 98 relating to data transmission and reception are omitted, and the evaluation device 3 can perform the same display processing as in FIG. 14 by independently executing steps S91, 93, 95, 97, and 99. The data acquisition device 9, terminal device 1100, and terminal device 1200 can also independently execute display processing like the evaluation device 3.
〇特定位置を指定する操作に基づく、表面表示画像の生成
 図15は、状態検査システムの表示画面における操作の説明図である。図15は、図14に示したシーケンス図のステップS95において、評価装置3のディスプレイに表示される入出力画面2000を示すが、データ取得装置9、端末装置1100および端末装置1200の夫々のディスプレイに表示される入出力画面2000についても同様である。
Generation of a surface display image based on an operation of specifying a specific position Fig. 15 is an explanatory diagram of an operation on a display screen of a state inspection system. Fig. 15 shows an input/output screen 2000 displayed on the display of the evaluation device 3 in step S95 of the sequence diagram shown in Fig. 14, but the same is true for the input/output screen 2000 displayed on the respective displays of the data acquisition device 9, the terminal device 1100, and the terminal device 1200.
 評価装置3の表示制御部は、法面を示す輝度画像中の特定位置を指定する指定操作を受け付ける指定受付画面2010と、法面における特定位置を示す画像を生成することを指示する指示操作を受け付ける生成受付画面2020を含む入出力画面2000を表示させる。 The display control unit of the evaluation device 3 displays an input/output screen 2000 including a designation reception screen 2010 that receives a designation operation for designating a specific position in a luminance image showing a slope, and a generation reception screen 2020 that receives an instruction operation for instructing the generation of an image showing a specific position on the slope.
 表示制御部は、指定受付画面2010中に、法面の表面を示す表面画像2100を表示させるとともに、ポインティングデバイスにより操作されるポインタ2300を表面画像2100上に表示させる。 The display control unit displays a surface image 2100 showing the surface of the slope on the specification reception screen 2010, and also displays a pointer 2300 operated by a pointing device on the surface image 2100.
 表面画像2100は、撮影画像データから図14のステップS92で読み出された輝度画像であり、表示制御部は、図5に示した撮影画像1,2、および図6(a)に示した撮影画像データ7Aに示されるX軸方向およびY軸方向と対応付けて、表面画像2100を表示させる。 The surface image 2100 is a luminance image read out from the captured image data in step S92 of FIG. 14, and the display control unit displays the surface image 2100 in association with the captured images 1 and 2 shown in FIG. 5 and the X-axis direction and Y-axis direction shown in the captured image data 7A shown in FIG. 6(a).
 表示制御部は、指定位置確定ボタン2400、変状確認ボタン2410、変状予兆確認ボタン2420、正面図解析ボタン2430、正面図比較ボタン2440、断面図解析ボタン2450、および断面図比較ボタン2460を含む生成受付画面2020を表示させる。変状確認ボタン2410、変状予兆確認ボタン2420、正面図解析ボタン2430、正面図比較ボタン2440、断面図解析ボタン2450、および断面図比較ボタン2460は、表面画像2100または断面画像2200における所定の条件を満たす部分の位置を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンである。 The display control unit displays the generation reception screen 2020, which includes a specified position confirmation button 2400, a deformation confirmation button 2410, a deformation sign confirmation button 2420, a front view analysis button 2430, a front view comparison button 2440, a cross-sectional view analysis button 2450, and a cross-sectional view comparison button 2460. The deformation confirmation button 2410, the deformation sign confirmation button 2420, the front view analysis button 2430, the front view comparison button 2440, the cross-sectional view analysis button 2450, and the cross-sectional view comparison button 2460 are buttons that instruct the generation of an image showing a specific position on the slope, with the position of a part in the surface image 2100 or the cross-sectional image 2200 that satisfies a specified condition as the specific position.
 指定位置確定ボタン2400は、指定受付画面2010で指定された法面における特定位置を確定して、法面における特定位置を示す画像を生成することを指示するボタンである。指定位置確定ボタン2400は、指定受付画面2010で指定された特定位置だけでなく、判断部等により特定され、指定受付画面2010に表示された特定位置を決定してもよい。 The designated position confirmation button 2400 is a button that instructs the user to confirm a specific position on the slope specified on the designation acceptance screen 2010 and generate an image showing the specific position on the slope. The designated position confirmation button 2400 may determine not only a specific position specified on the designation acceptance screen 2010, but also a specific position that has been specified by a judgment unit or the like and displayed on the designation acceptance screen 2010.
 変状確認ボタン2410は、法面の変状を示す位置を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンであり、変状予兆確認ボタン2420は、法面の変状の予兆を示す位置を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンである。 The Deformation Confirmation button 2410 is a button that instructs the system to generate an image showing a specific position on the slope, with a position indicating a deformation of the slope set as the specific position, and the Deformation Sign Confirmation button 2420 is a button that instructs the system to generate an image showing a specific position on the slope, with a position indicating a deformation of the slope set as the specific position.
 正面図解析ボタン2430は、表面画像2100を解析して得られる部分を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンであり、正面図比較ボタン2440は、表面画像2100を他の画像と比較して得られる部分を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンである。 The front view analysis button 2430 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by analyzing the surface image 2100 as the specific position, and the front view comparison button 2440 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by comparing the surface image 2100 with another image as the specific position.
 断面図解析ボタン2450は、後述する断面画像を解析して得られる部分を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンであり、断面図比較ボタン2460は、断面画像を他の画像と比較して得られる部分を特定位置として、法面における特定位置を示す画像を生成することを指示するボタンである。 The cross-sectional view analysis button 2450 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by analyzing the cross-sectional image (described later) as the specific position, and the cross-sectional view comparison button 2460 is a button that instructs the system to generate an image showing a specific position on the slope by specifying a portion obtained by comparing the cross-sectional image with another image as the specific position.
 図16は、情報処理方法の処理の一例を示すフローチャートである。図16(a)は、評価装置3における処理を示し、図16(b)は、データ管理装置5における処理を示す。 FIG. 16 is a flowchart showing an example of the information processing method. FIG. 16(a) shows the processing in the evaluation device 3, and FIG. 16(b) shows the processing in the data management device 5.
 評価装置3の受付部は、ポインタ2300により表面画像2100上の所定の位置がポインティングされると、当該ポインティング操作を受け付けて(ステップS101)、指定位置確定ボタン2400が操作されると、当該操作を受け付ける(ステップS102)。 When a specific position on the surface image 2100 is pointed to by the pointer 2300, the reception unit of the evaluation device 3 receives the pointing operation (step S101), and when the designated position confirmation button 2400 is operated, the reception unit receives the operation (step S102).
 次に、評価装置3の判断部は、ポインティングされた位置の表面画像2100におけるXY座標を特定位置として検出する(ステップS103)。この特定位置は、XY座標における点を示してもよく、領域を示しても良い。 Next, the judgment unit of the evaluation device 3 detects the XY coordinates of the pointed position in the surface image 2100 as a specific position (step S103). This specific position may indicate a point in the XY coordinates, or may indicate an area.
 次に、評価装置3の通信部は、データ管理装置5に対して、受付部32が受け付けた入力操作に係る入力情報を送信する(ステップS104)。この入力情報は、ポインタ2300によるポインティング操作に基づく、XY座標で特定位置を指定する指定情報と、指定位置確定ボタン2400の操作に基づく、法面における特定位置を示す画像を生成することを指示する指示情報を含む。 Next, the communication unit of the evaluation device 3 transmits input information related to the input operation received by the reception unit 32 to the data management device 5 (step S104). This input information includes designation information for designating a specific position in XY coordinates based on a pointing operation using the pointer 2300, and instruction information for instructing the generation of an image showing the specific position on the slope based on the operation of the designated position confirmation button 2400.
 データ管理装置5の通信部は、評価装置3から送信された入力情報を受信し、生成部は、受信した入力情報に含まれる指示情報および指定情報に基づき、図6(a)に示した画像データを用いて、特定位置のXY座標と重なる表面位置画像を、表面画像に重畳して生成して、表面表示画像を生成する(ステップS105)。表面位置画像は、特定位置のXY座標と必ずしも完全一致する必要はなく、特定位置のXY座標と重なっていればよい。 The communication unit of the data management device 5 receives the input information sent from the evaluation device 3, and the generation unit uses the image data shown in FIG. 6(a) based on the instruction information and specification information contained in the received input information to generate a surface position image that overlaps with the XY coordinates of the specific position by superimposing it on the surface image to generate a surface display image (step S105). The surface position image does not necessarily have to completely match the XY coordinates of the specific position, as long as it overlaps with the XY coordinates of the specific position.
 続いて、生成部は、図6(a)に示した画像データおよび図6(b)に示した測距データを用いて、特定位置のX座標に対応する断面画像を生成する(ステップS106)。図6(b)に示した測距データが、特定位置のX座標を含まない場合は、図6(b)に示した測距データに含まれる特定位置のX座標近傍のデータに基づき、断面画像を生成する。 Then, the generating unit generates a cross-sectional image corresponding to the X-coordinate of the specific position using the image data shown in FIG. 6(a) and the distance measurement data shown in FIG. 6(b) (step S106). If the distance measurement data shown in FIG. 6(b) does not include the X-coordinate of the specific position, the generating unit generates a cross-sectional image based on data in the vicinity of the X-coordinate of the specific position included in the distance measurement data shown in FIG. 6(b).
 なお、生成部は、ステップS106において、図5に示したZ軸方向および鉛直方向を含む断面の断面画像を生成したが、Z軸方向および鉛直方向から傾斜した方向を含む断面の断面画像や、Z軸方向から傾斜した方向を含む断面の断面画像を生成しても良い。 In step S106, the generating unit generates a cross-sectional image of a cross-section including the Z-axis direction and the vertical direction shown in FIG. 5, but it may also generate a cross-sectional image of a cross-section including the Z-axis direction and a direction inclined from the vertical direction, or a cross-sectional image of a cross-section including a direction inclined from the Z-axis direction.
 生成部は、特定位置のY座標と重なる断面位置画像を、断面画像の稜線と重畳して生成し、断面表示画像を生成する(ステップS107)。 The generation unit generates a cross-sectional position image that overlaps with the Y coordinate of the specific position by superimposing it on the edge line of the cross-sectional image, and generates a cross-sectional display image (step S107).
 通信部は、評価装置3に対して、ステップS105で生成した表面表示画像およびステップS107で生成した断面表示画像を送信する(ステップS108)。 The communication unit transmits the surface display image generated in step S105 and the cross-sectional display image generated in step S107 to the evaluation device 3 (step S108).
 そして、図14のステップS98およびS99に示したように、評価装置3の通信部は、データ管理装置5から送信された表面表示画像および断面表示画像を受信し、評価装置3の表示制御部は、受信した表面表示画像および断面表示画像をディスプレイに表示させる。 Then, as shown in steps S98 and S99 of FIG. 14, the communication unit of the evaluation device 3 receives the surface display image and cross-sectional display image transmitted from the data management device 5, and the display control unit of the evaluation device 3 displays the received surface display image and cross-sectional display image on the display.
 図17は、図14に示した処理後の表示画面の一例を示す図である。図17は、図14に示したシーケンス図のステップS99において、評価装置3のディスプレイに表示される入出力画面2000を示す。 FIG. 17 is a diagram showing an example of a display screen after the processing shown in FIG. 14. FIG. 17 shows an input/output screen 2000 that is displayed on the display of the evaluation device 3 in step S99 of the sequence diagram shown in FIG. 14.
 生成受付画面2020の表示内容は、図15と同一であるが、指定受付画面2010の表示内容は、図15とは異なる。 The display contents of the generation reception screen 2020 are the same as those in FIG. 15, but the display contents of the specification reception screen 2010 are different from those in FIG. 15.
 評価装置3の表示制御部は、法面の表面を示す表面画像2100および表面画像2100における特定位置を示す表面位置画像2110を含む表面表示画像2150と、法面の断面を示す断面画像2200および断面画像2200における特定位置を示す断面位置画像2210を含む断面表示画像2250を指定受付画面2010に表示させる。 The display control unit of the evaluation device 3 causes the specification reception screen 2010 to display a surface display image 2150 including a surface image 2100 showing the surface of the slope and a surface position image 2110 showing a specific position in the surface image 2100, and a cross-section display image 2250 including a cross-section image 2200 showing the cross-section of the slope and a cross-section position image 2210 showing a specific position in the cross-section image 2200.
 表示制御部は、図5に示したY軸方向およびZ軸方向と対応付けて、断面画像2200を表示させる。 The display control unit displays the cross-sectional image 2200 in association with the Y-axis direction and Z-axis direction shown in FIG. 5.
 このように、本実施形態の撮影方法及び情報処理方法では、移動体が移動するXZ平面上に移動体に設置された撮影装置の撮影基準位置を投影した位置からXZ平面上でXZ平面と対象物とが交わる位置までの長さを撮影基準位置から対象物までの距離と定義し、XZ平面から対象物における撮影対象領域までのZ軸上の長さを撮影対象領域の高さと定義する。そして、設置位置設定ステップでは、撮影基準位置から対象物までの距離、及び撮影対象領域の高さの少なくとも一方に応じて、移動方向と交差するZ軸方向について設置位置を調整する。これにより、対象物(例えば法面)の広い領域でピントが合った画像を得ることができる。 In this way, in the imaging method and information processing method of this embodiment, the length from the position where the imaging reference position of the imaging device installed on the moving body is projected onto the XZ plane along which the moving body moves to the position on the XZ plane where the XZ plane intersects with the object is defined as the distance from the imaging reference position to the object, and the length on the Z axis from the XZ plane to the imaging target area of the object is defined as the height of the imaging target area. Then, in the installation position setting step, the installation position is adjusted in the Z axis direction that intersects with the movement direction according to at least one of the distance from the imaging reference position to the object and the height of the imaging target area. This makes it possible to obtain an image that is in focus over a wide area of the object (e.g. a slope).
 以上、本開示に係る発明について詳細に説明したが、当業者にとっては、本開示に係る発明が本開示中に説明した実施形態に限定されないということは明らかである。本開示に係る発明は、請求の範囲の記載に基づいて定まる発明の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本開示の記載は、例示説明を目的とし、本開示に係る発明に対して何ら制限的な意味をもたらさない。  The invention disclosed herein has been described in detail above, but it is clear to those skilled in the art that the invention disclosed herein is not limited to the embodiments described herein. The invention disclosed herein can be implemented in modified and altered forms without departing from the spirit and scope of the invention as defined by the claims. Therefore, the description of the disclosure is intended as an illustrative example and does not impose any limiting meaning on the invention disclosed herein.
 例えば、上記の実施形態では、複数の撮影装置(カメラ)として5台の撮影装置(カメラ)を用意し、対象物である法面(法面を含む対象物)の最も上側を撮影する1台の撮影装置(カメラ)だけを「設置位置可変撮影部(設置位置可変カメラ)」とし、残りの4台の撮影装置(カメラ)を「設置位置固定撮影部(設置位置固定カメラ)」とした場合を例示して説明した。しかし、撮影装置(カメラ)の数には自由度があり、そのうち、「設置位置可変撮影部(設置位置可変カメラ)」と「設置位置固定撮影部(設置位置固定カメラ)」をどのように配置するかにも自由度がある。例えば、撮影装置(カメラ)を1台の「設置位置可変撮影部(設置位置可変カメラ)」だけとする態様、及び、撮影装置(カメラ)を6台以上設けてそのうち2台以上を「設置位置可変撮影部(設置位置可変カメラ)」とする態様も可能である。 For example, in the above embodiment, five photographing devices (cameras) are prepared as the multiple photographing devices (cameras), and only one photographing device (camera) that photographs the uppermost part of the slope (object including the slope) that is the target is designated as a "variable installation position photographing unit (variable installation position camera)", and the remaining four photographing devices (cameras) are designated as "fixed installation position photographing units (fixed installation position cameras)". However, there is a degree of freedom in the number of photographing devices (cameras), and there is also a degree of freedom in how to arrange the "variable installation position photographing units (variable installation position cameras)" and the "fixed installation position photographing units (fixed installation position cameras)". For example, it is also possible to have only one "variable installation position photographing unit (variable installation position camera)" as the photographing device (camera), and to have six or more photographing devices (cameras) and two or more of them as "variable installation position photographing units (variable installation position cameras)".
 1 状態検査システム
 6 移動体(車両)
 7 撮影装置(カメラ)
 7A 撮影装置(カメラ1、設置位置固定撮影部、第1の撮影装置)
 7B 撮影装置(カメラ2、設置位置固定撮影部、第1の撮影装置)
 7C 撮影装置(カメラ3、設置位置固定撮影部、第1の撮影装置)
 7D 撮影装置(カメラ4、設置位置固定撮影部、第1の撮影装置)
 7E 撮影装置(カメラ5、設置位置可変撮影部、第2の撮影装置)
 60 移動体システム(撮影システム)
1 Condition inspection system 6 Mobile body (vehicle)
7. Camera
7A Photographing device (camera 1, fixed position photographing unit, first photographing device)
7B Imaging device (camera 2, fixed position imaging unit, first imaging device)
7C Photographing device (camera 3, fixed position photographing unit, first photographing device)
7D Photographing device (camera 4, fixed position photographing unit, first photographing device)
7E Photographing device (camera 5, variable installation position photographing unit, second photographing device)
60 Mobile system (photography system)
特許第6498488号公報Patent No. 6498488

Claims (9)

  1.  移動体に設置された撮影装置により、移動しながら対象物を撮影する撮影方法であって、
    前記撮影装置は、移動方向と交差する方向に位置する前記対象物における撮影対象領域を撮影するものであり、
     前記移動体における前記撮影装置の設置位置を設定する設置位置設定ステップを有し、
     前記移動体の移動方向をX軸、前記移動方向と交差する方向をZ軸、前記X軸と前記Z軸と直交する方向をY軸、とするXYZ直交座標系を用いて、前記移動体が移動するXZ平面上に前記移動体に設置された前記撮影装置の撮影基準位置を投影した位置から前記XZ平面上で前記XZ平面と前記対象物とが交わる位置までの長さを前記撮影基準位置から前記対象物までの距離と定義し、前記XZ平面から前記対象物における前記撮影対象領域までの前記Y軸上の長さを前記撮影対象領域の高さと定義すると、
     前記設置位置設定ステップは、
    前記撮影基準位置から前記対象物までの距離、及び前記撮影対象領域の高さの少なくとも一方に応じて、前記移動方向と交差する前記Z軸方向について前記設置位置を調整する
     ことを特徴とする撮影方法。
    A method for photographing an object while moving using a photographing device installed on a moving body, comprising:
    The photographing device photographs a photographing target area of the object located in a direction intersecting a moving direction,
    an installation position setting step of setting an installation position of the photographing device in the moving body;
    Using an XYZ Cartesian coordinate system in which the moving direction of the moving body is the X axis, the direction intersecting the moving direction is the Z axis, and the direction perpendicular to the X and Z axes is the Y axis, the length from the position where the shooting reference position of the shooting device installed on the moving body is projected onto the XZ plane along which the moving body moves to the position on the XZ plane where the XZ plane intersects with the object is defined as the distance from the shooting reference position to the object, and the length on the Y axis from the XZ plane to the shooting target area of the object is defined as the height of the shooting target area,
    The installation position setting step includes:
    adjusting the installation position in the Z-axis direction intersecting with the movement direction according to at least one of a distance from the shooting reference position to the object and a height of the shooting target area.
  2.  前記撮影基準位置から前記対象物までの距離、及び前記撮影対象領域の高さは、前記撮影対象領域に合焦するための理論シャインプルーフ角を求めるためのパラメータであり、 前記理論シャインプルーフ角と前記撮影装置に予め設定されている設定シャインプルーフ角との差分に応じて前記設置位置を調整する
     ことを特徴とする請求項1に記載の撮影方法。
    The photographing method according to claim 1, characterized in that the distance from the photographing reference position to the object and the height of the photographing target area are parameters for calculating a theoretical Scheimpflug angle for focusing on the photographing target area, and the installation position is adjusted according to a difference between the theoretical Scheimpflug angle and a set Scheimpflug angle that is preset in the photographing device.
  3.  前記撮影対象領域は、第1の対象領域と、前記Y軸方向において前記第1の対象領域よりも上側に位置する第2の対象領域と、を含み、
     前記撮影装置は、前記第1の対象領域を撮影する第1の撮影装置と、前記第2の対象領域を撮影する第2の撮影装置とを含み、
     前記設置位置設定ステップは、前記第2の撮影装置の前記設置位置を調整する
     ことを特徴とする請求項1又は請求項2に記載の撮影方法。
    The imaging target area includes a first target area and a second target area located above the first target area in the Y-axis direction,
    the photographing device includes a first photographing device that photographs the first target area and a second photographing device that photographs the second target area;
    3. The photographing method according to claim 1, wherein the installation position setting step includes adjusting the installation position of the second photographing device.
  4.  前記第2の撮影装置は、前記第2の撮影装置における撮影レンズの光軸が前記第2の撮影装置における撮像面に直交する軸に対して傾いた設定シャインプルーフ角が予め設定されている
     ことを特徴とする請求項3に記載の撮影方法。
    The photographing method according to claim 3, characterized in that a set Scheimpflug angle is preset in the second photographing device, in which an optical axis of a photographing lens in the second photographing device is tilted with respect to an axis perpendicular to an imaging surface of the second photographing device.
  5.  前記対象物は法面であり、
     前記移動体は車両であり、
     前記移動体が移動するXZ平面は、前記車両が移動する路面であり、
    前記撮影基準位置から前記対象物までの距離は、前記路面上に前記移動体に設置された前記撮影装置の撮影基準位置を投影した位置から前記法面が前記路面と接する法尻までの距離である
     ことを特徴とする請求項1又は請求項2に記載の撮影方法。
    The object is a slope,
    the moving object is a vehicle,
    the XZ plane on which the moving body moves is a road surface on which the vehicle moves,
    3. The photographing method according to claim 1, wherein the distance from the photographing reference position to the object is the distance from a position where the photographing reference position of the photographing device installed on the moving body is projected onto the road surface to the toe of the slope where the slope meets the road surface.
  6.  前記撮影装置は、前記対象物における撮影対象領域を、前記移動体の移動方向に沿って複数の撮影領域に分けて撮影する
     ことを特徴とする請求項1又は請求項2に記載の撮影方法。
    3. The method according to claim 1, wherein the image capturing device captures an image of a target area of the object by dividing the target area into a plurality of image capturing areas along a moving direction of the moving body.
  7.  請求項6に記載の撮影方法により撮影された前記複数の撮影領域のそれぞれの撮影画像をつなぎ合わせて前記撮影対象領域の撮影画像を生成する生成ステップ、を含む
     ことを特徴とする情報処理方法。
    An information processing method comprising: a generating step of generating a captured image of the subject area by stitching together the captured images of the plurality of areas captured by the imaging method according to claim 6.
  8.  前記複数の撮影領域のそれぞれの撮影画像をつなぎ合わせた前記撮影対象領域の撮影画像を評価する評価ステップをさらに含む
     ことを特徴とする請求項7に記載の情報処理方法。
    The information processing method according to claim 7 , further comprising an evaluation step of evaluating the captured image of the target area obtained by joining together the captured images of the plurality of shooting areas.
  9.  前記複数の撮影領域のそれぞれの撮影画像をつなぎ合わせた前記撮影対象領域の撮影画像を表示する表示ステップをさらに含む
     ことを特徴とする請求項7に記載の情報処理方法。
    The information processing method according to claim 7 , further comprising a display step of displaying a captured image of the subject area obtained by stitching together the captured images of the plurality of shooting areas.
PCT/JP2023/033795 2022-10-19 2023-09-15 Photographing method and image processing method WO2024084873A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022167435 2022-10-19
JP2022-167435 2022-10-19

Publications (1)

Publication Number Publication Date
WO2024084873A1 true WO2024084873A1 (en) 2024-04-25

Family

ID=90737512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/033795 WO2024084873A1 (en) 2022-10-19 2023-09-15 Photographing method and image processing method

Country Status (1)

Country Link
WO (1) WO2024084873A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019033478A (en) * 2017-08-09 2019-02-28 株式会社リコー Structure wall surface photographing apparatus, vehicle, structure wall surface photographing method, and tunnel wall surface photographing method
JP6498488B2 (en) * 2015-03-26 2019-04-10 株式会社パスコ Slope check method, slope photography device, and vehicle
JP2022111555A (en) * 2021-01-20 2022-08-01 株式会社トプコン Photographing device and structure inspection system
JP2023139399A (en) * 2022-03-22 2023-10-04 株式会社リコー Imaging apparatus, imaging system, and imaging method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6498488B2 (en) * 2015-03-26 2019-04-10 株式会社パスコ Slope check method, slope photography device, and vehicle
JP2019033478A (en) * 2017-08-09 2019-02-28 株式会社リコー Structure wall surface photographing apparatus, vehicle, structure wall surface photographing method, and tunnel wall surface photographing method
JP2022111555A (en) * 2021-01-20 2022-08-01 株式会社トプコン Photographing device and structure inspection system
JP2023139399A (en) * 2022-03-22 2023-10-04 株式会社リコー Imaging apparatus, imaging system, and imaging method

Similar Documents

Publication Publication Date Title
EP3392612B1 (en) Defect detection apparatus and program
EP2141451B1 (en) Multiple-point measuring method and survey instrument
EP2588882B1 (en) Method for producing a digital photo wherein at least some of the pixels comprise position information, and such a digital photo
US20200218289A1 (en) Information processing apparatus, aerial photography path generation method, program and recording medium
JP6251142B2 (en) Non-contact detection method and apparatus for measurement object
CN101652628A (en) Optical instrument and method for obtaining distance and image information
CN101636632A (en) Optical instrument and method for obtaining distance and image information
JP2008527360A (en) At least one target surveying method and geodetic apparatus
JP6878194B2 (en) Mobile platforms, information output methods, programs, and recording media
US20070098251A1 (en) Non-contact three-dimensional measuring methods and apparatuses
JP2008058264A (en) Device, method and program for observing flow velocity at actual river as object of observation
WO2022078440A1 (en) Device and method for acquiring and determining space occupancy comprising moving object
JP2015010911A (en) Airborne survey method and device
JP2011095112A (en) Three-dimensional position measuring apparatus, mapping system of flying object, and computer program
JP2022023592A (en) Survey system, survey method and program for survey
JP2023029441A (en) Measuring device, measuring system, and vehicle
US20240040247A1 (en) Method for capturing image, method for processing image, image capturing system, and information processing system
JPH1019562A (en) Surveying equipment and surveying method
US20230308777A1 (en) Image capturing device, data acquisition unit, image capturing system, and image capturing method
WO2024084873A1 (en) Photographing method and image processing method
JP2016176751A (en) Target information acquisition device and target information acquisition method
CN105592294A (en) VSP excited cannon group monitoring system
JP4403546B2 (en) Automatic survey system
JP2004085551A (en) Surveying system
JP2024018910A (en) Photographing method, information processing method, photographing system, and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23879523

Country of ref document: EP

Kind code of ref document: A1