WO2018230480A1 - Line distinguishing device and line distinguishing method - Google Patents

Line distinguishing device and line distinguishing method Download PDF

Info

Publication number
WO2018230480A1
WO2018230480A1 PCT/JP2018/022139 JP2018022139W WO2018230480A1 WO 2018230480 A1 WO2018230480 A1 WO 2018230480A1 JP 2018022139 W JP2018022139 W JP 2018022139W WO 2018230480 A1 WO2018230480 A1 WO 2018230480A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
sliding surface
parts
sensor camera
filament
Prior art date
Application number
PCT/JP2018/022139
Other languages
French (fr)
Japanese (ja)
Inventor
大樹 山本
匠朗 川畑
Original Assignee
株式会社 明電舎
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 明電舎 filed Critical 株式会社 明電舎
Priority to JP2019525396A priority Critical patent/JP6844697B2/en
Publication of WO2018230480A1 publication Critical patent/WO2018230480A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60MPOWER SUPPLY LINES, AND DEVICES ALONG RAILS, FOR ELECTRICALLY- PROPELLED VEHICLES
    • B60M1/00Power supply lines for contact with collector on vehicle
    • B60M1/12Trolley lines; Accessories therefor
    • B60M1/28Manufacturing or repairing trolley lines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers

Definitions

  • the present invention relates to a line discriminating apparatus and a line discriminating method for managing overhead lines in railway facilities.
  • the trolley line deviation is measured by synthesizing images obtained from two cameras arranged vertically upward on both sides orthogonal to the traveling direction on the roof of the vehicle, and measuring the deviation of the trolley line by a stereo method.
  • a measuring apparatus is known (for example, refer to Patent Document 1 below).
  • three line sensor cameras are installed at both ends and the center of the roof of the vehicle, and the overhead line image captured by each line sensor camera is subjected to image processing by an image processing apparatus using stereo measurement.
  • a linear measuring device that measures height and displacement is also known (see, for example, Patent Document 2 below).
  • the line measuring apparatus of the said patent document 2 discriminate
  • the linear measuring device described in Patent Document 3 discriminates the main line from the information on the sliding surface, and uses the time information to make an overhead line (crossing line and hanging line suspension line) that does not have a sliding surface. Although it is configured to discriminate between overhead lines having a sliding surface (main line), if crossover lines and suspension lines of crossover lines intersect with the main line at the same time, there is a risk that association becomes difficult. It was.
  • an object of the present invention is to provide a line discriminating apparatus and a line discriminating method capable of reliably discriminating a wide range of heights and displacements with only a line sensor camera. .
  • the filament determining apparatus for solving the above-described problem is, A first line sensor camera and a second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and are inclined toward the sleeper direction center of the vehicle, respectively, Based on the captured images acquired from the first line sensor camera and the second line sensor camera, a line portion corresponding to the line and a sliding surface portion corresponding to the sliding surface of the line are extracted. And an image processing unit that associates the line portions corresponding to each other between the captured images, and calculates a height and a deviation of the line based on a result of the association.
  • a discrimination device The image processing unit When the sliding surface portion is included in the linear portion, the linear portions corresponding to each other between the captured images and the linear portions similar to the linear portion are associated with each other, After the line portions including the sliding surface portion are associated with each other, when only one line portion not including the sliding surface portion exists, the line portions are arranged between the captured images. If there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts Based on the results, the corresponding line portions are associated with each other between the captured images.
  • the filament determining apparatus for solving the above problem
  • the image processing unit A sliding surface extraction unit that detects the sliding surface portion from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively.
  • a line extraction unit that respectively detects a line point group corresponding to the line section from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera;
  • a connecting part for connecting the line point group so as to correspond to each line and creating a line part;
  • the line parts are associated, if there is only one line part that does not include the sliding surface part, the line parts are associated with each other between the captured images, and the sliding surface part If there are two or more line parts that do not include the selected line part, select two similar line parts and perform stereo measurement between the captured images for the two line parts.
  • An association part for associating corresponding filament parts with each other The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured.
  • a stereo measurement unit for calculating
  • a method for determining a filament according to a third invention for solving the above-described problem is as follows. Acquired from the first line sensor camera and the second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and inclined toward the center of the sleeper direction of the vehicle, respectively, to image the line to be measured.
  • a sliding surface portion corresponding to the sliding surface of the filament and a filament portion corresponding to the filament are extracted, and the first line sensor camera and the second line sensor camera respectively
  • the sliding surface portion is not included in the linear portion, the corresponding linear portions between the captured images and the linear portions similar to the linear portion are associated with each other, and the sliding surface portion is included.
  • the line parts are associated with each other between the captured images, and the sliding surface
  • two similar striated portions are selected, and the captured image is based on a result of performing stereo measurement between the captured images for the two striated portions. It includes an associating step for associating corresponding line portions with each other.
  • a method for determining a filament for solving the above-described problem, A sliding surface extraction step for detecting the sliding surface information from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively; A line extraction step for detecting a line point group corresponding to the line from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera; A joining step of creating a filament part by coupling the filament point group so as to correspond to each filament; The association step; The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured.
  • the corresponding filament parts between the captured images and the filament parts similar to the filament parts are associated with each other, After the line parts including the sliding surface part are associated with each other, when there is only one line part not including the sliding surface part, the line parts between the captured images When there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts. On the basis of the result, the corresponding line parts are associated with each other between the captured images.
  • the line discriminating apparatus and the line discriminating method according to the present invention it is possible to reliably discriminate a wide range of heights and displacements with only a line sensor camera.
  • FIG. 1 It is a schematic diagram which shows the example of an image which imaged the crossover line installation in the Example of this invention. It is explanatory drawing which shows the example of an image which extracted the sliding surface from the image shown in FIG. It is explanatory drawing which shows the example of an image which extracted the sliding surface from the image shown in FIG. It is explanatory drawing which shows the example of an image which extracted the sliding surface from the image shown in FIG. It is explanatory drawing which shows the example of an image which extracted the filament point group from the image shown in FIG. It is explanatory drawing which shows the example of an image which extracted the filament point group from the image shown in FIG. It is explanatory drawing which shows the example of an image which produced the filament block from the image shown in FIG. It is explanatory drawing which shows the example of an image which produced the filament block from the image shown in FIG.
  • the present invention installs two line sensor cameras on the roof of a vehicle, performs image processing on the images acquired by each line sensor camera, and obtains the height and displacement of the line from the image captured by the line sensor camera. Is.
  • the position of the line from the vehicle center to a deviation of ⁇ 900 mm by measuring the position of the line using only the line sensor camera and determining the line using stereo measurement. It is characterized by a certain point.
  • the range of deviation ⁇ 900 mm from the center of the vehicle is a range where management of the position information of the overhead line is required.
  • the overhead line collects current to the vehicle through the pantograph, and therefore, the overhead line is located within a range of ⁇ 250 mm from the left and right of the pantograph center ( ⁇ 300 mm in the Shinkansen).
  • the second overhead line an overhead line other than the main line
  • “deviation”, “wire”, “overlap”, “air section”, “crossing wire”, and “sliding surface” have the following meanings.
  • “Eccentricity” A technical term for railways, the distance from the center of the pantograph at the horizontal position of the line.
  • “Stripes” Lines installed in the air in railway facilities, such as overhead lines, suspended overhead lines, feeders, etc.
  • “Overlap” A device that electrically / mechanically classifies overhead lines, and refers to air section / air joint equipment. Examples of methods include the third overhead line method and the new station method.
  • “Air section” In the overlap, the front and rear overhead wires are not electrically connected, but are separated (see FIG. 26).
  • “Crossover line” an apparatus in which two sets of overhead lines on a branching device are crossed so as not to interfere with the passage of the pantograph (see FIG. 27).
  • “Sliding surface” A surface formed by wear of the overhead wire in contact with the pantograph. Usually, there is a sliding surface because the overhead wire is always in contact with the pantograph. However, for equipment sections with second overhead lines such as air sections and crossover lines, there are parts that do not have a sliding surface because there are places where the pantograph does not contact the overhead lines within ⁇ 900mm from the center of the vehicle that needs to be managed To do.
  • the line discriminating device includes first and second line sensor cameras 11 installed on a roof 10 a of a train vehicle (hereinafter simply referred to as a vehicle) 10. 12 and the lighting device 13, and an image processing unit 14 installed inside the vehicle 10.
  • FIG. 1 shows an air section facility and FIG. 2 shows a crossover facility.
  • 1 is a main line
  • 1a is a main line suspension line
  • 2 is a sub main line
  • 2a is a sub main line suspension line
  • 3 is a rail
  • 4 is a crossover wire
  • 4a is a suspension wire of a crossover wire
  • 5 is a feeder
  • 6 is a utility pole
  • 7 is a bent metal fitting.
  • the first and second line sensor cameras 11 and 12 are installed on both sides in the vehicle width direction of the roof 10a of the vehicle 10 in order to measure the displacement and height of the line in the measurement target area. More specifically, as shown in FIG. 3, the optical axis is in the vehicle width direction center side so as to ensure a measurement target region of ⁇ 900 mm from the vehicle 10 such as a crossover facility or an air section facility as an imaging range.
  • the direction and the elevation angle are set so that they face obliquely upward and the scanning line direction (imaging line) is orthogonal to the traveling direction of the vehicle 10.
  • a wide stereo imaging area (“measurement target area” in the figure) is required, but as shown in FIG.
  • a stereo imaging impossible area where the imaging ranges do not overlap each other in the measurement target area. a and b are generated, and stereo measurement cannot be performed for the portion.
  • the filament detection device as shown in FIG. 3, there is no region where the imaging ranges do not overlap each other in the measurement target region, that is, a region where stereo measurement cannot be performed. Stereo measurement can be performed on the entire measurement target region.
  • the image data (captured images) acquired by the first and second line sensor cameras 11 and 12 are input to the image processing unit 14.
  • the illuminating device 13 illuminates the filament in the area imaged by the first and second line sensor cameras 11 and 12.
  • the image processing unit 14 determines a line based on image data acquired by the first and second line sensor cameras 11 and 12. As shown in FIG. 5, the image processing unit 14 includes an image input unit 14a, a sliding surface extraction unit 14b, a line extraction unit 14c, a coupling unit 14d, an association unit 14e, a displacement / height. The calculation unit 14f and the storage unit 14g are included.
  • the image input unit 14 a acquires image data from the first and second line sensor cameras 11 and 12.
  • the image data acquired by the image input unit 14a is stored in the storage unit 14g.
  • the sliding surface extraction unit 14b detects a portion (sliding surface portion) corresponding to the sliding surface 8 (see FIGS. 7 and 8) by image processing from the image data acquired by the image input unit 14a.
  • the detected sliding surface data (sliding surface data) is stored in the storage unit 14g.
  • the filament extraction unit 14c detects a filament point group by image processing from the image data acquired by the image input unit 14a.
  • the detected line point group data (line point group data) is stored in the storage unit 14g.
  • the connecting unit 14d creates a line part (line part) formed by combining the line point group corresponding to each line from the data of the line point group detected by the line extracting unit 14c.
  • the created line part data (line part data) is stored in the storage unit 14g.
  • the associating unit 14e associates the line parts extracted from the image data of the first and second line sensor cameras 11 and 12 based on the line part data and the sliding surface data. Data that has been associated (line association data) is stored in the storage unit 14g.
  • the deviation / height calculation unit 14f calculates the deviation and height of each filament based on the filament association data associated by the association unit 14e.
  • the calculated deviation and height data (deviation / height data) is output to a display unit (not shown).
  • step S1 image data is acquired from the first and second line sensor cameras 11 and 12 by the image input unit 14a.
  • FIG. 7 shows an image example in which image data corresponding to the air section equipment shown in FIG. 1 is arranged in time series
  • FIG. 7 shows an image example in which image data corresponding to the crossover equipment shown in FIG. 2 is arranged in time series. It is shown in FIG. 7 and 8, an image I 1 obtained from the first line sensor camera 11 is shown on the left side, and an image I 2 obtained from the second line sensor camera 12 is shown on the right side. In the images I 1 and I 2 shown in FIG.
  • step S2 the sliding surface extraction unit 14b extracts the sliding surface portion from the images I 1 and I 2 as described above.
  • a known method may be used to extract the sliding surface portion, and a detailed description thereof is omitted here.
  • FIG. 9 shows the result of extracting the sliding surface portion from the images I 1 and I 2 shown in FIG. 7, and
  • FIG. 10 shows the result of extracting the sliding surface portion from the images I 1 and I 2 shown in FIG.
  • a line point group is extracted from the images I 1 and I 2 by the line extraction unit 14c as described above.
  • the term “striate point cloud” as used herein refers to a point group of only position information on the image, and there is no relationship between the line points.
  • a known method may be used to extract the line point group, and a detailed description thereof is omitted here.
  • FIG. 11 shows the result of detecting the line point group from the images I 1 and I 2 shown in FIG. 7, and
  • FIG. 12 shows the result of detecting the line point group from the images I 1 and I 2 shown in FIG.
  • the portions shown in white in FIGS. 11 and 12 correspond to the line point group.
  • step S4 the joint part 14d creates a line part corresponding to each line by combining the line points from the line point group data.
  • a line point group whose positions are continuous on each of the images I 1 and I 2 is combined to create a line block.
  • FIG. 13 shows the result of creating a line lump from the line point group shown in FIG. 11, and
  • FIG. 14 shows the result of creating a line lump from the line point group shown in FIG.
  • different streak blocks are shown in different patterns.
  • bond part 14d even if it is actually a line point group corresponding to one line, when the missing part is included, the missing part is pinched and it distinguishes as a different line lump.
  • FIG. 15 shows the results of creating the filament parts A to E and F to J from the filament block shown in FIG. 13, and the results of creating the filament parts K to O and P to T from the filament block shown in FIG. Is shown in FIG.
  • the length of the filamentous lumps, the angle, the approximate quadratic curve coefficient, and the position information of the ends of the filamentous lumps are used to perform the coupling determination between the filamentous lumps and determine the filaments to be joined.
  • the missing portion is complemented by using an approximate quadratic curve coefficient between the line lumps.
  • step S1 to step S4 is performed for the number of cameras (for this example, two units).
  • step S5 the associating unit 14e associates corresponding line parts between the images I 1 and I 2 .
  • the flow of processing in the association unit 14e will be described in detail with reference to FIG.
  • step S5-1 data of the line part corresponding to the first and second line sensor cameras 11 and 12 created by the coupling unit 14d in step S5-1 (created for each of the images I 1 and I 2 ). Enter.
  • step S5-2 a line part including the sliding surface part extracted by the sliding surface extraction unit 14b is detected, and the detected line part is preferentially associated as a main line or a secondary line.
  • FIG. 18A and FIG. 18B show the results of association based on the sliding surface extraction result shown in FIG. 9 and the filament parts shown in FIG. 15, and the sliding surface extraction result shown in FIG. 10 and FIG.
  • FIG. 19 shows the result of association based on the filament parts.
  • the line part C and the line part G are associated with the same line
  • the line part E and the line part J are the same line. It is associated.
  • the line part M and the line part Q are matched as the same line in a crossover line installation.
  • step S5-3 the linear part having the following features (1) to (3) is applied to the linear part including the sliding surface part associated in step S5-2. It determines that it is a suspended
  • the line part B and the line part H, the line part D, and the line part I are associated with each other as the same line.
  • step S5-4 it is determined whether or not there are two or more uncorrelated line parts (wires that do not include a sliding surface part) remain. That is, uncorrelated line parts among the five line parts A to E and F to J shown in FIG. 15 are associated with the line or the suspended line of the line by the process of step S5-3. If one is excluded, as shown in FIG. 20, there will be only one corresponding to the feeder 5 (the filament part A and the filament part F). When the number of uncorrelated line parts remaining is one or less (NO), the process proceeds to step S5-8 described later. In addition, uncorrelated line parts among the five line parts K to O and P to T created based on the image data corresponding to the cross line equipment shown in FIG.
  • the number of strips is three (the strip parts K, N, O and the strip parts P, S, T) as shown in FIG.
  • the process proceeds to step S5-5, and two line parts with similar shapes among the uncorrelated line parts are passed. It is detected as a candidate for the line 4 and the crossover suspension line 4a.
  • the line parts N and O and the line parts S and T having similar shapes are candidates for the crossover line 4 and the crossover suspension line 4a.
  • step S5-6 the detected shape for the two filament parts of the two non-correspondence similar in step S5-5, the two filament parts of the non-correspondence extracted from the image I 1 ( In the example shown in FIG. 21, for all combinations of two uncorrelated two filament parts extracted from the image I 2 (the filament parts S and T in the example shown in FIG. 21) for the filament parts N and O). Perform stereo measurement. As a result, as shown by a solid line or a two-dot chain line in FIG. 22, candidates for four overhead lines at different positions are obtained.
  • step S5-7 two line parts having the same displacement are identified as the crossover line 4 and the crossover suspension line 4a from among the four overhead line candidates, and are associated with each other. That is, since the crossover line 4 and the crossover suspension line 4a are at the same displacement position, among the four overhead line candidates, two overhead line candidates at the same displacement position indicated by the solid line in FIG. The crossover line 4 and the suspension line 4a of the crossover line, and the overhead line candidates at the other two deviation positions indicated by the two-dot chain line in FIG. 22 become an imaginary overhead line that does not actually exist. Further, of the two overhead lines at the same displacement position, the overhead line with a low vertical position is identified as a crossover line 4 and the overhead line with a high vertical position is identified as a crossover line 4a.
  • step S5-8 the remaining line parts are associated. That is, when the processing in step S5-3 is completed for an air section facility, uncorrelated line parts among the five line parts A to E and F to J shown in FIG. 15 are as shown in FIG. 1 (wire part A and wire part F). If the crossed line equipment is used, when the process of step S5-7 is completed, uncorrelated line parts among the five line parts K to O and P to T shown in FIG. 16 are shown in FIG. 1 (wire part K and wire part P). Therefore, the remaining uncorrelated wire parts are specified as feeder wires 5.
  • step S5-9 the result of associating the filament parts is output as the filament association data.
  • the deviation and height of each filament are calculated by the deviation / height calculation unit 14f based on the filament association data associated by the association unit 14e in step S6. . That is, the line parts associated between the first and second line sensor cameras 11 and 12 are measured in stereo, and the deviation and height of each line are calculated.
  • FIG. 24 shows the result of calculating the deviation and height of the filament from the images I 1 and I 2 shown in FIG. 7, and the deviation and height of the filament are calculated from the images I 1 and I 2 shown in FIG. The results are shown in FIG.
  • the following effects (1) to (4) can be obtained.
  • a wide range of heights and displacements can be measured with only two (first and second) line sensor cameras 11 and 12.
  • the stereo measurement is performed by the two (first and second) line sensor cameras 11 and 12, the measurement interval can be narrowed even in a high-speed traveling vehicle such as a business vehicle by photographing at a high cycle.
  • the measurement interval can be narrowed even in a high-speed traveling vehicle such as a business vehicle by photographing at a high cycle.
  • it is in the range of the performance of the 1st, 2nd line sensor cameras 11 and 12, it can measure without a restriction
  • the positional relationship between the stereo measurement patterns of the two overhead lines other than the main line is used, so that Even without intersection information, the second overhead line can be measured.
  • the present invention is not limited to the above-described embodiment.
  • the line is extracted.
  • the sliding surface extraction unit Needless to say, various changes can be made without departing from the present invention, such as the sliding surface portion may be extracted from the images I 1 and I 2 by 14b.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

From images (I1, I2) of each line (1, 1a, 2, 2a, 4, 4a, 5) captured by line sensor cameras (11, 12) installed on the roof top (10a) of a vehicle (10), line parts (A-T) and a sliding surface portion (8) are detected. If a line part contains a sliding surface portion (8), the relevant line parts and similar line parts are associated together between each captured image (I1, I2). If only one line part does not contain a sliding surface portion (8), the relevant line parts are associated together between each captured image (I1, I2). If two or more line parts do not contain a sliding surface portion (8), the relevant line parts are associated together between each captured image (I1, I2) on the basis of the result from carrying out stereo measurement on two similar line parts between the captured images (I1, I2). On the basis of the result from this association, the height and deviation of the lines (1, 1a, 2, 2a, 4, 4a, 5) are calculated.

Description

線条判別装置および線条判別方法Line detection device and line determination method
 本発明は、鉄道設備において架線を管理するための線条判別装置および線条判別方法に関する。 The present invention relates to a line discriminating apparatus and a line discriminating method for managing overhead lines in railway facilities.
 従来、車両の屋根上の進行方向に直交する両側に鉛直上方に向けて配置された2個のカメラから得られる画像を合成して、ステレオ法によりトロリ線の偏位を測定するトロリ線偏位測定装置が知られている(例えば、下記特許文献1等参照)。 Conventionally, the trolley line deviation is measured by synthesizing images obtained from two cameras arranged vertically upward on both sides orthogonal to the traveling direction on the roof of the vehicle, and measuring the deviation of the trolley line by a stereo method. A measuring apparatus is known (for example, refer to Patent Document 1 below).
 また、車両の屋根上の両端および中央に3台のラインセンサカメラを設置し、各ラインセンサカメラで撮像された架線の画像についてステレオ計測を利用した画像処理装置により画像処理を行うことで架線の高さ・偏位を計測するようにした線条計測装置も知られている(例えば、下記特許文献2等参照)。 In addition, three line sensor cameras are installed at both ends and the center of the roof of the vehicle, and the overhead line image captured by each line sensor camera is subjected to image processing by an image processing apparatus using stereo measurement. A linear measuring device that measures height and displacement is also known (see, for example, Patent Document 2 below).
 また、車両の屋根上の枕木方向両端に2台のラインカメラを配置し、各ラインカメラによって撮像された撮像画像から、計測対象である線条の線条情報および摺動面情報を検出し、これらの情報を用いて撮像画像間の線条の対応付けを行うことで、線条の高さおよび変位を算出するようにした線条計測装置も知られている(例えば、下記特許文献3等参照)。 In addition, two line cameras are arranged at both ends of the sleeper direction on the roof of the vehicle, and the line information and the sliding surface information of the line to be measured are detected from the captured images captured by each line camera, There is also known a line measuring device that calculates the height and displacement of a line by associating the line between captured images using such information (for example, Patent Document 3 below) reference).
特開2009-236574号公報JP 2009-236574 A 特開2016-065838号公報JP 2016-065838 A 特開2017-009446号公報JP 2017-009446 A
 しかしながら、上記特許文献1に記載のトロリ線偏位測定装置は、測定対象が本線のみであるため、わたり線が存在する場合、架線の判別が困難になる可能性があった。 However, since the trolley wire deviation measuring apparatus described in Patent Document 1 is only a main line, there is a possibility that it is difficult to distinguish an overhead line when a crossover line exists.
 また、上記特許文献2に記載の線条計測装置は、本線からの高さ情報を制限することでわたり線を判別するものであるため、わたり線が高さ制限値を超えるような異常値となっている場合は測定できないおそれがあった。 Moreover, since the line measuring apparatus of the said patent document 2 discriminate | determines a crossover line by restrict | limiting the height information from a main line, it is an abnormal value that a crossover line exceeds a height restriction value, and If it is, there was a possibility that it could not be measured.
 また、上記特許文献3に記載の線条計測装置は、摺動面の情報から本線を判別し、時間情報を利用して摺動面を持たない架線(わたり線およびわたり線の吊架線)と摺動面を持つ架線(本線)とを判別するように構成されているが、わたり線およびわたり線の吊架線が同時刻に本線と交差する場合には、対応付けが困難になるおそれがあった。 Further, the linear measuring device described in Patent Document 3 discriminates the main line from the information on the sliding surface, and uses the time information to make an overhead line (crossing line and hanging line suspension line) that does not have a sliding surface. Although it is configured to discriminate between overhead lines having a sliding surface (main line), if crossover lines and suspension lines of crossover lines intersect with the main line at the same time, there is a risk that association becomes difficult. It was.
 このようなことから本発明は、ラインセンサカメラのみで広範囲の高さおよび変位の線条を確実に判別することを可能とした線条判別装置および線条判別方法を提供することを目的とする。 Accordingly, an object of the present invention is to provide a line discriminating apparatus and a line discriminating method capable of reliably discriminating a wide range of heights and displacements with only a line sensor camera. .
 上記の課題を解決するための第1の発明に係る線条判別装置は、
 車両の屋根上の枕木方向両端に、それぞれ該車両の枕木方向中心に向けて傾斜して配置され、計測対象である線条を撮像する第一のラインセンサカメラ及び第二のラインセンサカメラと、前記第一のラインセンサカメラ及び前記第二のラインセンサカメラから取得した撮像画像に基づいて、前記線条に対応する線条部及び前記線条の摺動面に対応する摺動面部分を抽出し、各前記撮像画像間で対応する前記線条部同士の対応付けを行い、対応付けを行った結果に基づいて前記線条の高さ及び偏位を算出する画像処理部とを備える線条判別装置であって、
 前記画像処理部は、
 前記線条部に前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条部同士および当該線条部に類似する線条部同士を対応付け、
 前記摺動面部分が含まれる線条部の対応付けを行った後に、前記摺動面部分が含まれない線条部が一つのみ存在する場合は前記撮像画像間で当該線条部同士を対応付け、前記摺動面部分が含まれない線条部が二つ以上存在する場合は類似する二つの線条部を選択して当該二つの線条部について前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条部同士を対応付ける
ことを特徴とする。
The filament determining apparatus according to the first invention for solving the above-described problem is,
A first line sensor camera and a second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and are inclined toward the sleeper direction center of the vehicle, respectively, Based on the captured images acquired from the first line sensor camera and the second line sensor camera, a line portion corresponding to the line and a sliding surface portion corresponding to the sliding surface of the line are extracted. And an image processing unit that associates the line portions corresponding to each other between the captured images, and calculates a height and a deviation of the line based on a result of the association. A discrimination device,
The image processing unit
When the sliding surface portion is included in the linear portion, the linear portions corresponding to each other between the captured images and the linear portions similar to the linear portion are associated with each other,
After the line portions including the sliding surface portion are associated with each other, when only one line portion not including the sliding surface portion exists, the line portions are arranged between the captured images. If there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts Based on the results, the corresponding line portions are associated with each other between the captured images.
 また、上記の課題を解決するための第2の発明に係る線条判別装置は、第1の発明において、
 前記画像処理部は、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記摺動面部分をそれぞれ検出する摺動面抽出部と、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記線条部に対応する線条点群をそれぞれ検出する線条抽出部と、
 前記線条点群を各線条に対応するように結合して線条パーツを作成する結合部と、
 前記線条パーツに前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条パーツ同士および当該線条パーツに類似する線条パーツ同士を対応付け、前記摺動面部分が含まれる線条パーツの対応付けを行った後に、前記摺動面部分が含まれない線条パーツが一つのみ存在する場合は前記撮像画像間で当該線条パーツ同士を対応付け、前記摺動面部分が含まれない線条パーツが二つ以上存在する場合は類似する二つの線条パーツを選択して当該二つの線条パーツについて前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条パーツ同士を対応付ける対応付け部と、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像からそれぞれ取得した前記線条パーツ同士をステレオ計測し、前記線条の高さ及び偏位を算出するステレオ計測部と
を備えることを特徴とする。
In addition, in the first invention, the filament determining apparatus according to the second invention for solving the above problem
The image processing unit
A sliding surface extraction unit that detects the sliding surface portion from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively.
A line extraction unit that respectively detects a line point group corresponding to the line section from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera;
A connecting part for connecting the line point group so as to correspond to each line and creating a line part;
When the sliding surface part is included in the linear part, the corresponding linear part between the captured images and linear parts similar to the linear part are associated with each other, and the sliding surface part is included. After the line parts are associated, if there is only one line part that does not include the sliding surface part, the line parts are associated with each other between the captured images, and the sliding surface part If there are two or more line parts that do not include the selected line part, select two similar line parts and perform stereo measurement between the captured images for the two line parts. An association part for associating corresponding filament parts with each other,
The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured. And a stereo measurement unit for calculating
 また、上記の課題を解決するための第3の発明に係る線条判別方法は、
 車両の屋根上の枕木方向両端に、それぞれ該車両の枕木方向中心に向けて傾斜して配置され、計測対象である線条を撮像する第一のラインセンサカメラ及び第二のラインセンサカメラから取得した撮像画像から、前記線条の摺動面に対応する摺動面部分及び前記線条に対応する線条部を抽出し、前記第一のラインセンサカメラ及び前記第二のラインセンサカメラによってそれぞれ撮像した前記撮像画像間で前記線条部同士の対応付けを行い、対応付けを行った結果に基づいて前記線条の高さ及び偏位を算出する線条判別方法であって、
 前記線条部に前記摺動面部分が含まれない場合は前記撮像画像間で該当する線条部同士および当該線条部に類似する線条部同士を対応付け、前記摺動面部分が含まれる線条部の対応付けを行った後に、前記摺動面部分が含まれない線条部が一つのみ存在する場合は前記撮像画像間で当該線条部同士を対応付け、前記摺動面部分が含まれない線条部が二つ以上存在する場合は類似する二つの線条部を選択して当該二つの線条部について前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条部同士を対応付ける対応付け工程を含む
ことを特徴とする。
In addition, a method for determining a filament according to a third invention for solving the above-described problem is as follows.
Acquired from the first line sensor camera and the second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and inclined toward the center of the sleeper direction of the vehicle, respectively, to image the line to be measured. From the captured image, a sliding surface portion corresponding to the sliding surface of the filament and a filament portion corresponding to the filament are extracted, and the first line sensor camera and the second line sensor camera respectively A method for determining a line shape between the captured images, wherein the line portions are associated with each other, and the height and displacement of the line are calculated based on the result of the association,
When the sliding surface portion is not included in the linear portion, the corresponding linear portions between the captured images and the linear portions similar to the linear portion are associated with each other, and the sliding surface portion is included. If there is only one line part that does not include the sliding surface part, the line parts are associated with each other between the captured images, and the sliding surface When there are two or more striated portions that do not include a portion, two similar striated portions are selected, and the captured image is based on a result of performing stereo measurement between the captured images for the two striated portions. It includes an associating step for associating corresponding line portions with each other.
 また、上記の課題を解決するための第4の発明に係る線条判別方法は、第3の発明において、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記摺動面情報をそれぞれ検出する摺動面抽出工程と、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記線条に対応する線条点群をそれぞれ検出する線条抽出工程と、
 前記線条点群を各線条に対応するように結合して線条パーツを作成する結合工程と、
 前記対応付け工程と、
 前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像からそれぞれ取得した前記線条パーツ同士をステレオ計測し、前記線条の高さ及び偏位を算出するステレオ計測工程と
を含み、
 前記対応付け工程では、
 前記線条パーツに前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条パーツ同士および当該線条パーツに類似する線条パーツ同士を対応付け、
 前記摺動面部分が含まれる線条パーツの対応付けを行った後に、前記摺動面部分が含まれない線条パーツが一つのみ存在する場合は前記撮像画像間で当該線条パーツ同士を対応付け、前記摺動面部分が含まれない線条パーツが二つ以上存在する場合は類似する二つの線条パーツを選択して当該二つの線条パーツについて前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条パーツ同士を対応付ける
ことを特徴とする。
In addition, in the third invention, a method for determining a filament according to a fourth invention for solving the above-described problem,
A sliding surface extraction step for detecting the sliding surface information from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively;
A line extraction step for detecting a line point group corresponding to the line from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera;
A joining step of creating a filament part by coupling the filament point group so as to correspond to each filament;
The association step;
The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured. Stereo measurement process for calculating
In the association step,
When the sliding surface part is included in the filament parts, the corresponding filament parts between the captured images and the filament parts similar to the filament parts are associated with each other,
After the line parts including the sliding surface part are associated with each other, when there is only one line part not including the sliding surface part, the line parts between the captured images When there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts. On the basis of the result, the corresponding line parts are associated with each other between the captured images.
 本発明に係る線条判別装置および線条判別方法によれば、ラインセンサカメラのみで広範囲の高さおよび変位の線条を確実に判別することができる。 According to the line discriminating apparatus and the line discriminating method according to the present invention, it is possible to reliably discriminate a wide range of heights and displacements with only a line sensor camera.
本発明の実施例に係る線条判別装置の設置例を示す模式図である。It is a schematic diagram which shows the example of installation of the filament determination apparatus which concerns on the Example of this invention. 本発明の実施例に係る線条判別装置の設置例を示す他の模式図である。It is another schematic diagram which shows the example of installation of the filament determination apparatus which concerns on the Example of this invention. 本発明の実施例におけるラインセンサカメラの設置例を示す模式図である。It is a schematic diagram which shows the example of installation of the line sensor camera in the Example of this invention. ラインセンサカメラの設置の比較例を示す模式図である。It is a schematic diagram which shows the comparative example of installation of a line sensor camera. 本発明の実施例における画像処理部の構成を示すブロック図である。It is a block diagram which shows the structure of the image process part in the Example of this invention. 本発明の実施例における画像処理部の処理の流れを示すフローチャートである。It is a flowchart which shows the flow of a process of the image process part in the Example of this invention. 本発明の実施例においてエアセクション設備を撮像した画像例を示す模式図である。It is a schematic diagram which shows the example of an image which imaged the air section installation in the Example of this invention. 本発明の実施例においてわたり線設備を撮像した画像例を示す模式図である。It is a schematic diagram which shows the example of an image which imaged the crossover line installation in the Example of this invention. 図7に示す画像から摺動面を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the sliding surface from the image shown in FIG. 図8に示す画像から摺動面を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the sliding surface from the image shown in FIG. 図7に示す画像から線条点群を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the filament point group from the image shown in FIG. 図8に示す画像から線条点群を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the filament point group from the image shown in FIG. 図11に示す画像から線条塊を作成した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which produced the filament block from the image shown in FIG. 図12に示す画像から線条塊を作成した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which produced the filament block from the image shown in FIG. 図13に示す画像から線条パーツを作成した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which produced the filament part from the image shown in FIG. 図14に示す画像から線条パーツを作成した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which created the filament parts from the image shown in FIG. 本発明の実施例における対応付け部による処理の流れを示すフローチャートである。It is a flowchart which shows the flow of a process by the matching part in the Example of this invention. 図15に示す画像から摺動面を有する架線を特定した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which specified the overhead line which has a sliding surface from the image shown in FIG. 図15に示す画像から摺動面を有する他の架線を特定した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which specified the other overhead line which has a sliding surface from the image shown in FIG. 図16に示す画像から摺動面を有する架線を特定した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which specified the overhead line which has a sliding surface from the image shown in FIG. 図15に示す画像から未対応付けの線条を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the uncorrelated filament from the image shown in FIG. 図16に示す画像から未対応付けの線条を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the uncorrelated filament from the image shown in FIG. 図21に示す画像に対してステレオ計測により対応付けを行う例を示す説明図である。It is explanatory drawing which shows the example which matches by the stereo measurement with respect to the image shown in FIG. 図21に示す画像から未対応付けの線条を抽出した画像例を示す説明図である。It is explanatory drawing which shows the example of an image which extracted the uncorrelated filament from the image shown in FIG. 図7に示す画像から得られた線条の偏位および高さを示す説明図である。It is explanatory drawing which shows the deflection | deviation and height of the filament obtained from the image shown in FIG. 図8に示す画像から得られた線条の偏位および高さを示す説明図である。It is explanatory drawing which shows the deflection | deviation and height of the filament obtained from the image shown in FIG. エアセクション設備の例を示す模式図である。It is a schematic diagram which shows the example of an air section installation. わたり線設備の例を示す模式図である。It is a schematic diagram which shows the example of a crossover line installation.
 本発明は、車両の屋根上に二台のラインセンサカメラを設置し、各ラインセンサカメラで取得した画像を画像処理し、ラインセンサカメラによって撮像された画像から線条の高さと偏位を求めるものである。 The present invention installs two line sensor cameras on the roof of a vehicle, performs image processing on the images acquired by each line sensor camera, and obtains the height and displacement of the line from the image captured by the line sensor camera. Is.
 本発明は、特にラインセンサカメラのみにより線条の位置を測定する点と、ステレオ計測を用いて線条の判別を行うことにより車両中心から偏位±900mmまでの線条の位置計測が可能である点とを特徴としている。 In the present invention, it is possible to measure the position of the line from the vehicle center to a deviation of ± 900 mm by measuring the position of the line using only the line sensor camera and determining the line using stereo measurement. It is characterized by a certain point.
 この車両中心から偏位±900mmという範囲は、架線の位置情報の管理が必要とされている範囲である。通常、架線はパンタグラフを通じて車両に集電するので、パンタグラフ中心から左右±250mm(新幹線では±300mm)の範囲内の位置にある。しかしオーバーラップ周辺のエアセクション設備や、二つの軌道が交差するわたり線設備付近では、パンタグラフの端が第二架線(本線以外の架線)に接触するおそれがある。そのため、車両中心から偏位±900mmの範囲内での架線位置測定が重要となる。 The range of deviation ± 900 mm from the center of the vehicle is a range where management of the position information of the overhead line is required. Usually, the overhead line collects current to the vehicle through the pantograph, and therefore, the overhead line is located within a range of ± 250 mm from the left and right of the pantograph center (± 300 mm in the Shinkansen). However, there is a risk that the end of the pantograph will come into contact with the second overhead line (an overhead line other than the main line) in the vicinity of the air section facility around the overlap or the crossover line facility where two tracks intersect. For this reason, it is important to measure the overhead line position within a range of deviation ± 900 mm from the vehicle center.
 なお、本発明において、「偏位」、「線条」、「オーバーラップ」、「エアセクション」、「わたり線」、「摺動面」は下記の意味とする。
「偏位」:鉄道専門用語であり、線条の水平方向の位置で、パンタグラフ中央からの距離。
「線条」:鉄道設備において空中に架設された線で、架線、吊架線、き電線などの線がある。
「オーバーラップ」:架線を電気的/機械的に区分する装置のことで、エアセクション/エアジョイントの設備のことを指し、方式には第三架線方式/新駅方式等がある。
「エアセクション」:オーバーラップにおいて、前後の架線を電気的に接続せずに区分箇所としたもの(図26参照)。
「わたり線」:分岐器上の二組の架線をパンタグラフの通過に支障のないように交差させた装置(図27参照)。
「摺動面」:架線がパンタグラフと接触し摩耗して形成された面。通常、架線は常にパンタグラフに接しているため摺動面が存在する。しかし、エアセクションやわたり線など、第二架線がある設備区間に関しては、管理が必要な車両中心より±900mm内において、架線にパンタグラフが接しない箇所があるため摺動面を持たない部分が存在する。
In the present invention, “deviation”, “wire”, “overlap”, “air section”, “crossing wire”, and “sliding surface” have the following meanings.
"Eccentricity": A technical term for railways, the distance from the center of the pantograph at the horizontal position of the line.
“Stripes”: Lines installed in the air in railway facilities, such as overhead lines, suspended overhead lines, feeders, etc.
“Overlap”: A device that electrically / mechanically classifies overhead lines, and refers to air section / air joint equipment. Examples of methods include the third overhead line method and the new station method.
“Air section”: In the overlap, the front and rear overhead wires are not electrically connected, but are separated (see FIG. 26).
“Crossover line”: an apparatus in which two sets of overhead lines on a branching device are crossed so as not to interfere with the passage of the pantograph (see FIG. 27).
“Sliding surface”: A surface formed by wear of the overhead wire in contact with the pantograph. Usually, there is a sliding surface because the overhead wire is always in contact with the pantograph. However, for equipment sections with second overhead lines such as air sections and crossover lines, there are parts that do not have a sliding surface because there are places where the pantograph does not contact the overhead lines within ± 900mm from the center of the vehicle that needs to be managed To do.
 以下、図1から図25を用いて本発明に係る線条判別装置および線条判別方法の一実施例について説明する。 Hereinafter, an embodiment of a line discriminating apparatus and a line discriminating method according to the present invention will be described with reference to FIGS.
 図1および図2に示すように、本実施例において線条判別装置は、電車車両(以下、単に車両と称する)10の屋根上10aに設置された第一,第二のラインセンサカメラ11,12および照明装置13と、車両10の内部に設置された画像処理部14とを備えている。 As shown in FIGS. 1 and 2, in this embodiment, the line discriminating device includes first and second line sensor cameras 11 installed on a roof 10 a of a train vehicle (hereinafter simply referred to as a vehicle) 10. 12 and the lighting device 13, and an image processing unit 14 installed inside the vehicle 10.
 なお、図1はエアセクション設備、図2はわたり線設備を示しており、図1および図2において、1は本線、1aは本線の吊架線、2は副本線、2aは副本線の吊架線、3はレール、4はわたり線、4aはわたり線の吊架線、5はき電線、6は電柱、7は曲引金具である。 1 shows an air section facility and FIG. 2 shows a crossover facility. In FIGS. 1 and 2, 1 is a main line, 1a is a main line suspension line, 2 is a sub main line, and 2a is a sub main line suspension line. 3 is a rail, 4 is a crossover wire, 4a is a suspension wire of a crossover wire, 5 is a feeder, 6 is a utility pole, and 7 is a bent metal fitting.
 第一,第二のラインセンサカメラ11,12は、計測対象領域にある線条の偏位、高さの計測を行うため、車両10の屋根上10aの車幅方向両側に設置されている。より具体的には、図3に示すように、わたり線設備やエアセクション設備等の車両10より±900mmの計測対象領域を撮像範囲として確保するように、その光軸が車幅方向中心側かつ斜め上方に向き、また走査線方向(撮影ライン)が車両10の進行方向に直交するようにその向き及び仰角を設定されている。 The first and second line sensor cameras 11 and 12 are installed on both sides in the vehicle width direction of the roof 10a of the vehicle 10 in order to measure the displacement and height of the line in the measurement target area. More specifically, as shown in FIG. 3, the optical axis is in the vehicle width direction center side so as to ensure a measurement target region of ± 900 mm from the vehicle 10 such as a crossover facility or an air section facility as an imaging range. The direction and the elevation angle are set so that they face obliquely upward and the scanning line direction (imaging line) is orthogonal to the traveling direction of the vehicle 10.
 すなわち、エアセクション設備及びわたり線設備等では、既に説明したように広範囲なステレオ撮像領域(図中の「計測対象領域」)が必要であるのに対し、図4に示すように、車両10の屋根上10aの限定された設置環境下で、第一,第二のラインセンサカメラ11,12を鉛直方向に向けて設置すると、この計測対象領域中に互いの撮像範囲の重ならないステレオ撮像不可領域a,bが生じてしまい、当該箇所についてステレオ計測を行うことができない。 That is, in the air section equipment, the crossover equipment, etc., as already described, a wide stereo imaging area (“measurement target area” in the figure) is required, but as shown in FIG. When the first and second line sensor cameras 11 and 12 are installed in the vertical direction in a limited installation environment on the roof 10a, a stereo imaging impossible area where the imaging ranges do not overlap each other in the measurement target area. a and b are generated, and stereo measurement cannot be performed for the portion.
 これに対して、本実施例に係る線条判別装置では、図3に示すように、計測対象領域中に互いの撮像範囲の重ならない領域、すなわち、ステレオ計測することができない領域が生じないため、計測対象領域全体についてステレオ計測を行うことができる。 On the other hand, in the filament detection device according to the present embodiment, as shown in FIG. 3, there is no region where the imaging ranges do not overlap each other in the measurement target region, that is, a region where stereo measurement cannot be performed. Stereo measurement can be performed on the entire measurement target region.
 これら第一,第二のラインセンサカメラ11,12によって取得した画像データ(撮像画像)は画像処理部14に入力される。 The image data (captured images) acquired by the first and second line sensor cameras 11 and 12 are input to the image processing unit 14.
 照明装置13は、第一,第二のラインセンサカメラ11,12によって撮像される領域にある線条を照らす。 The illuminating device 13 illuminates the filament in the area imaged by the first and second line sensor cameras 11 and 12.
 画像処理部14は、第一,第二のラインセンサカメラ11,12によって取得した画像データに基づいて線条の判別を行う。画像処理部14は、図5に示すように、画像入力部14aと、摺動面抽出部14bと、線条抽出部14cと、結合部14dと、対応付け部14eと、偏位・高さ算出部14fと、記憶部14gとを含んで構成されている。 The image processing unit 14 determines a line based on image data acquired by the first and second line sensor cameras 11 and 12. As shown in FIG. 5, the image processing unit 14 includes an image input unit 14a, a sliding surface extraction unit 14b, a line extraction unit 14c, a coupling unit 14d, an association unit 14e, a displacement / height. The calculation unit 14f and the storage unit 14g are included.
 画像入力部14aは、第一,第二のラインセンサカメラ11,12から画像データを取得する。画像入力部14aにおいて取得した画像データは記憶部14gに保管される。
 摺動面抽出部14bは、画像入力部14aによって取得した画像データから画像処理により摺動面8(図7、図8等参照)に対応する部分(摺動面部分)を検出する。検出した摺動面部分のデータ(摺動面データ)は記憶部14gに保管される。
The image input unit 14 a acquires image data from the first and second line sensor cameras 11 and 12. The image data acquired by the image input unit 14a is stored in the storage unit 14g.
The sliding surface extraction unit 14b detects a portion (sliding surface portion) corresponding to the sliding surface 8 (see FIGS. 7 and 8) by image processing from the image data acquired by the image input unit 14a. The detected sliding surface data (sliding surface data) is stored in the storage unit 14g.
 線条抽出部14cは、画像入力部14aによって取得した画像データから画像処理により線条点群を検出する。検出した線条点群のデータ(線条点群データ)は記憶部14gに保管される。
 結合部14dは、線条抽出部14cによって検出した線条点群のデータから、各線条に対応する線条点群を結合してなる線条パーツ(線条部)を作成する。作成した線条パーツのデータ(線条パーツデータ)は記憶部14gに保管される。
The filament extraction unit 14c detects a filament point group by image processing from the image data acquired by the image input unit 14a. The detected line point group data (line point group data) is stored in the storage unit 14g.
The connecting unit 14d creates a line part (line part) formed by combining the line point group corresponding to each line from the data of the line point group detected by the line extracting unit 14c. The created line part data (line part data) is stored in the storage unit 14g.
 対応付け部14eは、線条パーツデータおよび摺動面データに基づいて第一,第二のラインセンサカメラ11,12の各画像データから抽出した線条パーツの対応付けを行う。対応付けを行ったデータ(線条対応付けデータ)は記憶部14gに保管される。
 偏位・高さ算出部14fは、対応付け部14eによって対応付けを行った線条対応付けデータに基づいて各線条の偏位および高さを算出する。算出した偏位および高さのデータ(偏位・高さデータ)は図示しない表示部等に出力される。
The associating unit 14e associates the line parts extracted from the image data of the first and second line sensor cameras 11 and 12 based on the line part data and the sliding surface data. Data that has been associated (line association data) is stored in the storage unit 14g.
The deviation / height calculation unit 14f calculates the deviation and height of each filament based on the filament association data associated by the association unit 14e. The calculated deviation and height data (deviation / height data) is output to a display unit (not shown).
 以下、図6から図17を用いて画像処理部14による線条判別処理の流れを説明する。 Hereinafter, the flow of the line discrimination processing by the image processing unit 14 will be described with reference to FIGS.
 図6に示すように、本実施例においてはまずステップS1で画像入力部14aにより第一,第二のラインセンサカメラ11,12から画像データを取得する。図1に示すエアセクション設備に対応する画像データを時系列的に並べた画像例を図7に示し、図2に示すわたり線設備に対応する画像データを時系列的に並べた画像例を図8に示す。図7および図8においては、第一のラインセンサカメラ11から得られる画像I1を左側に示し、第二のラインセンサカメラ12から得られる画像I2を右側に示している。図7に示す画像I1,I2には、本線1および本線1の吊架線1a、き電線5、電柱6、曲引金具7に加え、副本線2および副本線2の吊架線2aが撮像されている。図8に示す画像I1,I2には、本線1および本線1の吊架線1a、き電線5、電柱6、曲引金具7に加え、わたり線4およびわたり線4の吊架線4aが撮像されている。なお、図7および図8において8は摺動面を示している。 As shown in FIG. 6, in this embodiment, first, in step S1, image data is acquired from the first and second line sensor cameras 11 and 12 by the image input unit 14a. FIG. 7 shows an image example in which image data corresponding to the air section equipment shown in FIG. 1 is arranged in time series, and FIG. 7 shows an image example in which image data corresponding to the crossover equipment shown in FIG. 2 is arranged in time series. It is shown in FIG. 7 and 8, an image I 1 obtained from the first line sensor camera 11 is shown on the left side, and an image I 2 obtained from the second line sensor camera 12 is shown on the right side. In the images I 1 and I 2 shown in FIG. 7, in addition to the main line 1 and the suspension line 1 a of the main line 1, the feeder 5, the utility pole 6 and the bent metal fitting 7, the sub-main line 2 and the suspension line 2 a of the sub-main line 2 are imaged. Has been. In the images I 1 and I 2 shown in FIG. 8, in addition to the main line 1 and the suspension line 1 a of the main line 1, the feeder 5, the utility pole 6, and the bent metal fitting 7, the crossover line 4 and the suspension line 4 a of the crossover line 4 are imaged. Has been. 7 and 8, reference numeral 8 denotes a sliding surface.
 続いて、ステップS2で摺動面抽出部14bにより上述したように画像I1,I2から摺動面部分を抽出する。摺動面部分の抽出には既知の手法を用いればよく、ここでの詳細な説明は省略する。図7に示す画像I1,I2から摺動面部分を抽出した結果を図9に示し、図8に示す画像I1,I2から摺動面部分を抽出した結果を図10に示す。 Subsequently, in step S2, the sliding surface extraction unit 14b extracts the sliding surface portion from the images I 1 and I 2 as described above. A known method may be used to extract the sliding surface portion, and a detailed description thereof is omitted here. FIG. 9 shows the result of extracting the sliding surface portion from the images I 1 and I 2 shown in FIG. 7, and FIG. 10 shows the result of extracting the sliding surface portion from the images I 1 and I 2 shown in FIG.
 続いて、ステップS3で線条抽出部14cにより上述したように画像I1,I2から線条点群を抽出する。なお、ここでいう線条点群とは、画像上の位置情報のみの点群であり、線条点間に関連性はない。線条点群の抽出には既知の手法を用いればよく、ここでの詳細な説明は省略する。また、電柱6等の外乱が存在する場合はその部分を欠損として処理する。図7に示す画像I1,I2から線条点群を検出した結果を図11に示し、図8に示す画像I1,I2から線条点群を検出した結果を図12に示す。図11および図12において白色で示した部分が線条点群に対応する。 Subsequently, in step S3, a line point group is extracted from the images I 1 and I 2 by the line extraction unit 14c as described above. Note that the term “striate point cloud” as used herein refers to a point group of only position information on the image, and there is no relationship between the line points. A known method may be used to extract the line point group, and a detailed description thereof is omitted here. Further, when there is a disturbance such as the utility pole 6, the portion is treated as a defect. FIG. 11 shows the result of detecting the line point group from the images I 1 and I 2 shown in FIG. 7, and FIG. 12 shows the result of detecting the line point group from the images I 1 and I 2 shown in FIG. The portions shown in white in FIGS. 11 and 12 correspond to the line point group.
 続いて、ステップS4で結合部14dにより線条点群データから、線条点群を結合して各線条に対応する線条パーツを作成する。具体的には、まず、各画像I1,I2上で位置が連続する線条点群を結合し、線条塊を作成する。図11に示す線条点群から線条塊を作成した結果を図13に示し、図12に示す線条点群から線条塊を作成した結果を図14に示す。図13および図14において、相互に異なる線条塊は異なるパターンで示している。結合部14dでは、複数の連続する線条塊が交差している場合、一つの線条塊を跨いで交差している線条塊を異なる線条塊として区別する。また結合部14dでは、実際には一つの線条に対応する線条点群であっても、欠損している部分を含む場合は欠損している部分を挟み異なる線条塊として区別する。 Subsequently, in step S4, the joint part 14d creates a line part corresponding to each line by combining the line points from the line point group data. Specifically, first, a line point group whose positions are continuous on each of the images I 1 and I 2 is combined to create a line block. FIG. 13 shows the result of creating a line lump from the line point group shown in FIG. 11, and FIG. 14 shows the result of creating a line lump from the line point group shown in FIG. In FIG. 13 and FIG. 14, different streak blocks are shown in different patterns. In the connecting portion 14d, when a plurality of continuous line lumps intersect, the line lumps intersecting across one line lumps are distinguished as different line lumps. Moreover, in the coupling | bond part 14d, even if it is actually a line point group corresponding to one line, when the missing part is included, the missing part is pinched and it distinguishes as a different line lump.
 さらに、同一の線条に対応する線条塊同士を結合して一本一本の線条に対応する線条パーツを作成する。図13に示す線条塊から線条パーツA~E,F~Jを作成した結果を図15に示し、図14に示す線条塊から線条パーツK~O,P~Tを作成した結果を図16に示す。結合部14dでは、線条塊の長さ、角度、近似二次曲線係数、線条塊の端の位置情報を利用して線条塊同士の結合判定を行い、結合する線条塊を決定する。また、欠損部分は線条塊間の近似二次曲線係数を用いて補完する。 Furthermore, the line parts corresponding to the same line are joined together to create the line part corresponding to each line. FIG. 15 shows the results of creating the filament parts A to E and F to J from the filament block shown in FIG. 13, and the results of creating the filament parts K to O and P to T from the filament block shown in FIG. Is shown in FIG. In the joining part 14d, the length of the filamentous lumps, the angle, the approximate quadratic curve coefficient, and the position information of the ends of the filamentous lumps are used to perform the coupling determination between the filamentous lumps and determine the filaments to be joined. . In addition, the missing portion is complemented by using an approximate quadratic curve coefficient between the line lumps.
 上述したステップS1からステップS4の処理をカメラ数分(本実施例では、二台分)行う。 The above-described processing from step S1 to step S4 is performed for the number of cameras (for this example, two units).
 続いて、ステップS5で対応付け部14eにより各画像I1,I2間で対応する線条パーツの対応付けを行う。
 以下に、図17を用いて対応付け部14eにおける処理の流れを詳細に説明する。
Subsequently, in step S5, the associating unit 14e associates corresponding line parts between the images I 1 and I 2 .
Hereinafter, the flow of processing in the association unit 14e will be described in detail with reference to FIG.
 対応付け部14eでは、まずステップS5-1で結合部14dによって作成した第一,第二のラインセンサカメラ11,12に対応する(画像I1,I2ごとに作成した)線条パーツのデータを入力する。 In the associating unit 14e, data of the line part corresponding to the first and second line sensor cameras 11 and 12 created by the coupling unit 14d in step S5-1 (created for each of the images I 1 and I 2 ). Enter.
 続いて、ステップS5-2で、摺動面抽出部14bによって抽出された摺動面部分が含まれる線条パーツを検出し、検出した線条パーツを本線または副本線として優先して対応付けする。図9に示す摺動面抽出結果と図15に示す線条パーツとに基づいて対応付けを行った結果を図18Aおよび図18Bに示し、図10に示す摺動面抽出結果と図16に示す線条パーツとに基づいて対応付けを行った結果を図19に示す。図18Aおよび図18Bに示すように、エアセクション設備では、線条パーツCと線条パーツGが同一の線条として対応付けられるとともに、線条パーツEと線条パーツJが同一の線条として対応付けられる。また、図19に示すように、わたり線設備では線条パーツMと線条パーツQが同一の線条として対応付けられる。 Subsequently, in step S5-2, a line part including the sliding surface part extracted by the sliding surface extraction unit 14b is detected, and the detected line part is preferentially associated as a main line or a secondary line. . FIG. 18A and FIG. 18B show the results of association based on the sliding surface extraction result shown in FIG. 9 and the filament parts shown in FIG. 15, and the sliding surface extraction result shown in FIG. 10 and FIG. FIG. 19 shows the result of association based on the filament parts. As shown in FIG. 18A and FIG. 18B, in the air section facility, the line part C and the line part G are associated with the same line, and the line part E and the line part J are the same line. It is associated. Moreover, as shown in FIG. 19, the line part M and the line part Q are matched as the same line in a crossover line installation.
 続いて、ステップS5-3で、ステップS5-2において対応付けた摺動面部分を含む線条に対して、次の(1)~(3)の特徴を有する線条パーツを摺動面部分を含む線条の吊架線であると判定し、対応付けを行う。これにより、エアセクション設備では線条パーツBと線条パーツH、線条パーツDと線条パーツIがそれぞれ同一の線条として対応付けられる。
(1)類似した形状である。
(2)同区間内の架線である。
(3)偏位位置が近い。
Subsequently, in step S5-3, the linear part having the following features (1) to (3) is applied to the linear part including the sliding surface part associated in step S5-2. It determines that it is a suspended | hanging wire | line of the filament containing, and matches. Thereby, in the air section equipment, the line part B and the line part H, the line part D, and the line part I are associated with each other as the same line.
(1) Similar shape.
(2) An overhead line in the same section.
(3) The deviation position is close.
 続いて、ステップS5-4で、未対応付けの線条パーツ(摺動面部分を含まない線条)が二つ以上残っているか否かを判定する。
 すなわち、図15に示す五つの線条パーツA~E,F~Jのうち未対応付けの線条パーツは、ステップS5-3の処理により線条または線条の吊架線として対応付けられた四つを除くと、図20に示すようにき電線5に対応する一つ(線条パーツAと線条パーツF)のみとなる。このように残っている未対応付けの線条パーツが一つ以下である場合(NO)、後述するステップS5-8に移行する。
 また、図16に示すわたり線設備に対応する画像データに基づいて作成した五つの線条パーツK~O,P~Tのうち未対応付けの線条パーツは、ステップS5-3の処理により線条または線条の吊架線として対応付けられた二つを除くと、図21に示すように三つ(線条パーツK,N,Oと線条パーツP,S,T)となる。このように未対応付けの線条パーツが二つ以上残っている場合(YES)、ステップS5-5に移行して未対応付けの線条パーツのうち形状が類似した二つの線条パーツをわたり線4およびわたり線の吊架線4aの候補として検出する。図21に示す例では、形状が類似する線条パーツN,Oと線条パーツS,Tがわたり線4およびわたり線の吊架線4aの候補となる。
Subsequently, in step S5-4, it is determined whether or not there are two or more uncorrelated line parts (wires that do not include a sliding surface part) remain.
That is, uncorrelated line parts among the five line parts A to E and F to J shown in FIG. 15 are associated with the line or the suspended line of the line by the process of step S5-3. If one is excluded, as shown in FIG. 20, there will be only one corresponding to the feeder 5 (the filament part A and the filament part F). When the number of uncorrelated line parts remaining is one or less (NO), the process proceeds to step S5-8 described later.
In addition, uncorrelated line parts among the five line parts K to O and P to T created based on the image data corresponding to the cross line equipment shown in FIG. Excluding the two associated with the strips or the suspension lines of the strips, the number of strips is three (the strip parts K, N, O and the strip parts P, S, T) as shown in FIG. When two or more uncorrelated line parts remain in this way (YES), the process proceeds to step S5-5, and two line parts with similar shapes among the uncorrelated line parts are passed. It is detected as a candidate for the line 4 and the crossover suspension line 4a. In the example shown in FIG. 21, the line parts N and O and the line parts S and T having similar shapes are candidates for the crossover line 4 and the crossover suspension line 4a.
 続いて、ステップS5-6で、ステップS5-5において検出した形状が類似した二つの未対応付けの二つの線条パーツについて、画像I1から抽出された未対応付けの二つの線条パーツ(図21に示す例では線条パーツN,O)に対する、画像I2から抽出された未対応付けの二つの線条パーツ(図21に示す例では線条パーツS,T)の組合せの全てについてステレオ計測を行う。これにより、図22に実線または二点鎖線で示すように、異なる位置にある四つの架線の候補が得られる。 Then, in step S5-6, the detected shape for the two filament parts of the two non-correspondence similar in step S5-5, the two filament parts of the non-correspondence extracted from the image I 1 ( In the example shown in FIG. 21, for all combinations of two uncorrelated two filament parts extracted from the image I 2 (the filament parts S and T in the example shown in FIG. 21) for the filament parts N and O). Perform stereo measurement. As a result, as shown by a solid line or a two-dot chain line in FIG. 22, candidates for four overhead lines at different positions are obtained.
 続いて、ステップS5-7で四つの架線の候補の中から同じ偏位にある二つの線条パーツをわたり線4およびわたり線の吊架線4aと特定し、対応付けを行う。すなわち、わたり線4とわたり線の吊架線4aとは同一の偏位位置にあるため、四つの架線の候補のうち、図22に実線で示す同一の偏位位置にある二つの架線の候補がわたり線4とわたり線の吊架線4aであり、図22に二点鎖線で示す他の二つの偏位位置にある架線の候補は実際には存在しない架空の架線となる。さらに、同一の偏位位置にある二つの架線のうち、鉛直方向の位置が低い架線をわたり線4、鉛直方向の位置が高い架線をわたり線の吊架線4aと特定する。 Subsequently, in step S5-7, two line parts having the same displacement are identified as the crossover line 4 and the crossover suspension line 4a from among the four overhead line candidates, and are associated with each other. That is, since the crossover line 4 and the crossover suspension line 4a are at the same displacement position, among the four overhead line candidates, two overhead line candidates at the same displacement position indicated by the solid line in FIG. The crossover line 4 and the suspension line 4a of the crossover line, and the overhead line candidates at the other two deviation positions indicated by the two-dot chain line in FIG. 22 become an imaginary overhead line that does not actually exist. Further, of the two overhead lines at the same displacement position, the overhead line with a low vertical position is identified as a crossover line 4 and the overhead line with a high vertical position is identified as a crossover line 4a.
 ステップS5-8では、残りの線条パーツの対応付けを行う。すなわち、エアセクション設備であればステップS5-3の処理が終了すると、図15に示す五つの線条パーツA~E,F~Jのうち未対応付けの線条パーツは、図20に示すように一つ(線条パーツAと線条パーツF)となる。また、わたり線設備であればステップS5-7の処理が終了すると、図16に示す五つの線条パーツK~O,P~Tのうち未対応付けの線条パーツは、図23に示すように一つ(線条パーツKと線条パーツP)となる。そこで、残った未対応付けの線条パーツをき電線5として特定する。 In step S5-8, the remaining line parts are associated. That is, when the processing in step S5-3 is completed for an air section facility, uncorrelated line parts among the five line parts A to E and F to J shown in FIG. 15 are as shown in FIG. 1 (wire part A and wire part F). If the crossed line equipment is used, when the process of step S5-7 is completed, uncorrelated line parts among the five line parts K to O and P to T shown in FIG. 16 are shown in FIG. 1 (wire part K and wire part P). Therefore, the remaining uncorrelated wire parts are specified as feeder wires 5.
 その後、ステップS5-9で線条パーツの対応付けの結果を線条対応付けデータとして出力する。 Thereafter, in step S5-9, the result of associating the filament parts is output as the filament association data.
 上述したステップS5に続いては、ステップS6で対応付け部14eによって対応付けを行った線条対応付けデータに基づいて偏位・高さ算出部14fにより各線条の偏位および高さを算出する。すなわち、第一,第二のラインセンサカメラ11,12間で対応付けされた線条パーツ同士をステレオ計測し、各線条の偏位および高さを算出する。図7に示す画像I1,I2から線条の偏位および高さを算出した結果を図24に示し、図8に示す画像I1,I2から線条の偏位および高さを算出した結果を図25に示す。 Subsequent to step S5 described above, the deviation and height of each filament are calculated by the deviation / height calculation unit 14f based on the filament association data associated by the association unit 14e in step S6. . That is, the line parts associated between the first and second line sensor cameras 11 and 12 are measured in stereo, and the deviation and height of each line are calculated. FIG. 24 shows the result of calculating the deviation and height of the filament from the images I 1 and I 2 shown in FIG. 7, and the deviation and height of the filament are calculated from the images I 1 and I 2 shown in FIG. The results are shown in FIG.
 このように構成される本実施例に係る線条判別装置および線条判別方法によれば、以下の(1)~(4)のような作用効果が得られる。
(1)二台(第一,第二)のラインセンサカメラ11,12のみで広範囲の高さおよび変位の計測を行うことができる。
(2)また、二台(第一,第二)のラインセンサカメラ11,12によるステレオ計測のため、高周期で撮影することにより営業車両等の高速走行車両でも計測間隔を狭めることができる。
(3)また、第一,第二のラインセンサカメラ11,12の性能の範囲内であれば高さの制限なく測定が可能である。
(4)また、摺動面8の情報で本線とその他の架線を判別できることに加え、本線以外の二つの架線のステレオ計測パターンの位置関係を用いることで、摺動面8を持つ架線との交差情報がなくても、第二架線の計測が可能である。
According to the filament discriminating apparatus and the filament discriminating method according to the present embodiment configured as described above, the following effects (1) to (4) can be obtained.
(1) A wide range of heights and displacements can be measured with only two (first and second) line sensor cameras 11 and 12.
(2) Further, since the stereo measurement is performed by the two (first and second) line sensor cameras 11 and 12, the measurement interval can be narrowed even in a high-speed traveling vehicle such as a business vehicle by photographing at a high cycle.
(3) Moreover, if it is in the range of the performance of the 1st, 2nd line sensor cameras 11 and 12, it can measure without a restriction | limiting of height.
(4) Moreover, in addition to being able to distinguish the main line and other overhead lines from the information on the sliding surface 8, the positional relationship between the stereo measurement patterns of the two overhead lines other than the main line is used, so that Even without intersection information, the second overhead line can be measured.
 なお、本発明は上述した実施例に限定されるものではなく、例えば、上述した実施例では摺動面抽出部14bにより画像I1,I2から摺動面部分を抽出した後、線条抽出部14cにより画像I1,I2から線条点群を抽出する例を示したが、線条抽出部14cにより画像I1,I2から線条点群を抽出した後、摺動面抽出部14bにより画像I1,I2から摺動面部分を抽出してもよいなど、本発明を逸脱しない範囲で種々の変更が可能であることは言うまでもない。 Note that the present invention is not limited to the above-described embodiment. For example, in the above-described embodiment, after the sliding surface portion is extracted from the images I 1 and I 2 by the sliding surface extraction unit 14b, the line is extracted. The example in which the line point group is extracted from the images I 1 and I 2 by the unit 14c has been shown, but after the line point group is extracted from the images I 1 and I 2 by the line line extraction unit 14c, the sliding surface extraction unit Needless to say, various changes can be made without departing from the present invention, such as the sliding surface portion may be extracted from the images I 1 and I 2 by 14b.
 1 本線
 1a 本線の吊架線
 2 副本線
 2a 副本線の吊架線
 3 レール
 4 わたり線
 4a わたり線の吊架線
 5 き電線
 6 電柱
 7 曲引き金具
 8 摺動面
 10 電車車両
 10a 車両の屋根上
 11 第一のラインセンサカメラ
 12 第二のラインセンサカメラ
 13 照明装置
 14 画像処理部
 14a 画像入力部
 14b 摺動面抽出部
 14c 線条抽出部
 14d 結合部
 14e 対応付け部
 14f 偏位・高さ算出部
 14g 記憶部
 A~T 線条パーツ
 I1 第一のラインセンサカメラによって取得した画像
 I2 第二のラインセンサカメラによって取得した画像
DESCRIPTION OF SYMBOLS 1 Main line 1a Main line suspension line 2 Sub main line 2a Sub main line suspension line 3 Rail 4 Crossover line 4a Crossover suspension line 5 Feeder line 6 Electric pole 7 Curved fitting 8 Sliding surface 10 Train vehicle 10a On the roof of the vehicle 11 One line sensor camera 12 Second line sensor camera 13 Illumination device 14 Image processing unit 14a Image input unit 14b Sliding surface extraction unit 14c Linear strip extraction unit 14d Coupling unit 14e Corresponding unit 14f Deflection / height calculation unit 14g Storage unit A to T Line parts I 1 Image acquired by the first line sensor camera I 2 Image acquired by the second line sensor camera

Claims (4)

  1.  車両の屋根上の枕木方向両端に、それぞれ該車両の枕木方向中心に向けて傾斜して配置され、計測対象である線条を撮像する第一のラインセンサカメラ及び第二のラインセンサカメラと、前記第一のラインセンサカメラ及び前記第二のラインセンサカメラから取得した撮像画像に基づいて、前記線条に対応する線条部及び前記線条の摺動面に対応する摺動面部分を抽出し、各前記撮像画像間で対応する前記線条部同士の対応付けを行い、対応付けを行った結果に基づいて前記線条の高さ及び偏位を算出する画像処理部とを備える線条判別装置であって、
     前記画像処理部は、
     前記線条部に前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条部同士および当該線条部に類似する線条部同士を対応付け、
     前記摺動面部分が含まれる線条部の対応付けを行った後に、前記摺動面部分が含まれない線条部が一つのみ存在する場合は前記撮像画像間で当該線条部同士を対応付け、前記摺動面部分が含まれない線条部が二つ以上存在する場合は類似する二つの線条部を選択して当該二つの線条部について前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条部同士を対応付ける
    ことを特徴とする線条判別装置。
    A first line sensor camera and a second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and are inclined toward the sleeper direction center of the vehicle, respectively, Based on the captured images acquired from the first line sensor camera and the second line sensor camera, a line portion corresponding to the line and a sliding surface portion corresponding to the sliding surface of the line are extracted. And an image processing unit that associates the line portions corresponding to each other between the captured images, and calculates a height and a deviation of the line based on a result of the association. A discrimination device,
    The image processing unit
    When the sliding surface portion is included in the linear portion, the linear portions corresponding to each other between the captured images and the linear portions similar to the linear portion are associated with each other,
    After the line portions including the sliding surface portion are associated with each other, when only one line portion not including the sliding surface portion exists, the line portions are arranged between the captured images. If there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts A line discriminating apparatus characterized by associating corresponding line sections between the captured images based on the results obtained.
  2.  前記画像処理部は、
     前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記摺動面部分をそれぞれ検出する摺動面抽出部と、
     前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記線条部に対応する線条点群をそれぞれ検出する線条抽出部と、
     前記線条点群を各線条に対応するように結合して線条パーツを作成する結合部と、
     前記線条パーツに前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条パーツ同士および当該線条パーツに類似する線条パーツ同士を対応付け、前記摺動面部分が含まれる線条パーツの対応付けを行った後に、前記摺動面部分が含まれない線条パーツが一つのみ存在する場合は前記撮像画像間で当該線条パーツ同士を対応付け、前記摺動面部分が含まれない線条パーツが二つ以上存在する場合は類似する二つの線条パーツを選択して当該二つの線条パーツについて前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条パーツ同士を対応付ける対応付け部と、
     前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像からそれぞれ取得した前記線条パーツ同士をステレオ計測し、前記線条の高さ及び偏位を算出するステレオ計測部と
    を備えることを特徴とする請求項1に記載の線条判別装置。
    The image processing unit
    A sliding surface extraction unit that detects the sliding surface portion from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively.
    A line extraction unit that respectively detects a line point group corresponding to the line section from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera;
    A connecting part for connecting the line point group so as to correspond to each line and creating a line part;
    When the sliding surface part is included in the linear part, the corresponding linear part between the captured images and linear parts similar to the linear part are associated with each other, and the sliding surface part is included. After the line parts are associated, if there is only one line part that does not include the sliding surface part, the line parts are associated with each other between the captured images, and the sliding surface part If there are two or more line parts that do not include the selected line part, select two similar line parts and perform stereo measurement between the captured images for the two line parts. An association part for associating corresponding filament parts with each other,
    The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured. The line | wire discrimination apparatus of Claim 1 provided with the stereo measurement part which calculates | requires.
  3.  車両の屋根上の枕木方向両端に、それぞれ該車両の枕木方向中心に向けて傾斜して配置され、計測対象である線条を撮像する第一のラインセンサカメラ及び第二のラインセンサカメラから取得した撮像画像から、前記線条の摺動面に対応する摺動面部分及び前記線条に対応する線条部を抽出し、前記第一のラインセンサカメラ及び前記第二のラインセンサカメラによってそれぞれ撮像した前記撮像画像間で前記線条部同士の対応付けを行い、対応付けを行った結果に基づいて前記線条の高さ及び偏位を算出する線条判別方法であって、
     前記線条部に前記摺動面部分が含まれない場合は前記撮像画像間で該当する線条部同士および当該線条部に類似する線条部同士を対応付け、前記摺動面部分が含まれる線条部の対応付けを行った後に、前記摺動面部分が含まれない線条部が一つのみ存在する場合は前記撮像画像間で当該線条部同士を対応付け、前記摺動面部分が含まれない線条部が二つ以上存在する場合は類似する二つの線条部を選択して当該二つの線条部について前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条部同士を対応付ける対応付け工程を含む
    ことを特徴とする線条判別方法。
    Acquired from the first line sensor camera and the second line sensor camera, which are arranged at both ends of the sleeper direction on the roof of the vehicle and inclined toward the center of the sleeper direction of the vehicle, respectively, to image the line to be measured. From the captured image, a sliding surface portion corresponding to the sliding surface of the filament and a filament portion corresponding to the filament are extracted, and the first line sensor camera and the second line sensor camera respectively A method for determining a line shape between the captured images, wherein the line portions are associated with each other, and the height and displacement of the line are calculated based on the result of the association,
    When the sliding surface portion is not included in the linear portion, the corresponding linear portions between the captured images and the linear portions similar to the linear portion are associated with each other, and the sliding surface portion is included. If there is only one line part that does not include the sliding surface part, the line parts are associated with each other between the captured images, and the sliding surface When there are two or more striated portions that do not include a portion, two similar striated portions are selected, and the captured image is based on a result of performing stereo measurement between the captured images for the two striated portions. A method for discriminating a line, which includes an associating process for associating corresponding line sections with each other.
  4.  前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記摺動面情報をそれぞれ検出する摺動面抽出工程と、
     前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像から前記線条に対応する線条点群をそれぞれ検出する線条抽出工程と、
     前記線条点群を各線条に対応するように結合して線条パーツを作成する結合工程と、
     前記対応付け工程と、
     前記第一のラインセンサカメラにより取得した前記撮像画像及び前記第二のラインセンサカメラにより取得した前記撮像画像からそれぞれ取得した前記線条パーツ同士をステレオ計測し、前記線条の高さ及び偏位を算出するステレオ計測工程と
    を含み、
     前記対応付け工程では、
     前記線条パーツに前記摺動面部分が含まれる場合は前記撮像画像間で該当する線条パーツ同士および当該線条パーツに類似する線条パーツ同士を対応付け、
     前記摺動面部分が含まれる線条パーツの対応付けを行った後に、前記摺動面部分が含まれない線条パーツが一つのみ存在する場合は前記撮像画像間で当該線条パーツ同士を対応付け、前記摺動面部分が含まれない線条パーツが二つ以上存在する場合は類似する二つの線条パーツを選択して当該二つの線条パーツについて前記撮像画像間でステレオ計測を行った結果に基づき前記撮像画像間で該当する線条パーツ同士を対応付ける
    ことを特徴とする請求項3に記載の線条判別方法。
    A sliding surface extraction step for detecting the sliding surface information from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera, respectively;
    A line extraction step for detecting a line point group corresponding to the line from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera;
    A joining step of creating a filament part by coupling the filament point group so as to correspond to each filament;
    The association step;
    The line parts obtained from the captured image acquired by the first line sensor camera and the captured image acquired by the second line sensor camera are measured in stereo, and the height and displacement of the line are measured. Stereo measurement process for calculating
    In the association step,
    When the sliding surface part is included in the filament parts, the corresponding filament parts between the captured images and the filament parts similar to the filament parts are associated with each other,
    After the line parts including the sliding surface part are associated with each other, when there is only one line part not including the sliding surface part, the line parts between the captured images When there are two or more line parts that do not include the sliding surface part, select two similar line parts and perform stereo measurement between the captured images for the two line parts. 4. The method for determining a filament according to claim 3, wherein the corresponding filament parts are associated with each other between the captured images.
PCT/JP2018/022139 2017-06-12 2018-06-11 Line distinguishing device and line distinguishing method WO2018230480A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019525396A JP6844697B2 (en) 2017-06-12 2018-06-11 Line discriminating device and line discriminating method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017114894 2017-06-12
JP2017-114894 2017-06-12

Publications (1)

Publication Number Publication Date
WO2018230480A1 true WO2018230480A1 (en) 2018-12-20

Family

ID=64659761

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/022139 WO2018230480A1 (en) 2017-06-12 2018-06-11 Line distinguishing device and line distinguishing method

Country Status (2)

Country Link
JP (1) JP6844697B2 (en)
WO (1) WO2018230480A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200378750A1 (en) * 2019-05-29 2020-12-03 Hitachi High-Tech Fine Systems Corporation Overhead wire mutual separating situation measuring apparatus and overhead wire mutual separating situation measuring method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009236574A (en) * 2008-03-26 2009-10-15 Railway Technical Res Inst Method and device for measuring trolley wire deflection using stereo technique
WO2011088509A1 (en) * 2010-01-20 2011-07-28 Jrb Engineering Pty Ltd Optical overhead wire measurement
JP2016065838A (en) * 2014-09-26 2016-04-28 株式会社明電舎 Apparatus for measuring wire and method thereof
JP2017009446A (en) * 2015-06-23 2017-01-12 株式会社明電舎 Wire measurement device and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009236574A (en) * 2008-03-26 2009-10-15 Railway Technical Res Inst Method and device for measuring trolley wire deflection using stereo technique
WO2011088509A1 (en) * 2010-01-20 2011-07-28 Jrb Engineering Pty Ltd Optical overhead wire measurement
JP2016065838A (en) * 2014-09-26 2016-04-28 株式会社明電舎 Apparatus for measuring wire and method thereof
JP2017009446A (en) * 2015-06-23 2017-01-12 株式会社明電舎 Wire measurement device and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200378750A1 (en) * 2019-05-29 2020-12-03 Hitachi High-Tech Fine Systems Corporation Overhead wire mutual separating situation measuring apparatus and overhead wire mutual separating situation measuring method

Also Published As

Publication number Publication date
JPWO2018230480A1 (en) 2020-03-19
JP6844697B2 (en) 2021-03-17

Similar Documents

Publication Publication Date Title
JP5245445B2 (en) Crossover measuring device
JP6424362B2 (en) Filament measurement apparatus and method thereof
US10740936B2 (en) Trolley-wire display device, trolley-wire display system, and trolley-wire display data creation method
JP6450971B2 (en) Trolley wire wear measuring device and trolley wire wear measuring method
JP5698285B2 (en) Overhead wire position measuring apparatus and method
JP6644720B2 (en) Train wire fitting detection system and its detection method
JP6518940B2 (en) Filament measurement apparatus and method
WO2018230480A1 (en) Line distinguishing device and line distinguishing method
JP5549488B2 (en) Trolley wire inspection device
TW201903646A (en) Track identification device and program
JP7225616B2 (en) Wire measuring device and wire measuring method
JP5952759B2 (en) Overhead wire position measuring apparatus and method
JP6311757B2 (en) Insulator detecting device and insulator detecting method
JP2019218022A (en) Rail track detection device
JP2020179798A (en) Turnout detection device and turnout detection method
JP6308681B2 (en) Crossover equipment monitoring device and method
JP6389783B2 (en) Crossover relative position management apparatus and method
CN109697709B (en) Contact net tracking method and system in pantograph system
WO2024070612A1 (en) Line abnormality detection system and line abnormality detection method
JP2014198524A (en) Overhead wire inspection method and apparatus
JP7384083B2 (en) Contact wire inspection device and contact wire inspection method
JP2021157486A (en) Detection method of railroad rail with image
TW202421463A (en) Line abnormality detection system and line abnormality detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18817426

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019525396

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18817426

Country of ref document: EP

Kind code of ref document: A1