EP2431706B1 - Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire - Google Patents

Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire Download PDF

Info

Publication number
EP2431706B1
EP2431706B1 EP10774953.3A EP10774953A EP2431706B1 EP 2431706 B1 EP2431706 B1 EP 2431706B1 EP 10774953 A EP10774953 A EP 10774953A EP 2431706 B1 EP2431706 B1 EP 2431706B1
Authority
EP
European Patent Office
Prior art keywords
pantograph
marker
image
template
input image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP10774953.3A
Other languages
German (de)
French (fr)
Other versions
EP2431706A4 (en
EP2431706A1 (en
Inventor
Takamasa Fujisawa
Yusuke Watabe
Kenji Kato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meidensha Corp
Meidensha Electric Manufacturing Co Ltd
Original Assignee
Meidensha Corp
Meidensha Electric Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meidensha Corp, Meidensha Electric Manufacturing Co Ltd filed Critical Meidensha Corp
Publication of EP2431706A1 publication Critical patent/EP2431706A1/en
Publication of EP2431706A4 publication Critical patent/EP2431706A4/en
Application granted granted Critical
Publication of EP2431706B1 publication Critical patent/EP2431706B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L5/00Current collectors for power supply lines of electrically-propelled vehicles
    • B60L5/18Current collectors for power supply lines of electrically-propelled vehicles using bow-type collectors in contact with trolley wire
    • B60L5/22Supporting means for the contact bow
    • B60L5/26Half pantographs, e.g. using counter rocking beams
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60MPOWER SUPPLY LINES, AND DEVICES ALONG RAILS, FOR ELECTRICALLY- PROPELLED VEHICLES
    • B60M1/00Power supply lines for contact with collector on vehicle
    • B60M1/12Trolley lines; Accessories therefor
    • B60M1/28Manufacturing or repairing trolley lines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L2200/00Type of vehicles
    • B60L2200/26Rail vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Definitions

  • the present invention relates to a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire, and particularly relates to a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire in which a hard spot of a trolley wire is measured through pattern-matching processing using an image, captured by a line sensor camera, of a pantograph equipped with a marker designed for measuring a hard spot.
  • a trolley wire is in a state of being suspended from a messenger wire via a hanger, for example.
  • the weight of the trolley wire increases locally at a spot where the hanger is placed and at some spots such as where the trolley wire is connected to another trolley wire and where there is a pull-off, as compared to the other spots. These spots are called "hard spots of a trolley wire.”
  • a pantograph which is a current collector placed on the roof of a car and configured to slide on the trolley wire, sinks abruptly in some cases due to the weight of the trolley wire.
  • the trolley wire loses its contact with the pantograph and causes a discharge phenomenon called arc discharge.
  • the trolley wire is locally worn away due to the heat produced by the arc discharge.
  • the trolley wire is thought to be worn away faster at hard spots than at the other spots.
  • the pantograph accelerates greatly in the vertical direction at hard spots of the trolley wire. For this reason, the vertical acceleration of the pantograph whose displacement is equivalent to that of the trolley wire shall be monitored in order to detect the hard spots of the trolley wire.
  • the acceleration of the pantograph can be found by measuring the displacement of the pantograph and performing second order differentiation on the displacement.
  • pantograph displacement measuring methods The following methods have conventionally been known as pantograph displacement measuring methods.
  • This method is a method in which the displacement of a pantograph is measured by: scanning the pantograph with laser by using a mirror or the like; and studying the phase difference between reflected waves, the deformation of the shape of reflected laser, or the like.
  • This method is a method in which the displacement of a pantograph is measured by: projecting stripe-pattern light onto the pantograph; and receiving light with zigzag streaks which correspond to the shape of the pantograph.
  • This method is a method in which the displacement of a pantograph is measured by: capturing an image of the pantograph by use of a line sensor camera placed on the roof of a car; and then performing processing such as model matching or pattern matching on the captured image by use of a processing computer (see Patent Documents 1 and 2, for example).
  • the image processing method is such that: from the image of the pantograph captured by the line sensor camera, a pixel position on the image is extracted at which a beforehand-prepared model of the pantograph finds a match; then, the actual height of the pantograph is calculated from the pixel position on the image on the basis of the distance from the line sensor camera to the pantograph, the focal length of the lens of the image capturing unit, and the like.
  • a pixel position at which the pre-acquired model of the pantograph finds a match is detected from the captured image of the pantograph as the position of the pantograph.
  • a marker in a black and white stripe pattern is attached to the pantograph placed on the roof of a car, and pattern matching is performed to detect the position of the marker, i.e., the position of the pantograph, from an image captured by the line sensor camera.
  • Patent Document 2 discloses a displacement measuring device and method for a pantograph having in common with the present invention the features in the pre-characterizing portions of the independent claims.
  • a line sensor camera 2 is placed on the roof of a car 1 in such a posture as to face obliquely upward in order to capture a marker 4 attached to a pantograph 1a.
  • a camera elevation angle ⁇ A of the line sensor camera 2 is small when the distance from the line sensor camera 2 to the pantograph 1a is long.
  • a camera elevation angle ⁇ B of the line sensor camera 2 is large when the distance from the line sensor camera 2 to the pantograph 1a is short.
  • the line sensor camera illustrated with a solid line and the line sensor camera 2 illustrated with a broken line in Fig. 13 will be referred to as line sensor cameras 2A and 2B, respectively.
  • Parts (a) and (b) of Fig. 14 show example input images of the marker 4 captured by the line sensor cameras 2A and 2B, respectively.
  • the small camera elevation angle ⁇ A allows the width of a trace M of the marker to remain substantially the same in an input image 6A as shown in Part (a) of Fig. 14 , showing that different heights of the pantograph 1a will cause almost no resolution difference.
  • the large camera elevation angle ⁇ B makes the trace M of the marker appear differently depending on the height as shown in Part (b) of Fig. 14 , showing that different heights of the pantograph 1a cause a resolution difference in an input image 6B.
  • performing pattern matching processing on an image 6A captured by the line sensor camera 2A is highly likely to result in successful pattern matching as shown in Part (a) of Fig. 14 .
  • performing pattern matching processing on an image 6B captured by the line sensor camera 2B can possibly cause a problem of finding no match to the size of the template 7 as shown in Part (b) of Fig. 14 and failing the pattern matching.
  • the present invention is characterized by providing a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire which can improve the accuracy of pattern-matching processing.
  • the device for measuring displacement of a pantograph and the method for detecting a hard spot of a trolley wire according to the present invention are defined in the independent claims. Further advantageous features are set out in the dependent claims.
  • calibration means refers to means using a calibration method in Japanese Patent Application No. 2009-011648 , for example, to find the resolution of each pixel.
  • the device for measuring displacement of a pantograph of the invention allows the pixel position of the marker to be detected highly accurately, and therefore a hard spot of a trolley wire can be found.
  • the pattern-matching processing means could perform pattern-matching processing only on a predetermined range of the input image on the basis of a result of pattern-matching processing having been performed on an immediately previous line. Accordingly, it is possible to shorten the processing time and also to lower the probability of detecting noises and the like.
  • the device for measuring displacement of a pantograph of the present invention it is possible to efficiently perform the pattern-matching processing by setting the shifting pitch of the template lower in a case of a higher correlation value than in a case of a lower correlation value.
  • the pattern-matching processing means may automatically correct positions of the sections such that the trace of the marker is included within one of the sections on the basis of the position of the marker detected in an immediately previous line. Accordingly, it is possible to efficiently perform the pattern-matching processing by preventing the occurrence of the switching of the template within the processing range of the pattern matching attributable to the presence of a partitioning position within the processing range.
  • the pattern-matching processing means may detect a rough center position of a trace of the marker, calculate an average luminance value of a range starting from the center position and having half a width of the template, and extract the trace of the marker by using the average luminance value as a threshold. Accordingly, it is possible to avoid an error at a partitioning position in the image by preventing the offset between the pattern matching results by several pixels attributable to the changing of the template size at the partitioning position. It is also possible to perform stable edge extraction even when the luminance of the whole image changes.
  • the method for detecting a hard spot of a trolley wire of the present invention allows the pixel position of the marker to be detected highly accurately, and therefore a hard spot of a trolley wire can be found.
  • the fifth step could be performed only on a predetermined range of the input image on the basis of a result of pattern-matching processing having been performed on an immediately previous line. Accordingly, it is possible to shorten the processing time and also to lower the probability of detecting noises and the like.
  • the method for detecting a hard spot of a trolley wire of the present invention it is possible to efficiently perform the pattern-matching processing by setting the shifting pitch of the template lower in a case of a higher correlation value but setting the shifting pitch of the template higher in a case of a lower correlation value.
  • the fifth step may be performed by automatically correcting positions of the sections such that the trace of the marker is included within one of the sections on the basis of a position of the marker detected in the immediately previous line. Accordingly, it is possible to efficiently perform the pattern-matching processing by preventing the occurrence of the switching of the template within the processing range of the pattern matching attributable to the presence of a partitioning position within the processing range.
  • the fifth step may be performed by detecting a rough center position of a trace of the marker, calculating an average luminance value of a range starting from the center position and having half a width of the template, and using the average luminance value as a threshold. Accordingly, it is possible to avoid an error at a partitioning position in the image by preventing the offset between the pattern matching results by several pixels attributable to the changing of the template size at the partitioning position. It is also possible to perform stable edge extraction even when the luminance of the whole image changes.
  • FIG. 1 is an explanatory diagram showing an example of the placement of a device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
  • Fig. 2 is a front view of a marker of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
  • Fig. 3 is a block diagram showing a schematic configuration of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
  • Fig. 4 is an explanatory diagram showing an example of an input image in Embodiment 1 of the present invention.
  • Fig. 1 is an explanatory diagram showing an example of the placement of a device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
  • Fig. 2 is a front view of a marker of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
  • Fig. 3 is a block diagram showing a schematic configuration of the device for measuring displacement of a pantograph
  • FIG. 5 is an explanatory diagram showing an example of a template in Embodiment 1 of the present invention.
  • Fig. 6 is an explanatory diagram showing an example of dividing an image in Embodiment 1 of the present invention.
  • Fig. 7 is a flowchart showing the flow of pantograph measurement processing of Embodiment 1 of the present invention.
  • the pantograph height measuring device in this embodiment includes a line sensor camera 2 as image capturing means fixed to the roof of a car 1, a lighting device 3, a marker 4, and a processing computer 5 placed inside the car 1.
  • the line sensor camera 2 is placed on the roof of the car 1 in such a way as to capture images of a pantograph 1a. Specifically, the orientation of the line sensor camera 2 is set such that: the optical axis thereof can be directed obliquely upward; and the scanning-line direction thereof can be orthogonal to the longitudinal direction of the pantograph 1a. Image signals acquired by this line sensor camera 2 are inputted into the processing computer 5.
  • the orientation and illuminating angle of the lighting device 3 are set such that a spot to be captured by the line sensor camera 2 can be illuminated with light.
  • the marker 4 is formed of a light-reflective material and a non-light-reflective material, and may be placed at any position on the line sensor camera 2-side end surface of the pantograph 1a within a range within which the line sensor camera 2 can capture the marker 4.
  • the marker 4 used in this embodiment is formed by alternately arranging two white portions 4w made of the light-reflective material and three black portions 4b made of the non-light-reflective material. Any size can be selected for the marker 4.
  • the processing computer 5 detects the vertical displacement of the pantograph 1a by analyzing an image inputted from the line sensor camera 2, and includes an arithmetic processing unit 5A as arithmetic processing means and a monitor 5B.
  • the arithmetic processing unit 5A includes an input image creating unit 5a, a template setting unit 5b, an image dividing unit 5c, a template enlarging/reducing unit 5d, a pattern matching unit 5e, a pantograph displacement calculating unit 5f, a filtering unit 5g, an acceleration output unit 5h, and memories m1 and m2.
  • the input image creating unit 5a as input image creating means creates an input image 6 as shown in Fig. 4 in which image signals inputted from the line sensor camera 2 are arranged in chronological order. As shown in Fig. 4 , since the marker 4 reflects light of the lighting device 3, the traces of the white portions of the marker 4 are displayed in the input image 6 as strip-shaped white regions 6a in a black region (a portion indicated with dotted lines in the drawing) 6b.
  • the input image 6 is sent to the template setting unit 5b or the image dividing unit 5c through the memories m1 and m2 as needed.
  • the template setting unit 5b as template setting means acquires in advance a marker pattern as shown in Fig. 5 as a matching template (hereinafter, referred to as reference template) 7A from an input image 6 as shown in Fig. 4 .
  • the template setting unit 5b acquires in advance a marker pattern as the reference template 7A to be used for the extraction of the marker 4 in the input image 6 in processing of the pattern matching unit 5e, and then registers the marker pattern to the memory m2.
  • the reference template 7A is sent to the template enlarging/reducing unit 5d through the memory m2.
  • the reference template 7A is one-dimensional luminance data of white regions 7a and black regions 7b obtained by extracting the marker portion from an image acquired in advance for the purpose of creating the reference template 7A. It is desirable to cut the image in such a way that the reference template 7A partially includes a black portion 4b of the marker 4 on the outer side of each white portion 4w as shown in Fig. 5 , rather than cutting the image at the boundary of the white portion 4w and the black portion 4b on the outer side. Doing so increases the feature amount of the reference template 7A and therefore reduces erroneous detections.
  • the template setting unit 5b registers the reference template 7A and also an offset width W OS and a template size W T (see Fig. 4 ) at the same time.
  • the image dividing unit 5c as image-division processing means provides partitioning positions 8 as shown in Fig. 6 in the input image 6 inputted from the input image creating unit 5a to thereby divide the input image 6 into a predetermined number of sections A 1 , A 2 , ..., A N (hereinafter, a given section (s) will be referred to as a section(s) A i ).
  • Information on all the sections A i is sent to the template enlarging/reducing unit 5d through the memory m2.
  • the number N of sections is automatically calculated based on the resolution of each pixel found in advance by use of a calibration method in Japanese Patent Application No. 2009-011648 , for example.
  • the number N of sections is set such that the section with the lowest resolution will not have a resolution 1.1 or more times larger than that of the section with the highest resolution. In this way, the resolutions can be calculated accurately.
  • the number N of sections is set such that the section with the lowest resolution will not have a resolution 1.1 or more times larger than that of the section with the highest resolution, for a reason that is based on the result of a verification test on the template size W T .
  • the number N of sections is based on the result of a test in which the size W T of the reference template 7A acquired from an image capturing the marker 4 is varied to find out to what extent the reference template 7A is allowed to be scaled up and down before failing to get a successful match in pattern matching performed on the image from which the reference template 7A is acquired.
  • the template enlarging/reducing unit 5d as template scaling processing means performs processing to scale up or down the reference template 7A to change its size W T for each section A i on the basis of the reference template 7A inputted from the template setting unit 5b and the information on the section A i inputted from the image dividing unit 5c.
  • Data on each template 7B i with its size W T thus changed for the corresponding section A i (hereinafter, referred to as scaled template) is sent to the pattern matching unit 5e through the memory m2.
  • the template enlarging/reducing unit 5d creates the scaled temples 7B i corresponding to the sections A i by: calculating a factor by which the reference template 7A is scaled (hereinafter, referred to as scale factor) for each section A i ; and scaling up or down the reference template 7A through bilinear interpolation which is a common technique for scaling an image. Since the size W T of the reference template 7A is registered at the time of registering the reference template 7A, a size W Ti of the scaled template 7B i can be found by multiplying the size W T of the reference template 7A by the corresponding scale factor.
  • scale factor a factor by which the reference template 7A is scaled
  • each scale factor is found from the following expressions (1) to (3).
  • an expression obtained by a calibration method in Japanese Patent Application No. 2009-011648 is used as an approximate expression (4) for converting a pixel position in an image into an actual height.
  • P n a n + 1 2 + b n + 1 + c ⁇ a n 2 + bn + c
  • "a,” "b,” and “c” are coefficients of the approximate expression (4) for finding actual displacement from a pixel position
  • p n is a resolution [mm/pix] at a pixel position n to be scaled up or down
  • p ori is a resolution [mm/pix] at a pixel position ori on the reference template 7A
  • scale is the scale factor.
  • the resolution can be found as a height [mm] per pixel.
  • the resolution [mm/pix] can be found by finding a height [mm] at a pixel position n and a height [mm] at a pixel position n+1 next thereto and then subtracting the height at the pixel position n from the height at the pixel position n+1.
  • the scale factor is set to 1 when the size W Ti of the scaled template 7B i is equal to the size W T of the reference template 7A.
  • the pattern matching unit 5e as pattern-matching processing means detects the pixel position of the marker 4 in the input image 6 by performing pattern-matching processing for each section A i on the basis of the information on the section A i inputted from the image dividing unit 5c and the data on the corresponding scaled template 7B i inputted from the template enlarging/reducing unit 5d.
  • the pixel position of the marker 4 obtained by the pattern matching unit 5e is sent to the pantograph displacement calculating unit 5f through the memory m2.
  • the pantograph displacement calculating unit 5f as pantograph displacement calculating means converts the displacement of the marker 4 in the input image 6 into the actual displacement of the pantograph 1a on the basis of the pixel position of the marker 4 in the input image 6 inputted from the pattern matching unit 5e.
  • an approximate expression obtainable for example from Japanese Patent Application No. 2009-011648 or the like is found in advance and used as a calculation expression for converting the displacement of the trace of the pantograph 1a in the input image 6 into the actual displacement of the pantograph 1a.
  • Data on the actual displacement of the pantograph 1a obtained by the pantograph displacement calculating unit 5f is sent to the filtering unit 5g through the memory m2.
  • the filtering unit 5g as filtering processing means performs smoothing processing on the displacement data inputted from the pantograph displacement calculating unit 5f.
  • the actual displacement of the pantograph 1a is in a state of containing quantization errors of the image.
  • the actual displacement data is subjected to filtering processing to smooth the displacement data.
  • the displacement data after the smoothing (hereinafter, referred to as smoothed displacement data) is sent to the acceleration output unit 5h through the memory m2.
  • the acceleration output unit 5h as acceleration outputting means performs second order differentiation on the smoothed displacement data inputted from the filtering unit 5g to calculate the acceleration of the marker 4, i.e., the pantograph 1a, in the vertical direction.
  • the acceleration is found by performing second order differentiation on the displacement data smoothed by the filtering processing and then outputted to the monitor 5B.
  • a point where the acceleration of the pantograph 1a is 20 G or greater, for example, is detected as a hard spot in this embodiment.
  • the calculated acceleration data is outputted to the monitor 5B through the memory m2 and displayed on the monitor 5B.
  • the template setting unit 5b first performs the processing to register a reference template 7A (step P1). Then, the input image creating unit 5a performs the processing to create an input image 6 in which image signals outputted from the line sensor camera 2 are arranged in chronological order (step P2). Thereafter, as shown in Fig. 6 , the image dividing unit 5c performs the processing to divide the input image 6 into a predetermined number N of sections A 1 , A 2 , ..., A N (step P3).
  • the template enlarging/reducing unit 5d performs the processing to scale up or down the reference template 7A registered in step P1 for a given section A i (step P4).
  • the pattern matching unit 5e performs the pattern-matching processing to compare a scaled template 7B i , obtained by scaling up or down the reference template 7A for the section A i of the input image 6, with the input image 6 in attempt to detect the position (pixel position) of the marker 4 in the input image 6 (step 5). Thereafter, it is judged whether or not the pattern matching for the section A i is completed (step P6).
  • the processing proceeds to step P7 if the judgment result shows that the pattern-matching processing for the section A i is not yet completed (NO). On the other hand, the processing returns to step P4 if the pattern-matching processing for the section A i is completed (YES).
  • step P7 it is judged whether or not the pattern-matching processing is completed for the entire data of the input image.
  • the processing proceeds to step S8 if the judgment result shows that the pattern-matching processing is completed for the entire data of the input image (YES).
  • the processing returns to step P5 if the pattern-matching processing is not yet completed for the entire data of the input image (NO).
  • step P8 the pantograph displacement calculating unit 5f performs the processing to convert the pixel position of the marker 4 in the input image 6 into the actual displacement of the pantograph 1a on the basis of the detected marker position, for the entire input image 6.
  • the filtering unit 5g performs the filtering processing (step P9).
  • the acceleration output unit 5h performs the processing to output the acceleration of the pantograph (step P10).
  • an input image 6 captured at a large camera elevation angle ⁇ B as shown in Part (b) of Fig. 13 is divided into the predetermined number N of sections A 1 , A 2 , ..., A N as shown in Fig. 6 , and then the pattern-matching processing is performed using the scaled templates 7B i obtained by scaling up or down the reference template 7A on the basis of the resolutions of the respective sections A i . Accordingly, highly accurate pattern-matching processing can be performed. Moreover, utilizing a calibration result allows accurate calculation of the resolution of each pixel of parts where the calibration is performed.
  • a second embodiment of the device for measuring displacement of a pantograph of the present invention will be described.
  • This embodiment differs from Embodiment 1 in the processing of the pattern matching unit 5e.
  • the other configurations are substantially the same as those described in Embodiment 1.
  • the processing units providing the same effects will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • step P5 shown in Fig. 7 .
  • the same pattern-matching processing as Embodiment 1 is performed on the first line of an input image 6, and the detected marker position is stored in the memory m2. Thereafter, the pattern-matching processing is performed on the second and subsequent lines but only within a range of ⁇ N P [pix] from the pixel position of the marker obtained as a result of the pattern-matching processing on the immediately previous line.
  • the next line is subjected to the pattern-matching processing only within a predetermined range from the pixel position of the marker 4.
  • the range ⁇ N P [pix] to be subjected to the pattern-matching processing is determined by taking into account of the distance of movement of the marker (the distance of vertical displacement of the pantograph) per unit time in the image capturing using the line sensor camera 2.
  • Conventional railway Structure Regulation Section 62 states that in an area where a train travels at a speed faster than 50 km/h, the inclination of a trolley wire must be 5/1000 or smaller in a case where the trolley wire is suspended from a catenary or an overhead rigid conductor line, and be 15/1000 or smaller otherwise.
  • An inclination of 5/1000 means a 5-m change in height over a distance of 1000 m.
  • the sampling frequency of the line sensor camera 2 is set to 1000 Hz (an image containing 1000 lines is captured in 1 second (1-ms intervals)).
  • the car 1 travels at a speed of 50 km/h, for example, the car 1 advances a distance of approximately 13. 888 m per second, which is approximately 0.013888 m per microsecond.
  • the height of the pantograph changes by approximately 0.00021 mm per unit time (1 ms).
  • a reference acceleration for detecting a hard spot is set to 20 G. This is an acceleration for a case assuming a 0.1-mm change per unit time (1 ms). Given "Conventional Railway Structure Regulation Section 62" mentioned above, a 10-mm change per unit time should be large enough. So, with reference to the pixel position of the marker 4 detected in the immediately previous line, a pixel width N P [pix] is calculated which assumes a 10-mm change per microsecond based on the image resolution, and the corresponding range is set as the range to be subjected to the pattern matching.
  • This embodiment shows an example assuming a 10-mm change per unit time as a condition for calculating the pixel width N P [pix]. Note, however, that the condition for calculating the pixel width N P [pix] is not limited thereto. Any condition may be set when necessary.
  • the pattern-matching processing is performed only within a range of ⁇ N P [pix] from the pixel position of the marker 4 in the immediately previous line detected by pattern matching. Accordingly, the processing time can be shortened as compared to Embodiment 1. Further, narrowing the range to be subjected to the pattern-matching processing can lower the possibility of detecting noises and the like.
  • FIG. 8 is an explanatory diagram showing levels of a value of correlation to the template in an input image.
  • Embodiment 1 differs from Embodiment 1 in the processing of the pattern matching unit 5e.
  • the other configurations are substantially the same as those described in Embodiment 1.
  • the processing units providing the same effects as the foregoing configurations shown in Figs. 1 to 7 will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • correlation values R are first calculated as indexes each representing the degree of resemblance to the registered reference template 7A, and then the pattern-matching processing is performed while the pitch at which to shift the scaled template 7B i is changed in accordance with the corresponding correlation value R.
  • step S5 shown in Fig. 7 the following processing is performed as the processing in step S5 shown in Fig. 7 at timing before performing the processing in step P5 described in Embodiment 1.
  • the correlation value R to the reference template 7A is calculated for each pixel position i in the input image 6.
  • the correlation value R can be found through calculation using the following expression (6). Note that the calculation is targeting one-dimensional correlation because the line sensor camera 2 is a camera to capture one-dimensional images.
  • R is the correlation value
  • L is the width of the template image (set to be smaller than the width of the search image)
  • W i is a luminance value at the pixel position i in the search image
  • T i is a luminance value at the pixel position i in the template image.
  • the pitch at which to shift the scaled template 7B i during the pattern-matching processing is set in accordance with the correlation value R.
  • the shifting pitch of the scaled template 7B i is set such that the scaled template 7B i will shift shorter during the pattern-matching processing on a part having a higher correlation value R than during the pattern-matching processing on a part having a lower correlation value R.
  • the correlation value is 1 at the maximum and 0 at the minimum.
  • the correlation value R may be about 0.8 in a part having a low correlation and about 0.99 in a part having a high correlation, for example.
  • the shifting pitch of the scaled template 7B i during the pattern-matching processing can be set on the basis of the correlation value R in the following way.
  • the shifting pitch of the scaled template 7B i is increased by 1 [pix] upon increase of the correlation value R by 0.05 so that the shifting pitch is 1 [pix] when the correlation value R is 0.95 or higher while the shifting pitch is 2 [pix] when the correlation value R is 0.9 to 0.85.
  • step P5 described in Embodiment 1 is performed after the shifting pitch of the scaled template 7B i is set as mentioned above.
  • a threshold of the correlation value R (0.05 in this embodiment) for changing the shifting pitch is set manually.
  • the shifting pitch of the scaled template 7B i is not limited to those mentioned above and may be set to any values as needed.
  • the shifting pitch is changed on the basis of the level of the correlation value R, and therefore the pattern-matching processing can be performed more efficiently than Embodiment 1 in which the pattern-matching processing is performed on a pixel basis.
  • FIG. 9 is an explanatory diagram showing an example of a case where a partitioning position is present within a processing range of pattern matching.
  • Fig. 10 is an explanatory diagram showing an example of re-setting the partitioning position shown in Fig. 9.
  • Fig. 10 is an explanatory diagram showing an example of re-setting the partitioning position shown in Fig. 9 .
  • Embodiments 1 and 2 differs from Embodiments 1 and 2 in a method of the processing in step P5 shown in Fig. 7 .
  • the other configurations are substantially the same as those described in Embodiments 1 and 2.
  • the processing units providing the same effects as the foregoing configurations described in Embodiments 1 and 2 will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • Embodiment 2 attempts to improve the processing efficiency by limiting the processing range of the pattern matching on the basis of the pixel position of the marker 4 detected by the pattern-matching processing on the immediately previous line.
  • a partitioning position 8 may be present within a processing range B of the pattern matching in the input image 6 as shown in Fig. 9 (a range of ⁇ N P [pix] from the pixel position of the marker 4 obtained through the pattern-matching processing on the previous line).
  • the scale factor of the scaled template 7B i needs to be switched for the two sections (the sections A i and A i+1 in this embodiment).
  • a partitioning position 8 is included in the processing range B of the patter matching as shown in Fig. 9 at the time of performing the pattern-matching processing, processing is performed so that: all the partitioning positions 8 can be automatically re-set to exclude the partitioning position 8 from the processing range B; and the size of each scaled template 7B i can be re-set to be adjusted to its corresponding newly-set section A i .
  • the following processing is performed as the processing in the foregoing step P5 shown in Fig. 7 .
  • the same pattern-matching processing as Embodiment 1 is performed on the first line of an input image 6, and the pixel position of the detected marker 4 is stored in the memory m2.
  • a range of ⁇ N P [pix] from the pixel position of the marker 4 detected through the pattern-matching processing on the immediately previous line is set as the processing range B of the pattern matching.
  • new scaled templates 7B i are set by: re-setting the sections A i on the basis of the pixel position of the marker 4 obtained from the immediately previous line; and re-calculating the scale factors of the scaled templates 7B i for the re-set sections A i .
  • the pattern-matching processing described in Embodiment 1 is performed after the above processing is performed. Note that the scale factors of the scaled templates 7B i and the partitioning positions 8 are calculated by use of the methods described in Embodiment 1.
  • the reference template 7A is set manually, but the scaled templates 7B i are set automatically after this manual setting on the basis of the resolutions of the corresponding image positions.
  • FIG. 11 is an explanatory diagram showing an example of template sizes set to their respective sections.
  • Fig. 12 is an explanatory diagram showing an example of extracting an edge of one of white regions 6a representing traces of white portions 4w of the marker 4 in an input image.
  • a trace M of the marker 4 crosses the partitioning position 8 between the sections A i and A i+1 as shown in Fig. 11
  • performing the pattern matching by use of any of the methods of Embodiments 1 to 4 requires the single marker trace M to be subjected to the pattern-matching processing using two scaled templates 7B i and 7B i+1 for the respective two sections A i and A i+1 .
  • the pattern matching results may possibly be offset from each other by several pixels at the partitioning position 8. Offset by several pixels leads to a large error in the calculation of the acceleration.
  • the pattern-matching processing is performed in advance by using the scaled template 7B i to detect a rough pixel position of the marker trace M; and then an edge of the marker trace M is extracted on the basis of the detected pixel position of the marker trace M. This prevents the occurrence of an error attributable to the marker trace M crossing the partitioning position 8 between the sections A i and A i+1 .
  • any threshold can be used to extract the edge of the marker trace M, setting a constant as the threshold may prevent accurate edge extraction when the input image 6 appears dark or bright in whole. For this reason, it is preferable to calculate the average value of luminance in the processing range and set the average value as the threshold. In this way, stable edge extraction can be performed even when the luminance values of the image change in whole.
  • step P5 shown in Fig. 7 .
  • a rough marker center position P C is detected through the pattern-matching processing using the scaled template 7B i .
  • an average value B A of luminance in a range starting from the marker center position P C and having half the current template size W T is calculated.
  • the last pixel position (the highest edge) P E in a region with higher luminance values than the average luminance value B A is found, and that position is extracted as the pixel position of the marker trace M.
  • the term "rough" used in this embodiment refers to a range within which the pattern-matching processing results in the detection of a marker center position that will not cause an error in the matching results.
  • the allowable range of error in the template size W T is about ⁇ 10% experimentally.
  • a rough marker position is detected through the pattern matching, and then the edge of the marker 4 is extracted on the basis of the detected pixel position of the marker 4.
  • the template size is not changed at the partitioning position set in the input image 6. This makes it possible to avoid the offset between the pattern matching results attributable to the changing of the template size W T at the partitioning position in the image, and hence improve the accuracy.
  • the average luminance value of the range starting from the marker center position, detected through the pattern matching, and having half the current template size W T is calculated, and that value is used as the threshold for the edge extraction. Accordingly, stable edge extraction can be performed even when the luminance of the whole input image 6 changes.
  • the present invention is applicable to devices for measuring displacement of a pantograph and methods for detecting a hard spot of a trolley wire, and is preferably applicable particularly to devices for measuring displacement of a pantograph and methods for detecting a hard spot of a trolley wire in which a hard spot of a trolley wire is measured through pattern-matching processing using an image, captured by a line sensor camera, of a pantograph equipped with a marker designed for measuring a hard spot.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Transportation (AREA)
  • Power Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Current-Collector Devices For Electrically Propelled Vehicles (AREA)
  • Image Processing (AREA)

Description

    TECHNICAL FIELD
  • The present invention relates to a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire, and particularly relates to a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire in which a hard spot of a trolley wire is measured through pattern-matching processing using an image, captured by a line sensor camera, of a pantograph equipped with a marker designed for measuring a hard spot.
  • BACKGROUND ART
  • In electric railway facilities, measurement of hard spots of trolley wires can be cited as one of inspection items. A trolley wire is in a state of being suspended from a messenger wire via a hanger, for example. The weight of the trolley wire increases locally at a spot where the hanger is placed and at some spots such as where the trolley wire is connected to another trolley wire and where there is a pull-off, as compared to the other spots. These spots are called "hard spots of a trolley wire."
  • When passing these hard spots of a trolley wire, a pantograph, which is a current collector placed on the roof of a car and configured to slide on the trolley wire, sinks abruptly in some cases due to the weight of the trolley wire. In this case, the trolley wire loses its contact with the pantograph and causes a discharge phenomenon called arc discharge. In this event, the trolley wire is locally worn away due to the heat produced by the arc discharge. Thus, the trolley wire is thought to be worn away faster at hard spots than at the other spots.
  • In view of the above, detecting hard spots of trolley wires has been an important matter in maintaining, operating, and managing electric railway facilities.
  • As mentioned above, the pantograph accelerates greatly in the vertical direction at hard spots of the trolley wire. For this reason, the vertical acceleration of the pantograph whose displacement is equivalent to that of the trolley wire shall be monitored in order to detect the hard spots of the trolley wire. The acceleration of the pantograph can be found by measuring the displacement of the pantograph and performing second order differentiation on the displacement.
  • The following methods have conventionally been known as pantograph displacement measuring methods.
  • (A) Laser Sensor Method
  • This method is a method in which the displacement of a pantograph is measured by: scanning the pantograph with laser by using a mirror or the like; and studying the phase difference between reflected waves, the deformation of the shape of reflected laser, or the like.
  • (B) Light Section Sensor Method
  • This method is a method in which the displacement of a pantograph is measured by: projecting stripe-pattern light onto the pantograph; and receiving light with zigzag streaks which correspond to the shape of the pantograph.
  • (C) Image Processing Method
  • This method is a method in which the displacement of a pantograph is measured by: capturing an image of the pantograph by use of a line sensor camera placed on the roof of a car; and then performing processing such as model matching or pattern matching on the captured image by use of a processing computer (see Patent Documents 1 and 2, for example).
  • Among the methods listed above, the image processing method is such that: from the image of the pantograph captured by the line sensor camera, a pixel position on the image is extracted at which a beforehand-prepared model of the pantograph finds a match; then, the actual height of the pantograph is calculated from the pixel position on the image on the basis of the distance from the line sensor camera to the pantograph, the focal length of the lens of the image capturing unit, and the like.
  • In this image processing method, a pixel position at which the pre-acquired model of the pantograph finds a match is detected from the captured image of the pantograph as the position of the pantograph. Alternatively, a marker in a black and white stripe pattern is attached to the pantograph placed on the roof of a car, and pattern matching is performed to detect the position of the marker, i.e., the position of the pantograph, from an image captured by the line sensor camera.
  • Then, after the position of the pantograph in the image is detected, the pixel position in the image is converted into the actual displacement of the pantograph on the basis of the distance to the pantograph, the focal length of the lens, and the like. Then, by performing second order differentiation on the displacement of the pantograph thus found, the acceleration is calculated. Meanwhile, as described in Patent Document 2, using a line sensor camera can increase the spatial resolution and thus improve the accuracy. This method allows the device to be smaller than those in the laser sensor method and the light section sensor method, hence bringing about an advantage that the device can be mounted not only on inspection cars manufactured exclusively for the measurement but also on business cars.
  • Patent Document 2 discloses a displacement measuring device and method for a pantograph having in common with the present invention the features in the pre-characterizing portions of the independent claims.
  • PRIOR ART DOCUMENTS PATENT DOCUMENTS
    • PATENT DOCUMENT 1: Japanese Patent Application Publication No. 2006-250774
    • PATENT DOCUMENT 2: Japanese Patent Application Publication No. 2008-104312
    SUMMARY OF THE INVENTION PROBLEM TO BE SOLVED BY THE INVENTION
  • Note that in the method using a line sensor camera, as shown in Fig. 13, a line sensor camera 2 is placed on the roof of a car 1 in such a posture as to face obliquely upward in order to capture a marker 4 attached to a pantograph 1a.
  • Here, as shown in Fig. 13, a camera elevation angle θA of the line sensor camera 2 is small when the distance from the line sensor camera 2 to the pantograph 1a is long. On the other hand, a camera elevation angle θB of the line sensor camera 2 is large when the distance from the line sensor camera 2 to the pantograph 1a is short. In the following, the line sensor camera illustrated with a solid line and the line sensor camera 2 illustrated with a broken line in Fig. 13 will be referred to as line sensor cameras 2A and 2B, respectively.
  • Parts (a) and (b) of Fig. 14 show example input images of the marker 4 captured by the line sensor cameras 2A and 2B, respectively. In a case of using the line sensor camera 2A, the small camera elevation angle θA allows the width of a trace M of the marker to remain substantially the same in an input image 6A as shown in Part (a) of Fig. 14, showing that different heights of the pantograph 1a will cause almost no resolution difference. In contrast, in a case of using the line sensor camera 2B, the large camera elevation angle θB makes the trace M of the marker appear differently depending on the height as shown in Part (b) of Fig. 14, showing that different heights of the pantograph 1a cause a resolution difference in an input image 6B.
  • Note that in Fig. 14, an image corresponding to a time range TI shows the pantograph 1a at a I position shown in Fig. 13; an image corresponding to a time range TII shows the pantograph 1a at a II position shown in Fig. 13; and an image corresponding to a time range TIII shows the pantograph 1a at a III position shown in Fig. 13.
  • Assume for example that an image of the marker 4 as indicated by broken-line circles in Fig. 14 is acquired as a pattern-matching template 7 when the pantograph 1a is at the position denoted by II (= time range TII). In this case, performing pattern matching processing on an image 6A captured by the line sensor camera 2A is highly likely to result in successful pattern matching as shown in Part (a) of Fig. 14. On the other hand, performing pattern matching processing on an image 6B captured by the line sensor camera 2B can possibly cause a problem of finding no match to the size of the template 7 as shown in Part (b) of Fig. 14 and failing the pattern matching.
  • In view of the above, the present invention is characterized by providing a device for measuring displacement of a pantograph and a method for detecting a hard spot of a trolley wire which can improve the accuracy of pattern-matching processing.
  • MEANS FOR SOLVING THE PROBLEM
  • The device for measuring displacement of a pantograph and the method for detecting a hard spot of a trolley wire according to the present invention are defined in the independent claims. Further advantageous features are set out in the dependent claims.
  • Note that the "calibration means" mentioned in the independent claims refers to means using a calibration method in Japanese Patent Application No. 2009-011648 , for example, to find the resolution of each pixel.
  • Note that the term "rough" used in the claims refers to such an extent that the marker center position can be detected in the pattern-matching processing within a range within which no error occurs in the matching results.
  • EFFECTS OF THE INVENTION
  • By automatically finding an appropriate number of divided image sections and performing appropriate pattern-matching processing, the device for measuring displacement of a pantograph of the invention allows the pixel position of the marker to be detected highly accurately, and therefore a hard spot of a trolley wire can be found.
  • Moreover, according to a device for measuring displacement of a pantograph related to the present invention, the pattern-matching processing means could perform pattern-matching processing only on a predetermined range of the input image on the basis of a result of pattern-matching processing having been performed on an immediately previous line. Accordingly, it is possible to shorten the processing time and also to lower the probability of detecting noises and the like.
  • Further, according to the device for measuring displacement of a pantograph of the present invention, it is possible to efficiently perform the pattern-matching processing by setting the shifting pitch of the template lower in a case of a higher correlation value than in a case of a lower correlation value.
  • Furthermore, according to the device for measuring displacement of a pantograph of the present invention, if a trace of the marker in the input image lies over any two adjacent ones of the sections, the pattern-matching processing means may automatically correct positions of the sections such that the trace of the marker is included within one of the sections on the basis of the position of the marker detected in an immediately previous line. Accordingly, it is possible to efficiently perform the pattern-matching processing by preventing the occurrence of the switching of the template within the processing range of the pattern matching attributable to the presence of a partitioning position within the processing range.
  • In addition, according to the device for measuring displacement of a pantograph of the present invention, the pattern-matching processing means may detect a rough center position of a trace of the marker, calculate an average luminance value of a range starting from the center position and having half a width of the template, and extract the trace of the marker by using the average luminance value as a threshold. Accordingly, it is possible to avoid an error at a partitioning position in the image by preventing the offset between the pattern matching results by several pixels attributable to the changing of the template size at the partitioning position. It is also possible to perform stable edge extraction even when the luminance of the whole image changes.
  • By automatically finding an appropriate number of divided image sections and performing appropriate pattern-matching processing, the method for detecting a hard spot of a trolley wire of the present invention allows the pixel position of the marker to be detected highly accurately, and therefore a hard spot of a trolley wire can be found.
  • Besides, according to a method for detecting a hard spot of a trolley wire related to the present invention, the fifth step could be performed only on a predetermined range of the input image on the basis of a result of pattern-matching processing having been performed on an immediately previous line. Accordingly, it is possible to shorten the processing time and also to lower the probability of detecting noises and the like.
  • Moreover, according to the method for detecting a hard spot of a trolley wire of the present invention, it is possible to efficiently perform the pattern-matching processing by setting the shifting pitch of the template lower in a case of a higher correlation value but setting the shifting pitch of the template higher in a case of a lower correlation value.
  • Further, according to the method for detecting a hard spot of a trolley wire of the present invention, if a trace of the marker in the input image lies over any two adjacent ones of the sections, the fifth step may be performed by automatically correcting positions of the sections such that the trace of the marker is included within one of the sections on the basis of a position of the marker detected in the immediately previous line. Accordingly, it is possible to efficiently perform the pattern-matching processing by preventing the occurrence of the switching of the template within the processing range of the pattern matching attributable to the presence of a partitioning position within the processing range.
  • Furthermore, according to the method for detecting a hard spot of a trolley wire of the present invention, the fifth step may be performed by detecting a rough center position of a trace of the marker, calculating an average luminance value of a range starting from the center position and having half a width of the template, and using the average luminance value as a threshold. Accordingly, it is possible to avoid an error at a partitioning position in the image by preventing the offset between the pattern matching results by several pixels attributable to the changing of the template size at the partitioning position. It is also possible to perform stable edge extraction even when the luminance of the whole image changes.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • [Fig. 1] Fig. 1 is an explanatory diagram showing an example of the placement of a device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
    • [Fig. 2] Fig. 2 is a front view of a marker of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
    • [Fig. 3] Fig. 3 is a block diagram showing a schematic configuration of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention.
    • [Fig. 4] Fig. 4 is an explanatory diagram showing an example of an input image in Embodiment 1 of the present invention.
    • [Fig. 5] Fig. 5 is an explanatory diagram showing an example of a template in Embodiment 1 of the present invention.
    • [Fig. 6] Fig. 6 is an explanatory diagram showing an example of dividing an image in Embodiment 1 of the present invention.
    • [Fig. 7] Fig. 7 is a flowchart showing the flow of pantograph measurement processing of Embodiment 1 of the present invention.
    • [Fig. 8] Fig. 8 is an explanatory diagram showing levels of a value of correlation to the template in an input image in Embodiment 3 of the present invention.
    • [Fig. 9] Fig. 9 is an explanatory diagram showing an example of a case where a partitioning position is present within a processing range of pattern matching in Embodiment 4 of the present invention.
    • [Fig. 10] Fig. 10 is an explanatory diagram showing an example of re-setting the partitioning position shown in Fig. 9.
    • [Fig. 11] Fig. 11 is an explanatory diagram showing an example of template sizes set to their respective sections in Embodiment 5 of the present invention.
    • [Fig. 12] Fig. 12 is an explanatory diagram showing an example of extracting an edge of one of white regions representing traces of white portions of the marker in an input image in Embodiment 5 of the present invention.
    • [Fig. 13] Fig. 13 is an explanatory diagram showing an example of the placement of a device for measuring displacement of a pantograph.
    • [Fig. 14] Part (a) of Fig. 14 is an explanatory diagram showing an example of an input image in a case where the elevation angle of a line sensor camera is small, and Part (b) of Fig. 14 is an explanatory diagram showing an example of an input image in a case where the elevation angle of the line sensor camera is large.
    MODES FOR CARRYING OUT THE INVENTION
  • Hereinbelow, by referring to the drawings, description will be given of details of a method using image processing of the present invention to improve the accuracy of pattern matching in measurement of a hard spot of a trolley wire.
  • Embodiment 1 (Pattern Matching Using Template Scaled for Each Section of Image)
  • A first embodiment of a device for measuring displacement of a pantograph of the present invention will be described using Figs. 1 to 7. Fig. 1 is an explanatory diagram showing an example of the placement of a device for measuring displacement of a pantograph of Embodiment 1 of the present invention. Fig. 2 is a front view of a marker of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention. Fig. 3 is a block diagram showing a schematic configuration of the device for measuring displacement of a pantograph of Embodiment 1 of the present invention. Fig. 4 is an explanatory diagram showing an example of an input image in Embodiment 1 of the present invention. Fig. 5 is an explanatory diagram showing an example of a template in Embodiment 1 of the present invention. Fig. 6 is an explanatory diagram showing an example of dividing an image in Embodiment 1 of the present invention. Fig. 7 is a flowchart showing the flow of pantograph measurement processing of Embodiment 1 of the present invention.
  • As shown in Fig. 1, the pantograph height measuring device in this embodiment includes a line sensor camera 2 as image capturing means fixed to the roof of a car 1, a lighting device 3, a marker 4, and a processing computer 5 placed inside the car 1.
  • The line sensor camera 2 is placed on the roof of the car 1 in such a way as to capture images of a pantograph 1a. Specifically, the orientation of the line sensor camera 2 is set such that: the optical axis thereof can be directed obliquely upward; and the scanning-line direction thereof can be orthogonal to the longitudinal direction of the pantograph 1a. Image signals acquired by this line sensor camera 2 are inputted into the processing computer 5.
  • The orientation and illuminating angle of the lighting device 3 are set such that a spot to be captured by the line sensor camera 2 can be illuminated with light.
  • The marker 4 is formed of a light-reflective material and a non-light-reflective material, and may be placed at any position on the line sensor camera 2-side end surface of the pantograph 1a within a range within which the line sensor camera 2 can capture the marker 4. As shown in Fig. 2, the marker 4 used in this embodiment is formed by alternately arranging two white portions 4w made of the light-reflective material and three black portions 4b made of the non-light-reflective material. Any size can be selected for the marker 4.
  • The processing computer 5 detects the vertical displacement of the pantograph 1a by analyzing an image inputted from the line sensor camera 2, and includes an arithmetic processing unit 5A as arithmetic processing means and a monitor 5B.
  • As shown in Fig. 3, the arithmetic processing unit 5A includes an input image creating unit 5a, a template setting unit 5b, an image dividing unit 5c, a template enlarging/reducing unit 5d, a pattern matching unit 5e, a pantograph displacement calculating unit 5f, a filtering unit 5g, an acceleration output unit 5h, and memories m1 and m2.
  • The input image creating unit 5a as input image creating means creates an input image 6 as shown in Fig. 4 in which image signals inputted from the line sensor camera 2 are arranged in chronological order. As shown in Fig. 4, since the marker 4 reflects light of the lighting device 3, the traces of the white portions of the marker 4 are displayed in the input image 6 as strip-shaped white regions 6a in a black region (a portion indicated with dotted lines in the drawing) 6b. The input image 6 is sent to the template setting unit 5b or the image dividing unit 5c through the memories m1 and m2 as needed.
  • The template setting unit 5b as template setting means acquires in advance a marker pattern as shown in Fig. 5 as a matching template (hereinafter, referred to as reference template) 7A from an input image 6 as shown in Fig. 4. To be specific, the template setting unit 5b acquires in advance a marker pattern as the reference template 7A to be used for the extraction of the marker 4 in the input image 6 in processing of the pattern matching unit 5e, and then registers the marker pattern to the memory m2. The reference template 7A is sent to the template enlarging/reducing unit 5d through the memory m2.
  • As shown in Fig. 5, the reference template 7A is one-dimensional luminance data of white regions 7a and black regions 7b obtained by extracting the marker portion from an image acquired in advance for the purpose of creating the reference template 7A. It is desirable to cut the image in such a way that the reference template 7A partially includes a black portion 4b of the marker 4 on the outer side of each white portion 4w as shown in Fig. 5, rather than cutting the image at the boundary of the white portion 4w and the black portion 4b on the outer side. Doing so increases the feature amount of the reference template 7A and therefore reduces erroneous detections. Note that the template setting unit 5b registers the reference template 7A and also an offset width WOS and a template size WT (see Fig. 4) at the same time.
  • The image dividing unit 5c as image-division processing means provides partitioning positions 8 as shown in Fig. 6 in the input image 6 inputted from the input image creating unit 5a to thereby divide the input image 6 into a predetermined number of sections A1, A2, ..., AN (hereinafter, a given section (s) will be referred to as a section(s) Ai). Information on all the sections Ai is sent to the template enlarging/reducing unit 5d through the memory m2. In this event, the number N of sections is automatically calculated based on the resolution of each pixel found in advance by use of a calibration method in Japanese Patent Application No. 2009-011648 , for example. The number N of sections is set such that the section with the lowest resolution will not have a resolution 1.1 or more times larger than that of the section with the highest resolution. In this way, the resolutions can be calculated accurately.
  • Meanwhile, in this embodiment, the number N of sections is set such that the section with the lowest resolution will not have a resolution 1.1 or more times larger than that of the section with the highest resolution, for a reason that is based on the result of a verification test on the template size WT. Specifically, the number N of sections is based on the result of a test in which the size WT of the reference template 7A acquired from an image capturing the marker 4 is varied to find out to what extent the reference template 7A is allowed to be scaled up and down before failing to get a successful match in pattern matching performed on the image from which the reference template 7A is acquired.
  • The template enlarging/reducing unit 5d as template scaling processing means performs processing to scale up or down the reference template 7A to change its size WT for each section Ai on the basis of the reference template 7A inputted from the template setting unit 5b and the information on the section Ai inputted from the image dividing unit 5c. Data on each template 7Bi with its size WT thus changed for the corresponding section Ai (hereinafter, referred to as scaled template) is sent to the pattern matching unit 5e through the memory m2.
  • To be specific, the template enlarging/reducing unit 5d creates the scaled temples 7Bi corresponding to the sections Ai by: calculating a factor by which the reference template 7A is scaled (hereinafter, referred to as scale factor) for each section Ai; and scaling up or down the reference template 7A through bilinear interpolation which is a common technique for scaling an image. Since the size WT of the reference template 7A is registered at the time of registering the reference template 7A, a size WTi of the scaled template 7Bi can be found by multiplying the size WT of the reference template 7A by the corresponding scale factor.
  • Here, each scale factor is found from the following expressions (1) to (3). Note that an expression obtained by a calibration method in Japanese Patent Application No. 2009-011648 , for example, is used as an approximate expression (4) for converting a pixel position in an image into an actual height. P n = a n + 1 2 + b n + 1 + c a n 2 + bn + c
    Figure imgb0001
    P ori = a ori + 1 2 + b ori + 1 + c a ori 2 + b ori + c
    Figure imgb0002
    scale = P ori / P n
    Figure imgb0003
    y = a x 2 + bx + c
    Figure imgb0004
    where "a," "b," and "c" are coefficients of the approximate expression (4) for finding actual displacement from a pixel position; "pn" is a resolution [mm/pix] at a pixel position n to be scaled up or down; "pori" is a resolution [mm/pix] at a pixel position ori on the reference template 7A; and "scale" is the scale factor.
  • In this embodiment, the resolution can be found as a height [mm] per pixel. Specifically, the resolution [mm/pix] can be found by finding a height [mm] at a pixel position n and a height [mm] at a pixel position n+1 next thereto and then subtracting the height at the pixel position n from the height at the pixel position n+1.
  • Note that, needless to say, the scale factor is set to 1 when the size WTi of the scaled template 7Bi is equal to the size WT of the reference template 7A.
  • The pattern matching unit 5e as pattern-matching processing means detects the pixel position of the marker 4 in the input image 6 by performing pattern-matching processing for each section Ai on the basis of the information on the section Ai inputted from the image dividing unit 5c and the data on the corresponding scaled template 7Bi inputted from the template enlarging/reducing unit 5d. The pixel position of the marker 4 obtained by the pattern matching unit 5e is sent to the pantograph displacement calculating unit 5f through the memory m2.
  • The pantograph displacement calculating unit 5f as pantograph displacement calculating means converts the displacement of the marker 4 in the input image 6 into the actual displacement of the pantograph 1a on the basis of the pixel position of the marker 4 in the input image 6 inputted from the pattern matching unit 5e. Note that an approximate expression obtainable for example from Japanese Patent Application No. 2009-011648 or the like is found in advance and used as a calculation expression for converting the displacement of the trace of the pantograph 1a in the input image 6 into the actual displacement of the pantograph 1a. Data on the actual displacement of the pantograph 1a obtained by the pantograph displacement calculating unit 5f is sent to the filtering unit 5g through the memory m2.
  • The filtering unit 5g as filtering processing means performs smoothing processing on the displacement data inputted from the pantograph displacement calculating unit 5f. The actual displacement of the pantograph 1a is in a state of containing quantization errors of the image. Hence, the actual displacement data is subjected to filtering processing to smooth the displacement data. As a result, the quantization errors contained in the displacement data are reduced. The displacement data after the smoothing (hereinafter, referred to as smoothed displacement data) is sent to the acceleration output unit 5h through the memory m2.
  • The acceleration output unit 5h as acceleration outputting means performs second order differentiation on the smoothed displacement data inputted from the filtering unit 5g to calculate the acceleration of the marker 4, i.e., the pantograph 1a, in the vertical direction. To be specific, the acceleration is found by performing second order differentiation on the displacement data smoothed by the filtering processing and then outputted to the monitor 5B. In this event, a point where the acceleration of the pantograph 1a is 20 G or greater, for example, is detected as a hard spot in this embodiment. The calculated acceleration data is outputted to the monitor 5B through the memory m2 and displayed on the monitor 5B.
  • Hereinbelow, based on Fig. 7, a brief description will be given of the flow of trolley-wire hard-spot detection processing performed in the processing computer 5 of this embodiment.
  • As shown in Fig. 7, in the processing computer 5, the template setting unit 5b first performs the processing to register a reference template 7A (step P1). Then, the input image creating unit 5a performs the processing to create an input image 6 in which image signals outputted from the line sensor camera 2 are arranged in chronological order (step P2). Thereafter, as shown in Fig. 6, the image dividing unit 5c performs the processing to divide the input image 6 into a predetermined number N of sections A1, A2, ..., AN (step P3).
  • Then, the template enlarging/reducing unit 5d performs the processing to scale up or down the reference template 7A registered in step P1 for a given section Ai (step P4). Then, the pattern matching unit 5e performs the pattern-matching processing to compare a scaled template 7Bi, obtained by scaling up or down the reference template 7A for the section Ai of the input image 6, with the input image 6 in attempt to detect the position (pixel position) of the marker 4 in the input image 6 (step 5). Thereafter, it is judged whether or not the pattern matching for the section Ai is completed (step P6). The processing proceeds to step P7 if the judgment result shows that the pattern-matching processing for the section Ai is not yet completed (NO). On the other hand, the processing returns to step P4 if the pattern-matching processing for the section Ai is completed (YES).
  • In step P7, it is judged whether or not the pattern-matching processing is completed for the entire data of the input image. The processing proceeds to step S8 if the judgment result shows that the pattern-matching processing is completed for the entire data of the input image (YES). On the other hand, the processing returns to step P5 if the pattern-matching processing is not yet completed for the entire data of the input image (NO).
  • In step P8, the pantograph displacement calculating unit 5f performs the processing to convert the pixel position of the marker 4 in the input image 6 into the actual displacement of the pantograph 1a on the basis of the detected marker position, for the entire input image 6.
  • Then, the filtering unit 5g performs the filtering processing (step P9). Lastly, the acceleration output unit 5h performs the processing to output the acceleration of the pantograph (step P10).
  • In the device for measuring displacement of a pantograph of this embodiment configured as above, an input image 6 captured at a large camera elevation angle θB as shown in Part (b) of Fig. 13 is divided into the predetermined number N of sections A1, A2, ..., AN as shown in Fig. 6, and then the pattern-matching processing is performed using the scaled templates 7Bi obtained by scaling up or down the reference template 7A on the basis of the resolutions of the respective sections Ai. Accordingly, highly accurate pattern-matching processing can be performed. Moreover, utilizing a calibration result allows accurate calculation of the resolution of each pixel of parts where the calibration is performed.
  • Embodiment 2 (Shortening Processing Time by Reducing Processing Range of Pattern Matching)
  • A second embodiment of the device for measuring displacement of a pantograph of the present invention will be described. This embodiment differs from Embodiment 1 in the processing of the pattern matching unit 5e. The other configurations are substantially the same as those described in Embodiment 1. In the following, the processing units providing the same effects will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • In this embodiment, the following processing is performed as the processing in step P5 shown in Fig. 7.
  • First, the same pattern-matching processing as Embodiment 1 is performed on the first line of an input image 6, and the detected marker position is stored in the memory m2. Thereafter, the pattern-matching processing is performed on the second and subsequent lines but only within a range of ±NP[pix] from the pixel position of the marker obtained as a result of the pattern-matching processing on the immediately previous line.
  • In sum, once the trace of the marker 4 is detected in the input image 6, the next line is subjected to the pattern-matching processing only within a predetermined range from the pixel position of the marker 4. Here, the range ±NP[pix] to be subjected to the pattern-matching processing is determined by taking into account of the distance of movement of the marker (the distance of vertical displacement of the pantograph) per unit time in the image capturing using the line sensor camera 2.
  • Specifically, "Conventional Railway Structure Regulation Section 62" states that in an area where a train travels at a speed faster than 50 km/h, the inclination of a trolley wire must be 5/1000 or smaller in a case where the trolley wire is suspended from a catenary or an overhead rigid conductor line, and be 15/1000 or smaller otherwise. An inclination of 5/1000 means a 5-m change in height over a distance of 1000 m.
  • Now, let us explain this while assuming that the sampling frequency of the line sensor camera 2 is set to 1000 Hz (an image containing 1000 lines is captured in 1 second (1-ms intervals)). When the car 1 travels at a speed of 50 km/h, for example, the car 1 advances a distance of approximately 13. 888 m per second, which is approximately 0.013888 m per microsecond. Then, in a case where the pantograph 1a is displaced vertically with an inclination of 15/1000, the height of the pantograph changes by approximately 0.00021 mm per unit time (1 ms).
  • In this embodiment, a reference acceleration for detecting a hard spot is set to 20 G. This is an acceleration for a case assuming a 0.1-mm change per unit time (1 ms). Given "Conventional Railway Structure Regulation Section 62" mentioned above, a 10-mm change per unit time should be large enough. So, with reference to the pixel position of the marker 4 detected in the immediately previous line, a pixel width NP[pix] is calculated which assumes a 10-mm change per microsecond based on the image resolution, and the corresponding range is set as the range to be subjected to the pattern matching. This embodiment shows an example assuming a 10-mm change per unit time as a condition for calculating the pixel width NP[pix]. Note, however, that the condition for calculating the pixel width NP[pix] is not limited thereto. Any condition may be set when necessary.
  • When the line sensor camera 2 captures images of the pantograph 1a, the distance of the vertical displacement of the pantograph 1a per unit time of the image capturing is small, and therefore the distance of the movement of the marker 4 is small as well. Thus, once the pixel position of the marker 4 detected through the pattern-matching processing, the subsequent pattern-matching processing should be performed within a range of ±NP[pix] from that position. Given that the width of the input image 6 captured by the line sensor camera 2 is "WIDTH," a time t required for completing the pattern-matching processing in this embodiment can be expressed in the following expression (5) using a time to required for completing the pattern-matching processing in Embodiment 1.
    [Formula 1] t = 2 M WIDTH t 0
    Figure imgb0005
  • As described, in the device for measuring displacement of a pantograph of this embodiment, the pattern-matching processing is performed only within a range of ±NP[pix] from the pixel position of the marker 4 in the immediately previous line detected by pattern matching. Accordingly, the processing time can be shortened as compared to Embodiment 1. Further, narrowing the range to be subjected to the pattern-matching processing can lower the possibility of detecting noises and the like.
  • Embodiment 3 (Shortening Processing Time by Changing Pitch at Which to Shift Template on the Basis of Correlation Value)
  • A third embodiment of the device for measuring displacement of a pantograph of the present invention will be described by use of Fig. 8. Fig. 8 is an explanatory diagram showing levels of a value of correlation to the template in an input image.
  • This embodiment differs from Embodiment 1 in the processing of the pattern matching unit 5e. The other configurations are substantially the same as those described in Embodiment 1. In the following, the processing units providing the same effects as the foregoing configurations shown in Figs. 1 to 7 will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • As shown in Fig. 8, a value R of correlation between the input image 6 and the scaled template 7Bi is highest (R = RH) in a part including the trace of the marker captured in the input image 6 and is low (R = RL) in the other parts. Moreover, the part in which the marker 4 is captured dominates merely one part of the actual input image 6.
  • Thus, in this embodiment, in pattern matching on the input image 6 shown in Fig. 8, correlation values R are first calculated as indexes each representing the degree of resemblance to the registered reference template 7A, and then the pattern-matching processing is performed while the pitch at which to shift the scaled template 7Bi is changed in accordance with the corresponding correlation value R.
  • Specifically, in this embodiment, the following processing is performed as the processing in step S5 shown in Fig. 7 at timing before performing the processing in step P5 described in Embodiment 1.
  • First, the correlation value R to the reference template 7A is calculated for each pixel position i in the input image 6. The correlation value R can be found through calculation using the following expression (6). Note that the calculation is targeting one-dimensional correlation because the line sensor camera 2 is a camera to capture one-dimensional images.
    [Formula 2] R = i = 1 L W i × T i i = 1 L W i 2 i = 1 L T i 2
    Figure imgb0006
  • Here, R is the correlation value; L is the width of the template image (set to be smaller than the width of the search image) ; Wi is a luminance value at the pixel position i in the search image; and Ti is a luminance value at the pixel position i in the template image.
  • Next, the pitch at which to shift the scaled template 7Bi during the pattern-matching processing is set in accordance with the correlation value R. To be specific, the shifting pitch of the scaled template 7Bi is set such that the scaled template 7Bi will shift shorter during the pattern-matching processing on a part having a higher correlation value R than during the pattern-matching processing on a part having a lower correlation value R.
  • The correlation value is 1 at the maximum and 0 at the minimum. In the pattern-matching processing of this embodiment, the correlation value R may be about 0.8 in a part having a low correlation and about 0.99 in a part having a high correlation, for example. In this respect, the shifting pitch of the scaled template 7Bi during the pattern-matching processing can be set on the basis of the correlation value R in the following way. The shifting pitch of the scaled template 7Bi is increased by 1 [pix] upon increase of the correlation value R by 0.05 so that the shifting pitch is 1 [pix] when the correlation value R is 0.95 or higher while the shifting pitch is 2 [pix] when the correlation value R is 0.9 to 0.85.
  • In this embodiment, the processing in step P5 described in Embodiment 1 is performed after the shifting pitch of the scaled template 7Bi is set as mentioned above.
  • Note that a threshold of the correlation value R (0.05 in this embodiment) for changing the shifting pitch is set manually. In addition, the shifting pitch of the scaled template 7Bi is not limited to those mentioned above and may be set to any values as needed.
  • In the device for measuring displacement of a pantograph of this embodiment configured as above, the shifting pitch is changed on the basis of the level of the correlation value R, and therefore the pattern-matching processing can be performed more efficiently than Embodiment 1 in which the pattern-matching processing is performed on a pixel basis.
  • Embodiment 4 (Improving Efficiency of Processing at Partitioning Position in Image)
  • A fourth embodiment of the device for measuring displacement of a pantograph of the present invention will be described by use of Figs. 9 and 10. Fig. 9 is an explanatory diagram showing an example of a case where a partitioning position is present within a processing range of pattern matching. Fig. 10 is an explanatory diagram showing an example of re-setting the partitioning position shown in Fig. 9. Fig. 10 is an explanatory diagram showing an example of re-setting the partitioning position shown in Fig. 9.
  • This embodiment differs from Embodiments 1 and 2 in a method of the processing in step P5 shown in Fig. 7. The other configurations are substantially the same as those described in Embodiments 1 and 2. In the following, the processing units providing the same effects as the foregoing configurations described in Embodiments 1 and 2 will be denoted by the same reference numerals, and overlapping descriptions will be omitted. The differences will be mainly described.
  • Embodiment 2 mentioned above attempts to improve the processing efficiency by limiting the processing range of the pattern matching on the basis of the pixel position of the marker 4 detected by the pattern-matching processing on the immediately previous line. However, in Embodiment 2, a partitioning position 8 may be present within a processing range B of the pattern matching in the input image 6 as shown in Fig. 9 (a range of ±NP[pix] from the pixel position of the marker 4 obtained through the pattern-matching processing on the previous line). In this case, the scale factor of the scaled template 7Bi needs to be switched for the two sections (the sections Ai and Ai+1 in this embodiment).
  • To solve this, in this embodiment, if a partitioning position 8 is included in the processing range B of the patter matching as shown in Fig. 9 at the time of performing the pattern-matching processing, processing is performed so that: all the partitioning positions 8 can be automatically re-set to exclude the partitioning position 8 from the processing range B; and the size of each scaled template 7Bi can be re-set to be adjusted to its corresponding newly-set section Ai.
  • Specifically, in this embodiment, the following processing is performed as the processing in the foregoing step P5 shown in Fig. 7. First, the same pattern-matching processing as Embodiment 1 is performed on the first line of an input image 6, and the pixel position of the detected marker 4 is stored in the memory m2. Thereafter, for the second and subsequent lines, a range of ±NP[pix] from the pixel position of the marker 4 detected through the pattern-matching processing on the immediately previous line is set as the processing range B of the pattern matching.
  • Then, it is checked for a pixel position P [pix] of the marker 4 obtained from the immediately previous line as to whether or not a partitioning position 8 is included in the processing range B of P±NP[pix]. If no partitioning position 8 is included, the pattern matching is performed using the same scaled templates 7Bi as those of the immediately previous line.
  • On the other hand, if a partitioning position 8 is included in the range of P±NP [pix], new scaled templates 7Bi are set by: re-setting the sections Ai on the basis of the pixel position of the marker 4 obtained from the immediately previous line; and re-calculating the scale factors of the scaled templates 7Bi for the re-set sections Ai. The pattern-matching processing described in Embodiment 1 is performed after the above processing is performed. Note that the scale factors of the scaled templates 7Bi and the partitioning positions 8 are calculated by use of the methods described in Embodiment 1.
  • Note that in this embodiment, the reference template 7A is set manually, but the scaled templates 7Bi are set automatically after this manual setting on the basis of the resolutions of the corresponding image positions.
  • In the device for measuring displacement of a pantograph of this embodiment mentioned above, if a partitioning position is included in the processing range of the pattern matching, new scaled templates 7Bi and partitioning positions 8 are set automatically on the basis of the pixel position of the marker 4 obtained from the immediately previous line so that no partitioning position 8 may be included in the processing range. Accordingly, in addition to the effect of Embodiment 2, this embodiment makes it possible to perform highly efficient pattern-matching processing.
  • Embodiment 5 (Improving Accuracy of Partitioning Position by Changing Size of Template)
  • A fifth embodiment of the device for measuring displacement of a pantograph of the present invention will be described by use of Figs. 11 and 12. Fig. 11 is an explanatory diagram showing an example of template sizes set to their respective sections. Fig. 12 is an explanatory diagram showing an example of extracting an edge of one of white regions 6a representing traces of white portions 4w of the marker 4 in an input image.
  • If a trace M of the marker 4 (hereinafter, referred to as marker trace) crosses the partitioning position 8 between the sections Ai and Ai+1 as shown in Fig. 11, performing the pattern matching by use of any of the methods of Embodiments 1 to 4 requires the single marker trace M to be subjected to the pattern-matching processing using two scaled templates 7Bi and 7Bi+1 for the respective two sections Ai and Ai+1. As a consequence, the pattern matching results may possibly be offset from each other by several pixels at the partitioning position 8. Offset by several pixels leads to a large error in the calculation of the acceleration.
  • To solve this, in this embodiment, if the marker trace M crosses the partitioning position 8 between the sections Ai and Ai+1, the pattern-matching processing is performed in advance by using the scaled template 7Bi to detect a rough pixel position of the marker trace M; and then an edge of the marker trace M is extracted on the basis of the detected pixel position of the marker trace M. This prevents the occurrence of an error attributable to the marker trace M crossing the partitioning position 8 between the sections Ai and Ai+1.
  • Note that although any threshold can be used to extract the edge of the marker trace M, setting a constant as the threshold may prevent accurate edge extraction when the input image 6 appears dark or bright in whole. For this reason, it is preferable to calculate the average value of luminance in the processing range and set the average value as the threshold. In this way, stable edge extraction can be performed even when the luminance values of the image change in whole.
  • Specifically, in this embodiment, the following processing is performed as the processing in step P5 shown in Fig. 7.
  • First, if the marker trace M crosses the partitioning position 8 between the sections Ai and Ai+1, a rough marker center position PC is detected through the pattern-matching processing using the scaled template 7Bi. Thereafter, as shown in Fig. 12, an average value BA of luminance in a range starting from the marker center position PC and having half the current template size WT is calculated. Then, the last pixel position (the highest edge) PE in a region with higher luminance values than the average luminance value BA is found, and that position is extracted as the pixel position of the marker trace M.
  • Note that the term "rough" used in this embodiment refers to a range within which the pattern-matching processing results in the detection of a marker center position that will not cause an error in the matching results. The allowable range of error in the template size WT is about ±10% experimentally.
  • In the device for measuring displacement of a pantograph of this embodiment configured as above, a rough marker position is detected through the pattern matching, and then the edge of the marker 4 is extracted on the basis of the detected pixel position of the marker 4. Thus, the template size is not changed at the partitioning position set in the input image 6. This makes it possible to avoid the offset between the pattern matching results attributable to the changing of the template size WT at the partitioning position in the image, and hence improve the accuracy. Moreover, the average luminance value of the range starting from the marker center position, detected through the pattern matching, and having half the current template size WT is calculated, and that value is used as the threshold for the edge extraction. Accordingly, stable edge extraction can be performed even when the luminance of the whole input image 6 changes.
  • INDUSTRIAL APPLICABILITY
  • The present invention is applicable to devices for measuring displacement of a pantograph and methods for detecting a hard spot of a trolley wire, and is preferably applicable particularly to devices for measuring displacement of a pantograph and methods for detecting a hard spot of a trolley wire in which a hard spot of a trolley wire is measured through pattern-matching processing using an image, captured by a line sensor camera, of a pantograph equipped with a marker designed for measuring a hard spot.
  • EXPLANATION OF REFERENCE NUMERALS
    • 1 car
    • 2 line sensor camera
    • 3 lighting device
    • 4 marker
    • 4w white portion
    • 4b black portion
    • 5 processing computer
    • 5A arithmetic processing unit
    • 5B monitor
    • 5a input image creating unit
    • 5b template setting unit
    • 5c image dividing unit
    • 5d template enlarging/reducing unit
    • 5e pattern matching unit
    • 5f pantograph displacement calculating unit
    • 5g filtering unit
    • 5h acceleration output unit
    • 6 input image
    • 6a white region
    • 6b black region
    • 7 template
    • 7a white region
    • 7b black region
    • 8 partitioning position
    • A1, A2, ..., Ai, ..., AN section
    • WT template size

Claims (6)

  1. A device for measuring displacement of a pantograph including image capturing means (2) adapted to be placed on a roof of a car (1) for capturing an image of a pantograph (1a), and image processing means (5) for acquiring displacement of the pantograph by performing image processing on an input image captured by the image capturing means,
    wherein the image processing means comprise:
    input image creating means (5a) for creating an input image by using an image signal inputted from the image capturing means;
    template setting means (5b) for acquiring, as a template, a marker pattern obtained by extracting a marker portion from an image acquired in advance;
    image-division processing means (5c) for dividing the input image into a predetermined number of sections on the basis of a resolution of each of pixels obtained by calibration means;
    template scaling processing means (5d) for scaling up or down the template for each of the sections of the input image on the basis of the resolution of the section;
    pattern-matching processing means (5e) for detecting a pixel position of a marker (4) in the input image by using the template and the input image; and
    pantograph displacement calculating means (5f) for calculating actual displacement of the pantograph on the basis of the pixel position of the marker;
    characterized in that the image processing means further comprise:
    filtering processing means (5g) for performing smoothing processing on data on the displacement of the pantograph; and
    acceleration outputting means (5h) for outputting acceleration of the pantograph calculated on the basis of the data on the displacement of the pantograph smoothed by the smoothing processing means;
    wherein the pattern-matching processing means are adapted to set a pitch at which to shift the template, in accordance with a value representing correlation to the template and detected from the input image.
  2. The device for measuring displacement of a pantograph according to claim 1, wherein, if a trace of the marker (4) lies over any two adjacent ones of the sections, the pattern-matching processing means (5e) are adapted to automatically corrects positions of the sections such that the trace of the marker is included within one of the sections on the basis of the position of the marker detected in an immediately previous line.
  3. The device for measuring displacement of a pantograph according to claim 1, wherein the pattern-matching processing means (5e) are adapted to detect a rough center position of a trace of the marker (4), to calculate an average luminance value of a range starting from the center position and having half a width of the template, and to extract the trace of the marker by using the average luminance value as a threshold.
  4. A method for detecting a hard spot of a trolley wire by using a device for measuring displacement of a pantograph including image capturing means (2) placed on a roof of a car (1) for capturing an image of a pantograph (1a), and image processing means (5) for acquiring displacement of the pantograph by performing image processing on an input image captured by the image capturing means, the method comprising:
    a first step of acquiring, as a template, a marker pattern obtained by extracting a marker portion from an image acquired in advance;
    a second step of creating an input image by using an image signal inputted from the image capturing means;
    a third step of dividing the input image into a predetermined number of sections on the basis of a resolution of each of pixels obtained by calibration means;
    a fourth step of scaling up or down the template for each of the sections of the input image on the basis of the resolution of the section;
    a fifth step of detecting a pixel position of a marker (4) in the input image through pattern-matching processing of the template and the input image; and
    a sixth step of calculating actual displacement of the pantograph on the basis of the pixel position of the marker;
    the method characterized by:
    a seventh step of performing smoothing processing on data on the displacement of the pantograph; and
    an eighth step of outputting acceleration of the pantograph calculated on the basis of the data on the displacement of the pantograph smoothed by the smoothing processing means;
    wherein the fifth step is performed by setting a pitch at which to shift the template, on the basis of a value representing correlation to the template and detected from the input image for each pixel position.
  5. The method for detecting a hard spot of a trolley wire according to claim 4, wherein, if a trace of the marker (4) lies over any two adjacent ones of the sections, the fifth step is performed by automatically correcting positions of the sections such that the trace of the marker is included within one of the sections on the basis of a position of the marker detected in the immediately previous line.
  6. The method for detecting a hard spot of a trolley wire according to claim 4, wherein the fifth step is performed by detecting a rough center position of a trace of the marker (4), calculating an average luminance value of a range starting from the center position and having half a width of the template, and using the average luminance value as a threshold.
EP10774953.3A 2009-05-15 2010-05-13 Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire Active EP2431706B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009118285A JP5321235B2 (en) 2009-05-15 2009-05-15 Pantograph displacement measuring device and trolley wire hard spot detection method
PCT/JP2010/058083 WO2010131696A1 (en) 2009-05-15 2010-05-13 Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire

Publications (3)

Publication Number Publication Date
EP2431706A1 EP2431706A1 (en) 2012-03-21
EP2431706A4 EP2431706A4 (en) 2014-04-02
EP2431706B1 true EP2431706B1 (en) 2017-08-16

Family

ID=43085069

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10774953.3A Active EP2431706B1 (en) 2009-05-15 2010-05-13 Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire

Country Status (5)

Country Link
EP (1) EP2431706B1 (en)
JP (1) JP5321235B2 (en)
KR (1) KR101292897B1 (en)
CN (1) CN102428341B (en)
WO (1) WO2010131696A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11019548B2 (en) 2017-11-24 2021-05-25 Samsung Electronics Co., Ltd. Electronic device and communication method thereof

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5861318B2 (en) * 2011-08-26 2016-02-16 株式会社明電舎 Trolley wire data comparison device
JP5534058B1 (en) 2013-02-19 2014-06-25 株式会社明電舎 Wear measuring apparatus and method
WO2016040997A1 (en) * 2014-09-15 2016-03-24 Dti Group Limited Arcing filtering using multiple image capture devices
CN104833904B (en) * 2015-04-13 2017-09-29 东莞市诺丽电子科技有限公司 Straddle-type monorail pantograph arcing detection method
CN105158257B (en) * 2015-05-21 2018-07-20 苏州华兴致远电子科技有限公司 Slide plate measurement method and device
SG11201803516QA (en) * 2015-12-15 2018-06-28 Mitsubishi Electric Corp Trolley-wire measurement device and trolley-wire measurement method
CN105539206B (en) * 2015-12-24 2017-10-17 湖南华宏铁路高新科技开发有限公司 A kind of acquisition methods of electrification railway contact net bar position information
ITUA20162698A1 (en) * 2016-04-19 2017-10-19 Mer Mec S P A OPTICAL SYSTEM FOR THE MEASUREMENT OF THE CONTACT FORCE BETWEEN THE PANTOGRAPH AND THE CATENARY
CN111091525A (en) * 2018-10-18 2020-05-01 株洲中车时代电气股份有限公司 Contact net hard spot detection system and method thereof
CN109186469B (en) * 2018-10-18 2019-11-15 北京华开领航科技有限责任公司 Bow net dynamic monitoring system
JP6858742B2 (en) * 2018-12-17 2021-04-14 株式会社フジタ Displacement measuring device
JP6669294B1 (en) * 2019-03-07 2020-03-18 株式会社明電舎 Pantograph displacement measuring device and trolley wire hard point detection method
CN112837260A (en) * 2019-11-22 2021-05-25 株洲中车时代电气股份有限公司 Contact net hard spot detection method, electronic device and readable storage medium
CN110849885B (en) * 2019-11-27 2022-08-23 苏州华兴致远电子科技有限公司 Hard spot monitoring method, device and system in bow net system
CN113320445B (en) * 2020-02-28 2022-12-30 中铁二院工程集团有限责任公司 Online monitoring and intelligent hidden danger and fault distinguishing and early warning system for contact network
JP2021181893A (en) * 2020-05-18 2021-11-25 シャープ株式会社 Railway facility measurement device, control method of railway facility measurement device, railway facility measurement program and recording medium
KR102276634B1 (en) 2020-09-15 2021-07-13 엠아이엠테크 주식회사 System for detecting abnormality of pantograph on electric train installed on vehicle and method for processing thereof
CN112161577B (en) * 2020-09-21 2021-05-25 北京运达华开科技有限公司 Contact net hard spot detection method and system
JP7505419B2 (en) 2021-02-25 2024-06-25 株式会社明電舎 Pantograph displacement measuring device and contact wire hard point detection method
CN113256723B (en) * 2021-06-29 2023-03-21 西南交通大学 Automatic detection method for pantograph lifting time and pantograph head displacement curve
DE102022208846A1 (en) 2022-08-26 2024-02-29 Siemens Mobility GmbH Road vehicle with a pantograph
CN117309875B (en) * 2023-09-20 2024-04-09 北京运达华开科技有限公司 Non-contact type bow net contact hard point detection device and method
KR102649465B1 (en) * 2023-11-22 2024-03-20 주식회사 미래건설안전 A system and a method for determining displacement of object based on image analysis regarding the object, and a marker module

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08336204A (en) * 1995-06-06 1996-12-17 Hitachi Electron Eng Co Ltd Motion analyzer for pantograph
JP3198267B2 (en) * 1997-04-30 2001-08-13 株式会社東芝 Image processing device and image forming device
JP3406195B2 (en) * 1997-08-26 2003-05-12 本田技研工業株式会社 Vehicle distance measuring device
JP4085588B2 (en) * 2001-03-22 2008-05-14 株式会社明電舎 Pantograph measuring device
JP4258340B2 (en) * 2003-10-15 2009-04-30 株式会社明電舎 Pantograph detection device
JP4690749B2 (en) * 2005-03-11 2011-06-01 株式会社明電舎 Pantograph motion measuring device by image processing
JP4635657B2 (en) * 2005-03-11 2011-02-23 株式会社明電舎 Trolley wire wear measuring device by image processing
JP4923942B2 (en) * 2006-10-20 2012-04-25 株式会社明電舎 Pantograph measuring device by image processing
JP2009011648A (en) 2007-07-06 2009-01-22 Panasonic Electric Works Co Ltd Attaching structure for wash basin
JP5097596B2 (en) * 2008-03-31 2012-12-12 公益財団法人鉄道総合技術研究所 Measuring device using line sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11019548B2 (en) 2017-11-24 2021-05-25 Samsung Electronics Co., Ltd. Electronic device and communication method thereof
US11218938B2 (en) 2017-11-24 2022-01-04 Samsung Electronics Co., Ltd. Electronic device and communication method thereof

Also Published As

Publication number Publication date
KR101292897B1 (en) 2013-08-02
EP2431706A4 (en) 2014-04-02
JP2010266341A (en) 2010-11-25
EP2431706A1 (en) 2012-03-21
CN102428341B (en) 2014-05-28
KR20120022943A (en) 2012-03-12
CN102428341A (en) 2012-04-25
WO2010131696A1 (en) 2010-11-18
JP5321235B2 (en) 2013-10-23

Similar Documents

Publication Publication Date Title
EP2431706B1 (en) Device for measuring displacement of pantograph and method for detecting hard spot of trolley wire
JP5494286B2 (en) Overhead position measuring device
EP2960620A1 (en) Wear measuring device and method for same
JP4832321B2 (en) Camera posture estimation apparatus, vehicle, and camera posture estimation method
EP3199910B1 (en) Line measurement device and method
JP2010184527A (en) Train stop detection system and train travel speed and position detection system
EP3936369B1 (en) Pantograph displacement measuring device, and trolley-wire hard-spot detection method
EP2821747B1 (en) Pantograph measurement method, and pantograph measurement device
EP2966400A1 (en) Overhead line position measuring device and method
US20200250806A1 (en) Information processing apparatus, information processing method, and storage medium
JP2008299458A (en) Vehicle monitoring apparatus and vehicle monitoring method
EP2151667B1 (en) Equipment for measuring abrasion of trolley wire by image processing
JP2010127746A (en) Apparatus for measuring abrasion and deflection of trolley wire by image processing
JP2009276910A (en) Image processor, method and program
EP3885701A1 (en) Image processing device, image processing method, and program
EP4082867A1 (en) Automatic camera inspection system
JP3964077B2 (en) Trolley wire support insulator height measuring device
JPH07244717A (en) Travel environment recognition device for vehicle
JP3891730B2 (en) Trolley wire support bracket mounting angle measuring device
JP2020179798A (en) Turnout detection device and turnout detection method
JP2011180049A (en) Pantograph monitoring system
JP4165966B2 (en) Object recognition device
JP2008298733A (en) Apparatus for measuring wear of trolley wire by image processing
JPH11160046A (en) Appearance inspection method
JPH03286399A (en) Image processing type traffic flow measuring instrument

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20111208

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20140303

RIC1 Information provided on ipc code assigned before grant

Ipc: B60M 1/28 20060101ALI20140225BHEP

Ipc: G01B 11/00 20060101AFI20140225BHEP

Ipc: B60L 5/26 20060101ALI20140225BHEP

Ipc: G06T 1/00 20060101ALI20140225BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: B60M 1/28 20060101ALI20170209BHEP

Ipc: G01B 11/00 20060101AFI20170209BHEP

Ipc: B60L 5/26 20060101ALI20170209BHEP

Ipc: G06T 1/00 20060101ALI20170209BHEP

INTG Intention to grant announced

Effective date: 20170306

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 919489

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170915

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010044461

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20170816

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 919489

Country of ref document: AT

Kind code of ref document: T

Effective date: 20170816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171116

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171116

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171216

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20171117

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010044461

Country of ref document: DE

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20180517

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602010044461

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20180513

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20180531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180531

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180513

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20181201

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180513

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180513

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180513

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100513

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

Ref country code: MK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170816

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170816

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240529

Year of fee payment: 15