EP2796402A1 - System and control procedure for the positioning of bridge cranes - Google Patents

System and control procedure for the positioning of bridge cranes Download PDF

Info

Publication number
EP2796402A1
EP2796402A1 EP13382156.1A EP13382156A EP2796402A1 EP 2796402 A1 EP2796402 A1 EP 2796402A1 EP 13382156 A EP13382156 A EP 13382156A EP 2796402 A1 EP2796402 A1 EP 2796402A1
Authority
EP
European Patent Office
Prior art keywords
image
bridge crane
camera
positioning
bridge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13382156.1A
Other languages
German (de)
French (fr)
Inventor
Alberto Cogollos Martinez
Ivan Portas Arrondo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MECATRONIA AUTOMATIZACIONES S L
Mecatronia Automatizaciones SL
Original Assignee
MECATRONIA AUTOMATIZACIONES S L
Mecatronia Automatizaciones S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MECATRONIA AUTOMATIZACIONES S L, Mecatronia Automatizaciones S.L. filed Critical MECATRONIA AUTOMATIZACIONES S L
Priority to EP13382156.1A priority Critical patent/EP2796402A1/en
Publication of EP2796402A1 publication Critical patent/EP2796402A1/en
Application status is Withdrawn legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements

Abstract

The system comprises image capturing means (6), integrated into the bridge crane and focusing on a surface of reference, and data processing means (12), responsible for receiving and analysing the images acquired by the image capturing means (6), in order to recognise, using a character extraction process, relevant characteristics in the image, and determine the relative displacement of said relevant characteristics between successive images, and therefore, the positioning of the bridge crane. It enables an accurate calculation of the distance covered by the bridge crane.

Description

    Technical field
  • The present invention falls within the field of bridge cranes, and specifically, of systems and devices for controlling the distance covered by the bridge cranes.
  • Background of the invention
  • A bridge crane is a type of crane used in factories and industries for hoisting and displacing heavy loads horizontally and vertically. Currently, the control of the positioning of bridge cranes is mainly performed in three ways:
    • By means of encoder (rotary encoder): The precision of this system depends greatly on the preparation of the surfaces of the rotary elements and the track through which the running trolley advances. In the industry, the rails along which the bridge cranes move suffer the deformations of the actual structure of the buildings as well as its own due to the high loads they hold. Furthermore, the oil and the dust in suspension cause the grip coefficient of the wheels to be considerably reduced and irregular during the whole movement.
    • By means of laser rangefinder: They are simpler systems, since they only consist of a laser emitter assembled on the crane and a reflector placed at a fixed point aligned with the emitter. With this type of sensor, it must be noted that the diameter of the light spot increases with the distance to the sensor, which could cause measurement problems at long distances due to the 'multipath' effect. Furthermore, in industrial buildings the deformations of the /tracks and the nonuniform motion of the bridge mean that the direction of the light beam cannot be controlled, radicalising the 'multipath' effect. Finally, the laser measurement systems are influenced by specific changes in types and intensities of lights, making it difficult to find a common pattern due to the types of lighting and reflections that can be found in industrial buildings.
    • By means of laser triangulation or vision system: The operating principle of this system is to control the angle between the position of the screens relative to the laser emitter or the vision system camera. The drawbacks of using this system are practically the same as those described in the previous point: the nonuniformity of the crane's movement, the length of the building and the surrounding lighting conditions make it very difficult to establish this method as a standard for controlling the lifting centre motion.
  • The present invention solves the aforementioned problems by using artificial vision to enable the accurate calculation of the distance covered by the bridge crane.
  • Summary of the invention
  • The present invention relates to a system and a procedure for controlling the positioning of bridge cranes.
  • The system comprises image capturing means, integrated into the bridge crane and focusing on a surface of reference, and data processing means, responsible for receiving and analysing the images acquired by the image capturing means, in order to recognise, using a character extraction process, relevant characteristics in the image, and determine the relative displacement of said relevant characteristics between successive images, and therefore, the positioning of the bridge crane.
  • In a particular embodiment, the image capturing means are implemented using a camera integrated into the running trolley of the bridge crane, preferably, with zenithal orientation (focusing on the floor).
  • The data processing means are preferably configured to determine regions in the image, to calculate the central, spatial and invariant moments thereof, to obtain some characteristic points Pi as relevant characteristics of the image.
  • Another aspect of the present invention relates to a procedure for controlling the positioning of the bridge crane, which comprises acquiring images using a camera integrated into the bridge crane and focusing on a surface of reference, and analysing the acquired images in order to recognise, using a character extraction process, relevant characteristics in the image, and determine the relative displacement of said relevant characteristics between successive images, and therefore the positioning of the bridge crane.
  • The procedure may comprise determining regions in the image and calculating the central, spatial and invariant moments thereof to obtain some characteristic points Pi as relevant characteristics of the image.
  • Brief description of the drawings
  • Below is a very brief description of a set of drawings intended to aid a better understanding of the invention and that are expressly related to one embodiment of said invention, which is presented as a non-limitative example thereof.
    • Figure 1 shows a bridge crane with an image-capturing camera.
    • Figures 2A and 2B show successive images acquired by the camera installed in the crane.
    • Figure 3 shows the connection of the camera with the means responsible for processing the images in order to calculate the positioning of the bridge crane.
    Detailed description of the invention
  • The control of the distance covered by the lifting centre on the bridge crane is essential when it comes to automate handling processes in which this type of machinery is involved.
  • Figure 1 shows, by way of example, a bridge crane known in the state of the art, which has been equipped with image capturing means, camera 6. A bridge crane consist of a pair of parallel rails positioned at a great height, longitudinal guides 1, on which there is a displaceable bridge 2, which spans the space between the longitudinal guides 1. The bridge 2 has one or two transversal guides 3 (two, in the case shown in Figure 1) along which, moves a running trolley 4 with a block and tackle 5 for hoisting a loading hook that determines the lifting centre C.
  • The movement of the lifting centre C may be determined using the coordinates (X, Y, Z), where:
    • The X coordinate represents the position of the lifting centre C due to the movement of the bridge 2, caused by the action of the motor 7, along the longitudinal guides 1.
    • The Y coordinate represents the position of the lifting centre C due to the movement of the running trolley 4, caused by the action of the motor 8, along the transversal guides 3.
    • The Z coordinate represents the vertical position of the lifting centre C, due to the action of the block and tackle 5.
  • The motion of the coordinates in the (X, Y) plane, which are of interest to calculate the distance covered by the lifting centre C, can usually be calculated in two ways:
    • Measuring the X and Y coordinates with respect to their origins in the position of the initial moment (X0, Y0) and final moment of the movement (Xf,Yf) and calculating the differences (DX,DY):

               (DX, DY) = (Xf, Yf) - (X0, Y0)

    • Directly measuring the path covered in both X and Y axes by adding pulses SPulses determining a known motion; (XC,YC) is the motion on X and Y determined between pulses.

               (DX, DY)= (XC, YC)xSPulses

  • The installation and operating conditions of bridge cranes mean that measuring steadily and accurately the position of the lifting centre C coordinate is an extremely complex process. The inherent tolerances of the crane's mechanisms, the misalignment of the guides (rails), the slippage of the wheels, the high vibrations, the oscillations of the bridge, the suspended particles, and the incident light from different angles, among other factors, imply that measurements using traditional systems are not entirely efficient.
  • With the objective of controlling the motion of the lifting centre C of the bridge cranes, the system of the present invention enables to determine the parameters of the movement (distance, orientation and direction) in real time, maintaining a permissible accumulated deviation on great lengths and irrespective of the path covered, the vibrations and the existing lighting conditions.
  • The present invention consists in an artificial vision system that measures the distance covered, based on the comparison between two images taken at a controlled interval of time.
  • The area where the image-capturing camera 6 is placed depends fundamentally, on which movements of the bridge crane we want to control. If the two X and Y axes need to be recorded, the camera is placed integrally attached to some element moving with the running trolley 4 (the actual structure of the running trolley, the hook, a handling element, etc.). On the other hand, if only the displacement on the X axis of the bridge 2 movement needs to be controlled, then, the camera should be integrated into the bridge 2.
  • The orientation of the camera depends on where the surface of reference is. Usually it is placed in zenithal view to acquire images of the surface, but it can also use the sidewalls, or even the ceiling as reference. In the example shown in Figure 1, the camera 6 is integrally attached to the running trolley 4 and with zenithal orientation, focusing on the floor, in such a way that it can obtain the positioning of the running trolley 2 in the X and Y coordinates.
  • The control of displacement is based on the recognition of relevant characteristics in the image (by means of a characteristic extraction process, "feature extraction") and the observation of the relative displacement of these characteristics with respect to the following image captured. With this information, a displacement vector is obtained, which, once calibrated, can be expressed in physical measurement units of the crane.
  • To make it more steady, areas known as "regions" are isolated in the image (using binarization, aggregation, etc.). In these regions, central, spatial and Hu invariant moments are calculated. The regions in the image where the eigenvalues are large, characteristic points Pi are considered, from here a number of coordinates (X0,Y0)i, is obtained for each characteristic point Pi. Figures 2A and 2B show successive images acquired by the camera (with zenithal orientation) installed in the running trolley 4 during its displacement. The images have been filtered and show coils 10 on the floor, which are the handling target of the bridge crane, and the characteristic points Pi considered. They also show the coordinates X0,Y0 at an initial instant (first capture, Figure 2A) and the coordinates X1,Y1 at the instant of the second capture (Figure 2B) for a specific characteristic point.
  • The second step is to find, by comparing the mentioned parameters, similar regions in the second capture and set its new coordinates (X1,Y1)i. Finally, the average value of the differences between the coordinates of the first and second captures are calculated, (X1,Y1)i - (X0,Y0)i and an analysis of the standard deviation is performed, setting the permissible error depending on the desired accuracy to be obtained. The result of this is the distance covered on the two X and Y axes by the elements of the crane during the interval of time between captures.
  • The minus symbol between the values of the coordinates indicates the direction of the displacement; dividing this value by the time interval between captures, determines the velocity of the movement.
  • The accuracy of the system depends on the resolution of the images captured, the frequency between captures and the margin defined as permissible at the level of uncertainty of the statistical analysis.
  • The fundamental limitation of this method of measurement relates to the capture time and the mathematical processing time. Nowadays, the current commercial hardware can be used to measure the displacement of systems that move no faster than 2 m/s and wherein light changes (visible light) do not occur more frequently than at 100 Hz.
  • Figure 3 shows, in a schematic way, the connection of the camera 6 with the data processing means 12 responsible for processing the images to calculate the positioning of the bridge crane. In the case of Figure 1, the data processing means 12 (e.g. a processor or a mini computer) can be seen integrated into the running trolley 4, in such a way that the camera 6 sends the image data using the cable 13. However, the data processing means could be in another location, for example in a control station located in the actual hangar or factory and receiving information from a wireless camera.

Claims (8)

  1. Control system for positioning bridge cranes, characterised in that it comprises:
    image capturing means (6) integrated into the bridge crane and focusing on a surface of reference, and
    data processing means (12), responsible for receiving and analysing the images acquired by the image capturing means (6) in order to:
    recognise, using a characteristic extraction process, relevant characteristics in the image, and
    determine the relative displacement of said relevant characteristics between successive images, and therefore, the positioning of the bridge crane.
  2. System according to claim 1, wherein the image capturing means (6) is a camera integrated into the running trolley (4) of the bridge crane.
  3. System according to claim 2, wherein the camera (6) has a zenithal orientation.
  4. System according to any one of the previous claims, wherein the data processing means (12) are configured to determine regions in the image, calculate the central, spatial and invariant moments thereof in order to obtain some characteristic points Pi as relevant characteristics of the image.
  5. Procedure for controlling the positioning of bridge cranes, characterised in that it comprises:
    capturing images using a camera (6) integrated into the bridge crane and focusing on a surface of reference, and
    analysing the images captured in order to:
    recognise, using a characteristic extraction process, relevant characteristics in the image, and
    determine the relative displacement of said relevant characteristics between successive images, and therefore, the positioning of the bridge crane.
  6. Procedure according to claim 5, wherein the camera 6 is integrated into the running trolley (4) of the bridge crane.
  7. Procedure according to claim 6, wherein the camera (6) has a zenithal orientation.
  8. Procedure according to any one of claims 5 to 7, which comprises determining regions in the image and calculating the central, spatial and invariant moments thereof to obtain some characteristic points Pi as relevant characteristics of the image.
EP13382156.1A 2013-04-25 2013-04-25 System and control procedure for the positioning of bridge cranes Withdrawn EP2796402A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13382156.1A EP2796402A1 (en) 2013-04-25 2013-04-25 System and control procedure for the positioning of bridge cranes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP13382156.1A EP2796402A1 (en) 2013-04-25 2013-04-25 System and control procedure for the positioning of bridge cranes

Publications (1)

Publication Number Publication Date
EP2796402A1 true EP2796402A1 (en) 2014-10-29

Family

ID=48700503

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13382156.1A Withdrawn EP2796402A1 (en) 2013-04-25 2013-04-25 System and control procedure for the positioning of bridge cranes

Country Status (1)

Country Link
EP (1) EP2796402A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017091972A1 (en) * 2015-12-01 2017-06-08 Hong Kong R&D Centre for Logistics and Supply Chain Management Enabling Technologies Limited A safety system for a machine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997018153A1 (en) * 1995-11-14 1997-05-22 Sime Oy Method and device to pick up, transport and put down a load
US20090067673A1 (en) * 2007-09-12 2009-03-12 Hilmar Hofmann Method and apparatus for determining the position of a vehicle, computer program and computer program product
EP2305594A1 (en) * 2008-07-23 2011-04-06 Daifuku Co., Ltd. Learning device and learning method in article conveyance facility

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997018153A1 (en) * 1995-11-14 1997-05-22 Sime Oy Method and device to pick up, transport and put down a load
US20090067673A1 (en) * 2007-09-12 2009-03-12 Hilmar Hofmann Method and apparatus for determining the position of a vehicle, computer program and computer program product
EP2305594A1 (en) * 2008-07-23 2011-04-06 Daifuku Co., Ltd. Learning device and learning method in article conveyance facility

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017091972A1 (en) * 2015-12-01 2017-06-08 Hong Kong R&D Centre for Logistics and Supply Chain Management Enabling Technologies Limited A safety system for a machine

Similar Documents

Publication Publication Date Title
US10132611B2 (en) Laser scanner
EP2079982B1 (en) Method for determining the axis of rotation of a vehicle wheel
JP4559681B2 (en) Device for defining the wheel geometry and / or axis geometry of an automobile
JP3934413B2 (en) Apparatus and method for detecting position of elevator car inside hoistway
EP2672715A2 (en) 3-D imaging and processing system including at least one 3-D or depth sensor which is continually calibrated during use
JP3805302B2 (en) Work take-out device
US7289876B2 (en) Container crane, and method of determining and correcting a misalignment between a load-carrying frame and a transport vehicle
JP4533659B2 (en) Apparatus and method for generating map image by laser measurement
US20140336928A1 (en) System and Method of Automated Civil Infrastructure Metrology for Inspection, Analysis, and Information Modeling
JP2004198330A (en) Method and apparatus for detecting position of subject
DE10040139B4 (en) Method for measuring rail profiles and track position disturbances and device for carrying out the method
US8711214B2 (en) Position and orientation measurement apparatus, position and orientation measurement method, and storage medium
US9197810B2 (en) Systems and methods for tracking location of movable target object
US7436522B2 (en) Method for determining the 3D coordinates of the surface of an object
JP4363295B2 (en) Plane estimation method using stereo images
CN102735166B (en) Spatial digitizer and robot system
US20040081352A1 (en) Three-dimensional visual sensor
WO2004031068A1 (en) Method and/or device for determining the oscillation of a load suspended by lifting equipment, the axis of said oscillation running in the lifting direction
US20140267703A1 (en) Method and Apparatus of Mapping Landmark Position and Orientation
CN101839700A (en) Non-contact image measuring system
CN1914481A (en) Method for determining the position of an object in a space
Ellenberg et al. Bridge related damage quantification using unmanned aerial vehicle imagery
US9037336B2 (en) Robot system
Beom et al. Mobile robot localization using a single rotating sonar and two passive cylindrical beacons
US8917942B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
AK Designated contracting states:

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Extension of the european patent to

Countries concerned: BAME

17P Request for examination filed

Effective date: 20130425

18D Deemed to be withdrawn

Effective date: 20150430