EP3418244B1 - Loading a container on a landing target - Google Patents

Loading a container on a landing target Download PDF

Info

Publication number
EP3418244B1
EP3418244B1 EP17176830.2A EP17176830A EP3418244B1 EP 3418244 B1 EP3418244 B1 EP 3418244B1 EP 17176830 A EP17176830 A EP 17176830A EP 3418244 B1 EP3418244 B1 EP 3418244B1
Authority
EP
European Patent Office
Prior art keywords
landing target
container
control system
spreader
container crane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP17176830.2A
Other languages
German (de)
French (fr)
Other versions
EP3418244A1 (en
Inventor
Björn Holmberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ABB Schweiz AG
Original Assignee
ABB Schweiz AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ABB Schweiz AG filed Critical ABB Schweiz AG
Priority to ES17176830T priority Critical patent/ES2865179T3/en
Priority to EP17176830.2A priority patent/EP3418244B1/en
Publication of EP3418244A1 publication Critical patent/EP3418244A1/en
Application granted granted Critical
Publication of EP3418244B1 publication Critical patent/EP3418244B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C1/00Load-engaging elements or devices attached to lifting or lowering gear of cranes or adapted for connection therewith for transmitting lifting forces to articles or groups of articles
    • B66C1/10Load-engaging elements or devices attached to lifting or lowering gear of cranes or adapted for connection therewith for transmitting lifting forces to articles or groups of articles by mechanical means
    • B66C1/101Load-engaging elements or devices attached to lifting or lowering gear of cranes or adapted for connection therewith for transmitting lifting forces to articles or groups of articles by mechanical means for containers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C19/00Cranes comprising trolleys or crabs running on fixed or movable bridges or gantries
    • B66C19/002Container cranes

Definitions

  • the invention relates to a method, a container crane control system, a computer program and a computer program product for loading a container on a landing target.
  • Container cranes are used to handle freight containers, to transfer containers between transport modes at container terminals, freight harbours and the like.
  • Standard shipping containers are used to transport a great and growing volume of freight around the world.
  • Trans-shipment is a critical function in freight handling. Trans-shipment may occur at each point of transfer and there is usually a tremendous number of containers that must be unloaded, transferred to a temporary stack, and later loaded on to another ship, or back onto the same ship or loaded instead onto another form of transport such as a road vehicle or train.
  • container cranes have been controlled in an operator cabin mounted on the container crane. Recently however, container cranes have become remote controlled and even fully automated. This reduces or eliminates the need for crane operators being exposed to inconvenience, danger and even injury.
  • WO 2015/022001 shows the preamble of claim 1 and discloses a method for automatically landing a container on a landing target using a container crane.
  • the container crane comprises a trolley and spreader for holding and lifting the container and a crane control system for controlling movements of said container crane.
  • a distance from the container to the landing target is measured and the container is moved towards the landing target dependent on the measured distance.
  • a plurality of images of the landing target are made using at least one camera mounted on the spreader.
  • the images are processed to identify one or more landing features in the images of the landing target. Distances from the container to the landing target are calculated based on a measurement of distance between the container and the landing features in the images.
  • a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container.
  • the method is performed in a container crane control system and comprises the steps of:
  • the key features may comprise corners of the landing target.
  • the key features may comprise twistlocks of the landing target.
  • the key features may comprise any one or more of a gooseneck of a chassis, a guide structure, a girder, and a beam.
  • the landing target may be situated higher than surrounding surfaces.
  • the step of obtaining two-dimensional images also comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader.
  • the step of performing feature extraction is based also on a two dimensional image from at least one camera of the second pair.
  • the first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
  • the method may further comprise the step of: detecting orientation of the landing target based on lines in the two dimensional images.
  • the step of controlling movement is also based on the orientation of the landing target.
  • the step of performing feature extraction may be based on scale invariant feature transform, SIFT.
  • the step of generating a point cloud may also be based on stereo image matching based on the two-dimensional images.
  • the method may further comprise the step of: obtaining additional depth data from a depth detection device.
  • the step of generating a point cloud is also based on the additional depth data.
  • a container crane control system for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container.
  • the container crane control system comprises: a processor; and a memory storing instructions that, when executed by the processor, cause the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
  • the landing target may be situated higher than surrounding surfaces.
  • the instructions to obtain two-dimensional images may comprise instructions that, when executed by the processor, cause the container crane control system to obtain two-dimensional images of the landing target from a second pair of cameras arranged on the spreader, and wherein the instructions to perform feature extraction comprise instructions that, when executed by the processor, cause the container crane control system to based perform the feature extraction also on a two dimensional image from at least one camera of the second pair.
  • the first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
  • a computer program for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container.
  • the computer program comprises computer program code which, when run on a container crane control system causes the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
  • a computer program product comprising a computer program according to the third aspect and a computer readable means on which the computer program is stored.
  • Embodiments presented herein are based on identifying key features of a landing target from several two-dimensional images using stereo matching and feature extraction.
  • any other type of 3D mapping sensor could be used such as Lidar, time of flight cameras etc.
  • Key features can e.g. be corners of the landing target, twistlocks, etc.
  • a point cloud with points in three dimensions is also generated to describe the landing target and the environment around the landing target to improve ability to identify the landing target. By basing the landing target identification on three dimensions, a more reliable identification is achieved, allowing operation to be continued also in conditions with bad visibility.
  • Fig 1 is a schematic diagram illustrating a container crane environment in which embodiments presented herein can be applied.
  • the view of Fig 1 is along an x-y plane in a Cartesian coordinate system.
  • a container crane 51 uses a number of powerful electric motors mounted on a spreader 55 and on a trolley 53 to power moving parts and retract or extend cables to lift up or down the spreader 55.
  • the spreader 55 can hold a load 21 in the form of a container. Electric motors are also used to power the movements of the trolley 53 holding the spreader 55, to lift and transport the containers out of the ship and onto a land vehicle 23 or a stack etc.
  • the container crane 51 can be used for loading containers on a ship and/or for unloading containers from a ship to land or on a landing target 59 on a land vehicle 23, e.g. a truck chassis or a train carriage chassis. Moving containers from one position on land to another landing target is also possible.
  • the width of shipping containers is standardised at 8 ft. (2.436 m), but the height varies, typically between 8 ft. (2.436 m) and 9.5 ft. (2.896 m).
  • the most common standard lengths are 20 ft. (6.096 m) and 40 ft. (12.192 m) long.
  • the 40 ft. (12.192 m) container is very common today and even longer containers up to 53 ft. (16.154 m) long are also in use.
  • International standard dimensions are based on a number of ISO recommendations made between 1968 and 1970, and in particular a recommendation R1161 from January 1970, which made recommendations about dimensions of corner fittings for standard containers.
  • the distances between corner fittings on standard shipping containers are standardised in accordance with the ISO recommendations.
  • the corner fittings also known as corner castings, include standard openings so that a container may be picked up by inserting a hook of the spreader 55 into each of the four corner fittings at the top of the container 21.
  • the size and shape of the oval-shaped openings are defined in another standard, ISO 1161 from 1984.
  • the same type of corner fittings e.g. those on the bottom of a container, may be used to lock a container in place in a position (e.g. in a hold or on deck) on board a ship, on a wagon or a chassis.
  • the spreader 55 is thus used to grip the container 21 e.g. using twistlocks to engage with the standard sized opening in the corner fittings on the container, to lift it, lower it and release it.
  • the term spreader 55 is used to denote a part of a lifting device that is in direct contact with a container 21.
  • Spreaders 55 are normally designed to handle more than one size of container, typically 20-40 ft. (6.096 - 12.192 m) or 20-40-45 ft. (6.096 - 12.192 - 13.716 m) long containers. Some spreaders 55 may at any time lift and handle one single 40 ft. (12.192 m) or a 45 ft. (13.716 m) container or two 20 ft.
  • Some spreaders 55 are adjustable in use so that the same spreader 55 can be used to pick up one 20 ft. (6.096 m), or two 20 ft. (6.096 m) containers at a time by adjusting the length of the spreader.
  • the container crane 51 can thus be used to lift a container 21 up from a ship and land it on a landing target 59, or vice versa. Alternatively, the container crane 51 can be used to transfer the container 21 between the ship and ground or a container stack or any other suitable container movement.
  • a container crane control system 1 is used to control the operation of the crane 51.
  • the container crane control system 1 comprises several cameras (shown in more detail in Fig 3 and explained below) and/or other depth mapping devices such as Lidar, time of flight cameras, etc., and a control device 15.
  • the container crane can be manually controlled e.g. by an operator 5 using an operator terminal 12 in an office 7.
  • the control device 15 is any suitable control device capable of performing logic operations and can comprise any combination of a central processing unit (CPU), graphics processing unit (GPU), a microcontroller unit (MCU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and discrete logic circuitry, optionally combined with persistent memory (e.g. read only memory, ROM).
  • CPU central processing unit
  • GPU graphics processing unit
  • MCU microcontroller unit
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • discrete logic circuitry optionally combined with persistent memory (e.g. read only memory, ROM).
  • Fig 2 is a schematic diagram illustrating the landing target 59 and land vehicle 23 of Fig 1 in more detail.
  • the landing target 59 is a top surface or top structure of the land vehicle 23, comprising twistlocks for securing a container.
  • the land vehicle 23 can e.g. be a truck chassis or a train carriage chassis.
  • the landing target 59 is here higher than surrounding surfaces 20 (i.e. ground) at a certain height 25 from the surrounding surfaces 20.
  • Fig 3 is a schematic diagram illustrating a view from above of the landing target 59 of Fig 2 .
  • the view is along a z-x plane in the same coordinate system as for Fig 1 .
  • the view of Fig 3 is from above whereas the view in Fig 1 is from the front (or back).
  • the four corners 31a-d of the landing target 59 can be seen here.
  • the landing target 59 comprises four twistlocks 30a-d, respectively provided at the four corners 31a-d.
  • the twistlocks are used to secure the container on the landing target 59.
  • the landing target can have fewer than four twistlocks but instead have other structures used hold the container in position on the landing target.
  • Fig 4 is a schematic diagram illustrating a view from below of the spreader 55 of Fig 1 according to one embodiment.
  • there could be an addition of other depth mapping devices such as Lidar, time of flight cameras, etc.
  • a first pair 35a comprises the first camera 32a and the second camera 32b.
  • a second pair 35b comprises the third camera 32c and the fourth camera 32d.
  • a third pair 35c comprises the fifth camera 32e and the sixth camera 32f.
  • a fourth pair 35d comprises the seventh camera 32g and the eighth camera 32h.
  • end side is to be interpreted as one of the shorter sides of the spreader.
  • the first pair 35a of cameras and the second pair 35b of cameras are provided on one end of the spreader 55, while the third pair 35c of cameras and the fourth pair 35d of cameras are provided on the other end of the spreader 55.
  • Fig 5 is a schematic diagram illustrating lines of the landing target 59 of Fig 1 in more detail according to one embodiment.
  • the landing target 59 can comprise lines, including horizontal lines 37 and/or vertical lines 38.
  • the lines can form part of the structure of the landing target 59 or underlying chassis, which are predominately provided along (horizontally in Fig 5 ) or across (vertically in Fig 5 ) the landing target 59.
  • the container crane control system can accurately adjust skew alignment of the container when placed on the landing target 59.
  • Fig 6 is a flow chart illustrating a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The method is performed in the container crane control system for automated control of the container crane.
  • the container crane control system obtains two-dimensional images of the landing target from a first pair (e.g. any of the pairs 35a-d of Fig 4 ) of cameras arranged on the spreader.
  • this comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader.
  • the first pair of cameras and the second pair of cameras can be arranged along the same end side of the spreader.
  • the term end side is to be interpreted as one of the shorter sides of the spreader.
  • the container crane control system performs feature extraction based on the two-dimensional images to identify key features of the landing target.
  • the key features can be corners of the landing target.
  • the key features can comprise twistlocks of the landing target.
  • other key features are identified, e.g. any one or more of a gooseneck of a chassis, guide structures, girders, beams, etc. When more key features are extracted, this increases the reliability for subsequent loading of the container on the landing target.
  • the landing target can be situated higher than surrounding surfaces, which simplifies the identification of the landing target in the feature extraction, since this is based on at least two cameras and thus result in a depth dimension.
  • the feature extraction is based on any suitable algorithm, e.g. scale invariant feature transform (SIFT) or similar.
  • SIFT scale invariant feature transform
  • additional depth data from a depth detection device is obtained.
  • the depth detection device can e.g. be in the form of Lidar, time of flight cameras, etc.
  • a generate point cloud step 44 the container crane control system generates a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions. Each point may also contain light values in one or more colours, e.g. RGB (Red, Green and Blue) light values.
  • RGB Red, Green and Blue
  • the point cloud can also be derived using stereo image matching based on the two-dimensional images. This results in more dense depth maps than if only the feature extraction is used.
  • the stereo image matching can e.g. be based on block matching.
  • the additional depth data received in step 43 is be fused with the camera images to yield even more reliable point cloud data.
  • the point cloud can also based on the additional depth data to thereby obtain an even more extensive and accurate point cloud.
  • a detect orientation step 45 the container crane control system detects orientation of the landing target based on lines in the two dimensional images, see Fig 5 and corresponding text above. This improves skew control when controlling the container in relation to the landing target.
  • a control movement step 46 the container crane control system controls movement of the container to the landing target based on the point cloud and the identified key features of the landing target. When available, movement is controlled also based on the orientation of the landing target.
  • the method loops to provide continued feedback of position in relation to the landing target and appropriate movement control.
  • the container crane control system By using three dimensional data in the feature extraction and the point cloud, a more reliable identification of the landing target is achieved. Moreover, due the height difference (25 of Fig 2 ) between the landing target and ground, the three dimensional data greatly increases reliability of the identification. This allows the container crane control system to continue operation also in conditions of limited visibility, such as in rain, snow, lightning, etc.
  • a corresponding method can be applied for picking up a container, where instead of a landing target, the key features of a container to be picked up are identified.
  • Fig 7 is a schematic diagram illustrating components of the container crane control system 1 of Fig 1 .
  • a processor 60 is provided using any combination of one or more of a suitable central processing unit (CPU), graphics processing unit (GPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), etc., capable of executing software instructions 67 stored in a memory 64, which can thus be a computer program product.
  • the processor 60 can be configured to execute the method described with reference to Fig 6 above.
  • the memory 64 can be any combination of random access memory (RAM) and read only memory (ROM).
  • the memory 64 also comprises persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid-state memory or even remotely mounted memory.
  • a data memory 66 is also provided for reading and/or storing data during execution of software instructions in the processor 60.
  • the data memory 66 can be any combination of random access memory (RAM) and read only memory (ROM).
  • the container crane control system 1 further comprises an I/O interface 62 for communicating with other external entities.
  • the I/O interface 62 also includes a user interface.
  • Fig 8 shows one example of a computer program product comprising computer readable means.
  • a computer program 91 can be stored, which computer program can cause a processor to execute a method according to embodiments described herein.
  • the computer program product is an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc.
  • the computer program product could also be embodied in a memory of a device, such as the computer program product 67 of Fig 7 .
  • the computer program 91 is here schematically shown as a track on the depicted optical disk, the computer program can be stored in any way which is suitable for the computer program product, such as a removable solid state memory, e.g. a Universal Serial Bus (USB) drive.
  • USB Universal Serial Bus

Description

    TECHNICAL FIELD
  • The invention relates to a method, a container crane control system, a computer program and a computer program product for loading a container on a landing target.
  • BACKGROUND
  • Container cranes are used to handle freight containers, to transfer containers between transport modes at container terminals, freight harbours and the like. Standard shipping containers are used to transport a great and growing volume of freight around the world. Trans-shipment is a critical function in freight handling. Trans-shipment may occur at each point of transfer and there is usually a tremendous number of containers that must be unloaded, transferred to a temporary stack, and later loaded on to another ship, or back onto the same ship or loaded instead onto another form of transport such as a road vehicle or train.
  • Traditionally, the container cranes have been controlled in an operator cabin mounted on the container crane. Recently however, container cranes have become remote controlled and even fully automated. This reduces or eliminates the need for crane operators being exposed to inconvenience, danger and even injury.
  • WO 2015/022001 , shows the preamble of claim 1 and discloses a method for automatically landing a container on a landing target using a container crane. The container crane comprises a trolley and spreader for holding and lifting the container and a crane control system for controlling movements of said container crane. A distance from the container to the landing target is measured and the container is moved towards the landing target dependent on the measured distance. A plurality of images of the landing target are made using at least one camera mounted on the spreader. The images are processed to identify one or more landing features in the images of the landing target. Distances from the container to the landing target are calculated based on a measurement of distance between the container and the landing features in the images.
  • US 2015/070387 discloses structural modeling using depth sensors. Any improvement in how the landing target is identified is of great value.
  • SUMMARY
  • It is an object to improve the identification of a landing target for loading a container.
  • According to a first aspect, it is provided a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The method is performed in a container crane control system and comprises the steps of:
    • obtaining two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; performing feature extraction based on the two-dimensional images to identify key features of the landing target;
    • generating a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and controlling movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
  • The key features may comprise corners of the landing target. The key features may comprise twistlocks of the landing target. The key features may comprise any one or more of a gooseneck of a chassis, a guide structure, a girder, and a beam.
  • The landing target may be situated higher than surrounding surfaces.
  • The step of obtaining two-dimensional images also comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader. In such a case, the step of performing feature extraction is based also on a two dimensional image from at least one camera of the second pair.
  • The first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
  • The method may further comprise the step of: detecting orientation of the landing target based on lines in the two dimensional images. In such a case, the step of controlling movement is also based on the orientation of the landing target.
  • The step of performing feature extraction may be based on scale invariant feature transform, SIFT.
  • The step of generating a point cloud may also be based on stereo image matching based on the two-dimensional images.
  • The method may further comprise the step of: obtaining additional depth data from a depth detection device. In such a case, the step of generating a point cloud is also based on the additional depth data.
  • According to a second aspect, it is provided a container crane control system for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The container crane control system comprises: a processor; and a memory storing instructions that, when executed by the processor, cause the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
  • The landing target may be situated higher than surrounding surfaces.
  • The instructions to obtain two-dimensional images may comprise instructions that, when executed by the processor, cause the container crane control system to obtain two-dimensional images of the landing target from a second pair of cameras arranged on the spreader, and wherein the instructions to perform feature extraction comprise instructions that, when executed by the processor, cause the container crane control system to based perform the feature extraction also on a two dimensional image from at least one camera of the second pair.
  • The first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
  • According to a third aspect, it is provided a computer program for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The computer program comprises computer program code which, when run on a container crane control system causes the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
  • According to a fourth aspect, it is provided a computer program product comprising a computer program according to the third aspect and a computer readable means on which the computer program is stored.
  • Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the element, apparatus, component, means, step, etc." are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is now described, by way of example, with reference to the accompanying drawings, in which:
    • Fig 1 is a schematic diagram illustrating a container crane environment in which embodiments presented herein can be applied;
    • Fig 2 is a schematic diagram illustrating the landing target and land vehicle of Fig 1 in more detail according to one embodiment;
    • Fig 3 is a schematic diagram illustrating a view from above of the landing target of Fig 2 according to one embodiment;
    • Fig 4 is a schematic diagram illustrating a view from below of the spreader of Fig 1 according to one embodiment;
    • Fig 6 is a flow chart illustrating a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container;
    • Fig 7 is a schematic diagram illustrating components of the container crane control system of Fig 1; and
    • Fig 8 shows one example of a computer program product comprising computer readable means.
    DETAILED DESCRIPTION
  • The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
  • Embodiments presented herein are based on identifying key features of a landing target from several two-dimensional images using stereo matching and feature extraction. Alternatively, any other type of 3D mapping sensor could be used such as Lidar, time of flight cameras etc. Key features can e.g. be corners of the landing target, twistlocks, etc. A point cloud with points in three dimensions is also generated to describe the landing target and the environment around the landing target to improve ability to identify the landing target. By basing the landing target identification on three dimensions, a more reliable identification is achieved, allowing operation to be continued also in conditions with bad visibility.
  • Fig 1 is a schematic diagram illustrating a container crane environment in which embodiments presented herein can be applied. The view of Fig 1 is along an x-y plane in a Cartesian coordinate system.
  • A container crane 51 uses a number of powerful electric motors mounted on a spreader 55 and on a trolley 53 to power moving parts and retract or extend cables to lift up or down the spreader 55. The spreader 55 can hold a load 21 in the form of a container. Electric motors are also used to power the movements of the trolley 53 holding the spreader 55, to lift and transport the containers out of the ship and onto a land vehicle 23 or a stack etc. The container crane 51 can be used for loading containers on a ship and/or for unloading containers from a ship to land or on a landing target 59 on a land vehicle 23, e.g. a truck chassis or a train carriage chassis. Moving containers from one position on land to another landing target is also possible.
  • The width of shipping containers is standardised at 8 ft. (2.436 m), but the height varies, typically between 8 ft. (2.436 m) and 9.5 ft. (2.896 m). The most common standard lengths are 20 ft. (6.096 m) and 40 ft. (12.192 m) long. The 40 ft. (12.192 m) container is very common today and even longer containers up to 53 ft. (16.154 m) long are also in use. International standard dimensions are based on a number of ISO recommendations made between 1968 and 1970, and in particular a recommendation R1161 from January 1970, which made recommendations about dimensions of corner fittings for standard containers. The distances between corner fittings on standard shipping containers are standardised in accordance with the ISO recommendations. The corner fittings, also known as corner castings, include standard openings so that a container may be picked up by inserting a hook of the spreader 55 into each of the four corner fittings at the top of the container 21. The size and shape of the oval-shaped openings are defined in another standard, ISO 1161 from 1984. The same type of corner fittings, e.g. those on the bottom of a container, may be used to lock a container in place in a position (e.g. in a hold or on deck) on board a ship, on a wagon or a chassis.
  • The spreader 55 is thus used to grip the container 21 e.g. using twistlocks to engage with the standard sized opening in the corner fittings on the container, to lift it, lower it and release it. In this description, the term spreader 55 is used to denote a part of a lifting device that is in direct contact with a container 21. Spreaders 55 are normally designed to handle more than one size of container, typically 20-40 ft. (6.096 - 12.192 m) or 20-40-45 ft. (6.096 - 12.192 - 13.716 m) long containers. Some spreaders 55 may at any time lift and handle one single 40 ft. (12.192 m) or a 45 ft. (13.716 m) container or two 20 ft. (6.096 m) containers. Some spreaders 55 are adjustable in use so that the same spreader 55 can be used to pick up one 20 ft. (6.096 m), or two 20 ft. (6.096 m) containers at a time by adjusting the length of the spreader.
  • The container crane 51 can thus be used to lift a container 21 up from a ship and land it on a landing target 59, or vice versa. Alternatively, the container crane 51 can be used to transfer the container 21 between the ship and ground or a container stack or any other suitable container movement.
  • A container crane control system 1 is used to control the operation of the crane 51. In order to enable autonomous control of the crane 51, the container crane control system 1 comprises several cameras (shown in more detail in Fig 3 and explained below) and/or other depth mapping devices such as Lidar, time of flight cameras, etc., and a control device 15. Optionally, the container crane can be manually controlled e.g. by an operator 5 using an operator terminal 12 in an office 7.
  • The control device 15 is any suitable control device capable of performing logic operations and can comprise any combination of a central processing unit (CPU), graphics processing unit (GPU), a microcontroller unit (MCU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and discrete logic circuitry, optionally combined with persistent memory (e.g. read only memory, ROM).
  • Fig 2 is a schematic diagram illustrating the landing target 59 and land vehicle 23 of Fig 1 in more detail. The landing target 59 is a top surface or top structure of the land vehicle 23, comprising twistlocks for securing a container. The land vehicle 23 can e.g. be a truck chassis or a train carriage chassis. The landing target 59 is here higher than surrounding surfaces 20 (i.e. ground) at a certain height 25 from the surrounding surfaces 20.
  • Fig 3 is a schematic diagram illustrating a view from above of the landing target 59 of Fig 2. The view is along a z-x plane in the same coordinate system as for Fig 1. Hence, the view of Fig 3 is from above whereas the view in Fig 1 is from the front (or back). The four corners 31a-d of the landing target 59 can be seen here. Moreover, the landing target 59 comprises four twistlocks 30a-d, respectively provided at the four corners 31a-d. The twistlocks are used to secure the container on the landing target 59. Alternatively the landing target can have fewer than four twistlocks but instead have other structures used hold the container in position on the landing target.
  • Fig 4 is a schematic diagram illustrating a view from below of the spreader 55 of Fig 1 according to one embodiment. There are here eight cameras 32a-h provided for identification of landing target, etc. There could be fewer or more cameras as long as there are at least two cameras. Alternatively there could be an addition of other depth mapping devices such as Lidar, time of flight cameras, etc.
  • The cameras are provided in pairs. Specifically, a first pair 35a comprises the first camera 32a and the second camera 32b. A second pair 35b comprises the third camera 32c and the fourth camera 32d. A third pair 35c comprises the fifth camera 32e and the sixth camera 32f. A fourth pair 35d comprises the seventh camera 32g and the eighth camera 32h.
  • The term end side is to be interpreted as one of the shorter sides of the spreader. Hence, the first pair 35a of cameras and the second pair 35b of cameras are provided on one end of the spreader 55, while the third pair 35c of cameras and the fourth pair 35d of cameras are provided on the other end of the spreader 55.
  • Fig 5 is a schematic diagram illustrating lines of the landing target 59 of Fig 1 in more detail according to one embodiment. The landing target 59 can comprise lines, including horizontal lines 37 and/or vertical lines 38. The lines can form part of the structure of the landing target 59 or underlying chassis, which are predominately provided along (horizontally in Fig 5) or across (vertically in Fig 5) the landing target 59. By analysing the lines in images captured by the cameras and in the depth map of the landing target, the container crane control system can accurately adjust skew alignment of the container when placed on the landing target 59.
  • Fig 6 is a flow chart illustrating a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The method is performed in the container crane control system for automated control of the container crane.
  • In an obtain 2D images step 40, the container crane control system obtains two-dimensional images of the landing target from a first pair (e.g. any of the pairs 35a-d of Fig 4) of cameras arranged on the spreader.
  • Optionally, this comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader. The first pair of cameras and the second pair of cameras can be arranged along the same end side of the spreader. The term end side is to be interpreted as one of the shorter sides of the spreader.
  • In a feature extraction step 42, the container crane control system performs feature extraction based on the two-dimensional images to identify key features of the landing target. The key features can be corners of the landing target. Alternatively or additionally, the key features can comprise twistlocks of the landing target. Optionally, other key features are identified, e.g. any one or more of a gooseneck of a chassis, guide structures, girders, beams, etc. When more key features are extracted, this increases the reliability for subsequent loading of the container on the landing target.
  • As shown in Fig 2, the landing target can be situated higher than surrounding surfaces, which simplifies the identification of the landing target in the feature extraction, since this is based on at least two cameras and thus result in a depth dimension.
  • When a two dimensional image from at least one camera of the second pair is available, this is also used in the feature extraction. The use of additional cameras improves the ability to identify the depth dimension.
  • The feature extraction is based on any suitable algorithm, e.g. scale invariant feature transform (SIFT) or similar.
  • In an optional obtain additional depth data step 43, additional depth data from a depth detection device is obtained. The depth detection device can e.g. be in the form of Lidar, time of flight cameras, etc.
  • In a generate point cloud step 44, the container crane control system generates a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions. Each point may also contain light values in one or more colours, e.g. RGB (Red, Green and Blue) light values.
  • The point cloud can also be derived using stereo image matching based on the two-dimensional images. This results in more dense depth maps than if only the feature extraction is used. The stereo image matching can e.g. be based on block matching.
  • Optionally, the additional depth data received in step 43 is be fused with the camera images to yield even more reliable point cloud data. Hence, the point cloud can also based on the additional depth data to thereby obtain an even more extensive and accurate point cloud.
  • In a detect orientation step 45, the container crane control system detects orientation of the landing target based on lines in the two dimensional images, see Fig 5 and corresponding text above. This improves skew control when controlling the container in relation to the landing target.
  • In a control movement step 46, the container crane control system controls movement of the container to the landing target based on the point cloud and the identified key features of the landing target. When available, movement is controlled also based on the orientation of the landing target.
  • The method loops to provide continued feedback of position in relation to the landing target and appropriate movement control.
  • By using three dimensional data in the feature extraction and the point cloud, a more reliable identification of the landing target is achieved. Moreover, due the height difference (25 of Fig 2) between the landing target and ground, the three dimensional data greatly increases reliability of the identification. This allows the container crane control system to continue operation also in conditions of limited visibility, such as in rain, snow, lightning, etc.
  • A corresponding method can be applied for picking up a container, where instead of a landing target, the key features of a container to be picked up are identified.
  • Fig 7 is a schematic diagram illustrating components of the container crane control system 1 of Fig 1. A processor 60 is provided using any combination of one or more of a suitable central processing unit (CPU), graphics processing unit (GPU), multiprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), etc., capable of executing software instructions 67 stored in a memory 64, which can thus be a computer program product. The processor 60 can be configured to execute the method described with reference to Fig 6 above.
  • The memory 64 can be any combination of random access memory (RAM) and read only memory (ROM). The memory 64 also comprises persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid-state memory or even remotely mounted memory.
  • A data memory 66 is also provided for reading and/or storing data during execution of software instructions in the processor 60. The data memory 66 can be any combination of random access memory (RAM) and read only memory (ROM).
  • The container crane control system 1 further comprises an I/O interface 62 for communicating with other external entities. Optionally, the I/O interface 62 also includes a user interface.
  • Other components of the container crane control system 1 are omitted in order not to obscure the concepts presented herein.
  • Fig 8 shows one example of a computer program product comprising computer readable means. On this computer readable means, a computer program 91 can be stored, which computer program can cause a processor to execute a method according to embodiments described herein. In this example, the computer program product is an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc. As explained above, the computer program product could also be embodied in a memory of a device, such as the computer program product 67 of Fig 7. While the computer program 91 is here schematically shown as a track on the depicted optical disk, the computer program can be stored in any way which is suitable for the computer program product, such as a removable solid state memory, e.g. a Universal Serial Bus (USB) drive.
  • The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.

Claims (15)

  1. A method for loading a container on a landing target (59) on a land vehicle using a container crane comprising a trolley (53) and a spreader (55) for holding and lifting the container, the method being performed in a container crane control system (1) and comprising the step of:
    obtaining (40) two-dimensional images of the landing target (59) from a first pair of cameras arranged on the spreader (55), characterised in that the method further comprises the steps of:
    performing (42) feature extraction based on the two-dimensional images to identify key features of the landing target (59);
    generating (44) a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and
    controlling (46) movement of the container to the landing target (59) based on the point cloud and the identified key features of the landing target (59).
  2. The method according to claim 1, wherein the key features comprise corners (31a-d) of the landing target (59).
  3. The method according to claim 1 or 2, wherein the landing target (59) is situated higher than surrounding surfaces.
  4. The method according to any one of claims 1 to 3, wherein the step of obtaining (40) two-dimensional images also comprises obtaining two-dimensional images of the landing target (59) from a second pair of cameras arranged on the spreader (55), and wherein the step of performing (42) feature extraction is based also on a two dimensional image from at least one camera of the second pair.
  5. The method according to claim 4, wherein the first pair of cameras and the second pair of cameras are arranged along the same side of the spreader (55).
  6. The method according to any one of the preceding claims, further comprising the step of:
    detecting (45) orientation of the landing target (59) based on lines (37, 38) in the two dimensional images;
    and wherein the step of controlling (46) movement is also based on the orientation of the landing target.
  7. The method according to any one of the preceding claims, wherein the step of performing (42) feature extraction is based on scale invariant feature transform, SIFT.
  8. The method according to any one of the preceding claims, wherein the step of generating (44) a point cloud is also based on stereo image matching based on the two-dimensional images.
  9. The method according to any one of the preceding claims, further comprising the step of:
    obtaining (43) additional depth data from a depth detection device;
    and wherein the step of generating (44) a point cloud is also based on the additional depth data.
  10. A container crane control system (1) for loading a container on a landing target (59) on a land vehicle using a container crane comprising a trolley (53) and a spreader (55) for holding and lifting the container, the container crane control system (1) comprising:
    a processor (60); and
    a memory (64) storing instructions (67) that, when executed by the processor, cause the container crane control system (1) to:
    obtain two-dimensional images of the landing target (59) from a first pair of cameras arranged on the spreader (55),
    characterised in that the instructions further cause the container crane control system (1) to:
    perform feature extraction based on the two-dimensional images to identify key features of the landing target (59);
    generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and
    control movement of the container to the landing target (59) based on the point cloud and the identified key features of the landing target (59).
  11. The container crane control system (1) according to claim 10, wherein the landing target (59) is situated higher than surrounding surfaces.
  12. The container crane control system (1) according to claim 10 or 11, wherein the instructions to obtain two-dimensional images comprise instructions (67) that, when executed by the processor, cause the container crane control system (1) to obtain two-dimensional images of the landing target (59) from a second pair of cameras arranged on the spreader (55), and wherein the instructions to perform feature extraction comprise instructions (67) that, when executed by the processor, cause the container crane control system (1) to perform the feature extraction also based on a two dimensional image from at least one camera of the second pair.
  13. The container crane control system (1) according to claim 12, wherein the first pair of cameras and the second pair of cameras are arranged along the same side of the spreader (55).
  14. A computer program (67, 91) for loading a container on a landing target (59) on a land vehicle using a container crane comprising a trolley (53) and a spreader (55) for holding and lifting the container, the computer program comprising computer program code which, when run on a container crane control system (1) causes the container crane control system (1) to:
    obtain two-dimensional images of the landing target (59) from a first pair of cameras arranged on the spreader (55),
    characterised in that the computer program code further causes the container crane control system (1) to :
    perform feature extraction based on the two-dimensional images to identify key features of the landing target (59);
    generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and
    control movement of the container to the landing target (59) based on the point cloud and the identified key features of the landing target (59).
  15. A computer program product (64, 90) comprising a computer program according to claim 14 and a computer readable means on which the computer program is stored.
EP17176830.2A 2017-06-20 2017-06-20 Loading a container on a landing target Active EP3418244B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
ES17176830T ES2865179T3 (en) 2017-06-20 2017-06-20 Load a container into a warehouse target
EP17176830.2A EP3418244B1 (en) 2017-06-20 2017-06-20 Loading a container on a landing target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17176830.2A EP3418244B1 (en) 2017-06-20 2017-06-20 Loading a container on a landing target

Publications (2)

Publication Number Publication Date
EP3418244A1 EP3418244A1 (en) 2018-12-26
EP3418244B1 true EP3418244B1 (en) 2021-03-03

Family

ID=59091425

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17176830.2A Active EP3418244B1 (en) 2017-06-20 2017-06-20 Loading a container on a landing target

Country Status (2)

Country Link
EP (1) EP3418244B1 (en)
ES (1) ES2865179T3 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109987514A (en) * 2019-05-04 2019-07-09 华电重工股份有限公司 Gantry crane trolley suspender finely tunes shift system
CN110197028B (en) * 2019-05-29 2022-10-14 南京市特种设备安全监督检验研究院 Crane walking track rail engagement degree detection method based on vector center deviation sensitivity
CN113184707B (en) * 2021-01-15 2023-06-02 福建电子口岸股份有限公司 Method and system for preventing lifting of collection card based on laser vision fusion and deep learning
CN114604756B (en) * 2022-01-24 2023-06-02 杭州大杰智能传动科技有限公司 Cloud information system and method for intelligent tower crane operation data

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002104771A (en) * 2000-07-25 2002-04-10 Inst Of Physical & Chemical Res Container position detector
JP2007031102A (en) * 2005-07-28 2007-02-08 Mitsubishi Heavy Ind Ltd Remote controller of crane device, crane device, and remote control system of crane device
KR101699672B1 (en) * 2013-08-12 2017-01-24 에이비비 테크놀로지 리미티드 Method and system for automatically landing containers on a landing target using a container crane
US9934611B2 (en) * 2013-09-11 2018-04-03 Qualcomm Incorporated Structural modeling using depth sensors

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
ES2865179T3 (en) 2021-10-15
EP3418244A1 (en) 2018-12-26

Similar Documents

Publication Publication Date Title
US20220009748A1 (en) Loading A Container On A Landing Target
EP3033293B1 (en) Method and system for automatically landing containers on a landing target using a container crane
US11780101B2 (en) Automated package registration systems, devices, and methods
EP3418244B1 (en) Loading a container on a landing target
EP3275831B1 (en) Modified video stream for supporting remote control of a container crane
KR102461759B1 (en) Intelligent Forklift and Container Position and Posture Deviation Detection Method
KR101968057B1 (en) Container crane control system
CN110902570A (en) Dynamic measurement method and system for container loading and unloading operation
CN111891927B (en) First floor container placement method and computer readable storage medium
KR102438943B1 (en) Container crane comprising reference marker
JP7021620B2 (en) Manipulators and mobile robots

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190626

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200602

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

INTC Intention to grant announced (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ABB SCHWEIZ AG

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20201001

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1367006

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210315

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017033658

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: FI

Ref legal event code: FGE

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210603

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210604

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2865179

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20211015

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

REG Reference to a national code

Ref country code: AT

Ref legal event code: UEP

Ref document number: 1367006

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210303

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210705

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210703

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017033658

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20211206

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210620

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210620

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210703

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210303

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20170620

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230620

Year of fee payment: 7

Ref country code: DE

Payment date: 20230620

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FI

Payment date: 20230621

Year of fee payment: 7

Ref country code: AT

Payment date: 20230621

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: BE

Payment date: 20230619

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230622

Year of fee payment: 7

Ref country code: ES

Payment date: 20230830

Year of fee payment: 7