GB2267360A - Method and system for interacting with floating objects - Google Patents
Method and system for interacting with floating objects Download PDFInfo
- Publication number
- GB2267360A GB2267360A GB9211000A GB9211000A GB2267360A GB 2267360 A GB2267360 A GB 2267360A GB 9211000 A GB9211000 A GB 9211000A GB 9211000 A GB9211000 A GB 9211000A GB 2267360 A GB2267360 A GB 2267360A
- Authority
- GB
- United Kingdom
- Prior art keywords
- points
- point
- image
- crane
- floating object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63B—SHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING
- B63B27/00—Arrangement of ship-based loading or unloading equipment for cargo or passengers
- B63B27/10—Arrangement of ship-based loading or unloading equipment for cargo or passengers of cranes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66C—CRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
- B66C13/00—Other constructional features or details
- B66C13/02—Devices for facilitating retrieval of floating objects, e.g. for recovering crafts from water
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Ocean & Marine Engineering (AREA)
- Control And Safety Of Cranes (AREA)
Abstract
A ship's quarters are provided with orthogonal arrays of four light sources 10 to permit a CCD camera (22, figure 2) on a crane to monitor the motion of the ship in six degrees of freedom in real time. As shown in figure 2, data is passed to an image analyser (30) which first acquires and then tracks the two dimensional co-ordinates of one of the three dimensional orthogonal arrays, passing its data to a processor (40) which interprets these co-ordinates as motions of the ship. The processor outputs a signal indicative of sea state and other data to enable the crane (56) to lift cargo from the ship with enhanced efficiency and safety. Servo control of the relative position of the ship and the crane can be automatically effected or an indication thereof can be given to a human operator of the crane. <IMAGE>
Description
METHOD AND SYSTEM FOR INTERACTING WITH FLOATING OBJECTS
This invention is generally concerned with co-ordinating the interaction between a floating object at sea and a reference object.
The floating object may typically be a ship or barge or other vessel, although the invention will be seen to be applicable to other categories of floating object. The reference object may be freely floating, such as another vessel; or may be tethered, such as a moored platform or anchored ship; or may be quite stationary, such as a jetty, pier, or platform standing rigidly on the sea bed.
The invention is concerned particularly but not exclusively with controlling a structure mounted on one of the said objects to operate upon and/or hold station with respect to the other said object. The structure may be a crane, a pipe, a conveyor, or a drawbridge, for example.
The several aspects of the invention can usefully be illustrated by reference to a specific application.
In the North Sea oil and gas fields, production platforms stand on the sea bed and are supplied with all material other than personnel by surface vessel. The platforms have cranes for lifting cargo containers from the decks of the vessels. During such operations, the vessel is not moored to the platform, but attempts to hover on station by the platform while experiencing the full motion of the open sea.
At this time the vessel will be moving with six degrees of freedom, namely the linear motions of heave, sway and surge, and the rotational motions of yaw, pitch and roll. All these affect the unloading operation. The heave velocities in particular will influence the minimum velocity of lifting (so that the load is lifted faster than the vessel is rising), and will demand load -accelerations which will in turn govern the maximum load that the crane can handle. One result is that the crane must be de-rated, that is to say, operated below its theoretical capacity.
Crane derating may be performed by the crane operator on the basis of his visual assessment of the sea condition. He may decide to derate the crane by as much as 75%, and will ideally estimate the heave cycle - which may typically have a period between five and twelve seconds - so that he commences lifting just after a peak.
Prior to lifting, the crane wire is held in constant tension using a hydraulic error signal from the crane winch servo.
In addition, the operator may receive sea state information by telemetry from a wave reader buoy located in the vicinity of the platform and provided with sensors and a radio transmitter.
On the basis of all the information available to him, the operator will use his judgement to unload the vessel efficiently and with sufficient margins of safety.
This invention is concerned with co-ordinating the interaction between the floating body, that is to say the vessel, and the reference body, that is to say the crane, in this instance, so that the efficiency and the safety of the unloading operation are both enhanced.
According to the invention, the motion of the floating body in all six degrees of freedom is measured directly in real time relative to the reference object and the measurements are used to generate a signal which in turn is used to control an interaction between the objects. In the case of the crane, the interaction may include the proper positioning of the boom and boom trolley, and the control of the winch, to achieve the rapid and safe lifting of a container from the moving deck of the vessel. Control may be by means of a closed system, in which a signal derived from the measurements is automatically applied to the interaction, or may be through the medium of a human operator interpreting such a signal, and using his discretion in applying the measurements to his control of the interaction.
The invention may be implemented in a number of different ways, as will become apparent from this description. In particular, it encompasses a method of controlling the interaction between a floating object and a reference object, and a system for controlling the interaction between the said objects, in both general and specific applications. Because the invention concerns the relative positioning between objects, the floating object and the reference object can in principle be interchanged.
The generated signal may be one that defines the instantaneous position and orientation of the floating object relative to the reference object, or may be one that defines the change in position and orientation with respect to the previous measurement. Clearly one can be converted to the other, as may be required by any specific application of the invention.
For convenience, the invention will be described henceforth in its specific application to the unloading of ships at sea.
The direct real time measurement of the motions of the ship can in principle be carried out in many ways, any of which will provide a fundamental improvement in the former techniques exemplified by the use of a remote wave reader buoy. Gyros and accelerometers could be installed on the ship directly. However, a system which minimises the equipment to be installed on the ship is particularly attractive, because many different ships will visit any given platform, and on cost and maintenance grounds there should be as little duplication of shipboard equipment as possible. A system that is primarily installed on the platform has clear advantages, including a reduced or even nil telemetry requirement.
A preferred system includes the definition of at least three visible non-aligned points on the floating object, ie on the ship, at known distances apart; and the provision on the reference object, ie on the platform and preferably on the crane, of area imaging means adapted to capture an image including the said points1 an image analyser adapted to locate the said points in the captured image, a data processor for interpreting the locations of the points in the captured image in terms of the motions of the floating object, and output means for producing therefrom a signal indicative of the said motions.
Three points may produce ambiguous results, and preferably four points in an orthogonal configuration are used. It is an advantage to use more points than any theoretical minimum, so that there are in-built redundancies which allow the system to tolerate inaccuracies or imaging failures. On the other hand, certain ambiguities may be tolerable if in practice the risk of error is not significant.
Further, approximations may be made in transforming the actual measurements into orientation and position data, if, for example, calculation speeds can be increased without significant detriment to accuracy. Finally, although the measurements are sufficient to calculate all orientation and position coordinates, all this information may not be essential in the particular application.
For example, monitoring range and vertical movements may be sufficient for many crane operations.
One embodiment of the invention is illustrated, by way of example, in the accompanying drawings, in which:
Figure 1 is a diagram showing two arrays of object points on the stern of a ship to be unloaded; and
Figure 2 is a diagrammatic representation of a system for use in the unloading of a ship hovering off a platform.
Figure 1 indicates the rearmost portion of a ship 1. Each of the port and starboard quarters carries an array 10 of four mutually orthogonal points A, B, C, R, orientated to the axes of the ship.
That is to say, the respective pairs of points AR, BR and CR define mutually perpendicular lines which intersect at R. Point A is horizontally inboard of point R, point B is vertically below point
R, and point C is horizontally forward of point R.
The points are designated by bright light sources. Alternatively, the points could be fluorescent spots, or reflectors, especially 1800 reflectors adapted to reflect back to source any incident beam of light. Essentially, the points must be defined so as to be visibly distinct from any background that may be reasonably encountered.
The distances between each pair of points in each array of four are accurately known.
Figure 2 shows the parts of the system that are mounted on the platform. These include imaging means 20, an image analyser 30, a data processor 40 and a crane control interface 50 which functions as output means for the data processor.
When the ship 1 is in position for unloading, it hovers with its stern towards the platform. The provision of two orthogonal light arrays 10, one on each quarter, ensures that one array is always visible from the platform.
The imaging means 20 comprises an electro-optical sensor, in this case a high resolution CCD camera 22, contained in a weatherproof housing 24 mounted on the moving part of the crane and directed towards the area from which the crane is able to lift. This ensures that when the crane is lifting from a vessel, the vessel is always in the field of view of the camera, and a wide viewing angle is not required, to the benefit of camera resolution. If the points of the arrays are defined by 1800 reflectors, a narrow beam lamp may be mounted alongside the camera to illuminate the points.
The image analyser 30 and the data processor 40 are self-contained units mounted in a shock resistant rack, which can be mounted near the camera or close to the crane operator. The image analyser comprises a processor card which will examine and use the signal from the imaging means 20 to produce accurate co-ordinates in two dimensions, of the received images of the light source array 10.
The data processor 40 can be any general purpose computer, or a special processor card embedded in a larger computer.
The crane control interface 50 is essentially an input and output device, but can take many forms. It can vary in levels of complexity from a simple video monitor to a computer controller for a crane servo.
The functioning of the different elements of the system is broadly as follows.
When a loaded vessel appears in the field of view of the camera 22, the operator at the crane control interface 50 causes a 'go' command to be passed to the image analyser 30, and also designates which quarter of the ship is visible. The image analyser then switches into acquisition mode.
The CCD camera 22 forms a two dimensional image point array from the three dimensional object point array 10 on the ship. In acquisition mode, the image analyser scans any light sources appearing in the image received from the camera and compares these to the position of the surrounding lights. Using previously stored data defining the configuration and dimensions of the array 10, the signal analyser recognises and allocates tracks to the individual points of the array. If the light sources are not entirely visible, or if a multiplicity of lights surround and obscure the array, the operator can manually identify the array points to the system. To this end, the operator can be provided with a display screen duplicating the camera output, and a light pen, for example.
Before processing the image information received from the camera, the image analyser may enhance the point images by the use of a pre-processing thresholding algorithm. A centroid algorithm is then applied, using interpolation, to find the centre of each light to sub-pixel accuracy.
Once the image analyser has recognised and acquired the light array 10, it switches into operational mode. Now the image analyser starts to pass the two dimensional coordinates of each of the four points A, B, C, R, to the data processor 40.
Within the data processor the image coordinates are converted back to true three dimensional coordinates, using the known geometry of the array 10 and the known focal length of the imaging system in camera 22. This information equates to the position and orientation of the vessel l, and thus, over time, its motion.
Equations for the transformation of two dimensional coordinates to three dimensional coordinates are well known.
The following transformation equations are implemented by the data processor in this embodiment. The equations are used to produce three angles. These angles represent the angular deviation of the orientation of the vessel from given starting coordinates, in pitch, roll and yaw, while the camera remains static. The starting coordinates are defined by the line of sight of the camera and two axes perpendicular thereto which relate to the x and y coordinates of the camera image plane. Two lengths are also calculated, representing the range of the vessel from the reference point (eg the camera), and the vertical movement of the vessel.
The images of each of the four points A, B, C and R are expressed as (x,y) coordinates. The x component of the distance between points R and A, for example, is then expressed as X(RA), while the y component of the distance is expressed as Y(RA). The distances between other pairs of points are represented similarly.
The pitch angle L is given by:
The roll angle M is given by:
The yaw angle N is given by:
The range D is given by: D = S.P / 2.L.tan(F/2) where S is the point separation in metres
F is the angular field of view of the camera
P is the resolution of the camera (in (x,y) coordinate units)
The vertical movement of the vessel is simply derived from the changes in range and movement of the point R between successive sightings.
This particular method of calculation has been selected to give a satisfactory compromise between speed and accuracy.
The data processor 40 then uses this directly measured, real time data on the velocity and acceleration of the vessel to give a value for the sea state, which is output to the crane control interface 50 after approximately two wave cycles.
The same data is also fed into a filtered wave model 44, which calculates values for the wavelength and amplitude of the waves being experienced by the loaded vessel. This allows the motion of the vessel to be predicted. Comparing the predicted motion with the observed motion gives a measure of the regularity of the wave train.
The output of the data processor 40 at the crane control interface 50 can take many forms. Generally, the sea state and vessel motion coefficients are displayed in various ways to assist the crane operator and to improve crane performance. The actual display and use of data depends on the implementation of the system.
The crane control interface 50 includes a display unit 52 in the operator's cabin. The display unit includes a simple TV monitor and an alphanumeric display of the current sea state, together with audible and visible lift advice for the operator. The operator also has access to system controls which switch the image analyser to acquisition mode, designate which quarter of the vessel is visible, designate an area of the ship for which lift advice is required, and switch lift advice on and off.
It is possible for the operator to inform the system of the point on the vessel from which he intends to lift. This information can be entered manually by the operator. Alternatively, the data processor can continually calculate the intended lift position by combining the calculated position and orientation of the vessel with data received from the crane in respect of the position of the lifting trolley on the crane boom. The assumed position from which the load is to be lifted would be the portion of the ship lying directly below the lift trolley.
Given this additional information, the data processor can specifically monitor the acceleration and velocity of the selected portion of the vessel. This then allows the crane operator to be more precisely informed as to when it is safest to commence his lift. This safe 'lift window' normally occurs when the load point has just begun to descend from the crest of a wave.
The sea state and vessel motion calculations are continually repeated with update times typically of the order of tens of milliseconds. This ensures that the information is always current, which allows safe and effective lift advice. Continuous real time co-ordination between the vessel motion and the crane operation is highly desirable, having regard to the rapid accelerations and the randomness associated with wave patterns around oil platforms located in the North Sea.
Of course, numerous safety options can be built in. The observed points on the light array 10 can be continuously evaluated for consistency, and if any light disappears or the positions show a lack of consistency, the operator is immediately informed, and the output sea state is modified to ensure that it is 'safe', based on historic data, using a large safety margin.
By providing a more accurate means of control of the lifting operation, the invention not only enhances safety, but also allows more efficient operation. By providing the operator with more reliable knowledge of the best time to lift1 and of the likely behaviour of the vessel, crane pulleys can be sheaved only for the optimum derated load, so that cargo containers can be lifted at higher heave velocities and greater loads can be safely lifted in higher sea states.
Since the output of the data processor 40 provides accurate real time information the crane control interface 50 can include a crane controller or compensator 54 which operates on the crane servo 56 to directly position the crane hook with no further system elements required. The compensator function serves to maintain crane wire tension before the load is lifted from the vessel. The controller function initiates and controls the lifting sequence.
The system can analyse and track more than the four points of the arrays 10. This excess ability can be utilised in several ways.
For example, extra light sources can be added to the arrays to improve safety should any of the other light sources be extinguished.
Further, a light source can be added to the crane lifting attachment, or to any other object moving relative to the target vessel. The image analyser 30 will then observe this source and is able to produce commands to control the distance between the target vessel and the moving object.
Variations in the described application are many. One is the substitution of vessel manoeuvring implements or tugs for the maritime crane described, and the control of winching operations involving very large vessels. The light array would be placed on a relevant point of a large vessel or object of known dimensions.
The movement and range of the light array would enable the position and range of all points on the object to be known and controlled.
One application of this kind is in dry docking submarines, which must be floated accurately over cradles before the dock is drained.
The invention can be used to control a multiplicity of winches, each associated with a cable to the ship, automatically or through the intervention of a human operator.
Another variation lies in lifting smaller objects from the sea, such as in the recovery of small vehicles by larger objects such as mother ships. An array of light sources would be present on the vehicle being recovered and on the recovery arm.
Generally, the invention is suitable for generating a control signal between any floating object and a reference object.
Analysis of the motion of the floating object leads to an indication of the sea state1 and the data can be further utilised to determine the position and velocity of any point on the object.
Claims (30)
1 A system for co-ordinating interaction between a floating object and a reference object comprising means for measuring the motion in all six degrees of freedom of the floating object relative to the reference object in real time, means for generating a control signal from the measurements, and means for applying the control signal to control the said interaction between the objects.
2 A system according to claim 1 wherein the means for measuring the motion comprise at least three visible non-aligned points on the floating object at known distances apart and area imaging means mounted on the reference object adapted to capture an image including the said points.
3 A system according to claim 2 in which the means for measuring the motion further comprise an image analyser adapted to locate the said points in the captured image, a data processor for interpreting the locations of the points in the captured image in terms of the motions of the floating object, and output means for producing therefrom a signal indicative of the said motions.
4 A system according to claim 3 in which the image analyser is adapted to use the signal from the imaging means to produce coordinates in two dimensions of the received images of the points.
5 A system according to any one of claims 2 to 4 in which the points on the floating object are designated by bright light sources, fluorescent spots, or reflectors.
6 A system according to claim 5 in which the points are defined by 1800 reflectors and a narrow beam lamp is mounted alongside the imaging means to illuminate the points.
7 A system according to any one of claims 2 to 6 in which the said at least three points comprise four points in an orthogonal configuration.
8 A system according to claim 7 in which the floating object is a ship whereon each of the port and starboard quarters carries an array of four mutually orthogonal points orientated to the axes of the ship.
9 A system according to any one of claims 2 to 8 in which the reference object comprises a crane with a movable boom and the imaging means comprise an electro-optical sensor mounted on the boom and directed towards the area from which the crane is able to lift.
10 A system according to claim 9 dependent upon claim 3 in which the output means comprise a controller for a crane servo adapted to initiate and control the crane lifting sequence.
11 A system according to claim 9 dependent upon claim 3 in which the output means comprise a compensator adapted to maintain crane wire tension before a load is lifted.
12 A system according to any one of claims 3 to 11 further comprising a reference point on a controllable object moving relative to the floating object4 the reference point being observable by the image analyser, whereby the system is enabled to produce commands to control the distance between the controllable object and the floating object.
13 A system for co-ordinating interaction between a floating object and a reference object substantially as herein described with reference to and as illustrated in the accompanying drawings.
14 A method of co-ordinating an interaction between a floating object and a reference object comprising making measurements of the motion in all six degrees of freedom of the floating object relative to the reference object in real time, using the measurements to generate a control signal, and using the control signal to control the said interaction between the objects.
15 A method according to claim 14 comprising using the control signal to control a structure mounted on one of the said objects to operate upon and/or hold station with respect to the other said object.
16 A method according to claim 14 or claim 15 comprising measuring the motion of at least three visible non-aligned points on the floating object at known distances apart and using area imaging means mounted on the reference object to capture an image including the said points.
17 A method according to claim 16 in which the said at least three points comprise four points in an orthogonal configuration designated by bright light sources, fluorescent spots, or reflectors.
18 A method according to claim 16 or claim 17 further comprising using an image analyser to locate the said points in the captured image, and a data processor to interpret the locations of the points in the captured image in terms of the motions of the floating object, and producing therefrom an output signal indicative of the said motions.
19 A method according to claim 18 in which the imaging means form a two dimensional image point array from the three dimensional object point array, and the image analyser scans light sources appearing in the image point array, acquires the points corresponding to the object point array, and passes the two dimensional coordinates of each of the points to the data processor.
20 A method according to claim 19 in which the data processor converts the two dimensional image coordinates to three dimensional coordinates, using the known geometry of the array and the known focal length of the imaging system.
21 A method according to any one of claims 18 to 20 in which the data processor implements transformation equations to produce three angles representing the angular deviation of the orientation of the floating object from given starting coordinates, in pitch, roll and yaw, which are defined by the line of sight of the imaging means and two axes perpendicular thereto which relate to the x and y coordinates of the image plane.
22 A method according to claim 21 in which the said at least three points comprise four points in an orthogonal configuration and the transformation equations are as follows, wherein the image points of the four mutually orthogonal points on the floating object A, B, C, R, such that the respective pairs of points AR, BR and CR define mutually perpendicular lines which intersect at R, point A is horizontally inboard of point R, point B is vertically below point R, and point C is horizontally forward of point R, are expressed as (x,y) coordinates, the x component of the distance between points R and A is expressed as X(RA), the y component of the distance is expressed as Y(RA) and the distances between other pairs of points are represented similarly::
Pitch angle
Roll angle
Yaw angle
23 A method according to any one of claims 18 to 22 in which the data processor calculates two lengths, representing the range and the vertical movement of the floating object.
24 A method according to claim 22 in which the range D of the floating object is calculated as D = S.P / 2.L.tan(F/2) where:
S is the point separation
F is the angular field of view of the imaging means
P is the resolution of the imaging means in (x,y) co-ordinate
units, and
and the vertical movement of the object is derived from the changes in range and movement of the point R between successive sightings.
25 A method according to any one of claims 18 to 24 wherein the data processor uses the measured data to give a value for the sea state.
26 A method according to any one of claims 18 to 25 in which data from the data processor is fed into a wave model which calculates values for the wavelength and amplitude of the waves being experienced by the floating object, the motion of the object is predicted, and the predicted motion is compared with the subsequently observed motion to give a measure of the regularity of the wave train.
27 A method of unloading ships at sea according to any one of claims 14 to 26.
28 A method of manoeuvring a vessel according to any one of claims 14 to 26.
29 A method of lifting objects from the sea according to any one of claims 14 to 26.
30 A method of co-ordinating an interaction between a floating object and a reference object substantially as herein described with reference to and as illustrated in the accompanying drawings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9211000A GB2267360B (en) | 1992-05-22 | 1992-05-22 | Method and system for interacting with floating objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9211000A GB2267360B (en) | 1992-05-22 | 1992-05-22 | Method and system for interacting with floating objects |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9211000D0 GB9211000D0 (en) | 1992-07-08 |
GB2267360A true GB2267360A (en) | 1993-12-01 |
GB2267360B GB2267360B (en) | 1995-12-06 |
Family
ID=10715936
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9211000A Expired - Fee Related GB2267360B (en) | 1992-05-22 | 1992-05-22 | Method and system for interacting with floating objects |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2267360B (en) |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2300082A (en) * | 1995-04-21 | 1996-10-23 | British Aerospace | Distance measuring apparatus |
GB2325578A (en) * | 1997-04-04 | 1998-11-25 | Evans & Sutherland Computer Co | Camera/lens calibration |
GB2325807A (en) * | 1997-05-30 | 1998-12-02 | British Broadcasting Corp | Position determination |
WO1999055579A1 (en) * | 1998-04-28 | 1999-11-04 | Oceantech Plc | Stabilized ship-borne access apparatus and control method for the same |
NL1018919C2 (en) * | 2001-09-10 | 2003-03-11 | Leenstra Machine En Staalbouw | Transhipment system, uses sensors to steer derrick relative to aiming block normally carried by boat |
US6556722B1 (en) | 1997-05-30 | 2003-04-29 | British Broadcasting Corporation | Position determination |
NO320731B1 (en) * | 2003-04-09 | 2006-01-23 | Mongstad Test Eiendom As | System and method for monitoring a workspace |
GB2418482A (en) * | 2004-09-23 | 2006-03-29 | Wayne Howell | Method for locating luminaires using optical feedback |
US7367464B1 (en) | 2007-01-30 | 2008-05-06 | The United States Of America As Represented By The Secretary Of The Navy | Pendulation control system with active rider block tagline system for shipboard cranes |
WO2011135310A2 (en) | 2010-04-29 | 2011-11-03 | National Oilwell Varco L.P. | Videometric systems and methods for offshore and oil-well drilling |
US8195368B1 (en) | 2008-11-07 | 2012-06-05 | The United States Of America As Represented By The Secretary Of The Navy | Coordinated control of two shipboard cranes for cargo transfer with ship motion compensation |
EP2524892A1 (en) * | 2011-05-19 | 2012-11-21 | Liebherr-Werk Nenzing Ges.m.b.H | Crane control |
WO2012161584A1 (en) * | 2011-05-20 | 2012-11-29 | Optilift As | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations |
DE102011109157A1 (en) * | 2011-08-01 | 2013-02-07 | Horst Bredemeier | Method for setting load e.g. vessel, on deposition surface at sea with waves using hoist in offshore installation or ship, involves resuming transmission of control signals to cable winch to actuate winch to deposit load on surface |
US8606401B2 (en) | 2005-12-02 | 2013-12-10 | Irobot Corporation | Autonomous coverage robot navigation system |
US8656550B2 (en) | 2002-01-03 | 2014-02-25 | Irobot Corporation | Autonomous floor-cleaning robot |
US8670866B2 (en) | 2005-02-18 | 2014-03-11 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8739355B2 (en) | 2005-02-18 | 2014-06-03 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US8761931B2 (en) | 2005-12-02 | 2014-06-24 | Irobot Corporation | Robot system |
US8780342B2 (en) | 2004-03-29 | 2014-07-15 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US8781626B2 (en) | 2002-09-13 | 2014-07-15 | Irobot Corporation | Navigational control system for a robotic device |
US8800107B2 (en) | 2010-02-16 | 2014-08-12 | Irobot Corporation | Vacuum brush |
US8838274B2 (en) | 2001-06-12 | 2014-09-16 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8930023B2 (en) | 2009-11-06 | 2015-01-06 | Irobot Corporation | Localization by learning of wave-signal distributions |
US8972052B2 (en) | 2004-07-07 | 2015-03-03 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US8978196B2 (en) | 2005-12-02 | 2015-03-17 | Irobot Corporation | Coverage robot mobility |
US8985127B2 (en) | 2005-02-18 | 2015-03-24 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US9144361B2 (en) | 2000-04-04 | 2015-09-29 | Irobot Corporation | Debris sensor for cleaning apparatus |
CN105008218A (en) * | 2013-02-21 | 2015-10-28 | 利佩特控股(英国)有限公司 | Improved apparatus for and method of transferring object between marine transport vessel and construction or vessel |
US9215957B2 (en) | 2004-01-21 | 2015-12-22 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US9229454B1 (en) | 2004-07-07 | 2016-01-05 | Irobot Corporation | Autonomous mobile robot system |
US9317038B2 (en) | 2006-05-31 | 2016-04-19 | Irobot Corporation | Detecting robot stasis |
US9320398B2 (en) | 2005-12-02 | 2016-04-26 | Irobot Corporation | Autonomous coverage robots |
US9446521B2 (en) | 2000-01-24 | 2016-09-20 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US9480381B2 (en) | 2007-05-09 | 2016-11-01 | Irobot Corporation | Compact autonomous coverage robot |
US9486924B2 (en) | 2004-06-24 | 2016-11-08 | Irobot Corporation | Remote control scheduler and method for autonomous robotic device |
US9492048B2 (en) | 2006-05-19 | 2016-11-15 | Irobot Corporation | Removing debris from cleaning robots |
US9582005B2 (en) | 2001-01-24 | 2017-02-28 | Irobot Corporation | Robot confinement |
DE102008024513B4 (en) * | 2008-05-21 | 2017-08-24 | Liebherr-Werk Nenzing Gmbh | Crane control with active coast sequence |
WO2018228809A1 (en) | 2017-06-12 | 2018-12-20 | Siemens Wind Power A/S | Offshore wind turbine installation arrangement |
EP2572976B1 (en) * | 2010-05-20 | 2021-06-16 | Mitsubishi Shipbuilding Co., Ltd. | Transporting barge, floating structure installation system, and floating structure installation method |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8788092B2 (en) | 2000-01-24 | 2014-07-22 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8396592B2 (en) | 2001-06-12 | 2013-03-12 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US9128486B2 (en) | 2002-01-24 | 2015-09-08 | Irobot Corporation | Navigational control system for a robotic device |
US8386081B2 (en) | 2002-09-13 | 2013-02-26 | Irobot Corporation | Navigational control system for a robotic device |
KR101074937B1 (en) | 2005-12-02 | 2011-10-19 | 아이로보트 코퍼레이션 | Modular robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4402350A (en) * | 1979-11-12 | 1983-09-06 | Fmc Corporation | System for the control of a marine loading arm |
EP0294760A1 (en) * | 1987-06-11 | 1988-12-14 | International Business Machines Corporation | Magnetically levitated fine motion robot wrist with programmable compliance |
US4854800A (en) * | 1984-08-22 | 1989-08-08 | British Aerospace Public Limited Company | Open sea transfer of articles |
GB2224613A (en) * | 1988-11-02 | 1990-05-09 | Electro Optics Ind Ltd | Navigation using triangle of light sources |
GB2233121A (en) * | 1989-04-27 | 1991-01-02 | Nissan Motor | Positioning in automated assembly line |
GB2234877A (en) * | 1989-08-09 | 1991-02-13 | Marconi Gec Ltd | Determining orientation of pilot's helmet for weapon aiming |
GB2246261A (en) * | 1990-07-16 | 1992-01-22 | Roke Manor Research | Tracking arrangements and systems |
-
1992
- 1992-05-22 GB GB9211000A patent/GB2267360B/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4402350A (en) * | 1979-11-12 | 1983-09-06 | Fmc Corporation | System for the control of a marine loading arm |
US4854800A (en) * | 1984-08-22 | 1989-08-08 | British Aerospace Public Limited Company | Open sea transfer of articles |
EP0294760A1 (en) * | 1987-06-11 | 1988-12-14 | International Business Machines Corporation | Magnetically levitated fine motion robot wrist with programmable compliance |
GB2224613A (en) * | 1988-11-02 | 1990-05-09 | Electro Optics Ind Ltd | Navigation using triangle of light sources |
GB2233121A (en) * | 1989-04-27 | 1991-01-02 | Nissan Motor | Positioning in automated assembly line |
GB2234877A (en) * | 1989-08-09 | 1991-02-13 | Marconi Gec Ltd | Determining orientation of pilot's helmet for weapon aiming |
GB2246261A (en) * | 1990-07-16 | 1992-01-22 | Roke Manor Research | Tracking arrangements and systems |
Cited By (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2300082B (en) * | 1995-04-21 | 1999-09-22 | British Aerospace | Altitude measuring methods |
GB2300082A (en) * | 1995-04-21 | 1996-10-23 | British Aerospace | Distance measuring apparatus |
GB2325578A (en) * | 1997-04-04 | 1998-11-25 | Evans & Sutherland Computer Co | Camera/lens calibration |
GB2325578B (en) * | 1997-04-04 | 2001-11-07 | Evans & Sutherland Computer Co | Camera/lens calibration apparatus and method |
GB2325807A (en) * | 1997-05-30 | 1998-12-02 | British Broadcasting Corp | Position determination |
GB2325807B (en) * | 1997-05-30 | 2002-03-20 | British Broadcasting Corp | Position determination |
US6556722B1 (en) | 1997-05-30 | 2003-04-29 | British Broadcasting Corporation | Position determination |
WO1999055579A1 (en) * | 1998-04-28 | 1999-11-04 | Oceantech Plc | Stabilized ship-borne access apparatus and control method for the same |
US6659703B1 (en) | 1998-04-28 | 2003-12-09 | Oceantech Plc | Stabilized ship-borne access apparatus and control method for the same |
US9446521B2 (en) | 2000-01-24 | 2016-09-20 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US9144361B2 (en) | 2000-04-04 | 2015-09-29 | Irobot Corporation | Debris sensor for cleaning apparatus |
US9167946B2 (en) | 2001-01-24 | 2015-10-27 | Irobot Corporation | Autonomous floor cleaning robot |
US9622635B2 (en) | 2001-01-24 | 2017-04-18 | Irobot Corporation | Autonomous floor-cleaning robot |
US9582005B2 (en) | 2001-01-24 | 2017-02-28 | Irobot Corporation | Robot confinement |
US9104204B2 (en) | 2001-06-12 | 2015-08-11 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8838274B2 (en) | 2001-06-12 | 2014-09-16 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
NL1018919C2 (en) * | 2001-09-10 | 2003-03-11 | Leenstra Machine En Staalbouw | Transhipment system, uses sensors to steer derrick relative to aiming block normally carried by boat |
US8656550B2 (en) | 2002-01-03 | 2014-02-25 | Irobot Corporation | Autonomous floor-cleaning robot |
US8671507B2 (en) | 2002-01-03 | 2014-03-18 | Irobot Corporation | Autonomous floor-cleaning robot |
US8781626B2 (en) | 2002-09-13 | 2014-07-15 | Irobot Corporation | Navigational control system for a robotic device |
US9949608B2 (en) | 2002-09-13 | 2018-04-24 | Irobot Corporation | Navigational control system for a robotic device |
NO320731B1 (en) * | 2003-04-09 | 2006-01-23 | Mongstad Test Eiendom As | System and method for monitoring a workspace |
US9215957B2 (en) | 2004-01-21 | 2015-12-22 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US9360300B2 (en) | 2004-03-29 | 2016-06-07 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US8780342B2 (en) | 2004-03-29 | 2014-07-15 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US9486924B2 (en) | 2004-06-24 | 2016-11-08 | Irobot Corporation | Remote control scheduler and method for autonomous robotic device |
US9223749B2 (en) | 2004-07-07 | 2015-12-29 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US9229454B1 (en) | 2004-07-07 | 2016-01-05 | Irobot Corporation | Autonomous mobile robot system |
US8972052B2 (en) | 2004-07-07 | 2015-03-03 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
GB2418482A (en) * | 2004-09-23 | 2006-03-29 | Wayne Howell | Method for locating luminaires using optical feedback |
US8985127B2 (en) | 2005-02-18 | 2015-03-24 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US8739355B2 (en) | 2005-02-18 | 2014-06-03 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US8782848B2 (en) | 2005-02-18 | 2014-07-22 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US10470629B2 (en) | 2005-02-18 | 2019-11-12 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US8774966B2 (en) | 2005-02-18 | 2014-07-08 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8966707B2 (en) | 2005-02-18 | 2015-03-03 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US9445702B2 (en) | 2005-02-18 | 2016-09-20 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8670866B2 (en) | 2005-02-18 | 2014-03-11 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US9144360B2 (en) | 2005-12-02 | 2015-09-29 | Irobot Corporation | Autonomous coverage robot navigation system |
US8761931B2 (en) | 2005-12-02 | 2014-06-24 | Irobot Corporation | Robot system |
US8978196B2 (en) | 2005-12-02 | 2015-03-17 | Irobot Corporation | Coverage robot mobility |
US8606401B2 (en) | 2005-12-02 | 2013-12-10 | Irobot Corporation | Autonomous coverage robot navigation system |
US9392920B2 (en) | 2005-12-02 | 2016-07-19 | Irobot Corporation | Robot system |
US9320398B2 (en) | 2005-12-02 | 2016-04-26 | Irobot Corporation | Autonomous coverage robots |
US9955841B2 (en) | 2006-05-19 | 2018-05-01 | Irobot Corporation | Removing debris from cleaning robots |
US9492048B2 (en) | 2006-05-19 | 2016-11-15 | Irobot Corporation | Removing debris from cleaning robots |
US10244915B2 (en) | 2006-05-19 | 2019-04-02 | Irobot Corporation | Coverage robots and associated cleaning bins |
US9317038B2 (en) | 2006-05-31 | 2016-04-19 | Irobot Corporation | Detecting robot stasis |
US7367464B1 (en) | 2007-01-30 | 2008-05-06 | The United States Of America As Represented By The Secretary Of The Navy | Pendulation control system with active rider block tagline system for shipboard cranes |
US11498438B2 (en) | 2007-05-09 | 2022-11-15 | Irobot Corporation | Autonomous coverage robot |
US10299652B2 (en) | 2007-05-09 | 2019-05-28 | Irobot Corporation | Autonomous coverage robot |
US10070764B2 (en) | 2007-05-09 | 2018-09-11 | Irobot Corporation | Compact autonomous coverage robot |
US11072250B2 (en) | 2007-05-09 | 2021-07-27 | Irobot Corporation | Autonomous coverage robot sensing |
US9480381B2 (en) | 2007-05-09 | 2016-11-01 | Irobot Corporation | Compact autonomous coverage robot |
DE102008024513B4 (en) * | 2008-05-21 | 2017-08-24 | Liebherr-Werk Nenzing Gmbh | Crane control with active coast sequence |
US8195368B1 (en) | 2008-11-07 | 2012-06-05 | The United States Of America As Represented By The Secretary Of The Navy | Coordinated control of two shipboard cranes for cargo transfer with ship motion compensation |
US8930023B2 (en) | 2009-11-06 | 2015-01-06 | Irobot Corporation | Localization by learning of wave-signal distributions |
US10314449B2 (en) | 2010-02-16 | 2019-06-11 | Irobot Corporation | Vacuum brush |
US8800107B2 (en) | 2010-02-16 | 2014-08-12 | Irobot Corporation | Vacuum brush |
US11058271B2 (en) | 2010-02-16 | 2021-07-13 | Irobot Corporation | Vacuum brush |
US9303473B2 (en) | 2010-04-29 | 2016-04-05 | National Oilwell Varco, L.P. | Videometric systems and methods for offshore and oil-well drilling |
WO2011135310A3 (en) * | 2010-04-29 | 2012-09-27 | National Oilwell Varco L.P. | Videometric systems and methods for offshore and oil-well drilling |
WO2011135310A2 (en) | 2010-04-29 | 2011-11-03 | National Oilwell Varco L.P. | Videometric systems and methods for offshore and oil-well drilling |
EP2572976B1 (en) * | 2010-05-20 | 2021-06-16 | Mitsubishi Shipbuilding Co., Ltd. | Transporting barge, floating structure installation system, and floating structure installation method |
EP2524892A1 (en) * | 2011-05-19 | 2012-11-21 | Liebherr-Werk Nenzing Ges.m.b.H | Crane control |
DE102011102025A1 (en) * | 2011-05-19 | 2012-11-22 | Liebherr-Werk Nenzing Gmbh | crane control |
NO342303B1 (en) * | 2011-05-20 | 2018-04-30 | Optilift As | System, device and method for tracking position and orientation of vehicles, loading devices and goods in operations with loading devices |
US9909864B2 (en) | 2011-05-20 | 2018-03-06 | Optilift As | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations |
RU2623295C2 (en) * | 2011-05-20 | 2017-06-23 | Оптилифт Ас | System, device and method for current monitoring of vehicle, loading device and cargo position and orientation, while loading device operation |
GB2504903B (en) * | 2011-05-20 | 2016-05-25 | Optilift As | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations |
WO2012161584A1 (en) * | 2011-05-20 | 2012-11-29 | Optilift As | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations |
GB2504903A (en) * | 2011-05-20 | 2014-02-12 | Optilift As | System, device and method for tracking position and orientation of vehicle, loading device and cargo in loading device operations |
DE102011109157A1 (en) * | 2011-08-01 | 2013-02-07 | Horst Bredemeier | Method for setting load e.g. vessel, on deposition surface at sea with waves using hoist in offshore installation or ship, involves resuming transmission of control signals to cable winch to actuate winch to deposit load on surface |
CN105008218B (en) * | 2013-02-21 | 2019-03-01 | 利佩特控股(英国)有限公司 | For at sea between ships that transport and structure or ship transfer object improve equipment and method |
CN105008218A (en) * | 2013-02-21 | 2015-10-28 | 利佩特控股(英国)有限公司 | Improved apparatus for and method of transferring object between marine transport vessel and construction or vessel |
WO2018228809A1 (en) | 2017-06-12 | 2018-12-20 | Siemens Wind Power A/S | Offshore wind turbine installation arrangement |
Also Published As
Publication number | Publication date |
---|---|
GB9211000D0 (en) | 1992-07-08 |
GB2267360B (en) | 1995-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2267360A (en) | Method and system for interacting with floating objects | |
JP7498130B2 (en) | Offshore ship-to-ship lifting with target tracking assistance | |
RU2623295C2 (en) | System, device and method for current monitoring of vehicle, loading device and cargo position and orientation, while loading device operation | |
AU2023201477B2 (en) | System for determining position of objects | |
CN109739238A (en) | A kind of ship automatic berthing system and its working method | |
EP2914540B1 (en) | Control system for cables or similar | |
CN108897272A (en) | Bank end intelligent monitoring system | |
US10773591B2 (en) | Video analytics based pilot safety devices | |
CN109437020B (en) | Quayside container crane method of ship floating condition and stability monitoring device and its monitoring method | |
KR20200077525A (en) | Vessel navigation support system | |
CN208802612U (en) | Ship loader operating system and ship loader | |
JP7345335B2 (en) | Crane operation support system and crane operation support method | |
AU2023202325A1 (en) | System and method for tug-boat line transfer | |
WO2020153472A1 (en) | Undocking/docking director assistance device, undocking/docking director assistance method, and ship | |
CN209904990U (en) | 30 ten thousand tons of large-scale oil tankers berthing monitoring devices based on virtual wall | |
JP2523041Y2 (en) | Manned gondola for ships | |
GB2041578A (en) | Articulated arm control system | |
CN116608860A (en) | Ship auxiliary berthing and leaving method based on computer vision and related equipment | |
CN116203942A (en) | Yacht autonomous berthing system and method with multi-sensor information fusion | |
NICKELSBURG et al. | A “PLUSS” FOR DEEP OCEAN SEARCH | |
Ghignone et al. | Self-powered underwater vehicle for MCM operations (MIN MK2 system) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 19980522 |