EP3722197A1 - Système de compensation de mouvement entre deux objets, véhicule doté dudit système, structure fixe dotée dudit système et procédé doté dudit système - Google Patents

Système de compensation de mouvement entre deux objets, véhicule doté dudit système, structure fixe dotée dudit système et procédé doté dudit système Download PDF

Info

Publication number
EP3722197A1
EP3722197A1 EP20167485.0A EP20167485A EP3722197A1 EP 3722197 A1 EP3722197 A1 EP 3722197A1 EP 20167485 A EP20167485 A EP 20167485A EP 3722197 A1 EP3722197 A1 EP 3722197A1
Authority
EP
European Patent Office
Prior art keywords
sensor
target area
movement
information
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP20167485.0A
Other languages
German (de)
English (en)
Inventor
Markus Schleyer
Quang Huy Nguyen
Michael Erz
Daniel Neyer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Van Halteren Technologies Boxtel BV
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3722197A1 publication Critical patent/EP3722197A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B27/00Arrangement of ship-based loading or unloading equipment for cargo or passengers
    • B63B27/30Arrangement of ship-based loading or unloading equipment for transfer at sea between ships or between ships and off-shore structures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B17/00Vessels parts, details, or accessories, not otherwise provided for
    • B63B2017/0072Seaway compensators

Definitions

  • the invention relates to a system for motion compensation between two objects, at least one of the objects, such as a vehicle or watercraft, being movable.
  • the invention also relates to a vehicle with the system and a fixed structure with the system. A procedure is also provided for the system.
  • a floating structure is designed, for example, as a floating drilling platform or as an object attached to ropes under water. Often people and goods are transferred from one object to another during operations. For this it is necessary that the objects are connected to one another, for example by docking a ship on a fixed drilling platform, whereby it is assumed that neither people nor goods are injured or damaged.
  • unwanted and difficult or unpredictable movements of the object or objects for example due to the wave movement or the weather conditions, are problematic.
  • the invention is based on the object of creating a system with which an object can be docked to another object in a simple, inexpensive and secure manner in terms of device technology. Furthermore, the invention is based on the object of creating a vehicle that can be docked to an object in a simple, inexpensive and secure manner in terms of device technology. In addition, it is the object of the invention to create a fixed structure to which an object can be docked in a simple, inexpensive and secure manner in terms of device technology. Furthermore, it is the object of the invention to create a method for docking an object to another object in a simple, inexpensive and secure manner in terms of device technology.
  • a system in particular for motion compensation between two objects.
  • the system is used to approach a contact area of the object provided with the system, in particular independently, to a target area of another object and then to contact the areas with one another, the contact in particular to be maintained.
  • the areas can then also be positioned in a position relative to one another, in particular a position that can be determined and fixed in advance, for example also at a distance.
  • At least one of the objects is preferably movable, the object then being a vehicle, in particular a watercraft.
  • the system furthermore preferably has at least one sensor which can be arranged on the object provided with the system.
  • This object can be a component, for example a ship's gangway, or one on a winch attached load, which has the contact area.
  • the at least one sensor is preferably designed in such a way that sensor signals or data about the target area of the other object, in particular the object that does not have this sensor, such as a drilling platform, can be detected.
  • at least one actuator can be provided. This can then drive and move the component and / or the object with the component, in particular in order to change the position of the contact area. This allows movement compensation between the areas. Alternatively or additionally, this can change the relative position of the areas.
  • a control unit is preferably provided or the system is connected to a control unit, for example via the Internet or wirelessly or by cable.
  • the control unit is preferably designed in such a way that it is used to process the information, in particular via a suitable algorithm, in order in particular to determine movement and / or position information of the target area relative to the contact area.
  • the actuator can then be controlled based on the information processed by the control unit in such a way that the areas approach one another and then contact one another or then maintain the predetermined relative position, in particular a distance or relative angle, and / or that the movement compensation takes place.
  • the relative movement not to be fully compensated, but only to be minimized to the extent that the operation can be carried out safely. For example, with existing contact, slight changes in the angle between the gangway and the other ship could be tolerated.
  • a relative movement between two objects in particular continuously, can be measured.
  • the movement information can then be used to compensate for the relative movement during docking, for example a bridge or another element from one body to the other, in order to meet and maintain the target position.
  • This solution has the advantage that operations, for example between two floating objects or bodies, can also be carried out in less than optimal weather conditions. This is preferably done in that the relative movement is determined, for example, between the contact points, for example between an end side of a gangway and a docking point, between the two bodies and can then be compensated for.
  • the system can advantageously only be installed on one of the two objects or bodies. No aids or sensors are then required on the other object, for example on the other float.
  • safety is increased during operations between two objects, in particular between two floating bodies.
  • a manual compensation of the relative movement in the prior art is practically impossible, so that no compensation currently takes place in operations of this kind and considerable risks always have to be accepted and accidents cannot be ruled out.
  • a quick docking operation can be carried out with the present system.
  • the system according to the invention it is not necessary, for example for operations between two floating bodies, to determine a relative movement between their contact points using a sensor system that is provided on both floating bodies. There is therefore no need to install any sensors on both bodies or objects that exchange data.
  • the relative position between the target area and the contact area can be determined. This not only determines the movement of the body on which the system is installed. If only the movement of this body were to be determined, then only absolute positioning in space would be possible, which would be problematic in particular for operations between two moving objects.
  • a relative movement between two objects for example between two floating objects or between a floating object and a fixed object, can thus be determined.
  • the relative movement is only determined from one object, which means that no tools are required for the other object. No aids need to be attached to the other object and no information needs to be communicated from the other object to the first object.
  • information in the form of movement and / or position information about the contact area of the component of the object is recorded and / or established and / or the control unit at certain times or events or reported continuously. This is advantageous in order to easily determine the relative position between the contact area and the target area.
  • a selection algorithm can preferably be provided. This is designed, for example, in such a way that a target area can be selected via the control unit, in particular automatically, from areas detected by the at least one sensor. The target area can then be followed. If, for example, the object with the contact area approaches the further object, the object with the contact area can independently select a target area on the other object using the selection algorithm and then subsequently acquire the information required for the movement compensation and / or the To enable convergence of the areas.
  • an input means can be provided. This is preferably designed in such a way that a target area can be selected manually, in particular via an operator, by the at least one sensor detected areas, and can then preferably be followed.
  • manual selection of the target area can thus also be provided.
  • Such an input means is advantageous, for example, if an operator of the object with the contact area does not want to rely on a purely automatic target area selection. If both are possible, i.e. automatic or manual target selection, the operator can fall back on manual target area selection if necessary, for example in difficult situations.
  • a tracking algorithm is preferably provided. This can be configured in such a way that the, in particular the selected, target area is tracked via it on the basis of the information recorded by the at least one sensor. Thus, after the target area has been selected, an operator does not have to track this target area manually, but the tracking advantageously takes place automatically. It is conceivable that the selected target area is marked and can be shown in a display.
  • a position determination algorithm is advantageously provided, which can be designed in such a way that it is used to determine movement and / or position information of the, in particular selected and / or tracked, target area. The position and movement relative to the contact area can then easily be calculated.
  • a compensation algorithm is preferably provided, which is designed in such a way that the areas, i.e. the target area and the contact area, are returned to this relative position or come closer to one another after an undesired relative displacement or rotation, starting from a current relative position will.
  • the movement compensation can be carried out via the compensation algorithm.
  • the contact area and target area would preferably be in a fixed relative position to one another, although the object moves, in particular due to waves.
  • the information is ascertained or determined in the form of movement and / or position information of the contact area.
  • a relative movement and / or a relative position can then be determined from the information about the target area and from the information about the contact area.
  • the information of the contact area can be used, for example, in the position determination algorithm and can be recorded simultaneously or after or before the determination of the information for the target area.
  • one of the objects is preferably a stationary structure and the other object is movable, in particular in the form of a vehicle. It would also be conceivable that both objects can be moved.
  • the vehicle is preferably an aircraft or a water-based vehicle or a land-based vehicle.
  • An offshore wind turbine or a fixed drilling platform, for example, is provided as the fixed structure.
  • a ship, a floating platform or a buoy can be provided as a water-bound vehicle.
  • the system according to the invention is arranged either on the fixed structure or on the movable object.
  • the component is a load that can be moved or suspended by means of a winch and a crane arm.
  • the contact area is preferably an underside bearing surface of the load.
  • the target area can be a storage area for the load on the other object.
  • the component is a crane arm or an access bridge or staircase or a gangway of a watercraft.
  • the contact area is then preferably a contact surface thereof, which is to be brought into contact with the other object or to be brought close to the object.
  • the component is a component of the winch, such as a connecting element, such as a crane hook for the load. If the movement compensation is provided for the winch, then, for example, the lifting movement of the vehicle, in particular the watercraft, can be compensated for in the end position of the rope, where the crane hook is, for example. This means that the crane hook would essentially be in a fixed position in space, although the vehicle is moving due to waves.
  • the end of the gangway could be movement-compensated via the movement compensation of the system.
  • the end of the ship gangway can be positioned at a fixed position in space despite the ship's movement, which enables it to be guided to the target area, for example to dock on a wind turbine.
  • the at least one actuator or a plurality of actuators is preferably designed such that the contact area of the component can be moved in one, two or three translational direction (s) and / or in one, two or three rotary direction (s). It is therefore conceivable that some or all of the movements can be compensated for. Depending on the application, a certain number or configuration of the actuator or actuators is then used in order to move the contact area in the desired directions. It has been shown that, for example, in the case of a winch with the load on the rope, good results can be achieved if only translational relative lifting movements in the vertical direction are compensated for via the movement compensation of the system.
  • the component of the movable object can be movable relative to this with the at least one or a further actuator.
  • the component can be moved relative to its object via this at least one actuator in such a way that the contact and target areas approach one another and / or that the areas maintain a predetermined distance or contact and / or that motion compensation takes place.
  • the senor or the sensor in combination with the selection algorithm is preferably designed in such a way that - as already mentioned above - no aids are required in the target area in order to detect it. This means that no adjustments to the object with the target area are necessary. For example, no aids or additional sensors need to be used in the target area so that the sensor can detect the target area. For example, no motion reference units (MRU) or permanently installed reference measuring points for position determination are then required, such as reflectors for radar sensors or lasers. It is therefore not necessary to measure reference measuring points and report them to the control unit in a complex manner.
  • MRU motion reference units
  • permanently installed reference measuring points for position determination are then required, such as reflectors for radar sensors or lasers. It is therefore not necessary to measure reference measuring points and report them to the control unit in a complex manner.
  • an imaging sensor in particular a 3D image sensor
  • the sensor or another sensor can be designed as a camera.
  • a monochrome camera or a color camera or a stereo camera or an infrared camera or a time-off-flight (TOF) camera are suitable here.
  • the sensor or a further sensor can be configured as a light detection and ranging (LIDAR) sensor.
  • LIDAR light detection and ranging
  • a measuring range of the sensor or sensors can preferably be adjustable via one or one adjustment actuator in each case.
  • the adjustment actuator can be controlled, for example, by the control unit or the operator.
  • the sensor is flexible via the Vestellaktor can be aligned to the target area. This can take place automatically, for example, and / or positioning, in particular rough positioning, takes place via the operator. It is also conceivable that, in the event of a change in position between the target area and the contact area, the sensor automatically follows the target area by activating the adjustment actuator accordingly.
  • the sensor is attached to the movable component, for example.
  • it can be moved together with the component and / or independently of the component via the adjustment actuator.
  • Moving the sensor together with the component has the advantage that when the component moves with its contact area towards the target area, the sensor can, for example, also be brought closer to the target area.
  • it is necessary to calculate the movement / position of the contact point of the component in a complex manner using the kinematic data of the component. This calculation is omitted or simplified the closer the sensor system or the sensor is attached to the contact area.
  • the senor in the form of a LIDAR sensor is attached to the movable component, in particular in the form of a boom.
  • the sensor is arranged on the end face, that is to say for example at an end section or a tip of the boom, and thus for example close to the contact area or adjacent to the contact area.
  • the sensor is thereby advantageously in an exposed position, which leads to a free measurement area.
  • At least one sensor or a further sensor and / or a sensor pair can be attached to the object with the component and thus preferably not to the movable component.
  • a camera in particular a pan-tilt-zoom (PTZ) surveillance camera, is provided here as the sensor.
  • a sensor pair is arranged, a LIDAR sensor connected to it, in particular mechanically, can be used in addition to the camera.
  • the arrangement of the sensor or sensors on the object is preferably provided in such a way that, in this position, the target area is always provided in the measurement area despite movement or wave movement of the object. This is ensured in particular by the PTZ surveillance camera.
  • the LIDAR sensor is mechanically connected to the camera, it is also moved along with the camera, for example via a Adjustment actuator, which means that only one adjustment actuator is necessary. This also extends its measuring range according to the requirements.
  • the sensor or the sensor pair are arranged, for example, on a movable platform via which the component can be moved about a vertical axis of the vehicle or the structure. The sensor or the pair of sensors can then, for example, be moved together with the platform via an actuator or adjustment actuator.
  • a further sensor is preferably provided on the object, which is provided in particular for automatic detection or for the automatic selection of the target area.
  • the further sensor is attached, for example, to the platform or to the vehicle or the structure, ie preferably not to the component.
  • the sensor or some of the sensors or all sensors are preferably intrinsically and / or extrinsically calibrated.
  • intrinsic calibration for example, a sensor is calibrated in the form of a camera for the associated optics.
  • several sensors are provided, these or some of them are preferably calibrated against one another.
  • the calibrations can enable a coordinate transformation between sensor coordinate systems so that the control unit can easily determine the desired information.
  • the individual images (cameras) / point clouds (lidar) are merged and / or merged in the algorithm at certain points.
  • Reasons for this can include mutual plausibility checks, creating redundancy for sensor failure or temporary occlusion or increasing the stability of the algorithms.
  • the relative position (3 translations, 3 rotations) between the coordinate systems of the imaging sensors should be known. This is guaranteed by the extrinsic calibration.
  • an input means or a further input means can be provided for the at least one actuator and / or for the at least one adjustment actuator.
  • the input means is, for example, a joystick.
  • the input means can be configured in such a way that the component having the contact area and / or the at least one sensor with respect to the target area, in particular in advance, is positionable. Thus, for example, rough positioning by an operator is possible in a simple manner before automatic movement compensation and / or approach of the areas takes place.
  • a display or a display Possible target areas detected by the at least one sensor can then be displayed on this, for example.
  • an image captured by the sensor by a camera can be displayed.
  • the operator can use the display to select a target area via an input means.
  • a position or 3D position of the, in particular selected, target area is shown on the display based on the information recorded by the at least one sensor.
  • a vehicle with the system according to one or more of the preceding aspects is provided.
  • the vehicle can be used as an object with the sensor.
  • a stationary structure is provided to which a vehicle, in particular a watercraft, can be docked.
  • the structure can then have the system according to one or more of the preceding aspects.
  • the structure can then be used as an object with the sensor. It is conceivable that the structure has the movable component.
  • This solution has the advantage that a movement compensation between the areas and / or an approximation of the areas is made possible in a simple manner.
  • the compensation can take place, for example, by a corresponding control of the crane arm, the gangway or the winch, with the help of which contact with the other object is established. Operations between, for example, two floating bodies or one floating body and one solid body are thus made possible in a simple manner.
  • the target area is tracked with a suitable algorithm, in particular with the tracking algorithm, and the change in position is continuously tracked.
  • a suitable algorithm in particular with the tracking algorithm, and the change in position is continuously tracked.
  • this can be done using one or more of the following methods: Kalman filters, particulate filters, feature-based methods, region-based methods, contour-based methods, model-based methods.
  • Information or position information is preferably continuously fed to the control unit for regulating the at least one actuator, in particular for the movable component, in order to implement the movement compensation, in particular with a manual assistance function or automated contact establishment.
  • the sensor signals from a plurality of sensors can be used in the determination of the information in the form of the movement and / or position information of the selected target area, in particular via the position determination algorithm. This is preferably the case in particular when the distance between the target area and the contact area exceeds a certain threshold value and / or the sensor signals of a sensor are too imprecise.
  • the sensor or the sensor pair is first used on the object and not on the component, for example the camera or the camera and the LIDAR sensor. Under certain conditions, the sensor on the component, which is preferably the LIDAR sensor, can then also be used if necessary.
  • the ascertained information of the selected target area is too imprecise, it can be provided that the operator is informed of this, in particular in the display. A new target area can then be selected manually or automatically. It is conceivable that the operator initiates the automatic selection.
  • the distance between the contact area and the target area is preferably monitored, for example continuously or for example after a respective activation of the at least one actuator.
  • the method can then be ended from a predetermined distance or when the areas come into contact. This means, for example, if the relative distance of the areas has fallen below a previously defined threshold value, then the contact can be regarded as established.
  • the control can then, for example, be transferred to a general movement compensation system for the vehicle.
  • the contact area and the target area can optionally be approached manually via the input means by an operator or automatically via the control unit be approximated. It is conceivable, for example, that in uncritical and simple situations the operator manually approaches the areas or intervenes in the approach if necessary.
  • the target area on the other object can be selected either manually via the operator or automatically via the control unit.
  • the operator can first ensure that the target area is provided in the field of vision of the sensors. To this end, it can control at least one actuator and / or the at least one adjustment actuator in order to align the at least one sensor accordingly.
  • the sensor signals detected by the sensor can preferably be shown to the operator, in particular as images. This can take place, for example, in the form of a camera image and / or in the form of a LIDAR point cloud and / or in the form of a radar image.
  • the operator can select the target area, for example via the input means.
  • the input means is a joystick or a touch-sensitive display (touchscreen) would also be conceivable.
  • the information relating to the selected target area can then be ascertained, in particular via the position-determining algorithm.
  • a 3D position of the target area can be determined using suitable sensor data and preferably displayed to the operator, in particular via the display. For example, a so-called sensor fusion between stereo camera data and the LIDAR point cloud can take place.
  • the selection algorithm can be used for the automatic selection of the target area. This is preferably configured in such a way that the target area can be determined from the sensor signals or sensor data. A surround view camera system or sensors with a large opening angle are preferably suitable as the sensor. One or more of the following methods can be used as the selection algorithm: Iterative Closest Point (ICP), RANSAC, template matching, segmentation methods such as Otsus, K-means clustering, watershed, grab-cut algorithm.
  • ICP Iterative Closest Point
  • RANSAC RANSAC
  • template matching segmentation methods such as Otsus, K-means clustering, watershed, grab-cut algorithm.
  • the data acquired by the sensor or by the sensors can preferably be represented visually or graphically on the display.
  • the selected target area can be marked here, for example.
  • the current position of the 3D position of the target area can be shown in the display.
  • the tracking algorithm uses the sensor data of two sensors, with which the target area can advantageously be tracked reliably.
  • Suitable sensors are, for example, the sensor (camera) or the sensor pair on the object and not on the component and the sensor (LIDAR sensor) on the Component.
  • the position of the target area in the current sensor data can preferably be tracked using a filter, in particular using a Kalman filter.
  • the future position can preferably be calculated or estimated via the filter based on the current position of the target area, or the current position can be calculated or estimated via the filter based on the last known position.
  • the movement information is used to compensate for the movement when a bridge or another element is docked from one body to the other in order to meet and maintain a target position.
  • Fig. 1a shows an object in the form of a floating body, which is a watercraft 1.
  • a stationary structure 2 is shown as the object, which is, for example, an offshore wind turbine or a fixed drilling platform.
  • the watercraft 1 has a component in the form of a gangway 4. This should be included dock to a docking area 6 of the structure 2.
  • the gangway 4 can be pivoted via an actuator, not shown, so that a free-standing front end section 8 can be adjusted in height.
  • the gangway 4 can be rotated about a vertical axis of the watercraft 1 via an actuator.
  • a system according to the invention for movement compensation and for bringing the end section 8 closer to the docking area 6 is provided in the watercraft 1.
  • FIG. 1b an object in the form of a watercraft 10 is also shown. Furthermore, an object in the form of a stationary structure 12 is shown. In contrast to the Fig. 1a does not have the watercraft 10, but the structure 12 has the movable gangway 4. The watercraft 10 has the docking area 6. The system according to the invention for movement compensation and for bringing the gangway 4 closer to the docking area 6 is provided in the structure 12.
  • a first object in the form of a watercraft 14 and a second object in the form of a watercraft 16 are shown.
  • the watercraft 14 has a crane arm 18 which is rotatable so that its free end can be adjusted in height, and which is preferably movable about a vertical axis of the watercraft 14.
  • the crane arm 14 interacts with a (not shown) winch for a rope 20, at the end of which a load 22 is fastened via a crane hook.
  • the load 22 is intended to be unloaded onto the watercraft 16.
  • the watercraft 14 has the system for motion compensation and for bringing the load 22 closer to the watercraft 16.
  • At least one sensor can be provided, for example, on the free end section of the crane arm 18, i.e. on the end section remote from the watercraft 14. This can capture a target area on the watercraft 16.
  • a contact area is provided for the load 22, for example.
  • Fig. 1d shows, in addition to the watercraft 10 with the docking area 6, the watercraft 1 with the gangway 4.
  • the watercraft 1 here has the system for combining movements and for bringing the gangway 4 closer to the docking area 6.
  • the system can be used for motion compensation in operations between a watercraft and a stationary structure or in operations between two watercraft. It is advantageous in each case only necessary that each one of the objects from a respective Figures 1a to 1b the system has, while the corresponding other object does not have to be adjusted for the system.
  • the system is only installed on one of the objects and then enables it to establish mechanical contact with a large number of other floating or stationary bodies. To avoid damage during or during the establishment of contact and to be able to maintain contact over a longer period of time without damage.
  • movements of the watercraft 1, 10, 14, 16 caused by wind and waves can be taken into account and compensated for by the system when the contact area is established with the target area. This is done, for example, by adjusting the position of the gangway 4 or the crane arm 18 or the length of the rope 20.
  • a fully automatic and / or a semi-automatic method can be provided for the system.
  • Fig. 2 an arrangement for the semi-automatic process is shown and in Fig. 3 an arrangement for the fully automatic process is shown.
  • Fig. 2 the watercraft 1 is shown with the gangway 4 or bridge. Furthermore, an object in the form of a watercraft 24 is shown which has a docking area 6.
  • a sensor 26 in the form of a LIDAR sensor is arranged and fastened on the face of the gangway 4, that is to say at its free end.
  • a pair of sensors 28 is provided, which is not attached to the gangway 4, but directly to the watercraft 1. For example, the pair of sensors 28 is arranged on a platform which is rotatable about a vertical axis of the watercraft 1 and on which the gangway 4 is movably mounted.
  • the sensor pair 28 has a camera, in particular in the form of a pan-tilt-zoom (PTZ) camera, and a LIDAR sensor.
  • the gangway has a contact area 30 at the end, which is to be brought into contact with a target area 32 for docking with the watercraft 24.
  • the sensor pair 28 is preferably arranged on the watercraft 1 in such a way that The target area 32 remains detectable until a certain wave movement or with any wave movement. This is ensured by the PTZ camera.
  • the wave movement or other movements would also be conceivable to at least partially compensate for the sensor pair 28 with movement compensation.
  • the sensor pair 28 is mechanically firmly connected and can be moved together, for example via an adjustment actuator.
  • the LIDAR sensor for example, is connected to the camera through the mechanical connection and its field of view is thereby expanded according to the requirements.
  • the sensors 26, 28 are intrinsically, extrinsically and calibrated against one another. This enables a coordinate transformation between the sensor coordinate systems, where K B1 is the coordinate system of the first sensor of the sensor pair 28, K B2 is the coordinate system of the second sensor of the sensor pair 28 and K A is the coordinate system of the sensor 26 is.
  • the coordinate transformation can be done with the following common formulas: T K B. 1 K A. , T K B. 2 K A. and T K B. 1 K B. 2 where T is a transformation matrix.
  • the transformation matrix T is a 4x4 matrix with a sub-rotation matrix R, which describes the rotation between the coordinate systems, and the sub-translation matrix t, which describes the translation between the coordinate systems.
  • the representation of a vector with respect to the coordinate system K B1 can be converted into a representation with respect to the coordinate system K B2 :
  • actuators 34 in Fig. 2 are shown schematically with a block, the gangway 4 can be rotated so that the height of the contact area 30 can be changed and rotated about a vertical axis 36 of the watercraft 1.
  • the actuators 34 can be controlled via a control unit 37.
  • step 36 an operator ensures that the target area 32 is off Fig. 2 is in the field of view of the sensors 26 and 28. This can take place in that the sensor pair 26 is moved together with the gangway 4 via the actuators 34, for example via the platform, and / or the watercraft 1 is moved by its drive and / or that for the sensor pair 28 and / or for the Sensor 26, an adjustment actuator or an adjustment actuator is provided, via which it is / are movable.
  • a manual input of the change in position for the sensors 26, 28 takes place, for example, with the aid of a joystick and / or a keyboard console and / or a touch screen.
  • step 38 the sensor signals or sensor data from sensors 26, 28 are recorded.
  • the sensor data of the PTZ camera of the sensor pair 28 are used as a camera image I. B determined.
  • the Sensor data of the LIDAR sensor of the sensor pair 28 are signed with a LIDAR data signature D. B recorded.
  • the sensor data of the LIDAR sensor (sensor 26) is provided with a LIDAR data signature D. A recorded.
  • step 40 the visualization takes place on the screen.
  • the sensor signals of the scene are shown to the operator in an image.
  • the image of the PTZ camera of the sensor pair 28 is shown superimposed on the LIDAR point clouds of the LIDAR sensors (sensor pair 28 and 26) on a display or monitor.
  • step 42 the operator then selects based on the sensor signals displayed on the monitor I. B , D. B and D. A from a target area, the monitor being, for example, a touchscreen.
  • a template camera image is then used for the target area I. ⁇ 0 B. the PTZ camera (sensor pair 28) and LIDAR data signatures D. ⁇ 0 B. and D. ⁇ 0 A. of the LIDAR sensors (sensor pair 28 and sensor 26) which characterize the target area.
  • the selected target area is then marked in the monitor with a marking, for example a bounding box.
  • the position determination algorithm is used to determine the information in the form of movement and position information of the target area 32 relative to the contact area 30.
  • a 3D position of the target area is determined using the sensor signals or sensor data from the sensors 26 and 28 determined and displayed to the operator.
  • the template image is usually smaller than the current camera image.
  • a moving window is placed over the camera image at every possible position and the correlation is calculated.
  • u ' and ⁇ ' are the two pixel coordinates of the captured images.
  • u and ⁇ denote the a priori unknown (lateral) shifts in the two pixel coordinates between the image section and the template image.
  • the aim is now to find values for u and v to be determined so that the closest possible correspondence between the image section and the template image results.
  • the degree of agreement is expressed mathematically by the correlation, which is to be maximized by a suitable choice of u and v.
  • the values of u and v that maximize the correlation are called u 0 and v 0 .
  • the 3D position of the target area can then be determined using the Random Sample Consensus (RANSAC) or an Iterative Closest Point (ICP) algorithm. If a relative distance between the contact area 30 and the target area 32 is above a certain threshold, the data D.
  • RANSAC Random Sample Consensus
  • ICP Iterative Closest Point
  • a of the sensor 26 is also used for more precise position determination. If the selected target area has only a low contrast in the detected sensor signals, so that the accuracy of the position determination can only be imprecise, then the unreliable measurement is signaled to the operator and he can mark a new point, which is indicated by the arrow 46 in Fig. 4 is shown, whereupon step 44 is carried out again.
  • step 48 follows.
  • the movement and / or position information is continuously fed to the control unit 37, which accordingly the actuators 34 controls. These are controlled in such a way that the contact area 30 in The direction of the target area 32 is moved. In this case, a previously defined distance is covered and then a renewed recording of the sensor signals from sensors 26 and 28 is triggered, which is shown by step 50.
  • step 50 it is thus provided that the sensors 26 and 28 receive the sensor signals I. B , D. B and D. A record.
  • the selected target area 32 is tracked or found again in the sensor signals I. B , D. B and D. A using a tracking algorithm via the control unit 37.
  • This is preferably done using a Recursive Least Square (RLS) algorithm.
  • RLS Recursive Least Square
  • This is a set of mathematical equations that represent a computationally efficient, in particular recursive, estimation of the state of a process. The mean square error is minimized here.
  • the current measured position r t at time t is used as an input value for estimating the future position r t +1 .
  • the position r t is the last determined position information of the target area, which was initially determined in step 44 and is determined again in the following step 54, which is explained in more detail below.
  • r ⁇ t T formed from N measurements up to time t and ⁇ t are coefficients.
  • K ⁇ t P t - 1 ⁇ r ⁇ t - 1 ⁇ + r ⁇ t - 1 T P t - 1 r ⁇ t - 1
  • ⁇ (0 ⁇ ⁇ ⁇ 1) is the so-called forgetting factor.
  • P t is the inverse correlation matrix of the measurement data.
  • ⁇ t r t - r ⁇ t - 1 T ⁇ ⁇ t - 1
  • a Kalman filter for example, can be used.
  • step 52 the information in the form of movement and position information of the tracked target area on the basis of the sensor signals is generated in subsequent step 54 I. B , D. B and D. A determined using the sensors 26 and 28. The procedure corresponds to that in step 44.
  • step 54 it can be determined how large the distance between the contact area 30 and the target area 32 is. If the contact area 30 has not yet reached the target area 32, the method is carried out again from step 48, which is shown by the arrow 56. If the contact point is reached, step 58 follows after step 54. That is, if a relative distance between the contact area 30 and the target area 32 has fallen below a previously defined threshold value, then the contact between these areas is regarded as established. The control is then passed on to the general movement compensation of the object, in this case the watercraft 1.
  • a further sensor 60 is provided in the watercraft 1.
  • This is a scanning sensor with a comparatively large opening angle.
  • a LIDAR sensor can be used.
  • the sensor 60 can then be used over a large area an environment can be scanned, this preferably being done in a comparatively high resolution.
  • a recording speed then plays a subordinate role.
  • the sensors 26 and 60 and the sensor pair 28 are intrinsically, extrinsically and mutually calibrated, which enables a coordinate transformation between the individual sensor coordinate systems, where K C is the coordinate system of the sensor 60.
  • the coordinate transformation can be done with the following common formulas: T K C. K B. 1 , T K C. K B. 2 , T K C. K A. , T K B. 1 K A. , T K B. 2 K A. and T K B. 1 K B. 2
  • a step 62 is in the Fig. 5 provided that the sensor 60 picks up sensor signals. This is done in the form of the data D. C.
  • step 66 the alignment of the sensor 26 and the sensor pair 28 towards the target area 32 then takes place.
  • step 68 corresponds to step 38 from Fig. 4 , whereby sensor signals are detected.
  • step 70 corresponds to step 44 from FIG Fig. 4 . This means that the movement and / or position information of the selected target area is determined. If this is not possible, the method is repeated starting from step 62, which is indicated by the arrow 72. Once the desired information has been determined, steps 74, 76, 78, 80 and 82 follow after step 70, followed by steps 48, 50, 52, 54 and 58 Fig. 4 correspond.
  • the target area 32 that is to say the docking point, is itself identified by the method, for example with object recognition.
  • the target area is specified by the operator using the sensor data in the semi-automatic process.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
EP20167485.0A 2019-04-11 2020-04-01 Système de compensation de mouvement entre deux objets, véhicule doté dudit système, structure fixe dotée dudit système et procédé doté dudit système Withdrawn EP3722197A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102019205186.3A DE102019205186A1 (de) 2019-04-11 2019-04-11 System zur Bewegungskompensation zwischen zwei Objekten, Fahrzeug mit dem System, feststehende Struktur mit dem System und Verfahren mit dem System

Publications (1)

Publication Number Publication Date
EP3722197A1 true EP3722197A1 (fr) 2020-10-14

Family

ID=70154254

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20167485.0A Withdrawn EP3722197A1 (fr) 2019-04-11 2020-04-01 Système de compensation de mouvement entre deux objets, véhicule doté dudit système, structure fixe dotée dudit système et procédé doté dudit système

Country Status (2)

Country Link
EP (1) EP3722197A1 (fr)
DE (1) DE102019205186A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113104153A (zh) * 2021-04-25 2021-07-13 大连海事大学 一种海上换乘栈桥波浪补偿控制系统及其工作方法
CN113232768A (zh) * 2021-04-25 2021-08-10 大连海事大学 一种具有波浪补偿功能的海上换乘栈桥及其工作方法
EP3988439A1 (fr) * 2020-10-22 2022-04-27 Robert Bosch GmbH Système doté d'un dispositif de compensation de mouvement et procédé

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141675A1 (fr) * 2008-05-22 2009-11-26 Fmc Technologies Sa Dispositif de commande destiné à un système de transfert de fluide en mer
WO2015044898A1 (fr) * 2013-09-27 2015-04-02 Rolls-Royce Canada, Ltd. Système de compensation de mouvement à deux corps pour applications marines
DE102014220597A1 (de) * 2014-10-10 2016-04-14 Robert Bosch Gmbh Wellengangkompensationssystem
US20180244505A1 (en) * 2017-02-28 2018-08-30 J. Ray Mcdermott S.A. Offshore ship-to-ship lifting with target tracking assistance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009141675A1 (fr) * 2008-05-22 2009-11-26 Fmc Technologies Sa Dispositif de commande destiné à un système de transfert de fluide en mer
WO2015044898A1 (fr) * 2013-09-27 2015-04-02 Rolls-Royce Canada, Ltd. Système de compensation de mouvement à deux corps pour applications marines
DE102014220597A1 (de) * 2014-10-10 2016-04-14 Robert Bosch Gmbh Wellengangkompensationssystem
US20180244505A1 (en) * 2017-02-28 2018-08-30 J. Ray Mcdermott S.A. Offshore ship-to-ship lifting with target tracking assistance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MONSON H. HAYES: "Statistical Digital Processing and Modelling", article "Recursive Least Squares"

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3988439A1 (fr) * 2020-10-22 2022-04-27 Robert Bosch GmbH Système doté d'un dispositif de compensation de mouvement et procédé
DE102020213322A1 (de) 2020-10-22 2022-04-28 Robert Bosch Gesellschaft mit beschränkter Haftung System mit einer Bewegungskompensationseinrichtung und Verfahren
CN113104153A (zh) * 2021-04-25 2021-07-13 大连海事大学 一种海上换乘栈桥波浪补偿控制系统及其工作方法
CN113232768A (zh) * 2021-04-25 2021-08-10 大连海事大学 一种具有波浪补偿功能的海上换乘栈桥及其工作方法
CN113232768B (zh) * 2021-04-25 2022-05-13 大连海事大学 一种具有波浪补偿功能的海上换乘栈桥及其工作方法
CN113104153B (zh) * 2021-04-25 2022-05-17 大连海事大学 一种海上换乘栈桥波浪补偿控制系统及其工作方法

Also Published As

Publication number Publication date
DE102019205186A1 (de) 2020-10-15

Similar Documents

Publication Publication Date Title
EP3722197A1 (fr) Système de compensation de mouvement entre deux objets, véhicule doté dudit système, structure fixe dotée dudit système et procédé doté dudit système
DE102010035898B3 (de) Unbemanntes Unterwasserfahrzeug und Verfahren zum Betrieb eines unbemannten Unterwasserfahrzeugs
DE102007019491B4 (de) Fahrzeugumgebungs-Überwachungsvorrichtung, Fahrzeug, Fahrzeugumgebungs-Überwachungsverfahren und Fahrzeugumgebungs-Überwachungsprogramm
DE102016013274A1 (de) Bildverarbeitungsvorrichtung und verfahren zur erkennung eines bilds eines zu erkennenden objekts aus eingabedaten
DE102014005181A1 (de) Positions- und Lagebestimmung von Objekten
EP1615047A2 (fr) Méthode de calibration d'un capteur optique de distance monté sur véhicule
DE112010004767T5 (de) Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm
DE102017218366A1 (de) Verfahren und system zur fussgängererfassung in einem fahrzeug
DE102006055652A1 (de) Verfahren zur Aufarbeitung dreidimensionaler Daten und Vorrichtung zur Aufarbeitung dreidimensionaler Daten
EP2234081A2 (fr) Systèmes et procédés pour allouer et transmettre des transmissions de blocs de données en liaison ascendante
WO2009049750A2 (fr) Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique
DE112015002764B4 (de) Montagewinkeleinstellverfahren und Montagewinkelerfassungseinrichtung für bordeigene Kamera
DE102018200060B4 (de) Verfahren zum Betreiben einer mobilen Arbeitsmaschine und mobile Arbeitsmaschine
DE102013202915A1 (de) Verfahren und Vorrichtung zum Vermessen einer Parklücke für ein Einparkassistenzsystem eines Kraftfahrzeugs
DE102016124747A1 (de) Erkennen eines erhabenen Objekts anhand perspektivischer Bilder
EP2350977B1 (fr) Procédé pour fusionner au moins deux images pour former une image panoramique
EP3770804A1 (fr) Chariot de manutention à détection d'objets
EP3435030A1 (fr) Procédé de création d'un modèle tridimensionnel d'un objet
DE102020133092A1 (de) Positionsschätzungseinrichtung und positionsschätzungsverfahren
DE102014219428B4 (de) Selbstkalibrierung eines Stereokamerasystems im Auto
DE102006007550A1 (de) Vorrichtung und Verfahren zur Erkennung einer Fahrbahnmarkierung für ein Kraftfahrzeug
WO2019162327A2 (fr) Procédé de calcul d'un éloignement entre un véhicule automobile et un objet
DE102020209841B4 (de) Reinigungssystem mit einem selbstfahrenden Reinigungsroboter und einer Ladestation
EP3833576B1 (fr) Systeme de camera de surveillance
DE102017104957A1 (de) Verfahren zum Bestimmen einer Bewegung von zueinander korrespondierenden Bildpunkten in einer Bildsequenz aus einem Umgebungsbereich eines Kraftfahrzeugs, Auswerteeinrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210414

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: VAN HALTEREN TECHNOLOGIES BOXTEL B.V.

17Q First examination report despatched

Effective date: 20221028

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230308