WO2018103929A1 - Véhicule et procédé pour un véhicule, avec présentation de limitation de plage de charge maximale - Google Patents

Véhicule et procédé pour un véhicule, avec présentation de limitation de plage de charge maximale Download PDF

Info

Publication number
WO2018103929A1
WO2018103929A1 PCT/EP2017/076323 EP2017076323W WO2018103929A1 WO 2018103929 A1 WO2018103929 A1 WO 2018103929A1 EP 2017076323 W EP2017076323 W EP 2017076323W WO 2018103929 A1 WO2018103929 A1 WO 2018103929A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
load
limitation
loading
Prior art date
Application number
PCT/EP2017/076323
Other languages
English (en)
Inventor
Marcus RÖSTH
Original Assignee
Cargotec Patenter Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cargotec Patenter Ab filed Critical Cargotec Patenter Ab
Publication of WO2018103929A1 publication Critical patent/WO2018103929A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C23/00Cranes comprising essentially a beam, boom, or triangular structure acting as a cantilever and mounted for translatory of swinging movements in vertical or horizontal planes or a combination of such movements, e.g. jib-cranes, derricks, tower cranes
    • B66C23/88Safety gear
    • B66C23/90Devices for indicating or limiting lifting moment
    • B66C23/905Devices for indicating or limiting lifting moment electrical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C23/00Cranes comprising essentially a beam, boom, or triangular structure acting as a cantilever and mounted for translatory of swinging movements in vertical or horizontal planes or a combination of such movements, e.g. jib-cranes, derricks, tower cranes
    • B66C23/62Constructional features or details
    • B66C23/72Counterweights or supports for balancing lifting couples
    • B66C23/78Supports, e.g. outriggers, for mobile cranes
    • B66C23/80Supports, e.g. outriggers, for mobile cranes hydraulically actuated

Definitions

  • the present disclosure relates to a vehicle, and a method for a vehicle, providing a display unit presenting an overview image, e.g. a bird-view image, of the area surrounding the vehicle and including a superimposed image of at least one determined maximal load range limitation.
  • an overview image e.g. a bird-view image
  • visual assistance systems comprise cameras for monitoring the ground area around the vehicle.
  • cameras are known for trucks to facilitate the manoeuvring of the vehicles, in particular in a backward driving situation. The driver may then watch the vehicle position on a display during driving operation.
  • JP-2008074594 is disclosed a monitoring system for a vehicle provided with a crane and support legs.
  • the crane comprises a rotatable base and a telescopic crane boom mounted to the base.
  • Cameras are mounted to the crane and configured to monitor the area surrounding the vehicle. Images taken by the cameras are transformed into a bird view image by means of an image
  • US-20150330146 discloses a monitoring system for a utility vehicle provided with an aerial apparatus, for instance in the form of a turnable ladder or a crane, and support legs. Cameras are mounted to the vehicle and configured to monitor the ground areas on which the support legs rest in the operating position.
  • the ground areas monitored by the cameras are shown on a display in the driver's cabin of the vehicle. Markings illustrating the position of the support legs in the operating position are superposed on the image shown on the display. Furthermore, a marking illustrating the vertical turning axis of the aerial apparatus is also superposed on the image shown on the display.
  • EP-2952467 discloses a monitoring system for a vehicle provided with a lifting device and support legs. Cameras are mounted to the vehicle and configured to monitor the area surrounding the vehicle. Images taken by the cameras are transformed into a bird view image and this bird view image is shown on a display in the driver's cabin of the vehicle. Markings illustrating the position of the support legs in different operating positions are superposed on the bird view.
  • the driver when parking the vehicle, to check that there are no obstacles in the surroundings which will interfere with the support legs.
  • EP-2543622 discloses a monitoring system for an extensible boom of a mobile crane, wherein a camera is mounted near the outer end of the extensible boom in order to monitor a load suspended by the boom. The image taken by the camera is shown on a display. An image illustrating the extensible boom may be superposed on the image shown on the display. A limit performance line illustrating an area of maximum operation radius in which the extensible boom can move is superposed on the image shown on the display.
  • the object of the present invention is to achieve an improved vehicle and an improved method used in such a vehicle that provides a support tool for the operator of the vehicle to accurately and easily position the vehicle in relation to an object and indicate an applicable extension of the support legs in a specific situation.
  • the invention relates to a vehicle, in particular a vehicle for loading objects, comprising at least one sensing device mounted on the vehicle and being configured to capture measurement data to monitor the entire area, or a part of the area surrounding the vehicle.
  • the at least one sensing device is configured to generate at least one measurement data signal including the captured measurement data, and a processing unit configured to receive the measurement data signal and to determine a real time overview image, e.g. a bird-view image, of the area surrounding the vehicle based upon the
  • a loading arrangement is provided being configured to load objects to and from the vehicle, and being controlled by a loading control signal determined by the processing unit, and at least one supporting leg being arranged at the vehicle and being configured to be extended outside the vehicle to support the vehicle during a loading procedure, and that a supporting leg extension distance ranges from 0- 100% of a maximal extension distance.
  • the processing unit is further configured to receive an object parameter comprising at least a weight of the object, and to determine a maximal load range limitation for the loading arrangement for at least one of the entire range of support leg extension distances in dependence of the weight of the object.
  • the processing unit is configured to determine a load limitation image of at least one of the determined maximal load range limitation, wherein the load limitation image is a graphical illustration of the at least one maximal load range limitation in relation to the vehicle.
  • the processing unit is then configured to superimpose the load limitation image on the shown real time overview image, and more particularly, to superimpose the load limitation image in a fixed position in relation to a virtual image of the vehicle.
  • the invention comprises a method to be applied in a vehicle, in particular a vehicle for loading objects.
  • the method comprises:
  • a real time overview image e.g. a bird-view image, of the area surrounding the vehicle based upon said measurement data
  • the method further comprises:
  • a vehicle and a method is provided where a graphical illustration of the load range limitations for a loading
  • the maximal load range limitations are determined for 2-5 different support leg extension distances, e.g. 50%, 75% and 100% of the maximal extension. Thereby the operator will have a clear indication of available options regarding how to position the vehicle in relation to the object,
  • the processing unit is configured to receive an input signal including information of a chosen load range limitation.
  • the input signal may e.g. be generated in response of an operator input via a touch screen of the display unit.
  • the processing unit is configured to determine a loading control signal in dependence of the chosen load range limitation, wherein the loading control signal comprises load instructions to control the loading arrangement to perform a loading procedure of an object within a loading area defined by the chosen load range limitation. This is advantageous in order to perform an accurate and fast loading of the object.
  • Figure 1 is a block diagram schematically illustrating a vehicle according to the present invention.
  • Figures 2 and 3 are schematic illustrations of a screen of a display unit according to embodiments of the present invention.
  • Figure 4 is a flow diagram illustrating the method according to the present invention.
  • the present invention relates to a vehicle 2, in particular a vehicle for loading objects 20, e.g. emptying a waste bin, loading a pile of forestry.
  • vehicle is provided with a load
  • RMC roof mounted crane
  • a general articulated crane capable of lifting loads from the ground to the load carrying part of the vehicle, and also for lifting loads from the vehicle to the ground.
  • the loading arrangement 24 is controlled by a loading control signal 22
  • the vehicle is further provided with at least one supporting leg 30, e.g. four supporting legs, configured to be extended outside the vehicle to support the vehicle during a loading procedure.
  • the legs being extended often in a lateral and perpendicular direction in relation to the longitudinal axis of the vehicle.
  • a supporting leg extension distance ranges from 0-100% of a maximal extension distance.
  • the extended legs are denoted by dashed lines.
  • the object to be handled is not a part of the vehicle 2 and not physically connected to the vehicle but instead positioned outside the vehicle in an environment being accessible by the vehicle.
  • the vehicle comprises at least one sensing device 4 mounted on the vehicle 2 and being configured to capture measurement data to monitor the entire area surrounding the vehicle.
  • the sensing device may be one or many of a camera, a radar, an infra-red sensor, a laser-scanner or any other type of sensing device.
  • Various combinations of different types of sensing devices may be applied, e.g. one or many cameras and one or many laser-scanners.
  • the at least one sensing device 4 is configured to generate at least one measurement data signal 6 including the captured measurement data.
  • the processing unit 8 is configured to receive the measurement data signal 6 and to determine a real time overview image 10 of the entire area, substantially the entire area, or a part of the area surrounding the vehicle 2 based upon the measurement data.
  • the part may be half of the area surrounding the vehicle, e.g. an area to the right of the vehicle.
  • the processing unit 8 is further configured to generate a real time image signal 12 to be applied to a display unit 14 configured to show the real time overview image 10 (see figure 2).
  • the processor unit is provided with a necessary processing capability, and also computer memories, and is e.g. realized by a general computer unit normally available in the vehicle.
  • the number of sensing devices e.g. cameras, is naturally related to the visual fields of the cameras that should be overlapping to cover the entire surroundings around the vehicle. In the figure six cameras are illustrated preferably arranged at elevated positions on the vehicle, but of course fewer or more sensing devices could be applied.
  • the measurement data, e.g. the images, received from the sensing devices (the cameras) are then combined to a single overview image.
  • the vehicle itself may be represented as a central rectangular marking in the centre of the image - see figures 2 and 3.
  • the measurement data received by the processing unit from the sensing devices represent the surroundings obtained at an angle that may range from 0 to 180 degrees in relation to the level of the ground.
  • These images obtained by the sensing devices should then be transferred to an overview image, e.g. an image from above, i.e. a so-called bird-view image.
  • a bird-eye view transformation technique is applied to generate a top view perspective of an image obtained by the sensing devices.
  • One applied technique can be classified under digital image processing as a geometrical image modification. Basically the bird's eye view transform can be divided into three steps. First the image has to be represented in a shifted coordinate system, in a next step a rotation of the image is performed, and then the image is projected on a two dimensional plane.
  • the display unit is advantageously mounted in the cabin of the vehicle 2 such that images shown on the display are easily visible for the operator, e.g. the driver.
  • the display is configured to show an overview illustration, e.g. a view from above, of the area surrounding the vehicle, and an object 20, if visible, to be picked up.
  • the feature display should be interpreted broadly and should e.g. also comprise projected images, e.g. images projected at the windscreen, i.e. so-called heads-up systems.
  • the processing unit 8 is further configured to receive an object parameter comprising at least a weight of the object, and to determine a maximal load range limitation for the loading arrangement 24 for at least one of the entire range of support leg extension distances in dependence of the weight of the object.
  • the object parameter may be entered to the processing unit via the display unit, e.g. if the display unit is provided with a touchscreen.
  • a maximal load range limitation for an input weight of an object is calculated by taking into account the lifting capacity of the loading arrangement for the entire range of support leg extension distances.
  • the maximal distance Dmax to the load should be determined for the entire range of support leg extension distances SLdist using the following formula:
  • the processing unit is then configured to determine a load limitation image 34 of at least one of the determined maximal load range limitation, and that the load limitation image 34 is a graphical illustration 32 of the at least one maximal load range limitation in relation to the vehicle and preferably around the vehicle. It may also be possible to determine the load range limitation for only one side of the vehicle. The thus determined load limitation image 34 is then superimposed on the shown real time overview image 10. The combined image is shown in figure 2 where four different load range limitations 32 are shown.
  • the processing unit 8 is configured to superimpose the load limitation image 34 in a fixed position in relation to a virtual image 28 of the vehicle, which is illustrated in figure 2.
  • the virtual image of the vehicle is centred in relation to the graphical illustration 32 of the load range limitation(s) of the load limitation image.
  • the maximal load range limitations are determined for 2-5 different support leg extension distances, e.g. including 50% and 100% of a maximal extension distance.
  • the driver will then have a good overview of the different available options when arriving to an object to be loaded. If the environment around the object is such that the driver can position the vehicle close to the object it may be enough to extend applicable support legs only 25% of the maximal extension, which may be represented with the innermost dashed range limitation on the load limitation image 34. If, on the other hand, the environment is such that it is not possible to position the vehicle close to the object, due to e.g. trees, rocks or other obstacles, the full extension of the support legs may be required which will be represented by the utmost dashed range limitation presented on the display unit.
  • the processing unit 8 is configured to receive an input signal 36 including information of a chosen load range limitation.
  • the input signal 36 may be generated in response of an operator input via a touch screen at the display unit.
  • the processing unit 8 is configured to determine a loading control signal 22 in dependence of the chosen load range limitation.
  • the loading control signal comprises load instructions to control the loading arrangement 24 to perform a loading procedure of an object 20 within a loading area defined by the chosen load range limitation.
  • processing unit 8 may also be configured to control the extensions of the supporting legs to an extension that corresponds to the chosen load range limitation.
  • Figure 3 is schematic illustration of a real time overview image having a
  • the graphical illustrations 32A, 32B, 32C of the load range limitation(s) designate, in this example, support load extensions of 50%, 75% and 100%, respectively.
  • the shape of the curves of the graphical illustrations of the load range limitations depends e.g. on the position of the loading arrangement at the vehicle, the angle between the loading arrangement and the longitudinal extension of the vehicle when in a loading position . This is clearly illustrated in figure 3 where a particular load range limitation depends on the angle between the crane and the vehicle.
  • the present invention also relates to a method to be applied in a vehicle 2 for loading objects.
  • the vehicle comprises a loading arrangement 24 configured to load objects 20 to and from the vehicle, and being controlled by a loading control signal 22 determined by a processing unit 8.
  • the vehicle further comprises at least one supporting leg 30 configured to be extended outside the vehicle to support the vehicle during a loading procedure.
  • a supporting leg extension distance ranges from 0-100% of a maximal extension distance.
  • the object 20 is not a part of the vehicle 2, but instead e.g. a waste bin or a pile of forestry, or any other object to be picked-up or handled by a loading arrangement of the vehicle.
  • the method will now be discussed in detail with references to the flow diagram shown in figure 4.
  • the method comprises capturing measurement data by at least one sensing device 4 mounted on the vehicle 2 to monitor the entire area, or a part of the area surrounding the vehicle, and generating at least one measurement data signal 6 including said captured measurement data.
  • the method further comprises receiving, in a processing unit 8, the measurement data signal 6, determining a real time overview image 10 of the area surrounding the vehicle 2 based upon the measurement data, generating a real time image signal 12 and applying it to a display unit 14 where the real time overview image 10 is shown.
  • the method further comprises receiving an object parameter comprising at least a weight of the object, and determining a maximal load range limitation for the loading arrangement 24 for at least one of the entire range of support leg extension distances in dependence of the weight of the object.
  • an object parameter comprising at least a weight of the object
  • determining a maximal load range limitation for the loading arrangement 24 for at least one of the entire range of support leg extension distances in dependence of the weight of the object With regard to the detailed calculation of the load range limitation it is referred to the description above.
  • the method further comprises determining a load limitation image 34 of at least one of the determined maximal load range limitation, wherein the load limitation image is a graphical illustration of the at least one maximal load range limitation in relation to the vehicle, and superimposing the load limitation image on the shown real time overview image, such that the load limitation image is superimposed in a fixed position in relation to a virtual image 28 of the vehicle.
  • the virtual image of the vehicle is preferably centred in relation to the graphical illustration 32 of the load range limitation(s) of the load limitation image.
  • the maximal load range limitations are advantageously determined for 2-5 different support leg extension distances, and the determined maximal load range limitations preferably includes 50% and 100% of a maximal extension distance.
  • the method comprises receiving an input signal 36 including information of a chosen load range limitation.
  • the input signal may be generated in response of an operator input via a touch screen of said display unit.
  • the method comprises determining a loading control signal 22 in dependence of the chosen load range limitation.
  • the loading control signal comprises load instructions to control the loading arrangement 24 to perform a loading procedure of an object 20 within a loading area defined by the chosen load range limitation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un véhicule (2) pour charger des objets (20), comprenant au moins un dispositif de détection (4) monté sur le véhicule (2) et étant conçu pour capturer des données de mesure pour surveiller la totalité de la zone ou une partie de la zone entourant le véhicule, une unité de traitement (8) conçue pour recevoir lesdites données de mesure et pour déterminer une image d'aperçu en temps réel (10), par exemple une image à vol d'oiseau, de la zone entourant le véhicule (2) sur la base des données de mesure et pour générer en outre un signal d'image en temps réel (12) à appliquer à une unité d'affichage (14) conçue pour afficher ladite image d'aperçu en temps réel (10). Un agencement de chargement (24) est conçu pour charger des objets (20) vers et à partir du véhicule, au moins un montant de support (30) est conçu pour être étendu à l'extérieur du véhicule pour supporter le véhicule pendant une procédure de chargement, une distance d'extension de montant de support étant comprise entre 0 et 100 % d'une distance d'extension maximale. L'unité de traitement (8) est en outre conçue pour recevoir un paramètre d'objet comprenant au moins un poids de l'objet et pour déterminer une limitation de plage de charge maximale pour ledit agencement de chargement (24) pour au moins une distance de la plage entière de distances d'extension de montant de support en fonction du poids de l'objet, et pour déterminer une image de limitation de charge (34) d'au moins une limitation de plage de charge maximale déterminée. L'image de limitation de charge (34) est une illustration graphique (32) de ladite au moins une limitation de plage de charge maximale par rapport au véhicule, et ladite unité de traitement (8) est conçue pour superposer ladite image de limitation de charge (34) sur ladite image d'aperçu en temps réel représentée (10).
PCT/EP2017/076323 2016-12-07 2017-10-16 Véhicule et procédé pour un véhicule, avec présentation de limitation de plage de charge maximale WO2018103929A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP16202749.4A EP3333114B1 (fr) 2016-12-07 2016-12-07 Véhicule et procédé pour un véhicule avec présentation de limitation de plage de charge maximale
EP16202749.4 2016-12-07

Publications (1)

Publication Number Publication Date
WO2018103929A1 true WO2018103929A1 (fr) 2018-06-14

Family

ID=57542770

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/076323 WO2018103929A1 (fr) 2016-12-07 2017-10-16 Véhicule et procédé pour un véhicule, avec présentation de limitation de plage de charge maximale

Country Status (3)

Country Link
EP (1) EP3333114B1 (fr)
DK (1) DK3333114T3 (fr)
WO (1) WO2018103929A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5823370A (en) * 1995-03-03 1998-10-20 Komatsu Ltd. Movable range indicating apparatus for mobile crane vehicle
JP2008074594A (ja) 2006-09-25 2008-04-03 Tadano Ltd 作業機搭載車両の周辺監視装置
EP2543622A1 (fr) 2011-07-08 2013-01-09 Tadano, Ltd. Unité d'affichage de ligne de performance
US20150330146A1 (en) 2012-10-17 2015-11-19 Iveco Magirus Ag Utility vehicle with monitoring system for monitoring the position of the vehicle
EP2952467A1 (fr) 2014-06-03 2015-12-09 Palfinger Platforms GmbH Procédé de visualisation de la position d'appui et/ou de la sortie d'au moins un support d'un véhicule et véhicule doté d'au moins un support
EP3000761A1 (fr) * 2013-05-21 2016-03-30 Tadano, Ltd. Dispositif de détection d'orientation de caméra et dispositif d'affichage de ligne de région de travail

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5823370A (en) * 1995-03-03 1998-10-20 Komatsu Ltd. Movable range indicating apparatus for mobile crane vehicle
JP2008074594A (ja) 2006-09-25 2008-04-03 Tadano Ltd 作業機搭載車両の周辺監視装置
EP2543622A1 (fr) 2011-07-08 2013-01-09 Tadano, Ltd. Unité d'affichage de ligne de performance
US20150330146A1 (en) 2012-10-17 2015-11-19 Iveco Magirus Ag Utility vehicle with monitoring system for monitoring the position of the vehicle
EP3000761A1 (fr) * 2013-05-21 2016-03-30 Tadano, Ltd. Dispositif de détection d'orientation de caméra et dispositif d'affichage de ligne de région de travail
EP2952467A1 (fr) 2014-06-03 2015-12-09 Palfinger Platforms GmbH Procédé de visualisation de la position d'appui et/ou de la sortie d'au moins un support d'un véhicule et véhicule doté d'au moins un support

Also Published As

Publication number Publication date
DK3333114T3 (da) 2022-04-25
EP3333114A1 (fr) 2018-06-13
EP3333114B1 (fr) 2022-03-02

Similar Documents

Publication Publication Date Title
US20210405619A1 (en) Construction Machine, in Particular a Crane, and Method for the Control Thereof
EP3326958B1 (fr) Détection et analyse optiques d'angles de flèche d'une grue
Fang et al. A framework for real-time pro-active safety assistance for mobile crane lifting operations
US10228700B2 (en) Method for supporting a vehicle docking operation and a support system
US12013702B2 (en) Machine dump body control using object detection
CN109071187B (zh) 用于机器的安全系统
US20200149248A1 (en) System and method for autonomous operation of heavy machinery
JP2006528122A (ja) フォークリフトの荷物支持手段上の可動センサ装置
JP6481178B2 (ja) クレーンの遠隔運転方法及びクレーンの遠隔運転装置
CN112368229A (zh) 起重机
KR101811926B1 (ko) 무인비행체를 이용한 타워크레인의 운전 보조시스템 및 이를 이용한 타워크레인의 영상 제공 방법
CN110027001A (zh) 用于运行移动的作业机器的方法以及移动的作业机器
US11649146B2 (en) Safety system
EP3333114B1 (fr) Véhicule et procédé pour un véhicule avec présentation de limitation de plage de charge maximale
EP3333113B1 (fr) Véhicule et procédé pour un véhicule comprenant un marqueur de cible sur une image d'ensemble
EP3715993A1 (fr) Véhicule comprenant un équipement de travail, équipement de travail et procédé associé
CN216863543U (zh) 自主移动运输车
JP2020066520A (ja) クレーン車
EP3554983A1 (fr) Véhicule de travail comprenant une grue
JP2022081069A (ja) 作業車両
US11718509B2 (en) Vehicle comprising a working equipment, and a working equipment, and a method in relation thereto
JP2020148564A (ja) 作業車両及び作業車両の位置検出方法
US20220301260A1 (en) Systems and methods for area wide object dimensioning
JP2019163122A (ja) 作業車両及び遠隔操作端末
WO2022065117A1 (fr) Système de détection de position

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17792001

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17792001

Country of ref document: EP

Kind code of ref document: A1