WO2018169467A1 - Véhicule équipé d'une grue dotée d'un dispositif de détection d'objet - Google Patents

Véhicule équipé d'une grue dotée d'un dispositif de détection d'objet Download PDF

Info

Publication number
WO2018169467A1
WO2018169467A1 PCT/SE2018/050206 SE2018050206W WO2018169467A1 WO 2018169467 A1 WO2018169467 A1 WO 2018169467A1 SE 2018050206 W SE2018050206 W SE 2018050206W WO 2018169467 A1 WO2018169467 A1 WO 2018169467A1
Authority
WO
WIPO (PCT)
Prior art keywords
crane
vehicle
detecting device
information
object detecting
Prior art date
Application number
PCT/SE2018/050206
Other languages
English (en)
Inventor
Hans LYNGBÄCK
Per Gustafsson
Marcus RÖSTH
Original Assignee
Cargotec Patenter Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cargotec Patenter Ab filed Critical Cargotec Patenter Ab
Publication of WO2018169467A1 publication Critical patent/WO2018169467A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G67/00Loading or unloading vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/065Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks non-masted
    • B66F9/0655Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks non-masted with a telescopic boom
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/0755Position control; Position detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/20Means for actuating or controlling masts, platforms, or forks
    • B66F9/24Electrical devices or systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Definitions

  • a vehicle with a crane with object detecting device A vehicle with a crane with object detecting device
  • the present disclosure relates to a vehicle provided with a crane and in particular a crane provided with an object detecting device to detect and to determine a three dimensional representation of an object.
  • Working vehicles are often provided with various movable cranes, which are attached to the vehicle via a joint.
  • These cranes comprise movable crane parts, e.g. booms, that may be extended, and that are joined together by joints such that the crane parts may be folded together at the vehicle and extended to reach a load.
  • Various tools e.g. buckets or forks, may be attached to the crane tip, often via a rotator.
  • An operator has normally visual control of the crane when performing various tasks.
  • a crane provided with extendible booms have load limitations related to how far the boom has been extended. The operator therefore needs to be aware of load limitations when lifting loads.
  • a load that may be important when determining load limitations for a crane are for example the physical size and shape of the load, the position and orientation of the load at the ground, and also the weight and centre of gravity of the load.
  • US9415976 relates to a to a crane collision avoidance system.
  • a load locator is provided to determine a location of a load of a crane and provide the location information to a mapping module.
  • a tag scanner scans the site for tags, e.g. RFID tags, defining an obstacle.
  • a mapping module combines location information, a map and the obstacle information into a user accessible information that is displayed on a GUI. The tags mark objects on the job site which should be avoided during crane operations.
  • US9302890 relates to a crane control system configured to intervene with crane movements to avoid a collision with an obstacle.
  • a plurality of plans are stored in a memory for use by a control module, each of the plans representing an overhead plan view of a job site including at least one obstacle therein at a predetermined elevation or elevation range.
  • a plurality of crane configurations are stored in memory for use by the control module, and a display interface configured to interface with the control module to display via a real-time visualization a selected one of the plurality of plans, a selected one of the crane configuration, and a real-time position of a crane.
  • US-2015/0249821 relates to a device for obtaining surrounding information for a vehicle.
  • a stereo camera which measures a distance from the end portion to an object is provided
  • an image- processing controller which obtains three-dimensional position information of the object based on the crane as reference from measurement data to the object by the stereo camera is provided.
  • the three-dimensional position information of an object in a surrounding area centering the crane by the moving of the telescopic boom is obtained.
  • a lifting device is disclosed to achieve efficient load delivery, load monitoring, collision avoidance, and load hazard avoidance.
  • Various sensors may be provided in a load monitor in a housing close to the load.
  • collision avoidance and various techniques of generating alarm signals.
  • the object of the present invention is to achieve a vehicle, and also a method, that improves the loading and unloading procedures of loads, such that the procedures are faster, safer and more accurate. Summary
  • the invention comprises a vehicle comprising a movable crane mounted on the vehicle and movably attached to the vehicle, the crane comprises at least one crane part and a crane tip.
  • the vehicle further comprises at least one object detecting device provided at said crane and movable together with said crane, and configured to wirelessly capture
  • the captured information comprises at least a distance and a direction to said object defining a coordinate in a three dimensional coordinate system
  • the object detecting device is configured to generate an object data signal comprising data representing captured information.
  • the vehicle also comprises a processing unit configured to receive said object data signal.
  • the processing unit is configured to perform an object scanning procedure, that comprises:
  • the predefined scanning movement rules includes to control movements such that the object detecting device is moved around said object according to a predefined movement pattern. This is advantageous in order to cover the entire surface of the object.
  • the movements of the object detecting device is controlled such that a measurement distance from the object detecting device to the object is less than a predetermined maximal measurement distance. This is beneficial if an automatic scanning procedure is applied.
  • the processing of the object information data to determine a three dimensional (3D) representation of said object also includes to determine and apply information regarding the position of the object detecting device. This is preferable as an accurate position of the object in relation to the vehicle and also to the environment may be determined.
  • the processing unit is further configured to apply the determined 3D-representation of said object (12) during a loading procedure of said object, preferably during an automatic loading procedure. This is advantageous as a loading procedure then easily may be adapted to the present object.
  • the processing unit is further configured to determine the centre of gravity (COG) of the object based upon the determined 3D-representation of said object.
  • COG centre of gravity
  • the object detecting device is a camera system comprising two cameras that have essentially overlapping field of views. Using a camera system is advantageous in that it is a passive system, and that the camera system also may be applied to take images of the object and of the environment.
  • the present invention relates to a method in a vehicle of the kind described above.
  • the method comprises performing an object scanning procedure, that comprises:
  • processing said object information data to determine a three dimensional (3D) representation of said object.
  • Figure 1 is a schematic illustration of a vehicle according to the present invention.
  • Figure 2 is a block diagram illustrating various components of the present invention.
  • Figure 3a-3d show various positions of a crane during information capture in accordance with the present invention.
  • FIG. 4 is a flow diagram showing the method steps according to the present invention. Detailed description
  • the vehicle is any vehicle provided with a crane, and includes any working vehicle, forestry vehicle, transport vehicle, and loading vehicle.
  • the vehicle 2 comprises a movable crane 4, e.g. a foldable crane, mounted on the vehicle and movably attached to the vehicle.
  • the crane 4 is provided with a tool 9, e.g. a fork or a bucket, attached to a crane tip 8.
  • the crane 4 comprises at least one crane part 8, e.g. at least one boom that may be one or many extendible booms, and is movable within a movement range.
  • the vehicle and the crane will not be disclosed in greater detail as these are conventional, and being conventionally used, e.g. with regard to the joint between the crane and the vehicle, the joints between the crane parts of the crane, and the joint between a crane tip and a tool which normally is a rotator.
  • the vehicle further comprises at least one object detecting device 10 provided at the crane 4 and movable together with said crane, and configured to wirelessly capture information of an object 12.
  • the object 12 is a general designation of any fixed or removable three- dimensional item within a working range of the crane.
  • the object may e.g. be a load to be picked up by the crane, or may be a part of the environment and the ground around the vehicle.
  • the object detecting device 10 is a camera system comprising at least two cameras, and preferably two cameras, that have essentially overlapping field of views. This embodiment will be further discussed below.
  • the limitations for the field of views for the object detecting device are indicated as dashed lines in figure 2.
  • the captured information comprises at least a distance and a direction to an object defining a coordinate in a three dimensional coordinate system, the object detecting device 10 is configured to generate an object data signal 14 comprising data representing captured information.
  • the vehicle further comprises a processing unit 16 configured to receive the object data signal 14.
  • the processing unit 14 may be embodied as a dedicated electronic control unit (ECU), or implemented as a part of another ECU.
  • ECU electronice control unit
  • the processing unit 16 is configured to perform an object scanning procedure, which e.g. is a software implemented procedure that may be stored in the processing unit.
  • the scanning procedure comprises to control movement of the object detecting device 10 to a predetermined starting position in relation to the object 12, which e.g. could be at one side of the object which is illustrated in figure 3a. This is performed by controlling movement of the crane 4 such that the object detecting device is in its starting position.
  • the movement of the object detecting device 10 in relation to the object 12 is controlled according to predefined scanning movement rules such that
  • measurements are performed from directions such that essentially the entire outside surface of the object is covered, by controlling movement of the crane. More particularly, the object detecting device should be moved such that its field of view is directed towards the object to be measured. If the object detecting device is attached at a tool which is connected to the crane tip via a rotator the movement may be achieved by applying a rotation to the tool via the rotator. Simultaneously to the movement, measurements of the object is performed.
  • FIG.-3d the movement of the crane, and thus of the object detecting device, around the object 12 is illustrated.
  • the object is provided with a bold line on the respective side where the measurement is performed.
  • the field of view of the object detecting device 10 is schematically indicated as a circle sector.
  • Object information data is captured as a result of the measurements, and the data is applied to the processing unit by being included in the object data signal 14.
  • the processing unit is then configured to process the object information data to determine a three dimensional (3D) representation of the object.
  • the object data information may e.g. include a point cloud where each point is a three dimensional coordinate of a point at the surface of the object.
  • the predefined scanning movement rules include control instructions to control the movements of the crane such that the object detecting device is moved around the object according to a predefined movement pattern.
  • the predefined scanning movement rules includes to control movements such the that the object detecting device is moved around the object essentially in one plane, preferably in a horizontal plane.
  • the scanning movement rules should include control instructions that ensure that the necessary measurements are performed in order to establish the 3D
  • the different movement patterns may include moving around and above the object, and in addition repeating the movement around the object, stopping the movement, and moving up and down at one side, etc.
  • the movements of the object detecting device 10 is preferably controlled such that a measurement distance from the object detecting device to the object 12 is less than a predetermined maximal measurement distance, which is related to the overall size of the object and may as an example be within the range of 0.1 - 1 meter.
  • the scanning procedure steps may be automatically performed until enough data has been captured in order to determine the 3D representation of the object.
  • the processing unit 16 is further configured to apply the determined 3D- representation of the object 12 during a loading procedure of the object, preferably during an automatic loading procedure.
  • the determined 3D-representation of the object 12 is stored in a storage unit in the processing unit. Over time numerous different 3D-representations of objects are stored. These may be grouped in various typical object classes, e.g.
  • This stored information may be applied during the step of determining the 3D-representation such that the processing unit may recognize the scanned object among the stored 3D-objects. Thereby, the scanning procedure may be performed faster.
  • the scanned objects may also be shared to other users, e.g. to other vehicles or to a central database available by other vehicles.
  • the processing unit 16 is configured to determine the centre of gravity (COG) of the object (12) based upon the determined 3D- representation of the object. This may be useful information when determining a loading procedure.
  • an identification (ID) tag 20 is preferably provided at the object.
  • the ID tag may be a visual tag, e.g. provided with a bar code or QR code.
  • the object detecting device 10 is then configured to detect and capture information about the object from the ID tag.
  • the object detecting device 10 is preferably a camera system provided with two cameras arranged at a distance from each other such that the cameras capture overlapping field of views.
  • the object detecting device 10 is a laser scanning device or a structured-light 3D scanner.
  • a camera system comprises at least two cameras, preferably two cameras, sometimes called a stereo camera. This is an advantageous embodiment of the object detecting device as stereo camera systems are more and more frequently used in various vehicles.
  • a stereo camera is a type of camera with two lenses with a separate image sensor for each lens. This allows the camera to simulate human binocular vision, and therefore gives it the ability to capture three-dimensional images, a process known as stereo photography.
  • Stereo cameras may be used for making 3D pictures, or for range imaging. Unlike most other approaches to depth sensing, such as structured light or time-of-flight measurements, stereo vision is a purely passive technology which also works in bright daylight.
  • the object detecting device uses the Lidar-technology.
  • Lidar is sometimes considered an acronym of Light Detection And Ranging (sometimes Light Imaging, Detection, And Ranging), and is a surveying method that measures distance to a target by illuminating that target with a laser light.
  • Lidar is popularly used to make high-resolution maps, with applications in geodesy, forestry, laser guidance, airborne laser swath mapping (ALSM), and laser altimetry.
  • Lidar sometimes is called laser scanning and 3D scanning, with terrestrial, airborne, and mobile applications.
  • a 3D scanning device is used.
  • a 3D scanner is a device that analyses a real-world object or environment to collect data on its shape and possibly its appearance (e.g. colour). The collected data can then be used to construct digital three-dimensional models.
  • industrial computed tomography scanning can be used to construct digital 3D models, applying non-destructive testing.
  • 3D scanner The purpose of a 3D scanner is usually to create a point cloud of geometric samples on the surface of the subject. These points can then be used to extrapolate the shape of the subject (a process called reconstruction). If colour information is collected at each point, then the colours on the surface of the subject can also be determined.
  • 3D scanners share several traits with cameras. Like most cameras, they have a cone-like field of view, and like cameras, they can only collect information about surfaces that are not obscured. While a camera collects colour information about surfaces within its field of view, a 3D scanner collects distance information about surfaces within its field of view. The "picture" produced by a 3D scanner describes the distance to a surface at each point in the picture. This allows the three dimensional position of each point in the picture to be identified.
  • a so-called time-of-flight lidar scanner may be used to produce a 3D model.
  • the lidar can aim its laser beam in a wide range: its head rotates horizontally, a mirror flips vertically.
  • the laser beam is used to measure the distance to the first object on its path.
  • the time-of-flight 3D laser scanner is an active scanner that uses laser light to probe the subject.
  • a time-of-flight laser range finder finds the distance of a surface by timing the round- trip time of a pulse of light.
  • a laser is used to emit a pulse of light and the amount of time before the reflected light is seen by a detector is measured. Since the speed of light c is known, the round-trip time determines the travel distance of the light, which is twice the distance between the scanner and the surface.
  • the accuracy of a time-of-flight 3D laser scanner depends on how precisely we can measure the t; 3.3 picoseconds (approx.) is the time taken for light to travel 1 millimetre.
  • the laser range finder only detects the distance of one point in its direction of view.
  • the scanner scans its entire field of view one point at a time by changing the range finder's direction of view to scan different points.
  • the view direction of the laser range finder can be changed either by rotating the range finder itself, or by using a system of rotating mirrors. The latter method is commonly used because mirrors are much lighter and can thus be rotated much faster and with greater accuracy.
  • Typical time-of-flight 3D laser scanners can measure the distance of 10,000 ⁇ 100,000 points every second.
  • the object detecting device uses a structured-light 3D scanner that projects a pattern of light on the subject and look at the deformation of the pattern on the subject.
  • the pattern is projected onto the subject using either an LCD projector or other stable light source.
  • a camera offset slightly from the pattern projector, looks at the shape of the pattern and calculates the distance of every point in the field of view.
  • structured-light 3D scanners are speed and precision. Instead of scanning one point at a time, structured light scanners scan multiple points or the entire field of view at once. Scanning an entire field of view in a fraction of a second reduces or eliminates the problem of distortion from motion.
  • Some existing systems are capable of scanning moving objects in real-time.
  • Still another applicable technique is the so-called Photonic Mixer Devices (PMD) time-of-flight technology where invisible infrared light illuminates scenes and objects to be detected, and the reflected light is sensed by a PMD-sensor.
  • PMD Photonic Mixer Devices
  • a display unit 22 is provided and the processing unit 16 is configured to generate and apply a presentation signal 24 to the display unit to present the 3D representation on the display unit.
  • the display unit may be a display arranged e.g. at a control unit or in the vehicle.
  • the display unit 22 is a pair of glasses, for example of the type sold under the trademark Hololens.
  • the pair of glasses is structured to present the 3D representation such that the 3D representation is overlaid on the transparent glasses through which a user observes the object.
  • Various additional information may also be presented as overlaid information and preferably presented such that the additional information is presented close to an illustrated part of the object.
  • the display unit 22 is a pair of virtual reality goggles. These types of goggles comprise two displays to be arranged in front of the operator's eyes.
  • the present invention also comprises a method in a vehicle 2 comprising a movable crane 4 mounted on the vehicle and movably attached to the vehicle.
  • the crane 4 comprises at least one crane part 6 and a crane tip 8.
  • the vehicle further comprises at least one object detecting device 10 provided at the crane 4 and movable together with the crane.
  • the object detecting device is configured to wirelessly capture information of an object 12.
  • the captured information comprises at least a distance and a direction to the object defining a coordinate in a three dimensional coordinate system.
  • the object detecting device 10 is configured to generate an object data signal 14 comprising data
  • the vehicle further comprises a processing unit 16 configured to receive said object data signal 14.
  • a processing unit 16 configured to receive said object data signal 14.
  • the method preferably comprises that the predefined scanning movement rules includes controlling movements such that the object detecting device is moved around the object according to a predefined movement pattern.
  • the method preferably comprises controlling movements of the object detecting device 10 such that a measurement distance from the object detecting device to the object 12 is less than a predetermined maximal measurement distance.
  • the method comprises processing the object information data to determine a three dimensional (3D) representation of the object, and also determining and applying information regarding the position of the object detecting device 10.
  • the method comprises applying the determined 3D- representation of the object 12 during a loading procedure of the object, preferably during an automatic loading procedure.
  • the method comprises determining the centre of gravity (COG) of the object 12 based upon the determined 3D-representation of the object.
  • the method may also comprise detecting and capturing information about the object from an identification (ID) tag provided at the object.
  • ID tag is e.g. provided with a bar code or a QR code.
  • the method may also comprise presenting the 3D representation on a display unit.
  • the display unit may be a pair of glasses structured to present the 3D representation such that the 3D representation is overlaid on the transparent glasses through which a user observes the object.
  • the display unit is a pair of virtual reality goggles.
  • the present invention is not limited to the above-described preferred

Landscapes

  • Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Geology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Civil Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

L'invention concerne un véhicule (2) comprenant une grue mobile (4) montée sur le véhicule et fixée de façon mobile au véhicule, la grue (4) comprenant au moins une partie de grue (6) et une pointe de grue (8), le véhicule comprenant au moins un dispositif de détection d'objet (10) disposé au niveau de ladite grue (4) et pouvant être déplacé conjointement avec ladite grue, et configuré pour capturer sans fil des informations d'un objet (12), les informations capturées comprenant au moins une distance et une direction vers ledit objet définissant une coordonnée dans un système de coordonnées à trois dimensions. Une unité de traitement (16) est présente et configurée pour exécuter une procédure de balayage d'objet qui comprend les étapes suivantes : commande du mouvement du dispositif de détection d'objet (10) jusqu'à une position de départ prédéterminée par rapport à l'objet (12), en commandant le mouvement de la grue (4), commande du mouvement du dispositif de détection d'objet (10) par rapport audit objet (12) conformément à des règles de mouvement de balayage prédéfinies de telle manière que des mesures sont effectuées depuis des directions de sorte que presque la totalité de la surface extérieure de l'objet soit couverte, en commandant le mouvement de la grue, et exécution simultanée des mesures dudit objet, des données d'informations d'objet étant capturées suite desdites mesures, puis traitement desdites données d'informations d'objet afin de déterminer une représentation tridimensionnelle (3D) dudit objet.
PCT/SE2018/050206 2017-03-15 2018-03-05 Véhicule équipé d'une grue dotée d'un dispositif de détection d'objet WO2018169467A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1750308-7 2017-03-15
SE1750308A SE1750308A1 (sv) 2017-03-15 2017-03-15 A vehicle with a crane with object detecting device

Publications (1)

Publication Number Publication Date
WO2018169467A1 true WO2018169467A1 (fr) 2018-09-20

Family

ID=61628445

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/050206 WO2018169467A1 (fr) 2017-03-15 2018-03-05 Véhicule équipé d'une grue dotée d'un dispositif de détection d'objet

Country Status (2)

Country Link
SE (1) SE1750308A1 (fr)
WO (1) WO2018169467A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110422767A (zh) * 2019-06-27 2019-11-08 三一海洋重工有限公司 对吊具定位的方法、装置及系统
WO2020076212A1 (fr) * 2018-10-12 2020-04-16 Indexator Rotator System Ab Système destiné à commander un rotateur par des moyens de détection d'image
EP3715993A1 (fr) * 2019-03-25 2020-09-30 Cargotec Patenter AB Véhicule comprenant un équipement de travail, équipement de travail et procédé associé
AT523848B1 (de) * 2021-03-17 2021-12-15 Umweltdata G M B H Aufnahmevorrichtung und -verfahren zur erfassung eines waldbestands
WO2022221311A1 (fr) * 2021-04-12 2022-10-20 Structural Services, Inc. Systèmes et procédés pour aider un opérateur de grue

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11173810A (ja) * 1997-12-12 1999-07-02 Nkk Corp コイル位置検出装置
DE102004041938A1 (de) * 2004-08-30 2006-03-09 Liebherr-Werk Nenzing Gmbh, Nenzing Stapelgerät, insbesondere Reachstacker, und Verfahren zum Greifen und Stapeln von Containern
US20110187548A1 (en) 2010-02-01 2011-08-04 Kurt Maynard Lifting device efficient load delivery, load monitoring, collision avoidance, and load hazard avoidance
JP2014105091A (ja) * 2012-11-29 2014-06-09 Tadano Ltd クレーンの監視カメラ
JP2014169184A (ja) * 2013-02-05 2014-09-18 Tadano Ltd 作業車の周囲情報取得装置
US20150249821A1 (en) 2012-09-21 2015-09-03 Tadano Ltd. Surrounding information-obtaining device for working vehicle
US9302890B1 (en) 2013-04-29 2016-04-05 TNV, Inc. Crane control system and method
US9415976B2 (en) 2012-05-10 2016-08-16 Trimble Navigation Limited Crane collision avoidance

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11173810A (ja) * 1997-12-12 1999-07-02 Nkk Corp コイル位置検出装置
DE102004041938A1 (de) * 2004-08-30 2006-03-09 Liebherr-Werk Nenzing Gmbh, Nenzing Stapelgerät, insbesondere Reachstacker, und Verfahren zum Greifen und Stapeln von Containern
US20110187548A1 (en) 2010-02-01 2011-08-04 Kurt Maynard Lifting device efficient load delivery, load monitoring, collision avoidance, and load hazard avoidance
US9415976B2 (en) 2012-05-10 2016-08-16 Trimble Navigation Limited Crane collision avoidance
US20150249821A1 (en) 2012-09-21 2015-09-03 Tadano Ltd. Surrounding information-obtaining device for working vehicle
JP2014105091A (ja) * 2012-11-29 2014-06-09 Tadano Ltd クレーンの監視カメラ
JP2014169184A (ja) * 2013-02-05 2014-09-18 Tadano Ltd 作業車の周囲情報取得装置
US9302890B1 (en) 2013-04-29 2016-04-05 TNV, Inc. Crane control system and method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020076212A1 (fr) * 2018-10-12 2020-04-16 Indexator Rotator System Ab Système destiné à commander un rotateur par des moyens de détection d'image
EP3715993A1 (fr) * 2019-03-25 2020-09-30 Cargotec Patenter AB Véhicule comprenant un équipement de travail, équipement de travail et procédé associé
CN110422767A (zh) * 2019-06-27 2019-11-08 三一海洋重工有限公司 对吊具定位的方法、装置及系统
CN110422767B (zh) * 2019-06-27 2020-09-29 三一海洋重工有限公司 对吊具定位的方法、装置及系统
AT523848B1 (de) * 2021-03-17 2021-12-15 Umweltdata G M B H Aufnahmevorrichtung und -verfahren zur erfassung eines waldbestands
AT523848A4 (de) * 2021-03-17 2021-12-15 Umweltdata G M B H Aufnahmevorrichtung und -verfahren zur erfassung eines waldbestands
WO2022221311A1 (fr) * 2021-04-12 2022-10-20 Structural Services, Inc. Systèmes et procédés pour aider un opérateur de grue
US11897734B2 (en) 2021-04-12 2024-02-13 Structural Services, Inc. Systems and methods for guiding a crane operator
US11932518B2 (en) 2021-04-12 2024-03-19 Structural Services, Inc. Systems and methods for calculating a path
US11939194B2 (en) 2021-04-12 2024-03-26 Structural Services, Inc. Drone systems and methods for assisting a crane operator

Also Published As

Publication number Publication date
SE1750308A1 (sv) 2018-09-16

Similar Documents

Publication Publication Date Title
US11292700B2 (en) Driver assistance system and a method
US10665012B2 (en) Augmented reality camera for use with 3D metrology equipment in forming 3D images from 2D camera images
US20190079522A1 (en) Unmanned aerial vehicle having a projector and being tracked by a laser tracker
WO2018169467A1 (fr) Véhicule équipé d'une grue dotée d'un dispositif de détection d'objet
EP3589575B1 (fr) Véhicule équipé d'un agencement pour déterminer une représentation tridimensionnelle d'un élément mobile
US10132611B2 (en) Laser scanner
US9417317B2 (en) Three-dimensional measurement device having three-dimensional overview camera
US7777761B2 (en) Method and apparatus for specifying and displaying measurements within a 3D rangefinder data set
EP3657455B1 (fr) Procédés et systèmes de détection d'intrusions dans un volume surveillé
JP2019117188A (ja) 表面分析のためのシステムおよびその方法
US12025468B2 (en) Optical sensor with overview camera
EP3687937B1 (fr) Système d'assistance à l'opérateur et procédé associé au système
US20220414925A1 (en) Tracking with reference to a world coordinate system
US20240161435A1 (en) Alignment of location-dependent visualization data in augmented reality
EP4227708A1 (fr) Alignement et visualisation de réalité augmentée d'un nuage de points
JP2017111118A (ja) 3dスキャナからの2次元(2d)スキャンデータに基づく3次元(3d)スキャン間の位置合せ計算
WO2024102428A1 (fr) Alignement de données de visualisation dépendant de l'emplacement dans une réalité augmentée
WO2023163760A1 (fr) Suivi avec référence à un système de coordonnées universel
JP2017111117A (ja) 2次元スキャナによる測定値に基づいて各スキャン間で実行される3次元スキャナデータの位置合せ計算
ASCE Target-focused Local Workspace Modeling for Construction Automation Applications
Scherer a Circleless" 2D/3D Total STATION": a Low Cost Instrument for Surveying, Recording Point Clouds, Documentation, Image Acquisition and Visualisation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18711185

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18711185

Country of ref document: EP

Kind code of ref document: A1