WO2019049511A1 - Dispositif de grue - Google Patents

Dispositif de grue Download PDF

Info

Publication number
WO2019049511A1
WO2019049511A1 PCT/JP2018/026546 JP2018026546W WO2019049511A1 WO 2019049511 A1 WO2019049511 A1 WO 2019049511A1 JP 2018026546 W JP2018026546 W JP 2018026546W WO 2019049511 A1 WO2019049511 A1 WO 2019049511A1
Authority
WO
WIPO (PCT)
Prior art keywords
captured image
target
control unit
region
area
Prior art date
Application number
PCT/JP2018/026546
Other languages
English (en)
Japanese (ja)
Inventor
小林 雅人
Original Assignee
住友重機械搬送システム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友重機械搬送システム株式会社 filed Critical 住友重機械搬送システム株式会社
Priority to CN202110179044.5A priority Critical patent/CN112938766B/zh
Priority to CN201880053626.5A priority patent/CN111032561B/zh
Priority to JP2019540803A priority patent/JP6689467B2/ja
Publication of WO2019049511A1 publication Critical patent/WO2019049511A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/22Control systems or devices for electric drives
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/52Details of compartments for driving engines or motors or of operator's stands or cabins
    • B66C13/54Operator's stands or cabins

Definitions

  • One aspect of the present invention relates to a crane apparatus.
  • Patent Document 1 when a placed container is grasped by a spreader, or when a container grasped by a spreader is stacked on another placed container, acquisition is performed by a 3D camera provided in the spreader.
  • the crane which detects the mounted container and acquires the positional relationship of a spreader and the said container is described based on the distance map.
  • the 3D camera used in the crane as described above outputs measurement light to the area including the container to be detected, and detects the reflected light to obtain a distance map. For this reason, the 3D camera may erroneously detect that a container is present in an area where there is no container to be detected, for example, under the influence of strong sunlight or disturbance light outdoors. . In addition, the 3D camera may detect raindrops when used outdoors when it is raining, and may not properly detect a target object. Furthermore, in the 3D camera, for example, if there is a mirror surface portion in the container to be detected, the measurement light may be specularly reflected in a direction in which the reflected light can not be detected by the 3D camera. In this case, the 3D camera can not acquire the distance map for the mirror surface, and as a result, there is a possibility that the container can not be detected despite the presence of the container to be detected.
  • one aspect of the present invention aims to provide a crane apparatus capable of more reliably acquiring the positional relationship between a lifting gear and an object.
  • a crane apparatus includes: a lifting device that grips, holds, and unloads a load; an imaging unit that is provided on the lifting device and captures an image below the lifting device to obtain a captured image; A distance information acquisition unit provided on the lifting tool for acquiring distance information to a plurality of measurement points within the range of the captured image below the lifting tool, provided on the lifting tool, of the captured image; A color information acquisition unit for acquiring color information in a range, a display unit for displaying a captured image, and a control unit for processing the captured image and displaying the image on the display unit, the control unit including distance information and color information Based on the target image including the region where the target is located in the captured image is extracted from the captured image, and a corresponding portion corresponding to the shape of the reference image stored in advance is detected in The image is highlighted on the display with the corresponding parts highlighted Make.
  • the region where the object is located is estimated based on the distance information, and the object is located based on the color information
  • the region of interest is estimated.
  • the apparatus extracts a target area including the area in which the target is located from the captured image based on these estimation results, and emphasizes and displays a corresponding portion corresponding to the shape of the reference image in the target area.
  • this device can acquire the positional relationship between the lifting gear and the object using the distance information and the color information in a complementary manner. Therefore, since this device is not susceptible to, for example, the influence of disturbance light and the influence of specular reflection on an object, the positional relationship between the lifting gear and the object can be acquired more reliably.
  • control unit may not detect the corresponding part outside the target area of the captured image. According to this, since it is only necessary to detect the corresponding part corresponding to the shape of the reference image for only the target area in the captured image, the time required for the process can be shortened.
  • the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information.
  • a second region located may be estimated, and a region including a region included in at least one of the first region and the second region may be extracted from the captured image as a target region.
  • the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information.
  • a second region located may be estimated, and a region including the region included in the first region and included in the second region may be extracted from the captured image as a target region.
  • the crane apparatus includes a moving mechanism for moving the lifting gear, and the control unit causes the corresponding portion in the captured image to be captured and unloaded when the lifting gear performs loading and unloading of the load.
  • the position shift of the reference portion where the corresponding portion is to be positioned may be detected, and the moving mechanism may move the hanger so as to eliminate the position shift. According to this, it is possible to accurately position the lifting device directly above the object.
  • the object may be at least a part of the load gripped by the lifting gear. According to this, in the situation where the load is gripped by the lifting tool, the positional relationship between the lifting tool and the load can be acquired more reliably.
  • the object may be at least a part of the mounting portion on which the load held by the lifting device is unloaded. According to this, in the situation where the load is unloaded by the lifting tool, the positional relationship between the lifting tool and the placement portion can be acquired more reliably.
  • FIG. 1 is a block diagram showing a crane apparatus according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus.
  • FIG. 3 is a side view of the crane apparatus.
  • FIG. 4 is a perspective view showing a hanger and a coil.
  • FIG. 5 is a view showing a captured image in the load grip control.
  • FIG. 6 is a diagram showing a distance accuracy map in the load grip control.
  • FIG. 7 is a diagram showing a color accuracy map in the load grip control.
  • FIG. 8 is a diagram showing an OR probability map in the load grip control.
  • FIG. 9 is a view showing a reference image.
  • FIG. 10 is a view showing a highlighted image in the load gripping control.
  • FIG. 11 is a diagram showing an AND probability map in the load grip control.
  • FIG. 1 is a block diagram showing a crane apparatus according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus.
  • FIG. 3 is a side view of
  • FIG. 12 is a diagram showing a highlighted image in unloading control.
  • FIG. 13 is a block diagram showing a crane apparatus according to a second embodiment.
  • FIG. 14 is a front view of the crane apparatus.
  • FIG. 15 is a perspective view of the crane apparatus.
  • FIG. 16 is a diagram showing a reference image.
  • FIG. 17 is a view showing a highlighted image in the load grip control.
  • FIG. 18 is a view showing a highlighted image in unloading control.
  • FIG. 1 is a block diagram showing a crane apparatus 1 according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus 1.
  • FIG. 3 is a side view of the crane apparatus 1.
  • FIG. 4 is a perspective view showing the hanger 30 and the coil C.
  • the crane apparatus 1 according to the first embodiment is an overhead crane that transports a coil (load) C that is a steel plate wound in a cylindrical shape.
  • the crane device 1 includes a moving mechanism 10, a lifting tool 30, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60.
  • the crane apparatus 1 also includes a transport control unit 51 that controls the moving mechanism 10 and the lifting tool 30, and an encoder 31 connected to the transport control unit 51.
  • the coil C has a substantially cylindrical shape. At the axial center position of the coil C, a hanging hole H penetrating from one end to the other end of the coil C is formed.
  • the coil C is placed on a skid (placement unit) S disposed in a matrix at predetermined intervals on a floor surface F of a building such as a factory or a warehouse in a direction in which the axial direction is horizontal.
  • the coil C has a predetermined shape, a predetermined size, and a predetermined color.
  • the outer peripheral surface of the coil C is mirror-like.
  • the moving mechanism 10 is a mechanism that moves the lifting tool 30 (horizontally, vertically, and rotationally).
  • the moving mechanism 10 has a girder 12 and a trolley 13.
  • the girder 12 supports the load of the trolley 13 and the hanger 30.
  • the girder 12 is horizontally laid across both walls near the ceiling of the building.
  • the girder 12 is movable in the horizontal direction orthogonal to the extending direction of the girder 12, and the trolley 13 and the hanger 30 can be moved in this direction.
  • the trolley 13 traverses the upper surface of the girder 12 along the extending direction of the girder 12. Thereby, the trolley 13 can move the hanger 30 along the extension direction of the girder 12.
  • the trolley 13 has a hoisting mechanism 11 for winding up and down the hanging wire rope 17.
  • the trolley 13 has a rotation mechanism (not shown) that rotationally moves the hanger 30 about the vertical axis in the vertical direction via the wire rope 17.
  • the hanger 30 is a device for gripping, holding, and unloading the coil C.
  • the hanger 30 is hung on a wire rope 17 hanging down from the trolley 13.
  • the lifting tool 30 is moved upward by winding up the wire rope 17 by the winding mechanism 11 of the trolley 13 and is moved downward by winding down the wire rope 17 by the winding mechanism 11.
  • the hanger 30 has a base 18 and a pair of opposed claws 19, 19.
  • the wire rope 17 is connected to the upper surface side of the base portion 18, and the claw portions 19, 19 are provided on the lower surface side so as to be able to open and close.
  • each of the claws 19 and 19 is provided with a projection 20 projecting so as to face each other.
  • the respective convex portions 20 are inserted into the suspension holes H of the coil C from both sides.
  • the hanger 30 holds the coil C.
  • the inner surfaces 21 and 21 of the claws 19 and 19 facing each other may or may not be in contact with the coil C.
  • the hanging tool 30 transfers the coil C in the following procedure. That is, the lifting tool 30 opens and closes the claws 19 and 19 to grasp the load on the coil C placed on the skid S, and while holding the coil C, the moving mechanism 10 and the lifting mechanism 11 The coil C is transferred by being moved onto the skid S, and the claws 19, 19 are opened and closed again to unload the coil C.
  • the encoder 31 is a sensor that detects the amount of operation for raising and lowering the hoisting mechanism 11 and the amount of movement of the trolley 13 in the transverse direction. Based on the detection result of the encoder 31, the conveyance control unit 51 acquires height information indicating the current height of the lifting tool 30. The encoder 31 may not detect the amount of movement of the trolley 13 in the transverse direction by detecting the amount of operation of raising and lowering the winding mechanism 11.
  • the transfer control unit 51 controls the driving of the crane device 1.
  • the transport control unit 51 moves the girder 12, traverses the trolley 13, rolls up and down the wire rope 17 by the winding mechanism 11 (moves the lifting tool 30 in the vertical direction), and rotates the lifting tool 30 by the rotation mechanism. It controls the rotational movement and the opening and closing of the claws 19, 19.
  • the conveyance control unit 51 outputs, to the control unit 50, the height information of the hanger 30 obtained based on the detection result of the encoder 31.
  • the imaging unit 40 is an imaging device that is provided on the lifting tool 30 and captures an image of the lower side of the lifting tool 30 to obtain a captured image.
  • the imaging unit 40 may be, for example, a camera.
  • the imaging unit 40 is provided on the base 18 of the hanger 30 so as to face downward.
  • a plurality of imaging units 40 may be provided on the hanger 30.
  • the distance information acquisition unit 41 is a device provided on the lifting tool 30 to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30.
  • the distance information acquisition unit 41 acquires distance information to each measurement point in association with the position in the captured image.
  • the distance information acquisition unit 41 is not particularly limited as long as it is an apparatus capable of acquiring distance information, and may be, for example, a ToF [Time of Flight] camera, a 3D camera, or a 3D scanner. Good.
  • the distance information acquisition unit 41 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanging tool 30 so as to face downward.
  • the distance information acquisition units 41 are provided by the number corresponding to the number of imaging units 40.
  • the “distance information” is information indicating the distance between the distance information acquisition unit 41 and each measurement point.
  • the measurement points are, for example, points set on the floor F, the skid S, the upper surface of the coil C, etc., and may be set in a matrix, for example.
  • the measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane.
  • the color information acquisition unit 42 is a device provided on the hanging tool 30 to obtain color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30.
  • the color information acquisition unit 42 acquires color information in association with the position in the captured image.
  • the color information acquisition unit 42 is not particularly limited as long as it is an apparatus capable of acquiring color information, and may be, for example, a spectrum camera or a color camera.
  • the color information acquisition unit 42 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanger 30 so as to face downward.
  • the color information acquisition units 42 are provided in the number corresponding to the number of imaging units 40.
  • the “color information” is, for example, information which decomposes a color image into RGB signals and indicates it in vector representation.
  • the display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50.
  • the display unit 43 may be, for example, a display.
  • the display unit 43 is provided, for example, in the driver's cab 60.
  • the control unit 50 processes the captured image to generate an emphasized image, and causes the display unit 43 to display the generated emphasized image (the details will be described later).
  • the control unit 50 is configured to unload the coil C held by the lifting tool 30 onto the skid S by unloading the coil C held by the lifting tool 30. It is possible to generate an enhanced image in the down control.
  • FIG. 5 is a view showing a captured image P1 in the load grip control.
  • the control unit 50 causes the display unit 43 to display the captured image P1 captured by the imaging unit 40.
  • the coil C placed below the hanger 30, one claw portion 19 of the hanger 30, and the skid S on which the coil C is not placed are displayed.
  • the hanger 30 is at a position offset from immediately above the mounted coil C.
  • the coil C is placed on a skid S (not shown) below the coil C. Since the outer peripheral surface of the coil C is mirror-like, the coil C in the captured image P1 has a reflection portion R in which the surrounding scene is reflected.
  • control unit 50 extracts a target area including the area in which the target is located in the captured image P1 from the captured image P1 as follows.
  • the “target object” is the coil C gripped by the lifting tool 30.
  • FIG. 6 is a diagram showing the distance accuracy map P2 in the load grip control.
  • the distance information acquisition unit 41 corresponds distance information to a plurality of measurement points (not shown) in the range of the captured image P1 below the lifting tool 30 with the position in the captured image P1. Get it.
  • the measurement points may be set at positions where predetermined numbers (for example, 20) in the vertical direction of the captured image P1 and predetermined numbers (for example, 30) in the horizontal direction are arranged in a matrix at equal intervals.
  • the control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, a first area A1 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined shape and a predetermined size, the control unit 50 compares the lifting device 30 and the coil C with each other based on the current height information of the lifting device 30. Predict the current distance between. Then, the control unit 50 compares the predicted distance with the distance related to the distance information acquired by the distance information acquisition unit 41 to estimate the first area A1 in which the coil C is located in the captured image P1.
  • the control unit 50 determines that the coil C is located in that area, If it is out of the predetermined range, it may be determined that the coil C is not located in the region.
  • the control unit 50 appropriately determines that the area in which the coil C is located is in the vicinity of the axial center of the coil C in a plan view (white area in FIG. 6).
  • the measurement light of the distance information acquisition unit 41 is incident at an incident angle close to parallel to the outer peripheral surface of the mirror C. It is incident. Therefore, in the area, the distance information acquisition unit 41 can not detect the reflected light, and as a result, the control unit 50 erroneously determines that the coil C is not located.
  • FIG. 7 is a diagram showing the color accuracy map P3 in the load grip control.
  • the distance information acquisition unit 41 acquires color information within the range of the captured image P1 below the lifting tool 30 in association with the position in the captured image P1.
  • the control unit 50 estimates, based on the color information acquired by the color information acquisition unit 42, the second area A2 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined color, the control unit 50 positions the coil C based on the degree of color correlation with the predetermined color in the captured image P1.
  • the second area A2 is estimated.
  • a known method can be used as a method of determining "the degree of color correlation".
  • the control unit 50 converts the color information into coordinates on the chromaticity diagram, and evaluates the distance between the coordinates on the chromaticity diagram and the coordinates on the chromaticity diagram of the coil C stored in advance.
  • the degree of color correlation may be determined by a method.
  • the control unit 50 stores the color of the coil C in advance.
  • the control unit 50 determines that the coil C is positioned in the area if the degree of color correlation is equal to or higher than a predetermined value, and determines that the coil C is not positioned in the area if it is less than the predetermined value. You may
  • the control unit 50 appropriately determines that the coil C is located in a portion (white area in FIG. 7) of the coil C excluding the reflected portion R in a plan view. It is judged. On the other hand, since the reflection portion R has a color which is largely different from the color of the coil, the control unit 50 erroneously determines that the coil C is not located.
  • FIG. 8 is a diagram showing an OR probability map P4 in the load grip control.
  • the OR accuracy map P4 shown in FIG. 8 shows an OR region A3 (white part in FIG. 8) included in at least one of the first region A1 of the distance accuracy map P2 and the second region A2 of the color accuracy map P3.
  • the control unit 50 sets an area including the OR area A3 in the OR probability map P4 (for example, an area obtained by enlarging the outer edge of the OR area A3 by a predetermined width) as the target area A4.
  • the control unit 50 extracts, from the captured image P1, a target region A4 which is a region in which the coil C is located in the captured image P1.
  • control unit 50 detects a corresponding portion corresponding to the shape of the reference image stored in advance in the target area A4 of the captured image P1 as follows.
  • FIG. 9 is a view showing a reference image P5.
  • FIG. 10 is a view showing a highlight image P6 in the load grip control.
  • the control unit 50 stores in advance a shape (specifically, a plan view of the coil C) in the captured image P1 of the coil C which is an object as a reference image P5. There is. Then, the control unit 50 detects a corresponding portion A5 corresponding to the shape of the reference image P5 in a region corresponding to the target region A4 in the captured image P1 using a known image recognition method.
  • the "corresponding to the shape of the reference image P5" may have a size different from that of the reference image P5, or may be a shape in which the reference image P5 is inclined around an axis in the vertical direction.
  • image recognition method for example, pattern matching, machine learning, etc. may be used. Note that the control unit 50 does not detect the corresponding part A5 outside the target area A4 in the captured image P1.
  • the control unit 50 generates an emphasized image P6 in which the corresponding part A5 is emphasized in the captured image P1, and causes the display unit 43 to display the generated emphasized image P6.
  • the emphasis image P6 displays the target area A4 in color and displays areas other than the target area A4 in gray.
  • the corresponding part A5 is surrounded by a broken line.
  • the corresponding part A5 is not limited to being surrounded by a broken line as long as it is emphasized so as to be easily recognized by the operator, and may be displayed in a predetermined color, for example. It is also good, it may be cut out.
  • the control unit 50 determines, based on the detection result of the encoder 31, a reference portion A6 at which the corresponding portion A5 should be positioned in the captured image P1 when the hanger 30 grips the load of the coil C. Calculated from height information acquired.
  • the reference portion A6 is, for example, the shape of the coil C in a plan view in the case where the coil C is placed on the skid S directly under the lifting tool 30 (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding portion A5 and the reference portion A6 in the captured image P1 (or the enhanced image P6).
  • control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A5 and the reference part A6, and the corresponding part A5 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A6.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51.
  • the transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the hanger 30 immediately above the coil C by moving the hanger 30 such that the positional deviation is eliminated by the movement of the girder 12 and the traversing of the trolley 13. At this time, the lifting mechanism 30 may be rotationally moved about the vertical axis by driving the rotation mechanism as needed.
  • the method using the OR probability map P4 has been described above for the display of the enhanced image P6 in the load grip control by the control unit 50.
  • the control unit 50 can also execute a method using an AND certainty map instead of the OR certainty map P4 for the display of the emphasized image P6 in the grip control.
  • FIG. 11 is a view showing an AND probability map P7 in the load grip control.
  • the AND probability map P7 shown in FIG. 11 is the AND region A7 (white part in FIG. 11) included in the first region A1 of the distance probability map P2 and included in the second region A2 of the color probability map P3. It shows.
  • the control unit 50 may set an area including the AND area A7 in the AND likelihood map P7 (for example, an area obtained by expanding the outer edge of the AND area A7 by a predetermined width) as the target area.
  • the control unit 50 may extract, from the captured image P1, a target region which is a region in which the coil C is located in the captured image P1. Note that whether to use the OR probability map P4 or the AND probability map P7 is appropriately selected according to the object, for example.
  • the control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
  • FIG. 12 is a view showing a emphasized image P8 in unloading control.
  • the highlighted image P8 in FIG. 12 includes the coil C, the claw portion 19 of the hanger 30 holding the coil C, and a part of the skid S to which the coil C is unloaded.
  • the skid S has a loading mark M having a predetermined shape, a predetermined size, and a predetermined color.
  • the lifting tool 30 is at a position offset from immediately above the skid S to which the coil C is unloaded, and is tilted about the vertical axis with respect to the skid S.
  • the “object” is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded.
  • the shape and size of the stowage mark M is the shape and size of a virtual columnar portion extending from the entire stowage mark M to the floor surface F in the vertical direction among the skids S having the stowage mark M. is there.
  • the highlighted image P8 displays the target area A8 including the OR area in the OR probability map in color and grays out the area other than the target area A8 with respect to the captured image before unloading the coil C on the skid S. doing.
  • the target area A8 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR likelihood map.
  • the area of the stowing mark M which is the corresponding portion A9 detected in the target area A8, is surrounded by a broken line.
  • the reference portion A10 where the corresponding portion A9 should be positioned when unloading the coil C is performed by the lifting tool 30 is indicated by a dashed dotted line.
  • the reference portion A10 is a shape of the stowage mark M in a plan view in the case where the stowing mark M is located immediately below the hanger 30 (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A9 and the reference part A10. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A9 and the reference part A10, and the corresponding part A9 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A10.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51.
  • the transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the lifting tool 30 immediately above the skid S by moving the lifting tool 30 so that the positional deviation is eliminated by the movement of the girder 12, the traversing of the trolley 13, and the drive of the rotation mechanism.
  • the crane device 1 in the captured image P1 obtained by capturing the lower part of the lifting tool 30, the first region A1 in which the coil C or the stowage mark M, which is the object, is located based on the distance information. Is estimated, and the second region A2 in which the target object coil C or stowing mark M is located is estimated based on the color information. Then, the crane apparatus 1 extracts target areas A4 and A8 including the area in which the target is located from the captured image P1 based on the estimation results, and corresponds to the shape of the reference image in the target areas A4 and A8. The corresponding parts A5 and A9 are highlighted and displayed.
  • the crane apparatus 1 can acquire the positional relationship of the lifting gear 30 and a target object using distance information and color information complementarily. Therefore, since the crane apparatus 1 is hard to receive the influence of disturbance light, and the influence of the specular reflection in a target object, it can acquire the positional relationship of the hanger 30 and a target object more reliably.
  • the control unit 50 does not detect the corresponding parts A5 and A9 outside the target areas A4 and A8 of the captured image P1. As a result, it is only necessary to detect corresponding parts A5 and A9 corresponding to the shape of the reference image for only the target areas A4 and A8 in the captured image P1, so that the time required for processing can be shortened.
  • the control unit 50 estimates the first area A1 in which the coil C or stowing mark M is located in the captured image P1 based on the distance information, and based on the color information, A region including the OR region included in at least one of the first region A1 and the second region A2 by estimating the second region A2 in which the coil C or the stowage mark M is located in the captured image P1.
  • a region including the OR region included in at least one of the first region A1 and the second region A2 by estimating the second region A2 in which the coil C or the stowage mark M is located in the captured image P1.
  • the color information which is less susceptible to the specular reflection of the object is also extracted from the captured image P1 as the target areas A4 and A8. Therefore, it is suppressed that an object can not be detected despite the presence of the object, and the positional relationship between the hanging tool 30 and the object can be acquired more reliably.
  • the control unit 50 estimates the first area A1 in which the coil C or the stowage mark M is the target in the captured image P1 based on the distance information, and the color information is used.
  • the second region A2 in which the coil C or the stowage mark M is located in the captured image P1 and the AND region included in the first region A1 and included in the second region A2
  • the areas to be included are extracted from the captured image P1 as target areas A4 and A8.
  • not only distance information susceptible to disturbance light but also color information unlikely to be affected by disturbance light is assumed to be an area in which the object is located as a target area A4 or A8. Extract from Therefore, erroneous detection as the presence of the object is suppressed even in the region where the object does not exist, and the positional relationship between the lifting tool 30 and the object can be acquired more reliably.
  • the crane apparatus 1 includes the moving mechanism 10 for moving the lifting tool 30, and the control unit 50 grips or unloads the corresponding portions A5 and A9 in the captured image P1 and the coil C whose load is the lifting tool 30.
  • the positional deviation of the reference parts A6 and A10 where the corresponding parts A5 and A9 are to be positioned in the captured image P1 is detected, and the moving mechanism 10 moves the hanger 30 so as to eliminate the positional deviation. Thereby, the hanger 30 can be accurately positioned immediately above the object.
  • the object is a coil C gripped by the lifting tool 30.
  • the object is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded.
  • Second Embodiment [Configuration of crane device]
  • the crane apparatus which concerns on 2nd Embodiment is demonstrated.
  • the crane apparatus according to the second embodiment is different from the crane apparatus 1 according to the first embodiment in the type of crane and the load to be transferred.
  • points different from the crane apparatus 1 according to the first embodiment will be mainly described.
  • FIG. 13 is a block diagram showing a crane apparatus 1A according to a second embodiment.
  • FIG. 14 is a front view of the crane apparatus 1A.
  • FIG. 15 is a perspective view of the crane apparatus 1A.
  • the crane apparatus 1A according to the second embodiment is, for example, in a container yard Y at a container terminal where transfer of a container (load) D, etc. is performed to a container ship that has been docked. It is a container handling crane which is disposed and performs loading and unloading of the container D.
  • the crane apparatus 1A includes a moving mechanism 10A, a hanger 30A, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60A.
  • the hanger 30A is also referred to as a spreader.
  • the crane apparatus 1A includes a transport control unit 51A that controls the moving mechanism 10A and the hanging tool 30A, and an encoder 31 and a shake sensor 32 connected to the transport control unit 51A.
  • the container D is a container such as an ISO standard container.
  • the container D is in the form of a long rectangular parallelepiped and has a predetermined length of, for example, 20 feet or 40 feet in the longitudinal direction.
  • the container D is provided with the to-be-locked part G in which the hole Gh was each formed in the four corners of the upper surface (refer FIG. 16).
  • the container D is stacked in one or more stages in a container yard Y to form a plurality of rows E.
  • the rows E are vertically and horizontally aligned such that the longitudinal direction of the containers D constituting the rows E is parallel to the longitudinal direction of the containers D constituting the other rows E.
  • the container D has a predetermined shape and a predetermined size.
  • the locked portion G has a predetermined shape, a predetermined size, and a predetermined color.
  • the container D which is about to be gripped by the hanger 30A is referred to as a target container D1 (see FIG. 14).
  • the container D held by the hanger 30A is referred to as a holding container D2
  • the container D on which the holding container D2 is about to be unloaded is referred to as a target container (placement unit) D3 (see FIG. 15).
  • the target container D1, the holding container D2, and the target container D3 are such that how to call the container D changes according to the transfer status of the container D.
  • the moving mechanism 10A is a mechanism that moves the lifting tool 30A (horizontally, vertically, and rotationally).
  • the moving mechanism 10A includes a traveling device 14, two pairs of legs 15, 15, a girder 12A, and a trolley 13A.
  • the traveling device 14 includes tire wheels provided at lower ends of the two pairs of leg portions 15 and 15, and driving the tire wheels by the traveling motor causes two pairs of leg portions 15, It is possible to run 15 back and forth.
  • the girder 12A is horizontally laid over the upper end portions of the two pairs of leg portions 15 and 15 substantially horizontally. As a result, the girder 12A can move in the horizontal direction orthogonal to the extending direction of the girder 12A by causing the traveling device 14 to travel the two pairs of legs 15 and 15 forward and backward.
  • the trolley 13A and the hanger 30A can be moved in this direction.
  • the trolley 13A traverses the upper surface of the girder 12A along the extending direction of the girder 12A. Thereby, the trolley 13A can move the hanger 30A along the extension direction of the girder 12A.
  • the trolley 13A has a hoisting mechanism 11A that winds up and down the hanging wire rope 17A.
  • the trolley 13A has a rotation mechanism (not shown) that rotationally moves the hanger 30A around an axis in the vertical direction via the wire rope 17A.
  • a traveling path W of a transport carriage V such as a trailer or an AGV (Automated Guided Vehicle) is laid.
  • the crane apparatus 1A grips the container D carried in by the transport carriage V, and the container D is placed on the container yard Y or another container D (target container D3) placed on the container yard Y. Unload on top.
  • the crane apparatus 1A grips the container D (target container D1) placed on the container yard Y or another container D placed on the container yard Y, and moves the container D
  • the container D is unloaded onto the transport carriage V, and the container D is unloaded by the transport carriage V to the outside.
  • the hanger 30A is a device for gripping, holding, and unloading the container D.
  • the hanger 30A holds the container D from the upper surface side.
  • the hanger 30A is suspended on a wire rope 17A suspended from the trolley 13A.
  • the lifting gear 30A moves upward by winding up the wire rope 17A by the winding mechanism 11A of the trolley 13A, and moves downward by rolling down the wire rope 17A by the winding mechanism 11A.
  • the hanger 30A has a main body 18A and four lock pins (not shown).
  • the main body portion 18A has a shape and a size corresponding to the shape and the size of the container D in a plan view. That is, the main body portion 18A has a long rectangular shape in plan view.
  • the body portion 18A includes a sheave 22 around which the wire rope 17A is wound, on the upper surface side of the central portion in the longitudinal direction.
  • the lock pin is a mechanism for holding the container D.
  • the lock pins are provided at the four corners of the lower surface of the main body portion 18A so as to project downward from the main body portion 18A.
  • the lock pin is provided at a position corresponding to the hole Gh of the engaged portion G of the container D when the hanger 30A holds the container D.
  • the lock pin is, for example, a twist lock pin, and includes at its lower end a locking piece rotatable about a vertical axis.
  • the lock pins are engaged with the container D by entering holes Gh of the engaged portions G provided at the four corners of the upper surface of the container D and rotating the locking pieces by 90 degrees.
  • the encoder 31 is a sensor that detects the amount of operation of raising and lowering of the winding mechanism 11 and the amount of movement of the trolley 13A in the transverse direction. Based on the detection result of the encoder 31, the transport control unit 51A acquires height information indicating the current height of the hanging tool 30A. The encoder 31 may not detect the amount of movement of the trolley 13 ⁇ / b> A while detecting the amount of operation of raising and lowering of the winding mechanism 11.
  • the shake sensor 32 is a sensor that detects the amount of shake of the hanger 30A due to the wire rope 17A swinging. Based on the detection result of the shake sensor 32, the conveyance control unit 51A acquires shake amount information indicating the current shake amount of the lifting tool 30A.
  • the transfer control unit 51A controls the drive of the crane device 1A.
  • the transport control unit 51A moves the girder 12A, traverses the trolley 13A, rolls up and down the wire rope 17 by the winding mechanism 11A (moves the lifting device 30A in the vertical direction), and rotates the lifting device 30A by the rotation mechanism. It controls the rotational movement and the rotation of the locking piece of the lock pin.
  • the conveyance control unit 51A controls the height information of the lifting tool 30A acquired based on the detection result of the encoder 31 and the deflection amount information of the lifting tool 30A acquired based on the detection result of the shake sensor 32 as the control unit 50.
  • the imaging unit 40 is an imaging device that is provided on the hanging tool 30A and captures an image of the lower side of the hanging tool 30A to obtain a captured image.
  • the configuration and function of the imaging unit 40 are the same as those described above in the first embodiment.
  • the distance information acquisition unit 41 is a device provided in the hanging tool 30A to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A.
  • the measurement points are, for example, points set on the upper surface of the container yard Y, the container D mounted on the container yard Y, and the like, and may be set, for example, in a matrix.
  • the measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane.
  • the configuration and function of the distance information acquisition unit 41 are the same as those described above in the first embodiment.
  • the color information acquisition unit 42 is a device that is provided to the hanging tool 30A and acquires color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A.
  • the configuration and function of the color information acquisition unit 42 are the same as those described above in the first embodiment.
  • the display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50.
  • the display unit 43 may be, for example, a display.
  • the display unit 43 is provided, for example, in the driver's cab 60A.
  • the control unit 50 processes the captured image to generate a emphasized image, and causes the display unit 43 to display the generated emphasized image.
  • the control unit 50 can generate an enhanced image in load gripping control in which the object container D1 is gripped by the lifting implement 30A, and an enhanced image in unloading control in which the holding container D2 is unloaded onto the target container D3.
  • the control unit 50 displays the enhanced image in the cargo hold control in the second embodiment by processing the captured image in the same manner as the cargo hold control in the first embodiment.
  • FIG. 16 is a diagram showing a reference image P9.
  • FIG. 17 is a diagram showing a highlight image P10 in the load grip control.
  • the control unit 50 stores in advance the shape of the locked portion G of the container D as a reference image P9.
  • the highlighted image P10 in FIG. 17 includes a part of the main body 18A of the hanger 30A and the vicinity of the engaged portion G of the target container D1 which is about to be gripped by the hanger 30A.
  • the hanger 30A is at a position offset from immediately above the target container D1.
  • the “target object” is the locked portion G of the target container D1.
  • the control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, the first region in which the to-be-locked portion G of the target container D1 which is the target object is located. In addition, the control unit 50 estimates a second region in which the to-be-locked portion G of the target container D1 that is the target object is located, based on the color information acquired by the color information acquisition unit 42. The control unit 50 sets an area including the OR area included in at least one of the first area and the second area as the target area A11. As described above, the control unit 50 extracts the target area A11 from the captured image.
  • control unit 50 detects, in the target area A11, a corresponding portion corresponding to the shape of the reference image stored in advance using a known image recognition method.
  • the control unit 50 does not detect the corresponding part A12 outside the target area A11 in the captured image.
  • the control unit 50 generates an emphasized image P10 in which the corresponding part A12 detected in the captured image is emphasized, and causes the display unit 43 to display the generated emphasized image P10.
  • the highlighted image P10 displays the target area A11 in color and the areas other than the target area A11 in gray with respect to the captured image before the target container D1 is grasped.
  • the target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map.
  • the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line.
  • a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A grips the target container D1 is indicated by a dashed-dotted line.
  • the reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13.
  • the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D1 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D1 may be detected based on the angular shift with the portion A13.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A.
  • the transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated.
  • the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D1 by moving the hanger 30A such that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A.
  • the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
  • the control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
  • FIG. 18 is a view showing a highlighted image P11 in unloading control.
  • the vicinity of the to-be-locked portion G is included.
  • the hanger 30A is at a position deviated from immediately above the target container D3.
  • the “object” is the locked portion G of the target container D3.
  • the highlighted image P11 displays the target area A11, which is an area including the OR area in the OR probability map, in color with respect to the captured image before unloading the holding container D2 onto the target container D3, and also displays a color other than the target area A11.
  • the area is displayed in gray.
  • the target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map.
  • the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line.
  • a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A unloads the holding container D2 is displayed by a dashed-dotted line.
  • the reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D3 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D3 may be detected based on the angular shift with the part A13.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A.
  • the transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated.
  • the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D3 by moving the hanger 30 so that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A.
  • the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
  • the crane device 1A shifts the positional deviation between the corresponding portion A12 and the reference portion A13 in advance in a state where the lifting tool 30A is positioned at a predetermined height before the locked portion G of the target container D3 is not displayed. Is detected, and the detected value of the shake sensor 32 is calibrated based on the detection result, so that the detected value of the shake sensor 32 is associated with the positional deviation.
  • the crane device 1A monitors the state of positional deviation between the corresponding portion A12 and the reference portion A13 based on the detection result of the shake sensor 32, while lowering the holding container D2 toward the target container D3 by the lifting tool 30A. . Thereby, crane apparatus 1A may maintain the state where hanger 30A is located directly on target container D3.
  • the locked portion G of the target container D1 or the target container D3 that is the target object is locked based on the distance information.
  • the first region in which the portion G is located is estimated, and the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is positioned based on the color information
  • An area is estimated.
  • the crane apparatus 1A extracts a target area A11 including the area in which the target is located from the captured image, and a corresponding portion A12 corresponding to the shape of the reference image P9 in the target area A11. Emphasize and display.
  • the crane apparatus 1A can acquire the positional relationship between the hanger 30A and the object using the distance information and the color information in a complementary manner. Therefore, the crane device 1A is less susceptible to the effects of disturbance light, the influence of specular reflection on the object, and the detection of raindrops, for example, so that the positional relationship between the lifting gear 30A and the object is acquired more reliably. Can.
  • the control unit 50 does not detect the corresponding part A12 outside the target area A11 of the captured image. As a result, it is only necessary to detect the corresponding part A12 corresponding to the shape of the reference image P9 only in the target area A11 of the captured image, so that the time required for processing can be shortened.
  • the control unit 50 is a first region in which the to-be-locked portion G of the target container D1 or the to-be-locked portion G of the target container D3 is the object in the captured image based on the distance information.
  • To estimate a second region in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image based on the color information A region including the OR region included in at least one of the region and the second region is extracted from the captured image as a target region A11.
  • the second area which is estimated to have the object located by the above, is also extracted from the captured image as the target area A11. Therefore, it is suppressed that an object can not be detected despite the presence of an object, and the positional relationship between the hanger 30A and the object can be acquired more reliably.
  • the control unit 50 is configured such that the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is located in the captured image based on the distance information. 1 area is estimated, and a second area in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image is estimated based on color information; A region including the AND region included in the first region and included in the second region is extracted from the captured image as a target region A11.
  • an area in which the object is estimated to be located not only by distance information susceptible to disturbance light but also by color information not easily affected by disturbance light is extracted as a target area A11 from the captured image. . Therefore, erroneous detection as the presence of the target is suppressed even in the area where the target does not exist, and the positional relationship between the hanging tool 30A and the target can be acquired more reliably.
  • the crane apparatus 1A includes a moving mechanism 10A for moving the lifting tool 30A, and the control unit 50 performs loading or unloading of the corresponding part A12 in the captured image and the container D in which the lifting tool 30A is a load.
  • the positional deviation of the reference part A13 where the corresponding part A12 should be positioned in the captured image is detected, and the moving mechanism 10A moves the hanger 30A so that the positional deviation disappears. Thereby, the hanger 30A can be accurately positioned immediately above the object.
  • the target object is the to-be-locked portion G of the target container D1 gripped by the lifting tool 30A.
  • the target object is the engaged portion G of the target container D3 to which the holding container D2 held by the lifting tool 30A is unloaded.
  • two or all of the devices constituting the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common devices. It is also good.
  • the imaging unit 40 and the color information acquisition unit 42 may be configured by a common color camera, and the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common RGB image integrated TOF cameras It may be configured.
  • the control unit 50 when an object is detected in a region where an object should not exist based on the OR accuracy map or the AND accuracy map, the control unit 50 sets an obstacle for the object. It may be recognized as
  • the object may have a predetermined shape, a predetermined size, and a predetermined color.
  • the object may be other than the loading mark M of the coil C or the skid S.
  • the object may be a part of the coil C, or the entire skid S.
  • the object may be something other than the locked portion G of the target container D1 or the locked portion G of the target container D3.
  • the object may be the entire target container D1 or the entire target container D3.
  • control unit 50 may store reference portions A6, A10, and A13 in advance. In this case, the amount of calculation by the control unit 50 can be reduced.
  • the display unit 43 is not limited to the display provided in the driver's cab 60 or 60A, and any configuration can be adopted.
  • the display unit 43 may be a display of a portable terminal that can communicate with the crane devices 1 and 1A directly or indirectly through a network or the like.
  • the display unit 43 may be omitted. That is, the crane devices 1 and 1A may not include the display unit 43 when the control unit 50 performs the automatic operation instead of the manual operation of the worker.
  • the crane devices 1 and 1A are not limited to the overhead crane and the container handling crane, respectively, and may be other various lanes.
  • 1, 1A crane device, 30, 30A: lifting tool
  • 40 imaging unit
  • 41 distance information acquisition unit
  • 42 color information acquisition unit
  • 43 display unit
  • 50 control unit
  • C coil (load, target Object)
  • D Container (load)
  • G Locked part (object)
  • M Loading mark (object).

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

La présente invention concerne un dispositif de grue comprenant : un moyen de suspension ; une unité d'imagerie qui acquiert une image capturée par imagerie sous le moyen de suspension ; une unité d'acquisition d'informations de distance qui acquiert des informations de distance à une pluralité de points de mesure dans la portée de l'image capturée sous le moyen de suspension ; une unité d'acquisition d'informations de couleur qui acquiert des informations de couleur dans la portée de l'image capturée sous le moyen de suspension ; une unité d'affichage ; et une unité de commande qui traite l'image capturée et l'amène à être affichée sur un dispositif d'affichage. L'unité de commande extrait, dans l'image capturée, une zone de sujet qui comprend une zone dans laquelle est situé un objet d'intérêt dans l'image capturée sur la base des informations de distance et des informations de couleur, détecte une partie correspondante dans la zone de sujet qui correspond à la forme d'une image de référence mémorisée à l'avance, et affiche l'image capturée sur l'unité d'affichage tout en accentuant la partie correspondante détectée.
PCT/JP2018/026546 2017-09-05 2018-07-13 Dispositif de grue WO2019049511A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110179044.5A CN112938766B (zh) 2017-09-05 2018-07-13 起重机装置
CN201880053626.5A CN111032561B (zh) 2017-09-05 2018-07-13 起重机装置
JP2019540803A JP6689467B2 (ja) 2017-09-05 2018-07-13 クレーン装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-170605 2017-09-05
JP2017170605 2017-09-05

Publications (1)

Publication Number Publication Date
WO2019049511A1 true WO2019049511A1 (fr) 2019-03-14

Family

ID=65633842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/026546 WO2019049511A1 (fr) 2017-09-05 2018-07-13 Dispositif de grue

Country Status (3)

Country Link
JP (1) JP6689467B2 (fr)
CN (2) CN111032561B (fr)
WO (1) WO2019049511A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021048916A (ja) * 2019-09-20 2021-04-01 東芝ライテック株式会社 制御装置および制御方法
JP2021128059A (ja) * 2020-02-13 2021-09-02 コベルコ建機株式会社 ガイダンスシステム
WO2021193294A1 (fr) * 2020-03-24 2021-09-30 住友重機械搬送システム株式会社 Système de commande à distance et procédé de commande à distance

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110166A (zh) * 2021-04-13 2021-07-13 镇江港务集团有限公司 一种卧式门机动力控制系统及其控制方法
RU2770940C1 (ru) * 2021-12-24 2022-04-25 Общество С Ограниченной Ответственностью "Малленом Системс" Система и способ автоматического определения положения мостового крана вдоль пути движения

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (ja) * 1997-12-05 1999-06-22 Fujitsu Ltd 手形状認識装置及び単色物体形状認識装置
JP2011198270A (ja) * 2010-03-23 2011-10-06 Denso It Laboratory Inc 対象認識装置及びそれを用いた制御装置、並びに対象認識方法
JP2015533747A (ja) * 2012-10-02 2015-11-26 コネクレーンズ ピーエルシーKonecranes Plc 積み荷操作装置を用いる積み荷の操作
JP2016151955A (ja) * 2015-02-18 2016-08-22 キヤノン株式会社 画像処理装置、撮像装置、距離計測装置、および画像処理方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10245889B4 (de) * 2002-09-30 2008-07-31 Siemens Ag Verfahren und/oder Einrichtung zur Bestimmung einer Pendelung einer Last eines Hebezeuges
DE10245970B4 (de) * 2002-09-30 2008-08-21 Siemens Ag Verfahren bzw. Vorrichtung zur Erkennung einer Last eines Hebezeuges
CN103179369A (zh) * 2010-09-21 2013-06-26 株式会社锦宫事务 摄像对象物、图像处理程序及图像处理方法
CN102452611B (zh) * 2010-10-21 2014-01-15 上海振华重工(集团)股份有限公司 集装箱起重机的吊具空间姿态的检测方法和装置
FI125644B (fi) * 2011-07-18 2015-12-31 Konecranes Oyj Järjestelmä ja menetelmä nosturin tartuntaelimen sijainnin ja kiertymän määrittämiseksi
US9598836B2 (en) * 2012-03-29 2017-03-21 Harnischfeger Technologies, Inc. Overhead view system for a shovel
DE102012213604A1 (de) * 2012-08-01 2014-02-06 Ge Energy Power Conversion Gmbh Verladevorrichtung für Container sowie Verfahren zu deren Betrieb
CN203439940U (zh) * 2013-09-04 2014-02-19 贾来国 集装箱码头rtg/rmg双激光吊具防碰箱自动控制系统
CN204675650U (zh) * 2015-03-18 2015-09-30 苏州盈兴信息技术有限公司 一种生产物料贮运作业图像自动跟踪装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (ja) * 1997-12-05 1999-06-22 Fujitsu Ltd 手形状認識装置及び単色物体形状認識装置
JP2011198270A (ja) * 2010-03-23 2011-10-06 Denso It Laboratory Inc 対象認識装置及びそれを用いた制御装置、並びに対象認識方法
JP2015533747A (ja) * 2012-10-02 2015-11-26 コネクレーンズ ピーエルシーKonecranes Plc 積み荷操作装置を用いる積み荷の操作
JP2016151955A (ja) * 2015-02-18 2016-08-22 キヤノン株式会社 画像処理装置、撮像装置、距離計測装置、および画像処理方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021048916A (ja) * 2019-09-20 2021-04-01 東芝ライテック株式会社 制御装置および制御方法
JP7294024B2 (ja) 2019-09-20 2023-06-20 東芝ライテック株式会社 制御装置および制御方法
JP2021128059A (ja) * 2020-02-13 2021-09-02 コベルコ建機株式会社 ガイダンスシステム
JP7306291B2 (ja) 2020-02-13 2023-07-11 コベルコ建機株式会社 ガイダンスシステム
WO2021193294A1 (fr) * 2020-03-24 2021-09-30 住友重機械搬送システム株式会社 Système de commande à distance et procédé de commande à distance

Also Published As

Publication number Publication date
CN111032561B (zh) 2021-04-09
JP6689467B2 (ja) 2020-04-28
JPWO2019049511A1 (ja) 2020-05-28
CN112938766A (zh) 2021-06-11
CN111032561A (zh) 2020-04-17
CN112938766B (zh) 2023-08-15

Similar Documents

Publication Publication Date Title
JP6689467B2 (ja) クレーン装置
KR101699672B1 (ko) 컨테이너 크레인을 사용하여 랜딩 타깃 상에 컨테이너들을 자동적으로 랜딩하기 위한 방법 및 시스템
JP3785061B2 (ja) 荷役クレーンにおけるコンテナ位置検知方法及び装置並びにコンテナ着床、段積制御方法
JP4856394B2 (ja) コンテナクレーンの対象物位置計測装置と該対象物位置計測装置を用いた自動荷役装置
CN103030063B (zh) 用于确定集装箱吊架用的目标位置的方法和集装箱吊架
US20180282132A1 (en) Method, load handling device, computer program and computer program product for positioning gripping means
US20190262994A1 (en) Methods and systems for operating a material handling apparatus
US20220009748A1 (en) Loading A Container On A Landing Target
JP2018188299A (ja) コンテナターミナルシステムとその制御方法
EP3418244B1 (fr) Le chargement d'un conteneur sur une cible d'atterrissage
KR100624008B1 (ko) 크레인용 스프레더의 자동제어를 위한 자동착지시스템 및그 방법
JP6807131B2 (ja) 荷役搬送装置
JP7345335B2 (ja) クレーンの運転支援システム及びクレーンの運転支援方法
CN112996742B (zh) 起重机系统、起重机定位装置及起重机定位方法
CN110869305B (zh) 起重机装置
JP2022136849A (ja) 容器計測システム
WO2020184025A1 (fr) Grue et procédé de chargement avec une grue
KR20050007241A (ko) 컨테이너의 자동랜딩을 위한 스프레더의 절대위치검출방법 및 알고리즘
WO2024009342A1 (fr) Dispositif de transport et son procédé de commande
US20230382693A1 (en) Apparatus and method for determining the spatial position of pick-up elements of a container
Lim et al. A Visual Measurement System for Coil Shipping Automation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18853655

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019540803

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18853655

Country of ref document: EP

Kind code of ref document: A1