WO2019049511A1 - Crane device - Google Patents

Crane device Download PDF

Info

Publication number
WO2019049511A1
WO2019049511A1 PCT/JP2018/026546 JP2018026546W WO2019049511A1 WO 2019049511 A1 WO2019049511 A1 WO 2019049511A1 JP 2018026546 W JP2018026546 W JP 2018026546W WO 2019049511 A1 WO2019049511 A1 WO 2019049511A1
Authority
WO
WIPO (PCT)
Prior art keywords
captured image
target
control unit
region
area
Prior art date
Application number
PCT/JP2018/026546
Other languages
French (fr)
Japanese (ja)
Inventor
小林 雅人
Original Assignee
住友重機械搬送システム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 住友重機械搬送システム株式会社 filed Critical 住友重機械搬送システム株式会社
Priority to CN202110179044.5A priority Critical patent/CN112938766B/en
Priority to CN201880053626.5A priority patent/CN111032561B/en
Priority to JP2019540803A priority patent/JP6689467B2/en
Publication of WO2019049511A1 publication Critical patent/WO2019049511A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/22Control systems or devices for electric drives
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/52Details of compartments for driving engines or motors or of operator's stands or cabins
    • B66C13/54Operator's stands or cabins

Definitions

  • One aspect of the present invention relates to a crane apparatus.
  • Patent Document 1 when a placed container is grasped by a spreader, or when a container grasped by a spreader is stacked on another placed container, acquisition is performed by a 3D camera provided in the spreader.
  • the crane which detects the mounted container and acquires the positional relationship of a spreader and the said container is described based on the distance map.
  • the 3D camera used in the crane as described above outputs measurement light to the area including the container to be detected, and detects the reflected light to obtain a distance map. For this reason, the 3D camera may erroneously detect that a container is present in an area where there is no container to be detected, for example, under the influence of strong sunlight or disturbance light outdoors. . In addition, the 3D camera may detect raindrops when used outdoors when it is raining, and may not properly detect a target object. Furthermore, in the 3D camera, for example, if there is a mirror surface portion in the container to be detected, the measurement light may be specularly reflected in a direction in which the reflected light can not be detected by the 3D camera. In this case, the 3D camera can not acquire the distance map for the mirror surface, and as a result, there is a possibility that the container can not be detected despite the presence of the container to be detected.
  • one aspect of the present invention aims to provide a crane apparatus capable of more reliably acquiring the positional relationship between a lifting gear and an object.
  • a crane apparatus includes: a lifting device that grips, holds, and unloads a load; an imaging unit that is provided on the lifting device and captures an image below the lifting device to obtain a captured image; A distance information acquisition unit provided on the lifting tool for acquiring distance information to a plurality of measurement points within the range of the captured image below the lifting tool, provided on the lifting tool, of the captured image; A color information acquisition unit for acquiring color information in a range, a display unit for displaying a captured image, and a control unit for processing the captured image and displaying the image on the display unit, the control unit including distance information and color information Based on the target image including the region where the target is located in the captured image is extracted from the captured image, and a corresponding portion corresponding to the shape of the reference image stored in advance is detected in The image is highlighted on the display with the corresponding parts highlighted Make.
  • the region where the object is located is estimated based on the distance information, and the object is located based on the color information
  • the region of interest is estimated.
  • the apparatus extracts a target area including the area in which the target is located from the captured image based on these estimation results, and emphasizes and displays a corresponding portion corresponding to the shape of the reference image in the target area.
  • this device can acquire the positional relationship between the lifting gear and the object using the distance information and the color information in a complementary manner. Therefore, since this device is not susceptible to, for example, the influence of disturbance light and the influence of specular reflection on an object, the positional relationship between the lifting gear and the object can be acquired more reliably.
  • control unit may not detect the corresponding part outside the target area of the captured image. According to this, since it is only necessary to detect the corresponding part corresponding to the shape of the reference image for only the target area in the captured image, the time required for the process can be shortened.
  • the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information.
  • a second region located may be estimated, and a region including a region included in at least one of the first region and the second region may be extracted from the captured image as a target region.
  • the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information.
  • a second region located may be estimated, and a region including the region included in the first region and included in the second region may be extracted from the captured image as a target region.
  • the crane apparatus includes a moving mechanism for moving the lifting gear, and the control unit causes the corresponding portion in the captured image to be captured and unloaded when the lifting gear performs loading and unloading of the load.
  • the position shift of the reference portion where the corresponding portion is to be positioned may be detected, and the moving mechanism may move the hanger so as to eliminate the position shift. According to this, it is possible to accurately position the lifting device directly above the object.
  • the object may be at least a part of the load gripped by the lifting gear. According to this, in the situation where the load is gripped by the lifting tool, the positional relationship between the lifting tool and the load can be acquired more reliably.
  • the object may be at least a part of the mounting portion on which the load held by the lifting device is unloaded. According to this, in the situation where the load is unloaded by the lifting tool, the positional relationship between the lifting tool and the placement portion can be acquired more reliably.
  • FIG. 1 is a block diagram showing a crane apparatus according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus.
  • FIG. 3 is a side view of the crane apparatus.
  • FIG. 4 is a perspective view showing a hanger and a coil.
  • FIG. 5 is a view showing a captured image in the load grip control.
  • FIG. 6 is a diagram showing a distance accuracy map in the load grip control.
  • FIG. 7 is a diagram showing a color accuracy map in the load grip control.
  • FIG. 8 is a diagram showing an OR probability map in the load grip control.
  • FIG. 9 is a view showing a reference image.
  • FIG. 10 is a view showing a highlighted image in the load gripping control.
  • FIG. 11 is a diagram showing an AND probability map in the load grip control.
  • FIG. 1 is a block diagram showing a crane apparatus according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus.
  • FIG. 3 is a side view of
  • FIG. 12 is a diagram showing a highlighted image in unloading control.
  • FIG. 13 is a block diagram showing a crane apparatus according to a second embodiment.
  • FIG. 14 is a front view of the crane apparatus.
  • FIG. 15 is a perspective view of the crane apparatus.
  • FIG. 16 is a diagram showing a reference image.
  • FIG. 17 is a view showing a highlighted image in the load grip control.
  • FIG. 18 is a view showing a highlighted image in unloading control.
  • FIG. 1 is a block diagram showing a crane apparatus 1 according to the first embodiment.
  • FIG. 2 is a front view of the crane apparatus 1.
  • FIG. 3 is a side view of the crane apparatus 1.
  • FIG. 4 is a perspective view showing the hanger 30 and the coil C.
  • the crane apparatus 1 according to the first embodiment is an overhead crane that transports a coil (load) C that is a steel plate wound in a cylindrical shape.
  • the crane device 1 includes a moving mechanism 10, a lifting tool 30, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60.
  • the crane apparatus 1 also includes a transport control unit 51 that controls the moving mechanism 10 and the lifting tool 30, and an encoder 31 connected to the transport control unit 51.
  • the coil C has a substantially cylindrical shape. At the axial center position of the coil C, a hanging hole H penetrating from one end to the other end of the coil C is formed.
  • the coil C is placed on a skid (placement unit) S disposed in a matrix at predetermined intervals on a floor surface F of a building such as a factory or a warehouse in a direction in which the axial direction is horizontal.
  • the coil C has a predetermined shape, a predetermined size, and a predetermined color.
  • the outer peripheral surface of the coil C is mirror-like.
  • the moving mechanism 10 is a mechanism that moves the lifting tool 30 (horizontally, vertically, and rotationally).
  • the moving mechanism 10 has a girder 12 and a trolley 13.
  • the girder 12 supports the load of the trolley 13 and the hanger 30.
  • the girder 12 is horizontally laid across both walls near the ceiling of the building.
  • the girder 12 is movable in the horizontal direction orthogonal to the extending direction of the girder 12, and the trolley 13 and the hanger 30 can be moved in this direction.
  • the trolley 13 traverses the upper surface of the girder 12 along the extending direction of the girder 12. Thereby, the trolley 13 can move the hanger 30 along the extension direction of the girder 12.
  • the trolley 13 has a hoisting mechanism 11 for winding up and down the hanging wire rope 17.
  • the trolley 13 has a rotation mechanism (not shown) that rotationally moves the hanger 30 about the vertical axis in the vertical direction via the wire rope 17.
  • the hanger 30 is a device for gripping, holding, and unloading the coil C.
  • the hanger 30 is hung on a wire rope 17 hanging down from the trolley 13.
  • the lifting tool 30 is moved upward by winding up the wire rope 17 by the winding mechanism 11 of the trolley 13 and is moved downward by winding down the wire rope 17 by the winding mechanism 11.
  • the hanger 30 has a base 18 and a pair of opposed claws 19, 19.
  • the wire rope 17 is connected to the upper surface side of the base portion 18, and the claw portions 19, 19 are provided on the lower surface side so as to be able to open and close.
  • each of the claws 19 and 19 is provided with a projection 20 projecting so as to face each other.
  • the respective convex portions 20 are inserted into the suspension holes H of the coil C from both sides.
  • the hanger 30 holds the coil C.
  • the inner surfaces 21 and 21 of the claws 19 and 19 facing each other may or may not be in contact with the coil C.
  • the hanging tool 30 transfers the coil C in the following procedure. That is, the lifting tool 30 opens and closes the claws 19 and 19 to grasp the load on the coil C placed on the skid S, and while holding the coil C, the moving mechanism 10 and the lifting mechanism 11 The coil C is transferred by being moved onto the skid S, and the claws 19, 19 are opened and closed again to unload the coil C.
  • the encoder 31 is a sensor that detects the amount of operation for raising and lowering the hoisting mechanism 11 and the amount of movement of the trolley 13 in the transverse direction. Based on the detection result of the encoder 31, the conveyance control unit 51 acquires height information indicating the current height of the lifting tool 30. The encoder 31 may not detect the amount of movement of the trolley 13 in the transverse direction by detecting the amount of operation of raising and lowering the winding mechanism 11.
  • the transfer control unit 51 controls the driving of the crane device 1.
  • the transport control unit 51 moves the girder 12, traverses the trolley 13, rolls up and down the wire rope 17 by the winding mechanism 11 (moves the lifting tool 30 in the vertical direction), and rotates the lifting tool 30 by the rotation mechanism. It controls the rotational movement and the opening and closing of the claws 19, 19.
  • the conveyance control unit 51 outputs, to the control unit 50, the height information of the hanger 30 obtained based on the detection result of the encoder 31.
  • the imaging unit 40 is an imaging device that is provided on the lifting tool 30 and captures an image of the lower side of the lifting tool 30 to obtain a captured image.
  • the imaging unit 40 may be, for example, a camera.
  • the imaging unit 40 is provided on the base 18 of the hanger 30 so as to face downward.
  • a plurality of imaging units 40 may be provided on the hanger 30.
  • the distance information acquisition unit 41 is a device provided on the lifting tool 30 to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30.
  • the distance information acquisition unit 41 acquires distance information to each measurement point in association with the position in the captured image.
  • the distance information acquisition unit 41 is not particularly limited as long as it is an apparatus capable of acquiring distance information, and may be, for example, a ToF [Time of Flight] camera, a 3D camera, or a 3D scanner. Good.
  • the distance information acquisition unit 41 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanging tool 30 so as to face downward.
  • the distance information acquisition units 41 are provided by the number corresponding to the number of imaging units 40.
  • the “distance information” is information indicating the distance between the distance information acquisition unit 41 and each measurement point.
  • the measurement points are, for example, points set on the floor F, the skid S, the upper surface of the coil C, etc., and may be set in a matrix, for example.
  • the measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane.
  • the color information acquisition unit 42 is a device provided on the hanging tool 30 to obtain color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30.
  • the color information acquisition unit 42 acquires color information in association with the position in the captured image.
  • the color information acquisition unit 42 is not particularly limited as long as it is an apparatus capable of acquiring color information, and may be, for example, a spectrum camera or a color camera.
  • the color information acquisition unit 42 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanger 30 so as to face downward.
  • the color information acquisition units 42 are provided in the number corresponding to the number of imaging units 40.
  • the “color information” is, for example, information which decomposes a color image into RGB signals and indicates it in vector representation.
  • the display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50.
  • the display unit 43 may be, for example, a display.
  • the display unit 43 is provided, for example, in the driver's cab 60.
  • the control unit 50 processes the captured image to generate an emphasized image, and causes the display unit 43 to display the generated emphasized image (the details will be described later).
  • the control unit 50 is configured to unload the coil C held by the lifting tool 30 onto the skid S by unloading the coil C held by the lifting tool 30. It is possible to generate an enhanced image in the down control.
  • FIG. 5 is a view showing a captured image P1 in the load grip control.
  • the control unit 50 causes the display unit 43 to display the captured image P1 captured by the imaging unit 40.
  • the coil C placed below the hanger 30, one claw portion 19 of the hanger 30, and the skid S on which the coil C is not placed are displayed.
  • the hanger 30 is at a position offset from immediately above the mounted coil C.
  • the coil C is placed on a skid S (not shown) below the coil C. Since the outer peripheral surface of the coil C is mirror-like, the coil C in the captured image P1 has a reflection portion R in which the surrounding scene is reflected.
  • control unit 50 extracts a target area including the area in which the target is located in the captured image P1 from the captured image P1 as follows.
  • the “target object” is the coil C gripped by the lifting tool 30.
  • FIG. 6 is a diagram showing the distance accuracy map P2 in the load grip control.
  • the distance information acquisition unit 41 corresponds distance information to a plurality of measurement points (not shown) in the range of the captured image P1 below the lifting tool 30 with the position in the captured image P1. Get it.
  • the measurement points may be set at positions where predetermined numbers (for example, 20) in the vertical direction of the captured image P1 and predetermined numbers (for example, 30) in the horizontal direction are arranged in a matrix at equal intervals.
  • the control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, a first area A1 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined shape and a predetermined size, the control unit 50 compares the lifting device 30 and the coil C with each other based on the current height information of the lifting device 30. Predict the current distance between. Then, the control unit 50 compares the predicted distance with the distance related to the distance information acquired by the distance information acquisition unit 41 to estimate the first area A1 in which the coil C is located in the captured image P1.
  • the control unit 50 determines that the coil C is located in that area, If it is out of the predetermined range, it may be determined that the coil C is not located in the region.
  • the control unit 50 appropriately determines that the area in which the coil C is located is in the vicinity of the axial center of the coil C in a plan view (white area in FIG. 6).
  • the measurement light of the distance information acquisition unit 41 is incident at an incident angle close to parallel to the outer peripheral surface of the mirror C. It is incident. Therefore, in the area, the distance information acquisition unit 41 can not detect the reflected light, and as a result, the control unit 50 erroneously determines that the coil C is not located.
  • FIG. 7 is a diagram showing the color accuracy map P3 in the load grip control.
  • the distance information acquisition unit 41 acquires color information within the range of the captured image P1 below the lifting tool 30 in association with the position in the captured image P1.
  • the control unit 50 estimates, based on the color information acquired by the color information acquisition unit 42, the second area A2 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined color, the control unit 50 positions the coil C based on the degree of color correlation with the predetermined color in the captured image P1.
  • the second area A2 is estimated.
  • a known method can be used as a method of determining "the degree of color correlation".
  • the control unit 50 converts the color information into coordinates on the chromaticity diagram, and evaluates the distance between the coordinates on the chromaticity diagram and the coordinates on the chromaticity diagram of the coil C stored in advance.
  • the degree of color correlation may be determined by a method.
  • the control unit 50 stores the color of the coil C in advance.
  • the control unit 50 determines that the coil C is positioned in the area if the degree of color correlation is equal to or higher than a predetermined value, and determines that the coil C is not positioned in the area if it is less than the predetermined value. You may
  • the control unit 50 appropriately determines that the coil C is located in a portion (white area in FIG. 7) of the coil C excluding the reflected portion R in a plan view. It is judged. On the other hand, since the reflection portion R has a color which is largely different from the color of the coil, the control unit 50 erroneously determines that the coil C is not located.
  • FIG. 8 is a diagram showing an OR probability map P4 in the load grip control.
  • the OR accuracy map P4 shown in FIG. 8 shows an OR region A3 (white part in FIG. 8) included in at least one of the first region A1 of the distance accuracy map P2 and the second region A2 of the color accuracy map P3.
  • the control unit 50 sets an area including the OR area A3 in the OR probability map P4 (for example, an area obtained by enlarging the outer edge of the OR area A3 by a predetermined width) as the target area A4.
  • the control unit 50 extracts, from the captured image P1, a target region A4 which is a region in which the coil C is located in the captured image P1.
  • control unit 50 detects a corresponding portion corresponding to the shape of the reference image stored in advance in the target area A4 of the captured image P1 as follows.
  • FIG. 9 is a view showing a reference image P5.
  • FIG. 10 is a view showing a highlight image P6 in the load grip control.
  • the control unit 50 stores in advance a shape (specifically, a plan view of the coil C) in the captured image P1 of the coil C which is an object as a reference image P5. There is. Then, the control unit 50 detects a corresponding portion A5 corresponding to the shape of the reference image P5 in a region corresponding to the target region A4 in the captured image P1 using a known image recognition method.
  • the "corresponding to the shape of the reference image P5" may have a size different from that of the reference image P5, or may be a shape in which the reference image P5 is inclined around an axis in the vertical direction.
  • image recognition method for example, pattern matching, machine learning, etc. may be used. Note that the control unit 50 does not detect the corresponding part A5 outside the target area A4 in the captured image P1.
  • the control unit 50 generates an emphasized image P6 in which the corresponding part A5 is emphasized in the captured image P1, and causes the display unit 43 to display the generated emphasized image P6.
  • the emphasis image P6 displays the target area A4 in color and displays areas other than the target area A4 in gray.
  • the corresponding part A5 is surrounded by a broken line.
  • the corresponding part A5 is not limited to being surrounded by a broken line as long as it is emphasized so as to be easily recognized by the operator, and may be displayed in a predetermined color, for example. It is also good, it may be cut out.
  • the control unit 50 determines, based on the detection result of the encoder 31, a reference portion A6 at which the corresponding portion A5 should be positioned in the captured image P1 when the hanger 30 grips the load of the coil C. Calculated from height information acquired.
  • the reference portion A6 is, for example, the shape of the coil C in a plan view in the case where the coil C is placed on the skid S directly under the lifting tool 30 (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding portion A5 and the reference portion A6 in the captured image P1 (or the enhanced image P6).
  • control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A5 and the reference part A6, and the corresponding part A5 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A6.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51.
  • the transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the hanger 30 immediately above the coil C by moving the hanger 30 such that the positional deviation is eliminated by the movement of the girder 12 and the traversing of the trolley 13. At this time, the lifting mechanism 30 may be rotationally moved about the vertical axis by driving the rotation mechanism as needed.
  • the method using the OR probability map P4 has been described above for the display of the enhanced image P6 in the load grip control by the control unit 50.
  • the control unit 50 can also execute a method using an AND certainty map instead of the OR certainty map P4 for the display of the emphasized image P6 in the grip control.
  • FIG. 11 is a view showing an AND probability map P7 in the load grip control.
  • the AND probability map P7 shown in FIG. 11 is the AND region A7 (white part in FIG. 11) included in the first region A1 of the distance probability map P2 and included in the second region A2 of the color probability map P3. It shows.
  • the control unit 50 may set an area including the AND area A7 in the AND likelihood map P7 (for example, an area obtained by expanding the outer edge of the AND area A7 by a predetermined width) as the target area.
  • the control unit 50 may extract, from the captured image P1, a target region which is a region in which the coil C is located in the captured image P1. Note that whether to use the OR probability map P4 or the AND probability map P7 is appropriately selected according to the object, for example.
  • the control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
  • FIG. 12 is a view showing a emphasized image P8 in unloading control.
  • the highlighted image P8 in FIG. 12 includes the coil C, the claw portion 19 of the hanger 30 holding the coil C, and a part of the skid S to which the coil C is unloaded.
  • the skid S has a loading mark M having a predetermined shape, a predetermined size, and a predetermined color.
  • the lifting tool 30 is at a position offset from immediately above the skid S to which the coil C is unloaded, and is tilted about the vertical axis with respect to the skid S.
  • the “object” is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded.
  • the shape and size of the stowage mark M is the shape and size of a virtual columnar portion extending from the entire stowage mark M to the floor surface F in the vertical direction among the skids S having the stowage mark M. is there.
  • the highlighted image P8 displays the target area A8 including the OR area in the OR probability map in color and grays out the area other than the target area A8 with respect to the captured image before unloading the coil C on the skid S. doing.
  • the target area A8 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR likelihood map.
  • the area of the stowing mark M which is the corresponding portion A9 detected in the target area A8, is surrounded by a broken line.
  • the reference portion A10 where the corresponding portion A9 should be positioned when unloading the coil C is performed by the lifting tool 30 is indicated by a dashed dotted line.
  • the reference portion A10 is a shape of the stowage mark M in a plan view in the case where the stowing mark M is located immediately below the hanger 30 (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A9 and the reference part A10. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A9 and the reference part A10, and the corresponding part A9 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A10.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51.
  • the transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the lifting tool 30 immediately above the skid S by moving the lifting tool 30 so that the positional deviation is eliminated by the movement of the girder 12, the traversing of the trolley 13, and the drive of the rotation mechanism.
  • the crane device 1 in the captured image P1 obtained by capturing the lower part of the lifting tool 30, the first region A1 in which the coil C or the stowage mark M, which is the object, is located based on the distance information. Is estimated, and the second region A2 in which the target object coil C or stowing mark M is located is estimated based on the color information. Then, the crane apparatus 1 extracts target areas A4 and A8 including the area in which the target is located from the captured image P1 based on the estimation results, and corresponds to the shape of the reference image in the target areas A4 and A8. The corresponding parts A5 and A9 are highlighted and displayed.
  • the crane apparatus 1 can acquire the positional relationship of the lifting gear 30 and a target object using distance information and color information complementarily. Therefore, since the crane apparatus 1 is hard to receive the influence of disturbance light, and the influence of the specular reflection in a target object, it can acquire the positional relationship of the hanger 30 and a target object more reliably.
  • the control unit 50 does not detect the corresponding parts A5 and A9 outside the target areas A4 and A8 of the captured image P1. As a result, it is only necessary to detect corresponding parts A5 and A9 corresponding to the shape of the reference image for only the target areas A4 and A8 in the captured image P1, so that the time required for processing can be shortened.
  • the control unit 50 estimates the first area A1 in which the coil C or stowing mark M is located in the captured image P1 based on the distance information, and based on the color information, A region including the OR region included in at least one of the first region A1 and the second region A2 by estimating the second region A2 in which the coil C or the stowage mark M is located in the captured image P1.
  • a region including the OR region included in at least one of the first region A1 and the second region A2 by estimating the second region A2 in which the coil C or the stowage mark M is located in the captured image P1.
  • the color information which is less susceptible to the specular reflection of the object is also extracted from the captured image P1 as the target areas A4 and A8. Therefore, it is suppressed that an object can not be detected despite the presence of the object, and the positional relationship between the hanging tool 30 and the object can be acquired more reliably.
  • the control unit 50 estimates the first area A1 in which the coil C or the stowage mark M is the target in the captured image P1 based on the distance information, and the color information is used.
  • the second region A2 in which the coil C or the stowage mark M is located in the captured image P1 and the AND region included in the first region A1 and included in the second region A2
  • the areas to be included are extracted from the captured image P1 as target areas A4 and A8.
  • not only distance information susceptible to disturbance light but also color information unlikely to be affected by disturbance light is assumed to be an area in which the object is located as a target area A4 or A8. Extract from Therefore, erroneous detection as the presence of the object is suppressed even in the region where the object does not exist, and the positional relationship between the lifting tool 30 and the object can be acquired more reliably.
  • the crane apparatus 1 includes the moving mechanism 10 for moving the lifting tool 30, and the control unit 50 grips or unloads the corresponding portions A5 and A9 in the captured image P1 and the coil C whose load is the lifting tool 30.
  • the positional deviation of the reference parts A6 and A10 where the corresponding parts A5 and A9 are to be positioned in the captured image P1 is detected, and the moving mechanism 10 moves the hanger 30 so as to eliminate the positional deviation. Thereby, the hanger 30 can be accurately positioned immediately above the object.
  • the object is a coil C gripped by the lifting tool 30.
  • the object is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded.
  • Second Embodiment [Configuration of crane device]
  • the crane apparatus which concerns on 2nd Embodiment is demonstrated.
  • the crane apparatus according to the second embodiment is different from the crane apparatus 1 according to the first embodiment in the type of crane and the load to be transferred.
  • points different from the crane apparatus 1 according to the first embodiment will be mainly described.
  • FIG. 13 is a block diagram showing a crane apparatus 1A according to a second embodiment.
  • FIG. 14 is a front view of the crane apparatus 1A.
  • FIG. 15 is a perspective view of the crane apparatus 1A.
  • the crane apparatus 1A according to the second embodiment is, for example, in a container yard Y at a container terminal where transfer of a container (load) D, etc. is performed to a container ship that has been docked. It is a container handling crane which is disposed and performs loading and unloading of the container D.
  • the crane apparatus 1A includes a moving mechanism 10A, a hanger 30A, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60A.
  • the hanger 30A is also referred to as a spreader.
  • the crane apparatus 1A includes a transport control unit 51A that controls the moving mechanism 10A and the hanging tool 30A, and an encoder 31 and a shake sensor 32 connected to the transport control unit 51A.
  • the container D is a container such as an ISO standard container.
  • the container D is in the form of a long rectangular parallelepiped and has a predetermined length of, for example, 20 feet or 40 feet in the longitudinal direction.
  • the container D is provided with the to-be-locked part G in which the hole Gh was each formed in the four corners of the upper surface (refer FIG. 16).
  • the container D is stacked in one or more stages in a container yard Y to form a plurality of rows E.
  • the rows E are vertically and horizontally aligned such that the longitudinal direction of the containers D constituting the rows E is parallel to the longitudinal direction of the containers D constituting the other rows E.
  • the container D has a predetermined shape and a predetermined size.
  • the locked portion G has a predetermined shape, a predetermined size, and a predetermined color.
  • the container D which is about to be gripped by the hanger 30A is referred to as a target container D1 (see FIG. 14).
  • the container D held by the hanger 30A is referred to as a holding container D2
  • the container D on which the holding container D2 is about to be unloaded is referred to as a target container (placement unit) D3 (see FIG. 15).
  • the target container D1, the holding container D2, and the target container D3 are such that how to call the container D changes according to the transfer status of the container D.
  • the moving mechanism 10A is a mechanism that moves the lifting tool 30A (horizontally, vertically, and rotationally).
  • the moving mechanism 10A includes a traveling device 14, two pairs of legs 15, 15, a girder 12A, and a trolley 13A.
  • the traveling device 14 includes tire wheels provided at lower ends of the two pairs of leg portions 15 and 15, and driving the tire wheels by the traveling motor causes two pairs of leg portions 15, It is possible to run 15 back and forth.
  • the girder 12A is horizontally laid over the upper end portions of the two pairs of leg portions 15 and 15 substantially horizontally. As a result, the girder 12A can move in the horizontal direction orthogonal to the extending direction of the girder 12A by causing the traveling device 14 to travel the two pairs of legs 15 and 15 forward and backward.
  • the trolley 13A and the hanger 30A can be moved in this direction.
  • the trolley 13A traverses the upper surface of the girder 12A along the extending direction of the girder 12A. Thereby, the trolley 13A can move the hanger 30A along the extension direction of the girder 12A.
  • the trolley 13A has a hoisting mechanism 11A that winds up and down the hanging wire rope 17A.
  • the trolley 13A has a rotation mechanism (not shown) that rotationally moves the hanger 30A around an axis in the vertical direction via the wire rope 17A.
  • a traveling path W of a transport carriage V such as a trailer or an AGV (Automated Guided Vehicle) is laid.
  • the crane apparatus 1A grips the container D carried in by the transport carriage V, and the container D is placed on the container yard Y or another container D (target container D3) placed on the container yard Y. Unload on top.
  • the crane apparatus 1A grips the container D (target container D1) placed on the container yard Y or another container D placed on the container yard Y, and moves the container D
  • the container D is unloaded onto the transport carriage V, and the container D is unloaded by the transport carriage V to the outside.
  • the hanger 30A is a device for gripping, holding, and unloading the container D.
  • the hanger 30A holds the container D from the upper surface side.
  • the hanger 30A is suspended on a wire rope 17A suspended from the trolley 13A.
  • the lifting gear 30A moves upward by winding up the wire rope 17A by the winding mechanism 11A of the trolley 13A, and moves downward by rolling down the wire rope 17A by the winding mechanism 11A.
  • the hanger 30A has a main body 18A and four lock pins (not shown).
  • the main body portion 18A has a shape and a size corresponding to the shape and the size of the container D in a plan view. That is, the main body portion 18A has a long rectangular shape in plan view.
  • the body portion 18A includes a sheave 22 around which the wire rope 17A is wound, on the upper surface side of the central portion in the longitudinal direction.
  • the lock pin is a mechanism for holding the container D.
  • the lock pins are provided at the four corners of the lower surface of the main body portion 18A so as to project downward from the main body portion 18A.
  • the lock pin is provided at a position corresponding to the hole Gh of the engaged portion G of the container D when the hanger 30A holds the container D.
  • the lock pin is, for example, a twist lock pin, and includes at its lower end a locking piece rotatable about a vertical axis.
  • the lock pins are engaged with the container D by entering holes Gh of the engaged portions G provided at the four corners of the upper surface of the container D and rotating the locking pieces by 90 degrees.
  • the encoder 31 is a sensor that detects the amount of operation of raising and lowering of the winding mechanism 11 and the amount of movement of the trolley 13A in the transverse direction. Based on the detection result of the encoder 31, the transport control unit 51A acquires height information indicating the current height of the hanging tool 30A. The encoder 31 may not detect the amount of movement of the trolley 13 ⁇ / b> A while detecting the amount of operation of raising and lowering of the winding mechanism 11.
  • the shake sensor 32 is a sensor that detects the amount of shake of the hanger 30A due to the wire rope 17A swinging. Based on the detection result of the shake sensor 32, the conveyance control unit 51A acquires shake amount information indicating the current shake amount of the lifting tool 30A.
  • the transfer control unit 51A controls the drive of the crane device 1A.
  • the transport control unit 51A moves the girder 12A, traverses the trolley 13A, rolls up and down the wire rope 17 by the winding mechanism 11A (moves the lifting device 30A in the vertical direction), and rotates the lifting device 30A by the rotation mechanism. It controls the rotational movement and the rotation of the locking piece of the lock pin.
  • the conveyance control unit 51A controls the height information of the lifting tool 30A acquired based on the detection result of the encoder 31 and the deflection amount information of the lifting tool 30A acquired based on the detection result of the shake sensor 32 as the control unit 50.
  • the imaging unit 40 is an imaging device that is provided on the hanging tool 30A and captures an image of the lower side of the hanging tool 30A to obtain a captured image.
  • the configuration and function of the imaging unit 40 are the same as those described above in the first embodiment.
  • the distance information acquisition unit 41 is a device provided in the hanging tool 30A to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A.
  • the measurement points are, for example, points set on the upper surface of the container yard Y, the container D mounted on the container yard Y, and the like, and may be set, for example, in a matrix.
  • the measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane.
  • the configuration and function of the distance information acquisition unit 41 are the same as those described above in the first embodiment.
  • the color information acquisition unit 42 is a device that is provided to the hanging tool 30A and acquires color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A.
  • the configuration and function of the color information acquisition unit 42 are the same as those described above in the first embodiment.
  • the display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50.
  • the display unit 43 may be, for example, a display.
  • the display unit 43 is provided, for example, in the driver's cab 60A.
  • the control unit 50 processes the captured image to generate a emphasized image, and causes the display unit 43 to display the generated emphasized image.
  • the control unit 50 can generate an enhanced image in load gripping control in which the object container D1 is gripped by the lifting implement 30A, and an enhanced image in unloading control in which the holding container D2 is unloaded onto the target container D3.
  • the control unit 50 displays the enhanced image in the cargo hold control in the second embodiment by processing the captured image in the same manner as the cargo hold control in the first embodiment.
  • FIG. 16 is a diagram showing a reference image P9.
  • FIG. 17 is a diagram showing a highlight image P10 in the load grip control.
  • the control unit 50 stores in advance the shape of the locked portion G of the container D as a reference image P9.
  • the highlighted image P10 in FIG. 17 includes a part of the main body 18A of the hanger 30A and the vicinity of the engaged portion G of the target container D1 which is about to be gripped by the hanger 30A.
  • the hanger 30A is at a position offset from immediately above the target container D1.
  • the “target object” is the locked portion G of the target container D1.
  • the control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, the first region in which the to-be-locked portion G of the target container D1 which is the target object is located. In addition, the control unit 50 estimates a second region in which the to-be-locked portion G of the target container D1 that is the target object is located, based on the color information acquired by the color information acquisition unit 42. The control unit 50 sets an area including the OR area included in at least one of the first area and the second area as the target area A11. As described above, the control unit 50 extracts the target area A11 from the captured image.
  • control unit 50 detects, in the target area A11, a corresponding portion corresponding to the shape of the reference image stored in advance using a known image recognition method.
  • the control unit 50 does not detect the corresponding part A12 outside the target area A11 in the captured image.
  • the control unit 50 generates an emphasized image P10 in which the corresponding part A12 detected in the captured image is emphasized, and causes the display unit 43 to display the generated emphasized image P10.
  • the highlighted image P10 displays the target area A11 in color and the areas other than the target area A11 in gray with respect to the captured image before the target container D1 is grasped.
  • the target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map.
  • the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line.
  • a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A grips the target container D1 is indicated by a dashed-dotted line.
  • the reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13.
  • the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D1 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D1 may be detected based on the angular shift with the portion A13.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A.
  • the transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated.
  • the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D1 by moving the hanger 30A such that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A.
  • the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
  • the control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
  • FIG. 18 is a view showing a highlighted image P11 in unloading control.
  • the vicinity of the to-be-locked portion G is included.
  • the hanger 30A is at a position deviated from immediately above the target container D3.
  • the “object” is the locked portion G of the target container D3.
  • the highlighted image P11 displays the target area A11, which is an area including the OR area in the OR probability map, in color with respect to the captured image before unloading the holding container D2 onto the target container D3, and also displays a color other than the target area A11.
  • the area is displayed in gray.
  • the target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map.
  • the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line.
  • a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A unloads the holding container D2 is displayed by a dashed-dotted line.
  • the reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
  • the control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D3 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D3 may be detected based on the angular shift with the part A13.
  • the control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A.
  • the transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated.
  • the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D3 by moving the hanger 30 so that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A.
  • the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
  • the crane device 1A shifts the positional deviation between the corresponding portion A12 and the reference portion A13 in advance in a state where the lifting tool 30A is positioned at a predetermined height before the locked portion G of the target container D3 is not displayed. Is detected, and the detected value of the shake sensor 32 is calibrated based on the detection result, so that the detected value of the shake sensor 32 is associated with the positional deviation.
  • the crane device 1A monitors the state of positional deviation between the corresponding portion A12 and the reference portion A13 based on the detection result of the shake sensor 32, while lowering the holding container D2 toward the target container D3 by the lifting tool 30A. . Thereby, crane apparatus 1A may maintain the state where hanger 30A is located directly on target container D3.
  • the locked portion G of the target container D1 or the target container D3 that is the target object is locked based on the distance information.
  • the first region in which the portion G is located is estimated, and the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is positioned based on the color information
  • An area is estimated.
  • the crane apparatus 1A extracts a target area A11 including the area in which the target is located from the captured image, and a corresponding portion A12 corresponding to the shape of the reference image P9 in the target area A11. Emphasize and display.
  • the crane apparatus 1A can acquire the positional relationship between the hanger 30A and the object using the distance information and the color information in a complementary manner. Therefore, the crane device 1A is less susceptible to the effects of disturbance light, the influence of specular reflection on the object, and the detection of raindrops, for example, so that the positional relationship between the lifting gear 30A and the object is acquired more reliably. Can.
  • the control unit 50 does not detect the corresponding part A12 outside the target area A11 of the captured image. As a result, it is only necessary to detect the corresponding part A12 corresponding to the shape of the reference image P9 only in the target area A11 of the captured image, so that the time required for processing can be shortened.
  • the control unit 50 is a first region in which the to-be-locked portion G of the target container D1 or the to-be-locked portion G of the target container D3 is the object in the captured image based on the distance information.
  • To estimate a second region in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image based on the color information A region including the OR region included in at least one of the region and the second region is extracted from the captured image as a target region A11.
  • the second area which is estimated to have the object located by the above, is also extracted from the captured image as the target area A11. Therefore, it is suppressed that an object can not be detected despite the presence of an object, and the positional relationship between the hanger 30A and the object can be acquired more reliably.
  • the control unit 50 is configured such that the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is located in the captured image based on the distance information. 1 area is estimated, and a second area in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image is estimated based on color information; A region including the AND region included in the first region and included in the second region is extracted from the captured image as a target region A11.
  • an area in which the object is estimated to be located not only by distance information susceptible to disturbance light but also by color information not easily affected by disturbance light is extracted as a target area A11 from the captured image. . Therefore, erroneous detection as the presence of the target is suppressed even in the area where the target does not exist, and the positional relationship between the hanging tool 30A and the target can be acquired more reliably.
  • the crane apparatus 1A includes a moving mechanism 10A for moving the lifting tool 30A, and the control unit 50 performs loading or unloading of the corresponding part A12 in the captured image and the container D in which the lifting tool 30A is a load.
  • the positional deviation of the reference part A13 where the corresponding part A12 should be positioned in the captured image is detected, and the moving mechanism 10A moves the hanger 30A so that the positional deviation disappears. Thereby, the hanger 30A can be accurately positioned immediately above the object.
  • the target object is the to-be-locked portion G of the target container D1 gripped by the lifting tool 30A.
  • the target object is the engaged portion G of the target container D3 to which the holding container D2 held by the lifting tool 30A is unloaded.
  • two or all of the devices constituting the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common devices. It is also good.
  • the imaging unit 40 and the color information acquisition unit 42 may be configured by a common color camera, and the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common RGB image integrated TOF cameras It may be configured.
  • the control unit 50 when an object is detected in a region where an object should not exist based on the OR accuracy map or the AND accuracy map, the control unit 50 sets an obstacle for the object. It may be recognized as
  • the object may have a predetermined shape, a predetermined size, and a predetermined color.
  • the object may be other than the loading mark M of the coil C or the skid S.
  • the object may be a part of the coil C, or the entire skid S.
  • the object may be something other than the locked portion G of the target container D1 or the locked portion G of the target container D3.
  • the object may be the entire target container D1 or the entire target container D3.
  • control unit 50 may store reference portions A6, A10, and A13 in advance. In this case, the amount of calculation by the control unit 50 can be reduced.
  • the display unit 43 is not limited to the display provided in the driver's cab 60 or 60A, and any configuration can be adopted.
  • the display unit 43 may be a display of a portable terminal that can communicate with the crane devices 1 and 1A directly or indirectly through a network or the like.
  • the display unit 43 may be omitted. That is, the crane devices 1 and 1A may not include the display unit 43 when the control unit 50 performs the automatic operation instead of the manual operation of the worker.
  • the crane devices 1 and 1A are not limited to the overhead crane and the container handling crane, respectively, and may be other various lanes.
  • 1, 1A crane device, 30, 30A: lifting tool
  • 40 imaging unit
  • 41 distance information acquisition unit
  • 42 color information acquisition unit
  • 43 display unit
  • 50 control unit
  • C coil (load, target Object)
  • D Container (load)
  • G Locked part (object)
  • M Loading mark (object).

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

A crane device comprises: a suspension means; an imaging unit that acquires a captured image by imaging under the suspension means; a distance information acquisition unit that acquires distance information to a plurality of measurement points within the range of the captured image of under the suspension means; a color information acquisition unit that acquires color information within the range of the captured image of under the suspension means; a display unit; and a control unit that processes the captured image and causes the same to be displayed on a display device. The control unit extracts, from the captured image, a subject area that includes an area in which an object of concern is located in the captured image on the basis of the distance information and the color information, detects a corresponding portion in the subject area that corresponds to the shape of a reference image stored in advance, and displays the captured image on the display unit while accentuating the detected corresponding portion.

Description

クレーン装置Crane equipment
 本発明の一態様は、クレーン装置に関する。 One aspect of the present invention relates to a crane apparatus.
 特許文献1には、載置されたコンテナをスプレッダにより掴上げる場合、又は、スプレッダにより把持されたコンテナを他の載置されたコンテナ上に積上げる場合に、スプレッダに設けられた3Dカメラにより取得される距離マップに基づき、載置されたコンテナを検知してスプレッダと当該コンテナとの位置関係を取得するクレーンが記載されている。 According to Patent Document 1, when a placed container is grasped by a spreader, or when a container grasped by a spreader is stacked on another placed container, acquisition is performed by a 3D camera provided in the spreader. The crane which detects the mounted container and acquires the positional relationship of a spreader and the said container is described based on the distance map.
特表2015-533747号公報Japanese Patent Application Publication No. 2015-533747
 上述したようなクレーンで用いられる3Dカメラは、検知すべきコンテナが含まれる領域に対して計測光を出力し、その反射光を検出することで、距離マップを取得する。このため、3Dカメラは、例えば屋外での強力な太陽光や外乱光の影響を受けると、検知すべきコンテナが存在していない領域においてコンテナが存在していると誤検知してしまうおそれがある。また、3Dカメラでは、雨天時の屋外で使用された際に雨粒を検出してしまい、目標の対象物を適切に検出することができないおそれがある。更に、3Dカメラでは、例えば検知すべきコンテナに鏡面部分があると、当該3Dカメラにより反射光を検出することができない方向に計測光が鏡面反射されてしまう場合がある。この場合、3Dカメラは、鏡面部分についての距離マップを取得することができず、その結果、検知すべきコンテナが存在しているにもかかわらずコンテナを検知することができないおそれがある。 The 3D camera used in the crane as described above outputs measurement light to the area including the container to be detected, and detects the reflected light to obtain a distance map. For this reason, the 3D camera may erroneously detect that a container is present in an area where there is no container to be detected, for example, under the influence of strong sunlight or disturbance light outdoors. . In addition, the 3D camera may detect raindrops when used outdoors when it is raining, and may not properly detect a target object. Furthermore, in the 3D camera, for example, if there is a mirror surface portion in the container to be detected, the measurement light may be specularly reflected in a direction in which the reflected light can not be detected by the 3D camera. In this case, the 3D camera can not acquire the distance map for the mirror surface, and as a result, there is a possibility that the container can not be detected despite the presence of the container to be detected.
 そこで、本発明の一態様は、吊具と対象物との位置関係をより確実に取得することができるクレーン装置を提供することを目的とする。 Then, one aspect of the present invention aims to provide a crane apparatus capable of more reliably acquiring the positional relationship between a lifting gear and an object.
 本発明の一態様に係るクレーン装置は、荷の荷掴み、保持、及び荷下ろしを行う吊具と、吊具に設けられ、吊具の下方を撮像して撮像画像を取得する撮像部と、吊具に設けられ、吊具の下方のうち撮像画像の範囲内の複数の計測点までの距離情報を取得する距離情報取得部と、吊具に設けられ、吊具の下方のうち撮像画像の範囲内の色彩情報を取得する色彩情報取得部と、撮像画像を表示する表示部と、撮像画像を処理して表示部に表示させる制御部と、を備え、制御部は、距離情報及び色彩情報に基づいて、撮像画像において対象物が位置している領域を含む対象領域を当該撮像画像から抽出し、対象領域において、予め記憶している参照画像の形状に対応する対応部分を検出し、撮像画像を、検出した対応部分を強調して、表示部に表示させる。 A crane apparatus according to an aspect of the present invention includes: a lifting device that grips, holds, and unloads a load; an imaging unit that is provided on the lifting device and captures an image below the lifting device to obtain a captured image; A distance information acquisition unit provided on the lifting tool for acquiring distance information to a plurality of measurement points within the range of the captured image below the lifting tool, provided on the lifting tool, of the captured image A color information acquisition unit for acquiring color information in a range, a display unit for displaying a captured image, and a control unit for processing the captured image and displaying the image on the display unit, the control unit including distance information and color information Based on the target image including the region where the target is located in the captured image is extracted from the captured image, and a corresponding portion corresponding to the shape of the reference image stored in advance is detected in The image is highlighted on the display with the corresponding parts highlighted Make.
 本発明の一態様に係るクレーン装置によれば、吊具の下方を撮像した撮像画像において、距離情報に基づき対象物が位置している領域が推定されると共に、色彩情報に基づき対象物が位置している領域が推定される。そして、この装置は、これらの推定結果に基づいて、対象物が位置している領域を含む対象領域を撮像画像から抽出し、対象領域において参照画像の形状に対応する対応部分を強調して表示する。したがって、この装置は、距離情報と色彩情報とを相補的に用いて吊具と対象物との位置関係を取得することができる。よって、この装置は、例えば外乱光の影響及び対象物での鏡面反射の影響を受けにくいため、吊具と対象物との位置関係をより確実に取得することができる。 According to the crane apparatus according to an aspect of the present invention, in the captured image obtained by capturing the lower part of the lifting gear, the region where the object is located is estimated based on the distance information, and the object is located based on the color information The region of interest is estimated. Then, the apparatus extracts a target area including the area in which the target is located from the captured image based on these estimation results, and emphasizes and displays a corresponding portion corresponding to the shape of the reference image in the target area. Do. Therefore, this device can acquire the positional relationship between the lifting gear and the object using the distance information and the color information in a complementary manner. Therefore, since this device is not susceptible to, for example, the influence of disturbance light and the influence of specular reflection on an object, the positional relationship between the lifting gear and the object can be acquired more reliably.
 本発明の一態様に係るクレーン装置では、制御部は、撮像画像の対象領域の外において、対応部分を検出しなくてもよい。これによれば、撮像画像のうちの対象領域のみについて、参照画像の形状に対応する対応部分を検出すればよいため、処理に要する時間を短縮することができる。 In the crane apparatus according to an aspect of the present invention, the control unit may not detect the corresponding part outside the target area of the captured image. According to this, since it is only necessary to detect the corresponding part corresponding to the shape of the reference image for only the target area in the captured image, the time required for the process can be shortened.
 本発明の一態様に係るクレーン装置では、制御部は、距離情報に基づいて、撮像画像において対象物が位置している第1領域を推定し、色彩情報に基づいて、撮像画像において対象物が位置している第2領域を推定し、第1領域及び第2領域の少なくとも一方に含まれている領域を含む領域を、対象領域として撮像画像から抽出してもよい。これによれば、対象物での鏡面反射の影響を受けやすい距離情報によって対象物が位置していると推定された領域に加えて、対象物での鏡面反射の影響を受けにくい色彩情報によって対象物が位置していると推定された領域についても、対象領域として撮像画像から抽出する。よって、対象物が存在しているにもかかわらず対象物を検知することができないことが抑制され、吊具と対象物との位置関係をより確実に取得することができる。 In the crane apparatus according to one aspect of the present invention, the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information. A second region located may be estimated, and a region including a region included in at least one of the first region and the second region may be extracted from the captured image as a target region. According to this, in addition to the area in which the object is estimated to be located by the distance information susceptible to the specular reflection at the object, the object is targeted by the color information which is not easily affected by the specular reflection at the object An area in which it is estimated that an object is located is also extracted from the captured image as a target area. Therefore, it is suppressed that an object can not be detected despite the presence of an object, and the positional relationship between the lifting gear and the object can be acquired more reliably.
 本発明の一態様に係るクレーン装置では、制御部は、距離情報に基づいて、撮像画像において対象物が位置している第1領域を推定し、色彩情報に基づいて、撮像画像において対象物が位置している第2領域を推定し、第1領域に含まれ且つ第2領域に含まれている領域を含む領域を、対象領域として撮像画像から抽出してもよい。これによれば、外乱光の影響を受けやすい距離情報だけでなく、外乱光の影響を受けにくい色彩情報によっても対象物が位置していると推定された領域を、対象領域として撮像画像から抽出する。よって、対象物が存在していない領域において対象物が存在していると誤検知してしまうことが抑制され、吊具と対象物との位置関係をより確実に取得することができる。 In the crane apparatus according to one aspect of the present invention, the control unit estimates the first region in the captured image in which the target is located based on the distance information, and the target is in the captured image based on the color information. A second region located may be estimated, and a region including the region included in the first region and included in the second region may be extracted from the captured image as a target region. According to this, not only distance information susceptible to disturbance light but also color information unlikely to be affected by disturbance light is extracted from the captured image as a target region, an area estimated to be located by the object. Do. Therefore, it is suppressed that it will misdetect as a target object exists in the area | region where a target object does not exist, and it can acquire the positional relationship of a hanger and a target object more reliably.
 本発明の一態様に係るクレーン装置は、吊具を移動させる移動機構を備え、制御部は、撮像画像における対応部分と、吊具が荷の荷掴み又は荷下ろしを行う際に、撮像画像において対応部分が位置すべき基準部分と、の位置ずれを検出し、移動機構は、位置ずれが無くなるように吊具を移動させてもよい。これによれば、吊具を対象物の直上に精度良く位置させることができる。 The crane apparatus according to an aspect of the present invention includes a moving mechanism for moving the lifting gear, and the control unit causes the corresponding portion in the captured image to be captured and unloaded when the lifting gear performs loading and unloading of the load. The position shift of the reference portion where the corresponding portion is to be positioned may be detected, and the moving mechanism may move the hanger so as to eliminate the position shift. According to this, it is possible to accurately position the lifting device directly above the object.
 本発明の一態様に係るクレーン装置では、対象物は、吊具により荷掴みされる荷の少なくとも一部分であってもよい。これによれば、吊具により荷を荷掴みする状況において、吊具と荷との位置関係をより確実に取得することができる。 In the crane apparatus according to an aspect of the present invention, the object may be at least a part of the load gripped by the lifting gear. According to this, in the situation where the load is gripped by the lifting tool, the positional relationship between the lifting tool and the load can be acquired more reliably.
 本発明の一態様に係るクレーン装置では、対象物は、吊具により保持された荷が荷下ろしされる載置部の少なくとも一部分であってもよい。これによれば、吊具により荷を荷下ろしする状況において、吊具と載置部との位置関係をより確実に取得することができる。 In the crane apparatus according to an aspect of the present invention, the object may be at least a part of the mounting portion on which the load held by the lifting device is unloaded. According to this, in the situation where the load is unloaded by the lifting tool, the positional relationship between the lifting tool and the placement portion can be acquired more reliably.
 本発明の一態様によれば、吊具と対象物との位置関係をより確実に取得することが可能になる。 According to one aspect of the present invention, it is possible to more reliably acquire the positional relationship between the lifting gear and the object.
図1は、第1実施形態に係るクレーン装置を示すブロック図である。FIG. 1 is a block diagram showing a crane apparatus according to the first embodiment. 図2は、クレーン装置の正面図である。FIG. 2 is a front view of the crane apparatus. 図3は、クレーン装置の側面図である。FIG. 3 is a side view of the crane apparatus. 図4は、吊具及びコイルを示す斜視図である。FIG. 4 is a perspective view showing a hanger and a coil. 図5は、荷掴み制御における撮像画像を示す図である。FIG. 5 is a view showing a captured image in the load grip control. 図6は、荷掴み制御における距離確度マップを示す図である。FIG. 6 is a diagram showing a distance accuracy map in the load grip control. 図7は、荷掴み制御における色彩確度マップを示す図である。FIG. 7 is a diagram showing a color accuracy map in the load grip control. 図8は、荷掴み制御におけるOR確度マップを示す図である。FIG. 8 is a diagram showing an OR probability map in the load grip control. 図9は、参照画像を示す図である。FIG. 9 is a view showing a reference image. 図10は、荷掴み制御における強調画像を示す図である。FIG. 10 is a view showing a highlighted image in the load gripping control. 図11は、荷掴み制御におけるAND確度マップを示す図である。FIG. 11 is a diagram showing an AND probability map in the load grip control. 図12は、荷下ろし制御における強調画像を示す図である。FIG. 12 is a diagram showing a highlighted image in unloading control. 図13は、第2実施形態に係るクレーン装置を示すブロック図である。FIG. 13 is a block diagram showing a crane apparatus according to a second embodiment. 図14は、クレーン装置の正面図である。FIG. 14 is a front view of the crane apparatus. 図15は、クレーン装置の斜視図である。FIG. 15 is a perspective view of the crane apparatus. 図16は、参照画像を示す図である。FIG. 16 is a diagram showing a reference image. 図17は、荷掴み制御における強調画像を示す図である。FIG. 17 is a view showing a highlighted image in the load grip control. 図18は、荷下ろし制御における強調画像を示す図である。FIG. 18 is a view showing a highlighted image in unloading control.
 以下、本発明の実施形態について、図面を参照して説明する。なお、各図において同一又は相当部分には同一符号を付し、重複する説明を省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings, the same or corresponding parts are denoted by the same reference numerals and redundant description will be omitted.
[第1実施形態]
[クレーン装置の構成]
 図1は、第1実施形態に係るクレーン装置1を示すブロック図である。図2は、クレーン装置1の正面図である。図3は、クレーン装置1の側面図である。図4は吊具30及びコイルCを示す斜視図である。図1~図4に示されるように、第1実施形態に係るクレーン装置1は、円筒形状に巻かれた鋼板であるコイル(荷)Cを搬送する天井クレーンである。クレーン装置1は、移動機構10、吊具30、撮像部40、距離情報取得部41、色彩情報取得部42、表示部43、制御部50、及び運転室60を備えている。また、クレーン装置1は、移動機構10及び吊具30を制御する搬送制御部51、並びに、搬送制御部51に接続されたエンコーダ31を備えている。
First Embodiment
[Configuration of crane device]
FIG. 1 is a block diagram showing a crane apparatus 1 according to the first embodiment. FIG. 2 is a front view of the crane apparatus 1. FIG. 3 is a side view of the crane apparatus 1. FIG. 4 is a perspective view showing the hanger 30 and the coil C. As shown in FIG. As shown in FIGS. 1 to 4, the crane apparatus 1 according to the first embodiment is an overhead crane that transports a coil (load) C that is a steel plate wound in a cylindrical shape. The crane device 1 includes a moving mechanism 10, a lifting tool 30, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60. The crane apparatus 1 also includes a transport control unit 51 that controls the moving mechanism 10 and the lifting tool 30, and an encoder 31 connected to the transport control unit 51.
 コイルCは、略円筒形状を呈している。コイルCの軸心位置には、コイルCの一端から他端に貫通した吊孔Hが形成されている。コイルCは、工場又は倉庫等の建屋の床面Fに所定間隔でマトリクス状に配置されたスキッド(載置部)S上に、軸心方向が水平方向となる向きに載置される。コイルCは、所定の形状、所定の大きさ、及び所定の色彩を有している。コイルCの外周面は、鏡面状となっている。 The coil C has a substantially cylindrical shape. At the axial center position of the coil C, a hanging hole H penetrating from one end to the other end of the coil C is formed. The coil C is placed on a skid (placement unit) S disposed in a matrix at predetermined intervals on a floor surface F of a building such as a factory or a warehouse in a direction in which the axial direction is horizontal. The coil C has a predetermined shape, a predetermined size, and a predetermined color. The outer peripheral surface of the coil C is mirror-like.
 移動機構10は、吊具30を移動(水平移動、上下移動、及び回転移動)させる機構である。移動機構10は、ガーダー12及びトロリー13を有している。ガーダー12は、トロリー13及び吊具30の荷重を支持するものである。ガーダー12は、建屋の天井付近に、両壁に亘って略水平に横架されている。ガーダー12は、当該ガーダー12の延在方向に対して直交する水平方向に移動可能となっており、この方向にトロリー13及び吊具30を移動させることができる。 The moving mechanism 10 is a mechanism that moves the lifting tool 30 (horizontally, vertically, and rotationally). The moving mechanism 10 has a girder 12 and a trolley 13. The girder 12 supports the load of the trolley 13 and the hanger 30. The girder 12 is horizontally laid across both walls near the ceiling of the building. The girder 12 is movable in the horizontal direction orthogonal to the extending direction of the girder 12, and the trolley 13 and the hanger 30 can be moved in this direction.
 トロリー13は、ガーダー12の上面を、ガーダー12の延在方向に沿って横行する。これにより、トロリー13は、ガーダー12の延在方向に沿って吊具30を移動させることができる。また、トロリー13は、垂下されたワイヤーロープ17を巻き上げ及び巻き下げする巻上機構11を有している。更に、トロリー13は、ワイヤーロープ17を介して吊具30を鉛直方向の軸線回りに回転移動させる回転機構(不図示)を有している。 The trolley 13 traverses the upper surface of the girder 12 along the extending direction of the girder 12. Thereby, the trolley 13 can move the hanger 30 along the extension direction of the girder 12. In addition, the trolley 13 has a hoisting mechanism 11 for winding up and down the hanging wire rope 17. Furthermore, the trolley 13 has a rotation mechanism (not shown) that rotationally moves the hanger 30 about the vertical axis in the vertical direction via the wire rope 17.
 吊具30は、コイルCの荷掴み、保持、及び荷下ろしを行う器具である。吊具30は、トロリー13から垂下されたワイヤーロープ17に吊り下げられている。吊具30は、トロリー13の巻上機構11によりワイヤーロープ17が巻き上げられることで上方に移動し、巻上機構11によりワイヤーロープ17が巻き下げられることで下方に移動する。吊具30は、基部18及び一対の対向する爪部19,19を有している。基部18は、その上面側にワイヤーロープ17が接続されており、その下面側に爪部19,19が開閉可能に設けられている。 The hanger 30 is a device for gripping, holding, and unloading the coil C. The hanger 30 is hung on a wire rope 17 hanging down from the trolley 13. The lifting tool 30 is moved upward by winding up the wire rope 17 by the winding mechanism 11 of the trolley 13 and is moved downward by winding down the wire rope 17 by the winding mechanism 11. The hanger 30 has a base 18 and a pair of opposed claws 19, 19. The wire rope 17 is connected to the upper surface side of the base portion 18, and the claw portions 19, 19 are provided on the lower surface side so as to be able to open and close.
 爪部19,19の各爪先には、互いに向き合い正対するように突出した凸部20が設けられている。爪部19,19が閉じることにより、各凸部20は、コイルCの吊孔Hにそれぞれ両側から挿入される。これにより、吊具30は、コイルCを保持した状態となる。「吊具30がコイルCを保持した状態」においては、爪部19,19の互いに対向する内側面21,21がコイルCに当接していても、当接していなくてもよい。 The toe of each of the claws 19 and 19 is provided with a projection 20 projecting so as to face each other. By the claws 19 and 19 closing, the respective convex portions 20 are inserted into the suspension holes H of the coil C from both sides. As a result, the hanger 30 holds the coil C. In the “state in which the hanger 30 holds the coil C”, the inner surfaces 21 and 21 of the claws 19 and 19 facing each other may or may not be in contact with the coil C.
 吊具30は、コイルCの移載を以下の手順で行う。すなわち、吊具30は、爪部19,19を開閉してスキッドS上に載置されたコイルCを荷掴みし、当該コイルCを保持した状態で移動機構10及び巻上機構11により他のスキッドS上に移動され、再度、爪部19,19を開閉して当該コイルCを荷下ろしすることによって、コイルCの移載を行う。 The hanging tool 30 transfers the coil C in the following procedure. That is, the lifting tool 30 opens and closes the claws 19 and 19 to grasp the load on the coil C placed on the skid S, and while holding the coil C, the moving mechanism 10 and the lifting mechanism 11 The coil C is transferred by being moved onto the skid S, and the claws 19, 19 are opened and closed again to unload the coil C.
 エンコーダ31は、巻上機構11の巻き上げ及び巻き下げの動作量、並びに、トロリー13の横行の移動量を検出するセンサである。エンコーダ31の検出結果に基づいて、搬送制御部51は吊具30の現在の高さを示す高さ情報を取得する。なお、エンコーダ31は、巻上機構11の巻き上げ及び巻き下げの動作量を検出し、トロリー13の横行の移動量を検出しなくてもよい。 The encoder 31 is a sensor that detects the amount of operation for raising and lowering the hoisting mechanism 11 and the amount of movement of the trolley 13 in the transverse direction. Based on the detection result of the encoder 31, the conveyance control unit 51 acquires height information indicating the current height of the lifting tool 30. The encoder 31 may not detect the amount of movement of the trolley 13 in the transverse direction by detecting the amount of operation of raising and lowering the winding mechanism 11.
 搬送制御部51は、クレーン装置1の駆動を制御する。例えば、搬送制御部51は、ガーダー12の移動、トロリー13の横行、巻上機構11によるワイヤーロープ17の巻き上げ及び巻き下げ(吊具30の上下方向への移動)、回転機構による吊具30の回転移動、及び、爪部19,19の開閉を制御する。また、搬送制御部51は、エンコーダ31の検出結果に基づいて取得した吊具30の高さ情報を制御部50に出力する。 The transfer control unit 51 controls the driving of the crane device 1. For example, the transport control unit 51 moves the girder 12, traverses the trolley 13, rolls up and down the wire rope 17 by the winding mechanism 11 (moves the lifting tool 30 in the vertical direction), and rotates the lifting tool 30 by the rotation mechanism. It controls the rotational movement and the opening and closing of the claws 19, 19. Further, the conveyance control unit 51 outputs, to the control unit 50, the height information of the hanger 30 obtained based on the detection result of the encoder 31.
 撮像部40は、吊具30に設けられ、吊具30の下方を撮像して撮像画像を取得する撮像機器である。撮像部40は、例えばカメラであってもよい。撮像部40は、吊具30の基部18に、下方を向くように設けられている。撮像部40は、吊具30に複数設けられていてもよい。 The imaging unit 40 is an imaging device that is provided on the lifting tool 30 and captures an image of the lower side of the lifting tool 30 to obtain a captured image. The imaging unit 40 may be, for example, a camera. The imaging unit 40 is provided on the base 18 of the hanger 30 so as to face downward. A plurality of imaging units 40 may be provided on the hanger 30.
 距離情報取得部41は、吊具30に設けられ、吊具30の下方のうち撮像部40により撮像された撮像画像の範囲内の複数の計測点までの距離情報を取得する機器である。距離情報取得部41は、各計測点までの距離情報を撮像画像における位置と対応付けて取得する。距離情報取得部41は、距離情報を取得可能な機器であれば特に限定されず、例えばToF[Time of Flight]カメラであってもよく、3Dカメラであってもよく、3Dスキャナであってもよい。距離情報取得部41は、吊具30の基部18における撮像部40の近傍に、下方を向くように設けられている。距離情報取得部41は、撮像部40の個数に対応した個数だけ設けられている。「距離情報」とは、距離情報取得部41と各計測点との間の距離を示す情報である。ここでは、計測点は、例えば床面F、スキッドS、コイルC等の上面に設定される点であり、例えばマトリクス状に設定されていてもよい。計測点は、例えば距離情報取得部41の水平面内での位置座標を基準とした所定の相対位置に設定されてもよい。 The distance information acquisition unit 41 is a device provided on the lifting tool 30 to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30. The distance information acquisition unit 41 acquires distance information to each measurement point in association with the position in the captured image. The distance information acquisition unit 41 is not particularly limited as long as it is an apparatus capable of acquiring distance information, and may be, for example, a ToF [Time of Flight] camera, a 3D camera, or a 3D scanner. Good. The distance information acquisition unit 41 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanging tool 30 so as to face downward. The distance information acquisition units 41 are provided by the number corresponding to the number of imaging units 40. The “distance information” is information indicating the distance between the distance information acquisition unit 41 and each measurement point. Here, the measurement points are, for example, points set on the floor F, the skid S, the upper surface of the coil C, etc., and may be set in a matrix, for example. The measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane.
 色彩情報取得部42は、吊具30に設けられ、吊具30の下方のうち撮像部40により撮像された撮像画像の範囲内の色彩情報を取得する機器である。色彩情報取得部42は、色彩情報を撮像画像における位置と対応付けて取得する。色彩情報取得部42は、色彩情報を取得可能な機器であれば特に限定されず、例えばスペクトルカメラであってもよく、カラーカメラであってもよい。色彩情報取得部42は、吊具30の基部18における撮像部40の近傍に、下方を向くように設けられている。色彩情報取得部42は、撮像部40の個数に対応した個数だけ設けられている。「色彩情報」とは、例えばカラー画像をRGB信号に分解してベクトル表現で示す情報である。 The color information acquisition unit 42 is a device provided on the hanging tool 30 to obtain color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30. The color information acquisition unit 42 acquires color information in association with the position in the captured image. The color information acquisition unit 42 is not particularly limited as long as it is an apparatus capable of acquiring color information, and may be, for example, a spectrum camera or a color camera. The color information acquisition unit 42 is provided in the vicinity of the imaging unit 40 in the base 18 of the hanger 30 so as to face downward. The color information acquisition units 42 are provided in the number corresponding to the number of imaging units 40. The “color information” is, for example, information which decomposes a color image into RGB signals and indicates it in vector representation.
 表示部43は、撮像画像を表示する表示機器である。また、表示部43は、制御部50により生成された強調画像を表示する。表示部43は、例えばディスプレイであってもよい。表示部43は、例えば運転室60内に設けられている。 The display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50. The display unit 43 may be, for example, a display. The display unit 43 is provided, for example, in the driver's cab 60.
 制御部50は、撮像画像を処理して強調画像を生成し、生成した強調画像を表示部43に表示させる(詳しくは後述)。制御部50は、スキッドS上に載置されたコイルCを吊具30により荷掴みする荷掴み制御における強調画像、及び、吊具30により保持されたコイルCをスキッドS上に荷下ろしする荷下ろし制御における強調画像を生成可能である。 The control unit 50 processes the captured image to generate an emphasized image, and causes the display unit 43 to display the generated emphasized image (the details will be described later). The control unit 50 is configured to unload the coil C held by the lifting tool 30 onto the skid S by unloading the coil C held by the lifting tool 30. It is possible to generate an enhanced image in the down control.
[荷掴み制御]
 以下、荷掴み制御における強調画像の表示について説明する。
[Load control]
Hereinafter, the display of the emphasis image in the load grip control will be described.
 図5は、荷掴み制御における撮像画像P1を示す図である。図5に示されるように、制御部50は、撮像部40により撮像された撮像画像P1を表示部43に表示させる。図5には、吊具30の下方に載置されたコイルC、吊具30の一方の爪部19、及び、コイルCが載置されていないスキッドSが表示されている。図5では、吊具30は、載置されたコイルCの直上からずれた位置にある。なお、図5において、コイルCは、当該コイルCの下の図示されないスキッドS上に載置されている。コイルCの外周面が鏡面状となっていることから、撮像画像P1におけるコイルCには、周囲の景色が映り込んだ映込部分Rが存在している。 FIG. 5 is a view showing a captured image P1 in the load grip control. As shown in FIG. 5, the control unit 50 causes the display unit 43 to display the captured image P1 captured by the imaging unit 40. In FIG. 5, the coil C placed below the hanger 30, one claw portion 19 of the hanger 30, and the skid S on which the coil C is not placed are displayed. In FIG. 5, the hanger 30 is at a position offset from immediately above the mounted coil C. In FIG. 5, the coil C is placed on a skid S (not shown) below the coil C. Since the outer peripheral surface of the coil C is mirror-like, the coil C in the captured image P1 has a reflection portion R in which the surrounding scene is reflected.
 続いて、制御部50は、撮像画像P1において対象物が位置している領域を含む対象領域を、以下のように当該撮像画像P1から抽出する。なお、ここでは、「対象物」は、吊具30により荷掴みされるコイルCである。 Subsequently, the control unit 50 extracts a target area including the area in which the target is located in the captured image P1 from the captured image P1 as follows. Here, the “target object” is the coil C gripped by the lifting tool 30.
 図6は、荷掴み制御における距離確度マップP2を示す図である。図6に示されるように、距離情報取得部41は、吊具30の下方のうち撮像画像P1の範囲内の複数の計測点(不図示)までの距離情報を、撮像画像P1における位置と対応付けて取得する。一例として、計測点は、撮像画像P1の上下方向に所定数(例えば20個)、左右方向に所定数(例えば30個)ずつ等間隔でマトリクス状に配列した位置に設定されていてもよい。 FIG. 6 is a diagram showing the distance accuracy map P2 in the load grip control. As shown in FIG. 6, the distance information acquisition unit 41 corresponds distance information to a plurality of measurement points (not shown) in the range of the captured image P1 below the lifting tool 30 with the position in the captured image P1. Get it. As an example, the measurement points may be set at positions where predetermined numbers (for example, 20) in the vertical direction of the captured image P1 and predetermined numbers (for example, 30) in the horizontal direction are arranged in a matrix at equal intervals.
 制御部50は、距離情報取得部41により取得された距離情報に基づいて、撮像画像P1において対象物であるコイルCが位置している第1領域A1を推定する。より詳細には、コイルCが所定の形状及び所定の大きさを有していることを前提として、制御部50は、吊具30の現在の高さ情報に基づき吊具30とコイルCとの間の現在の距離を予測する。そして、制御部50は、予測した距離と距離情報取得部41により取得された距離情報に係る距離とを比較して、撮像画像P1においてコイルCが位置している第1領域A1を推定する。例えば、制御部50は、予測した距離と距離情報取得部41により取得された距離情報に係る距離との差が所定範囲内であれば、その領域にコイルCが位置していると判定し、所定範囲外であれば、その領域にコイルCが位置していないと判定してもよい。 The control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, a first area A1 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined shape and a predetermined size, the control unit 50 compares the lifting device 30 and the coil C with each other based on the current height information of the lifting device 30. Predict the current distance between. Then, the control unit 50 compares the predicted distance with the distance related to the distance information acquired by the distance information acquisition unit 41 to estimate the first area A1 in which the coil C is located in the captured image P1. For example, if the difference between the predicted distance and the distance related to the distance information acquired by the distance information acquisition unit 41 is within a predetermined range, the control unit 50 determines that the coil C is located in that area, If it is out of the predetermined range, it may be determined that the coil C is not located in the region.
 図6では、平面視におけるコイルCの軸心付近(図6中の白色領域)については、制御部50は、コイルCが位置している領域であると適切に判定している。一方、コイルCの平面視における軸心から遠い領域(図6中の灰色領域)については、距離情報取得部41の計測光が鏡面状のコイルCの外周面に対して平行に近い入射角で入射している。このため、当該領域では、距離情報取得部41は反射光を検出することができず、その結果、制御部50は、コイルCが位置していないと誤って判定している。 In FIG. 6, the control unit 50 appropriately determines that the area in which the coil C is located is in the vicinity of the axial center of the coil C in a plan view (white area in FIG. 6). On the other hand, in a region (gray region in FIG. 6) far from the axial center of the coil C in a plan view, the measurement light of the distance information acquisition unit 41 is incident at an incident angle close to parallel to the outer peripheral surface of the mirror C. It is incident. Therefore, in the area, the distance information acquisition unit 41 can not detect the reflected light, and as a result, the control unit 50 erroneously determines that the coil C is not located.
 図7は、荷掴み制御における色彩確度マップP3を示す図である。図7に示されるように、距離情報取得部41は、吊具30の下方のうち撮像画像P1の範囲内の色彩情報を、撮像画像P1における位置と対応付けて取得する。制御部50は、色彩情報取得部42により取得された色彩情報に基づいて、撮像画像P1において対象物であるコイルCが位置している第2領域A2を推定する。より詳細には、コイルCが所定の色彩を有していることを前提として、制御部50は、撮像画像P1において、当該所定の色彩との色の相関度に基づきコイルCが位置している第2領域A2を推定する。なお、「色の相関度」を判定する手法としては公知の手法を用いることができる。例えば、制御部50は、色彩情報を色度図上の座標に変換し、当該色度図上の座標と予め記憶しているコイルCの色彩の色度図上の座標との距離を評価する手法により、色の相関度を判定してもよい。制御部50は、コイルCの色彩を予め記憶している。制御部50は、色の相関度が所定値以上であれば、その領域にコイルCが位置していると判定し、所定値未満であれば、その領域にコイルCが位置していないと判定してもよい。 FIG. 7 is a diagram showing the color accuracy map P3 in the load grip control. As shown in FIG. 7, the distance information acquisition unit 41 acquires color information within the range of the captured image P1 below the lifting tool 30 in association with the position in the captured image P1. The control unit 50 estimates, based on the color information acquired by the color information acquisition unit 42, the second area A2 in which the coil C, which is an object in the captured image P1, is located. More specifically, on the premise that the coil C has a predetermined color, the control unit 50 positions the coil C based on the degree of color correlation with the predetermined color in the captured image P1. The second area A2 is estimated. In addition, a known method can be used as a method of determining "the degree of color correlation". For example, the control unit 50 converts the color information into coordinates on the chromaticity diagram, and evaluates the distance between the coordinates on the chromaticity diagram and the coordinates on the chromaticity diagram of the coil C stored in advance. The degree of color correlation may be determined by a method. The control unit 50 stores the color of the coil C in advance. The control unit 50 determines that the coil C is positioned in the area if the degree of color correlation is equal to or higher than a predetermined value, and determines that the coil C is not positioned in the area if it is less than the predetermined value. You may
 図7では、平面視において、コイルCのうちの映込部分Rを除いた部分(図7中の白色領域)については、制御部50は、コイルCが位置している領域であると適切に判定している。一方、映込部分Rについては、コイルの色彩とは大きく異なる色彩を有しているため、制御部50は、コイルCが位置していないと誤って判定している。 In FIG. 7, the control unit 50 appropriately determines that the coil C is located in a portion (white area in FIG. 7) of the coil C excluding the reflected portion R in a plan view. It is judged. On the other hand, since the reflection portion R has a color which is largely different from the color of the coil, the control unit 50 erroneously determines that the coil C is not located.
 図8は、荷掴み制御におけるOR確度マップP4を示す図である。図8に示されるOR確度マップP4は、距離確度マップP2の第1領域A1及び色彩確度マップP3の第2領域A2の少なくとも一方に含まれるOR領域A3(図8中の白色部分)を示している。制御部50は、OR確度マップP4におけるOR領域A3を含む領域(例えば、OR領域A3の外縁を所定の幅だけ拡大した領域)を対象領域A4に設定する。以上により、制御部50は、撮像画像P1においてコイルCが位置している領域である対象領域A4を撮像画像P1から抽出する。 FIG. 8 is a diagram showing an OR probability map P4 in the load grip control. The OR accuracy map P4 shown in FIG. 8 shows an OR region A3 (white part in FIG. 8) included in at least one of the first region A1 of the distance accuracy map P2 and the second region A2 of the color accuracy map P3. There is. The control unit 50 sets an area including the OR area A3 in the OR probability map P4 (for example, an area obtained by enlarging the outer edge of the OR area A3 by a predetermined width) as the target area A4. As described above, the control unit 50 extracts, from the captured image P1, a target region A4 which is a region in which the coil C is located in the captured image P1.
 続いて、制御部50は、撮像画像P1の対象領域A4において、予め記憶している参照画像の形状に対応する対応部分を以下のように検出する。 Subsequently, the control unit 50 detects a corresponding portion corresponding to the shape of the reference image stored in advance in the target area A4 of the captured image P1 as follows.
 図9は、参照画像P5を示す図である。図10は、荷掴み制御における強調画像P6を示す図である。図9及び図10に示されるように、制御部50は、対象物であるコイルCの撮像画像P1における形状(具体的には、コイルCの平面図)を、参照画像P5として予め記憶している。そして、制御部50は、撮像画像P1のうちの対象領域A4に対応する領域おいて、参照画像P5の形状に対応する対応部分A5を公知の画像認識の手法を用いて検出する。「参照画像P5の形状に対応する」とは、参照画像P5と大きさが異なっていてもよく、参照画像P5を鉛直方向の軸線回りに傾けた形状であってもよい。「画像認識の手法」としては、例えば、パターンマッチング、機械学習等が用いられてもよい。なお、制御部50は、撮像画像P1のうちの対象領域A4の外においては、対応部分A5を検出しない。 FIG. 9 is a view showing a reference image P5. FIG. 10 is a view showing a highlight image P6 in the load grip control. As shown in FIGS. 9 and 10, the control unit 50 stores in advance a shape (specifically, a plan view of the coil C) in the captured image P1 of the coil C which is an object as a reference image P5. There is. Then, the control unit 50 detects a corresponding portion A5 corresponding to the shape of the reference image P5 in a region corresponding to the target region A4 in the captured image P1 using a known image recognition method. The "corresponding to the shape of the reference image P5" may have a size different from that of the reference image P5, or may be a shape in which the reference image P5 is inclined around an axis in the vertical direction. As the “image recognition method”, for example, pattern matching, machine learning, etc. may be used. Note that the control unit 50 does not detect the corresponding part A5 outside the target area A4 in the captured image P1.
 続いて、制御部50は、撮像画像P1において対応部分A5を強調した強調画像P6を生成し、生成した強調画像P6を表示部43に表示させる。ここでは、強調画像P6は、撮像画像P1に対して、対象領域A4をカラー表示すると共に対象領域A4以外の領域を灰色に表示している。また、強調画像P6では、対応部分A5が破線で囲まれている。なお、強調画像P6において、対応部分A5は、作業者に認識されやすいように強調されていれば破線で囲まれることに限定されず、例えば、所定の色で表示されてもよく、点滅してもよく、切り抜かれてもよい。 Subsequently, the control unit 50 generates an emphasized image P6 in which the corresponding part A5 is emphasized in the captured image P1, and causes the display unit 43 to display the generated emphasized image P6. Here, with respect to the captured image P1, the emphasis image P6 displays the target area A4 in color and displays areas other than the target area A4 in gray. Further, in the emphasized image P6, the corresponding part A5 is surrounded by a broken line. In the emphasized image P6, the corresponding part A5 is not limited to being surrounded by a broken line as long as it is emphasized so as to be easily recognized by the operator, and may be displayed in a predetermined color, for example. It is also good, it may be cut out.
 次に、対象物であるコイルCに対する吊具30の位置ずれの補正について説明する。 Next, correction | amendment of position shift of the hanger 30 with respect to the coil C which is a target object is demonstrated.
 図10に示されるように、制御部50は、吊具30がコイルCの荷掴みを行う際に、撮像画像P1において対応部分A5が位置すべき基準部分A6を、エンコーダ31の検出結果に基づいて取得される高さ情報から計算して設定する。基準部分A6は、例えば吊具30の直下(鉛直方向における下方)のスキッドS上にコイルCが載置されている場合の、平面視における当該コイルCの形状である。制御部50は、撮像画像P1(或いは、強調画像P6)における対応部分A5と基準部分A6との位置ずれを検出する。また、制御部50は、対応部分A5と基準部分A6との大きさの倍率に基づいて吊具30とコイルCとの高さ方向の相対距離を検出してもよく、対応部分A5と基準部分A6との角度ずれに基づいて吊具30とコイルCとの回転方向のずれを検出してもよい。制御部50は、検出した位置ずれに関する情報を搬送制御部51に出力する。 As shown in FIG. 10, the control unit 50 determines, based on the detection result of the encoder 31, a reference portion A6 at which the corresponding portion A5 should be positioned in the captured image P1 when the hanger 30 grips the load of the coil C. Calculated from height information acquired. The reference portion A6 is, for example, the shape of the coil C in a plan view in the case where the coil C is placed on the skid S directly under the lifting tool 30 (downward in the vertical direction). The control unit 50 detects positional deviation between the corresponding portion A5 and the reference portion A6 in the captured image P1 (or the enhanced image P6). Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A5 and the reference part A6, and the corresponding part A5 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A6. The control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51.
 搬送制御部51は、制御部50から入力された位置ずれに関する情報に基づいて、移動機構10により、当該位置ずれが無くなるように吊具30を移動させる。すなわち、移動機構10は、ガーダー12の移動及びトロリー13の横行によって、当該位置ずれが無くなるように吊具30を移動させることで、吊具30をコイルCの直上に移動させる。このとき、必要に応じて回転機構を駆動させて吊具30を鉛直方向の軸線回りに回転移動させてもよい。 The transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the hanger 30 immediately above the coil C by moving the hanger 30 such that the positional deviation is eliminated by the movement of the girder 12 and the traversing of the trolley 13. At this time, the lifting mechanism 30 may be rotationally moved about the vertical axis by driving the rotation mechanism as needed.
 以上、制御部50による荷掴み制御における強調画像P6の表示について、OR確度マップP4を用いた方法を説明した。なお、制御部50は、荷掴み制御における強調画像P6の表示については、OR確度マップP4に代えてAND確度マップを用いた方法も実行可能である。 The method using the OR probability map P4 has been described above for the display of the enhanced image P6 in the load grip control by the control unit 50. The control unit 50 can also execute a method using an AND certainty map instead of the OR certainty map P4 for the display of the emphasized image P6 in the grip control.
 図11は、荷掴み制御におけるAND確度マップP7を示す図である。図11に示されるAND確度マップP7は、距離確度マップP2の第1領域A1に含まれ且つ色彩確度マップP3の第2領域A2に含まれているAND領域A7(図11中の白色部分)を示している。制御部50は、AND確度マップP7におけるAND領域A7を含む領域(例えば、AND領域A7の外縁を所定の幅だけ拡大した領域)を対象領域に設定してもよい。以上により、制御部50は、撮像画像P1においてコイルCが位置している領域である対象領域を撮像画像P1から抽出してもよい。なお、OR確度マップP4を用いるかAND確度マップP7を用いるかについては、例えば対象物に応じて適宜選択される。 FIG. 11 is a view showing an AND probability map P7 in the load grip control. The AND probability map P7 shown in FIG. 11 is the AND region A7 (white part in FIG. 11) included in the first region A1 of the distance probability map P2 and included in the second region A2 of the color probability map P3. It shows. The control unit 50 may set an area including the AND area A7 in the AND likelihood map P7 (for example, an area obtained by expanding the outer edge of the AND area A7 by a predetermined width) as the target area. As described above, the control unit 50 may extract, from the captured image P1, a target region which is a region in which the coil C is located in the captured image P1. Note that whether to use the OR probability map P4 or the AND probability map P7 is appropriately selected according to the object, for example.
[荷下ろし制御]
 制御部50は、荷掴み制御と同様に撮像画像を処理することで、荷下ろし制御においても強調画像の表示を行うことが可能である。以下、荷下ろし制御における強調画像の表示について説明する。
[Unloading control]
The control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
 図12は、荷下ろし制御における強調画像P8を示す図である。図12の強調画像P8には、コイルC、コイルCを保持している吊具30の爪部19、及び、コイルCが荷下ろしされるスキッドSの一部が含まれている。スキッドSは、所定の形状、所定の大きさ、及び所定の色彩を有する積付マークMを有している。強調画像P8では、吊具30は、コイルCが荷下ろしされるスキッドSの直上からずれた位置にあり、且つ、当該スキッドSに対して鉛直方向の軸線回りに傾いている。ここでは、「対象物」は、吊具30により保持されたコイルCが荷下ろしされるスキッドSの積付マークMである。なお、積付マークMの形状及び大きさとは、積付マークMを有するスキッドSのうち、積付マークMの全体から床面Fに鉛直方向に至る仮想的な柱状部分の形状及び大きさである。 FIG. 12 is a view showing a emphasized image P8 in unloading control. The highlighted image P8 in FIG. 12 includes the coil C, the claw portion 19 of the hanger 30 holding the coil C, and a part of the skid S to which the coil C is unloaded. The skid S has a loading mark M having a predetermined shape, a predetermined size, and a predetermined color. In the emphasized image P8, the lifting tool 30 is at a position offset from immediately above the skid S to which the coil C is unloaded, and is tilted about the vertical axis with respect to the skid S. Here, the “object” is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded. The shape and size of the stowage mark M is the shape and size of a virtual columnar portion extending from the entire stowage mark M to the floor surface F in the vertical direction among the skids S having the stowage mark M. is there.
 強調画像P8は、スキッドS上にコイルCを荷下ろしする前における撮像画像に対して、OR確度マップにおけるOR領域を含む対象領域A8をカラー表示すると共に、対象領域A8以外の領域を灰色に表示している。対象領域A8は、OR確度マップにおけるOR領域に代えて、AND確度マップにおけるAND領域に基づき設定されてもよい。強調画像P8では、対象領域A8において検出された対応部分A9である積付マークMの領域が破線で囲まれている。また、強調画像P8では、吊具30によりコイルCの荷下ろしを行う際に対応部分A9が位置すべき基準部分A10が、一点鎖線で表示されている。基準部分A10は、吊具30の直下(鉛直方向における下方)に積付マークMが位置している場合の、平面視における積付マークMの形状である。 The highlighted image P8 displays the target area A8 including the OR area in the OR probability map in color and grays out the area other than the target area A8 with respect to the captured image before unloading the coil C on the skid S. doing. The target area A8 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR likelihood map. In the enhanced image P8, the area of the stowing mark M, which is the corresponding portion A9 detected in the target area A8, is surrounded by a broken line. Further, in the emphasized image P8, the reference portion A10 where the corresponding portion A9 should be positioned when unloading the coil C is performed by the lifting tool 30 is indicated by a dashed dotted line. The reference portion A10 is a shape of the stowage mark M in a plan view in the case where the stowing mark M is located immediately below the hanger 30 (downward in the vertical direction).
 制御部50は、対応部分A9と基準部分A10との位置ずれを検出する。また、制御部50は、対応部分A9と基準部分A10との大きさの倍率に基づいて吊具30とコイルCとの高さ方向の相対距離を検出してもよく、対応部分A9と基準部分A10との角度ずれに基づいて吊具30とコイルCとの回転方向のずれを検出してもよい。制御部50は、検出した位置ずれに関する情報を搬送制御部51に出力する。搬送制御部51は、制御部50から入力された位置ずれに関する情報に基づいて、移動機構10により、当該位置ずれが無くなるように吊具30を移動させる。すなわち、移動機構10は、ガーダー12の移動、トロリー13の横行、及び回転機構の駆動によって、当該位置ずれが無くなるように吊具30を移動させることで、吊具30をスキッドSの直上に移動させる。 The control unit 50 detects positional deviation between the corresponding part A9 and the reference part A10. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the coil C in the height direction based on the magnification of the size of the corresponding part A9 and the reference part A10, and the corresponding part A9 and the reference part A shift in the rotational direction between the hanger 30 and the coil C may be detected based on the angular shift with A10. The control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51. The transport control unit 51 causes the moving mechanism 10 to move the hanger 30 so as to eliminate the positional deviation, based on the information on the positional deviation input from the control unit 50. That is, the moving mechanism 10 moves the lifting tool 30 immediately above the skid S by moving the lifting tool 30 so that the positional deviation is eliminated by the movement of the girder 12, the traversing of the trolley 13, and the drive of the rotation mechanism. Let
[作用及び効果]
 以上説明したように、クレーン装置1によれば、吊具30の下方を撮像した撮像画像P1において、距離情報に基づき対象物であるコイルC又は積付マークMが位置している第1領域A1が推定されると共に、色彩情報に基づき対象物であるコイルC又は積付マークMが位置している第2領域A2が推定される。そして、クレーン装置1は、これらの推定結果に基づいて、対象物が位置している領域を含む対象領域A4,A8を撮像画像P1から抽出し、対象領域A4,A8において参照画像の形状に対応する対応部分A5,A9を強調して表示する。したがって、クレーン装置1は、距離情報と色彩情報とを相補的に用いて吊具30と対象物との位置関係を取得することができる。よって、クレーン装置1は、例えば外乱光の影響及び対象物での鏡面反射の影響を受けにくいため、吊具30と対象物との位置関係をより確実に取得することができる。
[Action and effect]
As described above, according to the crane device 1, in the captured image P1 obtained by capturing the lower part of the lifting tool 30, the first region A1 in which the coil C or the stowage mark M, which is the object, is located based on the distance information. Is estimated, and the second region A2 in which the target object coil C or stowing mark M is located is estimated based on the color information. Then, the crane apparatus 1 extracts target areas A4 and A8 including the area in which the target is located from the captured image P1 based on the estimation results, and corresponds to the shape of the reference image in the target areas A4 and A8. The corresponding parts A5 and A9 are highlighted and displayed. Therefore, the crane apparatus 1 can acquire the positional relationship of the lifting gear 30 and a target object using distance information and color information complementarily. Therefore, since the crane apparatus 1 is hard to receive the influence of disturbance light, and the influence of the specular reflection in a target object, it can acquire the positional relationship of the hanger 30 and a target object more reliably.
 クレーン装置1では、制御部50は、撮像画像P1の対象領域A4,A8の外において、対応部分A5,A9を検出しない。これにより、撮像画像P1のうちの対象領域A4,A8のみについて、参照画像の形状に対応する対応部分A5,A9を検出すればよいため、処理に要する時間を短縮することができる。 In the crane device 1, the control unit 50 does not detect the corresponding parts A5 and A9 outside the target areas A4 and A8 of the captured image P1. As a result, it is only necessary to detect corresponding parts A5 and A9 corresponding to the shape of the reference image for only the target areas A4 and A8 in the captured image P1, so that the time required for processing can be shortened.
 クレーン装置1では、制御部50は、距離情報に基づいて、撮像画像P1において対象物であるコイルC又は積付マークMが位置している第1領域A1を推定し、色彩情報に基づいて、撮像画像P1において対象物であるコイルC又は積付マークMが位置している第2領域A2を推定し、第1領域A1及び第2領域A2の少なくとも一方に含まれているOR領域を含む領域を、対象領域A4,A8として撮像画像P1から抽出する。これにより、対象物での鏡面反射の影響を受けやすい距離情報によって対象物が位置していると推定された第1領域A1に加えて、対象物での鏡面反射の影響を受けにくい色彩情報によって対象物が位置していると推定された第2領域A2についても、対象領域A4,A8として撮像画像P1から抽出する。よって、対象物が存在しているにもかかわらず対象物を検知することができないことが抑制され、吊具30と対象物との位置関係をより確実に取得することができる。 In the crane device 1, the control unit 50 estimates the first area A1 in which the coil C or stowing mark M is located in the captured image P1 based on the distance information, and based on the color information, A region including the OR region included in at least one of the first region A1 and the second region A2 by estimating the second region A2 in which the coil C or the stowage mark M is located in the captured image P1. Are extracted from the captured image P1 as target areas A4 and A8. Thus, in addition to the first area A1 in which the object is estimated to be located by the distance information susceptible to the specular reflection of the object, the color information which is less susceptible to the specular reflection of the object The second area A2 in which the object is estimated to be located is also extracted from the captured image P1 as the target areas A4 and A8. Therefore, it is suppressed that an object can not be detected despite the presence of the object, and the positional relationship between the hanging tool 30 and the object can be acquired more reliably.
 或いは、クレーン装置1では、制御部50は、距離情報に基づいて、撮像画像P1において対象物であるコイルC又は積付マークMが位置している第1領域A1を推定し、色彩情報に基づいて、撮像画像P1において対象物であるコイルC又は積付マークMが位置している第2領域A2を推定し、第1領域A1に含まれ且つ第2領域A2に含まれているAND領域を含む領域を、対象領域A4,A8として撮像画像P1から抽出する。これにより、外乱光の影響を受けやすい距離情報だけでなく、外乱光の影響を受けにくい色彩情報によっても対象物が位置していると推定された領域を、対象領域A4,A8として撮像画像P1から抽出する。よって、対象物が存在していない領域においても対象物が存在していると誤検知してしまうことが抑制され、吊具30と対象物との位置関係をより確実に取得することができる。 Alternatively, in the crane device 1, the control unit 50 estimates the first area A1 in which the coil C or the stowage mark M is the target in the captured image P1 based on the distance information, and the color information is used. To estimate the second region A2 in which the coil C or the stowage mark M is located in the captured image P1, and the AND region included in the first region A1 and included in the second region A2 The areas to be included are extracted from the captured image P1 as target areas A4 and A8. Thus, not only distance information susceptible to disturbance light but also color information unlikely to be affected by disturbance light is assumed to be an area in which the object is located as a target area A4 or A8. Extract from Therefore, erroneous detection as the presence of the object is suppressed even in the region where the object does not exist, and the positional relationship between the lifting tool 30 and the object can be acquired more reliably.
 クレーン装置1は、吊具30を移動させる移動機構10を備え、制御部50は、撮像画像P1における対応部分A5,A9と、吊具30が荷であるコイルCの荷掴み又は荷下ろしを行う際に、撮像画像P1において対応部分A5,A9が位置すべき基準部分A6,A10と、の位置ずれを検出し、移動機構10は、位置ずれが無くなるように吊具30を移動させる。これにより、吊具30を対象物の直上に精度良く位置させることができる。 The crane apparatus 1 includes the moving mechanism 10 for moving the lifting tool 30, and the control unit 50 grips or unloads the corresponding portions A5 and A9 in the captured image P1 and the coil C whose load is the lifting tool 30. In this case, the positional deviation of the reference parts A6 and A10 where the corresponding parts A5 and A9 are to be positioned in the captured image P1 is detected, and the moving mechanism 10 moves the hanger 30 so as to eliminate the positional deviation. Thereby, the hanger 30 can be accurately positioned immediately above the object.
 クレーン装置1では、対象物は、吊具30により荷掴みされるコイルCである。これにより、吊具30によりコイルCを荷掴みする状況において、吊具30とコイルCとの位置関係をより確実に取得することができる。 In the crane apparatus 1, the object is a coil C gripped by the lifting tool 30. Thereby, in the situation where the load on the coil C is gripped by the hanger 30, the positional relationship between the hanger 30 and the coil C can be acquired more reliably.
 或いは、クレーン装置1では、対象物は、吊具30により保持されたコイルCが荷下ろしされるスキッドSの積付マークMである。これにより、吊具30によりコイルCを荷下ろしする状況において、吊具30とスキッドSとの位置関係をより確実に取得することができる。 Alternatively, in the crane apparatus 1, the object is the stowing mark M of the skid S to which the coil C held by the hanger 30 is unloaded. Thereby, in the condition of unloading the coil C by the hanger 30, the positional relationship between the hanger 30 and the skid S can be acquired more reliably.
[第2実施形態]
[クレーン装置の構成]
 第2実施形態に係るクレーン装置について説明する。第2実施形態に係るクレーン装置は、第1実施形態に係るクレーン装置1と比較して、クレーンの種類及び移載される荷が異なる。以下、第1実施形態に係るクレーン装置1と異なる点について主に説明する。
Second Embodiment
[Configuration of crane device]
The crane apparatus which concerns on 2nd Embodiment is demonstrated. The crane apparatus according to the second embodiment is different from the crane apparatus 1 according to the first embodiment in the type of crane and the load to be transferred. Hereinafter, points different from the crane apparatus 1 according to the first embodiment will be mainly described.
 図13は、第2実施形態に係るクレーン装置1Aを示すブロック図である。図14は、クレーン装置1Aの正面図である。図15は、クレーン装置1Aの斜視図である。図13~図15に示されるように、第2実施形態に係るクレーン装置1Aは、例えば接岸したコンテナ船に対してコンテナ(荷)Dの移載等が行われるコンテナターミナルにおいて、コンテナヤードYに配されてコンテナDの荷役を行うコンテナ荷役用クレーンである。クレーン装置1Aは、移動機構10A、吊具30A、撮像部40、距離情報取得部41、色彩情報取得部42、表示部43、制御部50、及び運転室60Aを備えている。吊具30Aは、スプレッダとも称される。また、クレーン装置1Aは、移動機構10A及び吊具30Aを制御する搬送制御部51A、並びに、搬送制御部51Aに接続されたエンコーダ31及び振れセンサ32を備えている。 FIG. 13 is a block diagram showing a crane apparatus 1A according to a second embodiment. FIG. 14 is a front view of the crane apparatus 1A. FIG. 15 is a perspective view of the crane apparatus 1A. As shown in FIGS. 13 to 15, the crane apparatus 1A according to the second embodiment is, for example, in a container yard Y at a container terminal where transfer of a container (load) D, etc. is performed to a container ship that has been docked. It is a container handling crane which is disposed and performs loading and unloading of the container D. The crane apparatus 1A includes a moving mechanism 10A, a hanger 30A, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60A. The hanger 30A is also referred to as a spreader. In addition, the crane apparatus 1A includes a transport control unit 51A that controls the moving mechanism 10A and the hanging tool 30A, and an encoder 31 and a shake sensor 32 connected to the transport control unit 51A.
 コンテナDは、ISO規格コンテナ等のコンテナである。コンテナDは、長尺の直方体状を呈し、その長手方向において例えば20フィート、40フィートといった所定の長さを有している。コンテナDは、その上面の四隅に、それぞれ孔部Ghが形成された被係止部Gを備えている(図16参照)。コンテナDは、コンテナヤードYに1段又は複数段積み上げられて複数のロウEを形成している。各ロウEは、当該ロウEを構成するコンテナDの長手方向が他のロウEを構成するコンテナDの長手方向に対して平行となるように、縦横に整列している。コンテナDは、所定の形状、所定の大きさを有している。また、被係止部Gは、所定の形状、所定の大きさ、及び所定の色彩を有している。 The container D is a container such as an ISO standard container. The container D is in the form of a long rectangular parallelepiped and has a predetermined length of, for example, 20 feet or 40 feet in the longitudinal direction. The container D is provided with the to-be-locked part G in which the hole Gh was each formed in the four corners of the upper surface (refer FIG. 16). The container D is stacked in one or more stages in a container yard Y to form a plurality of rows E. The rows E are vertically and horizontally aligned such that the longitudinal direction of the containers D constituting the rows E is parallel to the longitudinal direction of the containers D constituting the other rows E. The container D has a predetermined shape and a predetermined size. In addition, the locked portion G has a predetermined shape, a predetermined size, and a predetermined color.
 以下の説明では、各コンテナDのうち、吊具30Aによって荷掴みされようとしているコンテナDを対象コンテナD1と呼ぶ(図14参照)。また、吊具30Aによって保持されているコンテナDを保持コンテナD2と呼び、その上に保持コンテナD2が荷下ろしされようとしているコンテナDを目標コンテナ(載置部)D3と呼ぶ(図15参照)。対象コンテナD1、保持コンテナD2、及び目標コンテナD3は、コンテナDの移載の状況によってコンテナDの呼び方が変化したものである。 In the following description, among the containers D, the container D which is about to be gripped by the hanger 30A is referred to as a target container D1 (see FIG. 14). In addition, the container D held by the hanger 30A is referred to as a holding container D2, and the container D on which the holding container D2 is about to be unloaded is referred to as a target container (placement unit) D3 (see FIG. 15). . The target container D1, the holding container D2, and the target container D3 are such that how to call the container D changes according to the transfer status of the container D.
 移動機構10Aは、吊具30Aを移動(水平移動、上下移動、及び回転移動)させる機構である。移動機構10Aは、走行装置14、2組の一対の脚部15,15、ガーダー12A、及びトロリー13Aを有している。走行装置14は、2組の一対の脚部15,15の下端にそれぞれ設けられたタイヤ付車輪を含んでおり、走行モータによってタイヤ付車輪を駆動することで2組の一対の脚部15,15を前後に走行可能としている。ガーダー12Aは、2組の一対の脚部15,15の上端部同士に亘って略水平に横架されている。これにより、ガーダー12Aは、走行装置14が2組の一対の脚部15,15を前後に走行させることで、当該ガーダー12Aの延在方向に対して直交する水平方向に移動可能となっており、この方向にトロリー13A及び吊具30Aを移動させることができる。 The moving mechanism 10A is a mechanism that moves the lifting tool 30A (horizontally, vertically, and rotationally). The moving mechanism 10A includes a traveling device 14, two pairs of legs 15, 15, a girder 12A, and a trolley 13A. The traveling device 14 includes tire wheels provided at lower ends of the two pairs of leg portions 15 and 15, and driving the tire wheels by the traveling motor causes two pairs of leg portions 15, It is possible to run 15 back and forth. The girder 12A is horizontally laid over the upper end portions of the two pairs of leg portions 15 and 15 substantially horizontally. As a result, the girder 12A can move in the horizontal direction orthogonal to the extending direction of the girder 12A by causing the traveling device 14 to travel the two pairs of legs 15 and 15 forward and backward. The trolley 13A and the hanger 30A can be moved in this direction.
 トロリー13Aは、ガーダー12Aの上面を、ガーダー12Aの延在方向に沿って横行する。これにより、トロリー13Aは、ガーダー12Aの延在方向に沿って吊具30Aを移動させることができる。また、トロリー13Aは、垂下されたワイヤーロープ17Aを巻き上げ及び巻き下げする巻上機構11Aを有している。更に、トロリー13Aは、ワイヤーロープ17Aを介して吊具30Aを鉛直方向の軸線回りに回転移動させる回転機構(不図示)を有している。 The trolley 13A traverses the upper surface of the girder 12A along the extending direction of the girder 12A. Thereby, the trolley 13A can move the hanger 30A along the extension direction of the girder 12A. In addition, the trolley 13A has a hoisting mechanism 11A that winds up and down the hanging wire rope 17A. Furthermore, the trolley 13A has a rotation mechanism (not shown) that rotationally moves the hanger 30A around an axis in the vertical direction via the wire rope 17A.
 ガーダー12Aの下方には、トレーラ、AGV[Automated Guided Vehicle]等の搬送台車Vの走行路Wが敷設されている。クレーン装置1Aは、搬送台車Vによって搬入されたコンテナDを荷掴みして、当該コンテナDを、コンテナヤードY上、又は、コンテナヤードY上に載置された他のコンテナD(目標コンテナD3)上に荷下ろしする。また、クレーン装置1Aは、コンテナヤードY上、又は、コンテナヤードY上に載置された他のコンテナD上に載置されているコンテナD(対象コンテナD1)を荷掴みし、当該コンテナDを搬送台車V上に荷下ろして、搬送台車Vによって当該コンテナDを外部に搬出させる。 Under the girder 12A, a traveling path W of a transport carriage V such as a trailer or an AGV (Automated Guided Vehicle) is laid. The crane apparatus 1A grips the container D carried in by the transport carriage V, and the container D is placed on the container yard Y or another container D (target container D3) placed on the container yard Y. Unload on top. In addition, the crane apparatus 1A grips the container D (target container D1) placed on the container yard Y or another container D placed on the container yard Y, and moves the container D The container D is unloaded onto the transport carriage V, and the container D is unloaded by the transport carriage V to the outside.
 吊具30Aは、コンテナDの荷掴み、保持、及び荷下ろしを行う器具である。吊具30Aは、コンテナDをその上面側から保持する。吊具30Aは、トロリー13Aから垂下されたワイヤーロープ17Aに吊り下げられている。吊具30Aは、トロリー13Aの巻上機構11Aによりワイヤーロープ17Aが巻き上げられることで上方に移動し、巻上機構11Aによりワイヤーロープ17Aが巻き下げられることで下方に移動する。吊具30Aは、本体部18A及び4個のロックピン(不図示)を有している。 The hanger 30A is a device for gripping, holding, and unloading the container D. The hanger 30A holds the container D from the upper surface side. The hanger 30A is suspended on a wire rope 17A suspended from the trolley 13A. The lifting gear 30A moves upward by winding up the wire rope 17A by the winding mechanism 11A of the trolley 13A, and moves downward by rolling down the wire rope 17A by the winding mechanism 11A. The hanger 30A has a main body 18A and four lock pins (not shown).
 本体部18Aは、平面視において、コンテナDの形状及び大きさに対応した形状及び大きさを有している。すなわち、本体部18Aは、平面視において長尺の矩形状を呈している。本体部18Aは、その長手方向における中央部の上面側に、ワイヤーロープ17Aが掛け回されるシーブ22を含んでいる。 The main body portion 18A has a shape and a size corresponding to the shape and the size of the container D in a plan view. That is, the main body portion 18A has a long rectangular shape in plan view. The body portion 18A includes a sheave 22 around which the wire rope 17A is wound, on the upper surface side of the central portion in the longitudinal direction.
 ロックピンは、コンテナDを保持するための機構である。ロックピンは、本体部18Aの下面の四隅に、本体部18Aから下方に突出するように設けられている。ロックピンは、吊具30AがコンテナDを保持する際に当該コンテナDの被係止部Gの孔部Ghに対応する位置に設けられている。ロックピンは、例えばツイストロックピンであり、鉛直方向の軸線回りに回転可能な係止片を下端に含む。各ロックピンは、コンテナDの上面の四隅に設けられた各被係止部Gの孔部Ghに進入すると共に各係止片を90度回転させることにより、コンテナDに係合される。 The lock pin is a mechanism for holding the container D. The lock pins are provided at the four corners of the lower surface of the main body portion 18A so as to project downward from the main body portion 18A. The lock pin is provided at a position corresponding to the hole Gh of the engaged portion G of the container D when the hanger 30A holds the container D. The lock pin is, for example, a twist lock pin, and includes at its lower end a locking piece rotatable about a vertical axis. The lock pins are engaged with the container D by entering holes Gh of the engaged portions G provided at the four corners of the upper surface of the container D and rotating the locking pieces by 90 degrees.
 エンコーダ31は、巻上機構11の巻き上げ及び巻き下げの動作量、並びに、トロリー13Aの横行の移動量を検出するセンサである。エンコーダ31の検出結果に基づいて、搬送制御部51Aは吊具30Aの現在の高さを示す高さ情報を取得する。なお、エンコーダ31は、巻上機構11の巻き上げ及び巻き下げの動作量を検出し、トロリー13Aの横行の移動量を検出しなくてもよい。振れセンサ32は、ワイヤーロープ17Aが揺動することによる吊具30Aの振れ量を検出するセンサである。振れセンサ32の検出結果に基づいて、搬送制御部51Aは吊具30Aの現在の振れ量を示す振れ量情報を取得する。 The encoder 31 is a sensor that detects the amount of operation of raising and lowering of the winding mechanism 11 and the amount of movement of the trolley 13A in the transverse direction. Based on the detection result of the encoder 31, the transport control unit 51A acquires height information indicating the current height of the hanging tool 30A. The encoder 31 may not detect the amount of movement of the trolley 13 </ b> A while detecting the amount of operation of raising and lowering of the winding mechanism 11. The shake sensor 32 is a sensor that detects the amount of shake of the hanger 30A due to the wire rope 17A swinging. Based on the detection result of the shake sensor 32, the conveyance control unit 51A acquires shake amount information indicating the current shake amount of the lifting tool 30A.
 搬送制御部51Aは、クレーン装置1Aの駆動を制御する。例えば、搬送制御部51Aは、ガーダー12Aの移動、トロリー13Aの横行、巻上機構11Aによるワイヤーロープ17の巻き上げ及び巻き下げ(吊具30Aの上下方向への移動)、回転機構による吊具30Aの回転移動、及び、ロックピンの係止片の回転を制御する。また、搬送制御部51Aは、エンコーダ31の検出結果に基づいて取得した吊具30Aの高さ情報、及び、振れセンサ32の検出結果に基づいて取得した吊具30Aの振れ量情報を制御部50に出力する。 The transfer control unit 51A controls the drive of the crane device 1A. For example, the transport control unit 51A moves the girder 12A, traverses the trolley 13A, rolls up and down the wire rope 17 by the winding mechanism 11A (moves the lifting device 30A in the vertical direction), and rotates the lifting device 30A by the rotation mechanism. It controls the rotational movement and the rotation of the locking piece of the lock pin. Further, the conveyance control unit 51A controls the height information of the lifting tool 30A acquired based on the detection result of the encoder 31 and the deflection amount information of the lifting tool 30A acquired based on the detection result of the shake sensor 32 as the control unit 50. Output to
 撮像部40は、吊具30Aに設けられ、吊具30Aの下方を撮像して撮像画像を取得する撮像機器である。撮像部40の構成及び機能は、第1実施形態において上述したものと同様である。 The imaging unit 40 is an imaging device that is provided on the hanging tool 30A and captures an image of the lower side of the hanging tool 30A to obtain a captured image. The configuration and function of the imaging unit 40 are the same as those described above in the first embodiment.
 距離情報取得部41は、吊具30Aに設けられ、吊具30Aの下方のうち撮像部40により撮像された撮像画像の範囲内の複数の計測点までの距離情報を取得する機器である。ここでは、計測点は、例えばコンテナヤードY、コンテナヤードY上に載置されたコンテナD等の上面に設定される点であり、例えばマトリクス状に設定されていてもよい。計測点は、例えば距離情報取得部41の水平面内での位置座標を基準とした所定の相対位置に設定されてもよい。距離情報取得部41の構成及び機能は、第1実施形態において上述したものと同様である。 The distance information acquisition unit 41 is a device provided in the hanging tool 30A to obtain distance information to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A. Here, the measurement points are, for example, points set on the upper surface of the container yard Y, the container D mounted on the container yard Y, and the like, and may be set, for example, in a matrix. The measurement point may be set to a predetermined relative position based on, for example, the position coordinate of the distance information acquisition unit 41 in the horizontal plane. The configuration and function of the distance information acquisition unit 41 are the same as those described above in the first embodiment.
 色彩情報取得部42は、吊具30Aに設けられ、吊具30Aの下方のうち撮像部40により撮像された撮像画像の範囲内の色彩情報を取得する機器である。色彩情報取得部42の構成及び機能は、第1実施形態において上述したものと同様である。 The color information acquisition unit 42 is a device that is provided to the hanging tool 30A and acquires color information within the range of the captured image captured by the imaging unit 40 below the lifting tool 30A. The configuration and function of the color information acquisition unit 42 are the same as those described above in the first embodiment.
 表示部43は、撮像画像を表示する表示機器である。また、表示部43は、制御部50により生成された強調画像を表示する。表示部43は、例えばディスプレイであってもよい。表示部43は、例えば運転室60A内に設けられている。 The display unit 43 is a display device that displays a captured image. Further, the display unit 43 displays the emphasized image generated by the control unit 50. The display unit 43 may be, for example, a display. The display unit 43 is provided, for example, in the driver's cab 60A.
 制御部50は、撮像画像を処理して強調画像を生成し、生成した強調画像を表示部43に表示させる。制御部50は、対象コンテナD1を吊具30Aにより荷掴みする荷掴み制御における強調画像、及び、保持コンテナD2を目標コンテナD3上に荷下ろしする荷下ろし制御における強調画像を生成可能である。 The control unit 50 processes the captured image to generate a emphasized image, and causes the display unit 43 to display the generated emphasized image. The control unit 50 can generate an enhanced image in load gripping control in which the object container D1 is gripped by the lifting implement 30A, and an enhanced image in unloading control in which the holding container D2 is unloaded onto the target container D3.
[荷掴み制御]
 以下、荷掴み制御における強調画像の表示について説明する。制御部50は、第1実施形態における荷掴み制御と同様に撮像画像を処理することで、第2実施形態での荷掴み制御における強調画像の表示を行う。
[Load control]
Hereinafter, the display of the emphasis image in the load grip control will be described. The control unit 50 displays the enhanced image in the cargo hold control in the second embodiment by processing the captured image in the same manner as the cargo hold control in the first embodiment.
 図16は、参照画像P9を示す図である。図17は、荷掴み制御における強調画像P10を示す図である。図16及び図17に示されるように、制御部50は、コンテナDの被係止部Gの形状を参照画像P9として予め記憶している。図17の強調画像P10には、吊具30Aの本体部18Aの一部、及び、吊具30Aにより荷掴みされようとしている対象コンテナD1の被係止部G付近が含まれている。強調画像P10では、吊具30Aは、対象コンテナD1の直上からずれた位置にある。ここでは、「対象物」は、対象コンテナD1の被係止部Gである。 FIG. 16 is a diagram showing a reference image P9. FIG. 17 is a diagram showing a highlight image P10 in the load grip control. As shown in FIGS. 16 and 17, the control unit 50 stores in advance the shape of the locked portion G of the container D as a reference image P9. The highlighted image P10 in FIG. 17 includes a part of the main body 18A of the hanger 30A and the vicinity of the engaged portion G of the target container D1 which is about to be gripped by the hanger 30A. In the emphasized image P10, the hanger 30A is at a position offset from immediately above the target container D1. Here, the “target object” is the locked portion G of the target container D1.
 制御部50は、距離情報取得部41により取得された距離情報に基づいて、対象物である対象コンテナD1の被係止部Gが位置している第1領域を推定する。また、制御部50は、色彩情報取得部42により取得された色彩情報に基づいて、対象物である対象コンテナD1の被係止部Gが位置している第2領域を推定する。制御部50は、第1領域及び第2領域の少なくとも一方に含まれるOR領域を含む領域を対象領域A11に設定する。以上のように、制御部50は、対象領域A11を撮像画像から抽出する。 The control unit 50 estimates, based on the distance information acquired by the distance information acquisition unit 41, the first region in which the to-be-locked portion G of the target container D1 which is the target object is located. In addition, the control unit 50 estimates a second region in which the to-be-locked portion G of the target container D1 that is the target object is located, based on the color information acquired by the color information acquisition unit 42. The control unit 50 sets an area including the OR area included in at least one of the first area and the second area as the target area A11. As described above, the control unit 50 extracts the target area A11 from the captured image.
 そして、制御部50は、対象領域A11において、予め記憶している参照画像の形状に対応する対応部分を、公知の画像認識の手法を用いて検出する。なお、制御部50は、撮像画像のうちの対象領域A11の外においては、対応部分A12を検出しない。制御部50は、撮像画像において検出した対応部分A12を強調した強調画像P10を生成し、生成した強調画像P10を表示部43に表示させる。 Then, the control unit 50 detects, in the target area A11, a corresponding portion corresponding to the shape of the reference image stored in advance using a known image recognition method. The control unit 50 does not detect the corresponding part A12 outside the target area A11 in the captured image. The control unit 50 generates an emphasized image P10 in which the corresponding part A12 detected in the captured image is emphasized, and causes the display unit 43 to display the generated emphasized image P10.
 強調画像P10は、対象コンテナD1を荷掴みする前における撮像画像に対して、対象領域A11をカラー表示すると共に、対象領域A11以外の領域を灰色に表示している。対象領域A11は、OR確度マップにおけるOR領域に代えて、AND確度マップにおけるAND領域に基づき設定されてもよい。強調画像P10では、対象領域A11において検出された対応部分A12である被係止部Gの領域が破線で囲まれている。また、強調画像P10では、吊具30Aが対象コンテナD1の荷掴みを行う際に対応部分A12が位置すべき基準部分A13が、一点鎖線で表示されている。基準部分A13は、吊具30Aの直下(鉛直方向における下方)に被係止部Gが位置している場合の、平面視における被係止部Gの形状である。 The highlighted image P10 displays the target area A11 in color and the areas other than the target area A11 in gray with respect to the captured image before the target container D1 is grasped. The target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map. In the emphasized image P10, the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line. Further, in the emphasized image P10, a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A grips the target container D1 is indicated by a dashed-dotted line. The reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
 制御部50は、対応部分A12と基準部分A13との位置ずれを検出する。また、制御部50は、対応部分A12と基準部分A13との大きさの倍率に基づいて吊具30と対象コンテナD1との高さ方向の相対距離を検出してもよく、対応部分A12と基準部分A13との角度ずれに基づいて吊具30と対象コンテナD1との回転方向のずれを検出してもよい。制御部50は、検出した位置ずれに関する情報を搬送制御部51Aに出力する。搬送制御部51Aは、制御部50から入力された位置ずれに関する情報に基づいて、移動機構10Aにより、当該位置ずれが無くなるように吊具30Aを移動させる。すなわち、移動機構10Aは、ガーダー12Aの移動及びトロリー13Aの横行によって、当該位置ずれが無くなるように吊具30Aを移動させることで、吊具30Aを対象コンテナD1の直上に移動させる。このとき、必要に応じて回転機構を駆動させて吊具30Aを鉛直方向の軸線回りに回転移動させてもよい。 The control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13. In addition, the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D1 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D1 may be detected based on the angular shift with the portion A13. The control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A. The transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated. That is, the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D1 by moving the hanger 30A such that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A. At this time, the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
[荷下ろし制御]
 制御部50は、荷掴み制御と同様に撮像画像を処理することで、荷下ろし制御においても強調画像の表示を行うことが可能である。以下、荷下ろし制御における強調画像の表示について説明する。
[Unloading control]
The control unit 50 can display a highlighted image also in the unloading control by processing the captured image in the same manner as the load grip control. Hereinafter, the display of the emphasized image in the unloading control will be described.
 図18は、荷下ろし制御における強調画像P11を示す図である。図18の強調画像P11には、吊具30Aの本体部18Aの一部、吊具30Aにより保持されている保持コンテナD2の一部、及び、保持コンテナD2が荷下ろしされようとしている目標コンテナD3の被係止部G付近が含まれている。強調画像P11では、吊具30Aは、目標コンテナD3の直上からずれた位置にある。ここでは、「対象物」は、目標コンテナD3の被係止部Gである。 FIG. 18 is a view showing a highlighted image P11 in unloading control. In the highlight image P11 of FIG. 18, a part of the main body 18A of the hanging member 30A, a part of the holding container D2 held by the hanging member 30A, and a target container D3 to which the holding container D2 is to be unloaded. The vicinity of the to-be-locked portion G is included. In the emphasized image P11, the hanger 30A is at a position deviated from immediately above the target container D3. Here, the “object” is the locked portion G of the target container D3.
 強調画像P11は、目標コンテナD3上に保持コンテナD2を荷下ろしする前における撮像画像に対して、OR確度マップにおけるOR領域を含む領域である対象領域A11をカラー表示すると共に、対象領域A11以外の領域を灰色に表示している。対象領域A11は、OR確度マップにおけるOR領域に代えて、AND確度マップにおけるAND領域に基づき設定されてもよい。強調画像P11では、対象領域A11において検出された対応部分A12である被係止部Gの領域が破線で囲まれている。また、強調画像P11では、吊具30Aが保持コンテナD2の荷下ろしを行う際に対応部分A12が位置すべき基準部分A13が、一点鎖線で表示されている。基準部分A13は、吊具30Aの直下(鉛直方向における下方)に被係止部Gが位置している場合の、平面視における被係止部Gの形状である。 The highlighted image P11 displays the target area A11, which is an area including the OR area in the OR probability map, in color with respect to the captured image before unloading the holding container D2 onto the target container D3, and also displays a color other than the target area A11. The area is displayed in gray. The target area A11 may be set based on the AND area in the AND likelihood map instead of the OR area in the OR probability map. In the emphasized image P11, the area of the locked portion G which is the corresponding part A12 detected in the target area A11 is surrounded by a broken line. Further, in the emphasized image P11, a reference portion A13 where the corresponding portion A12 should be positioned when the lifting tool 30A unloads the holding container D2 is displayed by a dashed-dotted line. The reference portion A13 is a shape of the locked portion G in a plan view when the locked portion G is positioned directly below the hanging tool 30A (downward in the vertical direction).
 制御部50は、対応部分A12と基準部分A13との位置ずれを検出する。また、制御部50は、対応部分A12と基準部分A13との大きさの倍率に基づいて吊具30と目標コンテナD3との高さ方向の相対距離を検出してもよく、対応部分A12と基準部分A13との角度ずれに基づいて吊具30と目標コンテナD3との回転方向のずれを検出してもよい。制御部50は、検出した位置ずれに関する情報を搬送制御部51Aに出力する。搬送制御部51Aは、制御部50から入力された位置ずれに関する情報に基づいて、移動機構10Aにより、当該位置ずれが無くなるように吊具30Aを移動させる。すなわち、移動機構10Aは、ガーダー12Aの移動及びトロリー13Aの横行によって、当該位置ずれが無くなるように吊具30を移動させることで、吊具30Aを目標コンテナD3の直上に移動させる。このとき、必要に応じて回転機構を駆動させて吊具30Aを鉛直方向の軸線回りに回転移動させてもよい。 The control unit 50 detects positional deviation between the corresponding part A12 and the reference part A13. Further, the control unit 50 may detect the relative distance between the lifting tool 30 and the target container D3 in the height direction based on the magnification of the size of the corresponding part A12 and the reference part A13, and the reference part A12 and the reference A shift in the rotational direction between the hanger 30 and the target container D3 may be detected based on the angular shift with the part A13. The control unit 50 outputs information on the detected positional deviation to the conveyance control unit 51A. The transport control unit 51A causes the moving mechanism 10A to move the hanger 30A based on the information on the positional deviation input from the control unit 50 so that the positional deviation is eliminated. That is, the moving mechanism 10A moves the hanger 30A to the position immediately above the target container D3 by moving the hanger 30 so that the positional deviation is eliminated by the movement of the girder 12A and the traversing of the trolley 13A. At this time, the lifting mechanism 30A may be rotationally moved about an axis in the vertical direction by driving the rotation mechanism as needed.
 なお、荷下ろし制御においては、吊具30Aが保持コンテナD2を保持しているため、保持コンテナD2が目標コンテナD3に向かって下降していくと、目標コンテナD3の被係止部Gが保持コンテナD2に隠れて撮像画像に表示されなくなる場合がある。そこで、クレーン装置1Aは、目標コンテナD3の被係止部Gが表示されなくなる前に、吊具30Aが所定の高さに位置する状態において、予め、対応部分A12と基準部分A13との位置ずれを検出し、その検出結果に基づいて振れセンサ32の検出値を校正することで、振れセンサ32の検出値と当該位置ずれとを対応づける。その後、クレーン装置1Aは、吊具30Aにより保持コンテナD2を目標コンテナD3に向かって下降させながら、振れセンサ32の検出結果に基づいて対応部分A12と基準部分A13との位置ずれの状態を監視する。これにより、クレーン装置1Aは、吊具30Aが目標コンテナD3の直上に位置している状態を維持してもよい。 In the unloading control, since the hanger 30A holds the holding container D2, when the holding container D2 descends toward the target container D3, the engaged portion G of the target container D3 is a holding container. There is a case where it is hidden in D2 and is not displayed on the captured image. Therefore, the crane device 1A shifts the positional deviation between the corresponding portion A12 and the reference portion A13 in advance in a state where the lifting tool 30A is positioned at a predetermined height before the locked portion G of the target container D3 is not displayed. Is detected, and the detected value of the shake sensor 32 is calibrated based on the detection result, so that the detected value of the shake sensor 32 is associated with the positional deviation. Thereafter, the crane device 1A monitors the state of positional deviation between the corresponding portion A12 and the reference portion A13 based on the detection result of the shake sensor 32, while lowering the holding container D2 toward the target container D3 by the lifting tool 30A. . Thereby, crane apparatus 1A may maintain the state where hanger 30A is located directly on target container D3.
[作用及び効果]
 以上説明したように、クレーン装置1Aによれば、吊具30Aの下方を撮像した撮像画像において、距離情報に基づき対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第1領域が推定されると共に、色彩情報に基づき対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第2領域が推定される。そして、クレーン装置1Aは、これらの推定結果に基づいて、対象物が位置している領域を含む対象領域A11を撮像画像から抽出し、対象領域A11において参照画像P9の形状に対応する対応部分A12を強調して表示する。したがって、クレーン装置1Aは、距離情報と色彩情報とを相補的に用いて吊具30Aと対象物との位置関係を取得することができる。よって、クレーン装置1Aは、例えば外乱光の影響、対象物での鏡面反射の影響、及び雨粒の検出の影響を受けにくいため、吊具30Aと対象物との位置関係をより確実に取得することができる。
[Action and effect]
As described above, according to the crane apparatus 1A, in the captured image obtained by capturing the lower part of the lifting tool 30A, the locked portion G of the target container D1 or the target container D3 that is the target object is locked based on the distance information. The first region in which the portion G is located is estimated, and the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is positioned based on the color information An area is estimated. Then, based on these estimation results, the crane apparatus 1A extracts a target area A11 including the area in which the target is located from the captured image, and a corresponding portion A12 corresponding to the shape of the reference image P9 in the target area A11. Emphasize and display. Therefore, the crane apparatus 1A can acquire the positional relationship between the hanger 30A and the object using the distance information and the color information in a complementary manner. Therefore, the crane device 1A is less susceptible to the effects of disturbance light, the influence of specular reflection on the object, and the detection of raindrops, for example, so that the positional relationship between the lifting gear 30A and the object is acquired more reliably. Can.
 クレーン装置1Aでは、制御部50は、撮像画像の対象領域A11の外において、対応部分A12を検出しない。これにより、撮像画像のうちの対象領域A11のみについて、参照画像P9の形状に対応する対応部分A12を検出すればよいため、処理に要する時間を短縮することができる。 In the crane apparatus 1A, the control unit 50 does not detect the corresponding part A12 outside the target area A11 of the captured image. As a result, it is only necessary to detect the corresponding part A12 corresponding to the shape of the reference image P9 only in the target area A11 of the captured image, so that the time required for processing can be shortened.
 クレーン装置1Aでは、制御部50は、距離情報に基づいて、撮像画像において対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第1領域を推定し、色彩情報に基づいて、撮像画像において対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第2領域を推定し、第1領域及び第2領域の少なくとも一方に含まれているOR領域を含む領域を、対象領域A11として撮像画像から抽出する。これにより、外乱光の影響及び雨粒の検出の影響を受けやすい距離情報によって対象物が位置していると推定された第1領域に加えて、対象物での鏡面反射の影響を受けにくい色彩情報によって対象物が位置していると推定された第2領域についても、対象領域A11として撮像画像から抽出する。よって、対象物が存在しているにもかかわらず対象物を検知することができないことが抑制され、吊具30Aと対象物との位置関係をより確実に取得することができる。 In the crane apparatus 1A, the control unit 50 is a first region in which the to-be-locked portion G of the target container D1 or the to-be-locked portion G of the target container D3 is the object in the captured image based on the distance information. To estimate a second region in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image based on the color information A region including the OR region included in at least one of the region and the second region is extracted from the captured image as a target region A11. Thereby, in addition to the first region estimated to be located by the influence of disturbance light and distance information susceptible to the detection of raindrops, color information which is not easily affected by specular reflection at the object The second area, which is estimated to have the object located by the above, is also extracted from the captured image as the target area A11. Therefore, it is suppressed that an object can not be detected despite the presence of an object, and the positional relationship between the hanger 30A and the object can be acquired more reliably.
 或いは、クレーン装置1Aでは、制御部50は、距離情報に基づいて、撮像画像において対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第1領域を推定し、色彩情報に基づいて、撮像画像において対象物である対象コンテナD1の被係止部G又は目標コンテナD3の被係止部Gが位置している第2領域を推定し、第1領域に含まれ且つ第2領域に含まれているAND領域を含む領域を、対象領域A11として撮像画像から抽出する。これにより、外乱光の影響を受けやすい距離情報だけでなく、外乱光の影響を受けにくい色彩情報によっても対象物が位置していると推定された領域を、対象領域A11として撮像画像から抽出する。よって、対象物が存在していない領域においても対象物が存在していると誤検知してしまうことが抑制され、吊具30Aと対象物との位置関係をより確実に取得することができる。 Alternatively, in the crane device 1A, the control unit 50 is configured such that the locked portion G of the target container D1 that is the target object or the locked portion G of the target container D3 is located in the captured image based on the distance information. 1 area is estimated, and a second area in which the to-be-locked portion G of the target container D1 as the object or the to-be-locked portion G of the target container D3 is located in the captured image is estimated based on color information; A region including the AND region included in the first region and included in the second region is extracted from the captured image as a target region A11. As a result, an area in which the object is estimated to be located not only by distance information susceptible to disturbance light but also by color information not easily affected by disturbance light is extracted as a target area A11 from the captured image. . Therefore, erroneous detection as the presence of the target is suppressed even in the area where the target does not exist, and the positional relationship between the hanging tool 30A and the target can be acquired more reliably.
 クレーン装置1Aは、吊具30Aを移動させる移動機構10Aを備え、制御部50は、撮像画像における対応部分A12と、吊具30Aが荷であるコンテナDの荷掴み又は荷下ろしを行う際に、撮像画像において対応部分A12が位置すべき基準部分A13と、の位置ずれを検出し、移動機構10Aは、位置ずれが無くなるように吊具30Aを移動させる。これにより、吊具30Aを対象物の直上に精度良く位置させることができる。 The crane apparatus 1A includes a moving mechanism 10A for moving the lifting tool 30A, and the control unit 50 performs loading or unloading of the corresponding part A12 in the captured image and the container D in which the lifting tool 30A is a load. The positional deviation of the reference part A13 where the corresponding part A12 should be positioned in the captured image is detected, and the moving mechanism 10A moves the hanger 30A so that the positional deviation disappears. Thereby, the hanger 30A can be accurately positioned immediately above the object.
 クレーン装置1Aでは、対象物は、吊具30Aにより荷掴みされる対象コンテナD1の被係止部Gである。これにより、吊具30Aにより対象コンテナD1を荷掴みする状況において、吊具30Aと対象コンテナD1との位置関係をより確実に取得することができる。 In the crane apparatus 1A, the target object is the to-be-locked portion G of the target container D1 gripped by the lifting tool 30A. Thereby, in a situation where the target container D1 is gripped by the lifting tool 30A, the positional relationship between the lifting tool 30A and the target container D1 can be acquired more reliably.
 或いは、クレーン装置1Aでは、対象物は、吊具30Aにより保持された保持コンテナD2が荷下ろしされる目標コンテナD3の被係止部Gである。これにより、吊具30Aにより保持コンテナD2を荷下ろしする状況において、吊具30Aと目標コンテナD3との位置関係をより確実に取得することができる。 Alternatively, in the crane device 1A, the target object is the engaged portion G of the target container D3 to which the holding container D2 held by the lifting tool 30A is unloaded. As a result, in the situation where the holding container D2 is unloaded by the hanger 30A, the positional relationship between the hanger 30A and the target container D3 can be acquired more reliably.
[変形例]
 上述した実施形態は、当業者の知識に基づいて種々の変更、改良を施した様々な形態で実施することができる。
[Modification]
The embodiments described above can be implemented in various forms with various modifications and improvements based on the knowledge of those skilled in the art.
 例えば、第1実施形態及び第2実施形態において、撮像部40、距離情報取得部41、及び色彩情報取得部42を構成する各機器のうちの2種類又は全種類は、共通の機器であってもよい。例えば、撮像部40及び色彩情報取得部42は共通のカラーカメラで構成されていてもよく、撮像部40、距離情報取得部41、及び色彩情報取得部42は共通のRGB画像一体型TOFカメラで構成されていてもよい。 For example, in the first embodiment and the second embodiment, two or all of the devices constituting the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common devices. It is also good. For example, the imaging unit 40 and the color information acquisition unit 42 may be configured by a common color camera, and the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 are common RGB image integrated TOF cameras It may be configured.
 また、第1実施形態及び第2実施形態において、制御部50は、OR確度マップ又はAND確度マップに基づいて、物体が存在しないはずの領域に物体が検出された場合に、当該物体を障害物と認識してもよい。 In the first and second embodiments, when an object is detected in a region where an object should not exist based on the OR accuracy map or the AND accuracy map, the control unit 50 sets an obstacle for the object. It may be recognized as
 また、対象物は、所定の形状、所定の大きさ、及び所定の色彩を有していればよい。例えば、第1実施形態においては、対象物は、コイルC又はスキッドSの積付マークM以外のものであってもよい。或いは、対象物は、コイルCの一部分であってもよく、スキッドSの全体であってもよい。第2実施形態においては、対象物は、対象コンテナD1の被係止部G又は目標コンテナD3の被係止部G以外のものであってもよい。或いは、対象物は、対象コンテナD1の全体であってもよく、目標コンテナD3の全体であってもよい。 In addition, the object may have a predetermined shape, a predetermined size, and a predetermined color. For example, in the first embodiment, the object may be other than the loading mark M of the coil C or the skid S. Alternatively, the object may be a part of the coil C, or the entire skid S. In the second embodiment, the object may be something other than the locked portion G of the target container D1 or the locked portion G of the target container D3. Alternatively, the object may be the entire target container D1 or the entire target container D3.
 また、制御部50は、基準部分A6,A10,A13を予め記憶していてもよい。この場合、制御部50による演算量を低減することができる。 In addition, the control unit 50 may store reference portions A6, A10, and A13 in advance. In this case, the amount of calculation by the control unit 50 can be reduced.
 また、第1実施形態及び第2実施形態において、表示部43としては、運転室60,60A内に設けられたディスプレイに限定されず、あらゆる構成を採用することができる。例えば、表示部43は、クレーン装置1,1Aと直接的に又はネットワーク等を通じて間接的に通信可能な携帯端末のディスプレイであってもよい。なお、作業者(運転者)が手動操作にて運転を行わずに、制御部50による自動運転によってクレーン装置1,1Aが動作する場合には、表示部43を省略してもよい。つまり、クレーン装置1,1Aは、作業者の手動操作により運転されるのではなく制御部50により自動運転される場合には、表示部43を備えていなくてもよい。 In the first and second embodiments, the display unit 43 is not limited to the display provided in the driver's cab 60 or 60A, and any configuration can be adopted. For example, the display unit 43 may be a display of a portable terminal that can communicate with the crane devices 1 and 1A directly or indirectly through a network or the like. When the crane apparatus 1 or 1A is operated by the automatic operation by the control unit 50 without the operator (driver) manually driving, the display unit 43 may be omitted. That is, the crane devices 1 and 1A may not include the display unit 43 when the control unit 50 performs the automatic operation instead of the manual operation of the worker.
 また、クレーン装置1,1Aは、天井クレーン及びコンテナ荷役用クレーンにそれぞれ限定されず、他の各種レーンであってもよい。 The crane devices 1 and 1A are not limited to the overhead crane and the container handling crane, respectively, and may be other various lanes.
 1,1A…クレーン装置、30,30A…吊具、40…撮像部、41…距離情報取得部、42…色彩情報取得部、43…表示部、50…制御部、C…コイル(荷,対象物)、D…コンテナ(荷)、G…被係止部(対象物)、M…積付マーク(対象物)。 1, 1A: crane device, 30, 30A: lifting tool, 40: imaging unit, 41: distance information acquisition unit, 42: color information acquisition unit, 43: display unit, 50: control unit, C: coil (load, target Object), D: Container (load), G: Locked part (object), M: Loading mark (object).

Claims (7)

  1.  荷の荷掴み、保持、及び荷下ろしを行う吊具と、
     前記吊具に設けられ、前記吊具の下方を撮像して撮像画像を取得する撮像部と、
     前記吊具に設けられ、前記吊具の下方のうち前記撮像画像の範囲内の複数の計測点までの距離情報を取得する距離情報取得部と、
     前記吊具に設けられ、前記吊具の下方のうち前記撮像画像の範囲内の色彩情報を取得する色彩情報取得部と、
     前記撮像画像を表示する表示部と、
     前記撮像画像を処理して前記表示部に表示させる制御部と、を備え、
     前記制御部は、
     前記距離情報及び前記色彩情報に基づいて、前記撮像画像において対象物が位置している領域を含む対象領域を当該撮像画像から抽出し、
     前記対象領域において、予め記憶している参照画像の形状に対応する対応部分を検出し、
     前記撮像画像を、検出した前記対応部分を強調して、前記表示部に表示させる、クレーン装置。
    A lifting device for holding, holding and unloading a load;
    An imaging unit which is provided on the hanger and which images a lower part of the hanger to obtain a captured image;
    A distance information acquisition unit which is provided to the lifting device and obtains distance information to a plurality of measurement points within the range of the captured image below the lifting device;
    A color information acquisition unit which is provided on the hanger and acquires color information within the range of the captured image of the lower side of the hanger;
    A display unit for displaying the captured image;
    A control unit configured to process the captured image and display the processed image on the display unit;
    The control unit
    Based on the distance information and the color information, a target area including an area where the target is located in the captured image is extracted from the captured image,
    Detecting a corresponding portion corresponding to the shape of the reference image stored in advance in the target area;
    The crane apparatus which makes the said display part display the said captured image emphasizing the said corresponding part detected.
  2.  前記制御部は、
     前記撮像画像の前記対象領域の外において、前記対応部分を検出しない、請求項1に記載のクレーン装置。
    The control unit
    The crane apparatus according to claim 1, wherein the corresponding portion is not detected outside the target area of the captured image.
  3.  前記制御部は、
     前記距離情報に基づいて、前記撮像画像において前記対象物が位置している第1領域を推定し、
     前記色彩情報に基づいて、前記撮像画像において前記対象物が位置している第2領域を推定し、
     前記第1領域及び前記第2領域の少なくとも一方に含まれている領域を含む領域を、前記対象領域として前記撮像画像から抽出する、請求項1又は2に記載のクレーン装置。
    The control unit
    Estimating a first region in which the target is located in the captured image based on the distance information;
    Estimating a second region in which the target is located in the captured image based on the color information;
    The crane apparatus according to claim 1, wherein a region including a region included in at least one of the first region and the second region is extracted from the captured image as the target region.
  4.  前記制御部は、
     前記距離情報に基づいて、前記撮像画像において前記対象物が位置している第1領域を推定し、
     前記色彩情報に基づいて、前記撮像画像において前記対象物が位置している第2領域を推定し、
     前記第1領域に含まれ且つ前記第2領域に含まれている領域を含む領域を、前記対象領域として前記撮像画像から抽出する、請求項1又は2に記載のクレーン装置。
    The control unit
    Estimating a first region in which the target is located in the captured image based on the distance information;
    Estimating a second region in which the target is located in the captured image based on the color information;
    The crane apparatus according to claim 1, wherein an area including an area included in the first area and included in the second area is extracted from the captured image as the target area.
  5.  前記吊具を移動させる移動機構を備え、
     前記制御部は、前記撮像画像における前記対応部分と、前記吊具が前記荷の荷掴み又は荷下ろしを行う際に、前記撮像画像において前記対応部分が位置すべき基準部分と、の位置ずれを検出し、
     前記移動機構は、前記位置ずれが無くなるように前記吊具を移動させる、請求項1~4のいずれか一項に記載のクレーン装置。
    It has a moving mechanism for moving the lifting device,
    The control unit sets a positional deviation between the corresponding portion in the captured image and a reference portion at which the corresponding portion should be positioned in the captured image when the lifting tool grips or unloads the load. Detect
    The crane apparatus according to any one of claims 1 to 4, wherein the moving mechanism moves the lifting gear such that the positional deviation is eliminated.
  6.  前記対象物は、前記吊具により荷掴みされる前記荷の少なくとも一部分である、請求項1~5のいずれか一項に記載のクレーン装置。 The crane apparatus according to any one of claims 1 to 5, wherein the object is at least a part of the load gripped by the lifting gear.
  7.  前記対象物は、前記吊具により保持された前記荷が荷下ろしされる載置部の少なくとも一部分である、請求項1~5のいずれか一項に記載のクレーン装置。 The crane apparatus according to any one of claims 1 to 5, wherein the object is at least a part of a mounting portion on which the load held by the lifting tool is unloaded.
PCT/JP2018/026546 2017-09-05 2018-07-13 Crane device WO2019049511A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110179044.5A CN112938766B (en) 2017-09-05 2018-07-13 Crane device
CN201880053626.5A CN111032561B (en) 2017-09-05 2018-07-13 Crane device
JP2019540803A JP6689467B2 (en) 2017-09-05 2018-07-13 Crane equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017170605 2017-09-05
JP2017-170605 2017-09-05

Publications (1)

Publication Number Publication Date
WO2019049511A1 true WO2019049511A1 (en) 2019-03-14

Family

ID=65633842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/026546 WO2019049511A1 (en) 2017-09-05 2018-07-13 Crane device

Country Status (3)

Country Link
JP (1) JP6689467B2 (en)
CN (2) CN112938766B (en)
WO (1) WO2019049511A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021048916A (en) * 2019-09-20 2021-04-01 東芝ライテック株式会社 Control device and control method
JP2021128059A (en) * 2020-02-13 2021-09-02 コベルコ建機株式会社 Guidance system
WO2021193294A1 (en) * 2020-03-24 2021-09-30 住友重機械搬送システム株式会社 Remote operation system and remote operation method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110166A (en) * 2021-04-13 2021-07-13 镇江港务集团有限公司 Power control system of horizontal gantry crane and control method thereof
RU2770940C1 (en) * 2021-12-24 2022-04-25 Общество С Ограниченной Ответственностью "Малленом Системс" System and method for automatic determination of position of bridge crane along path of movement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (en) * 1997-12-05 1999-06-22 Fujitsu Ltd Hand form recognition device and monochromatic object form recognition device
JP2011198270A (en) * 2010-03-23 2011-10-06 Denso It Laboratory Inc Object recognition device and controller using the same, and object recognition method
JP2015533747A (en) * 2012-10-02 2015-11-26 コネクレーンズ ピーエルシーKonecranes Plc Load handling using load handling equipment
JP2016151955A (en) * 2015-02-18 2016-08-22 キヤノン株式会社 Image processing apparatus, imaging device, distance measuring device, and image processing method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10245889B4 (en) * 2002-09-30 2008-07-31 Siemens Ag Method and / or device for determining a pendulum of a load of a hoist
DE10245970B4 (en) * 2002-09-30 2008-08-21 Siemens Ag Method and device for detecting a load of a hoist
CN103179369A (en) * 2010-09-21 2013-06-26 株式会社锦宫事务 Imaging object, image processing program and image processing method
CN102452611B (en) * 2010-10-21 2014-01-15 上海振华重工(集团)股份有限公司 Detection method and detection device for space attitude of suspender of container crane
FI125644B (en) * 2011-07-18 2015-12-31 Konecranes Oyj System and method for determining the position and rotation of a crane gripper
US8768583B2 (en) * 2012-03-29 2014-07-01 Harnischfeger Technologies, Inc. Collision detection and mitigation systems and methods for a shovel
DE102012213604A1 (en) * 2012-08-01 2014-02-06 Ge Energy Power Conversion Gmbh Loading device for containers and method for their operation
CN203439940U (en) * 2013-09-04 2014-02-19 贾来国 Automatic control system for RTG/RMG dual-laser sling crash-proof box at container terminal
CN204675650U (en) * 2015-03-18 2015-09-30 苏州盈兴信息技术有限公司 A kind of production material storing flow diagram is as autotracker

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (en) * 1997-12-05 1999-06-22 Fujitsu Ltd Hand form recognition device and monochromatic object form recognition device
JP2011198270A (en) * 2010-03-23 2011-10-06 Denso It Laboratory Inc Object recognition device and controller using the same, and object recognition method
JP2015533747A (en) * 2012-10-02 2015-11-26 コネクレーンズ ピーエルシーKonecranes Plc Load handling using load handling equipment
JP2016151955A (en) * 2015-02-18 2016-08-22 キヤノン株式会社 Image processing apparatus, imaging device, distance measuring device, and image processing method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021048916A (en) * 2019-09-20 2021-04-01 東芝ライテック株式会社 Control device and control method
JP7294024B2 (en) 2019-09-20 2023-06-20 東芝ライテック株式会社 Control device and control method
JP2021128059A (en) * 2020-02-13 2021-09-02 コベルコ建機株式会社 Guidance system
JP7306291B2 (en) 2020-02-13 2023-07-11 コベルコ建機株式会社 guidance system
WO2021193294A1 (en) * 2020-03-24 2021-09-30 住友重機械搬送システム株式会社 Remote operation system and remote operation method

Also Published As

Publication number Publication date
CN111032561B (en) 2021-04-09
CN111032561A (en) 2020-04-17
JP6689467B2 (en) 2020-04-28
JPWO2019049511A1 (en) 2020-05-28
CN112938766A (en) 2021-06-11
CN112938766B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
JP6689467B2 (en) Crane equipment
KR101699672B1 (en) Method and system for automatically landing containers on a landing target using a container crane
JP3785061B2 (en) Container position detection method and apparatus for cargo handling crane, container landing and stacking control method
JP4856394B2 (en) Object position measuring apparatus for container crane and automatic cargo handling apparatus using the object position measuring apparatus
CN103030063B (en) For determining method and the container spreader of target position for container spreader
US20180282132A1 (en) Method, load handling device, computer program and computer program product for positioning gripping means
US20190262994A1 (en) Methods and systems for operating a material handling apparatus
US20220009748A1 (en) Loading A Container On A Landing Target
JP2018188299A (en) Container terminal system and control method of the same
EP3418244B1 (en) Loading a container on a landing target
KR100624008B1 (en) Auto landing system and the method for control spreader of crane
JP6807131B2 (en) Cargo handling and transportation equipment
JP7345335B2 (en) Crane operation support system and crane operation support method
CN112996742B (en) Crane system, crane positioning device and crane positioning method
CN110869305B (en) Crane device
JP2022136849A (en) Container measurement system
WO2020184025A1 (en) Crane and method for loading with crane
KR20050007241A (en) Absolute-position detection method and algorithm of spreader for the auto-landing of containers
WO2024009342A1 (en) Conveyance device and control method thereof
US20230382693A1 (en) Apparatus and method for determining the spatial position of pick-up elements of a container
Lim et al. A Visual Measurement System for Coil Shipping Automation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18853655

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019540803

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18853655

Country of ref document: EP

Kind code of ref document: A1