CN111032561A - Crane device - Google Patents

Crane device Download PDF

Info

Publication number
CN111032561A
CN111032561A CN201880053626.5A CN201880053626A CN111032561A CN 111032561 A CN111032561 A CN 111032561A CN 201880053626 A CN201880053626 A CN 201880053626A CN 111032561 A CN111032561 A CN 111032561A
Authority
CN
China
Prior art keywords
region
spreader
captured image
image
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880053626.5A
Other languages
Chinese (zh)
Other versions
CN111032561B (en
Inventor
小林雅人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sumitomo Heavy Industries Material Handling Systems Co Ltd
Original Assignee
Sumitomo Heavy Industries Material Handling Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sumitomo Heavy Industries Material Handling Systems Co Ltd filed Critical Sumitomo Heavy Industries Material Handling Systems Co Ltd
Priority to CN202110179044.5A priority Critical patent/CN112938766B/en
Publication of CN111032561A publication Critical patent/CN111032561A/en
Application granted granted Critical
Publication of CN111032561B publication Critical patent/CN111032561B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/22Control systems or devices for electric drives
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/18Control systems or devices
    • B66C13/46Position indicators for suspended loads or for crane elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66CCRANES; LOAD-ENGAGING ELEMENTS OR DEVICES FOR CRANES, CAPSTANS, WINCHES, OR TACKLES
    • B66C13/00Other constructional features or details
    • B66C13/52Details of compartments for driving engines or motors or of operator's stands or cabins
    • B66C13/54Operator's stands or cabins

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control And Safety Of Cranes (AREA)

Abstract

The crane device is provided with: a spreader; the camera shooting part shoots the lower part of the lifting appliance to obtain a camera shooting image; a distance information acquisition unit that acquires distance information from the measurement unit to a plurality of measurement points within a range of a captured image below the spreader; a color information acquisition unit that acquires color information within a range of a photographic image below the spreader; a display unit; and a control unit for processing the captured image and displaying the processing result on the display unit. The control unit performs the following processing: an object region including a region where an object is present in a captured image is extracted from the captured image based on the distance information and the color information, a corresponding portion corresponding to a shape of a reference image stored in advance is detected in the object region, the detected corresponding portion is emphasized in the captured image, and the emphasized corresponding portion is displayed on a display unit.

Description

Crane device
Technical Field
One embodiment of the present invention relates to a crane apparatus.
Background
Patent document 1 describes a crane that detects a container placed on a spreader from a distance map acquired by a 3D camera provided on the spreader when the container placed on the spreader is gripped by the spreader or when a container gripped by the spreader is stacked on another container placed on the spreader, and acquires a positional relationship between the spreader and the container.
Prior art documents
Patent document
Patent document 1: japanese laid-open patent publication No. 2015-533747
Disclosure of Invention
Technical problem to be solved by the invention
The 3D camera used in the crane outputs measurement light to an area containing a container to be detected, and detects reflected light thereof, thereby acquiring a distance map. Therefore, for example, if the 3D camera is affected by strong sunlight or external disturbance light outdoors, it may be erroneously detected that a container exists in an area where no container to be detected exists. When the 3D camera is used outdoors in a rainy day, raindrops may be detected, and the target object may not be accurately detected. In addition, in the 3D camera, for example, if a mirror portion exists on a container to be detected, the measurement light may be specularly reflected and directed in a direction in which the reflected light cannot be detected by the 3D camera. At this time, the 3D camera cannot acquire the distance map of the mirror surface portion, and as a result, there is a possibility that the container cannot be detected despite the presence of the container to be detected.
Therefore, an object of one embodiment of the present invention is to provide a crane apparatus capable of more accurately acquiring a positional relationship between a spreader and an object.
Means for solving the technical problem
A crane device according to an embodiment of the present invention includes: a spreader which performs gripping, holding, and unloading of the cargo; the camera shooting part is arranged on the lifting appliance and shoots the lower part of the lifting appliance to obtain a camera shooting image; a distance information acquisition unit that is provided on the hanger and acquires distance information from the hanger to a plurality of measurement points within a range of the captured image below the hanger; a color information acquisition unit that is provided on the hanger and acquires color information within a range of a captured image below the hanger; a display unit that displays a captured image; a control unit for processing the captured image and displaying the processing result on the display unit, wherein the control unit performs the following processing: an object region including a region where an object is present in the captured image is extracted from the captured image based on the distance information and the color information, a corresponding portion corresponding to a shape of a reference image stored in advance is detected in the object region, the detected corresponding portion is emphasized in the captured image, and the emphasized corresponding portion is displayed on a display unit.
According to the crane apparatus according to the embodiment of the present invention, the area where the object is located is estimated from the distance information and the area where the object is located is estimated from the color information in the captured image obtained by capturing the image of the lower side of the spreader. Further, the apparatus extracts an object region including a region where the object is present from the captured image based on these estimation results, and highlights a corresponding portion corresponding to the shape of the reference image in the object region. Therefore, the device can acquire the positional relationship between the spreader and the object using the distance information and the color information in a complementary manner. Therefore, the device is less susceptible to the influence of disturbance light and the influence of specular reflection of the object, for example, and therefore the positional relationship between the spreader and the object can be acquired more reliably.
In the crane apparatus according to the embodiment of the present invention, the control unit may not detect the corresponding portion in a region other than the target region in the captured image. Thus, it is only necessary to detect a corresponding portion corresponding to the shape of the reference image for the target region in the captured image, and therefore the time required for processing can be shortened.
In the crane apparatus according to the embodiment of the present invention, the control unit may perform: a1 st region in which an object is present in a captured image is estimated from distance information, a2 nd region in which the object is present in the captured image is estimated from color information, and a region including a region included in at least one of the 1 st region and the 2 nd region is extracted from the captured image as a target region. Thus, not only is the region where the object is estimated to be the target region by extracting the distance information which is easily affected by the specular reflection of the object from the captured image, but also the region where the object is estimated to be the target region by extracting the color information which is not easily affected by the specular reflection of the object. Therefore, it is possible to more reliably acquire the positional relationship between the spreader and the object while suppressing the object from being undetectable even if the object is present.
In the crane apparatus according to the embodiment of the present invention, the control unit may perform: a1 st region in which an object is located in a captured image is estimated from distance information, a2 nd region in which the object is located in the captured image is estimated from color information, and a region including regions included in both the 1 st region and the 2 nd region is extracted from the captured image as a target region. Thus, the region estimated to have the object is extracted from the captured image as the target region based on not only the distance information susceptible to the influence of the disturbance light but also the color information less susceptible to the influence of the disturbance light. Therefore, it is possible to more reliably acquire the positional relationship between the spreader and the object while suppressing erroneous detection of the presence of the object even in an area where the object is not present.
The crane apparatus according to one embodiment of the present invention further includes a moving mechanism that moves the hoist, the control unit detects a misalignment between the corresponding portion in the picked-up image and a reference portion where the corresponding portion should be located in the picked-up image when the hoist picks up or unloads the load, and the moving mechanism can move the hoist so as to eliminate the misalignment. This makes it possible to position the spreader directly above the object with high accuracy.
In the crane apparatus according to the embodiment of the present invention, the object may be at least a part of a load gripped by the spreader. Thus, when the load is gripped by the hoist, the positional relationship between the hoist and the load can be acquired more reliably.
In the crane apparatus according to the embodiment of the present invention, the object may be at least a part of a placement portion to which the load held by the spreader is to be unloaded. Thus, in a situation where the load is unloaded by the hoist, the positional relationship between the hoist and the placement portion can be acquired more reliably.
Effects of the invention
According to one embodiment of the present invention, the positional relationship between the spreader and the object can be acquired more reliably.
Drawings
Fig. 1 is a block diagram showing a crane apparatus according to embodiment 1.
Fig. 2 is a front view of the crane assembly.
Fig. 3 is a side view of the crane assembly.
Fig. 4 is a perspective view showing the hanger and the steel coil.
Fig. 5 is a diagram showing a captured image in capture control.
Fig. 6 is a diagram showing a distance accuracy map in the capture control.
Fig. 7 is a diagram showing a color accuracy map in capture control.
Fig. 8 is a diagram showing an OR accuracy map in the capture control.
Fig. 9 is a diagram showing a reference image.
Fig. 10 is a diagram showing a highlight image in capture control.
Fig. 11 is a diagram showing an AND accuracy map in the capture control.
Fig. 12 is a diagram showing an emphasized image in the unload control.
Fig. 13 is a block diagram showing a crane apparatus according to embodiment 2.
Fig. 14 is a front view of the crane assembly.
Fig. 15 is a perspective view of the crane apparatus.
Fig. 16 is a diagram showing a reference image.
Fig. 17 is a diagram showing a highlight image in capture control.
Fig. 18 is a diagram showing a highlight image in the unload control.
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings, the same or equivalent portions are denoted by the same reference numerals, and redundant description thereof is omitted.
[ embodiment 1 ]
[ Structure of Crane device ]
Fig. 1 is a block diagram showing a crane apparatus 1 according to embodiment 1. Fig. 2 is a front view of the crane apparatus 1. Fig. 3 is a side view of the crane arrangement 1. Fig. 4 is a perspective view showing the hanger 30 and the steel coil C. As shown in fig. 1 to 4, the crane apparatus 1 according to embodiment 1 is a bridge crane that transports a steel coil (cargo) C that is a steel sheet wound in a cylindrical shape. The crane apparatus 1 includes a moving mechanism 10, a spreader 30, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60. The crane apparatus 1 further includes a conveyance control unit 51 that controls the moving mechanism 10 and the spreader 30, and an encoder 31 connected to the conveyance control unit 51.
The steel coil C is substantially cylindrical. A hanging hole H is formed at the axial center of the steel coil C from one end of the steel coil C to the other end thereof. The steel coil C is placed on a saddle (placement portion) S disposed in a predetermined interval matrix on a floor F of a building such as a factory or a warehouse so that the axial direction thereof faces in the horizontal direction. The steel coil C has a predetermined shape, a predetermined size, and a predetermined color. The outer peripheral surface of the steel coil C is in a mirror surface shape.
The moving mechanism 10 is a mechanism for moving (horizontally moving, vertically moving, and rotationally moving) the spreader 30. The traveling mechanism 10 has a main beam 12 and a crane carriage 13. The main girder 12 supports the load of the crane trolley 13 and the spreader 30. The main beams 12 are erected on both side walls substantially horizontally in the vicinity of the ceiling of the building. The main beam 12 can move in a horizontal direction orthogonal to the extending direction of the main beam 12, and the crane trolley 13 and the spreader 30 can be moved in this direction.
The crane trolley 13 is laterally moved on the upper surface of the main beam 12 in the extending direction of the main beam 12. Thereby, the crane trolley 13 can move the spreader 30 along the extending direction of the main beam 12. The crane trolley 13 includes a hoisting mechanism 11 for hoisting and releasing the suspended wire rope 17. The crane carriage 13 further includes a rotation mechanism (not shown) for rotating and moving the spreader 30 about the axis in the vertical direction via the wire rope 17.
The hanger 30 is an implement for grasping, holding, and unloading the steel coil C. The spreader 30 is suspended from a wire rope 17 depending from the trolley 13. The hoist 30 is moved upward by hoisting the wire rope 17 by the hoisting mechanism 11 of the crane trolley 13, and moved downward by releasing the wire rope 17 by the hoisting mechanism 11. The hanger 30 includes a base portion 18 and a pair of claw portions 19 and 19 facing each other. The wire rope 17 is connected to the upper surface side of the base portion 18, and claw portions 19, 19 are openably and closably provided on the lower surface side thereof.
The distal ends of the claw portions 19, 19 are provided with projections 20 projecting so as to face each other. By closing the claw portions 19, the respective convex portions 20 are inserted into the hanging holes H of the steel coil C from both sides. Thereby, the hanger 30 is in a state of holding the steel coil C. In the "state where the hanger 30 holds the coil C", the inner surfaces 21, 21 of the claw portions 19, 19 facing each other may or may not abut against the coil C.
The hanger 30 carries the steel coil C in the following order. That is, the hanger 30 opens and closes the claw portions 19, 19 to grasp the coil C placed on the saddle S, moves onto the other saddle S in a state of holding the coil C by driving of the moving mechanism 10 and the winding mechanism 11, and opens and closes the claw portions 19, 19 again to unload the coil C, thereby carrying the coil C.
The encoder 31 is a sensor for detecting the amount of movement of the hoisting mechanism 11 for hoisting and releasing and the amount of lateral movement of the hoist carriage 13. The conveyance control unit 51 acquires height information indicating the current height of the spreader 30 from the detection result of the encoder 31. The encoder 31 may detect only the winding and unwinding operation amount of the winding mechanism 11 without detecting the lateral movement amount of the crane trolley 13.
The conveyance control unit 51 controls driving of the crane apparatus 1. For example, the conveyance controller 51 controls the movement of the main beam 12, the lateral movement of the crane carriage 13, the hoisting and releasing of the wire rope 17 by the hoisting mechanism 11 (the vertical movement of the spreader 30), the rotational movement of the spreader 30 by the rotation mechanism, and the opening and closing of the claws 19 and 19. The conveyance control unit 51 then outputs the height information of the spreader 30, which is acquired based on the detection result of the encoder 31, to the control unit 50.
The imaging section 40 is an imaging device that is provided on the spreader 30 and that images below the spreader 30 to acquire a captured image. The imaging section 40 may be a camera, for example. The imaging unit 40 is provided downward on the base portion 18 of the spreader 30. The spreader 30 may be provided with a plurality of image pickup units 40.
The distance information acquiring unit 41 is a device that is provided on the spreader 30 and acquires distance information up to a plurality of measurement points within the range of the captured image captured by the imaging unit 40 below the spreader 30. The distance information acquiring unit 41 associates the distance information up to each measurement point with the position in the captured image and acquires the associated information. The distance information acquiring unit 41 is not particularly limited as long as it can acquire distance information, and may be, for example, ToF [ Time of Flight: time of flight camera, it may also be a 3D camera, and it may also be a 3D scanner. The distance information acquisition unit 41 is provided downward in the vicinity of the imaging unit 40 on the base portion 18 of the spreader 30. The distance information acquisition section 41 is provided only by the number corresponding to the number of the image pickup sections 40. The "distance information" is information indicating the distance between the distance information acquiring unit 41 and each measurement point. Here, the measurement point is, for example, a point set on the upper surface of the ground F, the saddle S, the steel coil C, or the like, and may be set in a matrix form, for example. The measurement point may be set at a predetermined relative position based on the position coordinates of the distance information acquiring unit 41 in the horizontal plane, for example.
The color information acquisition unit 42 is a device that is provided on the spreader 30 and acquires color information within a range of the captured image captured by the imaging unit 40 below the spreader 30. The color information acquisition unit 42 associates the color information with a position in the captured image and acquires the color information. The color information acquiring unit 42 is not particularly limited as long as it can acquire color information, and may be, for example, a spectral camera or a color camera. The color information acquiring unit 42 is provided downward in the vicinity of the imaging unit 40 on the base portion 18 of the spreader 30. The color information acquisition section 42 is provided only by the number corresponding to the number of the image pickup sections 40. The "color information" is information that is expressed in a vector representation form by decomposing a color image into RGB signals, for example.
The display section 43 is a display device that displays a captured image. The display unit 43 displays the emphasized image generated by the control unit 50. The display unit 43 may be a display, for example. The display unit 43 is provided in the cab 60, for example.
The control unit 50 processes the captured image to generate an emphasized image, and displays the generated emphasized image on the display unit 43 (details will be described later). The control section 50 can generate an emphasized image in the grasping control for grasping the steel coil C placed on the saddle S by the hanger 30 and an emphasized image in the unloading control for unloading the steel coil C held by the hanger 30 onto the saddle S.
[ grab control ]
Hereinafter, the display of the emphasized image in the capture control will be described.
Fig. 5 is a diagram showing a captured image P1 in capture control. As shown in fig. 5, the control unit 50 causes the display unit 43 to display a captured image P1 captured by the imaging unit 40. Fig. 5 shows a coil C placed under the hanger 30, one claw portion 19 of the hanger 30, and a saddle S on which the coil C is not placed. In fig. 5, the spreader 30 is located at a position offset from the position directly above the coil C to be placed. In fig. 5, a steel coil C is placed on a saddle S, not shown, below the steel coil C. Since the outer peripheral surface of the coil C is mirror-shaped, a reflection portion R where a surrounding scene is reflected exists on the coil C in the picked-up image P1.
Next, the controller 50 extracts a target region including a region in which the target object is present in the captured image P1 from the captured image P1 as follows. Here, the "object" is a steel coil C to be gripped by the spreader 30.
Fig. 6 is a diagram showing a distance accuracy map P2 in the capture control. As shown in fig. 6, the distance information acquiring unit 41 associates the distance information to a plurality of measurement points (not shown) within the range of the captured image P1 below the spreader 30 with the position in the captured image P1, and acquires the associated distance information. For example, the measurement points may be set at positions arranged at intervals of a predetermined number (for example, 20) in the vertical direction of the captured image P1 and at intervals of a predetermined number (for example, 30) in the horizontal direction in a matrix.
The control unit 50 estimates the 1 st region a1 in which the object (i.e., the steel coil C) in the captured image P1 is located, based on the distance information acquired by the distance information acquiring unit 41. More specifically, the control unit 50 predicts the current distance between the hanger 30 and the steel coil C based on the current height information of the hanger 30, on the premise that the steel coil C has a predetermined shape and a predetermined size. Then, the control unit 50 compares the predicted distance with the distance according to the distance information acquired by the distance information acquiring unit 41, and estimates the 1 st region a1 in which the steel coil C is located in the captured image P1. For example, if the difference between the predicted distance and the distance according to the distance information acquired by the distance information acquiring unit 41 is within a predetermined range, the control unit 50 may determine that the coil C is located in the area, and if the difference is outside the predetermined range, may determine that the coil C is not located in the area.
In fig. 6, the control unit 50 accurately determines that the vicinity of the axial center of the steel coil C (white area in fig. 6) is the area where the steel coil C is located in the plan view. On the other hand, in the region (gray region in fig. 6) of the coil C that is distant from the axial center in a plan view, the measurement light of the distance information acquiring unit 41 is incident on the outer peripheral surface of the mirror-like coil C at an incident angle close to parallel. Therefore, in this region, the distance information acquisition unit 41 cannot detect the reflected light, and as a result, the control unit 50 erroneously determines that the coil C is not present.
Fig. 7 is a diagram showing a color accuracy map P3 in capture control. As shown in fig. 7, the color information acquiring unit 42 acquires color information in the range of the photographed image P1 below the spreader 30 in association with the position in the photographed image P1. The control unit 50 estimates the 2 nd region a2 in which the object (i.e., the steel coil C) in the captured image P1 is located, based on the color information acquired by the color information acquisition unit 42. More specifically, on the premise that the coil C has a predetermined color, the control unit 50 estimates the 2 nd area a2 in the captured image P1, in which the coil C is located, from the degree of color correlation with the predetermined color. As a method of determining the "color correlation degree", a known method may be used. For example, the control section 50 may determine the color association degree by a method of converting the color information into coordinates on a chromaticity diagram and evaluating a distance between the coordinates on the chromaticity diagram and coordinates on the chromaticity diagram of the color of the steel coil C stored in advance. The control section 50 stores the color of the steel coil C in advance. If the color correlation is equal to or greater than the predetermined value, the control unit 50 determines that the coil C is located in the area, and if the color correlation is less than the predetermined value, it may determine that the coil C is not located in the area.
In fig. 7, the control unit 50 accurately determines that the coil C is located in a portion (white area in fig. 7) of the coil C other than the reflection portion R in a plan view. On the other hand, the reflection portion R has a color significantly different from that of the coil, and therefore the control unit 50 erroneously determines that the coil C is not present.
Fig. 8 is a diagram showing an OR accuracy map P4 in the capture control. The OR accuracy map P4 shown in fig. 8 shows an OR region A3 (white portion in fig. 8) included in at least one of the 1 st region a1 of the distance accuracy map P2 and the 2 nd region a2 of the color accuracy map P3. The controller 50 sets a region including the OR region A3 (for example, a region obtained by enlarging the outer edge of the OR region A3 by a predetermined width) in the OR accuracy map P4 as the target region a 4. Thus, the control unit 50 extracts the region (i.e., the target region a4) where the steel coil C is located in the captured image P1 from the captured image P1.
Next, the control unit 50 detects a corresponding portion corresponding to the shape of the reference image stored in advance in the object region a4 of the captured image P1 as follows.
Fig. 9 is a diagram showing a reference picture P5. Fig. 10 is a diagram showing the emphasized image P6 in capture control. As shown in fig. 9 and 10, the control unit 50 stores in advance the shape of the object (i.e., the coil C) in the captured image P1 (specifically, the plan view of the coil C) as a reference image P5. Then, the controller 50 detects a corresponding portion a5 corresponding to the shape of the reference image P5 in the region corresponding to the target region a4 in the captured image P1 by a known image recognition method. "corresponding to the shape of the reference picture P5" means: the size may be different from that of the reference image P5, and may be a shape in which the reference image P5 is inclined about the axis in the vertical direction. As the "image recognition method", for example, pattern matching, mechanical learning, or the like can be used. Further, the control unit 50 does not detect the corresponding portion a5 in the region other than the target region a4 in the captured image P1.
Next, the controller 50 generates an emphasized image P6 in which the corresponding portion a5 is emphasized in the captured image P1, and displays the generated emphasized image P6 on the display unit 43. Here, the emphasized image P6 is a captured image P1 in which the target region a4 is displayed in color and a region other than the target region a4 is displayed in gray. Also, in the emphasized image P6, the corresponding portion a5 is surrounded by a broken line. In the emphasized image P6, the corresponding portion a5 is not limited to being surrounded by a broken line as long as it can be emphasized so as to be easily recognized by the operator, and may be displayed in a predetermined color, may be blinked, or may be hollowed out.
Next, the correction of the misalignment of the hanger 30 with respect to the object (i.e., the steel coil C) will be described.
As shown in fig. 10, the control section 50 calculates and sets the reference portion a6 where the corresponding portion a5 should be located in the photographed image P1 when the gripping of the steel coil C by the hanger 30 is performed, from the height information acquired from the detection result of the encoder 31. The reference portion a6 is, for example, the shape of a steel coil C when viewed from above when the steel coil C is placed on a saddle S directly below (vertically below) the spreader 30. The controller 50 detects a misalignment between the corresponding portion a5 and the reference portion a6 in the captured image P1 (or the emphasized image P6). Also, the control section 50 may detect the relative distance in the height direction between the hanger 30 and the steel coil C from the magnification between the size of the corresponding section a5 and the size of the reference section a6, and may also detect the deviation in the rotational direction between the hanger 30 and the steel coil C from the angular deviation between the corresponding section a5 and the reference section a 6. The control unit 50 outputs information on the detected misalignment to the conveyance control unit 51.
The transport control unit 51 controls the moving mechanism 10 to move the spreader 30 so as to eliminate the misalignment based on the information on the misalignment input from the control unit 50. That is, the moving mechanism 10 moves the spreader 30 to eliminate the misalignment by moving the main beam 12 and the crane carriage 13 so as to move the spreader 30 directly above the coil C. At this time, the rotation mechanism may be driven to rotate the spreader 30 about the axis in the vertical direction, if necessary.
The method in which the OR accuracy map P4 is used to display the emphasized image P6 in the capture control by the control unit 50 has been described above. In the display of the emphasized image P6 in the capture control, the control unit 50 may execute a method using an AND accuracy map instead of the method using the OR accuracy map P4.
Fig. 11 is a diagram showing an AND accuracy map P7 in the capture control. The AND region a7 (white portion in fig. 11) included in the 1 st region a1 of the distance precision map P2 AND included in the 2 nd region a2 of the color precision map P3 is shown in the AND precision map P7 shown in fig. 11. The controller 50 may set, as the target region, a region including the AND region a7 (for example, a region obtained by enlarging the outer edge of the AND region a7 by a predetermined width) in the AND accuracy map P7. As described above, the control section 50 can extract the region (i.e., the target region) where the steel coil C is located in the photographed image P1 from the photographed image P1. Whether OR precision map P4 OR AND precision map P7 is used can be appropriately selected depending on the object, for example.
[ unloading control ]
The control unit 50 processes the captured image in the same manner as the capture control, and can display the emphasized image even in the unload control. Hereinafter, the display of the emphasized image in the unload control will be described.
Fig. 12 is a diagram showing the emphasized image P8 in the unload control. The emphasized image P8 of fig. 12 includes a coil C, the claw portion 19 of the hanger 30 holding the coil C, and a part of the saddle S to which the coil C is to be unloaded. The saddle S has a mounting mark M having a predetermined shape, a predetermined size, and a predetermined color. In the emphasized image P8, the spreader 30 is located at a position offset from directly above the saddle S from which the coil C of steel is to be unloaded, and is inclined with respect to the saddle S about the axis in the vertical direction. Here, the "object" is a loading mark M on the saddle S to which the steel coil C held by the hanger 30 is to be unloaded. The shape and size of the loading mark M are: the shape and size of an imaginary columnar portion of the saddle S having the loading mark M from the entire loading mark M to the floor surface F in the vertical direction.
The emphasized image P8 is made by color-displaying the object area a8 including the OR area in the OR accuracy map and gray-displaying the area other than the object area a8 for the photographed image before the steel coil C is unloaded onto the saddle S. The object region A8 may be set from the AND region in the AND precision map instead of the OR region in the OR precision map. In the emphasized image P8, the region corresponding to the portion a9 (i.e., the loading mark M) detected in the object region a8 is surrounded by a broken line. Further, in the emphasized image P8, the reference portion a10 where the corresponding portion a9 should be located when the coil C is unloaded by the hanger 30 is shown by a one-dot chain line. The reference portion a10 is the shape of the loading mark M in a plan view when the loading mark M is positioned directly below (vertically below) the spreader 30.
Control section 50 detects the misalignment between corresponding section a9 and reference section a 10. Also, the control part 50 may detect the relative distance in the height direction between the hanger 30 and the steel coil C from the magnification between the size of the corresponding part a9 and the size of the reference part a10, and may also detect the deviation in the rotational direction between the hanger 30 and the steel coil C from the angular deviation between the corresponding part a9 and the reference part a 10. The control unit 50 outputs information on the detected misalignment to the conveyance control unit 51. The transport control unit 51 controls the moving mechanism 10 to move the spreader 30 so as to eliminate the misalignment based on the information on the misalignment input from the control unit 50. That is, the moving mechanism 10 moves the spreader 30 to eliminate the misalignment by moving the main beam 12, laterally moving the crane carriage 13, and driving the rotating mechanism, thereby moving the spreader 30 to a position directly above the saddle S.
[ Effect and Effect ]
As described above, according to the crane apparatus 1, in the picked-up image P1 picked up below the spreader 30, the 1 st area a1 where the object (i.e., the steel coil C or the loading mark M) is located is estimated from the distance information, and the 2 nd area a2 where the object (i.e., the steel coil C or the loading mark M) is located is estimated from the color information. Further, the crane apparatus 1 extracts the object regions a4, A8 including the region where the object is located from the photographic image P1 based on these estimation results, and highlights the corresponding portions a5, a9 corresponding to the shape of the reference image in the object regions a4, A8. Therefore, the crane apparatus 1 can acquire the positional relationship between the spreader 30 and the object using the distance information and the color information in a complementary manner. Therefore, the crane apparatus 1 is less susceptible to the influence of disturbance light and the influence of specular reflection of the object, for example, and therefore the positional relationship between the spreader 30 and the object can be acquired more reliably.
In the crane apparatus 1, the control unit 50 does not detect the corresponding portions a5 and a9 in the region other than the object regions a4 and a8 of the captured image P1. Accordingly, it is only necessary to detect the corresponding portions a5, a9 corresponding to the shape of the reference image for the object regions a4, a8 in the captured image P1, and therefore the time required for the processing can be shortened.
In the crane apparatus 1, the control unit 50 estimates the 1 st region a1 where the object (i.e., the coil C OR the loading marker M) is located in the photographed image P1 from the distance information, estimates the 2 nd region a2 where the object (i.e., the coil C OR the loading marker M) is located in the photographed image P1 from the color information, and extracts a region including an OR region included in at least one of the 1 st region a1 and the 2 nd region a2 as the object regions a4 and A8 from the photographed image P1. Thus, not only the 1 st region a1 in which the object is estimated to be present based on the distance information that is easily affected by the specular reflection of the object is extracted from the captured image P1 as the object regions a4 and A8, but also the 2 nd region a2 in which the object is estimated to be present based on the color information that is not easily affected by the specular reflection of the object is extracted as the object regions a4 and A8. Therefore, it is possible to more reliably acquire the positional relationship between the spreader 30 and the object while suppressing the object from being undetectable even if the object is present.
Alternatively, in the crane apparatus 1, the control unit 50 estimates the 1 st region a1 in which the object (i.e., the steel coil C or the loading marker M) is located in the photographed image P1 based on the distance information, estimates the 2 nd region a2 in which the object (i.e., the steel coil C or the loading marker M) is located in the photographed image P1 based on the color information, AND extracts regions including AND regions included in both the 1 st region a1 AND the 2 nd region a2 from the photographed image P1 as the object regions a4 AND A8. Thus, the regions estimated to have the object are extracted from the captured image P1 as the target regions a4 and a8 based on not only the distance information that is easily affected by the disturbance light but also the color information that is not easily affected by the disturbance light. Therefore, it is possible to more reliably acquire the positional relationship between the spreader 30 and the object while suppressing erroneous detection of the presence of the object even in an area where the object is not present.
The crane apparatus 1 includes a moving mechanism 10 for moving the spreader 30, and the controller 50 detects a misalignment between the corresponding portions a5 and a9 in the picked-up image P1 and the reference portions a6 and a10 where the corresponding portions a5 and a9 should be located in the picked-up image P1 when the spreader 30 picks up or unloads the load (i.e., the steel coil C), and controls the moving mechanism 10 to move the spreader 30 so as to eliminate the misalignment. This allows the spreader 30 to be positioned directly above the object with high accuracy.
In the crane apparatus 1, the object is a steel coil C gripped by the spreader 30. Thus, in a state where the coil C is gripped by the hanger 30, the positional relationship between the hanger 30 and the coil C can be acquired more reliably.
Alternatively, in the crane apparatus 1, the object is the loading mark M on the saddle S on which the steel coil C held by the spreader 30 is to be unloaded. Thereby, in a situation where the hanger 30 unloads the steel coil C, the positional relationship between the hanger 30 and the saddle S can be acquired more reliably.
[ 2 nd embodiment ]
[ Structure of Crane device ]
The following describes a crane apparatus according to embodiment 2. The crane apparatus according to embodiment 2 is different from the crane apparatus 1 according to embodiment 1 in the type of crane and the load to be transported. Hereinafter, differences from the crane apparatus 1 according to embodiment 1 will be mainly described.
Fig. 13 is a block diagram showing a crane apparatus 1A according to embodiment 2. Fig. 14 is a front view of the crane apparatus 1A. Fig. 15 is a perspective view of the crane apparatus 1A. As shown in fig. 13 to 15, the crane apparatus 1A according to embodiment 2 is, for example, a container handling crane which is disposed on a container yard Y and handles a container D on a container terminal for handling the container (cargo) D to a container ship on the shore. The crane apparatus 1A includes a moving mechanism 10A, a spreader 30A, an imaging unit 40, a distance information acquisition unit 41, a color information acquisition unit 42, a display unit 43, a control unit 50, and a cab 60A. The spreader 30A is also referred to as a spreader device. The crane apparatus 1A further includes a conveyance control unit 51A that controls the movement mechanism 10A and the spreader 30A, and an encoder 31 and a swing sensor 32 connected to the conveyance control unit 51A.
The container D is an ISO standard container or the like. The container D has a long rectangular parallelepiped shape and has a predetermined length of, for example, 20 feet or 40 feet in the longitudinal direction. The container D includes engaged portions G (see fig. 16) each having a hole Gh formed in each of four corners of the upper surface thereof. The containers D are stacked in one or more layers in a container yard Y to form a plurality of ROWs (ROWs) E. Each row E is arranged vertically and horizontally so that the longitudinal direction of the containers D constituting the row E is parallel to the longitudinal direction of the containers D constituting the other rows E. The container D has a predetermined shape and a predetermined size. The engaged portion G has a predetermined shape, a predetermined size, and a predetermined color.
In the following description, the container D gripped by the spreader 30A among the containers D is referred to as an object container D1 (see fig. 14). The container D held by the spreader 30A is referred to as a holding container D2, and the container D on which the holding container D2 is to be unloaded is referred to as a target container (placement portion) D3 (see fig. 15). The object container D1, the holding container D2, and the target container D3 are names in which the name of the container D is changed according to the transportation state of the container D.
The moving mechanism 10A is a mechanism that moves (horizontally moves, vertically moves, and rotationally moves) the spreader 30A. The moving mechanism 10A includes a traveling device 14, two sets of a pair of leg portions 15, a main beam 12A, and a crane trolley 13A. The running device 14 includes a tire-equipped wheel provided at the lower end of each of the two sets of the pair of leg portions 15, and the running motor drives the tire-equipped wheel to enable the two sets of the pair of leg portions 15, 15 to run forward and backward. The main beam 12A is substantially horizontally spanned between upper end portions of the pair of leg portions 15, 15 of the two sets. Thus, the traveling device 14 travels the pair of leg portions 15, 15 in the two sets forward and backward, and the main beam 12A can be moved in the horizontal direction orthogonal to the extending direction of the main beam 12A, and the crane trolley 13A and the spreader 30A can be moved in this direction.
The crane trolley 13A is laterally moved on the upper surface of the main beam 12A in the extending direction of the main beam 12A. Thereby, the crane trolley 13A can move the spreader 30A along the extending direction of the main beam 12A. The crane trolley 13A includes a hoisting mechanism 11A that hoists and releases the suspended wire rope 17A. The crane trolley 13A further includes a rotation mechanism (not shown) for rotating and moving the spreader 30A about the axis in the vertical direction via the wire rope 17A.
A trailer, an AGV [ Automated Guided Vehicle: automated guided vehicle ], and the like. The crane apparatus 1A grips the container D carried in by the conveyance carriage V, and unloads the container D on the container yard Y or places the container D on another container D (target container D3) on the container yard Y. The crane apparatus 1A then grabs a container D (object container D1) placed on the container yard Y or on another container D placed on the container yard Y, unloads the container D on the transport vehicle V, and carries out the container D to the outside by the transport vehicle V.
The spreader 30A is an implement for gripping, holding, and unloading the container D. The spreader 30A holds the container D from its upper surface side. The spreader 30A is suspended on a wire rope 17A depending from the trolley 13A. The hoist 30A is moved upward by hoisting the wire rope 17A by the hoisting mechanism 11A of the crane trolley 13A, and is moved downward by releasing the wire rope 17A by the hoisting mechanism 11A. The spreader 30A includes a body 18A and four lock pins (not shown).
The body 18A has a shape and a size corresponding to those of the container D in a plan view. That is, the main body 18A has a long rectangular shape in plan view. The body portion 18A includes a sheave 22 around which the wire rope 17A is wound on the upper surface side of the longitudinal center portion thereof.
The locking pin is a mechanism for holding the container D. The lock pins are provided at four corners of the lower surface of the main body portion 18A so as to protrude downward from the main body portion 18A. The lock pin is provided at a position corresponding to the hole Gh of the engaged portion G of the container D when the container D is held by the spreader 30A. The lock pin is, for example, a twist lock pin, and has a locking piece at a lower end thereof, the locking piece being rotatable about a vertical axis. Each locking pin enters a hole Gh of each engaged portion G provided at four corners of the upper surface of the container D, and is engaged with the container D by rotating each locking piece by 90 degrees.
The encoder 31 is a sensor for detecting the amount of movement of the hoisting mechanism 11 for hoisting and releasing and the amount of lateral movement of the hoist carriage 13A. The conveyance control unit 51A acquires height information indicating the current height of the spreader 30A from the detection result of the encoder 31. The encoder 31 may detect only the winding and unwinding operation amount of the winding mechanism 11 without detecting the lateral movement amount of the hoist carriage 13A. The swing sensor 32 is a sensor that detects the amount of swing of the spreader 30A due to the swing of the wire rope 17A. The conveyance control unit 51A acquires oscillation amount information indicating the current oscillation amount of the spreader 30A based on the detection result of the oscillation sensor 32.
The conveyance control unit 51A controls driving of the crane apparatus 1A. For example, the conveyance controller 51A controls the movement of the main beam 12A, the lateral movement of the crane carriage 13A, the hoisting and releasing of the wire rope 17 by the hoisting mechanism 11A (the vertical movement of the spreader 30A), the rotational movement of the spreader 30A by the rotation mechanism, and the rotation of the locking piece of the lock pin. The conveyance control unit 51A outputs the height information of the spreader 30A obtained from the detection result of the encoder 31 and the swing amount information of the spreader 30A obtained from the detection result of the swing sensor 32 to the control unit 50.
The imaging section 40 is an imaging device that is provided on the spreader 30A, and that images the underside of the spreader 30A to acquire a captured image. The configuration and function of the imaging unit 40 are the same as those in embodiment 1.
The distance information acquiring unit 41 is a device that is provided on the spreader 30A and acquires distance information to a plurality of measurement points within a range of the captured image captured by the imaging unit 40 below the spreader 30A. Here, the measurement point is, for example, a point set on the upper surface of the container yard Y or the containers D placed on the container yard Y, and may be set in a matrix form, for example. The measurement point may be set at a predetermined relative position based on the position coordinates of the distance information acquiring unit 41 in the horizontal plane, for example. The configuration and function of the distance information acquiring unit 41 are the same as those of embodiment 1.
The color information acquisition unit 42 is a device that is provided on the spreader 30A and acquires color information within a range of the captured image captured by the imaging unit 40 below the spreader 30A. The configuration and function of the color information acquiring unit 42 are the same as those of embodiment 1.
The display section 43 is a display device that displays a captured image. The display unit 43 displays the emphasized image generated by the control unit 50. The display unit 43 may be a display, for example. The display unit 43 is provided in the cab 60A, for example.
The control unit 50 processes the captured image to generate an emphasized image, and displays the generated emphasized image on the display unit 43. The control unit 50 can generate an emphasized image in the gripping control for gripping the object container D1 by the spreader 30A and an emphasized image in the unloading control for unloading the holding container D2 onto the target container D3.
[ grab control ]
Hereinafter, the display of the emphasized image in the capture control will be described. The control unit 50 processes the captured image in the same manner as the capture control in embodiment 1, thereby performing the display of the emphasized image in the capture control in embodiment 2.
Fig. 16 is a diagram showing a reference picture P9. Fig. 17 is a diagram showing the emphasized image P10 in capture control. As shown in fig. 16 and 17, the control unit 50 stores the shape of the engaged portion G of the container D as a reference image P9. The highlighted image P10 in fig. 17 includes a part of the body portion 18A of the hanger 30A and the vicinity of the engaged portion G of the target container D1 to be gripped by the hanger 30A. In the emphasized image P10, the spreader 30A is located at a position offset from the position directly above the object container D1. Here, the "object" is the engaged portion G of the object container D1.
The control unit 50 estimates the 1 st region where the object (i.e., the engaged portion G of the object container D1) is located, based on the distance information acquired by the distance information acquisition unit 41. Then, the control unit 50 estimates the 2 nd region where the object (i.e., the engaged portion G of the object container D1) is located, based on the color information acquired by the color information acquisition unit 42. The control unit 50 sets an area including an OR area included in at least one of the 1 st area and the 2 nd area as the target area a 11. As described above, the control unit 50 extracts the target region a11 from the captured image.
Then, the control unit 50 detects a corresponding portion corresponding to the shape of the reference image stored in advance in the target area a11 by a known image recognition method. In addition, the control section 50 does not detect the corresponding portion a12 in the region other than the object region a11 in the captured image. The control unit 50 generates an emphasized image P10 in which the detected corresponding portion a12 is emphasized in the captured image, and displays the generated emphasized image P10 on the display unit 43.
The emphasized image P10 is formed by displaying the object region a11 in color and displaying a region other than the object region a11 in gray for the photographed image before the object container D1 is grasped. The object region a11 may be set from the AND region in the AND precision map instead of the OR region in the OR precision map. In the emphasized image P10, the corresponding portion a12 detected in the object region a11 (i.e., the region of the locked portion G) is surrounded by a broken line. In the emphasized image P10, the reference portion a13 in which the corresponding portion a12 should be located when the object container D1 is grasped by the spreader 30A is indicated by a one-dot chain line. The reference portion a13 is the shape of the engaged portion G in a plan view when the engaged portion G is positioned directly below (vertically below) the spreader 30A.
Control section 50 detects the misalignment between corresponding section a12 and reference section a 13. Further, the controller 50 may detect the relative distance in the height direction between the hanger 30A and the object container D1 from the magnification between the size of the corresponding portion a12 and the size of the reference portion a13, or may detect the deviation in the rotational direction between the hanger 30A and the object container D1 from the angular deviation between the corresponding portion a12 and the reference portion a 13. The control unit 50 outputs information on the detected misalignment to the conveyance control unit 51A. The conveyance control unit 51A controls the moving mechanism 10A to move the spreader 30A so as to eliminate the misalignment, based on the information on the misalignment input from the control unit 50. That is, the moving mechanism 10A moves the spreader 30A to eliminate the misalignment by moving the main beam 12A and the crane carriage 13A laterally, thereby moving the spreader 30A to a position directly above the target container D1. At this time, the rotation mechanism may be driven to rotate and move the spreader 30A about the axis in the vertical direction, if necessary.
[ unloading control ]
The control unit 50 processes the captured image in the same manner as the capture control, and can display the emphasized image even in the unload control. Hereinafter, the display of the emphasized image in the unload control will be described.
Fig. 18 is a diagram showing the emphasized image P11 in the unload control. The emphasized image P11 in fig. 18 includes a part of the main body portion 18A of the spreader 30A, a part of the holding container D2 held by the spreader 30A, and the vicinity of the stuck portion G of the target container D3 to which the holding container D2 is to be unloaded. In the emphasized image P11, the spreader 30A is located at a position offset from directly above the target container D3. Here, the "object" is the engaged part G of the target container D3.
The emphasized image P11 is made by color-displaying a region including the OR region in the OR accuracy map (i.e., the object region a11) and gray-displaying a region other than the object region a11 with respect to the photographed image before the holding container D2 is unloaded to the target container D3. The object region a11 may be set from the AND region in the AND precision map instead of the OR region in the OR precision map. In the emphasized image P11, the corresponding portion a12 detected in the object region a11 (i.e., the region of the locked portion G) is surrounded by a broken line. In the emphasized image P11, the reference portion a13 in which the corresponding portion a12 should be located when the container D2 is unloaded by the spreader 30A is indicated by a one-dot chain line. The reference portion a13 is the shape of the engaged portion G in a plan view when the engaged portion G is positioned directly below (vertically below) the spreader 30A.
Control section 50 detects the misalignment between corresponding section a12 and reference section a 13. Further, the controller 50 may detect the relative distance in the height direction between the spreader 30A and the target container D3 from the magnification between the size of the corresponding portion a12 and the size of the reference portion a13, or may detect the deviation in the rotational direction between the spreader 30A and the target container D3 from the angular deviation between the corresponding portion a12 and the reference portion a 13. The control unit 50 outputs information on the detected misalignment to the conveyance control unit 51A. The conveyance control unit 51A controls the moving mechanism 10A to move the spreader 30A so as to eliminate the misalignment, based on the information on the misalignment input from the control unit 50. That is, the moving mechanism 10A moves the spreader 30 by moving the main beam 12A and traversing the crane carriage 13A to eliminate the misalignment, thereby moving the spreader 30A to a position directly above the target container D3. At this time, the rotation mechanism may be driven to rotate and move the spreader 30A about the axis in the vertical direction, if necessary.
In the unloading control, since the spreader 30A holds the holding container D2, if the holding container D2 is lowered toward the target container D3, the engaged portion G of the target container D3 may be hidden by the holding container D2 and not displayed on the captured image. Therefore, before the locked portion G of the target container D3 is not displayed, the crane apparatus 1A detects the displacement between the corresponding portion a12 and the reference portion a13 in advance in a state where the spreader 30A is at a predetermined height, and corrects the detection value of the swing sensor 32 based on the detection result, thereby associating the detection value of the swing sensor 32 with the displacement. Then, the crane apparatus 1A monitors the misalignment state between the corresponding section a12 and the reference section a13 based on the detection result of the swing sensor 32 while lowering the holding container D2 toward the target container D3 by the spreader 30A. Thereby, the crane apparatus 1A can maintain the state in which the spreader 30A is positioned directly above the target container D3.
[ Effect and Effect ]
As described above, according to the crane apparatus 1A, the 1 st region where the object (i.e., the engaged portion G of the object container D1 or the engaged portion G of the target container D3) is located is estimated from the distance information in the picked-up image picked up below the spreader 30A, and the 2 nd region where the object (i.e., the engaged portion G of the object container D1 or the engaged portion G of the target container D3) is located is estimated from the color information. Further, the crane apparatus 1A extracts an object region a11 including a region where the object is present from the photographed image based on these estimation results, and highlights a corresponding portion a12 corresponding to the shape of the reference image P9 in the object region a 11. Therefore, the crane apparatus 1A can acquire the positional relationship between the spreader 30A and the object using the distance information and the color information in a complementary manner. Therefore, the crane apparatus 1A is less susceptible to the influence of disturbance light, the influence of specular reflection of the object, and the influence of detection of raindrops, for example, and thus the positional relationship between the spreader 30A and the object can be acquired more reliably.
In the crane apparatus 1A, the control unit 50 does not detect the corresponding portion a12 in the region other than the object region a11 in the captured image. Thus, only the corresponding portion a12 corresponding to the shape of the reference image P9 needs to be detected for the object region a11 in the captured image, and therefore the time required for the processing can be shortened.
In the crane apparatus 1A, the control unit 50 estimates a1 st region where the object (i.e., the engaged portion G of the object container D1 OR the engaged portion G of the target container D3) is located in the picked-up image based on the distance information, estimates a2 nd region where the object (i.e., the engaged portion G of the object container D1 OR the engaged portion G of the target container D3) is located in the picked-up image based on the color information, and extracts a region including an OR region included in at least one of the 1 st region and the 2 nd region from the picked-up image as the object region a 11. Thus, not only the 1 st region where the object is estimated to be present based on the distance information which is easily affected by the disturbance light and the influence of the detected raindrops is extracted from the captured image as the object region a11, but also the 2 nd region where the object is estimated to be present based on the color information which is not easily affected by the specular reflection of the object is extracted as the object region a 11. Therefore, it is possible to more reliably acquire the positional relationship between the spreader 30A and the object while suppressing the object from being undetectable even if the object is present.
Alternatively, in the crane apparatus 1A, the control unit 50 estimates the 1 st region where the object (i.e., the engaged portion G of the object container D1 or the engaged portion G of the target container D3) is located in the captured image based on the distance information, estimates the 2 nd region where the object (i.e., the engaged portion G of the object container D1 or the engaged portion G of the target container D3) is located in the captured image based on the color information, AND extracts a region including an AND region included in both the 1 st region AND the 2 nd region from the captured image as the object region a 11. Thus, the region estimated to have the object is extracted from the captured image as the target region a11 based on not only the distance information that is susceptible to the influence of the disturbance light but also the color information that is not susceptible to the influence of the disturbance light. Therefore, it is possible to more reliably acquire the positional relationship between the spreader 30A and the object while suppressing erroneous detection of the presence of the object even in an area where the object is not present.
The crane apparatus 1A includes a moving mechanism 10A for moving the spreader 30A, and the controller 50 detects a misalignment between the corresponding portion a12 in the captured image and the reference portion a13 in the captured image where the corresponding portion a12 should be located when the spreader 30A captures or unloads the cargo (i.e., the container D), and controls the moving mechanism 10A to move the spreader 30A so as to eliminate the misalignment. This allows the spreader 30A to be positioned directly above the object with high accuracy.
In the crane apparatus 1A, the object is the engaged portion G of the object container D1 gripped by the spreader 30A. Thus, in a state where the spreader 30A grips the target container D1, the positional relationship between the spreader 30A and the target container D1 can be acquired more reliably.
Alternatively, in the crane apparatus 1A, the target object is the engaged portion G of the target container D3 that is to unload the holding container D2 held by the spreader 30A. Thus, in a state where the spreader 30A is unloading the holding container D2, the positional relationship between the spreader 30A and the target container D3 can be acquired more reliably.
[ modified examples ]
The above-described embodiments may be variously modified and improved according to the knowledge of those skilled in the art.
For example, in embodiment 1 and embodiment 2, two or all of the devices constituting the imaging unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 may be common devices. For example, the image pickup unit 40 and the color information acquisition unit 42 may be configured by a general-purpose color camera, or the image pickup unit 40, the distance information acquisition unit 41, and the color information acquisition unit 42 may be configured by a general-purpose RGB image integrated TOF camera.
In embodiments 1 AND 2, when the control unit 50 detects an object in a region where the object should not exist based on the OR accuracy map OR the AND accuracy map, the object can be recognized as an obstacle.
The object may have a predetermined shape, a predetermined size, and a predetermined color. For example, in embodiment 1, the object may be an object other than the steel coil C or the loading mark M of the saddle S. Alternatively, the object may be a part of the steel coil C or the entire saddle S. In embodiment 2, the object may be an object other than the engaged portion G of the target container D1 or the engaged portion G of the target container D3. Alternatively, the object may be the entire object container D1 or the entire target container D3.
Further, the control unit 50 may store the reference portions a6, a10, and a13 in advance. In this case, the amount of calculation by the control unit 50 can be reduced.
In embodiments 1 and 2, the display unit 43 is not limited to a display provided in the cab 60 or 60A, and may have any configuration. For example, the display unit 43 may be a display of a portable terminal that can communicate with the crane apparatuses 1 and 1A directly or indirectly via a network or the like. In addition, when the crane apparatuses 1 and 1A are not operated by a manual operation of an operator (driver) but are operated by an automatic operation of the control unit 50, the display unit 43 may be omitted. That is, in the case where the crane apparatuses 1 and 1A are operated automatically based on the control unit 50, not by manual operation by the operator, the display unit 43 may not be provided.
The crane apparatuses 1 and 1A are not limited to the bridge crane and the container handling crane, and various other cranes may be used.
Description of the symbols
1. 1A-crane means, 30A-spreader, 40-camera section, 41-distance information acquisition section, 42-color information acquisition section, 43-display section, 50-control section, C-steel coil (goods, object), D-container (goods), G-blocked section (object), M-load marker (object).

Claims (7)

1. A crane device is characterized by comprising:
a spreader which performs gripping, holding, and unloading of the cargo;
a camera part which is arranged on the lifting appliance and shoots the lower part of the lifting appliance to obtain a camera image;
a distance information acquisition unit that is provided on the spreader and acquires distance information from the spreader to a plurality of measurement points within a range of the captured image below the spreader;
a color information acquisition unit that is provided on the hanger and acquires color information within a range of the captured image below the hanger;
a display unit that displays the captured image; and
a control unit for processing the captured image and displaying a processing result on the display unit,
the control unit performs the following processing:
extracting an object region including a region where an object is present in the photographed image from the photographed image based on the distance information and the color information,
detecting a corresponding portion corresponding to a shape of a pre-stored reference image in the object region,
the detected corresponding portion is emphasized in the captured image, and is displayed on the display unit.
2. The crane arrangement of claim 1,
the control unit does not detect the corresponding portion in a region other than the target region in the captured image.
3. Crane device according to claim 1 or 2,
the control unit performs the following processing:
estimating a1 st region in the captured image where the object is located, based on the distance information,
estimating a2 nd region in which the object is located in the captured image based on the color information,
extracting, from the captured image, a region including a region included in at least one of the 1 st region and the 2 nd region as the target region.
4. Crane device according to claim 1 or 2,
the control unit performs the following processing:
estimating a1 st region in the captured image where the object is located, based on the distance information,
estimating a2 nd region in which the object is located in the captured image based on the color information,
extracting a region including regions included in both the 1 st region and the 2 nd region from the captured image as the target region.
5. Crane device according to any one of claims 1 to 4,
further comprises a moving mechanism for moving the spreader,
the control section detects a misalignment between the corresponding portion in the picked-up image and a reference portion where the corresponding portion should be located in the picked-up image when the spreader performs the gripping or unloading of the cargo,
the moving mechanism moves the spreader to eliminate the misalignment.
6. Crane device according to any one of claims 1 to 5,
the object is at least a part of the cargo gripped by the spreader.
7. Crane device according to any one of claims 1 to 5,
the object is at least a part of a placement unit that is to unload the load held by the spreader.
CN201880053626.5A 2017-09-05 2018-07-13 Crane device Active CN111032561B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110179044.5A CN112938766B (en) 2017-09-05 2018-07-13 Crane device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017170605 2017-09-05
JP2017-170605 2017-09-05
PCT/JP2018/026546 WO2019049511A1 (en) 2017-09-05 2018-07-13 Crane device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110179044.5A Division CN112938766B (en) 2017-09-05 2018-07-13 Crane device

Publications (2)

Publication Number Publication Date
CN111032561A true CN111032561A (en) 2020-04-17
CN111032561B CN111032561B (en) 2021-04-09

Family

ID=65633842

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110179044.5A Active CN112938766B (en) 2017-09-05 2018-07-13 Crane device
CN201880053626.5A Active CN111032561B (en) 2017-09-05 2018-07-13 Crane device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110179044.5A Active CN112938766B (en) 2017-09-05 2018-07-13 Crane device

Country Status (3)

Country Link
JP (1) JP6689467B2 (en)
CN (2) CN112938766B (en)
WO (1) WO2019049511A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110166A (en) * 2021-04-13 2021-07-13 镇江港务集团有限公司 Power control system of horizontal gantry crane and control method thereof
WO2023121502A1 (en) * 2021-12-24 2023-06-29 Общество С Ограниченной Ответственностью "Малленом Системс" System for automatically determining the position of an overhead crane

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7294024B2 (en) * 2019-09-20 2023-06-20 東芝ライテック株式会社 Control device and control method
JP7306291B2 (en) * 2020-02-13 2023-07-11 コベルコ建機株式会社 guidance system
JP2021151909A (en) * 2020-03-24 2021-09-30 住友重機械搬送システム株式会社 Remote operation system and remote operation method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (en) * 1997-12-05 1999-06-22 Fujitsu Ltd Hand form recognition device and monochromatic object form recognition device
CN1684901A (en) * 2002-09-30 2005-10-19 西门子公司 Method and device for recognition of a load on a lifting gear
JP2011198270A (en) * 2010-03-23 2011-10-06 Denso It Laboratory Inc Object recognition device and controller using the same, and object recognition method
CN203439940U (en) * 2013-09-04 2014-02-19 贾来国 Automatic control system for RTG/RMG dual-laser sling crash-proof box at container terminal
CN103693551A (en) * 2012-08-01 2014-04-02 通用电气能源电力转换有限责任公司 Unloading device for containers and method for operating them
CN103781717A (en) * 2011-07-18 2014-05-07 科恩起重机有限公司 System and method for determining location and skew of crane grappling member
CN104302848A (en) * 2012-03-29 2015-01-21 哈尼施费格尔技术公司 Overhead view system for shovel
US20160239977A1 (en) * 2015-02-18 2016-08-18 Canon Kabushiki Kaisha Image processing apparatus, imaging apparatus, depth measuring apparatus and image processing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10245889B4 (en) * 2002-09-30 2008-07-31 Siemens Ag Method and / or device for determining a pendulum of a load of a hoist
CN103179369A (en) * 2010-09-21 2013-06-26 株式会社锦宫事务 Imaging object, image processing program and image processing method
CN102452611B (en) * 2010-10-21 2014-01-15 上海振华重工(集团)股份有限公司 Detection method and detection device for space attitude of suspender of container crane
FI125689B (en) * 2012-10-02 2016-01-15 Konecranes Global Oy Handling a load with a load handler
CN204675650U (en) * 2015-03-18 2015-09-30 苏州盈兴信息技术有限公司 A kind of production material storing flow diagram is as autotracker

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11167455A (en) * 1997-12-05 1999-06-22 Fujitsu Ltd Hand form recognition device and monochromatic object form recognition device
CN1684901A (en) * 2002-09-30 2005-10-19 西门子公司 Method and device for recognition of a load on a lifting gear
JP2011198270A (en) * 2010-03-23 2011-10-06 Denso It Laboratory Inc Object recognition device and controller using the same, and object recognition method
CN103781717A (en) * 2011-07-18 2014-05-07 科恩起重机有限公司 System and method for determining location and skew of crane grappling member
CN104302848A (en) * 2012-03-29 2015-01-21 哈尼施费格尔技术公司 Overhead view system for shovel
CN103693551A (en) * 2012-08-01 2014-04-02 通用电气能源电力转换有限责任公司 Unloading device for containers and method for operating them
CN203439940U (en) * 2013-09-04 2014-02-19 贾来国 Automatic control system for RTG/RMG dual-laser sling crash-proof box at container terminal
US20160239977A1 (en) * 2015-02-18 2016-08-18 Canon Kabushiki Kaisha Image processing apparatus, imaging apparatus, depth measuring apparatus and image processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110166A (en) * 2021-04-13 2021-07-13 镇江港务集团有限公司 Power control system of horizontal gantry crane and control method thereof
WO2023121502A1 (en) * 2021-12-24 2023-06-29 Общество С Ограниченной Ответственностью "Малленом Системс" System for automatically determining the position of an overhead crane

Also Published As

Publication number Publication date
CN111032561B (en) 2021-04-09
JP6689467B2 (en) 2020-04-28
JPWO2019049511A1 (en) 2020-05-28
CN112938766A (en) 2021-06-11
CN112938766B (en) 2023-08-15
WO2019049511A1 (en) 2019-03-14

Similar Documents

Publication Publication Date Title
CN111032561B (en) Crane device
JP3785061B2 (en) Container position detection method and apparatus for cargo handling crane, container landing and stacking control method
KR101699672B1 (en) Method and system for automatically landing containers on a landing target using a container crane
JP6167179B2 (en) Load handling using load handling equipment
EP3613699A1 (en) Inspection system for container
KR100624008B1 (en) Auto landing system and the method for control spreader of crane
EP3418244B1 (en) Loading a container on a landing target
KR101089265B1 (en) Container docking system, container crane, and container docking method
JP2018172188A (en) Container terminal and method for operating the same
CN112678688B (en) Crane device
KR20110066764A (en) Spreader control system of crane for container
CN112996742A (en) Crane system, crane positioning device and crane positioning method
KR20050007241A (en) Absolute-position detection method and algorithm of spreader for the auto-landing of containers
WO2020184025A1 (en) Crane and method for loading with crane
JPH09267990A (en) Device for detecting position of hoisted load of rope suspension type crane
JPH09328291A (en) Automatic coiling direction recognition method
WO2020111062A1 (en) Container crane remote operation system and remote operation method
Lim et al. A Visual Measurement System for Coil Shipping Automation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant