CN113788332B - Unloading device - Google Patents

Unloading device Download PDF

Info

Publication number
CN113788332B
CN113788332B CN202111088722.3A CN202111088722A CN113788332B CN 113788332 B CN113788332 B CN 113788332B CN 202111088722 A CN202111088722 A CN 202111088722A CN 113788332 B CN113788332 B CN 113788332B
Authority
CN
China
Prior art keywords
edge
point
unit
measurement
hatch coaming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111088722.3A
Other languages
Chinese (zh)
Other versions
CN113788332A (en
Inventor
久保谅太郎
坂野肇
香月良夫
阿久根圭
水崎纪彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IHI Corp
IHI Transport Machinery Co Ltd
Original Assignee
IHI Corp
IHI Transport Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018206073A external-priority patent/JP7129314B2/en
Application filed by IHI Corp, IHI Transport Machinery Co Ltd filed Critical IHI Corp
Publication of CN113788332A publication Critical patent/CN113788332A/en
Application granted granted Critical
Publication of CN113788332B publication Critical patent/CN113788332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G67/00Loading or unloading vehicles
    • B65G67/60Loading or unloading ships
    • B65G67/606Loading or unloading ships using devices specially adapted for bulk material
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G67/00Loading or unloading vehicles
    • B65G67/60Loading or unloading ships
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G69/00Auxiliary measures taken, or devices used, in connection with loading or unloading
    • B65G69/04Spreading out the materials conveyed over the whole surface to be loaded; Trimming heaps of loose materials
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2201/00Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
    • B65G2201/04Bulk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/042Sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/042Sensors
    • B65G2203/044Optical

Abstract

The unloading device (100) is provided with a shovel unit (112) for shoveling the cargo (6) in the cabin (5), and distance measuring sensors (133-136) respectively included in the measurement range on the traveling direction side of the shovel unit (112) and on the side opposite to the traveling direction side. The shovel may further comprise a display unit (230) for displaying the measurement results of the distance measuring sensors (133, 135) disposed on the side surface of the shovel (112) on the traveling direction side and the measurement results of the distance measuring sensors (134, 136) disposed on the side surface opposite to the traveling direction side. The unloading device (100) of another embodiment is provided with a main body part having a scooping part (112) inserted into the cabin (5), ranging sensors (130-132) which are arranged on the main body part and can range downward, and an edge detection part (152) which detects the edge of the upper end of a hatch coaming (7) arranged on the upper part of the cabin (5) by using a plurality of measuring points measured by the ranging sensors (130-132).

Description

Unloading device
The present application is a divisional application of the invention patent application with the application number 2019800056455 and the application date 2019, 01, 31 and named "unloading device".
Technical Field
The present disclosure relates to an unloading device. The present application claims the benefit of priority based on japanese patent application nos. 2018-017504 filed 2/2018, and japanese patent application nos. 2018-206073 filed 31/10/2018, the contents of which are incorporated herein by reference.
Background
The unloading device carries out the cargo loaded in the cabin to the outside of the cabin. As an example of the unloading device, there is an unloading device. In many cases, it is difficult or impossible for operators to directly observe the state of the cargo, the distance to the wall surface of the hold, and the like in the unloading device. In the unloading device, a technique has been developed in which a sensor is attached to a scooping portion to measure a distance to a wall surface of a cabin (for example, patent literature 1).
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 8-012094
Disclosure of Invention
Problems to be solved by the invention
In the technique described in patent document 1, it is difficult to grasp the condition of cargo in the cabin.
In view of the above problems, an object of the present disclosure is to provide a cargo discharging device capable of grasping a scooping state of cargo even outside a ship cabin.
Means for solving the problems
In order to solve the above problems, an unloading device according to an aspect of the present disclosure includes: a scooping part which scoops the goods in the cabin; and a distance measuring sensor that includes the travel direction side of the shovel and the side opposite to the travel direction side in the measurement range, respectively.
Preferably, the unloading device includes a display unit that displays a measurement result of the distance measuring sensor disposed on a side surface of the shovel unit on the traveling direction side and a measurement result of the distance measuring sensor disposed on a side surface opposite to the traveling direction side.
Preferably, the distance measuring sensor is capable of measuring distance toward the lower side.
The distance measuring sensor is preferably capable of measuring a range equal to or longer than a length in which the load can be scooped by the scooping section in a direction orthogonal to the traveling direction.
Preferably, the display unit displays a first image showing the load on the travel direction side of the shovel unit, which is shown based on the measurement result of the distance measuring sensor arranged on the side surface on the travel direction side of the shovel unit, the load on the opposite side to the travel direction side, which is shown based on the measurement result of the distance measuring sensor arranged on the side surface on the opposite side to the travel direction side, and the shovel unit.
Preferably, the display unit displays a second image showing the cargo on the travel direction side of the shovel unit and the shovel unit based on the measurement result of the distance measuring sensor disposed on the travel direction side of the shovel unit.
Preferably, the display unit displays a third image showing the cargo and the scooping unit on the side opposite to the traveling direction side based on the measurement result of the distance measuring sensor arranged on the side opposite to the traveling direction side.
Preferably, the display unit displays the depth of penetration of the cargo formed by the scooping unit based on the measurement result of the distance measuring sensor disposed on the side surface of the scooping unit on the traveling direction side and the measurement result of the distance measuring sensor disposed on the side surface opposite to the traveling direction side.
The distance measuring sensor is preferably disposed in the scooping section or in a vertical conveying mechanism section that holds the scooping section.
In order to solve the above problems, an unloading device according to an aspect of the present disclosure includes: a main body part provided with a shovel part inserted into the cabin; a distance measuring sensor which is disposed in the main body and can measure distance toward the lower side; and an edge detection unit that detects an edge of an upper end of a hatch coaming provided in an upper portion of the cabin, using a plurality of measurement points measured by the distance measurement sensor.
The edge detection unit preferably includes: a direction determination unit that determines a direction between the measurement point and an adjacent measurement point; a grouping unit that groups the measurement points into clusters based on the angle difference in the direction determined by the direction determining unit; and an edge point extraction unit that extracts edge points of the hatch coaming based on the end points of the clustered clusters.
Preferably, the distance measuring sensor measures measurement points of each of the plurality of measurement lines as measurement point groups, and the edge point extracting unit extracts edge points for each of the measurement point groups.
The grouping unit preferably determines continuity between the measurement points continuously measured by the distance measurement sensor based on the direction between the measurement points, extracts the measurement points having continuity as clusters, and extracts end points of the clusters as continuous end points, the edge point extracting unit extracts continuous end points closest to the horizontal distance of the body portion among the continuous end points as edge candidate points, which are candidates of edge points, and extracts edge points based on the edge candidate points, and the edge detecting unit includes an edge deriving unit that derives an edge of the upper end of the hatch coaming based on the edge points extracted by the edge point extracting unit.
Preferably, the edge point extraction unit derives a direction of a continuous point group including the edge candidate point, and extracts, as the edge point, a measurement point closest to the horizontal distance of the main body unit from among measurement points on the same plane as the edge candidate point when the direction of the continuous point group including the edge candidate point approaches the horizontal direction.
Preferably, the edge point extraction unit derives a direction of a continuous point group including the edge candidate point, and extracts, as the edge point, a measurement point highest in the vertical direction from among measurement points on the same plane as the edge candidate point when the direction of the continuous point group including the edge candidate point is close to the vertical direction.
Preferably, the unloading device further includes a coordinate transformation deriving unit that derives edge side information on each side of the edge at the upper end of the hatch coaming based on the plurality of edge points detected by the edge detecting unit, and derives transformation parameters of the coordinate system of the unloading device and the coordinate system of the cabin based on the derived edge side information.
The coordinate transformation deriving unit preferably associates a straight line of the upper edge of the hatch coaming in the edge side information with the upper edge of the three-dimensional model of the hatch coaming based on the posture of the unloading device, and derives the transformation parameter based on the positional relationship between the straight line of the edge and the upper edge after the association.
Preferably, the coordinate transformation deriving unit derives the transformation parameter by representing a straight line of the upper edge of the hatch coaming based on the edge side information by using a three-dimensional point group and minimizing a sum of values obtained based on a distance between the three-dimensional point group and the upper edge of the hatch coaming in the three-dimensional model.
ADVANTAGEOUS EFFECTS OF INVENTION
Even outside the cabin, the picking state of the cargo can be grasped.
Drawings
Fig. 1 is a diagram illustrating a discharge system.
Fig. 2 is a view illustrating the structure of the unloading device.
Fig. 3 is a diagram illustrating a measurement range of the ranging sensor.
Fig. 4 is a diagram illustrating a measurement range of the ranging sensor.
Fig. 5 is a diagram illustrating a measurement range of the ranging sensor.
Fig. 6 is a diagram illustrating a measurement range of the ranging sensor.
Fig. 7 is a diagram illustrating an electrical configuration of the discharge system in the first embodiment.
Fig. 8A is a diagram illustrating a coordinate system of the unloading device.
Fig. 8B is a diagram illustrating the coordinate system of the discharge device.
Fig. 9 is a diagram illustrating measurement points of the ranging sensor.
Fig. 10 is a diagram showing a state in which edge points are detected.
Fig. 11A, 11B, and 11C are diagrams illustrating the arrangement of the three-dimensional model.
Fig. 12 is a diagram illustrating an upper viewpoint image.
Fig. 13 is a diagram illustrating an image around the shovel.
Fig. 14A and 14B are diagrams illustrating an automatic path.
Fig. 15 is a diagram illustrating an electrical configuration of the discharge system in the second embodiment.
Fig. 16 is a diagram illustrating a functional configuration of the edge detection section.
Fig. 17 is a flowchart showing the flow of the continuous end point extraction processing by the continuous end point extraction unit and the edge point extraction processing by the edge point extraction unit.
Fig. 18 is a diagram illustrating a continuous endpoint extraction process.
Fig. 19 is a diagram illustrating extraction of edge candidate points.
Detailed Description
Hereinafter, an embodiment of the present disclosure will be described in detail with reference to the accompanying drawings. The dimensions, materials, other specific numerical values, and the like shown in such embodiments are merely examples for easy understanding, and are not limiting of the present disclosure unless specifically described. In the present specification and the drawings, elements having substantially the same functions and structures are denoted by the same reference numerals, and repetitive description thereof will be omitted, and elements not directly related to the present disclosure will be omitted.
< first embodiment >
Fig. 1 is a diagram illustrating a discharge system 1. As shown in fig. 1, the unloading system 1 includes an unloading device 100 as an example of an unloading device, and a control device 200. Although the description has been given of an example in which four discharge devices 100 are provided, the number of discharge devices 100 may be arbitrary.
The unloading device 100 is capable of traveling on a pair of rails 3 laid along the quay wall 2 in the extending direction of the rails 3. In fig. 1, the plurality of discharge devices 100 are arranged on the same rail 3, but may be arranged on different rails 3.
The discharge device 100 is communicably connected to the control device 200. The communication method between the unloading device 100 and the control device 200 may be wired or wireless.
The unloading device 100 carries out the cargo 6 loaded in the hold 5 of the ship 4 moored to the quay wall 2 to the outside. The cargo 6 is assumed to be bulk cargo, and coal is exemplified.
Fig. 2 is a diagram illustrating the structure of the unloading device 100. In fig. 2, the quay wall 2 and the ship 4 are shown in cross section. As shown in fig. 2, the unloading device 100 includes a traveling body 102, a revolving body 104, a boom 106, a top frame 108, a lift 110, a shovel 112, and a boom conveyor 114. The top frame 108, the elevator 110, and the scooping portion 112 function as a main body portion including the scooping portion 112 inserted into the cabin 5.
By driving a driver, not shown, the traveling body 102 can travel on the track 3. The traveling body 102 is provided with a position sensor 116. The position sensor 116 is, for example, a rotary encoder. The position sensor 116 measures the position of the traveling body 102 on the horizontal plane with respect to a predetermined origin position based on the rotational speed of the wheels of the traveling body 102.
The revolving unit 104 is provided rotatably about a vertical axis at an upper portion of the traveling body 102. The turning body 104 can turn with respect to the traveling body 102 by driving a driver not shown.
The boom 106 is provided at an upper portion of the revolving unit 104 so as to be capable of changing an inclination angle. By driving a not-shown actuator, the boom 106 can change the inclination angle with respect to the revolving unit 104.
The rotation body 104 is provided with a rotation angle sensor 118 and an inclination angle sensor 120. The rotation angle sensor 118 and the inclination angle sensor 120 are, for example, rotary encoders. The rotation angle sensor 118 measures the rotation angle of the rotation body 104 with respect to the traveling body 102. Inclination angle sensor 120 measures an inclination angle of boom 106 with respect to revolving unit 104.
A top frame 108 is provided at the front end of the cantilever 106. The top frame 108 is provided with a driver for rotating the elevator 110.
The lifter 110 is formed in a substantially cylindrical shape. The lifter 110 is rotatably supported by the top frame 108 about a central axis. A pivot angle sensor 122 is provided in the top frame 108. The rotation angle sensor 122 is, for example, a rotary encoder. The pivot angle sensor 122 measures the pivot angle of the lifter 110 with respect to the top frame 108.
The scooping portion 112 is provided at the lower end of the elevator 110. The scooping portion 112 rotates integrally with the elevator 110 as the elevator 110 rotates. In this way, the scooping portion 112 is rotatably held by the top frame 108 and the lifter 110 functioning as the vertical conveyance mechanism portion.
The shovel 112 includes a plurality of buckets 112a and a chain 112b. The plurality of buckets 112a are arranged in series in the chain 112b. The chain 112b is installed inside the shovel 112 and the lifter 110.
The scooping portion 112 is provided with a link mechanism, not shown. The link mechanism is movable, so that the length of the bottom of the scooping portion 112 is variable. Thereby, the scooping portion 112 changes the number of the buckets 112a that are in contact with the cargo 6 in the hold 5. The scooping portion 112 scoops the cargo 6 in the hold 5 by the bucket 112a at the bottom by rotating the chain 112b. The bucket 112a that scoops the load 6 moves upward of the elevator 110 with the rotation of the chain 112b.
A boom conveyor 114 is provided below the boom 106. The boom conveyor 114 carries out the load 6 moved to the upper portion of the elevator 110 by the bucket 112 a.
The unloading device 100 having such a configuration is moved along the extending direction of the rail 3 by the traveling body 102 to adjust the relative positional relationship with the ship 4 in the longitudinal direction. Further, the unloading device 100 rotates the boom 106, the top frame 108, the lifter 110, and the scooping portion 112 by the rotation body 104, thereby adjusting the relative positional relationship with the ship 4 in the short side direction. Further, the unloading device 100 moves the top frame 108, the elevator 110, and the shovel 112 in the vertical direction by the boom 106, and adjusts the relative positional relationship with the ship 4 in the vertical direction. The unloading device 100 rotates the elevator 110 and the shovel 112 by the top frame 108. Thus, the unloading device 100 can move the shovel 112 to an arbitrary position and angle.
Here, the vessel 4 is divided into a plurality of cabins 5. A hatch coaming 7 is provided in the upper part of the hold 5. The hatch coaming 7 has a wall surface of a predetermined height in the vertical direction. The opening area of the hatch coaming 7 is smaller than the horizontal cross section near the center of the cabin 5. That is, the cabin 5 has a shape in which an opening is narrowed by the hatch coaming 7. A hatch cover 8 for opening and closing the hatch coaming 7 is provided above the hatch coaming 7.
Since the opening is narrowed by the hatch coaming 7 in this way, it is difficult for the operator to visually confirm the condition in the cabin 5 when the operator scoops the cargo 6 by the scooping section 112. Accordingly, the discharge device 100 of the present disclosure is provided with the distance measuring sensors 130 to 136. The unloading system 1 of the present disclosure displays the positional relationship between the unloading device 100 and the cabins 5 and the cargoes 6 based on the distances measured by the distance measuring sensors 130 to 136, and can allow the operator to grasp the situation in the cabins 5.
The distance measuring sensors 130 to 136 are, for example, laser sensors capable of measuring distance, and VLP-16 manufactured by Velodyne, VLP-32 manufactured by Quanergy, M8 manufactured by Quanergy, or the like can be used. The distance measuring sensors 130 to 136 are provided with, for example, 16 laser irradiation sections axially spaced apart on the side surface of a cylindrical main body section. The laser irradiation part is provided in the main body part so as to be rotatable by 360 degrees. The laser irradiation portions were respectively arranged adjacent to each other, and the difference in the emission angle of the laser light in the axial direction was 2 degrees. That is, the distance measuring sensors 130 to 136 can irradiate laser light in a range of 360 degrees in the circumferential direction of the main body portion. The distance measuring sensors 130 to 136 can emit laser light within ±15 degrees with respect to a plane orthogonal to the axial direction of the main body. The body of the distance measuring sensors 130 to 136 is provided with a receiving portion for receiving the laser beam.
The distance measuring sensors 130 to 136 irradiate laser light at a predetermined angle while rotating the laser irradiation part. The distance measuring sensors 130 to 136 receive the laser beams irradiated (projected) from the plurality of laser irradiation units and reflected by the object (measurement point), respectively, by the reception units. The distance measuring sensors 130 to 136 derive the distance to the object based on the time from the irradiation of the laser light to the reception. That is, the distance measuring sensors 130 to 136 measure a plurality of measurement points on one measurement line (a track of the irradiated laser light or a track of an object reflecting the laser light in a cross section formed by the irradiated laser light) by one laser light irradiation unit. The distance measuring sensors 130 to 136 measure a plurality of measurement points on a plurality of measurement lines by a plurality of laser irradiation units, respectively.
Fig. 3 and 4 are diagrams illustrating measurement ranges of the ranging sensors 130 to 132. Fig. 3 is a diagram illustrating the measurement ranges of the ranging sensors 130 to 132 when the unloading device 100 is viewed from above. Fig. 4 is a diagram illustrating the measurement ranges of the ranging sensors 130 to 132 when the unloading device 100 is viewed from the side. In fig. 3 and 4, the measurement ranges of the ranging sensors 130 to 132 are shown by single-dot chain lines.
The distance measuring sensors 130 to 132 are mainly used for detecting the edge of the upper end of the hatch coaming 7. As shown in fig. 3 and 4, the distance measuring sensors 130 to 132 are mounted on the side surface of the top frame 108. Specifically, the distance measuring sensors 130 to 132 are disposed at 120 degrees apart from each other in the circumferential direction with reference to the central axis of the elevator 110. The distance measuring sensors 130 to 132 are arranged such that the central axis of the body portion is along the radial direction of the lifter 110. The upper half of the range sensors 130 to 132 in the vertical direction is covered with a cover, not shown.
Therefore, as shown in fig. 3 and 4, the distance measuring sensors 130 to 132 can measure the distance to: the object is located below the horizontal plane as the measurement direction, and is located within ±15 degrees with respect to a tangent line tangent to the side surface of the top frame 108.
Fig. 5 and 6 are diagrams illustrating measurement ranges of the ranging sensors 133 to 136. Fig. 5 is a diagram illustrating the measurement ranges of the ranging sensors 133 to 136 when the shovel 112 is viewed from above. In fig. 5, only the scooping portion 112 of the unloading device 100 is shown. Fig. 5 shows a horizontal cross section of the ship 4 at the same position as the scooping portion 112 in the vertical direction. Fig. 6 is a diagram illustrating the measurement ranges of the ranging sensors 133 to 136 when the unloading device 100 is viewed from the side. In fig. 5 and 6, the measurement ranges of the distance measuring sensors 133 and 134 are shown by single-dot chain lines. In fig. 5 and 6, the measurement ranges of the distance measuring sensors 135 and 136 are shown by two-dot chain lines.
The distance measuring sensors 133 to 136 are mainly used for detecting the cargo 6 in the cabin 5 and the wall surface of the cabin 5. As shown in fig. 5 and 6, the distance measuring sensors 133 and 134 are attached to the side surfaces 112c and 112d of the shovel 112, respectively. The distance measuring sensors 133 and 134 are disposed such that the central axes of the body portion are orthogonal to the side surfaces 112c and 112d of the shovel 112, respectively. The upper half portions of the ranging sensors 133 and 134 in the vertical direction are covered with covers not shown.
Accordingly, the distance measuring sensors 133 and 134 can measure the distance of: the object is located below the side surfaces 112c and 112d of the shovel 112 in the measurement direction, and is located within ±15 degrees with respect to the position parallel to the side surfaces 112c and 112d of the shovel 112. More specifically, the distance measuring sensors 133 and 134 can measure the distance to the object (the cargo 6) located on the bottom side of the shovel 112 and on both sides of the shovel 112. The distance measuring sensors 133 and 134 are disposed so as to be able to measure a range equal to or longer than the maximum length of the bottom of the shovel 112 on the plane on which the bottom of the shovel 112 is located.
The distance measuring sensors 135 and 136 are attached to the side surface 112c and the side surface 112d of the shovel 112, respectively. The distance measuring sensors 135 and 136 are disposed such that the central axis of the main body is perpendicular to the bottom surface of the shovel 112.
Accordingly, the distance measuring sensors 135, 136 can measure the distance of: the object is located outside the shovel 112 as the measurement direction, and is located within ±15 degrees with respect to a horizontal plane orthogonal to the side surfaces 112c and 112d of the shovel 112.
Fig. 7 is a diagram illustrating an electrical configuration of the discharge system 1 in the first embodiment. As shown in fig. 7, the discharge device 100 is provided with a discharge control unit 140, a storage unit 142, and a communication device 144.
The discharge control unit 140 is connected to the position sensor 116, the pivot angle sensor 118, the inclination angle sensor 120, the pivot angle sensor 122, the distance measuring sensors 130 to 136, the storage unit 142, and the communication device 144. The discharge control unit 140 is constituted by a semiconductor integrated circuit including a CPU (central processing unit). The discharge control unit 140 reads a program, parameters, and the like for operating the CPU itself from the ROM. The discharge control unit 140 manages and controls the entire discharge device 100 in cooperation with the RAM and other electronic circuits as the work area. The discharge control unit 140 functions as a drive control unit 150, an edge detection unit 152, a coordinate transformation deriving unit 154, a model arrangement unit 156, a state monitoring unit 158, a route generating unit 160, an automatic operation command unit 162, an automatic operation end determination unit 164, and an anti-collision unit 166. Further, the discharge control section 140 is described in detail below.
The storage unit 142 is a storage medium such as a hard disk or a nonvolatile memory. The storage unit 142 stores data of the three-dimensional model of the cargo discharging device 100 and the ship 4. The data of the three-dimensional model of the unloading device 100 is three-dimensional pixel data of at least the outer shape of the elevator 110 and the shovel 112. The data of the three-dimensional model of the ship 4 is three-dimensional pixel data of the outer shape of the hatch coaming 7, and three-dimensional pixel data of the wall shape and the inner space of the cabin 5. The data of the three-dimensional model may be data that enables the three-dimensional shapes of the unloading device 100 and the ship 4 to be grasped, and may be used simultaneously even if the data is polygonal data, a contour (straight line), a point group, or the like. The data of the three-dimensional model of the ship 4 is set for each type of the ship 4.
The data of the three-dimensional model of the discharge device 100 can be calculated from the shape information at the time of design and the measurement results of the position sensor 116, the rotation angle sensor 118, the inclination angle sensor 120, and the rotation angle sensor 122 of the discharge device 100. Also, the data of the three-dimensional model of the ship 4 may use design data of the ship, and may also use data measured at the time of past arrival at port. The measurement at the time of entry can be performed using a device capable of generating data of a three-dimensional model, such as a laser sensor. The three-dimensional model data may be accumulated with information from the ranging sensors 130 to 136 to restore the shape.
The communication device 144 communicates with the control device 200 by wire or wirelessly.
The control device 200 includes a monitor control unit 210, an operation unit 220, a display unit 230, and a communication device 240. The monitor control unit 210 is constituted by a semiconductor integrated circuit including a CPU (central processing unit). The monitor control unit 210 reads a program, parameters, and the like for operating the CPU itself from the ROM. The monitoring control unit 210 is configured to manage and control the plurality of discharge devices 100 in a unified manner in cooperation with the RAM and other electronic circuits as the work area. The monitor control unit 210 functions as a remote operation switching unit 212, a display switching unit 214, and a situation determination unit 216. In addition, the monitor control section 210 is described in detail below.
The operation unit 220 receives an input operation for operating the unloading device 100. As will be described in detail below, the display unit 230 displays an image in which the operator can grasp the relative positional relationship between the unloading device 100 and the hold 5 and the cargo 6. Communication device 240 communicates with discharge device 100 by wire or wirelessly.
Fig. 8A and 8B are diagrams illustrating the coordinate system of the unloading device 100. As shown in fig. 8A and 8B, the unloading device 100 has three coordinate systems, namely, an above-ground coordinate system 300, a top frame coordinate system 310, and a hatch coaming coordinate system 320.
The above-ground coordinate system 300 sets the preset initial position of the discharge device 100 as the origin. The above-ground coordinate system 300 sets a direction orthogonal to the extending direction and the vertical direction of the rail 3 as the X-axis direction. The above-ground coordinate system 300 sets the extending direction of the rail 3 as the Y-axis direction. The above-ground coordinate system 300 sets the vertical direction as the Z-axis direction.
The top frame coordinate system 310 sets the lower end of the top frame 108 in the vertical direction, which is located on the central axis of the elevator 110, as the origin. The top frame coordinate system 310 sets the extending direction of the cantilever 106 as the X-axis direction. The top frame coordinate system 310 sets a direction orthogonal to the extending direction and the vertical direction of the cantilever 106 as a Y-axis direction. The top frame coordinate system 310 sets the vertical direction as the Z-axis direction.
The hatch coaming coordinate system 320 sets the upper end of the hatch coaming 7 located at the center position of the stern-side wall surface of the hatch coaming 7 of the ship 4 as the origin. The hatch coaming coordinate system 320 sets the longitudinal direction of the ship 4, that is, the extending direction of the hatch coaming 7 along the ship 4, as the X-axis direction. The hatch coaming coordinate system 320 sets the short side direction (width direction) of the ship 4 as the Y-axis direction. The hatch coaming coordinate system 320 sets a direction orthogonal to the upper end surface of the hatch coaming 7 as the Z-axis direction.
Here, the ground coordinate system 300 and the top frame coordinate system 310 can be changed based on the shape of the discharge device 100 and the movement of the discharge device 100.
For example, since the distance measuring sensors 133 to 136 are attached to the shovel 112, the position relative to the shovel 112 is known in advance. Further, the position of the top frame coordinate system 310 can be derived based on the rotation angle of the lifter 110.
Further, since the distance measuring sensors 130 to 132 are mounted to the top frame 108, the position of the top frame coordinate system 310 is known in advance.
Here, the relative positional relationship between the top frame coordinate system 310 and the hatch coaming coordinate system 320 changes with the movement of the unloading device 100 and the ship 4. For example, since the ship 4 is swayed and the ship 4 moves in the vertical direction due to the tide or the load of the cargo 6, the relative positional relationship between the top frame coordinate system 310 and the hatch coaming coordinate system 320 changes.
Therefore, the edge detection unit 152 detects the edge of the upper end of the hatch coaming 7 based on the measurement points measured by the ranging sensors 130 to 132. The coordinate transformation deriving unit 154 derives transformation parameters of the top frame coordinate system 310 and the hatch coaming coordinate system 320 based on the detected edge of the upper end of the hatch coaming 7.
First, the edge detection unit 152 derives the three-dimensional position of the measurement point in the top frame coordinate system 310 based on the positions of the ranging sensors 130 to 132 and the distances between the measurement points measured by the ranging sensors 130 to 132.
Fig. 9 is a diagram illustrating measurement points of the ranging sensors 130 to 132. In fig. 9, the measurement ranges of the distance measuring sensors 130 to 132 on the hatch coaming 7 are shown as thick lines. As shown in fig. 9, the distance measuring sensors 130 to 132 measure the distance to: the object is located below the horizontal plane and within a range of + -15 degrees from the distance measuring sensors 130 to 132 with respect to the plane that is in contact with the top frame 108. Therefore, the edge of the hatch coaming 7 on the front side and the rear side, which is located vertically below the distance measuring sensors 130 to 132 (the rotation center of the elevator 110), is the measurement range of the distance measuring sensors 130 to 132. The front side is a measurement range measured in the first half of one measurement. The rear side refers to a measurement range measured in the latter half of one measurement.
Therefore, the measurement points measured by the distance measuring sensors 130 to 132 are divided into two parts on the front side and the rear side with reference to the vertically lower sides of the distance measuring sensors 130 to 132.
Fig. 10 is a diagram showing a state in which edge points are detected. In fig. 10, the measurement points are shown as black dots. Fig. 10 shows measurement points reflected by laser light irradiated from one laser irradiation section of the ranging sensors 130 to 132 at a predetermined angle.
The edge detection unit 152 performs the following processing for each measurement point group (front side and rear side, respectively) of one measurement line that is irradiated and measured by one laser irradiation unit. The edge detection unit 152 derives the vector (direction) of each measured point irradiated and measured by one laser irradiation unit. Further, as for the vector of the measurement points, the direction (vector) of the next measured measurement point among the continuously measured measurement points with respect to one measurement point is derived as the vector of one measurement point.
The edge detection unit 152 extracts the vector of the measurement point as the measurement point in the vertical direction. This is because: the wall surface (side surface) of the hatch coaming 7 measured by the ranging sensors 130 to 132 extends substantially in the vertical direction, and when there is a measurement point on the wall surface of the hatch coaming 7, the vector of the measurement point becomes the vertical direction.
When there are a plurality of continuously extracted measurement points among the extracted measurement points, the edge detection unit 152 extracts the uppermost point in the vertical direction. This is because: in order to detect the edge of the upper end of the hatch coaming 7, the uppermost point may be the edge of the upper end of the hatch coaming 7 among the continuously measured measurement point group.
Next, the edge detection unit 152 extracts the measurement point closest to the origin in the X-axis direction and the Y-axis direction in the top frame coordinate system 310 from among the extracted measurement points. This is because: the hatch coaming 7 is located at a position closest to the elevator 110 among the structures of the ship 4.
Then, the edge detection unit 152 extracts the measurement points located within a predetermined range (for example, a range of several tens cm) in the X-axis direction and the Y-axis direction in the top frame coordinate system 310 again for the extracted measurement points. Here, the measurement points on the hatch coaming 7 are extracted.
Then, the edge detection unit 152 extracts, as the edge point of the hatch coaming 7, the upper-most measurement point in the vertical direction among the re-extracted measurement points, that is, the measurement points on the hatch coaming 7.
The edge detection unit 152 extracts edge points on the front side and the rear side for each of the measured point groups, that is, the measured point groups measured on the same surface, which are irradiated and measured by one laser irradiation unit of the range sensors 130 to 132.
When all the edge points are extracted, the edge detection unit 152 detects a straight line of the edge of the hatch coaming 7. Specifically, the edge detection unit 152 sets, as one group, edge points extracted on the front side of the ranging sensor 130. Similarly, the edge detection unit 152 sets the edge points extracted on the rear side of the distance measuring sensor 130 as one group. The edge detection unit 152 groups edge points extracted on the front and rear sides of the range sensors 131 and 132, respectively.
Here, as shown in fig. 9, two straight lines of the edges of the upper ends of the hatch coamings 7 measured on the front and rear sides of the ranging sensors 130 to 132 are measured when the straight lines include the corners of the hatch coamings 7.
Therefore, the edge detection unit 152 derives, for each group, a vector having the most similar line segment among the extracted line segments between edge points as a candidate vector. The edge detection unit 152 extracts edge points within a predetermined range from the candidate vectors. Then, the edge detection unit 152 calculates a straight line again using the extracted edge points.
Next, the edge detection unit 152 repeats the above-described processing using the edge points that are not extracted. However, when the number of extracted edge points is smaller than a predetermined threshold value, no straight line is derived. Thus, even when the corner of the hatch coaming 7 is included, straight lines of both edges can be derived.
The edge detection unit 152 repeats the above-described processing for each group to derive a straight line of the edge.
Thus, for straight lines of edges, since at most two straight lines are detected at one place, at most 12 lines are detected.
The edge detection unit 152 derives an included angle between the straight lines among the detected straight lines. When the included angle is equal to or smaller than a predetermined threshold value, the edge detection unit 152 is aligned. Specifically, the edge points constituting the straight line having the included angle equal to or smaller than a predetermined threshold value are used, and the straight line is derived again by least squares approximation.
Next, the edge detection unit 152 derives edge side information including a three-dimensional direction vector of each side, a three-dimensional barycentric coordinate of each side, a length of each side, and coordinates of an end point of each side from the straight line of the detected edge. In this way, by deriving the edge information of the hatch coaming 7 provided at the upper portion of the cabin 5 using the ranging sensors 130 to 132 provided above the ship 4, the position (attitude) of the cabin 5 can be derived with high accuracy and ease.
Next, the coordinate transformation deriving unit 154 reads the three-dimensional model information of the hatch coaming 7 stored in advance in the storage unit 142 from the storage unit 142. The three-dimensional model information includes a three-dimensional direction vector of the side of the upper end of the hatch coaming 7, a three-dimensional barycentric coordinate of each side, a length of each side, and coordinates of an end point of each side. And, the three-dimensional model information is represented by the hatch coaming coordinate system 320. The coordinate transformation deriving unit 154 derives transformation parameters of the top frame coordinate system 310 and the hatch coaming coordinate system 320 based on the read three-dimensional model information and the edge information (detection result) expressed by the top frame coordinate system 310.
The coordinate transformation deriving unit 154 performs rough correction by rotating the detected direction of the straight line of the edge of the hatch coaming 7 by the rotation angle of the cantilever 106. Then, the coordinate transformation derivation unit 154 associates the detected straight line of the edge of the hatch coaming 7 with the closest straight line of the edge of the upper end of the hatch coaming 7 in the three-dimensional model information. Thus, a correct correspondence is established, and a transformation parameter of a solution close to the positive solution can be obtained stably. In the correspondence relationship, the straight line of the detected edge of the hatch coaming 7 may be represented by a three-dimensional point group, and the correspondence relationship may be established between the three-dimensional point group and the side of the upper end of the hatch coaming 7 in the three-dimensional model information, such that the average value of the shortest distances approaches each other. The correspondence relationship may be established by considering both the direction of the edge and the average value of the shortest distance.
The coordinate transformation deriving unit 154 obtains rotation angles α, β, γ about the X-axis, Y-axis, and Z-axis, which are transformation parameters, and a travel vector t= (tx, ty, tz) by, for example, the LM method. In the LM method, for example, the sum of squares of the differences between the edge points and the upper edge of the hatch coaming 7 based on the three-dimensional model information is set as an evaluation function, and a conversion parameter that minimizes the evaluation function is solved. Specifically, the transformation parameters are solved so that the sum of distances between the edge points and the upper edge of the hatch coaming 7 based on the three-dimensional model information or the area of a curved surface formed by the straight line of the edge and the upper edge of the hatch coaming 7 based on the three-dimensional model information is minimized. Here, the coordinate transformation derivation unit 154 derives a transformation parameter that minimizes the sum of values obtained by the distances between the edge points (three-dimensional point groups) and the upper edge of the hatch coaming 7 based on the three-dimensional model information. The method for solving the transformation parameters is not limited to the LM method, and other methods such as the steepest descent method and newton method may be used.
In this way, the coordinate transformation derivation unit 154 derives transformation parameters for transforming the top frame coordinate system 310 into the hatch coaming coordinate system 320.
Thus, the unloading device 100 can grasp the relative positional relationship between the elevator 110 and the shovel 112, which are represented by the top frame coordinate system 310, and the cabin 5 and the hatch coaming 7, which are represented by the hatch coaming coordinate system 320.
Further, the unloading device 100 can easily derive the positional relationship between the unloading device 100 and the hold 5 by a simple configuration in which only the ranging sensors 130 to 132 capable of ranging toward the lower side are arranged on the side surface of the top frame 108.
Further, the unloading apparatus 100 can estimate the position and posture of the hatch coaming 7 expressed by the hatch coaming coordinate system 320 in the top frame coordinate system 310.
In addition, in the two distance measuring sensors, two edge sides having different directions may not be measured according to the posture of the unloading device 100, except for the square hatch coaming 7. However, in the case where the ranging sensors 130 to 132 are arranged in the circumferential direction of the elevator 110 so as to change the 120-degree direction, the aspect ratio of the edge is 1.73:1, the hatch coaming 7 can detect two edge edges having different directions regardless of the position and posture of the unloading device 100. Therefore, two edge edges having different directions can be detected.
Next, a process of disposing the three-dimensional model of the elevator 110, the shovel 112, the cabin 5, and the hatch coaming 7 will be described.
Fig. 11A, 11B, and 11C are diagrams illustrating the arrangement of the three-dimensional model. As shown in fig. 11A, 11B, and 11C, the model arrangement unit 156 first arranges the three-dimensional model 400 of the elevator 110 and the shovel 112 stored in the storage unit 142 on the hatch coaming coordinate system 320. The three-dimensional model 400 of the elevator 110 and the shovel 112 is represented by a top frame coordinate system 310. Therefore, the model arrangement unit 156 converts the three-dimensional model 400 of the elevator 110 and the shovel 112 into the hatch coaming coordinate system 320 using the conversion parameters derived by the coordinate conversion derivation unit 154.
When the elevator 110 and the shovel 112 move relative to the top frame 108, the model arrangement unit 156 reflects the rotation of the elevator 110, the length of the shovel 112, and the like in the three-dimensional model 400 based on the measurement results of the position sensor 116, the pivot angle sensor 118, the tilt angle sensor 120, and the pivot angle sensor 122 of the unloading device 100.
The three-dimensional model 400 may be a model obtained by temporarily loading another measurer into the cabin and measuring the measured value, either in a model in which noise is filtered from the accumulated measured value during the picking of the cargo 6 or in a model in which the measured value at the end of the picking in the past is accumulated or in a model in a design drawing.
The model arrangement unit 156 arranges the three-dimensional model 400 of the elevator 110 and the shovel 112 converted into the hatch coaming coordinate system 320 on the hatch coaming coordinate system 320 (fig. 11A).
Next, the model arrangement unit 156 superimposes the three-dimensional model 410 of the hatch coaming 7 stored in the storage unit 142 on the three-dimensional model 400 of the elevator 110 and the shovel 112 (fig. 11B). Further, since the three-dimensional model 410 of the hatch coaming 7 is represented by the hatch coaming coordinate system 320, it is directly configured without performing coordinate transformation.
The model arrangement unit 156 superimposes the three-dimensional model 420 of the cabin 5 stored in the storage unit 142 on the three-dimensional model 400 of the elevator 110 and the shovel 112 and the three-dimensional model 410 of the hatch coaming 7 (fig. 11C).
Thus, the model arrangement unit 156 can easily grasp the relative positions of the elevator 110 and the shovel 112, which are part of the unloading device 100, and the hatch coaming 7 and the hold 5, which are part of the ship 4, using a three-dimensional model.
In particular, by disposing the three-dimensional model of the lifter 110 that may collide with the hatch coaming 7 and the three-dimensional model of the hatch coaming 7, the position of the lifter 110 with respect to the hatch coaming 7 can be easily grasped.
Further, by arranging the three-dimensional model of the scooping portion 112 that may collide with the cabin 5 and the three-dimensional model of the cabin 5, the position of the scooping portion 112 with respect to the cabin 5 can be easily grasped.
Next, the state monitoring process by the state monitoring unit 158 will be described. The state monitoring unit 158 cyclically derives the distances (distance information) between the three-dimensional model 410 of the hatch coaming 7 and the three-dimensional model 420 of the cabin 5, which are arranged on the hatch coaming coordinate system 320 by the model arrangement unit 156, and the three-dimensional model 400 of the elevator 110 and the shovel 112.
The state monitoring unit 158 derives the state of the cabin 5 based on the measurement points measured by the ranging sensors 133 to 136. Specifically, the state monitoring unit 158 derives the three-dimensional position of the measurement point in the top frame coordinate system 310 based on the distance to the measurement point measured by the ranging sensors 133 to 136 and the positions of the ranging sensors 133 to 136.
Further, the state monitoring unit 158 converts the three-dimensional position of the measurement point in the top frame coordinate system 310 into the hatch coaming coordinate system 320 using the conversion parameters. The position of each measurement point and the three-dimensional model 420 of the cabin 5 are used to determine whether each measurement point is a wall surface of the cabin 5 or the cargo 6. Here, the measurement points having a relationship between the positions of the measurement points and the position of the three-dimensional model 420 of the cabin 5 within a predetermined range are determined as the wall surface of the cabin 5, and the other measurement points are determined as the cargo 6.
The state monitoring unit 158 uses, as the three-dimensional pixels of the cargo 6, the three-dimensional pixels including the measurement points determined as the cargo 6, and uses, as the three-dimensional pixels of the cargo 6, the three-dimensional pixels below the three-dimensional pixels determined as the cargo 6. The model arrangement unit 156 rearranges, as the three-dimensional model of the cargo 6, the three-dimensional pixels determined as the three-dimensional pixels of the cargo 6 among the three-dimensional pixels of the internal space of the three-dimensional model 420 of the cabin 5. This enables the condition of the cargo 6 in the hold 5 to be grasped.
Further, the unloading device 100 uses a three-dimensional model of the hatch coaming 7 and the unloading device 100 in a highly accurate relative position. Therefore, in the unloading device 100, even if all the edge edges of the hatch coaming 7 cannot be detected by the distance measuring sensors 130 to 132, collision and approaching with all the side surfaces of the hatch coaming 7 can be detected and prevented.
The distance measuring sensors 133 and 135 are provided on the side surface 112c of the shovel 112. The distance measuring sensors 134 and 136 are provided on the side surface 112d of the shovel 112. The scooping portion 112 scoops the cargo 6 while moving from the side surface 112d to the side surface 112c. Accordingly, the unloading device 100 can grasp the condition of the load 6 on the traveling direction side of the shovel 112 by the distance measuring sensors 133 and 135. Further, the unloading device 100 can grasp the condition of the load 6 on the side opposite to the traveling direction of the scooping portion 112 by the distance measuring sensors 134 and 136.
The above-described processes by the coordinate transformation deriving unit 154, the model arranging unit 156, and the state monitoring unit 158 are repeated at predetermined intervals. The communication device 144 transmits the data of the three-dimensional model configured by the model configuration unit 156 and the distance information derived by the state monitoring unit 158 to the control device 200.
Fig. 12 is a diagram illustrating the upper viewpoint image 500. Fig. 13 is a diagram illustrating a shovel periphery image 510. The monitoring control unit 210 of the control device 200 receives the data of the three-dimensional model and the distance information transmitted from the unloading device 100 by the communication device 240. The monitor control unit 210 displays the upper viewpoint image 500 and the shovel periphery image 510 on the display unit 230 based on the received data.
As shown in fig. 12, a three-dimensional model 410 of the hatch coaming 7 and a three-dimensional model 400 of the lifter 110 located at the same position as the hatch coaming 7 in the Z-axis direction are displayed in the upper viewpoint image 500. That is, a cross section perpendicular to the Z-axis direction (a cross section parallel to the upper surface of the hatch coaming 7 or parallel to the horizontal) at a position where the three-dimensional model 410 of the hatch coaming 7 is located is displayed in the upper viewpoint image 500.
In the upper viewpoint image 500, a three-dimensional model 400 of the shovel 112, a three-dimensional model 420 of the cabin 5 located at the same position as the shovel 112 in the Z-axis direction, and a three-dimensional model 430 of the cargo 6 are displayed. That is, a cross section perpendicular to the Z-axis direction at a position where the three-dimensional model 400 of the shovel 112 is located is displayed in the upper viewpoint image 500.
That is, the upper viewpoint image 500 displays the XY section of the position where the three-dimensional model 410 of the hatch coaming 7 is located and the XY section of the position where the three-dimensional model 400 of the shovel 112 is located in an overlapping manner.
The upper viewpoint image 500 displays the distance between the hatch coaming 7 and the elevator 110 ("hatch o m") and the distance between the shovel 112 and the wall surface of the cabin 5 ("cabin o m") on the outside of the three-dimensional model 420 of the cabin 5. The distance between the hatch coaming 7 and the elevator 110 shown here is only displayed below a first threshold value (here, 1.5 m) set in advance. More specifically, when the distance is equal to or less than the first threshold value and equal to or more than a second threshold value (here, 1.0 m) smaller than the first threshold value, the distance is displayed in a yellow background. When the distance is smaller than the second threshold value, the distance is displayed in a red background. The display is performed only when the distance between the scooping portion 112 and the wall surface of the cabin 5 is equal to or less than a preset third threshold value (here, 1.5 m). More specifically, when the distance is equal to or less than the third threshold value and equal to or more than a fourth threshold value (here, 1.0 m) smaller than the third threshold value, the distance is displayed in a yellow background. When the distance is smaller than the fourth threshold value, the distance is displayed in a red background.
By displaying the distance between the hatch coaming 7 and the elevator 110 and the distance between the scooping portion 112 and the wall surface of the cabin 5 in this way, it is possible to quantitatively grasp whether there is a concern about collision or the like.
Further, by displaying the distance between the positions of the first threshold value and the third threshold value, the distance between the hatch coaming 7 and the elevator 110 and the distance between the scooping portion 112 and the wall surface of the cabin 5 can be easily grasped. Further, when the distance is equal to or less than the first threshold value and equal to or greater than the second threshold value and is smaller than the second threshold value, the distance feeling can be easily grasped by displaying the distances in different display modes. Similarly, when the distance is equal to or less than the third threshold value and equal to or greater than the fourth threshold value and less than the fourth threshold value, the distance feeling can be easily grasped by displaying the distance in different display modes. The distance between the hatch coaming 7 and the elevator 110 and the distance between the scooping portion 112 and the wall surface of the cabin 5 are derived based on the distance information transmitted from the unloading device 100.
Thus, the upper viewpoint image 500 can easily grasp the positional relationship between the hatch coaming 7 and the elevator 110, which may collide, and the positional relationship between the scooping portion 112 and the wall surface of the cabin 5, which may collide. Therefore, the operator can avoid collision between the hatch coaming 7 and the elevator 110 and collision between the shovel 112 and the wall surface of the cabin 5 by visually inspecting the upper viewpoint image 500. Further, since a commander for commanding the operation of the unloading device 100 may not be provided in the hold 5 or on the ship 4, the number of people required for picking up the cargo 6 can be reduced. In addition, since only the portion that is likely to collide is displayed in an extracted manner, the amount of information given to the operator does not become excessive, and the operator can make an appropriate judgment. Further, the upper viewpoint image 500 can easily grasp the condition of the cargo 6 at the height position where the shovel 112 is located.
As shown in fig. 13, a shovel periphery image 510 is displayed in an array with a shovel periphery image 512 viewed from the side surface 112c of the shovel 112, a shovel periphery image 514 viewed from the front side of the shovel 112, and a shovel periphery image 516 viewed from the side surface 112d of the shovel 112.
The three-dimensional model 400 of the shovel 112, the three-dimensional model 430 of the cargo 6, and the three-dimensional model 420 (only the bottom surface) of the cabin 5 are displayed in the shovel periphery images 512, 514, 516.
Here, the scooping portion 112 scoops the cargo 6 while moving from the side surface 112d to the side surface 112 c. Accordingly, the three-dimensional model 430 of the cargo 6 that is not scooped is displayed in the scooping portion peripheral image 512 (second image). On the other hand, the three-dimensional model 430 of the cargo 6 scooped by the scooping portion 112 is displayed in the scooping portion peripheral image 516 (third image). Then, the cargo 6 whose side surface 112c side is not scraped and whose side surface 112d side is scraped is displayed in the scraping section peripheral image 514 (first image). This makes it possible to easily grasp the picking state of the cargo 6. For example, the operator can grasp the difference in height of the cargo 6 in the direction of travel of the shovel 112 and the direction opposite to the direction of travel, or the height of the cargo 6 in the direction of travel by observing the shovel periphery image 514, or comparing the shovel periphery images 512 and 516. Thereby, the operator can appropriately scoop the cargo 6 by the scooping portion 112. Further, even outside the cabin 5, the operator can quantitatively grasp the depth to which the cargo 6 is to be scooped. Further, since a commander for commanding the operation of the unloading device 100 may not be provided in the hold 5 or on the ship 4, the number of people required for picking up the cargo 6 can be reduced. Further, since the shovel periphery image 510 is displayed by the hatch coaming coordinate system 320, the cargo 6 and the shovel 112 in the cabin 5 can always be presented at the viewpoint fixed to the ship 4, and the operator can easily grasp the situation.
The difference in the penetration depth of the scooping portion 112 and the interval between the scooping portion 112 and the bottom surface of the hold 5, which will be described in detail below, are displayed in the scooping portion peripheral image 514. The penetration depth is shown on the side surface 112c and the side surface 112d, respectively. The above-mentioned scooping depth and the interval between the scooping portion 112 and the bottom surface of the cabin 5 are derived based on the distance information transmitted from the unloading device 100.
As described above, the described upper viewpoint image 500 and the shovel periphery image 510 are updated and displayed each time the data of the three-dimensional model and the distance information are transmitted from the unloading device 100.
Next, the processing of the path generating unit 160, the automatic operation instructing unit 162, and the automatic operation end determining unit 164 of the unloading device 100 will be described.
Fig. 14A and 14B are diagrams illustrating an automatic path. Here, there are approximately three steps when the cargo 6 is scooped by the unloading device 100. In the case where the cargo 6 in the cabin 5 is not picked up at a time, the cargo 6 is piled up in a mountain shape in the cabin 5. Therefore, as the first step, the cargo 6 in the hold 5 is flattened. The first step is performed by an operator operating the unloading device 100 via the operating unit 220. More specifically, when a signal corresponding to the operation of the operation unit 220 is transmitted to the discharge device 100, the drive control unit 150 operates various drivers to drive the discharge device 100 according to the operation of the operation unit 220.
After that, if the surface of the cargo 6 stacked in the cabin 5 is substantially flat, the scooping portion 112 is moved several turns along the wall surface of the cabin 5 and then moved to the center once as a second step. In this second step, the path along which the scooping portion 112 moves is simplified, and the scooping amount of the cargo 6 is stabilized, so that automation can be realized.
Then, when the amount of the cargo 6 in the hold 5 decreases, the remaining cargo 6 is scooped up by the scooping portion 112 as a third step. In this third step, it is necessary to move the scooping portion 112 to a position where the cargo 6 remains in the hold 5. In the third step, the scooping portion 112 needs to be movable in the vicinity of the bottom surface of the cabin 5. Accordingly, the third step is performed by the operator operating the unloading device 100 via the operating unit 220. Here, similarly, when a signal corresponding to the operation of the operation unit 220 is transmitted to the discharge device 100, the drive control unit 150 operates various drivers to drive the discharge device 100 according to the operation of the operation unit 220.
In this way, the second step of the three steps when the cargo 6 is scooped up by the unloading device 100 can be automated by the unloading device 100.
Therefore, as an automatic path, the path generating section 160 makes the scooping section 112 proceed along the side wall of the cabin 5 from a position where the scooping section 112 shown in solid lines in fig. 14A is determined in advance. The scooping portion 112 is moved to a rotatable position with respect to the central axis of the elevator 110. The scoop 112 is then turned 90 degrees along the side wall of the hold 5. The scooping portion 112 is made to enter along the side wall of the cabin 5. By repeating the above operation, the scooping portion 112 is moved 360 degrees along the side wall of the cabin 5. Moreover, the shoveling depth is changed and further moved several turns.
Finally, as shown in fig. 14B, the scooping portion 112 is pivoted 90 degrees at the center of the cabin 5, and then moved along the center of the cabin 5. Thereby, the cargo 6 remaining in the center is scooped by the scooping portion 112.
Here, the control device 200 can control a plurality of discharge devices 100 in parallel. Then, the operator who operates the operation unit 220 of the control device 200 selects one discharge device 100 as a target of remote operation, and performs the first step and the third step of the three steps described above on the selected discharge device 100. Then, the discharge apparatus 100 capable of performing the second step is selected as the discharge apparatus 100 to be automatically operated, and the discharge apparatus 100 to be automatically operated is automatically operated.
When there is a discharge device 100 to be remotely operated in the first process and the third process, the operator operates the operation unit 220 to select the discharge device 100 to be remotely operated. The remote operation switching unit 212 determines the discharge device 100 to be remotely operated according to the operation of the operation unit 220. The remote operation switching unit 212 establishes bidirectional communication with respect to the discharge device 100 to be remotely operated via the communication device 240. However, the monitoring control unit 210 continues to receive data and distance information from the three-dimensional model of the unloading device 100 that is not the target of the remote operation.
The display switching unit 214 causes the display unit 230 to display an image (upper viewpoint image 500, shovel periphery image 510) formed based on the data of the three-dimensional model and the distance information received from the discharge device 100 to be remotely operated. This makes it possible to easily grasp the state of the discharge device 100 to be remotely operated.
When there is a discharge device 100 to be subjected to the automatic operation in the second step, the operator operates the operation unit 220 to select the discharge device 100 to be subjected to the automatic operation. The remote operation switching unit 212 determines the discharge device 100 to be automatically operated according to the operation of the operation unit 220. The remote operation switching unit 212 transmits an automation instruction command to the unloading device 100 to be automatically operated. In the unloading device 100, upon receiving the automation instruction command, the automatic operation instruction unit 162 causes the path generation unit 160 to generate an automatic path. The drive control unit 150 drives the discharge device 100 based on the automatic path.
When the automatic operation end condition is satisfied or when an error occurs, the automatic operation end determination unit 164 stops (restricts) the driving of the unloading device 100. As the automatic operation end condition, there are cases where the position of the scooping portion 112 is lower than the position determined by the automatic path, and where the scooping amount of the load 6 exceeds a preset amount.
The display switching unit 214 displays only minimum information necessary for the automatic operation on the display unit 230 based on the data of the three-dimensional model and the distance information received from the unloading device 100 during the automatic operation.
In the case of the unloading device 100 during the automatic operation, the situation determination unit 216 predicts, for each unloading device 100, the time from the height change of the scooping unit 112, the average of the scooping amounts, and the like until the target height of the scooping unit 112 and the target scooping integrated amount are reached. When the unloading device 100 having a relatively short end time of the second process is present, the timing of remote control is overlapped, so that the situation determination unit 216 issues a predetermined warning.
Based on the data and distance information of the three-dimensional model transmitted from the unloading device 100, the state monitoring unit 158 derives the minimum distance and the direction in which the distance between the wall surfaces of the hatch coaming 7 and the cabin 5 and the elevator 110 and the scooping unit 112 is minimum. When the derived minimum distance is equal to or less than the predetermined threshold value, the collision preventing unit 166 restricts (stops) the operation (collision preventing function) of the unloading device 100. Further, when the derived minimum distance is equal to or less than the predetermined threshold value, the collision preventing unit 166 may restrict the movement of the elevator 110 and the shovel 112 in the direction of derivation. This makes it possible to more safely perform the automatic operation of the unloading device 100.
For example, while the operator is performing the first process on one discharge device 100, the remaining three discharge devices 100 are subjected to the second process. The operator transmits an automation instruction command to the unloading device 100 after the completion of the first process via the operation unit 220. Then, the operator performs the third process on the discharge device 100 after the completion of the second process.
In this way, in the unloading system 1, by automating a part of the plurality of steps, the plurality of unloading apparatuses 100 can be controlled by one control apparatus 200. Thus, the unloading system 1 can reduce personnel. In addition, when the distance between the hatch coaming 7 and the elevator 110 and the distance between the scooping portion 112 and the wall surface of the cabin 5 are smaller than the distance at which collision occurs, the state monitoring portion 158 may stop the drive control portion 150 from automating.
< second embodiment >
Fig. 15 is a diagram illustrating an electrical configuration of a discharge system 600 according to the second embodiment. The discharge system 600 in the second embodiment is provided with a discharge device 700 instead of the discharge device 100 of the discharge system 1 in the first embodiment. The unloading system 600 according to the second embodiment is similar to the unloading system 1 according to the first embodiment except for the unloading device 700.
The discharge device 700 is provided with a discharge control unit 740 instead of the discharge control unit 140 of the discharge device 100 according to the first embodiment. The discharge device 700 is similar to the discharge device 100 according to the first embodiment except for the discharge control unit 740.
The discharge control unit 740 is provided with an edge detection unit 752 instead of the edge detection unit 152 of the discharge control unit 140 in the first embodiment. The discharge control unit 740 has the same configuration as the discharge control unit 140 in the first embodiment except for the edge detection unit 752.
Fig. 16 is a diagram illustrating a functional configuration of the edge detection section 752. As shown in fig. 16, the edge detection unit 752 functions as the continuous end point extraction unit 800, the edge point extraction unit 802, and the edge derivation unit 804. The continuous end point extraction unit 800 extracts end points of a continuous point group composed of a plurality of continuous measurement points from the measurement points measured by the ranging sensors 130 to 132. The edge point extraction unit 802 extracts, as edge candidate points, measurement points that are candidates for the edge of the upper end of the hatch coaming 7 based on the end points extracted by the continuous end point extraction unit 800. Then, the edge point extraction unit 802 extracts edge points based on the extracted edge candidate points. The edge derivation section 804 derives (detects) the edge of the upper end of the hatch coaming 7 based on the edge point extracted by the edge point extraction section 802. Hereinafter, the specific processing performed by the continuous endpoint extraction unit 800 and the edge point extraction unit 802 will be mainly described.
Fig. 17 is a flowchart showing the flow of the continuous end point extraction process performed by the continuous end point extraction unit 800 and the edge point extraction process performed by the edge point extraction unit 802. In fig. 17, the processing of S100 is continuous endpoint extraction processing, and the processing of S102 to S146 is edge point extraction processing.
As a pre-stage of the continuous endpoint extraction process (S100) shown in fig. 17, the continuous endpoint extraction unit 800 derives the three-dimensional position of the measurement point in the top frame coordinate system 310 based on the positions of the ranging sensors 130 to 132 and the distances to the measurement point measured by the ranging sensors 130 to 132. Further, as in the first embodiment, the continuous end point extraction unit 800 divides the measurement points measured by the ranging sensors 130 to 132 into two parts on the front side and the rear side with reference to the vertical lower sides of the ranging sensors 130 to 132.
Fig. 18 is a diagram illustrating a continuous endpoint extraction process. In fig. 18, the measurement points are shown as black dots and hollow circles. The black dots in fig. 18 show the continuous end points. The open circles in fig. 18 show measurement points other than the continuous end points. Fig. 18 shows measurement points reflected by laser light irradiated from one laser irradiation section of the ranging sensors 130 to 132 at a predetermined angle.
The continuous endpoint extraction unit 800 performs continuous endpoint extraction processing with respect to any one of the divided measurement point groups (S100). Specifically, the continuous end point extraction unit 800 derives a vector of each measured point to be measured by the laser irradiation unit irradiated with laser light. Further, as the vector of the measurement point, a vector toward a measurement point located in a predetermined distance range determined by, for example, a threshold value, that is, located around the measurement point is derived as the vector of the measurement point. That is, the continuous end point extraction unit 800 also functions as a direction determination unit that determines the direction between the measurement points included in the measurement point group and the adjacent measurement points.
Next, the continuous endpoint extraction unit 800 derives an angle formed by a vector of one measurement point and a vector of the surrounding measurement points as a measurement point vector angle. When the measurement point vector angle is equal to or smaller than a predetermined continuity angle threshold (for example, 5 degrees), the continuous end point extraction unit 800 determines that two measurement points have continuity. The continuous end point extraction unit 800 extracts first and second measurement points located within a predetermined distance range as a reference, and determines continuity with surrounding measurement points for each of the first and second measurement points. In this case, the continuous end point extraction unit 800 may further determine the continuity with the surrounding measurement points by using the closest measurement point among the surrounding measurement points having continuity as the measurement point having continuity and the measurement point having continuity as the reference. The continuous endpoint extraction unit 800 derives a vector angle with respect to the first and second measurement points in addition to determining the continuity of the measurement points located around, and determines that continuity exists when the derived vector angle is within a continuity angle threshold. If the continuity is interrupted, the continuous end point extraction unit 800 determines a plurality of measurement points determined to be continuous as one continuous point group. That is, the continuous end point extraction unit 800 also functions as a grouping unit that groups a plurality of measurement points into a continuous point group (cluster) based on the parallelism of the vectors (based on whether the measurement point vector angle (i.e., the angle value on the acute angle side formed by the two vectors, i.e., the angle difference between the two vectors) is equal to or smaller than the continuity angle threshold). The grouping method described here is an example, and for example, a plurality of points having continuity may be extracted without extracting the nearest point having continuity among the surrounding points, and the points having continuity around the representative point may be extracted with the representative point as the base point. Any method of grouping based on parallelism can be used, independent of other methods.
Next, the continuous endpoint extraction unit 800 determines whether or not the continuous endpoint group is continuous by a predetermined number (for example, five) or more. When the continuous dot group is a predetermined number or more, the end points of the measurement points are extracted as continuous end points (black dots in fig. 18).
A continuous point group having continuity at a predetermined number or more of measurement points can be regarded as being located on the same plane as the measurement object. The hatch coaming 7 has a side surface (wall surface) extending in the vertical direction and an upper surface extending in the horizontal direction. Thus, the continuous end point may be closest to the edge of the upper end in the side or upper surface of the hatch coaming 7.
Fig. 19 is a diagram illustrating extraction of edge candidate points. In fig. 19, the measurement points are shown as black dots and hollow circles. The black dots in fig. 19 show the consecutive end points closest to the origin (ranging sensors 130 to 132) in the top frame coordinate system 310. The open circles in fig. 19 show the measurement points in the top frame coordinate system 310 except for the consecutive end points nearest to the origin. Fig. 19 shows measurement points reflected by laser light emitted from one laser light emitting section of the ranging sensors 130 to 132 at a predetermined angle.
The edge point extraction unit 802 extracts, as a first edge candidate point, a continuous end point closest to the origin in the top frame coordinate system 310 from among the continuous end points detected by the continuous end point extraction unit 800 (S102). That is, the edge point extraction unit 802 extracts, as the edge point, the measurement point closest to the horizontal distance of the main body unit among the continuous end points. Here, when detecting the edge of the upper end of the hatch coaming 7, the ranging sensors 130 to 132 are located near the center of the opening of the hatch coaming 7. Therefore, the possibility that the consecutive end point closest to the origin in the top frame coordinate system 310 is closest to the edge of the upper end of the hatch coaming 7 is high. And, it can be deduced that: even if the continuous end point closest to the origin in the top frame coordinate system 310 is not the closest measurement point to the edge of the upper end of the hatch coaming 7, it is located at least on the side or upper surface of the hatch coaming 7. Therefore, the continuous end point closest to the origin, i.e., the continuous end point closest to the ranging sensors 130 to 132 in the top frame coordinate system 310 is extracted as the first edge candidate point.
Then, the edge point extraction unit 802 derives, as a first vector, a vector passing through the first edge candidate point and the other end point of the continuous point group including the first edge candidate point (S104).
Next, the edge point extraction unit 802 derives an angle formed by the first vector and the vertical direction as a first vector angle (S106). Then, the edge point extraction unit 802 determines whether or not the first vector angle is equal to or larger than a predetermined horizontal angle threshold (for example, 45 degrees) (S108). Here, it is determined whether the first vector is near the vertical direction or near the horizontal direction. That is, it is determined whether the continuous point group including the first edge candidate point is located on the upper surface of the hatch coaming 7 or on the side surface (wall surface) of the hatch coaming 7.
When the first vector angle is equal to or greater than the horizontal angle threshold, the edge point extraction unit 802 determines that the first vector is close to the horizontal direction. That is, it is determined that the continuous point group including the first edge candidate point is located on the upper surface of the hatch coaming 7.
On the other hand, when the first vector angle is smaller than the horizontal angle threshold, the edge point extraction unit 802 determines that the first vector is near the vertical direction. That is, it is determined that the continuous point group including the first edge candidate point is located on the side surface of the hatch coaming 7.
When the first vector angle is equal to or greater than the horizontal angle threshold (yes in S108), the edge point extraction unit 802 sets the first edge candidate point as the second edge candidate point (S110).
Next, the edge point extraction unit 802 uses all the measurement points measured by one laser irradiation unit as processing target points, and selects one measurement point as a processing target point in the order of measurement (S112).
Next, the edge point extraction unit 802 determines whether or not the selected processing target point (measurement point) is not an independent point not included in any one of the continuous point groups (S114). The independent point is a point at which the erroneously measured measurement point, for example, a rope, does not become the measurement point of the hatch coaming 7. Here, the independent point that is not a candidate of the edge of the upper end of the hatch coaming 7 is removed.
If the processing target point is an independent point (no in S114), the processing proceeds to S126. On the other hand, if the processing target point is not an independent point (yes in S114), the edge point extraction unit 802 derives a vector of the first edge candidate point and the processing target point as a second vector (S116). Then, the edge point extraction unit 802 derives an angle formed by the first vector and the second vector as a second vector angle (S118).
The edge point extraction unit 802 determines whether or not the second vector angle is equal to or smaller than a predetermined similar angle threshold (e.g., 5 degrees) (S120). Here, the similar angle threshold is set to a value that divides whether the first edge candidate point and the processing target point lie on the same plane. If the second vector angle is larger than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is not located on the same plane as the first edge candidate point. That is, if the second vector angle is larger than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is not located on the upper surface of the hatch coaming 7. On the other hand, if the second vector angle is equal to or smaller than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point and the first edge candidate point are on the same plane. That is, if the second vector angle is equal to or smaller than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is located on the upper surface of the hatch coaming 7.
If the second vector angle is equal to or smaller than the similar angle threshold (yes in S120), the edge point extraction unit 802 determines whether or not the processing target point is closer to the origin in the top frame coordinate system 310 than the second edge candidate point (S122). In the case where the processing target point is closer to the origin in the top frame coordinate system 310 than the second edge candidate point, it can be said that the processing target point is closer to the edge of the upper end of the hatch coaming 7 than the second edge candidate point.
Therefore, when the processing target point is closer to the origin in the top frame coordinate system 310 than the second edge candidate point (yes in S122), the edge point extraction unit 802 updates the second edge candidate point to the processing target point (S124). Further, in the case where the second vector angle is larger than the similar angle threshold (no in S120), and in the case where the processing target point is further away from the origin in the top frame coordinate system 310 than the second edge candidate point (no in S122), the processing moves to the processing of S126 without performing the processing of S124.
The edge point extraction unit 802 determines whether or not the processing of S112 to S124 is completed for all the processing target points (S126). If the processing in S112 to S124 is not completed for all the processing target points (no in S126), the edge point extraction unit 802 returns to the processing in S112, and the processing in S114 to S124 is performed with the next measurement point as the processing target point.
On the other hand, when the processing in S112 to S124 is completed for all the processing target points (yes in S126), the edge point extraction unit 802 extracts the measurement point that eventually becomes the second edge candidate point as an edge point (S128), and ends the edge point extraction processing. Through the processing in S112 to S128, the edge point extraction unit 802 extracts, as an edge point, a measurement point closest to the horizontal distance of the main body from among measurement points on the same plane as the first edge candidate point.
In this way, when the vector of the continuous point group including the first edge candidate point is nearly horizontal, the edge point extraction unit 802 extracts, as the edge point, the closest measurement point to the origin in the top frame coordinate system 310 from the measurement points on the same plane as the first edge candidate point. This makes it possible to extract, as an edge point, a measurement point closest to the origin in the top frame coordinate system 310 from among measurement points located on the upper surface of the hatch coaming 7.
On the other hand, if the first vector angle is smaller than the horizontal angle threshold (no in S108), the edge point extraction unit 802 sets the first edge candidate point as the third edge candidate point (S130).
Next, the edge point extraction unit 802 sets all the measurement points irradiated and measured by one laser irradiation unit as processing target points, and selects (updates) one measurement point as the processing target in the order of measurement (S132).
Next, the edge point extraction unit 802 derives a vector of the first edge candidate point and the processing target point as a third vector (S134). Then, the edge point extraction unit 802 derives an angle formed by the first vector and the third vector as a third vector angle (S136).
The edge point extraction unit 802 determines whether or not the third vector angle is equal to or smaller than a predetermined similar angle threshold (e.g., 5 degrees) (S138). If the third vector angle is larger than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is not located on the same plane as the first edge candidate point. That is, if the third vector angle is larger than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is not located on the side surface of the hatch coaming 7. On the other hand, if the third vector angle is equal to or smaller than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point and the first edge candidate point are on the same plane. That is, if the third vector angle is equal to or smaller than the similar angle threshold, the edge point extraction unit 802 determines that the processing target point is located on the side surface of the hatch coaming 7.
If the third vector angle is equal to or smaller than the similar angle threshold (yes in S138), the edge point extraction unit 802 determines whether or not the processing target point is vertically above the third edge candidate point in the top frame coordinate system 310 (S140). When the processing target point is vertically above the third edge candidate point in the top frame coordinate system 310 (when the processing target point is high), it can be said that the processing target point is closer to the edge of the upper end of the hatch coaming 7 than the third edge candidate point.
Therefore, when the processing target point is vertically above the third edge candidate point in the top frame coordinate system 310 (yes in S140), the edge point extraction unit 802 updates the third edge candidate point to the processing target point (S142). Further, in the case where the third vector angle is larger than the similar angle threshold (no in S138), and in the case where the processing target point is further vertically below the third edge candidate point in the top frame coordinate system 310 (no in S140), the processing shifts to the processing of S144 without performing the processing of S142.
The edge point extraction unit 802 determines whether or not the processing of S132 to S142 is completed for all the processing target points (S144). When the processing in S132 to S142 is completed for all the processing target points (no in S144), the edge point extraction unit 802 returns to the processing in S132, and the processing in S134 to S142 is performed with the next measurement point as the processing target point.
On the other hand, when the processing in S132 to S142 is completed for all the processing target points (yes in S144), the edge point extraction unit 802 extracts the measurement point that eventually becomes the third edge candidate point as the edge point (S146), and ends the edge point extraction processing.
In this way, when the vector of the continuous point group including the first edge candidate point is close to the vertical direction, the edge point extraction unit 802 extracts, as the edge point, the highest measurement point in the vertical direction from the measurement points on the same plane as the first edge candidate point. This makes it possible to extract, as an edge point, the measurement point highest in the vertical direction from the measurement points located on the side surface of the hatch coaming 7.
The continuous end point extraction unit 800 and the edge point extraction unit 802 extract the edge points on the front side and the rear side by the continuous end point extraction process and the edge point extraction process described above for each of the measurement point groups measured by one laser irradiation unit of the range sensors 130 to 132.
When all edge points are extracted, the edge deriving unit 804 detects a straight line of the edge of the hatch coaming 7, similarly to the edge detecting unit 152 in the first embodiment. The edge deriving unit 804 derives an angle between the detected straight lines. When the angle formed is equal to or smaller than a predetermined threshold value, the edge derivation unit 804 is aligned. Next, the edge deriving unit 804 derives edge side information including a three-dimensional direction vector of each side, a three-dimensional barycentric coordinate of each side, a length of each side, and coordinates of an end point of each side from the straight line of the detected edge. Then, the edge derivation unit 804 derives, for each group, a vector having the most similar line segment among the line segments between the extracted edge points as a candidate vector. Then, the edge derivation unit 804 extracts edge points within a predetermined range with respect to the candidate vectors. Then, the edge derivation unit 804 calculates a straight line again using the extracted edge points.
Next, the edge derivation unit 804 repeats the above-described processing using the edge points that are not extracted. However, when the number of extracted edge points is smaller than a predetermined threshold value, no straight line is derived. Thus, even when the corner of the hatch coaming 7 is included, straight lines of both edges can be derived. The edge deriving unit 804 repeats the above-described processing for each group to derive a straight line of the edge.
As described above, the edge detection unit 752 derives a vector including a continuous point group including the first edge candidate point which is a candidate point of the edge of the upper end of the hatch coaming 7, and detects (extracts) the edge point by different processes depending on whether the derived vector is in the near-horizontal direction or the near-vertical direction.
Here, the edge detection unit 152 in the first embodiment detects edge points only with respect to the vertical direction. That is, the edge detection unit 152 detects an edge point from the measurement points on the side surface of the hatch coaming 7. However, in the case where the distance measuring sensors 130 to 132 are relatively far from the hatch coaming 7, the distance from the distance measuring sensors 130 to 132 to the side surface of the hatch coaming 7 is far. The incident angle of the laser light irradiated from the ranging sensors 130 to 132 is large at the side surface of the hatch coaming 7. Therefore, the number of measurement points at the side surface of the hatch coaming 7 may be reduced.
Therefore, in the second embodiment, edge points are extracted not only in the vertical direction but also in the horizontal direction. This also makes it possible to detect an edge point from the measurement points on the upper surface of the hatch coaming 7, and to improve the accuracy of detecting the edge of the upper end of the hatch coaming 7 as compared with the first embodiment.
The embodiments have been described above with reference to the drawings, but the present disclosure is not limited to the embodiments. It is obvious to those skilled in the art that various changes and modifications can be made within the scope described in the claims, and it is understood that such various changes and modifications are of course also within the technical scope.
For example, in the above embodiment, the plurality of discharge devices 100 and 700 are controlled by one control device 200. However, a control device 200 may also be provided in relation to a discharge device 100, 700. In this case, the discharge control units 140 and 740 and the monitoring control unit 210 may be unified into one unit. Further, the communication device 144 and the communication device 240 may not be provided.
In the above embodiment, the discharge control unit 140 (740) is configured to function as the drive control unit 150, the edge detection unit 152 (752), the coordinate transformation deriving unit 154, the model arrangement unit 156, the state monitoring unit 158, the path generating unit 160, the automatic operation command unit 162, the automatic operation end determination unit 164, and the collision prevention unit 166. However, the monitor control unit 210 may function as a part or all of the drive control unit 150, the edge detection unit 152 (752), the coordinate transformation derivation unit 154, the model arrangement unit 156, the state monitoring unit 158, the route generation unit 160, the automatic operation instruction unit 162, the automatic operation end determination unit 164, and the collision prevention unit 166.
In the above embodiment, the distance measuring sensors 130 to 132 are disposed on the top frame 108. However, the distance measuring sensors 130 to 132 may be disposed on the top frame 108 or the lifter 110. That is, the distance measuring sensors 130 to 132 may be disposed in the main body. In the above embodiment, the distance measuring sensors 133 to 136 are disposed in the shovel 112. However, the distance measuring sensors 133 to 136 may be disposed on the half side of the elevator 110 near the shovel 112.
In the above embodiment, a part (cross section) of the three-dimensional model is displayed as the upper viewpoint image 500, but the measurement results (measurement points) measured by the ranging sensors 130 to 132 may be displayed as images as they are, or a straight line of the edge detected by the edge detecting unit 152 may be displayed as images. That is, the upper viewpoint image 500 showing at least a part of the elevator 110 and the shovel 112, the cabin 5, and the hatch coaming 7 may be displayed based on the measurement results measured by the ranging sensors 130 to 132.
In the above embodiment, the portion (cross section) of the three-dimensional model is displayed as the shovel portion peripheral image 510, but the measurement results (measurement points) measured by the ranging sensors 133 to 136 may be displayed as images as they are. That is, the shovel periphery image 510 showing at least a part of the elevator 110, the shovel 112, and the cabin 5 may be displayed based on the measurement results measured by the ranging sensors 133 to 136.
In the above embodiment, the unloading devices 100 and 700 are described as an example of the unloading device. However, the unloading device may be a continuous unloading device (skip type, belt type, vertical screw conveyor, etc.), a pneumatic unloading device, etc.
In the above embodiment, three distance measuring sensors 130 to 132 are provided so as to be separated by 120 degrees in the circumferential direction of the elevator 110 and measured in a constant angle range from the plane direction tangential to the cylinder. However, the number of distance measuring sensors may be three or more. The distance measuring sensor need not be provided to measure in the direction of the plane tangential to the cylinder, but may be provided to be inclined with respect to the plane. At least one of the distance measuring sensors may be oriented in a direction different from the other distance measuring sensors by 45 degrees or more in the circumferential direction (including the circumferential plane). The distance measuring sensor may be designed to have different measurement ranges.
In the above embodiment, the vertical conveyance mechanism illustrated by the lifter 110 and the like mainly conveys the load upward from the scooping portion 112, and is not strictly illustrated as vertical.
In the above embodiment, the continuous end point extraction process and the edge point extraction process shown in fig. 17 are performed for each measurement point group on one measurement line. However, it is also possible to extract a predetermined measurement point from all the measurement points and perform continuous end point extraction processing and edge point extraction processing on the extracted measurement points.
Industrial applicability
The present disclosure can be used for unloading devices.
Description of symbols
100. 700-unloading device (unloading device), 110-lifter (vertical conveying mechanism), 112-shoveling part, 133, 134, 135, 136-distance measuring sensor, 156-model arrangement part, 230-display part.

Claims (8)

1. An unloading device, comprising:
a main body part provided with a shovel part inserted into the cabin;
a distance measuring sensor which is disposed on the main body and can measure distance toward the lower side; and
an edge detection unit for detecting an edge of an upper end of a hatch coaming provided on an upper portion of the cabin using a plurality of measurement points measured by the distance measurement sensor,
the edge detection unit includes:
a direction determination unit that determines a direction between the measurement point and the adjacent measurement point;
a grouping unit configured to group the measurement points into clusters based on the angle difference in the direction determined by the direction determining unit; and
an edge point extraction unit that extracts edge points of the hatch coaming based on end points of the clusters after the grouping,
the unloading device comprises a coordinate transformation deriving unit for deriving edge side information on each side of the edge of the upper end of the hatch coaming based on the plurality of edge points detected by the edge detecting unit, and deriving transformation parameters of the coordinate system of the unloading device and the coordinate system of the cabin based on the derived edge side information.
2. The unloading device according to claim 1, wherein,
the distance measuring sensor measures the measurement points of each of a plurality of measurement lines as a measurement point group,
the edge point extraction unit extracts the edge points for each of the measurement point groups.
3. The unloading device according to claim 1, wherein,
the grouping unit determines continuity of the measurement points continuously measured by the ranging sensor based on directions among the plurality of measurement points, extracts the measurement points having continuity as the cluster, extracts end points of the cluster as continuous end points,
the edge point extraction unit extracts the edge candidate point, which is a candidate of the edge point, from among the continuous end points closest to the horizontal distance of the main body, and extracts the edge point based on the edge candidate point,
the edge detection unit includes an edge derivation unit that derives an edge of the upper end of the hatch coaming based on the edge points extracted by the edge point extraction unit.
4. An unloading device as defined in claim 3, wherein,
the edge point extraction unit derives a direction of a continuous point group including the edge candidate point, and extracts, as the edge point, the measurement point closest to the horizontal distance of the main body unit from among the measurement points on the same plane as the edge candidate point when the direction of the continuous point group including the edge candidate point approaches the horizontal direction.
5. An unloading device as defined in claim 3, wherein,
the edge point extraction unit derives a direction of a continuous point group including the edge candidate point, and extracts, as the edge point, the measurement point highest in the vertical direction from among the measurement points on the same plane as the edge candidate point when the direction of the continuous point group including the edge candidate point is close to the vertical direction.
6. The unloading device according to claim 1, wherein,
the coordinate transformation deriving unit associates a straight line of an edge of the upper end of the hatch coaming with an edge of the upper end in the three-dimensional model of the hatch coaming based on the posture of the unloading device, and derives the transformation parameter based on a positional relationship between the straight line of the edge and the edge of the upper end after the association.
7. The unloading device according to claim 1, wherein,
the coordinate transformation deriving unit derives the transformation parameter by representing a straight line of an edge of the upper end of the hatch coaming based on the edge side information by using a three-dimensional point group and minimizing a sum of values obtained based on a distance between the three-dimensional point group and an edge of the upper end in the three-dimensional model of the hatch coaming.
8. The unloading device as defined in claim 6, wherein,
the coordinate transformation deriving unit derives the transformation parameter by representing a straight line of an edge of the upper end of the hatch coaming based on the edge side information by using a three-dimensional point group and minimizing a sum of values obtained based on a distance between the three-dimensional point group and an edge of the upper end in the three-dimensional model of the hatch coaming.
CN202111088722.3A 2018-02-02 2019-01-31 Unloading device Active CN113788332B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2018-017504 2018-02-02
JP2018017504 2018-02-02
JP2018-206073 2018-10-31
JP2018206073A JP7129314B2 (en) 2018-02-02 2018-10-31 Unloading device
CN201980005645.5A CN111328318B (en) 2018-02-02 2019-01-31 Unloading device
PCT/JP2019/003537 WO2019151460A1 (en) 2018-02-02 2019-01-31 Unloading device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201980005645.5A Division CN111328318B (en) 2018-02-02 2019-01-31 Unloading device

Publications (2)

Publication Number Publication Date
CN113788332A CN113788332A (en) 2021-12-14
CN113788332B true CN113788332B (en) 2023-07-14

Family

ID=67479373

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201980005645.5A Active CN111328318B (en) 2018-02-02 2019-01-31 Unloading device
CN202111088722.3A Active CN113788332B (en) 2018-02-02 2019-01-31 Unloading device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201980005645.5A Active CN111328318B (en) 2018-02-02 2019-01-31 Unloading device

Country Status (2)

Country Link
CN (2) CN111328318B (en)
WO (1) WO2019151460A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7442356B2 (en) * 2020-03-18 2024-03-04 住友重機械搬送システム株式会社 unloader
CN114194871B (en) * 2021-12-03 2024-01-09 宜昌常丰港机制造有限公司 Bidirectional movable ship loader and ship loading method thereof
CN114671266B (en) * 2022-05-26 2022-08-26 浙江天新智能研究院有限公司 Collapse coal unloading process for unattended screw ship unloader

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL8500103A (en) * 1985-01-17 1986-08-18 Weimar Nv TRANSPORT DEVICE.
JPH06206618A (en) * 1993-01-08 1994-07-26 Ishikawajima Harima Heavy Ind Co Ltd Remote operation method for unloader
JPH08239124A (en) * 1995-03-01 1996-09-17 Ishikawajima Harima Heavy Ind Co Ltd Controller for unloader
JP3329420B2 (en) * 1995-04-24 2002-09-30 川崎製鉄株式会社 Excavation depth control method for continuous unloader
JPH09297023A (en) * 1996-05-02 1997-11-18 Mitsubishi Heavy Ind Ltd Measuring apparatus for relative position of unloader excavating part to hull
JP3965229B2 (en) * 1997-08-18 2007-08-29 石川島運搬機械株式会社 Unloader device and operation method thereof
JP2907386B1 (en) * 1998-01-30 1999-06-21 川崎重工業株式会社 Continuous unloader relative position measuring device
JP2001019168A (en) * 1999-06-30 2001-01-23 Ishikawajima Harima Heavy Ind Co Ltd Operation supporting device for continuous unloader
CN101020540A (en) * 2007-01-23 2007-08-22 武汉理工大学 Container loading and unloading process
CN101723187B (en) * 2008-10-23 2012-07-25 宝山钢铁股份有限公司 Automatic collision avoidance system and method of port cargo ship unloader
JP5465128B2 (en) * 2010-08-11 2014-04-09 株式会社トプコン Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and point cloud position data processing program
JP5944148B2 (en) * 2011-12-01 2016-07-05 住友重機械搬送システム株式会社 Continuous unloader
JP2014034458A (en) * 2012-08-09 2014-02-24 Sumitomo Heavy Industries Material Handling Systems Co Ltd Continuous unloader
JPWO2017141321A1 (en) * 2016-02-15 2018-02-22 株式会社マリタイムイノベーションジャパン Instruction device, program and recording medium for supporting operation of crane

Also Published As

Publication number Publication date
CN113788332A (en) 2021-12-14
WO2019151460A1 (en) 2019-08-08
CN111328318B (en) 2022-01-07
CN111328318A (en) 2020-06-23

Similar Documents

Publication Publication Date Title
CN111433143B (en) Unloading device
CN113788332B (en) Unloading device
TWI724369B (en) Unloading apparatus
EP2419757B1 (en) System for the identification and/or location determination of a container handling machine
CN103781717B (en) Determine the position of the capture member of hoisting crane and the system and method for measure of skewness
CN111344238B (en) Unloading device
CN105431370A (en) Method and system for automatically landing containers on a landing target using a container crane
JP7412274B2 (en) unloading equipment
JP7272848B2 (en) Unloading device
JP7237703B2 (en) Unloading device
JP2005239411A (en) Bucket type crane gear
JPH0661841U (en) Relative position detector for unloader and ship
JP7237702B2 (en) Unloading device
JP7449163B2 (en) unloading equipment
TWI748399B (en) Shape derivation device and unloading device
JP7280095B2 (en) Unloading device
JP7285121B2 (en) Unloading device
JP7285122B2 (en) Shape derivation device
JP2704918B2 (en) Control equipment for marine cargo handling machinery
JP2022135158A (en) Unloading device and control device for unloading device
WO2024026531A1 (en) Systems and methods for loading bulk material into storage
JP2021134058A (en) Unloading device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40059884

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant