WO2016199366A1 - 寸法測定装置および寸法測定方法 - Google Patents
寸法測定装置および寸法測定方法 Download PDFInfo
- Publication number
- WO2016199366A1 WO2016199366A1 PCT/JP2016/002571 JP2016002571W WO2016199366A1 WO 2016199366 A1 WO2016199366 A1 WO 2016199366A1 JP 2016002571 W JP2016002571 W JP 2016002571W WO 2016199366 A1 WO2016199366 A1 WO 2016199366A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cargo
- distance image
- dimension measuring
- measuring device
- processor
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/022—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/60—Memory management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- This disclosure relates to a technique for measuring the dimensions of cargo or cargo with a loading platform.
- the cargo includes many irregular cargos that are not necessarily rectangular parallelepipeds, and a technology for smoothly and accurately measuring the dimensions of these irregular cargos is required.
- Patent Document 1 discloses a three-dimensional shape measurement method.
- a spatial code is specified for each position corresponding to each light receiving element of a camera based on the order of received measurement light.
- the light receiving element that did not receive the measurement light is extracted to extract the unmeasurable point, and for the spatial code of this unmeasurable point, An interpolation value based on the value of the spatial code at is calculated. Based on this interpolated value, it is possible to measure the distance to a portion where measurement light cannot be irradiated, and it can be applied to the measurement of dimensions of irregular cargo.
- Patent Document 2 discloses an object identification method.
- an object distance image measured by a distance image measurement unit using a three-dimensional space in a world coordinate system having a predetermined work table as an origin as a matching space. Is plotted in the matching space.
- the object surface position is virtually reconstructed in the matching space, and the obtained three-dimensional object surface position image is projected onto the horizontal plane ZX plane of the world coordinate system to obtain a projected image.
- a circumscribed cuboid is a cuboid circumscribing an object, and can be applied to dimension measurement of indeterminate cargo.
- JP 2000-193433 A Japanese Patent Laid-Open No. 2004-259114
- a circumscribed cuboid of an object such as an indeterminate cargo can be obtained, and the dimensions of the object can be measured.
- a background distance image that does not include cargo is acquired, and then a cargo distance image including the cargo is acquired. From the difference between the background distance image and the cargo distance image, the cargo Acquire distance images and measure dimensions.
- the conditions for measuring the size of the cargo that is the object usually change from moment to moment. For example, it is considered that there are many cases where the environment at the time of acquiring the background distance image and the environment at the time of acquiring the cargo distance image are different. In such a case, there may be a case where an appropriate background distance image cannot be obtained. As a result, an accurate cargo distance image may not be obtained.
- the cargo is generally distributed in a state of being loaded on a predetermined loading platform such as a pallet. Since such distribution is customary, attempts have been made to measure the dimensions of cargo with a loading platform, but the demand for measuring the dimensions of the cargo itself excluding the loading platform is also increasing.
- This disclosure relates to technology for appropriately measuring the dimensions of cargo or cargo with a loading platform.
- the present disclosure relates to a dimension measuring device for measuring a dimension of a cargo placed on a cargo bed or a cargo with a cargo bed, wherein the transmitter includes a transmitter that transmits a measurement wave, a first measurement wave that receives the reflected measurement wave, and includes the cargo and the cargo bed.
- the area where unnecessary objects other than the cargo or cargo with loading platform exist is specified, and the second cargo is written by writing predetermined information in the specified area. Generate a distance image.
- the present disclosure is also a dimension measurement method for measuring a dimension of a cargo placed on a cargo bed or a cargo with a cargo bed, wherein a transmitter transmits a measurement wave, a receiver receives a reflected measurement wave, and the cargo and Acquiring a first cargo distance image including a loading platform, and a processor cooperating with the memory to store the first cargo distance image in the memory and based on the shape of the loading platform present in the first cargo distance image; In the first cargo distance image, an area where an unnecessary object other than cargo or cargo with a loading platform exists is specified, and predetermined information is written in the specified area to generate a second cargo distance image.
- the first cargo distance image an area where an unnecessary object other than cargo exists is specified, and a second cargo distance image is generated by writing predetermined information in the specified area.
- a second cargo distance image By utilizing the second cargo distance image, it is possible to obtain a circumscribed rectangular parallelepiped of the cargo, and thus appropriately measure the size of the cargo.
- FIG. 1 is a block diagram illustrating a configuration of the dimension measuring apparatus according to the first embodiment.
- FIG. 2A is a diagram illustrating a state where cargo is not mounted on the fork.
- FIG. 2B is a diagram illustrating a state where cargo with a loading platform is mounted on a fork.
- FIG. 3A is a diagram illustrating a background distance image.
- FIG. 3B is a diagram illustrating a background distance image.
- FIG. 3C is a diagram showing a cargo distance image.
- FIG. 3D is a diagram illustrating a cargo distance image.
- FIG. 4A is a diagram illustrating a state where cargo is not mounted on the fork.
- FIG. 4B is a diagram illustrating a background distance image acquired and generated in the state of FIG. 4A.
- FIG. 4C is a diagram illustrating a state where cargo with a loading platform is mounted on the fork.
- FIG. 4D is a diagram illustrating a cargo distance image acquired and generated in the state of FIG. 4C.
- FIG. 5A is a diagram illustrating a state in which an object that was not initially present in the image is displayed on the image as the forklift moves.
- FIG. 5B is a diagram illustrating processing of an image related to a wall.
- FIG. 5C is a diagram illustrating processing of an image related to a foreign object on the floor.
- FIG. 6A is a diagram illustrating a state in which another object that was not initially present in the image is displayed in the image.
- FIG. 6B is a diagram illustrating processing of an image related to another object.
- FIG. 7 is a flowchart showing an outline of an operation procedure performed by the dimension measuring apparatus according to the first embodiment.
- FIG. 8A is a flowchart showing a detailed processing procedure (subroutine) in step S10 of FIG.
- FIG. 8B is a diagram illustrating a distance image generated by changing the height of the fork in a plurality of stages and every time the fork is changed.
- FIG. 9 is a flowchart showing a detailed processing procedure (subroutine) in step S40 of FIG.
- FIG. 10A is a flowchart showing a detailed processing procedure (subroutine) in step S41 of FIG.
- FIG. 10B is a distance image illustrating the processing in step S412.
- FIG. 10C is a distance image illustrating the processing in step S413.
- FIG. 10A is a flowchart showing a detailed processing procedure (subroutine) in step S10 of FIG.
- FIG. 8B is a diagram illustrating a distance image generated by changing the height of the fork in
- FIG. 10D is a distance image illustrating the process in step S414.
- FIG. 10E is a conceptual diagram of processing when obtaining the distance of the image on the front surface of the loading platform.
- FIG. 11A is a flowchart showing a detailed processing procedure (subroutine) in step S42 of FIG.
- FIG. 11B is a distance image illustrating the processing in step S421.
- FIG. 11C is a distance image illustrating the process in step S422.
- FIG. 11D is a distance image illustrating the process in step S423.
- FIG. 11E is a distance image illustrating the process in step S424.
- FIG. 12A is a diagram showing a process of generating a final cargo distance image by combining a plurality of second cargo distance images.
- FIG. 12B is a diagram illustrating a process of developing the coordinates of the final cargo distance image in the three-dimensional matching space.
- FIG. 12C is a diagram illustrating a process of correcting the direction of the principal axis of inertia and performing coordinate transformation of the three-dimensional matching space.
- FIG. 12D is a diagram illustrating a process of calculating a circumscribed cuboid of the cargo in the three-dimensional matching space after the coordinate conversion.
- FIG. 13 is a flowchart showing an outline of an operation procedure when the dimension measuring apparatus does not acquire a background distance image.
- FIG. 14A is a flowchart showing a detailed processing procedure (subroutine) in step S40A of FIG.
- FIG. 14B is a distance image illustrating the processing in step S40A1.
- FIG. 14C is a distance image illustrating the processing in step S40A2.
- FIG. 14D is a distance image illustrating the process in step S40A3.
- FIG. 14E is a distance image illustrating the process in step S40A4.
- FIG. 14F is a distance image illustrating the process in step S40A5.
- FIG. 15A is an image diagram of the entire process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 15B is a diagram illustrating processing performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 15C is a diagram illustrating processing performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 15D is a diagram illustrating a process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 15E is a diagram illustrating processing performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 16A is a diagram illustrating a modification of the process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 16B is a diagram illustrating a modification of the process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 16C is a diagram illustrating a process of a modification of the process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 16D is a diagram illustrating a modification of the process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 16E is a diagram illustrating a process of a modification of the process performed by the dimension measuring apparatus according to the second embodiment.
- FIG. 17 is a diagram illustrating a state in which a forklift loaded with cargo with a loading platform moves along a predetermined route in a warehouse.
- FIG. 18 is a conceptual diagram illustrating processing performed by a business diagnosis system including a dimension measuring device.
- FIG. 19 is a conceptual diagram showing a modification of the embodiment.
- the present embodiment specifically discloses the dimension measuring apparatus according to the present disclosure
- the present embodiment will be described in detail with reference to the drawings as appropriate.
- more detailed description than necessary may be omitted.
- detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
- the accompanying drawings and the following description are provided to enable those skilled in the art to fully understand the present disclosure, and are not intended to limit the claimed subject matter.
- FIG. 1 is a block diagram illustrating a configuration of the dimension measuring apparatus according to the first embodiment.
- a dimension measuring apparatus 100 shown in FIG. 1 is installed in a facility relating to logistics such as a logistics warehouse (cargo transportation device such as a forklift, warehouse floor, etc.), and includes at least a camera 110, a projector 120, a processor 130, and a memory. 140.
- the dimension measuring apparatus 100 is an apparatus that measures the dimensions of cargo placed on a cargo bed (pallet) or cargo with a cargo bed.
- the term “cargo” or cargo C whose dimensions are to be measured may include not only “cargo” itself but also “cargo with cargo” or cargo CP with cargo bed including cargo and cargo bed unless otherwise specified. .
- the dimension measuring apparatus 100 measures the dimensions of the three sides of the rectangular parallelepiped, and when the cargo is an indeterminate type that is not a rectangular parallelepiped, as described later, a circumscribed rectangular parallelepiped (described later) that circumscribes the cargo. Measure the dimensions of the three sides.
- the camera 110 as an imaging device includes various elements such as a lens and an image sensor (not shown).
- the projector 120 as a light projecting device projects measurement light necessary for measuring the size of the cargo as the measurement target.
- the projector 120 projects measurement light onto the cargo, and the camera 110 receives the measurement light reflected by the cargo.
- the projection pattern of the measurement light is deformed by the surface of the cargo, and the deformation of the projection pattern of the measurement light reflected by the camera 110 is captured as depth information, that is, distance information in each pixel of the generated image.
- the image generated here is generally called a “distance image”.
- the distance image may be generated by the camera 110, or may be generated by the processor 130 described later after obtaining information from the camera 110.
- Distance image refers to an image containing distance information to the position (including the surface of the cargo) indicated by each pixel.
- the distance image can be visually expressed by expressing the distance information of each pixel on a display using different colors, for example.
- the processor 130 is configured by a general arithmetic unit, reads various control programs and data from a built-in storage device and memory 140 (not shown), and controls the overall operation of the dimension measuring apparatus 100.
- a memory 140 as a storage device performs operations such as reading of a control program necessary for various processes and saving of data by the processor 130. That is, the processor 130 and the memory 140 cooperate to control various processes by the dimension measuring apparatus 100.
- the dimension measuring apparatus 100 can also be configured by connecting an independent computer functioning as the processor 130 and the memory 140 to the camera 110 and the projector 120 by wired communication or wireless communication.
- the dimension measuring apparatus 100 may be configured as an integrated apparatus in which the camera 110, the projector 120, the processor 130, and the memory 140 are housed in an integrated housing.
- FIGS. 2A and 2B show a state in which a dimension measuring system 200 including two dimension measuring apparatuses 100A and 100B (hereinafter also referred to as “dimension measuring apparatus 100”) is fixed to a forklift FL and a situation in which the problem of the present invention occurs. It is explanatory drawing explaining these. Two integrated dimension measuring apparatuses 100 are fixed in front of the forklift FL, and the dimension measuring apparatus 100 is capable of moving up and down and photographs the fork F holding the cargo with a loading platform and the vicinity thereof.
- the dimension measuring apparatus 100 acquires a background distance image that does not include the cargo to be measured.
- the fork F is loaded with a cargo CP with a cargo bed in which the cargo C is placed on the cargo bed P, and the dimension measuring device 100 is configured to carry the cargo C (hereinafter referred to as cargo with a cargo bed) whose dimensions are to be measured.
- a cargo distance image including (which may also include a CP) is acquired.
- the background distance image obtained in the state of FIG. 2A and the cargo distance image obtained in the state of FIG. 2B are compared, and the difference between both images is extracted to extract only the distance image of the cargo C. It is conceivable to acquire. However, the height of the fork F may actually differ between the state of FIG. 2A and the state of FIG. 2B. That is, the height of the fork F when the cargo CP with a loading platform (FIG. 2B) is mounted may be lower than the height of the fork F when the cargo CP with a loading platform is not mounted (FIG. 2A). Conceivable. That is, it is expected that the position of the fork F changes between the time when the background distance image is acquired and the time when the cargo distance image is acquired, and it becomes difficult to extract the distance image of the cargo C.
- 3A to 3D are distance image diagrams specifically showing the above-described event.
- 3A shows a background distance image taken by the dimension measuring apparatus 100A in FIG. 2A
- FIG. 3B shows a background distance image taken by the dimension measuring apparatus 100B in FIG. 2A
- 3C shows a cargo distance image taken by the dimension measuring apparatus 100A in FIG. 2B
- FIG. 3D shows a cargo distance image taken by the dimension measuring apparatus 100B in FIG. 2B.
- the fork F is an unnecessary object other than the cargo C to be dimension-measured, and specifying the region where the fork F exists is an important matter for the extraction and dimension measurement of the cargo C.
- the position in the image of the fork F is different between the background distance image and the cargo distance image as in this example, it is difficult to specify the area where the fork F exists, which may be an obstacle to the measurement of the size of the cargo C.
- FIG. 4A to 4D are conceptual diagrams for explaining processing performed by the dimension measuring apparatus 100 of the present embodiment in order to solve the above-described problem.
- FIG. 4A shows a state in which no cargo is mounted on the fork F, the distance from the dimension measuring apparatus 100 to the fork F is d1, and the dimension measuring apparatus 100 acquires a background distance image not including the cargo. Yes. In this case, the dimension measuring apparatus 100 first acquires the first background distance image including the fork F as shown in FIG. 4B.
- a cargo CP with a loading platform is mounted on the fork F, the distance from the dimension measuring device 100 to the fork F is d2, and the dimension measuring device 100 includes the cargo distance image including the cargo C whose dimensions are to be measured.
- the state to acquire is shown.
- the dimension measuring apparatus 100 first acquires the first cargo distance image including the fork F and the cargo CP with a cargo bed as shown in FIG. 4D.
- the position of the fork F is lower than the position in FIG. 4A because the cargo CP with a loading platform is mounted, and d2> d1.
- the position of the fork F is different between the first background distance image of FIG. 4B and the first cargo distance image of FIG. 4D, and it is expected that it is difficult to extract the distance image of the cargo C. Is done.
- the processor 130 cooperates with the memory 140 to exclude pixels including the fork F that is an unnecessary object from the first background distance image, A new pixel including distance information of a partial pixel is interpolated. As a result, a second background distance image that does not include the fork F is generated.
- the processor 130 performs edge detection (a detection method based on abrupt changes in pixel distance information, etc.) in cooperation with the memory 140, or a pre-registered loading platform pattern.
- the edge E of the loading platform P is extracted by a method such as pattern matching based on the comparison.
- the processor 130 specifies the shape of the loading platform P and the shape of the front surface FS among the side surfaces of the loading platform.
- the processor 130 can also detect the fork F that is an object protruding from the front surface FS, and an area where an unnecessary object exists is an area where the protruding object exists (including the side surface FS of the cargo bed).
- the pixels in this area are excluded from the first cargo distance image.
- a new pixel including distance information of a corresponding pixel in the second background distance image is interpolated with respect to the excluded pixel to generate a second cargo distance image.
- the processor 130 acquires a difference between the second background distance image and the second cargo distance image, thereby generating a final cargo distance image representing the shape of the cargo C or the cargo CP with a loading platform.
- This final cargo distance image is a target of coordinate development by a three-dimensional matching space described later and generation of a circumscribed cuboid.
- FIG. 5A to FIG. 5C are conceptual diagrams illustrating a first modification of the process performed by the dimension measuring apparatus 100 of the present embodiment.
- the wall W of the distribution warehouse that was not present in the image at the beginning of the image, the foreign matter M that has fallen on the floor, etc. are displayed on the image.
- the fork F described above these are also unnecessary objects and need to be excluded when measuring the dimensions.
- the cargo C is not shown.
- the vertical and horizontal dimensions of the cargo C are defined to be the same as the vertical and horizontal dimensions of the loading platform P. That is, as a packing condition, the cargo C is set so as not to protrude the loading platform P. In this example, this concept is used, and pixels corresponding to an object existing in a range exceeding the area of the loading platform P are deleted in the same manner as the fork F described above.
- FIG. 5B shows processing of an image related to the wall W.
- the process related to the background distance image is the same as that in FIG. 4B, and the dimension measuring apparatus 100 obtains the first background distance image including the fork F and then generates the second background distance image not including the fork F.
- the processor 130 excludes the images of the front surface FS and the fork F of the loading platform P in the first cargo distance image by the same processing as shown in FIG. 4D, and the distance of the corresponding pixel in the second background distance image. Interpolate new pixels containing information. Furthermore, the processor 130 determines that the wall W existing in the first cargo distance image exists in a range exceeding the area of the loading platform P.
- the area viewed from the top of the loading platform P that is, the vertical and horizontal dimensions are known in advance, and as shown in FIG. 5A, the processor 130 identifies a rectangular parallelepiped area (extraction effective range) with the top surface US of the loading platform P as the bottom surface. Then, it is possible to specify a region where an object existing outside this area is present as a region where an unnecessary object is present. For this reason, the processor 130 can determine that the wall W is an unnecessary object. The processor 130 sets the region where the wall W is present as the region where unnecessary objects are present, and excludes pixels in this region from the first cargo distance image. In the present embodiment, the processor 130 interpolates a new pixel including distance information of a corresponding pixel of the second background distance image with respect to the excluded pixel.
- FIG. 5C shows processing of an image related to the foreign matter M on the floor. Similar to the wall W, the processor 130 can determine that the foreign matter M is an unnecessary object. The processor 130 sets an area where the foreign object M exists as an area where an unnecessary object exists, and excludes pixels in this area from the first cargo distance image. In the present embodiment, the processor 130 interpolates a new pixel including distance information of a corresponding pixel of the second background distance image with respect to the excluded pixel.
- FIG. 6A and FIG. 6B are conceptual diagrams for explaining a modification example 2 of the process performed by the dimension measuring apparatus 100 of the present embodiment.
- another object X in this example, another forklift carrying a cargo
- the other object X is also an unnecessary object and needs to be excluded when measuring the dimensions.
- illustration of the cargo C is abbreviate
- FIG. 6B shows processing of an image related to another object X.
- the process related to the background distance image is the same as that in FIG. 4B, and the dimension measuring apparatus 100 obtains the first background distance image including the fork F and then generates the second background distance image not including the fork F.
- the processor 130 excludes the images of the front surface FS and the fork F of the loading platform P in the first cargo distance image by the same processing as that shown in FIG. 4C, and the distance of the corresponding pixel in the second background distance image. Interpolate new pixels containing information. Furthermore, the processor 130 determines that the other object X present in the first cargo distance image exists in a range exceeding the area of the loading platform P.
- the area viewed from the top of the loading platform P that is, the vertical and horizontal dimensions are known in advance, and as shown in FIG. 6A, the processor 130 identifies a rectangular parallelepiped area (extraction effective range) with the top surface US of the loading platform P as the bottom surface. Then, it is possible to specify a region where an object existing outside this area is present as a region where an unnecessary object is present. For this reason, the processor 130 can determine that the other object X is an unnecessary object.
- the processor 130 sets an area where the other object X exists as an area where an unnecessary object exists, and excludes pixels in this area from the first cargo distance image.
- the processor 130 interpolates a new pixel including distance information of a corresponding pixel of the second background distance image with respect to the excluded pixel.
- FIG. 7 is a flowchart showing an outline of an operation procedure performed by the dimension measuring apparatus 100 according to the first embodiment.
- the projector 120 of the first dimension measuring apparatus 100 is triggered by, for example, an input of an artificial operation start command by an operator or a signal input from a specific sensor or the like.
- the measurement light is projected in the state shown in FIG. 2A, and a background distance image (first background distance image) is generated based on the measurement light reflected by the camera 110 (step S10). Since the dimension measuring apparatus 100 is generally provided with a plurality of n (n ⁇ 2), the operation of step S10 is repeated until all (n) dimension measuring apparatuses 100 generate the background distance image ( Step S20).
- step S20 After all the dimension measuring apparatuses 100 generate the background distance image (step S20; Yes), the projector 120 of the first dimension measuring apparatus 100 (100A) projects measurement light in the state shown in FIG. 2B, for example. Then, based on the measurement light reflected by the camera 110, a cargo distance image (first cargo distance image and second cargo distance image) is generated (step S30). Then, based on the obtained background distance image and cargo distance image, the processor 130 includes an image of only the cargo C (or cargo CP with cargo bed) to be dimension-measured, and the shape of the cargo C (or cargo CP with cargo bed). Is generated (step S40). The operations of step S30 and step S40 are repeated until all (n) size measuring apparatuses 100 generate the final cargo distance image (step S50).
- step S10 can be performed in parallel with step S30 and step S40. That is, while a specific dimension measuring apparatus 100 (for example, 100A) generates a final cargo distance image in steps S30 and S40, another dimension measuring apparatus 100 (for example, 100B) generates a background distance image in step S10.
- a specific dimension measuring apparatus 100 for example, 100A
- another dimension measuring apparatus 100 for example, 100B
- the order of processing is not particularly limited.
- the processor 130 of the specific dimension measuring apparatus 100 synthesizes all the obtained final cargo images and develops the coordinates in the three-dimensional matching space (step S60). Then, the processor 130 generates a circumscribed cuboid for the final cargo distance image, and calculates the dimensions of the three sides of the circumscribed cuboid (step S70).
- the circumscribed rectangular parallelepiped means the smallest rectangular parallelepiped having sides parallel to the inertial main axis direction, in which the cargo C (or cargo CP with a loading platform) enters, that is, each of the six surfaces of the rectangular parallelepiped at least at one point. It is a rectangular parallelepiped in contact with the surface (or cargo CP with cargo bed). Steps S60 and S70 will be described in detail later.
- FIG. 8A is a flowchart showing a detailed processing procedure (subroutine) in step S10 of FIG.
- the projector 120 and the camera 110 of the dimension measuring apparatus 100 generate a (first) background distance image using the predetermined input described above as a trigger.
- the operator changes the height of the fork F of the forklift FL in a plurality of stages, and the dimension measuring device 100 changes the first background distance image corresponding to each stage every time the change is made.
- the processor 130 stores the first background distance image in the memory 140. Since the height of the fork F at the time of loading is expected to be different depending on the cargo CP with a loading platform, the first background distance image in which the position of the fork F is different is generated and stored in this way.
- the processor 130 When the first background distance image is generated according to all the heights of the plurality of stages (step S12; Yes), the processor 130 combines all the first background distance images stored in the memory 140, A second background distance image from which the fork F is removed is generated (step S13).
- FIG. 9 is a flowchart showing a detailed processing procedure (subroutine) in step S40 of FIG.
- the processor 130 estimates the distance to the front surface (the connection surface with the fork) of the side surfaces of the cargo bed. Then, the pixel of the fork portion on the front surface is excluded (step S41). Subsequently, the processor 130 estimates the upper surface of the loading platform P (the surface on which the cargo C rides), the cargo distance image excluding the portion above the upper surface (first cargo distance image), and the corrected background distance image (first image). 2nd background distance image), a second cargo distance image that excludes pixels of objects other than cargo is generated by interpolating from the background distance image with corrected pixels to the cargo distance image. (Step S42).
- FIG. 10A is a flowchart showing a detailed processing procedure (subroutine) in step S41 of FIG.
- the processor 130 detects the edge E1 of the outline of the loading platform P by a method such as edge detection or pattern matching (step S411).
- the processor 130 based on the detected edge E1 of the contour, as shown in FIG. 10C, the edge E2 of the front surface FS (connection surface with the fork F) of the loading platform P that cannot be directly detected (because the fork F is on) Is estimated (step S412).
- the processor 130 estimates the distance of each pixel in the front surface FS from the dimension measuring device 100, and associates the distance information on the cargo distance image (step S413).
- the estimation of the distance can be performed by various methods. For example, as shown in FIG. 10E, when estimating the distance to an arbitrary pixel A on the front surface FS, the processor 130 passes through the pixel A, and the distance to the intersection of a line and a vertical edge parallel to the horizontal edge of the front surface FS. Calculate by triangulation. As described above, in step S41, based on the shape (edge) of the loading platform existing in the first cargo distance image, an unnecessary object (fork) other than cargo or cargo with a loading platform is included in the first cargo distance image. By identifying the existing area (FS) and writing distance information as predetermined information in the identified area, the presence of the fork is excluded from the cargo distance image. Note that the information to be written as the predetermined information may be Null data (identifiable code that is ignored in future calculations).
- the method of excluding the fork from the cargo distance image is not limited to this.
- Whether or not such an area exists can be determined by the processor 130 from the distance information expressed by the first cargo distance image.
- the FS is similarly specified, and the processor 130 may calculate whether or not an object having a linear surface gradient protrudes in a predetermined direction from the specified FS.
- Null data may be written as predetermined information in the specified area. Further, the specified region is complemented by using corresponding pixels (pixels on substantially the same coordinates) of the second background distance image as predetermined information, and a difference from the second background distance image is acquired later. Also good.
- the fork F as an unnecessary object protrudes from the front surface FS of the loading platform P and the method for removing the fork F has been described.
- the surface from which the unnecessary object protrudes is not limited to the front surface FS. It may be any one of the side surfaces including the front surface FS.
- FIG. 11A is a flowchart showing a detailed processing procedure (subroutine) in step S42 of FIG.
- the processor 130 detects the edge E1 of the outline of the loading platform P by a method such as edge detection or pattern matching (step S421).
- the processor 130 estimates an edge E3 of the upper surface (the surface on which the cargo C rides) US of the loading platform P (step S422).
- the processor 130 determines that the difference between the first cargo distance image excluding the pixels above the upper surface US and the corrected background distance image (second background distance image). It is determined whether or not there is (step S423).
- the processor 130 checks the difference for each pixel, determines whether or not the pixel having the difference exists above the upper surface US of the loading platform P, and if the image is not positioned above the upper surface of the loading platform P.
- the pixel is regarded as an interpolation target, and the pixel is excluded (step S423; Yes).
- the processor 130 interpolates the pixels excluded in step S423 using the corresponding pixels of the corrected background distance image (second background distance image) for the pixels having a difference.
- a second cargo distance image is generated (step S424).
- step S42 based on the shape (edge) of the loading platform present in the first cargo distance image, an area in which unnecessary objects other than cargo or cargo with a loading platform exist in the first cargo distance image.
- the presence of unnecessary objects is excluded from the cargo distance image.
- FIG. 12 is an explanatory diagram showing an overview of the processing of step S60 and step S70 in FIG.
- the processor 130 expands the coordinates of the final cargo distance image generated based on the second cargo distance image in the three-dimensional matching space, generates a circumscribed rectangular parallelepiped of the final cargo distance image that has been coordinate-expanded, and outputs 3 of the circumscribed rectangular parallelepiped. Calculate the dimensions of the sides.
- the “final cargo distance image” is an image composed of only pixels representing the shape of the cargo C (or cargo CP with cargo bed) that basically does not include the background other than the cargo C (or cargo CP with cargo bed).
- a method for generating the final cargo distance image there is a method of acquiring and generating a difference between the second background distance image and the second cargo distance image. This is a method performed when the second cargo distance image is generated using the corresponding pixel of the second background distance image in steps S413 to S424.
- the second cargo distance image becomes the final cargo distance image. Then, as shown in FIG.
- the processor 130 combines the final cargo distance images obtained by all the dimension measuring devices 100 obtained in step S50 of FIG. 7 to obtain the final coordinate ready to be developed in the three-dimensional matching space. Get a distance image. Then, as shown in FIG. 12B, the processor 130 develops coordinates of the obtained final cargo distance image in the three-dimensional matching space.
- the bottom surface in the final cargo distance image is not necessarily arranged parallel to the ZX plane of the bottom surface of the three-dimensional matching space.
- the processor 130 corrects the direction of the principal axis of inertia and performs coordinate conversion of the three-dimensional matching space.
- the “main axis of inertia” refers to an axis that minimizes the projected area on the ZX plane (horizontal plane) when the cargo is rotated around the axis.
- the processor 130 calculates a circumscribed cuboid of the cargo in the three-dimensional matching space after the coordinate conversion. This circumscribed cuboid is defined as the minimum dimension for packing cargo.
- the dimension measuring apparatus 100 may acquire a background distance image and use it later for interpolation of pixels excluded in the cargo distance image.
- what interpolates the excluded pixels as described above is not limited to the pixels of the background distance image, and can be interpolated by writing pixels having other identifiable codes such as Null data, for example. Good.
- the processor 130 writes predetermined information to the pixels in the area identified as the area where the unnecessary object exists, and generates the second cargo distance image.
- FIG. 13 shows a processing procedure when a background distance image is not acquired.
- the dimension measuring apparatus 100 starts from the process of step S30 without performing the processes of step S10 and step S20 in FIG. Then, the dimension measuring apparatus 100 performs the process of step S40A instead of step 40 of FIG. 7, and then performs the processes of steps S50, S60, and S70 of FIG.
- FIG. 14A is a flowchart showing a detailed processing procedure (subroutine) in step S40A of FIG.
- the processor 130 detects the edge E1 of the outline of the loading platform P by a method such as edge detection or pattern matching (step S40A1).
- the processor 130 based on the detected edge E1, the processor 130, as shown in FIG. 14C, the edge E2 of the front surface FS (the connection surface with the fork F) of the loading platform P that cannot be directly detected (because the fork F is on). Is estimated (step S40A2).
- the processor 130 estimates the distance of each pixel in the front surface FS from the dimension measuring device 100 and associates it with the cargo distance image (step S40A3). The estimation of the distance here is performed by the method shown in FIG. 10E, for example.
- the processor 130 estimates the edge E3 of the upper surface (the surface on which the cargo C rides) US of the loading platform P (step S40A4). Based on the edge E3, the processor 130 estimates a space above the upper surface US of the loading platform P as shown in FIG. 14F (step S40A5). Further, when the position of a specific pixel (i) in the image is present in an area outside the space above this (step S40A6; Yes), the processor 130 excludes the pixel, for example, Null data The pixel is interpolated by writing (identifiable code) (step S40A7). Finally, the processor 130 confirms whether or not there is a pixel to be complemented for all the pixels, and generates a second background distance image when the confirmation is completed (step S40A8).
- FIG. 15A is an image diagram of the entire process, and the dimension measuring apparatus 100 fixed to the floor captures the cargo CP with a loading platform fixed at a specific location, excludes the loading platform P, and calculates the circumscribed cuboid of the cargo C. be able to.
- the apparatus configuration in the present embodiment is the same as the configuration of the dimension measuring apparatus according to the first embodiment shown in FIG. 1, and is the same in the other embodiments described below.
- the dimension measuring apparatus 100 (100A, 100B) generates a background distance image by acquiring a background distance image of a specific place (substantially photographing only the floor) in a state where there is no cargo CP with a loading platform.
- the dimension measuring apparatuses 100A and 100B capture the cargo CP with a loading platform, and obtain a first cargo distance image including the cargo CP with a loading platform as illustrated in FIG. 15C.
- the processor 130 of the dimension measuring apparatus 100 estimates the upper surface US after detecting the edge E3 of the upper surface US of the loading platform P from the first cargo distance image.
- the processor 130 specifies a rectangular parallelepiped area (extraction effective range) having the upper surface US as a bottom surface, and specifies an area where an object existing outside the area exists as an area where an unnecessary object exists. Then, the processor 130 eliminates the pixels in the area where the object existing outside the area is present (for example, writes Null data), thereby, as illustrated in FIG. 15E, the final cargo distance image (the first cargo distance image in the present embodiment). 2) is also generated. In this way, it is possible to calculate the circumscribed cuboid of the cargo C without the loading platform P.
- the region (upper surface US is the region where unnecessary objects other than cargo or cargo with loading platform exist in the first cargo distance image.
- the present embodiment unlike the examples in FIGS. 4 to 6, there is no difference between the environment at the time of acquiring the background distance image and the environment at the time of acquiring the cargo distance image, except for the presence or absence of the cargo CP with a loading platform. No change has occurred. However, in some cases, the size of the cargo C itself excluding the cargo bed P is required instead of the cargo CP with the cargo bed due to a request from a distributor. In the present embodiment, since the cargo C excluding only the loading platform P that is a part of the cargo CP with loading platform, which is the difference between the background distance image and the cargo distance image, can be extracted. be able to.
- FIGS. 16A to 16E are conceptual diagrams for explaining a modification of the processing performed by the dimension measuring apparatus 100 according to the second embodiment.
- the dimension measuring apparatus 100 (100A, 100B) acquires a background distance image including the forklift FL in a state where there is no cargo CP with a loading platform (substantially only the forklift is photographed), and then, as shown in FIG. 16A, the forklift The cargo CP with cargo bed mounted on the FL is photographed together with the forklift FL, and a first cargo distance image including the cargo CP with cargo bed is acquired (FIG. 16B).
- FIG. 16B the dimension measuring apparatus 100
- the processor 130 of the dimension measuring apparatus 100 detects the edge E3 of the upper surface US of the loading platform P from the first cargo distance image. Further, the processor 130 estimates the upper surface US as shown in FIG. 16D. Then, the processor 130 specifies a rectangular parallelepiped area (extraction effective range) having the upper surface US as a cross section, and specifies an area where an object existing outside the area exists as an area where an unnecessary object exists. Then, the processor 130 eliminates the pixels in the area where the object existing outside the area is present (for example, writes Null), thereby, as illustrated in FIG. 16E, the final cargo distance image (the second cargo distance image in the present embodiment). Is also a cargo distance image.
- a rectangular parallelepiped area extraction effective range
- the region (upper surface US is the region where unnecessary objects other than cargo or cargo with loading platform exist in the first cargo distance image.
- the cargo C is loaded on a forklift or the like and follows a predetermined route. Even when the vehicle is moving, the final cargo distance image can be generated.
- FIG. 17 is a conceptual diagram illustrating processing performed by the dimension measuring apparatus 100 of the present embodiment.
- a forklift FL carrying a cargo CP with a loading platform moves along a predetermined route in a warehouse, and a dimension measuring device 100 is arranged beside the route.
- the camera 110 that has projected the measurement light and has received the reflected measurement light acquires a plurality of first background distance images during movement of the cargo CP with a loading platform.
- the processor 130 generates the final cargo distance image by performing the above-described processing (see FIGS. 13 and 14).
- FIG. 18 is a conceptual diagram illustrating processing performed by a business diagnosis system including the dimension measuring apparatus 100 of the present embodiment.
- the business diagnosis system is a system that aggregates attribute information for each cargo C and uses it for business diagnosis in the hub site.
- the dimension measuring apparatus 100 transmits attribute information including dimensions calculated by the dimension measuring apparatus 100 itself and various information acquired from tags attached to the cargo to the job determination computer 300 when photographing the cargo.
- the attribute information includes the cargo size, weight, carry-in gate, carry-in time, destination (country), carry-out gate, carry-out time, and the like.
- the data totaling unit 301 of the business determination computer 300 receives the attribute information, totals it in a predetermined format, and sends it to the optimization algorithm generation unit 302.
- the optimization algorithm generation unit 302 generates an improvement plan for the received attribute information in consideration of the hub site configuration information and the work reduction target value at the hub site.
- the hub site configuration information includes, for example, a site map, the number of carry-in gates, the number of carry-out gates, the type and number of fork lists, the type and number of containers, and the like.
- the work reduction target value includes, for example, a total movement distance reduction target value for forklifts, a total movement distance reduction target value for containers, a reduction target value for manual total work time, and the like. Improvement plans include the placement of trailer carry-in gates by day of the week and time of day, placement of forklifts by hour (or number of vehicles), and placement of aircraft carry-out gates by time of day.
- the business diagnosis system analyzes, for example, what kind of cargo is present in which time zone, and improves the arrangement of trailers, forklifts, containers, and the like. Thereby, it is possible to reduce the total operation time of the forklift, the lead time for cargo evacuation, the waiting time for the trailer, and the like. These analysis results can be seen in various data formats, and work can be improved.
- FIG. 19 is a conceptual diagram illustrating a modified example of the embodiment of the present disclosure.
- the number of dimension measuring devices 100 installed in the forklift FL is two, but may be one.
- the installation position of the dimension measuring apparatus 100 is arbitrary.
- a rail for the dimension measuring device 100 may be attached to the forklift FL, and the position of the device may be changed in accordance with the handle operation of the forklift FL.
- the background distance image and the cargo distance image are taken at least twice from different angles.
- the first time can be taken from the front, and the second time can be taken from diagonally above.
- the brightness sensor 101 may be attached to the dimension measuring apparatus 100. This is because it is assumed that the operating environment (lighting conditions, etc.) differs for each shooting location. For example, when the forklift FL enters an area having a certain darkness, the dimension measuring apparatus 100 can be activated to eliminate the influence of the external light conditions depending on the location.
- the projector 120 and the camera 110 that reflect the measurement light and receive the reflected measurement light are used.
- the medium for obtaining the distance image is not limited to light, and may be infrared light or a laser. Therefore, a transmitter that transmits a general measurement wave can be used instead of the projector 120. Further, instead of the camera 110, a receiver that receives the reflected measurement wave can be used.
- the camera 110 when the camera 110 obtains the cargo distance image from the cargo C or the cargo CP with a loading platform, the relative positional relationship between the camera 110 and the cargo C or the cargo CP with a loading platform is constant.
- the present disclosure is not limited to this. That is, the camera 110 may obtain the cargo distance image by photographing the cargo C or the cargo CP with the loading platform while rotating around the cargo C or the cargo CP with the loading platform.
- the cargo distance image may be obtained by the camera 110 photographing the state where the cargo C or the cargo CP with a loading platform is rotating with respect to the camera 110. In this way, it becomes possible not only to obtain the cargo distance image with one camera 110, but also to continuously photograph the cargo C or the cargo CP with the platform from different angles. Multiple cargo distance images can be obtained. As a result, the accuracy of the distance information included in the cargo distance image is further increased.
- the dimension measuring apparatus 100 is a dimension measuring apparatus 100 that measures the dimension of the cargo C placed on the loading platform P or the cargo CP with loading platform, and includes a transmitter (projector 120) that transmits a measurement wave.
- a receiver (camera 110) that receives the reflected measurement wave and obtains a first cargo distance image including the cargo C and the loading platform P, a processor 130, and a memory 140.
- the processor 130 cooperates with the memory 140 to store the first cargo distance image in the memory 140, and in the first cargo distance image based on the shape of the loading platform P existing in the first cargo distance image.
- the second cargo distance image is generated by specifying a region where unnecessary objects other than the cargo C or the cargo CP with cargo bed are present and writing predetermined information in the specified region.
- the dimension measuring apparatus 100 generates a second cargo distance image in which predetermined information is written in an area where an unnecessary object other than the cargo C or the cargo CP with a loading platform exists. Therefore, the dimension measuring apparatus 100 can appropriately generate a circumscribed cuboid of the cargo C or the cargo CP with a loading platform, and can appropriately measure the dimensions of the cargo C or the cargo CP with a loading platform.
- the receiver (camera 110) further acquires a background distance image that does not include the cargo, and the processor 130 acquires the difference between the background distance image and the second cargo distance image, so that the cargo or cargo with a loading platform is obtained.
- a final cargo distance image representing the shape of Thereby, the dimension measuring apparatus 100 can acquire an appropriate difference between the background distance image and the cargo distance image.
- the processor 130 may specify the shape of the loading platform P by extracting the edge E1 of the loading platform P from the first cargo distance image. Thereby, the dimension measuring apparatus 100 can appropriately exclude unnecessary objects based on the shape of the loading platform P.
- the processor 130 may specify the shape of the side surface of the loading platform P, and specify the region where the object protruding from the side surface exists as the region where the unnecessary object exists. Thereby, the dimension measuring apparatus 100 can appropriately exclude an unnecessary object on the side surface.
- the processor 130 specifies a rectangular parallelepiped area having the upper surface US of the loading platform P as either the bottom surface or the cross section, and specifies an area where an object existing outside the area exists as an area where an unnecessary object exists. Also good.
- the predetermined information may be pixel information of a region in the background distance image corresponding to a region where an unnecessary object exists.
- the dimension measuring apparatus 100 can effectively use the background processing image, and can generate an appropriate final cargo distance image.
- the predetermined information may be null data for an unnecessary object.
- the second cargo distance image can be handled as the final cargo distance image. Therefore, the second cargo distance image is coordinate-expanded into a three-dimensional matching space, and the circumscribed cuboid of the coordinate-expanded second cargo distance image is displayed. And the dimensions of the circumscribed cuboid can be calculated. Thereby, the dimension measuring apparatus 100 can acquire the exact dimension of the cargo C or the cargo CP with a loading platform.
- the dimension measuring apparatus 100 is fixed at a predetermined position, the cargo C or the cargo CP with a loading platform moves at a predetermined speed, and the receiver (camera 110) is moved to a plurality of backgrounds while the cargo C or the cargo CP with a loading platform is moving.
- the processor 130 obtains the distance image and the plurality of first cargo distance images, the processor 130 generates a final cargo distance image from the plurality of background distance images and the plurality of first cargo distance images in consideration of a predetermined speed. Also good. Thereby, the dimension measuring apparatus 100 can effectively use the background processing image, and can generate an appropriate final cargo distance image.
- the processor 130 can expand the coordinates of the final cargo distance image in the three-dimensional matching space, generate a circumscribed rectangular parallelepiped of the final cargo distance image that has been coordinate-expanded, and calculate the dimensions of the circumscribed rectangular parallelepiped. Thereby, the dimension measuring apparatus 100 can acquire the exact dimension of the cargo C or the cargo CP with a loading platform.
- the transmitter 120 may be a projector 120 that projects measurement light as a measurement wave, and the camera 110 may receive the measurement light reflected by the receiver.
- the dimension measuring apparatus 100 can be configured inexpensively and reliably.
- the cargo distance image and / or the background distance image has been described as being obtained by one shooting with a (camera).
- the cargo distance image and / or the background distance image in the present disclosure are not limited to those obtained by one-time shooting by (camera). That is, by combining two or more cargo distance images and / or background distance images obtained by (camera) shooting at different times, the images obtained are combined into the cargo distance image and / or background in the present embodiment. It is also possible to apply to a distance image. Specifically, two or more cargo distance images and / or two or more cargo distance images and / or background distance images obtained by (camera) shooting at different times are overlapped. Alternatively, it is also possible to generate a cargo distance image and / or a background distance image applied to the present embodiment by generating a smaller number of cargo distance images and / or background distance images from the background distance image.
- the dimension measuring method of this embodiment is a dimension measuring method for measuring the dimensions of the cargo C placed on the loading platform P or the cargo CP with loading platform, in which the transmitter (projector 120) transmits a measurement wave and the receiver (camera 110). ) Receives the reflected measurement wave and obtains the first cargo distance image including the cargo C and the loading platform P.
- the processor 130 cooperates with the memory 140 to store the first cargo distance image in the memory 140 and based on the shape of the loading platform P present in the first cargo distance image, in the first cargo distance image, A region where an unnecessary object other than the cargo C or the cargo CP with cargo bed is present is specified, and predetermined information is written in the specified region to generate a second cargo distance image.
- This dimension measurement method generates a second cargo distance image in which predetermined information is written in an area where an unnecessary object other than the cargo C or the cargo CP with cargo bed exists. According to this method, it is possible to appropriately generate a circumscribed rectangular parallelepiped of the cargo C or the cargo CP with a loading platform, and thus it is possible to appropriately measure the dimensions of the cargo C or the cargo CP with a loading platform.
- the present disclosure is useful as a dimension measuring apparatus and a dimension measuring method that can appropriately measure the dimensions of cargo or cargo with a loading platform.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
以下、図1~図12を用いて実施の形態1を説明する。
図1は、実施の形態1にかかる寸法測定装置の構成を示すブロック図である。図1に示す寸法測定装置100は、例えば物流倉庫等の物流に関する設備(フォークリフトの如き貨物の運搬装置や倉庫の床等)に設置され、少なくともカメラ110と、プロジェクタ120と、プロセッサ130と、メモリ140とを備える。寸法測定装置100は、荷台(パレット)に置かれた貨物または荷台付き貨物の寸法を測定する装置である。以下、寸法の測定対象である「貨物」または貨物Cの用語は、特別の場合を除いて「貨物」そのもののみならず、貨物と荷台を含む「荷台付き貨物」または荷台付き貨物CPも含み得る。そして、寸法測定装置100は、貨物が直方体の場合は直方体の3辺の寸法を測定し、貨物が直方体でない不定型の場合は、後述するように、当該貨物に外接する外接直方体(後述)の3辺の寸法を測定する。
図2Aおよび図2Bは二台の寸法測定装置100A、100B(以下、「寸法測定装置100」とも言う)を含む寸法測定システム200がフォークリフトFLに固定された状態および本願発明の課題が生じる一場面を説明する説明図である。フォークリフトFLの前方には、一体型の寸法測定装置100が二台固定され、寸法測定装置100は昇降可能であって荷台付き貨物を保持するフォークFおよびその周辺を撮影する。
図7~11は、本実施形態の寸法測定装置100が行う具体的な動作手順を示すフローチャートである。図7は、実施の形態1の寸法測定装置100が行う動作手順の概略を示すフローチャートである。
図15Aないし図15Eは、実施の形態2の寸法測定装置100が行う処理を説明する概念図である。図15Aは処理全体のイメージ図であり、床に固定された寸法測定装置100は、特定の場所に固定された荷台付き貨物CPを撮影し、荷台Pを除外し、貨物Cの外接直方体を算出することができる。尚、本実施形態における装置構成は、図1に示した実施の形態1にかかる寸法測定装置の構成と共通であり、以下に述べる他の実施形態でも共通である。
図18は、本実施形態の寸法測定装置100を含む業務診断システムが行う処理を説明する概念図である。業務診断システムは、貨物Cごとの属性情報を集計し、ハブ拠点内での業務診断に活用するシステムである。例えば寸法測定装置100は、貨物の撮影時に、寸法測定装置100自身が算出した寸法や、貨物に取り付けられたタグなどから取得した種々の情報を含む属性情報を、業務判断コンピュータ300に送信する。属性情報には、貨物の寸法、重量、搬入ゲート、搬入時間、宛先(国)、搬出ゲート、搬出時間などが含まれる。
図19は、本開示の実施形態の変形例を示す概念図である。上記の実施形態の例では、フォークリフトFLに設置される寸法測定装置100の数は2台であるが、1台であってもよい。もちろん寸法測定装置100の設置位置は自由である。例えば、寸法測定装置100用のレールをフォークリフトFLに取り付け、フォークリフトFLのハンドル動作に合わせて当該装置の位置を変化させるようにしてもよい。
110 カメラ
120 プロジェクタ
130 プロセッサ
140 メモリ
200 寸法測定システム
300 業務判断コンピュータ
301 データ集計部
302 最適化アルゴリズム生成部302
C 貨物
CP 荷台付き貨物
F フォーク
FL フォークリフト
M 異物
W 壁
P 荷台(パレット)
Claims (11)
- 荷台に置かれた貨物または荷台付き貨物の寸法を測定する寸法測定装置であって、
測定波を送信するトランスミッタと、
反射した前記測定波を受信して、前記貨物および前記荷台を含む第1の貨物距離画像を取得するレシーバと、
プロセッサと、
メモリと、を備え、
前記プロセッサは、前記メモリと協働して、
前記第1の貨物距離画像を前記メモリに記憶させ、
前記第1の貨物距離画像に存在する前記荷台の形状に基づいて、前記第1の貨物距離画像内において、前記貨物または前記荷台付き貨物以外の不要な物体が存在する領域を特定し、
前記特定された領域に所定の情報を書き込むことにより第2の貨物距離画像を生成する、
寸法測定装置。 - 請求項1に記載の寸法測定装置であって、
前記レシーバは、前記貨物を含まない背景距離画像を更に取得し、
前記プロセッサは、前記背景距離画像と、前記第2の貨物距離画像との差分を取得することにより、前記貨物または前記荷台付き貨物の形状を表す最終貨物距離画像を生成する、
寸法測定装置。 - 請求項1に記載の寸法測定装置であって、
前記プロセッサは、
前記第1の貨物距離画像から前記荷台のエッジの抽出を行うことにより、前記荷台の形状を特定する、
寸法測定装置。 - 請求項3に記載の寸法測定装置であって、
前記プロセッサは、
前記荷台の側面の形状を特定し、前記側面から突出する物体が存在する領域を前記不要な物体が存在する領域として特定する、
寸法測定装置。 - 請求項3に記載の寸法測定装置であって、
前記プロセッサは、
前記荷台の上面を底面および断面のどちらか一方とする直方体のエリアを特定し、前記エリア外に存在する物体が存在する領域を前記不要な物体が存在する領域として特定する、
寸法測定装置。 - 請求項2に記載の寸法測定装置であって、
前記所定の情報は、前記不要な物体が存在する領域に対応する前記背景距離画像内の領域の画素情報である、
寸法測定装置。 - 請求項1に記載の寸法測定装置であって、
前記所定の情報は、Nullデータである、
寸法測定装置。 - 請求項2に記載の寸法測定装置であって、
前記プロセッサは、
前記最終貨物距離画像を3次元マッチング空間にて座標展開し、
座標展開された前記最終貨物距離画像の外接直方体を生成し、当該外接直方体の寸法を算出する、
寸法測定装置。 - 請求項7に記載の寸法測定装置であって、
前記プロセッサは、
前記第2の貨物距離画像を3次元マッチング空間にて座標展開し、
座標展開された前記第2の貨物距離画像の外接直方体を生成し、当該外接直方体の寸法を算出する、
寸法測定装置。 - 請求項1に記載の寸法測定装置であって、
前記トランスミッタが前記測定波として測定光を投光するプロジェクタであり、
前記レシーバが反射した前記測定光を受光するカメラである、
寸法測定装置。 - 荷台に置かれた貨物または荷台付き貨物の寸法を測定する寸法測定方法であって、
トランスミッタが測定波を送信し、
レシーバが、反射した前記測定波を受信して、前記貨物および前記荷台を含む第1の貨物距離画像を取得し、
プロセッサがメモリと協働して、
前記第1の貨物距離画像を前記メモリに記憶させ、
前記第1の貨物距離画像に存在する前記荷台の形状に基づいて、前記第1の貨物距離画像内において、前記貨物または前記荷台付き貨物以外の不要な物体が存在する領域を特定し、
前記特定された領域に所定の情報を書き込むことにより第2の貨物距離画像を生成する、
寸法測定方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680003580.7A CN107076541A (zh) | 2015-06-11 | 2016-05-27 | 尺寸测定装置以及尺寸测定方法 |
EP16807083.7A EP3309505A4 (en) | 2015-06-11 | 2016-05-27 | Dimension measurement device and dimension measurement method |
JP2017523100A JPWO2016199366A1 (ja) | 2015-06-11 | 2016-05-27 | 寸法測定装置および寸法測定方法 |
US15/525,732 US20170336195A1 (en) | 2015-06-11 | 2016-05-27 | Dimension measurement device and dimension measurement method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-118565 | 2015-06-11 | ||
JP2015118565 | 2015-06-11 | ||
JP2015-143369 | 2015-07-17 | ||
JP2015143369 | 2015-07-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016199366A1 true WO2016199366A1 (ja) | 2016-12-15 |
Family
ID=57503215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/002571 WO2016199366A1 (ja) | 2015-06-11 | 2016-05-27 | 寸法測定装置および寸法測定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170336195A1 (ja) |
EP (1) | EP3309505A4 (ja) |
JP (1) | JPWO2016199366A1 (ja) |
CN (1) | CN107076541A (ja) |
WO (1) | WO2016199366A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019008861A1 (ja) * | 2017-07-05 | 2019-01-10 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、情報処理方法および個体撮像装置 |
WO2020054118A1 (ja) * | 2018-09-10 | 2020-03-19 | Necプラットフォームズ株式会社 | 荷物計測装置、荷物受付システム、荷物計測方法、及び非一時的なコンピュータ可読媒体 |
JP2020042537A (ja) * | 2018-09-11 | 2020-03-19 | 矢崎エナジーシステム株式会社 | 作業車両の稼働管理システム |
WO2020066847A1 (ja) | 2018-09-28 | 2020-04-02 | パナソニックIpマネジメント株式会社 | 採寸装置及び採寸方法 |
CN111899373A (zh) * | 2020-08-05 | 2020-11-06 | 中国工商银行股份有限公司 | 机房巡检点确定方法、装置、机器人及存储介质 |
JP2020189746A (ja) * | 2019-05-24 | 2020-11-26 | 矢崎エナジーシステム株式会社 | 作業車両の運転支援システム |
WO2022195705A1 (ja) * | 2021-03-16 | 2022-09-22 | 株式会社Fuji | 移動体 |
JP7444687B2 (ja) | 2020-04-15 | 2024-03-06 | 矢崎エナジーシステム株式会社 | 車載器、稼働管理装置、運転支援システム、及び運転支援プログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10281924B2 (en) * | 2016-12-07 | 2019-05-07 | Bendix Commerical Vehicle Systems Llc | Vision system for vehicle docking |
US11430148B2 (en) * | 2016-12-28 | 2022-08-30 | Datalogic Ip Tech S.R.L. | Apparatus and method for pallet volume dimensioning through 3D vision capable unmanned aerial vehicles (UAV) |
US10005564B1 (en) * | 2017-05-05 | 2018-06-26 | Goodrich Corporation | Autonomous cargo handling system and method |
CN107862712A (zh) * | 2017-10-20 | 2018-03-30 | 陈宸 | 尺寸数据确定方法、装置、存储介质及处理器 |
US10657666B2 (en) * | 2017-12-22 | 2020-05-19 | Symbol Technologies, Llc | Systems and methods for determining commercial trailer fullness |
JP7285470B2 (ja) | 2018-05-17 | 2023-06-02 | パナソニックIpマネジメント株式会社 | 投影システム、投影装置及び投影方法 |
CN110910445B (zh) * | 2019-11-26 | 2023-08-04 | 深圳市丰巢科技有限公司 | 一种物件尺寸检测方法、装置、检测设备及存储介质 |
DE102021114067A1 (de) | 2021-05-31 | 2022-12-01 | Jungheinrich Aktiengesellschaft | Flurförderzeug mit einer optischen Überwachungseinrichtung |
CN113640177A (zh) * | 2021-06-29 | 2021-11-12 | 阿里巴巴新加坡控股有限公司 | 货物密度测量方法、系统以及电子设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62157503A (ja) * | 1985-12-28 | 1987-07-13 | Honda Motor Co Ltd | 画像処理方法 |
JPH05164529A (ja) * | 1991-12-16 | 1993-06-29 | Shinko Electric Co Ltd | 振動フィ−ダ用視覚装置 |
JP2002029631A (ja) * | 2000-07-12 | 2002-01-29 | Suehiro Giken Kk | 積み付け方法および積み付けシステム |
JP2002222412A (ja) * | 2000-09-27 | 2002-08-09 | Canon Inc | 画像処理装置 |
JP2006528122A (ja) * | 2003-05-26 | 2006-12-14 | ダイムラークライスラー・アクチェンゲゼルシャフト | フォークリフトの荷物支持手段上の可動センサ装置 |
US20090059004A1 (en) * | 2007-08-31 | 2009-03-05 | Speed Trac Technologies, Inc. | System and Method for Monitoring the Handling of a Shipment of Freight |
JP2013196355A (ja) * | 2012-03-19 | 2013-09-30 | Toshiba Corp | 物体測定装置、及び物体測定方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05165429A (ja) * | 1991-12-12 | 1993-07-02 | Noritake Co Ltd | カラー液晶表示装置 |
KR100356016B1 (ko) * | 1999-12-21 | 2002-10-18 | 한국전자통신연구원 | 영상인식에 의한 소포우편물 부피계측시스템 및부피계측방법 |
US7277187B2 (en) * | 2001-06-29 | 2007-10-02 | Quantronix, Inc. | Overhead dimensioning system and method |
US8284988B2 (en) * | 2009-05-13 | 2012-10-09 | Applied Vision Corporation | System and method for dimensioning objects using stereoscopic imaging |
CN102102981B (zh) * | 2009-12-21 | 2013-02-27 | 重庆工商大学 | 以二维单基色对比度为特征帧匹配测量位移的方法及装置 |
EP2439487B1 (de) * | 2010-10-06 | 2012-08-22 | Sick Ag | Volumenmessvorrichtung für bewegte Objekte |
EP2902960A4 (en) * | 2012-09-27 | 2015-09-23 | Panasonic Ip Man Co Ltd | STEREOSCOPIC IMAGE PROCESSING DEVICE AND STEREOSCOPIC IMAGE PROCESSING METHOD |
CN204202558U (zh) * | 2014-11-18 | 2015-03-11 | 宁波工程学院 | 一种倒置式图像尺寸检测与测量装置 |
-
2016
- 2016-05-27 WO PCT/JP2016/002571 patent/WO2016199366A1/ja active Application Filing
- 2016-05-27 CN CN201680003580.7A patent/CN107076541A/zh active Pending
- 2016-05-27 EP EP16807083.7A patent/EP3309505A4/en not_active Withdrawn
- 2016-05-27 US US15/525,732 patent/US20170336195A1/en not_active Abandoned
- 2016-05-27 JP JP2017523100A patent/JPWO2016199366A1/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62157503A (ja) * | 1985-12-28 | 1987-07-13 | Honda Motor Co Ltd | 画像処理方法 |
JPH05164529A (ja) * | 1991-12-16 | 1993-06-29 | Shinko Electric Co Ltd | 振動フィ−ダ用視覚装置 |
JP2002029631A (ja) * | 2000-07-12 | 2002-01-29 | Suehiro Giken Kk | 積み付け方法および積み付けシステム |
JP2002222412A (ja) * | 2000-09-27 | 2002-08-09 | Canon Inc | 画像処理装置 |
JP2006528122A (ja) * | 2003-05-26 | 2006-12-14 | ダイムラークライスラー・アクチェンゲゼルシャフト | フォークリフトの荷物支持手段上の可動センサ装置 |
US20090059004A1 (en) * | 2007-08-31 | 2009-03-05 | Speed Trac Technologies, Inc. | System and Method for Monitoring the Handling of a Shipment of Freight |
JP2013196355A (ja) * | 2012-03-19 | 2013-09-30 | Toshiba Corp | 物体測定装置、及び物体測定方法 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019008861A1 (ja) * | 2017-07-05 | 2019-01-10 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、情報処理方法および個体撮像装置 |
US10964045B2 (en) | 2017-07-05 | 2021-03-30 | Sony Semiconductor Solutions Corporation | Information processing device, information processing method, and individual imaging device for measurement of a size of a subject |
WO2020054118A1 (ja) * | 2018-09-10 | 2020-03-19 | Necプラットフォームズ株式会社 | 荷物計測装置、荷物受付システム、荷物計測方法、及び非一時的なコンピュータ可読媒体 |
JP2020041886A (ja) * | 2018-09-10 | 2020-03-19 | Necプラットフォームズ株式会社 | 荷物計測装置、荷物受付システム、荷物計測方法、及びプログラム |
US11836941B2 (en) | 2018-09-10 | 2023-12-05 | Nec Platforms, Ltd. | Package measuring apparatus, package accepting system, package measuring method, and non-transitory computer readable medium |
JP7140612B2 (ja) | 2018-09-11 | 2022-09-21 | 矢崎エナジーシステム株式会社 | 作業車両の稼働管理システム |
JP2020042537A (ja) * | 2018-09-11 | 2020-03-19 | 矢崎エナジーシステム株式会社 | 作業車両の稼働管理システム |
WO2020066847A1 (ja) | 2018-09-28 | 2020-04-02 | パナソニックIpマネジメント株式会社 | 採寸装置及び採寸方法 |
JP2020189746A (ja) * | 2019-05-24 | 2020-11-26 | 矢崎エナジーシステム株式会社 | 作業車両の運転支援システム |
JP7306876B2 (ja) | 2019-05-24 | 2023-07-11 | 矢崎エナジーシステム株式会社 | 作業車両の運転支援システム |
JP7444687B2 (ja) | 2020-04-15 | 2024-03-06 | 矢崎エナジーシステム株式会社 | 車載器、稼働管理装置、運転支援システム、及び運転支援プログラム |
CN111899373A (zh) * | 2020-08-05 | 2020-11-06 | 中国工商银行股份有限公司 | 机房巡检点确定方法、装置、机器人及存储介质 |
WO2022195705A1 (ja) * | 2021-03-16 | 2022-09-22 | 株式会社Fuji | 移動体 |
Also Published As
Publication number | Publication date |
---|---|
US20170336195A1 (en) | 2017-11-23 |
EP3309505A1 (en) | 2018-04-18 |
CN107076541A (zh) | 2017-08-18 |
JPWO2016199366A1 (ja) | 2018-04-05 |
EP3309505A4 (en) | 2018-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016199366A1 (ja) | 寸法測定装置および寸法測定方法 | |
US11629964B2 (en) | Navigation map updating method and apparatus and robot using the same | |
CN108474653B (zh) | 三维计测装置及其计测辅助处理方法 | |
CN110837814B (zh) | 车辆导航方法、装置及计算机可读存储介质 | |
EP3434626A1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
US20160364853A1 (en) | Image processing device and image processing method | |
US20210039257A1 (en) | Workpiece picking device and workpiece picking method | |
JP2005072888A (ja) | 画像投影方法、及び画像投影装置 | |
US11488354B2 (en) | Information processing apparatus and information processing method | |
EP3434621B1 (en) | Instruction projecting device, parcel sorting system and instruction projecting method | |
EP3434622A1 (en) | Instruction projecting device, package sorting system and instruction projecting method | |
JP2013158873A (ja) | 検索ウィンドウを自動的に調整する機能を備えた画像処理装置 | |
EP3434623B1 (en) | Projection indicator, cargo assortment system, and projection indicating method | |
KR101030317B1 (ko) | 스테레오 비전을 이용하여 장애물을 추적하는 장치 및 방법 | |
CN110816522B (zh) | 车辆姿态的控制方法、设备及计算机可读存储介质 | |
JP7111172B2 (ja) | 位置検出装置、位置検出システム、遠隔制御装置、遠隔制御システム、位置検出方法、及びプログラム | |
US10841559B2 (en) | Systems and methods for detecting if package walls are beyond 3D depth camera range in commercial trailer loading | |
EP3647236B1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
WO2023213070A1 (zh) | 基于2d相机获取货物位姿方法、装置、设备及存储介质 | |
EP3689789B1 (en) | Baggage recognition device, baggage sorting system, and baggage recognition method | |
US11145066B2 (en) | Parcel recognition device, parcel sorting system, and parcel recognition method | |
EP3434625B1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
JP7364269B2 (ja) | 物体検知装置、画像処理表示方法及びプログラム | |
US20230039203A1 (en) | Information processing apparatus, moving body, method for controlling information processing apparatus, and recording medium | |
KR102619083B1 (ko) | 인공 표식의 위치 추정 방법 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16807083 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2016807083 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016807083 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017523100 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |