WO2020008538A1 - Material estimation device and robot - Google Patents
Material estimation device and robot Download PDFInfo
- Publication number
- WO2020008538A1 WO2020008538A1 PCT/JP2018/025262 JP2018025262W WO2020008538A1 WO 2020008538 A1 WO2020008538 A1 WO 2020008538A1 JP 2018025262 W JP2018025262 W JP 2018025262W WO 2020008538 A1 WO2020008538 A1 WO 2020008538A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- light
- reflectance
- shape
- estimating
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/47—Scattering, i.e. diffuse reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/55—Specular reflectivity
Definitions
- the present invention relates to a material estimation device and a robot for estimating the material of an object.
- the robot that operates the object grasps the object after recognizing the shape and the grasping position of the object with the shape measurement sensor.
- the operation is performed by grasping the object after recognizing the shape and the gripping position of the object with the shape measurement sensor. Will be needed.
- Patent Document 1 proposes a method of irradiating light or radiation to an object to be grasped to estimate a material in a non-contact manner.
- Patent Document 1 since the method disclosed in Patent Document 1 requires the use of an analyzer such as an X-ray fluorescence analyzer or an infrared spectrophotometer, when applied to a production site, a space for installing the analyzer at the production site is required. Need to be secured. In a production site, securing an installation space for an analyzer separately from a space for installing a robot leads to a reduction in the degree of integration of devices for performing operations. That is, at the production site, it is difficult to estimate the material of the object by applying the method disclosed in Patent Document 1.
- an analyzer such as an X-ray fluorescence analyzer or an infrared spectrophotometer
- the present invention has been made in view of the above, and has as its object to provide a material estimating apparatus capable of estimating the material of an object without using an analyzer.
- the present invention provides a shape measurement sensor including a light projecting unit that projects a light beam on an object and a light receiving unit that receives a light beam reflected by the object,
- a storage unit is provided for storing position and orientation information indicating the relationship between the position and orientation of the light unit and the light receiving unit, and reflectance material relationship information indicating the relationship between the material of the object and the reflectance.
- the present invention provides a shape measurement unit that generates a distance image indicating a surface shape of a target object based on image data and position and orientation information generated based on light rays received by a light receiving unit, a distance image, and a position and orientation.
- a material estimating unit for estimating the material of the target object based on the information and the reflectance material relation information;
- the material estimation device has an effect that the material of the object can be estimated without using the analyzer.
- FIG. 1 is a diagram showing a configuration of a material estimation device and a robot according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram showing an operation of an information processing unit of the material estimation device according to the first embodiment.
- 5 is a flowchart showing the operation of the material estimating unit of the material estimating device according to Embodiment 1.
- FIG. 1 is a diagram illustrating a configuration of a material estimation device and a robot according to Embodiment 1 of the present invention.
- the material estimating device 70 includes the shape measurement sensor 2 that acquires a three-dimensional shape of the object 5 and the information processing unit 3.
- the robot 60 includes a hand 4 as an operation unit for operating the target 5 and a robot arm 1 for moving the hand 4 as the operation unit.
- the robot arm 1 moves the hand 4. Note that the hand 4 may be moved using a linear motion stage instead of the robot arm 1.
- the shape measurement sensor 2 includes a light projecting unit 21 that projects light and a light receiving unit 22 that receives reflected light of light projected from the light projecting unit 21.
- a projector or a laser slit light projecting device can be applied to the light projecting unit 21.
- a digital camera can be applied to the light receiving unit 22.
- the information processing unit 3 includes position and orientation information 311 indicating the relationship between the position and orientation of the light projecting unit 21 and the light receiving unit 22 of the shape measurement sensor 2, and a reflectance material relationship indicating the relationship between the material of the object 5 and the reflectance.
- the surface shape of the object 5 based on information acquired from the shape measurement sensor 2 and the database 31 which is a storage unit storing the information 312 and the material gripping parameter relation information 313 in which the material of the object 5 is associated with the gripping parameter.
- the control unit 34 determines a gripping method or gripping means based on the gripping method.
- the gripping parameters include, when the hand 4 grips the object 5, the amount of force holding the object 5, the opening / closing amount of the hand 4, the gripping position at which the hand 4 sandwiches the object 5, and the hand 4 gripping the object 5. Parameters representing the approach direction, the operation speed of the robot arm 1, and the operation trajectory of the robot arm 1.
- the material gripping parameter relation information 313 can be realized by a table in which material names and gripping parameters are related. Details of the processing in the information processing unit 3 will be described later.
- the hand 4 is a finger-shaped gripper that grips the target object 5 therebetween.
- the object 5 is housed in the supply box 6.
- the object 5 is not limited to a specific material. Further, the object 5 accommodated in the supply box 6 may include two or more types of objects. How to place the object 5 in the supply box 6 is not limited to a specific way.
- the placement of the object 5 in the supply box 6 can be exemplified by bulk loading and flat placement, but other placements are also possible.
- FIG. 2 is a diagram illustrating an operation of the information processing unit of the material estimation device according to the first embodiment.
- the acquired data 41 of the shape measurement sensor 2 is input to the shape measurement unit 32.
- the acquired data 41 indicates data of an image obtained by capturing the light projection pattern of the light beam emitted by the light projection unit 21 with the light receiving unit 22.
- an image data generation unit that generates image data based on the output signal of the shape measurement sensor 2 separately from the shape sensor 2 and use the image data output by the image data generation unit as the acquisition data 41.
- the shape measurement unit 32 creates the shape measurement data 42 based on the principle of triangulation based on the acquired data 41 and the position and orientation information 311.
- the shape measurement data is data referred to as a distance image, and is image data in which the closer the distance to the light receiving unit 22 is, the brighter the object is, and the longer the distance to the light receiving unit 22 is, the darker the object is displayed. Therefore, when the light projecting unit 21 projects the light beam on the object 5, the distance image includes the surface shape of the object 5.
- the material estimation section 33 receives the shape measurement data 42 as the distance image, the position and orientation information 311 and the reflectance material relation information 312. A specific processing flowchart of the material estimating unit 33 will be described later.
- the material estimating unit 33 outputs material estimation data 43 which is a result of estimating the material of the object shown in the image.
- the controller 34 receives the material estimation data 43 estimated by the material estimator 33 and the material gripping parameter relation information 313 stored in the database 31.
- the control unit 34 determines a grip parameter 44 suitable for the estimated material based on the material grip parameter relation information 313 and outputs the determined grip parameter 44 to the robot 60.
- the object 5 When the object 5 is formed of a soft material, the object 5 may be deformed by its own weight if the position away from the center of gravity is grasped. Therefore, it is preferable to grasp the object 5 at a position close to the center of gravity. Further, when the object 5 is formed of a soft material, if the hand 4 collides with the object 5, a dent may remain on the object 5. Therefore, the speed at which the robot arm 1 moves the hand 4 is low. It is preferable that As described above, the control unit 34 changes the operation of the robot arm 1 and the hand 4 by changing the grip parameters based on the material estimation data 43.
- FIG. 3 is a flowchart showing the operation of the material estimating unit of the material estimating apparatus according to Embodiment 1.
- the material estimating unit 33 estimates a three-dimensional shape of a region where the light projecting unit 21 projects a light ray from the shape measurement data 42.
- FIG. 4 is a diagram schematically illustrating a process of estimating a three-dimensional shape by the material estimating unit of the material estimating device according to the first embodiment. Since the light projecting unit 21 emits light rays in a specific light projecting pattern, the shape of the light projecting pattern projected on the surface of the object is changed according to the surface of the object. Accordingly, the material estimating unit 33 compares the shape of the light and dark pattern in the image indicated by the shape measurement data 42 with the light emitting pattern of the light emitted by the light emitting unit 21 so that the light emitting unit 21 The three-dimensional surface of the projection area is estimated.
- the shape measurement data 42 can obtain only the shape of the surface of the object viewed from the shape measurement sensor 2. That is, as shown in FIG. 4, when the shape measurement sensor 2 measures the distance to the object 5 existing in the region where the light projecting unit 21 projects the light beam, the shape measurement data 42 includes only the surface of the object 5. Is the data indicating the extracted missing shape. Therefore, the material estimating unit 33 interpolates the shape measurement data 42 or applies a defined primitive shape to the shape measurement data 42 to obtain a three-dimensional image of the object existing in the region where the light projecting unit 21 projects the light beam. Estimate the shape.
- the primitive shape is a geometrically simple shape such as a rectangular parallelepiped, a cube, a cylinder, a sphere, and a cone.
- the CAD data may be applied to estimate the three-dimensional shape of the object 5.
- the material estimating unit 33 collates the defined shape data such as the primitive shape or CAD data with the shape measurement data 42, and thereby the three-dimensional shape of the object existing in the region where the light projecting unit 21 projects the light beam. Is estimated.
- FIG. 4 shows, by broken lines, the estimation result of the three-dimensional shape of the object existing in the area where the light projecting unit 21 projects the light beam.
- the target object 5 shown in FIG. 4 is a solid object
- the three-dimensional shape estimated by the material estimating unit 33 is an outer shape. That is, whether the object 5 is solid or hollow, the estimation result of the three-dimensional shape by the material estimation unit 33 is the same.
- the material estimating unit 33 estimates the reflection state of the object existing in the region where the light projecting unit 21 projects the light beam on the three-dimensional surface.
- the estimation of the reflection state on the surface of the three-dimensional shape means that, based on the position / orientation information 311 and the three-dimensional shape, each point on the surface of the three-dimensional shape is reflected by a regular reflection point 10, a diffuse reflection point 11, and a secondary reflection point described later. This refers to determining which of the points 12 is applicable.
- the specular reflection point 10 is also called a specular reflection point.
- FIG. 5 is a diagram illustrating a definition of a specular reflection point in a process of estimating a reflection state of a three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment.
- a point of the three-dimensional shape where specular reflection occurs is defined as a regular reflection point 10.
- the position that can be the specular reflection point 10 can be specified based on the distance and posture of the light projecting unit 21 and the light receiving unit 22.
- FIG. 6 is a diagram showing the definition of the diffuse reflection point in the process of estimating the reflection state of the three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment.
- a point where diffuse reflection occurs when the diffuse reflection component of the light beam emitted from the light projecting unit 21 enters the light receiving unit 22 is defined as a diffuse reflection point 11.
- the light beam entering the light receiving unit 22 has the same intensity as the reference value. Therefore, when a light beam having the same intensity as the reference value enters the light receiving unit 22, it can be determined that the point is the diffuse reflection point 11.
- FIG. 7 is a diagram showing the definition of the secondary reflection point in the process of estimating the reflection state of the three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment.
- the point at which the last specular reflection occurs when the light beam emitted from the light projecting unit 21 is twice specularly reflected in a three-dimensional shape and the reflected light finally enters the light receiving unit 22 is defined as the secondary reflection point 12. I do.
- the light beam entering the light receiving unit 22 has an intensity lower than the reference value.
- the light rays reflected twice or more in the three-dimensional shape are greatly attenuated.
- the reflected light that is specularly reflected three or more times in the three-dimensional shape that is, the reflected light of the third or higher order, is unlikely to affect the material estimation result even if ignored.
- consideration of the third or higher order reflected light is omitted.
- FIG. 8 is a diagram showing definitions of points that are both regular reflection points and secondary reflection points in the process of estimating the reflection state of the three-dimensional surface by the material estimation unit of the material estimation device according to Embodiment 1. .
- both the light ray reflected at the specular reflection point 10 and the secondary reflected light may enter the light receiving unit 22. That is, there can be a point that is both a regular reflection point 10 and a secondary reflection point 12.
- the light entering the light receiving unit 22 has an intensity exceeding a reference value.
- the point at which the light ray is reflected is the secondary reflection point 12. It can be determined that there is.
- the reference value may have a range. That is, the value between the upper threshold and the lower threshold is the reference value, and when the intensity of the light beam incident on the light receiving unit 22 exceeds the upper threshold or is lower than the lower threshold, the secondary reflection point 12 may be determined. Good.
- the material estimating unit 33 estimates the diffuse reflectance of the diffuse reflection point 11 on the surface of the three-dimensional shape.
- the diffuse reflectance can be estimated from the light intensity and the light direction from the model of the object reflected light. That is, the intensity of the light beam that reaches the diffuse reflection point 11 from the light projection unit 21 is specified based on the positional relationship between the light projection unit 21 and the diffuse reflection point 11 and the light projection pattern of the light beam emitted by the light projection unit 21. .
- the diffuse reflectance can be estimated by dividing the intensity of the light beam incident on the light receiving unit 22 by the intensity of the light beam reaching the diffuse reflection point 11 from the light projecting unit 21. In the case where a point on the surface of the three-dimensional shape corresponds to the diffuse reflection point 11, if the diffuse reflectance can be estimated, the specular reflectance can also be estimated from the light intensity and the light ray direction.
- step S4 the material estimating unit 33 estimates the specular reflectance of the diffuse reflection point 11 on the surface of the three-dimensional shape.
- the specular reflectance is also referred to as a regular reflectance.
- the specular reflectance can be estimated from the light intensity and the light direction.
- the diffuse reflectance needs to be known, but the specular reflectance can be estimated by using the diffuse reflectance estimated at the diffuse reflection point 11 around the regular reflection point 10.
- step S5 the material estimating unit 33 interpolates the regular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12.
- the regular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12 are distributed with a certain size on the surface of the object 5. Further, due to the influence of the disturbance, the intensity of the light beam reflected at a certain point and incident on the light receiving unit 22 may exceed the reference value or may fall below the reference value. Therefore, there is a possibility that another kind of small reflection point exists in the area of the regular reflection point 10, the diffuse reflection point 11, or the secondary reflection point 12.
- the material estimating unit 33 determines that the region surrounded by any one of the specular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12 and having a size equal to or smaller than the threshold value includes the specular reflection point 10, the diffuse reflection An interpolation process is performed, which is assumed to be the same as the point 11 or the secondary reflection point 12. It is assumed that the reflectance is the same as the reflectance at the point where the interpolation processing is performed.
- the material estimating unit 33 estimates the object region based on the three-dimensional shape and the reflectance of the object existing in the region where the light projecting unit 21 projects the light rays. That is, the material estimating unit 33 estimates where the target object 5 exists in the area where the light projecting unit 21 projects the light beam.
- the diffuse reflectance and the specular reflectance are different from each other.
- the diffuse reflectance and the specular reflectance of the supply box 6 accommodating the object 5 are different from those of the object 5. Different objects 5 having the same material but different diffuse reflectances and specular reflectances are the same, but have discontinuous shapes.
- the material estimating unit 33 estimates an area where the shape, diffuse reflectance, and specular reflectance change smoothly, as one object.
- “the change is smooth” means that at two points corresponding to adjacent pixels in the distance image as the shape measurement data 42, a difference in position or a difference in reflectance is equal to or smaller than a threshold.
- the material estimating unit 33 also estimates a secondary reflection area surrounded by an area where the shape, diffuse reflectance and specular reflectance change smoothly, as the same object. By a method such as estimation based on such specific conditions or segmentation of a region based on machine learning, the material estimating unit 33 converts one object estimated to be divided into a plurality of three-dimensional shapes into one object Can be estimated.
- the material estimating unit 33 estimates the material for each object region based on the reflectance material relationship information 312.
- the material estimating unit 33 estimates the material of the object from the reflectance material relation information 312 stored in the database 31 and the diffuse reflectance and the specular reflectance for each estimated object region. That is, the material estimation unit 33 estimates the material associated with the reflectance at the diffuse reflection point in the reflectance material relationship information 312 as the material of the target object 5.
- the material estimation device 70 according to the first embodiment can estimate the material of an object using a shape measurement sensor used for object operation without using an expensive and special sensor. Therefore, it is possible to control operations according to various types of objects.
- the material estimating device 70 according to the first embodiment estimates the material of an object using the shape measurement sensor 2 having the light projecting unit 21 and the light receiving unit 22 that is usually used for the robot 60. This makes it possible to estimate the material of various types of objects with an inexpensive and realistic device configuration without using an analyzer such as a fluorescent X-ray analyzer or an infrared spectrophotometer.
- FIG. FIG. 9 is a diagram showing a configuration of a material estimation device according to Embodiment 2 of the present invention.
- the robot 60 according to the second embodiment is different from the robot 60 according to the first embodiment in that the suction hand 13 is provided on the robot arm 1.
- the database 31 stores data in which the type of the operation unit and the material of the target object 5 are associated with each other.
- the operation section for operating the object 5 is suitable for holding the object 5 depends on the material of the object 5. Therefore, by estimating the material of the object 5 in the same manner as in the material estimating device 70 according to the first embodiment, the operation unit can be properly used according to the material of the object 5. More specifically, if the target object 5 is a flexible object, it is easier to grip the target object 5 with the suction hand 13 than with the hand 4. Further, when the target object 5 is in a mesh shape, it is easier to grip the target object 5 by sandwiching it with the hand 4 than by sucking it with the suction hand 13.
- the material estimating device 70 grips the target object 5 by holding it with the hand 4 or sucks the target object 5 with the suction hand 13 based on the estimation result of the material of the target object 5. Can be switched, and the object 5 can be gripped efficiently.
- FIG. FIG. 10 is a diagram showing a configuration of a material estimation device according to Embodiment 3 of the present invention.
- the material estimating device 70 according to the third embodiment is different from the material estimating device 70 according to the first embodiment in that the information processing unit 3 is connected to the shape measurement sensor 2 via the network 50.
- the material estimation device 70 according to the third embodiment allows the information processing unit 3 to be installed separately from the robot arm 1.
- An increase in the installation space for the information processing unit 3 at the production site where the robot arm 1 is installed hinders an improvement in productivity.
- the material estimation device 70 according to the third embodiment by installing the information processing unit 3 away from the robot arm 1, a work space can be secured at a production site, and productivity can be improved.
- FIG. 11 is a diagram showing a configuration of a modification of the material estimating device according to the third embodiment.
- the shape measurement sensors 2 are installed on the robot arms 1 of the plurality of robots 60, and the information processing unit 3 is connected to the plurality of shape measurement sensors 2 via the network 50.
- the information processing unit 3 As described above, as the amount of information stored in the database 31 increases, the accuracy of material estimation improves. Therefore, by installing the information processing unit 3 in common for a plurality of robots 60, it is possible to change the gripping parameters based on a highly accurate material estimation result.
- the relationship between the material and the grasping parameter is updated, or the material estimating method of the material estimating unit 33 is changed, thereby improving the performance. High operation can be realized.
- the function of the information processing unit 3 according to the first, second, or third embodiment is realized by a processing circuit.
- the processing circuit may be dedicated hardware or an arithmetic device that executes a program stored in a storage device.
- FIG. 12 is a diagram illustrating a configuration in which the function of the information processing unit according to the first, second, or third embodiment is realized by hardware.
- the processing circuit 29 incorporates a logic circuit 29a that realizes the function of the information processing unit 3.
- the hardware that implements the processing circuit 29 can be exemplified by a microcontroller.
- the processing circuit 29 is an arithmetic unit
- the function of the information processing unit 3 is realized by software, firmware, or a combination of software and firmware.
- FIG. 13 is a diagram illustrating a configuration in which the function of the information processing unit according to the first, second, or third embodiment is realized by software.
- the processing circuit 29 has a central processing unit 291 for executing the program 29b, a random access memory 292 used by the central processing unit 291 for a work area, and a storage device 293 for storing the program 29b.
- the function of the information processing unit 3 is realized by the central processing unit 291 expanding and executing the program 29b stored in the storage device 293 on the random access memory 292.
- the software or firmware is described in a programming language and stored in the storage device 293.
- the processing circuit 29 implements the function of the information processing unit 3 by reading and executing the program 29b stored in the storage device 293. It can be said that the program 29b causes a computer to execute a procedure and a method for realizing the function of the information processing unit 3.
- processing circuit 29 may be partially realized by dedicated hardware and partially realized by software or firmware.
- the processing circuit 29 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- Immunology (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Biochemistry (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Manipulator (AREA)
Abstract
A material estimation device (70) comprising: a shape measurement sensor (2) that comprises a projection unit (21) for projecting a beam of light onto an object (5), and a light reception unit (22) for receiving a beam of light reflected by the object (5); a database (31) that stores position and orientation information (311) indicating the relationship between the light projection unit (21) and the light reception unit (22) in terms of position and orientation, and reflectance material relationship information (312) indicating the relationship between the material and reflectance of the object (5); a shape measurement unit (32) that, on the basis of the position and orientation information (311) and of image data generated on the basis of the beam of light received by the light reception unit (22), generates a distance image indicating the surface shape of the object (5); and a material estimation unit (33) that estimates the material of the object (5) on the basis of the distance image, the position and orientation information (311), and the reflectance material relationship information (312).
Description
本発明は、物体の材質を推定する材質推定装置及びロボットに関する。
The present invention relates to a material estimation device and a robot for estimating the material of an object.
物体を操作するロボットは、形状計測センサにより物体の形状及び把持位置を認識した上で物体を把持する。二種類以上の物体を区別して操作する場合、形状計測センサにより物体の形状及び把持位置を認識した上で物体を把持して操作を行うが、物体の材質に応じて把持する力加減の変更が必要になる。
ロ ボ ッ ト The robot that operates the object grasps the object after recognizing the shape and the grasping position of the object with the shape measurement sensor. When two or more types of objects are operated while being distinguished from each other, the operation is performed by grasping the object after recognizing the shape and the gripping position of the object with the shape measurement sensor. Will be needed.
特許文献1には、光又は放射線を把持対象物に照射して非接触で材質を推定する手法が提案されている。
Patent Document 1 proposes a method of irradiating light or radiation to an object to be grasped to estimate a material in a non-contact manner.
しかしながら、特許文献1に開示される手法は、蛍光X線分析装置又は赤外分光光度計測装置といった分析装置を使用する必要があるため、生産現場に適用すると、生産現場に分析装置を設置するスペースを確保する必要がある。生産現場において、ロボットの設置スペースとは別に分析装置の設置スペースを確保することは、作業を行う装置の集積度の低下につながる。すなわち、生産現場においては、特許文献1に開示される手法を適用して物体の材質を推定することは困難であった。
However, since the method disclosed in Patent Document 1 requires the use of an analyzer such as an X-ray fluorescence analyzer or an infrared spectrophotometer, when applied to a production site, a space for installing the analyzer at the production site is required. Need to be secured. In a production site, securing an installation space for an analyzer separately from a space for installing a robot leads to a reduction in the degree of integration of devices for performing operations. That is, at the production site, it is difficult to estimate the material of the object by applying the method disclosed in Patent Document 1.
本発明は、上記に鑑みてなされたものであって、分析装置を使用することなく物体の材質を推定できる材質推定装置を得ることを目的とする。
The present invention has been made in view of the above, and has as its object to provide a material estimating apparatus capable of estimating the material of an object without using an analyzer.
上述した課題を解決し、目的を達成するために、本発明は、対象物に光線を投射する投光部及び対象物において反射された光線を受光する受光部を備えた形状計測センサと、投光部と受光部との位置及び姿勢の関係を示す位置姿勢情報と、対象物の材質と反射率との関係を示す反射率材質関係情報とを格納する記憶部とを備える。本発明は、受光部が受光した光線に基づいて生成された画像データと位置姿勢情報とに基づいて、対象物の表面形状を示す距離画像を生成する形状計測部と、距離画像と、位置姿勢情報と、反射率材質関係情報とに基づいて、対象物の材質を推定する材質推定部とを備える。
In order to solve the above-described problems and achieve the object, the present invention provides a shape measurement sensor including a light projecting unit that projects a light beam on an object and a light receiving unit that receives a light beam reflected by the object, A storage unit is provided for storing position and orientation information indicating the relationship between the position and orientation of the light unit and the light receiving unit, and reflectance material relationship information indicating the relationship between the material of the object and the reflectance. The present invention provides a shape measurement unit that generates a distance image indicating a surface shape of a target object based on image data and position and orientation information generated based on light rays received by a light receiving unit, a distance image, and a position and orientation. A material estimating unit for estimating the material of the target object based on the information and the reflectance material relation information;
本発明に係る材質推定装置は、分析装置を使用することなく物体の材質を推定できるという効果を奏する。
The material estimation device according to the present invention has an effect that the material of the object can be estimated without using the analyzer.
以下に、本発明の実施の形態に係る材質推定装置及びロボットを図面に基づいて詳細に説明する。なお、この実施の形態によりこの発明が限定されるものではない。
Hereinafter, a material estimation device and a robot according to an embodiment of the present invention will be described in detail with reference to the drawings. It should be noted that the present invention is not limited by the embodiment.
実施の形態1.
図1は、本発明の実施の形態1に係る材質推定装置及びロボットの構成を示す図である。材質推定装置70は、対象物5の三次元形状を取得する形状計測センサ2と、情報処理部3とを有する。ロボット60は、対象物5を操作する操作部であるハンド4と、操作部であるハンド4を移動させるロボットアーム1を備える。 Embodiment 1 FIG.
FIG. 1 is a diagram illustrating a configuration of a material estimation device and a robot according to Embodiment 1 of the present invention. Thematerial estimating device 70 includes the shape measurement sensor 2 that acquires a three-dimensional shape of the object 5 and the information processing unit 3. The robot 60 includes a hand 4 as an operation unit for operating the target 5 and a robot arm 1 for moving the hand 4 as the operation unit.
図1は、本発明の実施の形態1に係る材質推定装置及びロボットの構成を示す図である。材質推定装置70は、対象物5の三次元形状を取得する形状計測センサ2と、情報処理部3とを有する。ロボット60は、対象物5を操作する操作部であるハンド4と、操作部であるハンド4を移動させるロボットアーム1を備える。 Embodiment 1 FIG.
FIG. 1 is a diagram illustrating a configuration of a material estimation device and a robot according to Embodiment 1 of the present invention. The
ロボットアーム1は、ハンド4を移動させる。なお、ロボットアーム1の代わりに直動ステージを用いてハンド4を移動させてもよい。
The robot arm 1 moves the hand 4. Note that the hand 4 may be moved using a linear motion stage instead of the robot arm 1.
形状計測センサ2は、光を投射する投光部21及び投光部21から投射された光の反射光を受光する受光部22を有する。投光部21には、プロジェクタ又はレーザースリット投光装置を適用できる。受光部22にはデジタルカメラを適用できる。投光部21と受光部22との位置及び姿勢の関係が既知な場合、対象物5において反射した投光部21からの光を受光部22で受光することにより、三角測量の原理で形状計測センサ2から対象物5までの距離を取得することができる。
The shape measurement sensor 2 includes a light projecting unit 21 that projects light and a light receiving unit 22 that receives reflected light of light projected from the light projecting unit 21. A projector or a laser slit light projecting device can be applied to the light projecting unit 21. A digital camera can be applied to the light receiving unit 22. When the relationship between the position and posture of the light projecting unit 21 and the light receiving unit 22 is known, the light from the light projecting unit 21 reflected on the object 5 is received by the light receiving unit 22 to measure the shape based on the principle of triangulation. The distance from the sensor 2 to the object 5 can be obtained.
情報処理部3は、形状計測センサ2の投光部21と受光部22との位置及び姿勢の関係を示す位置姿勢情報311、対象物5の材質と反射率との関係を示す反射率材質関係情報312及び対象物5の材質と把持パラメータとを関連付けた材質把持パラメータ関係情報313が格納された記憶部であるデータベース31、形状計測センサ2から取得される情報を基に対象物5の表面形状を復元する形状計測部32、投光部21が投光した光線が受光部22で受光されるまでの光線経路に基づき、対象物5の材質を推定する材質推定部33、材質の推定結果に基づき把持方法又は把持手段を決定する制御部34を有する。把持パラメータは、ハンド4で対象物5を把持する際に、対象物5を挟む力加減、ハンド4の開閉量及びハンド4が対象物5を挟む把持位置、ハンド4が対象物5を把持する際のアプローチ方向、ロボットアーム1の動作速度及びロボットアーム1の動作軌跡を表す各パラメータを含む。材質把持パラメータ関係情報313は、材質名と把持パラメータとを関係付けたテーブルによって実現できる。情報処理部3での処理の詳細は後述する。
The information processing unit 3 includes position and orientation information 311 indicating the relationship between the position and orientation of the light projecting unit 21 and the light receiving unit 22 of the shape measurement sensor 2, and a reflectance material relationship indicating the relationship between the material of the object 5 and the reflectance. The surface shape of the object 5 based on information acquired from the shape measurement sensor 2 and the database 31 which is a storage unit storing the information 312 and the material gripping parameter relation information 313 in which the material of the object 5 is associated with the gripping parameter. A shape measuring unit 32 for restoring the object, a material estimating unit 33 for estimating the material of the object 5 based on a ray path until the light beam emitted by the light projecting unit 21 is received by the light receiving unit 22, and a material estimating result. The control unit 34 determines a gripping method or gripping means based on the gripping method. The gripping parameters include, when the hand 4 grips the object 5, the amount of force holding the object 5, the opening / closing amount of the hand 4, the gripping position at which the hand 4 sandwiches the object 5, and the hand 4 gripping the object 5. Parameters representing the approach direction, the operation speed of the robot arm 1, and the operation trajectory of the robot arm 1. The material gripping parameter relation information 313 can be realized by a table in which material names and gripping parameters are related. Details of the processing in the information processing unit 3 will be described later.
ハンド4は、対象物5を挟んで把持するフィンガー形状のグリッパである。
The hand 4 is a finger-shaped gripper that grips the target object 5 therebetween.
対象物5は、供給箱6に収容されている。対象物5は、特定の材質に限定されない。また、供給箱6に収容される対象物5は、2種類以上の物体が混在してもよい。供給箱6内での対象物5の置き方は、特定の置き方に限定されない。供給箱6内での対象物5の置き方は、ばら積み及び平置きを例示できるが、他の置き方でもよい。
The object 5 is housed in the supply box 6. The object 5 is not limited to a specific material. Further, the object 5 accommodated in the supply box 6 may include two or more types of objects. How to place the object 5 in the supply box 6 is not limited to a specific way. The placement of the object 5 in the supply box 6 can be exemplified by bulk loading and flat placement, but other placements are also possible.
図2は、実施の形態1に係る材質推定装置の情報処理部の動作を示す図である。形状計測部32には、形状計測センサ2の取得データ41が入力される。取得データ41とは、投光部21が発する光線の投光パターンを受光部22で撮影した画像のデータを指す。なお、形状計測センサ2の出力信号に基づいて画像データを生成する画像データ生成部を形状センサ2とは別に設け、画像データ生成部が出力する画像データを取得データ41とすることも可能である。形状計測部32は、取得データ41と位置姿勢情報311とに基づいて、三角測量の原理で形状計測データ42を作成する。形状計測データは距離画像と称されるデータであり、受光部22との距離が近いほど物体が明るく、受光部22との距離が遠いほど物体が暗く表示される画像データである。したがって、投光部21が対象物5に光線を投射している場合、距離画像は、対象物5の表面形状を含んでいる。
FIG. 2 is a diagram illustrating an operation of the information processing unit of the material estimation device according to the first embodiment. The acquired data 41 of the shape measurement sensor 2 is input to the shape measurement unit 32. The acquired data 41 indicates data of an image obtained by capturing the light projection pattern of the light beam emitted by the light projection unit 21 with the light receiving unit 22. In addition, it is also possible to provide an image data generation unit that generates image data based on the output signal of the shape measurement sensor 2 separately from the shape sensor 2 and use the image data output by the image data generation unit as the acquisition data 41. . The shape measurement unit 32 creates the shape measurement data 42 based on the principle of triangulation based on the acquired data 41 and the position and orientation information 311. The shape measurement data is data referred to as a distance image, and is image data in which the closer the distance to the light receiving unit 22 is, the brighter the object is, and the longer the distance to the light receiving unit 22 is, the darker the object is displayed. Therefore, when the light projecting unit 21 projects the light beam on the object 5, the distance image includes the surface shape of the object 5.
材質推定部33には、距離画像である形状計測データ42、位置姿勢情報311及び反射率材質関係情報312が入力される。材質推定部33の具体的な処理フローチャートは後述する。材質推定部33からは、画像に映っている物体の材質の推定結果である材質推定データ43が出力される。
The material estimation section 33 receives the shape measurement data 42 as the distance image, the position and orientation information 311 and the reflectance material relation information 312. A specific processing flowchart of the material estimating unit 33 will be described later. The material estimating unit 33 outputs material estimation data 43 which is a result of estimating the material of the object shown in the image.
制御部34には、材質推定部33で推定された材質推定データ43と、データベース31に蓄えられた材質把持パラメータ関係情報313とが入力される。制御部34は、推定された材質に適した把持パラメータ44を、材質把持パラメータ関係情報313に基づいて決定し、ロボット60に出力する。
The controller 34 receives the material estimation data 43 estimated by the material estimator 33 and the material gripping parameter relation information 313 stored in the database 31. The control unit 34 determines a grip parameter 44 suitable for the estimated material based on the material grip parameter relation information 313 and outputs the determined grip parameter 44 to the robot 60.
対象物5が軟らかい材料で形成されている場合、重心から離れた位置を把持すると対象物5が自重で変形する可能性があるため、重心に近い位置で把持することが好ましい。また、対象物5が軟らかい材料で形成されている場合、ハンド4が対象物5に衝突すると対象物5に打痕が残る可能性があるため、ロボットアーム1がハンド4を移動させる速度は低速であることが好ましい。このように、制御部34は、材質推定データ43に基づいて、把持パラメータを変更することによってロボットアーム1及びハンド4の動作を変更する。
場合 When the object 5 is formed of a soft material, the object 5 may be deformed by its own weight if the position away from the center of gravity is grasped. Therefore, it is preferable to grasp the object 5 at a position close to the center of gravity. Further, when the object 5 is formed of a soft material, if the hand 4 collides with the object 5, a dent may remain on the object 5. Therefore, the speed at which the robot arm 1 moves the hand 4 is low. It is preferable that As described above, the control unit 34 changes the operation of the robot arm 1 and the hand 4 by changing the grip parameters based on the material estimation data 43.
図3は、実施の形態1に係る材質推定装置の材質推定部の動作を示すフローチャートである。ステップS1において、材質推定部33は、投光部21が光線を投射する領域の立体形状を形状計測データ42から推定する。
FIG. 3 is a flowchart showing the operation of the material estimating unit of the material estimating apparatus according to Embodiment 1. In step S <b> 1, the material estimating unit 33 estimates a three-dimensional shape of a region where the light projecting unit 21 projects a light ray from the shape measurement data 42.
図4は、実施の形態1に係る材質推定装置の材質推定部による立体形状を推定する処理を模式的に示す図である。投光部21は、特定の投光パターンで光線を発するため、物体の表面に投射された投光パターンは、物体の表面にあわせて形状が変形する。したがって、材質推定部33は、形状計測データ42が示す画像中での明暗のパターンの形状と、投光部21が発する光線の投光パターンとを比較することにより、投光部21が光線を投射する領域の立体形状の表面を推定する。
FIG. 4 is a diagram schematically illustrating a process of estimating a three-dimensional shape by the material estimating unit of the material estimating device according to the first embodiment. Since the light projecting unit 21 emits light rays in a specific light projecting pattern, the shape of the light projecting pattern projected on the surface of the object is changed according to the surface of the object. Accordingly, the material estimating unit 33 compares the shape of the light and dark pattern in the image indicated by the shape measurement data 42 with the light emitting pattern of the light emitted by the light emitting unit 21 so that the light emitting unit 21 The three-dimensional surface of the projection area is estimated.
上記のように、形状計測データ42だけでは、形状計測センサ2から見た物体の表面の形状しか得られない。すなわち、図4に示すように、投光部21が光線を投射する領域に存在する対象物5までの距離を形状計測センサ2で計測した場合、形状計測データ42は、対象物5の表面だけが抽出された抜けのある形状を示すデータとなる。そこで、材質推定部33は、形状計測データ42に対して、データを補間したり、定義済のプリミティブ形状を当てはめたりすることで、投光部21が光線を投射する領域に存在する物体の立体形状を推定する。プリミティブ形状とは、直方体、立方体、円柱、球体及び錐体といった幾何学的に単純な形状である。なお、対象物5のCAD(Computer Aided Design)データとも称されるコンピュータ支援設計データがデータベース31に記憶されている場合には、CADデータを当てはめて対象物5の立体形状を推定してもよい。このように、材質推定部33は、プリミティブ形状又はCADデータといった定義済の形状データと形状計測データ42とを照合することにより、投光部21が光線を投射する領域に存在する物体の立体形状を推定する。図4には、投光部21が光線を投射する領域に存在する物体の立体形状の推定結果を破線で示している。図4に示す対象物5は中実の物体であるが、材質推定部33が推定する立体形状は、外形形状である。すなわち、対象物5が中実であっても中空であっても、材質推定部33による立体形状の推定結果は同じとなる。
As described above, only the shape measurement data 42 can obtain only the shape of the surface of the object viewed from the shape measurement sensor 2. That is, as shown in FIG. 4, when the shape measurement sensor 2 measures the distance to the object 5 existing in the region where the light projecting unit 21 projects the light beam, the shape measurement data 42 includes only the surface of the object 5. Is the data indicating the extracted missing shape. Therefore, the material estimating unit 33 interpolates the shape measurement data 42 or applies a defined primitive shape to the shape measurement data 42 to obtain a three-dimensional image of the object existing in the region where the light projecting unit 21 projects the light beam. Estimate the shape. The primitive shape is a geometrically simple shape such as a rectangular parallelepiped, a cube, a cylinder, a sphere, and a cone. If computer-aided design (CAD) data of the object 5 is also stored in the database 31, the CAD data may be applied to estimate the three-dimensional shape of the object 5. . As described above, the material estimating unit 33 collates the defined shape data such as the primitive shape or CAD data with the shape measurement data 42, and thereby the three-dimensional shape of the object existing in the region where the light projecting unit 21 projects the light beam. Is estimated. FIG. 4 shows, by broken lines, the estimation result of the three-dimensional shape of the object existing in the area where the light projecting unit 21 projects the light beam. Although the target object 5 shown in FIG. 4 is a solid object, the three-dimensional shape estimated by the material estimating unit 33 is an outer shape. That is, whether the object 5 is solid or hollow, the estimation result of the three-dimensional shape by the material estimation unit 33 is the same.
ステップS2において、材質推定部33は、投光部21が光線を投射する領域に存在する物体の立体形状の表面における反射状態を推定する。立体形状の表面における反射状態の推定とは、位置姿勢情報311と、立体形状とをもとに、立体形状の表面の各点が、後述する正反射点10、拡散反射点11及び二次反射点12のいずれに該当するかを判定することを指す。正反射点10は、鏡面反射点とも称される。
In step S2, the material estimating unit 33 estimates the reflection state of the object existing in the region where the light projecting unit 21 projects the light beam on the three-dimensional surface. The estimation of the reflection state on the surface of the three-dimensional shape means that, based on the position / orientation information 311 and the three-dimensional shape, each point on the surface of the three-dimensional shape is reflected by a regular reflection point 10, a diffuse reflection point 11, and a secondary reflection point described later. This refers to determining which of the points 12 is applicable. The specular reflection point 10 is also called a specular reflection point.
図5は、実施の形態1に係る材質推定装置の材質推定部による立体形状の表面の反射状態を推定する処理における正反射点の定義を示す図である。立体形状の表面上のある点において、投光部21から出た光線が鏡面反射して受光部22に入射する場合に、鏡面反射が発生する立体形状の点を正反射点10と定義する。正反射点10となりうる位置は、投光部21及び受光部22の距離及び姿勢に基づいて特定可能である。
FIG. 5 is a diagram illustrating a definition of a specular reflection point in a process of estimating a reflection state of a three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment. At a certain point on the surface of the three-dimensional shape, when a light beam emitted from the light projecting unit 21 is specularly reflected and enters the light receiving unit 22, a point of the three-dimensional shape where specular reflection occurs is defined as a regular reflection point 10. The position that can be the specular reflection point 10 can be specified based on the distance and posture of the light projecting unit 21 and the light receiving unit 22.
図6は、実施の形態1に係る材質推定装置の材質推定部による立体形状の表面の反射状態を推定する処理における拡散反射点の定義を示す図である。立体形状の表面上のある点において、投光部21から出た光線の拡散反射成分が受光部22に入射する場合に、拡散反射が発生する点を拡散反射点11と定義する。投光部21から出た光線が拡散反射して受光部22に入射する場合、受光部22に入射する光線は基準値と同じ強度である。したがって、基準値と同じ強度の光線が受光部22に入射する場合には、拡散反射点11であると判断できる。
FIG. 6 is a diagram showing the definition of the diffuse reflection point in the process of estimating the reflection state of the three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment. At a certain point on the surface of the three-dimensional shape, a point where diffuse reflection occurs when the diffuse reflection component of the light beam emitted from the light projecting unit 21 enters the light receiving unit 22 is defined as a diffuse reflection point 11. When the light beam emitted from the light projecting unit 21 is diffusely reflected and enters the light receiving unit 22, the light beam entering the light receiving unit 22 has the same intensity as the reference value. Therefore, when a light beam having the same intensity as the reference value enters the light receiving unit 22, it can be determined that the point is the diffuse reflection point 11.
図7は、実施の形態1に係る材質推定装置の材質推定部による立体形状の表面の反射状態を推定する処理における二次反射点の定義を示す図である。投光部21から出た光線が立体形状で二回鏡面反射されて、反射光が最終的に受光部22に入射する場合に、最後の鏡面反射が発生する点を二次反射点12と定義する。投光部21から出た光線が二次反射して受光部22に入射する場合、受光部22に入射する光線は基準値未満の強度である。なお、立体形状で二回以上反射された光線は、大きく減衰する。したがって、立体形状で三回以上鏡面反射された反射光、すなわち三次以上の反射光は、無視しても材質推定結果に影響を与える可能性は小さい。実施の形態1においては、三次以上の反射光の考慮は省略する。
FIG. 7 is a diagram showing the definition of the secondary reflection point in the process of estimating the reflection state of the three-dimensional surface by the material estimating unit of the material estimating device according to the first embodiment. The point at which the last specular reflection occurs when the light beam emitted from the light projecting unit 21 is twice specularly reflected in a three-dimensional shape and the reflected light finally enters the light receiving unit 22 is defined as the secondary reflection point 12. I do. When the light beam emitted from the light projecting unit 21 is secondarily reflected and enters the light receiving unit 22, the light beam entering the light receiving unit 22 has an intensity lower than the reference value. The light rays reflected twice or more in the three-dimensional shape are greatly attenuated. Therefore, the reflected light that is specularly reflected three or more times in the three-dimensional shape, that is, the reflected light of the third or higher order, is unlikely to affect the material estimation result even if ignored. In the first embodiment, consideration of the third or higher order reflected light is omitted.
図8は、実施の形態1に係る材質推定装置の材質推定部による立体形状の表面の反射状態を推定する処理における正反射点でありかつ二次反射点でもある点の定義を示す図である。立体形状によっては、正反射点10で反射された光線と、二次反射光との両方が受光部22に入射する場合がある。すなわち、正反射点10でありかつ二次反射点12でもある点が存在しうる。正反射点10で反射された光線と、二次反射光との両方が受光部22に入射する場合には、受光部22に入射する光線は基準値を超える強度となる。したがって、立体形状の表面の点において反射された光線が、基準値未満の強度又は基準値を超える強度で受光部22に入射する場合には、光線が反射される点は二次反射点12であると判断できる。なお、基準値に幅を持たせてもよい。すなわち、上限閾値と下限閾値との間の値が基準値であり、受光部22に入射する光線の強度が上限閾値を超える場合又は下限閾値未満の場合に二次反射点12と判断してもよい。
FIG. 8 is a diagram showing definitions of points that are both regular reflection points and secondary reflection points in the process of estimating the reflection state of the three-dimensional surface by the material estimation unit of the material estimation device according to Embodiment 1. . Depending on the three-dimensional shape, both the light ray reflected at the specular reflection point 10 and the secondary reflected light may enter the light receiving unit 22. That is, there can be a point that is both a regular reflection point 10 and a secondary reflection point 12. When both the light reflected at the regular reflection point 10 and the secondary reflected light enter the light receiving unit 22, the light entering the light receiving unit 22 has an intensity exceeding a reference value. Therefore, when a light ray reflected at a point on the surface of the three-dimensional shape is incident on the light receiving unit 22 at an intensity less than or greater than the reference value, the point at which the light ray is reflected is the secondary reflection point 12. It can be determined that there is. Note that the reference value may have a range. That is, the value between the upper threshold and the lower threshold is the reference value, and when the intensity of the light beam incident on the light receiving unit 22 exceeds the upper threshold or is lower than the lower threshold, the secondary reflection point 12 may be determined. Good.
ステップS3において、材質推定部33は、立体形状の表面における拡散反射点11の拡散反射率を推定する。立体形状の表面のある点が、拡散反射点11に該当する場合、物体反射光のモデルより、光線の強度と光線方向とから拡散反射率を推定することができる。すなわち、投光部21と拡散反射点11との位置関係及び投光部21が発する光線の投光パターンとに基づいて、投光部21から拡散反射点11に到達する光線の強度を特定する。受光部22に入射した光線の強度を投光部21から拡散反射点11に到達する光線の強度で除することで、拡散反射率を推定できる。立体形状の表面のある点が、拡散反射点11に該当する場合、拡散反射率が推定できていれば、光線の強度と光線方向とから鏡面反射率も推定することができる。
In step S3, the material estimating unit 33 estimates the diffuse reflectance of the diffuse reflection point 11 on the surface of the three-dimensional shape. When a point on the surface of the three-dimensional shape corresponds to the diffuse reflection point 11, the diffuse reflectance can be estimated from the light intensity and the light direction from the model of the object reflected light. That is, the intensity of the light beam that reaches the diffuse reflection point 11 from the light projection unit 21 is specified based on the positional relationship between the light projection unit 21 and the diffuse reflection point 11 and the light projection pattern of the light beam emitted by the light projection unit 21. . The diffuse reflectance can be estimated by dividing the intensity of the light beam incident on the light receiving unit 22 by the intensity of the light beam reaching the diffuse reflection point 11 from the light projecting unit 21. In the case where a point on the surface of the three-dimensional shape corresponds to the diffuse reflection point 11, if the diffuse reflectance can be estimated, the specular reflectance can also be estimated from the light intensity and the light ray direction.
ステップS4において、材質推定部33は、立体形状の表面における拡散反射点11の鏡面反射率を推定する。なお、鏡面反射率は、正反射率とも称される。立体形状のある点が正反射点10に該当する場合も同様に、光線の強度と光線方向とから鏡面反射率を推定することができる。その際、拡散反射率が既知である必要があるが、正反射点10周辺の拡散反射点11で推定した拡散反射率を用いることで、鏡面反射率を推定することができる。
In step S4, the material estimating unit 33 estimates the specular reflectance of the diffuse reflection point 11 on the surface of the three-dimensional shape. The specular reflectance is also referred to as a regular reflectance. Similarly, when a point having a three-dimensional shape corresponds to the specular reflection point 10, the specular reflectance can be estimated from the light intensity and the light direction. At this time, the diffuse reflectance needs to be known, but the specular reflectance can be estimated by using the diffuse reflectance estimated at the diffuse reflection point 11 around the regular reflection point 10.
ステップS5において、材質推定部33は、正反射点10、拡散反射点11及び二次反射点12を補間する。一般に、正反射点10、拡散反射点11及び二次反射点12は、対象物5の表面にある程度の大きさをもって分布する。また、外乱の影響により、ある点で反射されて受光部22に入射する光線の強度が基準値を超えたり、基準値を下回ったりすることがありうる。したがって、正反射点10、拡散反射点11又は二次反射点12の領域の中に、他の種類の小さい反射点が存在すると推定される可能性がある。したがって、材質推定部33は、正反射点10、拡散反射点11及び二次反射点12のいずれかに囲まれた閾値以下の大きさの領域は、その領域を囲む正反射点10、拡散反射点11又は二次反射点12と同じであるとする補間処理を行う。なお、補間処理を行った点の反射率と同じであると推定する。
In step S5, the material estimating unit 33 interpolates the regular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12. In general, the regular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12 are distributed with a certain size on the surface of the object 5. Further, due to the influence of the disturbance, the intensity of the light beam reflected at a certain point and incident on the light receiving unit 22 may exceed the reference value or may fall below the reference value. Therefore, there is a possibility that another kind of small reflection point exists in the area of the regular reflection point 10, the diffuse reflection point 11, or the secondary reflection point 12. Therefore, the material estimating unit 33 determines that the region surrounded by any one of the specular reflection point 10, the diffuse reflection point 11, and the secondary reflection point 12 and having a size equal to or smaller than the threshold value includes the specular reflection point 10, the diffuse reflection An interpolation process is performed, which is assumed to be the same as the point 11 or the secondary reflection point 12. It is assumed that the reflectance is the same as the reflectance at the point where the interpolation processing is performed.
ステップS6において、材質推定部33は、投光部21が光線を投射する領域に存在する物体の立体形状と反射率とに基づいて、物体領域を推定する。すなわち、材質推定部33は、投光部21が光線を投射する領域のどこに対象物5が存在するかを推定する。材質の異なる複数の対象物5が混在する場合、拡散反射率及び鏡面反射率は互いに異なる。また、対象物5を収容する供給箱6の拡散反射率及び鏡面反射率は、対象物5とは異なる。材質が同じであるが別々の対象物5は、拡散反射率及び鏡面反射率は同じであるが、形状が不連続である。したがって、材質推定部33は、形状、拡散反射率及び鏡面反射率の変化が滑らかな領域を一つの物体と推定する。ここで、「変化が滑らか」とは、形状計測データ42である距離画像で隣接する画素に対応する二つの点において、位置の差又は反射率の差が閾値以下であることを意味する。また、材質推定部33は、形状、拡散反射率及び鏡面反射率の変化が滑らかな領域に囲まれた二次反射領域なども、同一の物体と推定する。このような特定の条件に基づく推定であったり、機械学習に基づき領域をセグメンテーションするなどの方法により、材質推定部33は、複数の立体形状に分かれて推定された一つの物体を、一つの物体の物体領域と推定することができる。
In step S6, the material estimating unit 33 estimates the object region based on the three-dimensional shape and the reflectance of the object existing in the region where the light projecting unit 21 projects the light rays. That is, the material estimating unit 33 estimates where the target object 5 exists in the area where the light projecting unit 21 projects the light beam. When a plurality of objects 5 having different materials are mixed, the diffuse reflectance and the specular reflectance are different from each other. The diffuse reflectance and the specular reflectance of the supply box 6 accommodating the object 5 are different from those of the object 5. Different objects 5 having the same material but different diffuse reflectances and specular reflectances are the same, but have discontinuous shapes. Therefore, the material estimating unit 33 estimates an area where the shape, diffuse reflectance, and specular reflectance change smoothly, as one object. Here, “the change is smooth” means that at two points corresponding to adjacent pixels in the distance image as the shape measurement data 42, a difference in position or a difference in reflectance is equal to or smaller than a threshold. The material estimating unit 33 also estimates a secondary reflection area surrounded by an area where the shape, diffuse reflectance and specular reflectance change smoothly, as the same object. By a method such as estimation based on such specific conditions or segmentation of a region based on machine learning, the material estimating unit 33 converts one object estimated to be divided into a plurality of three-dimensional shapes into one object Can be estimated.
ステップS7において、材質推定部33は、反射率材質関係情報312に基づいて、物体領域ごとの材質を推定する。材質推定部33は、データベース31に蓄えられた反射率材質関係情報312と、推定した物体領域ごとに、拡散反射率及び鏡面反射率とから、物体の材質を推定する。すなわち、材質推定部33は、反射率材質関係情報312において拡散反射点での反射率に関係付けられている材質を、対象物5の材質と推定する。
In step S7, the material estimating unit 33 estimates the material for each object region based on the reflectance material relationship information 312. The material estimating unit 33 estimates the material of the object from the reflectance material relation information 312 stored in the database 31 and the diffuse reflectance and the specular reflectance for each estimated object region. That is, the material estimation unit 33 estimates the material associated with the reflectance at the diffuse reflection point in the reflectance material relationship information 312 as the material of the target object 5.
実施の形態1に係る材質推定装置70は、物体の材質を、高価で特殊なセンサを使うことなく、物体操作に使われる形状計測センサを用いて物体の材質を推定できる。したがって、様々な種類の物体に応じた操作の制御が可能となる。実施の形態1に係る材質推定装置70は、ロボット60に通常用いられている、投光部21と受光部22とを持つ形状計測センサ2を用いて、物体の材質を推定する。これにより、蛍光X線分析装置又は赤外分光光度計測装置といった分析装置を用いず、安価かつ現実的な装置構成で、様々な種類の物体の材質を推定することができる。
The material estimation device 70 according to the first embodiment can estimate the material of an object using a shape measurement sensor used for object operation without using an expensive and special sensor. Therefore, it is possible to control operations according to various types of objects. The material estimating device 70 according to the first embodiment estimates the material of an object using the shape measurement sensor 2 having the light projecting unit 21 and the light receiving unit 22 that is usually used for the robot 60. This makes it possible to estimate the material of various types of objects with an inexpensive and realistic device configuration without using an analyzer such as a fluorescent X-ray analyzer or an infrared spectrophotometer.
実施の形態2.
図9は、本発明の実施の形態2に係る材質推定装置の構成を示す図である。実施の形態2の係るロボット60は、ロボットアーム1に吸着ハンド13が設置されている点で、実施の形態1に係るロボット60と相違する。実施の形態2に係る材質推定装置70において、データベース31には、操作部の種類と対象物5の材質とを関連付けたデータが記憶される。Embodiment 2 FIG.
FIG. 9 is a diagram showing a configuration of a material estimation device according toEmbodiment 2 of the present invention. The robot 60 according to the second embodiment is different from the robot 60 according to the first embodiment in that the suction hand 13 is provided on the robot arm 1. In the material estimation device 70 according to the second embodiment, the database 31 stores data in which the type of the operation unit and the material of the target object 5 are associated with each other.
図9は、本発明の実施の形態2に係る材質推定装置の構成を示す図である。実施の形態2の係るロボット60は、ロボットアーム1に吸着ハンド13が設置されている点で、実施の形態1に係るロボット60と相違する。実施の形態2に係る材質推定装置70において、データベース31には、操作部の種類と対象物5の材質とを関連付けたデータが記憶される。
FIG. 9 is a diagram showing a configuration of a material estimation device according to
対象物5を操作する操作部は、対象物5を保持するのに適するか否かは、対象物5の材質によって異なる。したがって、実施の形態1に係る材質推定装置70と同様に対象物5の材質を推定することにより、対象物5の材質に合わせて操作部を使い分けることができる。具体的に説明すると、対象物5が柔軟物であれば、ハンド4で挟むよりも吸着ハンド13で吸い付ける方が、対象物5を把持しやすい。また、対象物5がメッシュ状である場合には、吸着ハンド13で吸い付けるよりもハンド4で挟む方が対象物5を把持しやすい。
操作 Whether the operation section for operating the object 5 is suitable for holding the object 5 depends on the material of the object 5. Therefore, by estimating the material of the object 5 in the same manner as in the material estimating device 70 according to the first embodiment, the operation unit can be properly used according to the material of the object 5. More specifically, if the target object 5 is a flexible object, it is easier to grip the target object 5 with the suction hand 13 than with the hand 4. Further, when the target object 5 is in a mesh shape, it is easier to grip the target object 5 by sandwiching it with the hand 4 than by sucking it with the suction hand 13.
実施の形態2に係る材質推定装置70は、対象物5の材質の推定結果に基づいて、ハンド4で挟んで対象物5を把持するか、吸着ハンド13で吸い付けて対象物5を把持するかを切り替えることができ、対象物5を効率的に把持できる。
The material estimating device 70 according to the second embodiment grips the target object 5 by holding it with the hand 4 or sucks the target object 5 with the suction hand 13 based on the estimation result of the material of the target object 5. Can be switched, and the object 5 can be gripped efficiently.
実施の形態3.
図10は、本発明の実施の形態3に係る材質推定装置の構成を示す図である。実施の形態3に係る材質推定装置70は、情報処理部3がネットワーク50を通じて形状計測センサ2に接続されている点で実施の形態1に係る材質推定装置70と相違する。 Embodiment 3 FIG.
FIG. 10 is a diagram showing a configuration of a material estimation device according to Embodiment 3 of the present invention. Thematerial estimating device 70 according to the third embodiment is different from the material estimating device 70 according to the first embodiment in that the information processing unit 3 is connected to the shape measurement sensor 2 via the network 50.
図10は、本発明の実施の形態3に係る材質推定装置の構成を示す図である。実施の形態3に係る材質推定装置70は、情報処理部3がネットワーク50を通じて形状計測センサ2に接続されている点で実施の形態1に係る材質推定装置70と相違する。 Embodiment 3 FIG.
FIG. 10 is a diagram showing a configuration of a material estimation device according to Embodiment 3 of the present invention. The
実施の形態3に係る材質推定装置70は、情報処理部3をロボットアーム1から離して設置することが可能である。材質と反射率との関係を示すデータをデータベース31に多く記憶させるほど、材質推定の精度は高くなるが、データベース31が大型化し、情報処理部3の設置スペースが拡大する。ロボットアーム1が設置される生産現場において情報処理部3の設置スペースが大きくなることは、生産性を向上させる妨げとなる。実施の形態3に係る材質推定装置70は、情報処理部3をロボットアーム1から離して設置することにより、生産現場に作業用のスペースを確保でき、生産性を向上させることができる。
The material estimation device 70 according to the third embodiment allows the information processing unit 3 to be installed separately from the robot arm 1. The more the data indicating the relationship between the material and the reflectance is stored in the database 31, the higher the accuracy of the material estimation becomes, but the size of the database 31 increases, and the installation space of the information processing unit 3 increases. An increase in the installation space for the information processing unit 3 at the production site where the robot arm 1 is installed hinders an improvement in productivity. In the material estimation device 70 according to the third embodiment, by installing the information processing unit 3 away from the robot arm 1, a work space can be secured at a production site, and productivity can be improved.
図11は、実施の形態3に係る材質推定装置の変形例の構成を示す図である。複数のロボット60のロボットアーム1に形状計測センサ2が設置されており、情報処理部3は、ネットワーク50を通じて複数の形状計測センサ2に接続されている。上記のように、データベース31に記憶させておく情報量が多いほど材質推定の精度が向上する。したがって、複数のロボット60に共通して情報処理部3を設置することにより、精度の高い材質推定結果に基づいて、把持パラメータを変更することができる。
FIG. 11 is a diagram showing a configuration of a modification of the material estimating device according to the third embodiment. The shape measurement sensors 2 are installed on the robot arms 1 of the plurality of robots 60, and the information processing unit 3 is connected to the plurality of shape measurement sensors 2 via the network 50. As described above, as the amount of information stored in the database 31 increases, the accuracy of material estimation improves. Therefore, by installing the information processing unit 3 in common for a plurality of robots 60, it is possible to change the gripping parameters based on a highly accurate material estimation result.
また、複数のロボット60のいずれかにおける操作の成否情報をもとに、材質と把持パラメータとの関係を更新したり、材質推定部33の材質推定方法を変更したりすることで、より性能の高い操作を実現することができる。
Further, based on the information on the success or failure of the operation in any one of the plurality of robots 60, the relationship between the material and the grasping parameter is updated, or the material estimating method of the material estimating unit 33 is changed, thereby improving the performance. High operation can be realized.
上記実施の形態1、実施の形態2又は実施の形態3に係る情報処理部3の機能は、処理回路により実現される。処理回路は、専用のハードウェアであっても、記憶装置に格納されるプログラムを実行する演算装置であってもよい。
(4) The function of the information processing unit 3 according to the first, second, or third embodiment is realized by a processing circuit. The processing circuit may be dedicated hardware or an arithmetic device that executes a program stored in a storage device.
処理回路が専用のハードウェアである場合、処理回路は、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、特定用途向け集積回路、フィールドプログラマブルゲートアレイ、又はこれらを組み合わせたものが該当する。図12は、実施の形態1、実施の形態2又は実施の形態3に係る情報処理部の機能をハードウェアで実現した構成を示す図である。処理回路29には、情報処理部3の機能を実現する論理回路29aが組み込まれている。処理回路29を実現するハードウェアには、マイクロコントローラを例示できる。
If the processing circuit is dedicated hardware, the processing circuit may be a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit, a field programmable gate array, or a combination thereof. Is applicable. FIG. 12 is a diagram illustrating a configuration in which the function of the information processing unit according to the first, second, or third embodiment is realized by hardware. The processing circuit 29 incorporates a logic circuit 29a that realizes the function of the information processing unit 3. The hardware that implements the processing circuit 29 can be exemplified by a microcontroller.
処理回路29が演算装置である場合、情報処理部3の機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。
When the processing circuit 29 is an arithmetic unit, the function of the information processing unit 3 is realized by software, firmware, or a combination of software and firmware.
図13は、実施の形態1、実施の形態2又は実施の形態3に係る情報処理部の機能をソフトウェアで実現した構成を示す図である。処理回路29は、プログラム29bを実行する中央処理装置291と、中央処理装置291がワークエリアに用いるランダムアクセスメモリ292と、プログラム29bを記憶する記憶装置293を有する。記憶装置293に記憶されているプログラム29bを中央処理装置291がランダムアクセスメモリ292上に展開し、実行することにより、情報処理部3の機能が実現される。ソフトウェア又はファームウェアはプログラム言語で記述され、記憶装置293に格納される。
FIG. 13 is a diagram illustrating a configuration in which the function of the information processing unit according to the first, second, or third embodiment is realized by software. The processing circuit 29 has a central processing unit 291 for executing the program 29b, a random access memory 292 used by the central processing unit 291 for a work area, and a storage device 293 for storing the program 29b. The function of the information processing unit 3 is realized by the central processing unit 291 expanding and executing the program 29b stored in the storage device 293 on the random access memory 292. The software or firmware is described in a programming language and stored in the storage device 293.
処理回路29は、記憶装置293に記憶されたプログラム29bを読み出して実行することにより、情報処理部3の機能を実現する。プログラム29bは、情報処理部3の機能を実現する手順及び方法をコンピュータに実行させるものであるとも言える。
The processing circuit 29 implements the function of the information processing unit 3 by reading and executing the program 29b stored in the storage device 293. It can be said that the program 29b causes a computer to execute a procedure and a method for realizing the function of the information processing unit 3.
なお、処理回路29は、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。
Note that the processing circuit 29 may be partially realized by dedicated hardware and partially realized by software or firmware.
このように、処理回路29は、ハードウェア、ソフトウェア、ファームウェア、又はこれらの組み合わせによって、上述の各機能を実現することができる。
As described above, the processing circuit 29 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
以上の実施の形態に示した構成は、本発明の内容の一例を示すものであり、別の公知の技術と組み合わせることも可能であるし、本発明の要旨を逸脱しない範囲で、構成の一部を省略、変更することも可能である。
The configurations described in the above embodiments are merely examples of the contents of the present invention, and can be combined with other known technologies, and can be combined with other known technologies without departing from the gist of the present invention. Parts can be omitted or changed.
1 ロボットアーム、2 形状計測センサ、3 情報処理部、4 ハンド、5 対象物、6 供給箱、10 正反射点、11 拡散反射点、12 二次反射点、13 吸着ハンド、21 投光部、22 受光部、29 処理回路、29a 論理回路、29b プログラム、31 データベース、32 形状計測部、33 材質推定部、34 制御部、41 取得データ、42 形状計測データ、43 材質推定データ、44 把持パラメータ、50 ネットワーク、60 ロボット、70 材質推定装置、291 中央処理装置、292 ランダムアクセスメモリ、293 記憶装置、311 位置姿勢情報、312 反射率材質関係情報、313 材質把持パラメータ関係情報。
1 robot arm, 2 shape measurement sensor, 3 information processing unit, 4 hand, 5 object, 6 supply box, 10 specular reflection point, 11 diffuse reflection point, 12 secondary reflection point, 13 suction hand, 21 projection unit, 22 light receiving unit, 29 processing circuit, 29a logic circuit, 29b program, 31 database, 32 shape measurement unit, 33 material estimation unit, 34 control unit, 41 acquisition data, 42 shape measurement data, 43 material estimation data, 44 grip parameters, 50 network, 60 robot, 70 material estimation device, 291 central processing unit, 292 random access memory, 293 storage device, 311 position and orientation information, 312 reflectance material relationship information, 313 material grip parameter relationship information.
Claims (7)
- 対象物に光線を投射する投光部及び前記対象物において反射された前記光線を受光する受光部を備えた形状計測センサと、
前記投光部と前記受光部との位置及び姿勢の関係を示す位置姿勢情報と、前記対象物の材質と反射率との関係を示す反射率材質関係情報とを格納する記憶部と、
前記受光部が受光した前記光線に基づいて生成された画像データと前記位置姿勢情報とに基づいて、前記対象物の表面形状を示す距離画像を生成する形状計測部と、
前記距離画像と、前記位置姿勢情報と、前記反射率材質関係情報とに基づいて、前記対象物の材質を推定する材質推定部とを備えることを特徴とする材質推定装置。 A shape measuring sensor including a light projecting unit that projects a light beam on the object and a light receiving unit that receives the light beam reflected on the object,
A storage unit that stores position and orientation information indicating the relationship between the position and orientation of the light emitting unit and the light receiving unit, and reflectance material relationship information indicating the relationship between the material of the object and the reflectance.
Based on image data and the position and orientation information generated based on the light beam received by the light receiving unit, a shape measuring unit that generates a distance image indicating a surface shape of the target object,
A material estimating apparatus comprising: a material estimating unit configured to estimate a material of the target object based on the distance image, the position and orientation information, and the reflectance material relation information. - 前記材質推定部は、
前記対象物の表面の点であって、前記点において反射されて前記受光部に入射する前記光線の強度が基準値未満又は前記基準値を超える強度である拡散反射点での反射率を、前記位置姿勢情報及び前記光線の投光パターンと、前記受光部が受光する光線の強度とに基づいて推定し、
前記反射率材質関係情報において前記拡散反射点での反射率に関係付けられている材質を、前記対象物の材質と推定することを特徴とする請求項1に記載の材質推定装置。 The material estimating unit,
A point on the surface of the object, the reflectance at a diffuse reflection point whose intensity of the light ray reflected at the point and incident on the light receiving unit is less than a reference value or greater than the reference value, Estimated based on the position and orientation information and the light projection pattern of the light beam, and the intensity of the light beam received by the light receiving unit,
The material estimating apparatus according to claim 1, wherein a material associated with the reflectance at the diffuse reflection point in the reflectance material relation information is estimated as a material of the target object. - 前記材質推定部は、
前記距離画像が示す表面形状と定義済の形状データとの照合により前記光線が投射された領域の立体形状を推定し、前記立体形状の表面の各点での前記光線の反射率に基づいて、前記対象物の立体形状を推定することを特徴とする請求項1又は2に記載の材質推定装置。 The material estimating unit,
Estimate the three-dimensional shape of the area where the light ray is projected by comparing the surface shape indicated by the distance image and defined shape data, based on the reflectance of the light ray at each point on the surface of the three-dimensional shape, The material estimation device according to claim 1, wherein the three-dimensional shape of the object is estimated. - 前記記憶部、前記形状計測部及び前記材質推定部と、前記形状計測センサとがネットワークを通じて接続されたことを特徴とする請求項1から3のいずれか1項に記載の材質推定装置。 4. The material estimation device according to claim 1, wherein the storage unit, the shape measurement unit, the material estimation unit, and the shape measurement sensor are connected via a network. 5.
- 前記対象物を把持する操作部を備えたロボットアームを制御する制御部を有し、
前記制御部は、前記材質推定部による前記対象物の材質の推定結果に基づいて、前記操作部が前記対象物を把持する操作の条件を示す把持パラメータを変更することを特徴とする請求項1から4のいずれか1項に記載の材質推定装置。 A control unit that controls a robot arm including an operation unit that grips the object,
The said control part changes the grip parameter which shows the conditions of the operation which the said operation part grips the said object based on the estimation result of the material of the said object by the said material estimation part, The claim 1 characterized by the above-mentioned. 5. The material estimation device according to any one of items 1 to 4. - 前記制御部は、前記操作部を複数備えた前記ロボットアームを制御し、前記対象物の材質の推定結果に基づいて、前記対象物を操作する操作に用いる前記操作部を決定することを特徴とする請求項5に記載の材質推定装置。 The control unit controls the robot arm including a plurality of the operation units, and determines the operation unit to be used for an operation of operating the object based on an estimation result of a material of the object. The material estimation device according to claim 5, wherein
- 対象物を把持する操作部を備えたロボットアームと、前記対象物の材質を推定する請求項5又は6に記載の材質推定装置とを備えることを特徴とするロボット。 (7) A robot, comprising: a robot arm having an operation unit for gripping an object; and the material estimation device according to claim 5 or 6 for estimating a material of the object.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/025262 WO2020008538A1 (en) | 2018-07-03 | 2018-07-03 | Material estimation device and robot |
JP2019545826A JPWO2020008538A1 (en) | 2018-07-03 | 2018-07-03 | Material estimation device and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/025262 WO2020008538A1 (en) | 2018-07-03 | 2018-07-03 | Material estimation device and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020008538A1 true WO2020008538A1 (en) | 2020-01-09 |
Family
ID=69059411
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/025262 WO2020008538A1 (en) | 2018-07-03 | 2018-07-03 | Material estimation device and robot |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2020008538A1 (en) |
WO (1) | WO2020008538A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021115658A (en) * | 2020-01-24 | 2021-08-10 | 株式会社東芝 | Cargo handling gear and article gripping mechanism |
JPWO2022050169A1 (en) * | 2020-09-02 | 2022-03-10 | ||
JP2022526473A (en) * | 2020-03-11 | 2022-05-25 | ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド | Methods and devices for acquiring information, electronic devices, storage media and computer programs |
JP2023024068A (en) * | 2021-08-06 | 2023-02-16 | 東芝エレベータ株式会社 | User detection system of elevator |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06288912A (en) * | 1993-02-25 | 1994-10-18 | Black & Decker Inc | Equipment and method for identification of diffused reflection material |
JP2001137828A (en) * | 1999-11-12 | 2001-05-22 | Takeshi Hashimoto | Method for judging kind of material of waste |
JP2004144557A (en) * | 2002-10-23 | 2004-05-20 | Fanuc Ltd | Three-dimensional visual sensor |
WO2006006624A1 (en) * | 2004-07-13 | 2006-01-19 | Matsushita Electric Industrial Co., Ltd. | Article holding system, robot and robot control method |
WO2007080733A1 (en) * | 2006-01-13 | 2007-07-19 | Matsushita Electric Industrial Co., Ltd. | Device and method for controlling robot arm, robot and program |
JP2008055584A (en) * | 2006-09-04 | 2008-03-13 | Toyota Motor Corp | Robot for holding object and holding method of object by robot |
JP2013019890A (en) * | 2011-06-13 | 2013-01-31 | Canon Inc | Information processor and information processing method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6635690B2 (en) * | 2015-06-23 | 2020-01-29 | キヤノン株式会社 | Information processing apparatus, information processing method and program |
JP6752615B2 (en) * | 2015-07-29 | 2020-09-09 | キヤノン株式会社 | Information processing device, information processing method, robot control device and robot system |
WO2018092860A1 (en) * | 2016-11-16 | 2018-05-24 | 三菱電機株式会社 | Interference avoidance device |
-
2018
- 2018-07-03 JP JP2019545826A patent/JPWO2020008538A1/en active Pending
- 2018-07-03 WO PCT/JP2018/025262 patent/WO2020008538A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06288912A (en) * | 1993-02-25 | 1994-10-18 | Black & Decker Inc | Equipment and method for identification of diffused reflection material |
JP2001137828A (en) * | 1999-11-12 | 2001-05-22 | Takeshi Hashimoto | Method for judging kind of material of waste |
JP2004144557A (en) * | 2002-10-23 | 2004-05-20 | Fanuc Ltd | Three-dimensional visual sensor |
WO2006006624A1 (en) * | 2004-07-13 | 2006-01-19 | Matsushita Electric Industrial Co., Ltd. | Article holding system, robot and robot control method |
WO2007080733A1 (en) * | 2006-01-13 | 2007-07-19 | Matsushita Electric Industrial Co., Ltd. | Device and method for controlling robot arm, robot and program |
JP2008055584A (en) * | 2006-09-04 | 2008-03-13 | Toyota Motor Corp | Robot for holding object and holding method of object by robot |
JP2013019890A (en) * | 2011-06-13 | 2013-01-31 | Canon Inc | Information processor and information processing method |
Non-Patent Citations (1)
Title |
---|
STONE, R. S. ET AL.: "An automated handling system for soft compact shaped non-rigid products", MECHATRONICS, vol. 8, 1998, pages 8 5 - 102, XP010206458 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021115658A (en) * | 2020-01-24 | 2021-08-10 | 株式会社東芝 | Cargo handling gear and article gripping mechanism |
JP7419082B2 (en) | 2020-01-24 | 2024-01-22 | 株式会社東芝 | Cargo handling equipment and article gripping mechanism |
JP2022526473A (en) * | 2020-03-11 | 2022-05-25 | ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド | Methods and devices for acquiring information, electronic devices, storage media and computer programs |
JPWO2022050169A1 (en) * | 2020-09-02 | 2022-03-10 | ||
JP7481468B2 (en) | 2020-09-02 | 2024-05-10 | ファナック株式会社 | Robot system and control method |
JP2023024068A (en) * | 2021-08-06 | 2023-02-16 | 東芝エレベータ株式会社 | User detection system of elevator |
JP7276992B2 (en) | 2021-08-06 | 2023-05-18 | 東芝エレベータ株式会社 | Elevator user detection system |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020008538A1 (en) | 2020-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020008538A1 (en) | Material estimation device and robot | |
US11103998B2 (en) | Method and computing system for performing motion planning based on image information generated by a camera | |
JP6692107B1 (en) | Method and computing system for object identification | |
US11511415B2 (en) | System and method for robotic bin picking | |
US9026234B2 (en) | Information processing apparatus and information processing method | |
JP6126437B2 (en) | Image processing apparatus and image processing method | |
JP6541397B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM | |
JP6703812B2 (en) | 3D object inspection device | |
US11426876B2 (en) | Information processing apparatus, information processing method, and program | |
JP2019018272A (en) | Motion generation method, motion generation device, system, and computer program | |
JP6598814B2 (en) | Information processing apparatus, information processing method, program, system, and article manufacturing method | |
JP2009523623A (en) | Method and apparatus for automatic workpiece gripping | |
KR20190070875A (en) | Calibration and operation of vision-based manipulation systems | |
US20210370518A1 (en) | Method and computing system for performing container detection and object detection | |
JP2021020285A (en) | Robot setting device and robot setting method | |
US10656097B2 (en) | Apparatus and method for generating operation program of inspection system | |
JP7202966B2 (en) | Three-dimensional measuring device and three-dimensional measuring method | |
JP2019060695A (en) | Three-dimensional object detector, robot, and program | |
CN113313803B (en) | Stack type analysis method, apparatus, computing device and computer storage medium | |
JP7164451B2 (en) | Three-dimensional measuring device | |
JP7519222B2 (en) | Image Processing Device | |
CN115082550A (en) | Apparatus and method for locating position of object from camera image of object | |
JP5332873B2 (en) | Bag-like workpiece recognition device and method | |
Román-Ibáñez et al. | Online simulation as a collision prevention layer in automated shoe sole adhesive spraying | |
JP2022017739A (en) | Image processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2019545826 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18925571 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18925571 Country of ref document: EP Kind code of ref document: A1 |