WO2016152288A1 - Dispositif de détection de matériau, procédé de détection de matériau, et programme - Google Patents
Dispositif de détection de matériau, procédé de détection de matériau, et programme Download PDFInfo
- Publication number
- WO2016152288A1 WO2016152288A1 PCT/JP2016/053810 JP2016053810W WO2016152288A1 WO 2016152288 A1 WO2016152288 A1 WO 2016152288A1 JP 2016053810 W JP2016053810 W JP 2016053810W WO 2016152288 A1 WO2016152288 A1 WO 2016152288A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target object
- material detection
- shape
- information
- unit
- Prior art date
Links
- 239000000463 material Substances 0.000 title claims abstract description 223
- 238000001514 detection method Methods 0.000 title claims abstract description 97
- 238000003384 imaging method Methods 0.000 claims description 49
- 238000004891 communication Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 abstract description 11
- 238000000034 method Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 14
- 230000011218 segmentation Effects 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 239000002184 metal Substances 0.000 description 6
- 239000003973 paint Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000001678 irradiating effect Effects 0.000 description 5
- 230000000704 physical effect Effects 0.000 description 5
- 239000011347 resin Substances 0.000 description 4
- 229920005989 resin Polymers 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 239000004677 Nylon Substances 0.000 description 1
- 229920000122 acrylonitrile butadiene styrene Polymers 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000010440 gypsum Substances 0.000 description 1
- 229910052602 gypsum Inorganic materials 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920001778 nylon Polymers 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/255—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring radius of curvature
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/35—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
- G01N21/3563—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
Definitions
- the present disclosure relates to a material detection device, a material detection method, and a program.
- the basic data format has point group coordinate position, polygon size, and distance information between point groups in a three-dimensional object. Recently, in addition to these basic information, a standard that can hold color information, shadow information, and material information for each point cloud is becoming widespread.
- the current 3D scanner detects three-dimensional shapes, but none has the function of detecting even materials. That is, in order to hold material information for each point group, the 3D data format detected by the 3D scanner must be once imported into the data editing environment, and the material must be set for each point group.
- Patent Document 1 describes a material recognition technique that provides material information of an object included in an image captured by a user.
- the technique described in the above-mentioned patent document receives an incident wave generated by the transmitter of the exploration radar unit in the direction of the object and receives a surface reflected wave having an amplitude lower than that of the incident wave and having the same phase.
- the physical property information of the object is detected, and the physical property information is compared with the reference physical property information corresponding to the material of the object to recognize what the object is.
- since physical property information is detected at a minimum specific point it is difficult to acquire the physical properties of the entire three-dimensional object.
- a shape acquisition unit that acquires a three-dimensional shape of a target object
- an imaging information acquisition unit that acquires imaging information obtained by imaging the target object, the three-dimensional shape and the imaging information Based on the information of the area detection unit that divides the three-dimensional shape for each area, the material detection wave irradiated for each area of the target object, and the material detection wave reflected by the target object
- a material acquisition unit that acquires the material for each region.
- acquiring the three-dimensional shape of the target object acquiring imaging information obtained by imaging the target object, and based on the three-dimensional shape and the imaging information, Dividing the three-dimensional shape for each region, and determining the material of the target object based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object.
- a material detection method comprising: acquiring each time.
- means for acquiring a three-dimensional shape of a target object means for acquiring imaging information obtained by imaging the target object, the tertiary based on the three-dimensional shape and the imaging information Means for dividing the original shape into regions, and obtaining the material of the target object for each region based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object
- a program for causing a computer to function as a means for performing the above is provided.
- the material of each element can be detected with a simple process for a three-dimensional object.
- the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
- FIG. It is a mimetic diagram showing the composition of the system concerning one embodiment of this indication. It is a schematic diagram which shows the example which took in the scissors with the system of this embodiment. It is a schematic diagram which shows a mode that the system which concerns on this embodiment detects the material of scissors for every area
- FIG. It is a flowchart which shows the process of this embodiment. It is a schematic diagram which shows the example which implement
- FIG. 1 is a schematic diagram showing a configuration of a system 1000 according to the present embodiment.
- a system 1000 according to this embodiment includes a recording hardware 100 and an analysis unit (software) 200.
- the recording hardware 100 includes a shape recognizing infrared light emitting unit (projector) 102, a shape recognizing infrared light receiving unit (sensor) 104, an RGB color recognition camera 106, a material recognizing infrared light emitting unit (projector) 108, and a material recognizing infrared light.
- a light receiving unit (sensor) 110 is included.
- the analysis unit 200 includes a target object shape detection unit 202, a specific point material detection unit 204, a material determination library 206, and a region-specific material allocation unit 208.
- Each component of the analysis unit 200 shown in FIG. 1 can be composed of a central processing unit such as a CPU and software (program) for causing it to function.
- the program can be stored in a memory (not shown) included in the recording hardware 100 or a recording medium connected from the outside.
- the recording hardware 100 and the analysis unit 200 may be configured as an integrated device, or may be configured as separate devices.
- the 3D scanner includes a contact type and a non-contact type, but the system 1000 of this embodiment uses a non-contact type as an example.
- the shape detection unit 202 detects the shape and surface of the target object by irradiating the target object (three-dimensional object) with infrared light from the shape recognition infrared light emitting unit 102 and receiving the infrared light received by the shape recognition infrared light receiving unit 104.
- the shape is estimated by obtaining the time phase difference difference between the irradiation light and the reflected light. Specifically, light is received at the timing of infrared light emission, the phase difference between the light-emitting part and the light-receiving part is calculated from the received light signal intensity, and the shape is estimated by calculating the flight time (distance) from the phase difference. To do.
- the distance to the object is estimated by triangulation from the difference between the irradiation pattern and the reflection pattern, and the shape and depth information is acquired. Specifically, the light is received in accordance with the light emission timing, the light emission pattern and the light reception pattern are collated, and the distance is calculated from the projection angle and the incident angle based on the principle of three-stroke surveying.
- the pattern irradiation type is considered to perform measurement at a higher speed than the type in which the laser beam is moved.
- the type of scanning by moving the laser beam is called a ToF (Time OF Flight) method
- the type of irradiating a pattern is also called an active stereo method. Note that light other than infrared light may be used for shape recognition.
- the shape detection method is not particularly limited, and it is also possible to detect the shape using a method other than the above.
- the target object shape detection unit 202 analyzes shape information from an image / video captured by the recording hardware 100. Specifically, the RGB color recognition camera 106 images a target object, analyzes shape information of the target object from information obtained by the imaging, and performs region division. By performing shape analysis at this stage, it is possible to speed up the processing when material information is later dropped into the 3D data format.
- FIG. 2 is a schematic diagram showing an example in which the scissors 300 are captured by the system 1000 of the present embodiment.
- the shape detection unit 202 analyzes the shape information for segmenting the metal portion and the handle portion of the scissors 300 and the color of the handle.
- the shape detection unit 202 includes an imaging information acquisition unit 202 a that acquires imaging information obtained by imaging from the RGB color recognition camera 106.
- the shape detection unit 202 uses color information / edge information, luminance / saturation information, contrast, depth information, background information, light reflectance, flat part (region with less texture) information, temperature, and the like as reference information.
- An area dividing unit 202b that divides (segments) the three-dimensional shape of the target object for each area is provided.
- the shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation, and performs region division based on the edge information.
- the region dividing unit 202b divides the three-dimensional shape for each region by providing a boundary line in the three-dimensional information including the mesh information or the depth information indicating the three-dimensional shape.
- the boundary line is set based on imaging information such as color information, edge information, luminance / saturation information, and contrast. Imaging information such as color information, edge information, luminance / saturation information, and contrast used for area division is acquired by the RGB color recognition camera 106 and acquired by the imaging information acquisition unit 202a. It is also possible for the user to input auxiliary information such as an object / background and perform segmentation by the graph cut method.
- the target object is specified from the background by selecting the target object with a touch operation at the time of shooting. As a result, the background can be separated with high accuracy.
- the scissors 300 are segmented into a metal portion (region A), a handle portion (region B), and a resin portion (region C) other than the handle portion.
- region A a metal portion
- region B a handle portion
- region C a resin portion
- both the region B and the region C are made of resin and have different colors.
- the material of the specific point is measured.
- an infrared ray for detecting the material irradiated by the infrared ray emitting unit for material recognition 108 and an infrared ray receiving unit for material recognition 110 for receiving the infrared ray are used.
- the material is estimated by irradiating a specific point on the three-dimensional object with the infrared rays emitted by the material recognition infrared light emitting unit 108 using a known method.
- Infrared rays emitted from the material recognizing infrared light emitting unit 108 are applied to a specific point of the target object and received by the material recognizing infrared light receiving unit 110.
- the material detection unit 204 obtains the reflectance of the infrared light emitted by the material recognition infrared light emitting unit 108 based on the infrared light emitted by the material recognition infrared light emitting unit 108 and the infrared light received by the material recognition infrared light receiving unit 110. Then, by comparing with the reflectance held in the material determination library 206, the material of the specific point of the target object is predicted.
- signals such as infrared rays, ultrasonic waves, and electromagnetic waves are irradiated on the object, the reflectance (response) is measured, and the material of the object is predicted by comparing the coefficients of the reflection components. Things are generally used.
- material detection is realized by a system that irradiates a part of a target object with an infrared laser of a point light source and detects the intensity, transfer function, and attenuation degree of the reflected infrared light.
- the material detection by the point light source irradiation the material of only the region (specific point) that can be irradiated with the infrared laser is detected, so it takes a lot of time to measure the material of the entire three-dimensional object.
- the time required to detect the material of a specific object is proportional to the size of the object. As an example, it is known that it takes about 20 ms to detect the material of an area corresponding to 256 pixels by a conventional method. This detection time is the time required to specify an object by irradiating an object with infrared rays, measuring the reflectance, and comparing how close the material coefficient is to the reflectance.
- the target object is divided (segmented) into regions, and material determination is performed for each region, thereby greatly reducing processing time and load.
- the metal part is irradiated with infrared rays as an example of a material detection wave to determine that the material is metal.
- the handle portion is irradiated with infrared rays to determine that the material is plastic or resin.
- the material can be determined for each of the regions A, B, and C by performing material recognition once for each of the regions A, B, and C shown in FIG.
- Each material determined at this time is held in association with the world coordinate system. In this way, only a specific part in a limited area is recognized as a material, and is used when it is dropped into a later 3D data format. Therefore, the material detection is not performed on all of the point groups showing the three-dimensional shape, and the processing can be greatly simplified.
- the target object is estimated by material for each region, and the process of dropping into the 3D data format is performed.
- the process of dropping into the 3D data format is performed.
- the material is determined for each region, and the material prediction unit for each region is introduced to speed up the material prediction of the three-dimensional object.
- the segmentation result by the shape detection unit 202 is used for this speeding up.
- the shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation.
- edge information there is a method in which Sobel, Laplacian, and Canny operators are applied to the first derivative result of an image. Further, it is possible to extract edge information such as pixel luminance value difference, saturation value difference, template matching, and the like.
- the edge of the 3D object extracted here produces a continuous closed area when the background is separated. For example, it means that a line starting from point A has returned to point A again. In this case, it can be determined that the closed region surrounded by this line is likely to be one of the components of a certain object.
- the material allocation unit 208 by region allocates material information of a specific point detected by the material detection unit 204 to the closed region.
- a paint routine algorithm can be used.
- the paint routine algorithm is an algorithm for painting a color in a certain closed space, and is used in a paint application of Windows (registered trademark) accessories. This makes it possible to have material information as if all point cloud data is painted.
- 3D data format generally has world coordinate information, polygon size and position, texture information, etc.
- formats that can have material information such as 3DS, FBX format, and OBJ format have been born. Since the world coordinates for each object (areas A, B, and C shown in FIG. 2) divided by segmentation can be specified, material information can be assigned to each object.
- a format such as an AMF (Additive Manufacturing File) format that can describe information about colors, materials, and internal structures may be used based on the STL used in 3D printers.
- AMF Additional Manufacturing File
- FIG. 3 is a schematic diagram showing how the system 1000 according to this embodiment detects the material of the scissors 300 for each of the areas A, B, and C.
- infrared light is emitted from the material recognition infrared light emitting unit 108 for each of the regions A, B, and C, and the infrared light is received by the material recognition infrared light receiving unit 110.
- the material detection unit 204 detects the material for each of the regions A, B, and C. At this time, the material is detected by comparing with the material coefficient of each material (ABS resin, PLA resin, nylon, plastic, gypsum, rubber, metal, etc.) held in the material determination library 206.
- each region A, B, and C In order to detect the material for each of the regions A, B, and C, each region A, B, and C emits light once. The material is detected by emitting light only once at a specific point in each of the areas A, B, and C, and the detected material is assigned to all of the areas. As a result, the processing time and load can be significantly reduced.
- FIG. 4 is a flowchart showing processing in the present embodiment.
- step S ⁇ b> 10 infrared irradiation is performed by the shape recognition infrared light emitting unit 102 and light is received by the shape recognition infrared light receiving unit 104, so that the shape detection unit 202 detects the shape of the target object.
- step S12 imaging information by the RGB color recognition camera 106 is acquired.
- step S14 the shape area and the classification are performed based on the imaging information.
- background separation is performed.
- step S18 the region and the segment of the shape based on the edge information are divided. As described above, the region division by the region dividing unit 202b is performed in steps S14 to S18.
- step S20 the material determination library 206 is collated, and material detection is performed for each region. Therefore, in steps S22, S24, and S26, one of the areas A, B, and C is determined. If the area is A, the process proceeds to step S28, and the material of the area A is recognized by a paint routine. In the case of the area B, the process proceeds to step S30, and the material of the area B is recognized by the paint routine. In the case of the area C, the process proceeds to step S32, and the material of the area C is recognized by the paint routine. After steps S28, S30, and S32, the process proceeds to step S34 to create a 3D data format.
- FIG. 5 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1010 such as a smartphone.
- FIG. 6 is a schematic diagram showing the configuration of the mobile device 1010.
- the mobile device 1010 includes a display unit 114, a touch sensor 116, and an irradiation position guide unit 210 in addition to the system 1000 illustrated in FIG. 1.
- the mobile device 1010 includes a touch panel configured by providing a touch sensor 116 on the display unit 114.
- a captured image of the scissors 300 captured by the RGB color recognition camera 106 is displayed on the display unit 114.
- the irradiation position guiding unit 210 is an application for guiding the irradiation position of the sensor signal to an area where material detection is desired.
- the infrared irradiation position (mark 400 indicated by a cross in FIG. 5) by the material recognizing infrared light emitting unit 108 is displayed so as to be superimposed on the captured image displayed on the display unit 114.
- the position of the mark 400 can be calculated from, for example, the infrared irradiation direction and the distance to the target object (scissors 300). Accordingly, the user can detect the material of the region including the position by aligning the mark 400 with an arbitrary position of the scissors 300 in the captured image. The user can inform the mobile device 1010 that the area to be detected is irradiated by operating the touch sensor 116.
- the mobile device 1010 includes a shape recognition infrared light emitting unit 102, a shape recognition infrared light receiving unit 104, and an RGB color recognition camera 106.
- the mobile device 1010 does not have to include the shape recognition infrared light emitting unit 102, the shape recognition infrared light receiving unit 104, and the RGB color recognition camera 106.
- FIG. 7 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1020 such as a smartphone.
- the material recognition infrared light emitting unit 108 and the material recognition infrared light receiving unit 110 are used for detecting a material.
- FIG. 8 is a schematic diagram showing the configuration of the mobile device 1020. As shown in FIG. 8, unlike the mobile device 1010 of FIG. 6, the mobile device 1020 does not include the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110.
- the mobile device 1020 includes a communication unit 118 in order to communicate with the infrared sensor unit 1030 and Bluetooth (registered trademark).
- the infrared sensor unit 1030 includes a material recognizing infrared light emitting unit 1032, a material recognizing infrared light receiving unit 1034, and a communication unit 1036.
- the material recognizing infrared light emitting unit 1032 and the material recognizing infrared light receiving unit 1034 correspond to the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110 of FIGS.
- the communication unit 1036 communicates with the mobile device 1020 via Bluetooth (registered trademark) or the like.
- the infrared sensor unit 1030 sends information about the infrared rays irradiated by the material recognition infrared light emitting unit 1032 and the infrared rays received by the material recognition infrared light receiving unit 1034 from the communication unit 1036 to the mobile device 1020.
- the material detection unit 204 of the mobile device 1020 acquires the infrared reflectance based on the information sent from the infrared sensor unit 1030 and compares it with the reflectance held in the material determination library 206. The material of the object is detected for each region.
- the mobile device 1020 does not have an infrared sensor, if communication with the infrared sensor unit 1030 is possible, by sending a signal from the infrared sensor unit 1030 side to the mobile device 1020 side, It is possible to estimate the material.
- the three-dimensional shape obtained by shape detection is divided into regions based on the imaging information obtained by the RGB color recognition camera 106, and infrared rays are irradiated to specific points in the divided regions.
- material detection is performed for each region.
- the material obtained by detecting the material at the specific point is assigned to the 3D data format as the material of the entire region. Therefore, it is not necessary to detect the material for each point group in the 3D data format, and the processing time and processing load can be greatly reduced.
- a shape acquisition unit that acquires the three-dimensional shape of the target object;
- An imaging information acquisition unit that acquires imaging information obtained by imaging the target object;
- a region dividing unit that divides the three-dimensional shape into regions,
- a material acquisition unit that acquires the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
- a material detection device comprising: (2) The material detection device according to (1), wherein the material detection wave is infrared light.
- the region dividing unit divides the three-dimensional shape for each region by providing boundary lines in three-dimensional information including mesh information or depth information indicating the three-dimensional shape. Material detection device. (4) The material detection device according to (3), wherein the region dividing unit provides the boundary line based on the imaging information. (5) The material acquisition unit further includes a region-specific material allocation unit that allocates the material of the target object acquired for each region to the three-dimensional shape data acquired by the shape acquisition unit for each region. The material detection apparatus as described in 1).
- the material detection device according to (1) further comprising: (7) The material detection device according to (2), wherein the material acquisition unit acquires the material based on a reflectance of the infrared light irradiated on the target object in the target object.
- Infrared reflectance for each material is held in advance, and the material acquisition unit is configured to reflect the reflectance of the infrared light applied to the target object in the target object, and the infrared reflectance for each material held in advance.
- the shape acquisition unit acquires the three-dimensional shape based on information on a light beam applied to the target object and a light beam reflected by the target object.
- the light beam is infrared;
- An infrared light emitting part for shape recognition that emits infrared light for acquiring the three-dimensional shape of the target object;
- Infrared light receiving unit for shape recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting unit for shape recognition;
- the infrared light emitting unit for shape recognition irradiates infrared rays in a predetermined pattern
- the material detection device according to (10), wherein the shape acquisition unit acquires the three-dimensional shape by triangulation from a difference between an infrared irradiation pattern and a reflection pattern.
- the material detection device comprising: (15) The material detection device according to (1), further including a communication unit that receives information on infrared rays irradiated on the target object and infrared rays reflected on the target object from an external infrared sensor unit.
- a material detection method comprising: (17) means for acquiring the three-dimensional shape of the target object; Means for acquiring imaging information obtained by imaging the target object; Means for dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information; Means for acquiring the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
- Infrared light emitting unit for shape recognition 104 Infrared light receiving unit for shape recognition 106 RGB color recognition camera 108 Infrared light emitting unit for material recognition 110 Infrared light receiving unit for material recognition 114 Display unit 202 Shape detection unit 202a Imaging information acquisition unit 202b Area division unit 204 Material detection unit 208 Area-specific material allocation unit 1000 System 1010, 1020 Mobile device
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
La présente invention concerne un dispositif de détection de matériau qui est pourvu de : une unité d'acquisition de forme qui acquiert la forme tridimensionnelle d'un objet ; une unité d'acquisition d'informations d'image collectées qui acquiert des information d'image collectées obtenues par collecte d'une image de l'objet ; une unité de division de région qui divise la forme tridimensionnelle en régions sur la base de la forme tridimensionnelle et des informations d'image capturées ; et une unité d'acquisition de matériau qui acquiert, par chacune des régions, un matériau de l'objet sur la base d'informations d'ondes de détection de matériau avec lesquelles chacune des régions de l'objet est irradiée, et des informations d'ondes de détection de matériau réfléchies par l'objet. Avec une telle configuration, le matériau de chaque élément de l'objet tridimensionnel peut être détecté avec le simple traitement.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-062639 | 2015-03-25 | ||
JP2015062639 | 2015-03-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016152288A1 true WO2016152288A1 (fr) | 2016-09-29 |
Family
ID=56977576
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/053810 WO2016152288A1 (fr) | 2015-03-25 | 2016-02-09 | Dispositif de détection de matériau, procédé de détection de matériau, et programme |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016152288A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110794418A (zh) * | 2018-08-03 | 2020-02-14 | 株式会社拓普康 | 测量装置 |
CN112926702A (zh) * | 2019-12-06 | 2021-06-08 | 李雯毓 | 主动光源式物体材质识别系统及方法 |
JP7483267B2 (ja) | 2021-05-31 | 2024-05-15 | 日本コンベヤ株式会社 | パイプコンベヤにおけるローリング監視システム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007216381A (ja) * | 2004-07-13 | 2007-08-30 | Matsushita Electric Ind Co Ltd | ロボット |
JP2013250263A (ja) * | 2012-05-31 | 2013-12-12 | Korea Institute Of Science And Technology | 客体の材質認識装置及びその方法 |
-
2016
- 2016-02-09 WO PCT/JP2016/053810 patent/WO2016152288A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007216381A (ja) * | 2004-07-13 | 2007-08-30 | Matsushita Electric Ind Co Ltd | ロボット |
JP2013250263A (ja) * | 2012-05-31 | 2013-12-12 | Korea Institute Of Science And Technology | 客体の材質認識装置及びその方法 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110794418A (zh) * | 2018-08-03 | 2020-02-14 | 株式会社拓普康 | 测量装置 |
CN110794418B (zh) * | 2018-08-03 | 2024-05-28 | 株式会社拓普康 | 测量装置 |
CN112926702A (zh) * | 2019-12-06 | 2021-06-08 | 李雯毓 | 主动光源式物体材质识别系统及方法 |
JP7483267B2 (ja) | 2021-05-31 | 2024-05-15 | 日本コンベヤ株式会社 | パイプコンベヤにおけるローリング監視システム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11915502B2 (en) | Systems and methods for depth map sampling | |
US10288418B2 (en) | Information processing apparatus, information processing method, and storage medium | |
Kang et al. | Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model | |
US8331652B2 (en) | Simultaneous localization and map building method and medium for moving robot | |
CN109801333B (zh) | 体积测量方法、装置、系统及计算设备 | |
KR100920931B1 (ko) | Tof 카메라를 이용한 로봇의 물체 자세 인식 방법 | |
JP2021192064A (ja) | 3次元計測システム及び3次元計測方法 | |
US11361457B2 (en) | Annotation cross-labeling for autonomous control systems | |
KR101272448B1 (ko) | 관심영역 검출 장치와 방법 및 상기 방법을 구현하는 프로그램이 기록된 기록매체 | |
JP6435661B2 (ja) | 物体識別システム、情報処理装置、情報処理方法及びプログラム | |
US9805249B2 (en) | Method and device for recognizing dangerousness of object | |
Leens et al. | Combining color, depth, and motion for video segmentation | |
JP6172432B2 (ja) | 被写体識別装置、被写体識別方法および被写体識別プログラム | |
KR102257746B1 (ko) | 로봇군 제어 방법 및 시스템 | |
EP3213504B1 (fr) | Segmentation de données d'image | |
KR101997048B1 (ko) | 물류 관리를 위한 원거리 다수의 코드 인식 방법 및 이를 이용한 코드 인식 장치 | |
Lejeune et al. | A new jump edge detection method for 3D cameras | |
WO2016152288A1 (fr) | Dispositif de détection de matériau, procédé de détection de matériau, et programme | |
JP4110501B2 (ja) | ランダムパターン生成装置とその方法、距離画像生成装置とその方法、およびプログラム提供媒体 | |
Song et al. | Estimation of kinect depth confidence through self-training | |
CN109840463A (zh) | 一种车道线识别方法和装置 | |
KR20220110034A (ko) | 대상체의 기하학적 특성을 반영하여 확장된 표현 범위를 가지는 인텐시티 정보를 생성하는 방법 및 그러한 방법을 수행하는 라이다 장치 | |
US20220148200A1 (en) | Estimating the movement of an image position | |
JP5743935B2 (ja) | 対象物検出装置および対象物検出方法 | |
WO2021049490A1 (fr) | Dispositif d'enregistrement d'images, système de génération d'images, procédé d'enregistrement d'images et programme d'enregistrement d'images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16768191 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16768191 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |