US20230089139A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
US20230089139A1
US20230089139A1 US17/908,082 US202117908082A US2023089139A1 US 20230089139 A1 US20230089139 A1 US 20230089139A1 US 202117908082 A US202117908082 A US 202117908082A US 2023089139 A1 US2023089139 A1 US 2023089139A1
Authority
US
United States
Prior art keywords
distance
image processing
image
workpiece
reference plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/908,082
Other languages
English (en)
Inventor
Yuusuke MURATA
Shouta TAKIZAWA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Assigned to FANUC CORPORATION reassignment FANUC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MURATA, YUUSUKE, TAKIZAWA, SHOUTA
Publication of US20230089139A1 publication Critical patent/US20230089139A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention relates to an image processing device and an image processing method. More particularly, the present invention relates to an image processing device and an image processing method for performing image processing using three-dimensional data acquired by a three-dimensional sensor.
  • Examples of image processing devices for performing image processing using three-dimensional data acquired by a three-dimensional sensor include devices disclosed in Patent Documents 1 and 2.
  • Patent Document 1 discloses a device that detects a three-dimensional object through two-dimensional image processing. Specifically, the device disclosed in Patent Document 1 captures an image of a three-dimensional object using a pair of imagers and calculates disparity data for each of regions obtained by segmenting the image. The device then generates a grayscale image in which each region has a corresponding grayscale value based on the disparity data and based on the distance from the imagers. The device models the three-dimensional object and calculates correlation values representing the similarity between the resulting model and an image area in the grayscale image.
  • the model is a two-dimensional image having shape features of the three-dimensional object as seen in a direction from the position of the imagers, and each of the regions obtained by segmenting the two-dimensional image has a grayscale value representing the distance between a corresponding part of the three-dimensional object and the imagers.
  • the correlation values are calculated based on the grayscale values of the model and the grayscale values of the image area in the grayscale image.
  • the device detects the three-dimensional object by detecting, from the grayscale image, an image area having the greatest correlation value with the model.
  • Patent Document 2 discloses a motion detection device capable of detecting a three-dimensional motion with high accuracy.
  • the motion detection device disclosed in Patent Document 2 includes: an image acquisition unit that acquires a distance image; a segmentation unit that segments the distance image acquired by the image acquisition unit into sub-regions having a predetermined size; a first detection unit that detects a motion in a plane direction for each set of similar sub-regions between distance images successively acquired by the image acquisition unit; a calculation unit that calculates depth information for each sub-region; and a second detection unit that detects a motion in a depth direction for each set of similar sub-regions based on the depth information calculated by the calculation unit.
  • Patent Document 1 Japanese Unexamined Patent Application, Publication No. 2007-213353
  • Patent Document 2 Japanese Unexamined Patent Application, Publication No. 2000-222585
  • the surface of the object may not be perpendicular to the optical axis of the three-dimensional sensor. If this is the case, the distance from the three-dimensional sensor to the surface that is not perpendicular to the optical axis of the three-dimensional sensor is not constant, making it difficult to detect the object. It is therefore desired to make it easier to detect an object even if the surface of the object is not perpendicular to the optical axis of a three-dimensional sensor.
  • FIG. 1 is a block diagram illustrating a configuration example of an image processing device according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating a method by which a three-dimensional sensor included in the image processing device detects a loading surface on which a workpiece is loaded;
  • FIG. 3 is a diagram illustrating an arrangement of the three-dimensional sensor and the workpiece, and an image obtained by converting a distance image so that distances from the loading surface each represent a luminance;
  • FIG. 4 is a diagram illustrating a case where a distance image is created using, as pixel values thereof, values calculated based on the distance between the three-dimensional sensor and each of points on a surface of the workpiece;
  • FIG. 5 is a flowchart showing operation of an image processing unit according to the embodiment.
  • FIG. 6 is a diagram illustrating the three-dimensional sensor included in the image processing device and a hexagonal prism-shaped workpiece loaded on the loading surface.
  • FIG. 1 is a block diagram illustrating a configuration example of an image processing device according to an embodiment of the present invention.
  • an image processing device 10 includes a three-dimensional sensor 101 , a reference plane calculation unit 102 , a distance calculation unit 103 , a distance image creation unit 104 , and an image processing unit 105 .
  • the three-dimensional sensor 101 captures an image of a workpiece 20 and a loading surface 30 on which the workpiece 20 is loaded, and outputs resulting three-dimensional data to the reference plane calculation unit 102 and the distance calculation unit 103 .
  • the three-dimensional data is a three-dimensional point cloud or a distance image in this example, but is not limited as such.
  • the three-dimensional data may be any other three-dimensional data.
  • the workpiece 20 is a detection object.
  • the loading surface 30 is, for example, a surface of a table on which the workpiece 20 is loaded. As shown in FIG. 1 , the loading surface 30 and the workpiece 20 are disposed obliquely to the optical axis of the three-dimensional sensor 101 .
  • the workpiece 20 is trapezoid-shaped in this example.
  • Examples of three-dimensional sensors usable as the three-dimensional sensor 101 include a stereo camera that measures the distance between the stereo camera and the detection object workpiece 20 by performing matching between images respectively captured by two cameras, and a stereo camera that measures the distance between the stereo camera and the detection object workpiece 20 by performing matching between an image of a pattern projected from a projector and images captured by cameras.
  • Another example of a three-dimensional sensor usable as the three-dimensional sensor 101 is a stereo camera that measures the distance between the stereo camera and the detection object workpiece 20 by performing matching between images respectively captured by two cameras under a condition where a pattern is projected from a projector.
  • the reference plane calculation unit 102 determines a reference plane for detecting the workpiece 20 based on the three-dimensional data outputted from the three-dimensional sensor 101 .
  • the reference plane may be, for example, determined using a designed value or through a measurement by another measurement method.
  • the reference plane is a surface that is not parallel to the three-dimensional sensor 101 .
  • the reference plane is the loading surface 30 or a surface parallel to the loading surface 30 .
  • the surface parallel to the loading surface 30 includes a contact surface of the workpiece 20 that is in contact with the loading surface 30 . In the example described below, the loading surface 30 is used as the reference plane.
  • FIG. 2 is a diagram illustrating a method by which the three-dimensional sensor included in the image processing device detects the loading surface on which the workpiece 20 is loaded.
  • the reference plane calculation unit 102 determines, based on the three-dimensional data outputted by the three-dimensional sensor 101 , three-dimensional coordinates of the three-dimensional sensor 101 and at least three points on the loading surface 30 , such as three points A 1 , A 2 , and A 3 on the loading surface 30 shown in FIG. 2 , in a three-dimensional coordinate system of the three-dimensional sensor 101 .
  • the reference plane calculation unit 102 can determine the three points A 1 , A 2 , and A 3 on the loading surface 30 by detecting a portion of the loading surface 30 around the workpiece 20 .
  • the reference plane calculation unit 102 can determine an equation representing the loading surface 30 in the three-dimensional coordinate system of the three-dimensional sensor 101 (hereinafter, referred to as a “three-dimensional coordinate system”) based on the three-dimensional coordinate values of the three points A 1 , A 2 , and A 3 on the loading surface 30 shown in FIG. 2 .
  • coordinates on respective axes in the three-dimensional coordinate system are referred to as “three-dimensional coordinates”.
  • the distance calculation unit 103 can calculate the distance between the loading surface 30 (reference plane) and each of points on a surface of the workpiece 20 based on the equation representing the loading surface 30 determined by the reference plane calculation unit 102 , and the three-dimensional coordinates of the points on the surface of the workpiece 20 and the loading surface 30 determined based on the three-dimensional data outputted from the three-dimensional sensor 101 .
  • the distance between the loading surface 30 and a surface 20 a of the workpiece 20 is a constant distance D 1 as shown in FIG. 3 .
  • the distance image creation unit 104 creates a distance image that has, as pixel values, values calculated based on the calculated distance between the loading surface 30 and each of the points on the surface of the workpiece 20 .
  • FIG. 3 shows a case where a pixel value of a point B at a corner of the surface 20 a of workpiece 20 is D1.
  • the distance image herein means an image created by measuring a surface of an object and capturing an image thereof, and giving each pixel on the captured image a value, as a pixel value, calculated based on the calculated distance between the loading surface 30 and a corresponding one of points on the surface of the workpiece 20 .
  • the image processing unit 105 performs image processing on the distance image. For example, the image processing unit 105 converts the distance image to an image in which the values (pixel values) calculated based on the distance between the loading surface 30 and each of the points on the surface of the workpiece 20 each represent a luminance, which in other words is an image in which the pixel values are exhibited as gradations.
  • the image in which the pixel values are exhibited as gradations is, for example, a single chromatic color image or a grayscale image. In the image in which the pixel values are exhibited as gradations resulting from the conversion of the distance image, the luminance of the surface 20 a of the workpiece 20 is constant within this surface as shown in FIG.
  • FIG. 3 shows a case where the image in which the pixel values are exhibited as gradations is a grayscale image.
  • the luminance of the surface 20 a of the workpiece 20 being constant within this surface makes it easier to detect the workpiece 20 .
  • the luminance is the highest on the surface 20 a of the workpiece 20 , which is at the longest distance from the loading surface 30 (farthest from the loading surface 30 ), and the luminance decreases with the decreasing distance from the loading surface 30 (toward the loading surface 30 ).
  • FIG. 3 shows a case where the image in which the pixel values are exhibited as gradations is a grayscale image.
  • the luminance of the surface 20 a of the workpiece 20 being constant within this surface makes it easier to detect the workpiece 20 .
  • the luminance is the highest on the surface 20 a of the workpiece 20 , which is at the longest distance from the loading surface 30 (farthest from the loading surface 30 ), and the luminance decreases with the decreasing distance from the loading surface 30 (toward
  • a distance image is created using, as pixel values thereof, values calculated based on the distance between the three-dimensional sensor 101 and each of points on a surface of the workpiece 20 .
  • a distance D 2 between the point B at the corner of the surface 20 a of the workpiece 20 and the three-dimensional sensor 101 is different from a distance D 3 between a point C at a corner of the surface 20 a of the workpiece 20 and the three-dimensional sensor 101 in the Z direction of the three-dimensional coordinate system (X, Y, Z) of the three-dimensional sensor 101 .
  • the luminance of the surface 20 a of the workpiece 20 is not constant within this surface in the case of a grayscale image obtained by converting a distance image created using, as pixel values thereof, values calculated based on the distance between the surface 20 a of the workpiece 20 and the three-dimensional sensor 101 .
  • the image processing unit 105 may further perform image processing on the converted image such as the grayscale image in which the pixel values are exhibited as gradations, for displaying the image on a display device such as a liquid crystal display device.
  • the image processing device 10 may include a computer having an arithmetic processor such as a central processing unit (CPU).
  • the image processing device 10 further includes an auxiliary storage device such as a hard disk drive (HDD) storing various control programs like application software and an operating system (OS), and a main storage device such as random access memory (RAM) for storing data transitorily needed when the arithmetic processor executes the programs.
  • arithmetic processor such as a central processing unit (CPU).
  • the image processing device 10 further includes an auxiliary storage device such as a hard disk drive (HDD) storing various control programs like application software and an operating system (OS), and a main storage device such as random access memory (RAM) for storing data transitorily needed when the arithmetic processor executes the programs.
  • HDD hard disk drive
  • RAM random access memory
  • the arithmetic processor reads the application software and the OS from the auxiliary storage device, and performs computing based on the application software and the OS while deploying the read application software and the read OS to the main storage device. Furthermore, various types of hardware included in the image processing device 10 are controlled based on the results of the computing.
  • the functional blocks according to the present embodiment are implemented. That is, the present embodiment can be implemented through cooperation of hardware and software.
  • Step S 11 the three-dimensional sensor 101 captures an image of the workpiece 20 and the loading surface 30 on which the workpiece 20 is loaded, and outputs resulting three-dimensional data to the reference plane calculation unit 102 and the distance calculation unit 103 .
  • the reference plane calculation unit 102 determines, based on the three-dimensional data, three-dimensional coordinates of at least three points on the loading surface 30 , such as the three points A 1 , A 2 , and A 3 on the loading surface 30 shown in FIG. 3 .
  • the reference plane calculation unit 102 determines an equation representing the loading surface 30 in the coordinate system of the three-dimensional sensor 101 based on the three-dimensional coordinates of the three-dimensional sensor 101 and the three points A 1 , A 2 , and A 3 on the loading surface 30 shown in FIG. 2 .
  • the distance calculation unit 103 calculates the distance between the loading surface 30 (reference plane) and each of points on a surface of the workpiece 20 based on the equation representing the loading surface 30 determined by the reference plane calculation unit 102 , and the three-dimensional coordinates of the points on the surface of the workpiece 20 and the loading surface 30 determined based on the three-dimensional data outputted from the three-dimensional sensor 101 .
  • the distance image creation unit 104 creates a distance image that has, as pixel values, values calculated based on the calculated distance between the loading surface 30 and each of the points on the surface of the workpiece 20 .
  • the image processing unit 105 performs image processing on the distance image. For example, as described above, the image processing unit 105 converts the distance image to a grayscale image in which the values calculated based on the distance between the loading surface 30 and each of the points on the surface of the workpiece 20 each represent a luminance. In the grayscale image resulting from the conversion, the luminance of the surface 20 a of the workpiece 20 is constant within this surface as shown in FIG. 3 , because the distance from the loading surface 30 to the surface 20 a of the workpiece 20 is the constant distance D 1 .
  • the workpiece loading surface to be used as the reference plane is specified, a distance image is created that has, as pixel values, values calculated based on the distance between the specified loading surface and each of points on a surface of the workpiece, and image processing is performed on the distance image, so that a distance image in squarely confronting relationship to the workpiece can be obtained even if the workpiece is oblique to the optical axis of the sensor.
  • This configuration makes it easier to detect the workpiece.
  • the image processing device can be, for example, used for detection of a workpiece on a table in a machine tool or detection of a workpiece on a table in a configuration in which the workpiece is conveyed by a robot arm.
  • the loading surface on which a workpiece is loaded or a surface parallel to the loading surface is used as the reference plane.
  • a surface other than the loading surface and a surface parallel to the loading surface may be used as the reference plane.
  • the following describes an example in which a surface other than the loading surface and a surface parallel to the loading surface is used as the reference plane.
  • the workpiece is hexagonal prism-shaped.
  • FIG. 6 is a diagram illustrating the three-dimensional sensor included in the image processing device and a hexagonal prism-shaped workpiece loaded on the loading surface.
  • a side surface oblique to the loading surface 30 is specified as the reference plane out of a hexagonal prism-shaped workpiece 21 , and it is impossible for the three-dimensional sensor 101 to capture an image of the reference plane.
  • an additional three-dimensional sensor having the same configuration as the three-dimensional sensor 101 is disposed in a position where the side surface of the hexagonal prism-shaped workpiece 21 can be observed.
  • the reference plane calculation unit 102 can determine an equation representing the side surface of the workpiece 21 in a coordinate system of the additional three-dimensional sensor.
  • the reference plane calculation unit 102 performs a calibration on the coordinate system of the three-dimensional sensor 101 and the coordinate system of the additional three-dimensional sensor, and then determines the equation representing the side surface of the workpiece 21 in the coordinate system of the additional three-dimensional sensor. Based on the thus determined equation representing the side surface, the reference plane calculation unit 102 can determine an equation representing the side surface of the workpiece 21 in the coordinate system of the three-dimensional sensor 101 .
  • the distance calculation unit 103 calculates the distance between the side surface (reference plane) of the workpiece 21 and each of points on a different surface of the workpiece 21 based on the equation representing the side surface of the workpiece 21 determined by the reference plane calculation unit 102 , and the three-dimensional coordinates of the points on the different surface of the workpiece 21 and the loading surface 30 determined based on the three-dimensional data outputted from the three-dimensional sensor 101 , in the same manner as in the example described above in which the loading surface is used as the reference plane except that the equation representing the side surface of the workpiece 21 is used instead of the equation representing the loading surface 30 .
  • the distance from the reference plane to a surface 21 a of the workpiece 21 is a constant distance D 4 .
  • the distance image creation unit 104 creates a distance image that has, as pixel values, values calculated based on the distance between the side surface of the workpiece 21 and each of the points on the different surface of the workpiece 21 calculated based on the three-dimensional data outputted from the three-dimensional sensor 101 .
  • the image processing unit 105 converts the distance image to, for example, a grayscale image.
  • the luminance of the surface 21 a of the workpiece 21 is constant within this surface, because the distance from the reference plane to the surface 21 a of the workpiece 21 is the constant distance D 4 .
  • the luminance of the surface 21 a of the workpiece 21 being constant within this surface makes it easier to detect the workpiece 21 .
  • Constituent elements of the image processing unit according to the embodiment described above can be implemented by hardware, software, or a combination thereof.
  • the constituent elements may be implemented by an electronic circuit.
  • An image processing method that is carried out through cooperation of the aforementioned constituent elements can also be implemented by hardware, software, or a combination thereof.
  • Being implemented by software herein means being implemented through a computer reading and executing a program.
  • the program can be supplied to the computer by being stored on any of various types of non-transitory computer readable media.
  • the non-transitory computer readable media include various types of tangible storage media. Examples of the non-transitory computer readable media include magnetic storage media (such as hard disk drives), magneto-optical storage media (such as magneto-optical disks), compact disc read only memory (CD-ROM), compact disc recordable (CD-R), compact disc rewritable (CD-R/W), and semiconductor memory (such as mask ROM, programmable ROM (PROM), erasable PROM (EPROM), flash ROM, and random access memory (RAM)).
  • magnetic storage media such as hard disk drives
  • magneto-optical storage media such as magneto-optical disks
  • CD-ROM compact disc read only memory
  • CD-R compact disc recordable
  • CD-R/W compact disc rewritable
  • semiconductor memory such as mask ROM, programmable ROM (PROM), erasable PROM (EPROM),
  • a workpiece which is a piece of material to be machined in a machine tool
  • the object is not limited to a workpiece, and may be anything other than a workpiece.
  • the object may be a manufactured article, a commodity, or a packaging material such as a cardboard box containing a manufactured article or a commodity.
  • a reference plane storage unit that stores the equation representing the loading surface 30 or the side surface of the workpiece 21 may be provided instead of the reference plane calculation unit 102 .
  • the distance calculation unit 103 may read the equation representing the loading surface 30 or the side surface of the workpiece 21 from the reference plane storage unit, and convert the received image to a distance image that has, as pixel values, values of the distance from the loading surface 30 or the side surface of the workpiece 21 .
  • the three-dimensional sensor 101 does not need to be provided inside the image processing device 10 , and may be provided outside the image processing device 10 .
  • the workpiece is not limited to being trapezoid-shaped or hexagonal prism-shaped, and may have another shape such as a cube shape or a rectangular prism shape.
  • the image processing device and the image processing method according to the present disclosure can take various embodiments having the following configurations including the embodiment described above.
  • this image processing device it is easy to detect an object even if the object is oblique to the optical axis of the three-dimensional sensor.
  • this image processing method it is easy to detect an object even if the object is oblique to the optical axis of the three-dimensional sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
US17/908,082 2020-03-23 2021-03-16 Image processing device and image processing method Pending US20230089139A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020050744 2020-03-23
JP2020-050744 2020-03-23
PCT/JP2021/010613 WO2021193236A1 (ja) 2020-03-23 2021-03-16 画像処理装置及び画像処理方法

Publications (1)

Publication Number Publication Date
US20230089139A1 true US20230089139A1 (en) 2023-03-23

Family

ID=77892155

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/908,082 Pending US20230089139A1 (en) 2020-03-23 2021-03-16 Image processing device and image processing method

Country Status (5)

Country Link
US (1) US20230089139A1 (ja)
JP (1) JP7436633B2 (ja)
CN (1) CN115335658A (ja)
DE (1) DE112021001777T5 (ja)
WO (1) WO2021193236A1 (ja)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3637226B2 (ja) 1999-02-01 2005-04-13 株式会社東芝 動き検出方法、動き検出装置及び記録媒体
JP5041458B2 (ja) 2006-02-09 2012-10-03 本田技研工業株式会社 三次元物体を検出する装置
JP5458885B2 (ja) * 2007-08-30 2014-04-02 株式会社安川電機 物体検出方法と物体検出装置およびロボットシステム
JP5493105B2 (ja) 2010-03-19 2014-05-14 オプテックス株式会社 距離画像カメラを用いた物体寸法測定方法および物体寸法測定装置
JP2015170907A (ja) * 2014-03-05 2015-09-28 キヤノン株式会社 スキャナシステム、スキャナシステムのデータ処理方法、及びプログラム
JP6191019B2 (ja) * 2014-12-25 2017-09-06 パナソニックIpマネジメント株式会社 投影装置及び投影方法
JP6736383B2 (ja) * 2016-06-27 2020-08-05 株式会社キーエンス 測定装置
JP2018107642A (ja) 2016-12-27 2018-07-05 キヤノン株式会社 画像処理装置、画像処理装置の制御方法、及びプログラム
JP7107015B2 (ja) 2018-06-19 2022-07-27 沖電気工業株式会社 点群処理装置、点群処理方法およびプログラム
JP7200994B2 (ja) 2018-07-13 2023-01-10 株式会社ニコン 処理装置、検出装置、処理方法、及び処理プログラム

Also Published As

Publication number Publication date
JPWO2021193236A1 (ja) 2021-09-30
JP7436633B2 (ja) 2024-02-21
CN115335658A (zh) 2022-11-11
WO2021193236A1 (ja) 2021-09-30
DE112021001777T5 (de) 2023-01-12

Similar Documents

Publication Publication Date Title
US9111177B2 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
US10115035B2 (en) Vision system and analytical method for planar surface segmentation
US9621793B2 (en) Information processing apparatus, method therefor, and measurement apparatus
US20160117824A1 (en) Posture estimation method and robot
EP3309505A1 (en) Dimension measurement device and dimension measurement method
US9984291B2 (en) Information processing apparatus, information processing method, and storage medium for measuring a position and an orientation of an object by using a model indicating a shape of the object
JP6649796B2 (ja) 物体状態特定方法、物体状態特定装置、および、搬送車
US20170129101A1 (en) Robot control apparatus and robot control method
US20100328682A1 (en) Three-dimensional measurement apparatus, measurement method therefor, and computer-readable storage medium
JP2006071471A (ja) 移動体高さ判別装置
US11017548B2 (en) Methods, systems, and apparatuses for computing dimensions of an object using range images
US10726569B2 (en) Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
JP6172432B2 (ja) 被写体識別装置、被写体識別方法および被写体識別プログラム
US10643338B2 (en) Object detection device and object detection method
US10679367B2 (en) Methods, systems, and apparatuses for computing dimensions of an object using angular estimates
JP2018036770A (ja) 位置姿勢推定装置、位置姿勢推定方法、及び位置姿勢推定プログラム
US10907954B2 (en) Methods and systems for measuring dimensions of a 2-D object
US20230129785A1 (en) Three-dimensional measurement device which generates position information for surface of object from image captured by multiple cameras
US20230089139A1 (en) Image processing device and image processing method
US11244465B2 (en) Point cloud data processing device
JP2004126905A (ja) 画像処理装置
JP5481397B2 (ja) 3次元座標計測装置
US20170330334A1 (en) Movement direction determination method and movement direction determination device
US11717970B2 (en) Controller, control method using controller, and control system
KR101741501B1 (ko) 카메라와 객체 간 거리 추정 장치 및 그 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: FANUC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURATA, YUUSUKE;TAKIZAWA, SHOUTA;REEL/FRAME:060940/0805

Effective date: 20220823

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION