WO2024166602A1 - 三次元測定装置 - Google Patents
三次元測定装置 Download PDFInfo
- Publication number
- WO2024166602A1 WO2024166602A1 PCT/JP2024/000596 JP2024000596W WO2024166602A1 WO 2024166602 A1 WO2024166602 A1 WO 2024166602A1 JP 2024000596 W JP2024000596 W JP 2024000596W WO 2024166602 A1 WO2024166602 A1 WO 2024166602A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scanner
- unit
- dimensional
- display
- measurement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/56—Particle system, point based geometry or rendering
Definitions
- This disclosure relates to a three-dimensional measuring device equipped with a three-dimensional scanner.
- Patent Document 1 discloses measuring the three-dimensional coordinates of an object to be measured using a contact probe having a contact portion that is brought into contact with a desired portion of the object to be measured.
- a contact probe having a contact portion that is brought into contact with a desired portion of the object to be measured.
- multiple markers on the contact probe are captured by an imaging unit installed away from the contact probe, and the three-dimensional coordinates of the contact position of the contact probe can be calculated based on the marker images generated by the imaging unit.
- the contact probe in Patent Document 1 is provided with a display unit that displays a setting screen including measurement items, and the measurement operator can select the setting items while looking at the setting screen displayed on the display unit.
- the device in Patent Document 1 uses a contact probe, so it can only measure the coordinates of the part that is in contact with the probe. Therefore, if a non-contact three-dimensional scanner is used, it becomes possible to measure a wider range of the object to be measured, i.e., to scan a wide area.
- a measurement operator scans an object to be measured with a three-dimensional scanner, he or she must pay attention to matters such as whether the distance between the object to be measured and the three-dimensional scanner is appropriate, whether the pattern light is being irradiated onto the part of the object to be measured, and the extent of the current scan completion range.
- the contact probe in Patent Document 1 is provided with a display unit, but this display unit only displays a setting screen.
- this display unit only displays a setting screen.
- This disclosure has been made in consideration of these points, and its purpose is to make it possible to easily check information regarding the measurement results obtained by a 3D scanner on the 3D scanner.
- the three-dimensional measuring device includes a three-dimensional scanner having a scanner light source that irradiates a pattern light, a scanner imaging unit that captures the pattern light irradiated by the scanner light source to generate an image including the pattern light, a scanner display unit, and a first communication unit that receives display data for generating a display screen to be displayed on the scanner display unit, a position and orientation identification unit that identifies the position and orientation of the three-dimensional scanner, and a three-dimensional data generation means having a second communication unit that generates display data showing the three-dimensional shape of a measurement target based on an image including the pattern light generated by the scanner imaging unit and the position and orientation of the three-dimensional scanner identified by the position and orientation identification unit, and transmits the generated display data.
- the first communication unit of the three-dimensional scanner receives the display data transmitted via the second communication unit.
- the scanner display unit can display the display screen generated
- the three-dimensional data generating means generates display data showing the three-dimensional shape of the measurement object based on the image generated by the scanner imaging unit of the three-dimensional scanner and the position and orientation of the three-dimensional scanner identified by the position and orientation identifying unit.
- the display data is received from the second communication unit of the three-dimensional data generating means via the first communication unit of the three-dimensional scanner.
- the scanner display unit displays a display screen generated based on the display data received via the first communication unit, so that the measurement operator can easily check information on the measurement results by the three-dimensional scanner, i.e., whether the distance (working distance) between the measurement object and the three-dimensional scanner is appropriate, whether the pattern light is being irradiated to the part of the measurement object to be measured, and the current scan completion range, simply by looking at the scanner display unit.
- the display screen may be a screen that displays a point cloud showing the three-dimensional shape of the measurement object, or a screen that displays mesh data showing the three-dimensional shape of the measurement object.
- the three-dimensional scanner may further include a scanner image processing unit that processes an image including the pattern light generated by the scanner image capturing unit to generate first measurement information, and a plurality of self-luminous markers.
- the first communication unit may also transmit the first measurement information generated by the scanner image processing unit.
- the position and orientation identification unit also includes a movable image capturing unit that moves the field of view so that the three-dimensional scanner is within the field of view, captures the self-luminous markers to generate an image including the self-luminous markers in order to measure the position and orientation of the three-dimensional scanner, a camera image processing unit that processes the image including the self-luminous markers generated by the movable image capturing unit to generate second measurement information, and a third communication unit that transmits the second measurement information generated by the camera image processing unit.
- the three-dimensional data generating means can receive the first measurement information generated by the scanner image processing unit and transmitted via the first communication unit, and the second measurement information generated by the camera image processing unit and transmitted via the third communication unit, and generate display data showing the three-dimensional shape of the measurement object based on the received first measurement information and second measurement information.
- the three-dimensional data generating means can transmit the display data, and the first communication unit of the three-dimensional scanner can receive the transmitted display data.
- the three-dimensional data generating means may further include a measurement setting unit that receives settings for at least one of the type of pattern light emitted by the scanner light source and the exposure time of the scanner imaging unit, and a measurement control unit that controls the scanner light source or the scanner imaging unit based on the settings received by the measurement setting unit.
- the scanner display unit also displays a setting screen that accepts settings for at least one of the type of pattern light emitted by the scanner light source and the exposure time of the scanner imaging unit, so the measurement operator can easily set the settings while looking at the setting screen.
- the setting information received via the setting screen can be written to the measurement setting section of the three-dimensional data generating means, and in this case, the measurement control section of the three-dimensional data generating means can also control the scanner light source or the scanner imaging section based on the setting information written to the measurement setting section.
- the three-dimensional data generating means can also generate new display data showing the three-dimensional shape of the measurement object based on a new image including a pattern light generated by the scanner imaging unit controlled based on the setting information written in the measurement setting unit and the position and orientation of the three-dimensional scanner identified by the position and orientation identifying unit, and can transmit the new display data generated.
- the scanner display unit can display a display screen generated based on the new display data received via the first communication unit.
- distance information indicating the distance between the object to be measured and the three-dimensional scanner can be displayed on the display screen.
- difference information indicating the difference between the CAD data of the object to be measured and the measured three-dimensional shape, etc. can be displayed on the display screen.
- the three-dimensional measuring device may have a scanner light source that irradiates a pattern light for measuring the three-dimensional shape of a measurement object, and a scanner imaging unit that captures the pattern light irradiated by the scanner light source and generates an image including the pattern light.
- the three-dimensional measuring device may include an input unit that accepts an input of a reference model of the measurement object, a display data generation unit that displays the reference model in a solid body on a display unit, a geometric element extraction unit that extracts geometric elements by accepting a user input on the reference model displayed in a solid body on the scanner display unit, in which display data is generated by the display data generation unit, a coordinate system creation unit that creates a coordinate system of the reference model based on the geometric elements extracted by the geometric element extraction unit, a position and orientation identification unit that identifies the position and orientation of the three-dimensional scanner, a three-dimensional data generation means that sequentially generates point cloud data of the measurement object in a measurement coordinate system based on an image including the pattern light generated by the scanner imaging unit and the position and orientation of the three-dimensional scanner identified by the position and orientation identification unit, and a coordinate system matching unit that aligns the coordinate system of the reference model created by the coordinate system creation unit with the measurement coordinate system.
- the reference model of the object to be measured may be, for
- the display data generation unit generates display data for displaying the reference model on a display unit with the coordinate system of the reference model and the measurement coordinate system aligned by the coordinate system alignment unit, and in response to the start of point cloud data generation by the three-dimensional data generation means, switches the reference model from a solid display to an edge line display with emphasized edge lines, and generates display data for cumulatively displaying, on the edge-line displayed reference model, a three-dimensional shape based on the point cloud data of the measurement object sequentially generated by the three-dimensional data generation means.
- the display data generation unit can generate display data for the reference model in which the solid of the reference model is displayed semi-transparently as the edge-line displayed reference model.
- the three-dimensional measuring device may further include a measurement processing unit that performs measurement processing of the three-dimensional shape of the measurement object based on a series of point cloud data sequentially generated by the three-dimensional data generating means.
- the measurement processing includes, for example, geometric measurement, comparative measurement, cross-sectional measurement, etc.
- the three-dimensional measuring device may further include a contact probe that indicates the positions of measurement points, and a coordinate calculation unit that calculates the coordinates of multiple measurement points indicated by the contact probe.
- the coordinate system creation unit can create a measurement coordinate system based on the coordinates of the multiple measurement points calculated by the coordinate calculation unit.
- the three-dimensional scanner may further include a scanner display unit and a first communication unit that receives the display data generated by the display data generation unit.
- the scanner display unit can receive and display display data via the first communication unit for cumulatively displaying a three-dimensional shape based on point cloud data of the measurement object in the measurement coordinate system created by the coordinate system creation unit.
- the three-dimensional measuring device may further include a second communication unit that transmits display data to the first communication unit of the three-dimensional scanner.
- the scanner display unit can display the display data generated by the display data generator and transmitted to the three-dimensional scanner via the second communication unit and the first communication unit.
- the display data generating unit can also generate display data so that the pattern light contained in the image generated by the scanner imaging unit is displayed on the reference model in different colors on the display unit depending on the distance between the scanner imaging unit and the object to be measured.
- the three-dimensional measuring device may further include a region deletion unit that deletes the point cloud data of the region designated by the user input by accepting user input on the three-dimensional shape of the measurement object displayed on the display unit as a result of display data being generated by the display data generation unit.
- the three-dimensional measuring device may further include a texture camera that captures an image of a measurement object and generates a texture image including the texture of the measurement object.
- the position and orientation identification unit can identify the position and orientation of the texture camera.
- the display data generation unit can display a three-dimensional texture image obtained by adding the texture image acquired by the texture image acquisition unit to the three-dimensional shape data based on the position and orientation of the texture camera when the texture image was acquired.
- the display data generation unit generates display data for superimposing and displaying a texture image at a predetermined time point acquired by the texture camera on point cloud data that has been generated sequentially and cumulatively over a predetermined period of time generated by the three-dimensional data generation means, and can transmit the generated display data to the first communication unit.
- the predetermined period of time can be, for example, the period from the start of measurement to the completion of measurement.
- the scanner display unit can then display the display data received via the first communication unit.
- a display screen generated based on display data showing the three-dimensional shape of the object to be measured can be displayed on the scanner display unit of the three-dimensional scanner, allowing the measurement operator to easily check information related to the measurement results obtained by the three-dimensional scanner on the three-dimensional scanner.
- FIG. 1 is a diagram showing a configuration of a three-dimensional scanner according to an embodiment of the present invention.
- FIG. 2 is a block diagram of an imaging unit and a processing unit.
- FIG. 2 is a perspective view of the three-dimensional scanner seen from below.
- FIG. 2 is a right side view of the three-dimensional scanner.
- FIG. 2 is a plan view of a three-dimensional scanner.
- FIG. 2 is a bottom view of the three-dimensional scanner.
- FIG. 2 is a front view of the three-dimensional scanner. 2 is a cross-sectional view of the three-dimensional scanner taken along the vertical direction.
- FIG. 2 is a block diagram showing the circuit configuration of a three-dimensional scanner.
- FIG. 13 illustrates an example of a setting screen.
- FIG. 10 is a flowchart illustrating an example of a procedure for reflecting setting information.
- FIG. 13 is a diagram showing a first example of a display screen on which a point cloud is displayed.
- FIG. 13 is a diagram showing a second example of a display screen on which a point cloud is displayed.
- FIG. 13 is a diagram showing a third example of a display screen on which a point cloud is displayed.
- FIG. 13 is a diagram showing a fourth example of a display screen on which a point cloud is displayed.
- 13 is a flowchart illustrating an example of a processing procedure when a viewpoint is fixed.
- FIG. 13 is a diagram showing an example of a display screen showing the difference between CAD data of a measurement object and a measurement result.
- FIG. 13 is a diagram showing an example of a display screen when a texture is reflected.
- 13 is a flowchart showing an example of a processing procedure for reflecting a texture.
- 1 is a flowchart showing an example of a procedure for measuring the three-dimensional shape of a measurement object using a three-dimensional scanner.
- 13 is a flowchart illustrating an example of a data matching process.
- FIG. 13 is a diagram showing a case where a three-dimensional scanner is attached to an arm and used.
- 1 is a flowchart showing an example of a procedure for three-dimensional measurement.
- FIG. 3 is a view corresponding to FIG. 2 according to another example.
- FIG. 1 is a diagram showing an example of a three-dimensional model displayed on a monitor as a solid object.
- FIG. 2 is a block diagram showing a circuit configuration of a probe.
- 13A to 13C are diagrams illustrating an example of a user interface when the three-dimensional scanner is switched from an inactive state to an active state to perform a measurement operation.
- FIG. 13 is a diagram showing an example of a display form in which edge lines are emphasized.
- FIG. 1 is a diagram showing the configuration of a three-dimensional measuring device 1 according to an embodiment of the present invention.
- the three-dimensional measuring device 1 is a shape measuring device that measures the three-dimensional shape and three-dimensional coordinates of a measurement object W without contacting the measurement object W, and is equipped with a three-dimensional scanner 2 having multiple self-luminous markers, an imaging unit 3 that images the multiple self-luminous markers of the three-dimensional scanner 2, and a processing unit 4 that measures the three-dimensional shape and three-dimensional coordinates of the measurement object based on the marker image generated by the imaging unit 3 and the bright line image generated by the three-dimensional scanner 2.
- the three-dimensional scanner 2 is separate from the imaging unit 3 and the processing unit 4, and the measurement operator can bring the three-dimensional scanner 2 close to the measurement object W located away from the imaging unit 3 and the processing unit 4, and have the bright line image generated by the three-dimensional scanner 2.
- the imaging unit 3 is an example of a position and orientation identification unit that identifies the position and orientation of the three-dimensional scanner 2.
- the imaging unit 3 is a unit that captures a plurality of self-luminous markers (described later) provided on the three-dimensional scanner 2 to generate a marker image including the plurality of self-luminous markers.
- the marker image including the self-luminous markers can also be called a second image.
- the imaging unit 3 includes a base 30 and a movable imaging unit 3A that moves the field of view so that the three-dimensional scanner 2 is within the field of view, and captures the self-luminous markers to measure the position and orientation of the three-dimensional scanner 2 and generates a marker image including the self-luminous markers.
- the movable imaging unit 3A includes a movable stage 31 supported by the base 30 and a scanner imaging camera 32 fixed to the upper part of the movable stage 31.
- the movable stage 31 includes a stage driving unit 31a.
- the stage driving unit 31a includes an actuator such as a motor, and is configured to rotate the movable stage 31 around a vertical axis as well as around a horizontal axis. By rotating the movable stage 31 around the vertical axis, the scanner imaging camera 32 rotates around the vertical axis, and by rotating the movable stage 31 around the horizontal axis, the scanner imaging camera 32 rotates around the horizontal axis.
- the stage driving unit 31a is controlled by the main body control unit 33 possessed by the imaging unit 3.
- a plurality of light emitters 31b are provided at a predetermined interval on a two-dimensional plane below the movable stage 31, and the light emitters 31b can be switched between a lit state and an unlit state by the lighting control unit 31c.
- the plurality of light emitters 31b move with the movement of the scanner imaging camera 32 and the movable stage 31.
- the lighting control unit 31c is controlled by the main body control unit 33.
- a reference camera 34 that captures the movable imaging unit 3A is provided on the base 30. This reference camera 34 captures the light emitters 31b that are turned on by the lighting control unit 31c.
- the reference camera 34 captures the light emitters 31b provided on the movable imaging unit 3A to generate an image including the light emitters 31b.
- the reference camera 34 can also be called a fixed imaging unit, and the image including the light emitters 31b can also be called a third image.
- a reference camera 34 is provided that captures the light emitters 31b that are turned on by the lighting control unit 31c.
- the multiple light emitters 31b can also be called self-luminous markers provided on the movable imaging unit 3A.
- the markers provided on the movable imaging unit 3A may be composed of a marking material other than the light emitters 31b.
- the imaging unit 3 is provided with a camera image processing unit 35.
- the camera image processing unit 35 has an image processing circuit and controls the scanner imaging camera 32 to perform imaging at a predetermined timing.
- Examples of the image processing circuit include a GPU (Graphics Processing Unit), an FPGA (Field Programmable Gate Array), and a DSP (Digital Signal Processor).
- the camera image processing unit 35 receives the marker image captured by the scanner imaging camera 32, as well as the image of the light-emitting body 31b captured by the reference camera 34.
- the camera image processing unit 35 processes the marker image captured by the scanner imaging camera 32 to generate center position information of the self-luminous marker (corresponding to the second measurement information of the present invention). Specifically, the camera image processing unit 35 performs processing to extract the center of the self-luminous marker from the marker image. Then, based on the extraction result, it generates center position information of the self-luminous marker. Furthermore, the camera image processing unit 35 generates position and orientation information of the self-luminous marker relative to the movable imaging unit 3A based on the center position information of the self-luminous marker obtained as a result of the processing to extract the center of the self-luminous marker.
- the central position information of the self-luminous markers 71-77, 81-87, 91-97, 101-107 is generated by the following method.
- the camera image processing unit 35 acquires the arrangement information of each of the self-luminous markers 71-77, 81-87, 91-97, 101-107 stored in the three-dimensional scanner 2.
- the camera image processing unit 35 calculates the position at which each marker will be imaged by the imaging unit 3 when the relative position or attitude of the three-dimensional scanner 2 with respect to the imaging unit 3 is changed, and matches each calculated marker position with the marker position in the image 102.
- the relative position and orientation of the three-dimensional scanner 2 with respect to the imaging unit 3 that minimizes the error between the calculated marker positions and the marker positions of the image 102 is calculated, and generated as central position information of the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107.
- the camera image processing unit 35 virtually changes the position and orientation of the three-dimensional scanner 2 to virtually change the arrangement information of the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107 acquired from the three-dimensional scanner 2, calculates the position and orientation that matches the marker image generated by the camera image processing unit 35, and generates central position information of the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107.
- This calculation process of the position and orientation information may be called bundle adjustment.
- some of the self-luminous markers 71-77, 81-87, 91-97, 101-107 included in the marker image may be selectively used as representative markers.
- the circular self-luminous markers 71-77, 81-87, 91-97, 101-107 may become elliptical depending on the position and orientation of the three-dimensional scanner 2.
- the flattening ratio which is the ratio of the length of the long side to the short side of the self-luminous markers 71-77, 81-87, 91-97, 101-107 included in the marker image, may be used, and if the flattening ratio is equal to or less than a predetermined value, the markers may not be included in the calculation, and the self-luminous markers 71-77, 81-87, 91-97, 101-107 with the flattening ratio equal to or greater than the predetermined value may be used as representative markers.
- a marker block that is close to a perfect circle may be selected as the representative marker.
- the central position information of the self-luminous markers 71-77, 81-87, 91-97, 101-107 calculated here is based on the scanner imaging camera 32. Therefore, the camera image processing unit 35 calculates the position and orientation information of the three-dimensional scanner 2 based on the reference camera 34, based on the position and orientation information of the scanner imaging camera 32 based on the reference camera 34, and the position and orientation information of the three-dimensional scanner 2 based on the scanner imaging camera 32, thereby generating central position information of the self-luminous markers based on the reference camera 34.
- the imaging unit 3 is equipped with a wireless communication unit 36 that is controlled by the main body control unit 33.
- the wireless communication unit 36 is a communication module or the like that is configured to be able to communicate with devices other than the imaging unit 3.
- the imaging unit 3 communicates with the three-dimensional scanner 2 via the wireless communication unit 36, and is able to send and receive various data, such as image data captured by the scanner imaging camera 32, as well as various signals.
- the imaging unit 3 also includes a communication section (corresponding to a third communication section of the present invention) 37 controlled by the main body control section 33.
- the communication section 37 is a communication module or the like configured to be able to communicate with the processing section 4.
- the imaging unit 3 communicates with the processing section 4 via the communication section 37, enabling transmission and reception of various data such as image data and various signals.
- the communication by the communication section 37 may be wired communication or wireless communication.
- the communication section 37 transmits central position information of the self-luminous marker generated by the camera image processing section 35.
- the imaging unit 3 has a trigger generation section 38 that generates identification information for identifying the timing of synchronization execution based on a measurement instruction. For example, when a measurement operator performs a predetermined measurement start operation, the main body control section 33 of the imaging unit 3 accepts the measurement start operation. When the main body control section 33 accepts the measurement start operation, it causes the trigger generation section 38 to generate a trigger as the above-mentioned identification information. The trigger is transmitted to the three-dimensional scanner 2, for example, via the wireless communication section 36.
- the trigger generation section 38 can also be referred to as a synchronization means.
- the main body control unit 33 synchronizes the emission of light from the self-luminous markers of the three-dimensional scanner 2, the imaging of the self-luminous markers of the three-dimensional scanner 2 by the movable imaging unit 3A, the lighting of the light emitters 31b of the movable stage 31, and the imaging of the light emitters 31b by the reference camera 34.
- the light emitters 31b of the movable stage 31 may be constantly lit, and therefore the main body control unit 33 synchronizes at least the emission of light from the self-luminous markers of the three-dimensional scanner 2, the imaging by the movable imaging unit 3A, and the imaging by the reference camera 34.
- the timing of the emission of light from the self-luminous markers of the three-dimensional scanner 2 may be slightly earlier than the timing of the imaging by the movable imaging unit 3A, and in this case too, the emission of light from the self-luminous markers of the three-dimensional scanner 2 and the imaging by the movable imaging unit 3A are considered to be synchronized.
- the communication unit 37 links and transmits the central position information of the self-luminous marker generated by the camera image processing unit 35 and the identification information corresponding to the central position information of the self-luminous marker generated by the trigger generation unit 38.
- “Linking” means associating or associating two or more pieces of information.
- the central position information of the self-luminous marker is associated with identification information for distinguishing the central position information of this self-luminous marker from central position information of other self-luminous markers. Therefore, the central position information of the desired self-luminous marker can be identified based on the identification information.
- the communication unit 37 corresponds to the second transmission unit of the present invention.
- the central position information of the self-luminous marker and the identification information may be transmitted by wireless communication.
- Processing unit 4 is a part that receives the positions and orientations of multiple markers obtained by processing the marker images generated by imaging unit 3 from imaging unit 3, receives edge data of the bright line images obtained by processing the bright line images generated by the three-dimensional scanner 2, and measures the three-dimensional shape of the measurement object W based on the received positions and orientations of the markers and edge data.
- the multiple light-emitting bodies 31b of the imaging unit 3 are mounted on a movable stage 31 to which the scanner imaging camera 32 is fixed, so the positional relationship of the multiple light-emitting bodies 31b with respect to the scanner imaging camera 32 is known.
- the scanner imaging camera 32 moves by the stage drive unit 31a, the scanner imaging camera 32 moves within a range in which the reference camera 34 can image the light-emitting bodies 31b.
- the position and orientation of the three-dimensional scanner 2 with respect to the scanner imaging camera 32 is determined based on the marker image of the three-dimensional scanner 2 captured by the scanner imaging camera 32.
- the reference camera 34 determines the position and orientation of the scanner imaging camera 32 relative to the reference camera 34 based on images of the multiple light-emitting bodies 31b in a similar manner.
- the camera image processing unit 35 acquires the arrangement information of the light-emitting bodies 31b stored in the memory unit 39c of the imaging unit 3, processes the image of the light-emitting bodies 31b generated by the reference camera 34 based on the arrangement information of the light-emitting bodies 31b, and generates position and orientation information of the scanner imaging camera 32 relative to the reference camera 34.
- the position and orientation information of the scanner imaging camera 32 relative to the reference camera 34 can be called third measurement information.
- the position and orientation of the three-dimensional scanner 2 relative to the scanner imaging camera 32 and the position and orientation of the scanner imaging camera 32 relative to the reference camera 34 are used to determine the position and orientation of the three-dimensional scanner 2 relative to the reference camera 34, and the coordinates of the measurement points are obtained, making it possible to perform three-dimensional coordinate measurement, i.e., to measure the three-dimensional shape.
- the processing unit 4 is configured as a general-purpose notebook personal computer, but it may also be configured as a desktop personal computer or a controller dedicated to the three-dimensional measuring device 1. In either case, it can be used as the processing unit 4 by installing a program or application for realizing the functions of the three-dimensional measuring device 1.
- the processing unit 4 may be separate from the imaging unit 3, or may be integrated with the imaging unit 3. Furthermore, a part of the processing unit 4 may be incorporated into the imaging unit 3, or a part of the imaging unit 3 may be incorporated into the processing unit 4.
- the processing unit 4 includes a control unit 40, a monitor 41, and an operation input unit 42.
- the monitor 41 is configured to display various images, user interfaces, etc., and is configured to include a liquid crystal display or an organic EL display.
- the operation input unit 42 is the part where the user performs various input operations.
- the operation input unit 42 is composed of, for example, a keyboard and a mouse.
- the control unit 40 includes a control unit 43, a display control unit 44, a storage unit 45, and a communication unit 46.
- the display control unit 44 controls the monitor 41 based on signals output from the control unit 43, and causes the monitor 41 to display various images, a user interface, and the like. Operations performed by the user on the user interface are acquired by the control unit 43 based on signals output from the operation input unit 42.
- the memory unit 45 may be a ROM, a solid state drive, a hard disk drive, etc.
- the memory unit 45 stores the placement information of each self-luminous marker in the marker blocks of the three-dimensional scanner 2.
- the placement information of the marker blocks and each self-luminous marker includes information indicating the distance between the marker blocks, the relative positional relationship of the self-luminous markers provided in each marker block, etc.
- the communication unit 46 of the processing unit 4 is controlled by the control unit 43.
- the communication unit 46 is a communication module or the like configured to be able to communicate with the communication unit 37 of the imaging unit 3.
- the three-dimensional scanner 2 is a handheld, portable scanner that is configured so that the measurement operator can measure the shape of the measurement object W while holding it in one or both hands and moving it freely.
- the power source may be supplied from an external source, or may be supplied from a built-in battery.
- the front, back, left, right, and up and down of the three-dimensional scanner 2 are defined as shown in Figures 3 to 7. That is, the side that is located to the right when the measurement operator holds the three-dimensional scanner 2 in his or her hand is called the right, and the side that is located to the left is called the left.
- the front of the three-dimensional scanner 2 is the side that faces the measurement object W
- the rear of the three-dimensional scanner 2 is the side opposite to the side that faces the measurement object W.
- the top of the three-dimensional scanner 2 is the side that is the upper side when the gripping unit 112 described later is held in a predetermined natural posture
- the bottom of the three-dimensional scanner 2 is the side that is the lower side when the gripping unit 112 is held in a predetermined natural posture.
- the three-dimensional shape of the measurement object W can be measured while the three-dimensional scanner 2 is held and moved by hand, so the orientation of the three-dimensional scanner 2 may be inverted upside down, the top side may be located to the right or left, or the rear side may be located at the top or bottom.
- the three-dimensional scanner 2 comprises a scanner body 20, a first marker block 21, a second marker block 22, a third marker block 23, and a fourth marker block 24.
- the first to fourth marker blocks 21 to 24 each have a self-luminous marker that faces in multiple directions.
- the scanner body 20 has a first arm portion 51 extending upward from the center, a second arm portion 52 extending downward from the center, a third arm portion 53 extending left from the center, and a fourth arm portion 54 extending right from the center.
- a first marker block 21 is attached to the tip of the first arm section 51
- a second marker block 22 is attached to the tip of the second arm section 52
- a third marker block 23 is attached to the tip of the third arm section 53
- a fourth marker block 24 is attached to the tip of the fourth arm section 54.
- the scanner body 20 has a scanner unit 60 and an optical base 61.
- the scanner unit 60 has a first scanner light source 62, a second scanner light source 63, a first scanner imaging unit 64, a second scanner imaging unit 65, and a texture camera 66.
- the part above the center of the optical base 61 is a part that constitutes the first arm unit 51, and serves as an upper support part 61a that supports the first marker block 21. Therefore, the first marker block 21 is attached to the upper end of the upper support part 61a.
- the part below the center of the optical base 61 is a part that constitutes the second arm unit 52, and serves as a lower support part 61b that supports the second marker block 22. Therefore, the second marker block 22 is attached to the lower end of the lower support part 61b.
- Two first scanner light sources 62 are attached at a distance in the left-right direction to the vertical center of the optical base 61, i.e., the part between the upper support part 61a and the lower support part 61b.
- the two first scanner light sources 62 are multi-line light sources that irradiate multiple linear light in the measurement direction (forward), and are arranged so that the light emission surface faces the measurement object W during measurement.
- the light irradiated by the first scanner light source 62 can be called multi-line light, and multi-line light is included in the pattern light.
- a second scanner light source 63 is attached above the first scanner light source 62 in the vertical center of the optical base 61.
- the second scanner light source 63 is a single-line light source that irradiates one linear light in the measurement direction (forward), and is disposed so that the light emission surface faces the measurement target W during measurement.
- the light irradiated by the second scanner light source 63 can be called single-line light, and the single-line light is also included in the pattern light.
- the first scanner light source 62 and the second scanner light source 63 have laser light sources that emit laser light, but the type of light source is not particularly limited. In this example, a total of three scanner light sources 62, 63 are provided, but this is not limited, and it is sufficient that one or more scanner light sources are provided. In addition, the type of pattern light is not particularly limited, and the scanner light source may emit pattern light other than multi-line light and single-line light.
- the first scanner imaging unit 64 and the second scanner imaging unit 65 each have a light receiving element such as a CMOS sensor, an optical system for forming an image of light incident from the outside on the light receiving surface of the light receiving element, and the like.
- the first scanner imaging unit 64 is attached to the upper part of the optical base 61, which is a part above and away from the scanner light sources 62 and 63.
- the second scanner imaging unit 65 is attached to the lower part of the optical base 61, which is a part below and away from the scanner light sources 62 and 63.
- the first scanner imaging unit 64 and the second scanner imaging unit 65 are arranged so that their optical axes face the direction of irradiation of the pattern light by the scanner light sources 62 and 63, which makes it possible to image the pattern light irradiated by the scanner light sources 62 and 63 in the measurement direction and generate a bright line image including the pattern light.
- the bright line image including the pattern light can also be called a first image.
- the distance between the first scanner imaging unit 64 and the second scanner imaging unit 65 can be secured long, and the accuracy of the stereo measurement method can be improved. That is, the distance between the optical axes of the first scanner imaging unit 64 and the second scanner imaging unit 65 is known, and the pattern light irradiated from the first scanner light source 62 or the second scanner light source 63 is simultaneously captured by the first scanner imaging unit 64 and the second scanner imaging unit 65 to obtain corresponding points of the respective images generated, and the three-dimensional coordinates of the corresponding points can be obtained by using the stereo measurement method.
- the stereo measurement method may be a passive stereo using the first scanner imaging unit 64 and the second scanner imaging unit 65, or an active stereo using one scanner imaging unit.
- the measurement target W is a mirror reflecting object or when a deep hole is measured
- one of the images generated by the first scanner imaging unit 64 and the second scanner imaging unit 65 may not include the pattern light.
- the three-dimensional coordinates may be calculated using the active stereo method based on the positional relationship between the scanner imaging unit corresponding to the image captured by the pattern light and the scanner light source.
- the texture camera 66 has a light receiving element, such as a CMOS sensor capable of acquiring color images, and an optical system for forming an image of light incident from the outside on the light receiving surface of the light receiving element.
- the texture camera 66 is attached to the optical base 61 between the first scanner imaging unit 64 and the second scanner imaging unit 65.
- the texture camera 66 is positioned so that its optical axis faces the measurement object W during measurement, and captures an image of the measurement object W to generate a texture image.
- the first to fourth marker blocks 21 to 24 have the same structure.
- the first marker block 21 has the first to seventh self-luminous markers 71 to 77 that face multiple directions.
- the second to fourth marker blocks 22 to 24 are configured in the same manner as the first marker block 21. That is, as shown in Figures 3 to 8, the second marker block 22 has first to seventh self-light-emitting markers 81 to 87, the third marker block 23 has first to seventh self-light-emitting markers 91 to 97, and the fourth marker block 24 has first to seventh self-light-emitting markers 101 to 107.
- the first self-luminous marker 71 of the first marker block 21 and the first self-luminous marker 81 of the second marker block 22 are arranged so as to be offset in position around the straight line B, and the optical axis of the first self-luminous marker 71 of the first marker block 21 and the optical axis of the first self-luminous marker 81 of the second marker block 22 are oriented in different directions. This is because the multiple side faces formed on the second marker block 22 are arranged so as to be offset in position around an axis extending in a first direction (up-down direction) relative to the multiple side faces formed on the first marker block 21.
- the multiple side faces formed on the fourth marker block 24 are arranged so as to be offset in position around an axis extending in a second direction (left-right direction) relative to the multiple side faces formed on the third marker block 23. This makes it difficult for multiple solutions to be obtained when processing the marker image, which will be described later.
- the scanner body 20 is equipped with a resin exterior member 110 that covers the optical base 61.
- the front part of the exterior member 110 has a scanner cover part 111 that covers the first scanner light source 62, the second scanner light source 63, the first scanner imaging part 64, and the second scanner imaging part 65.
- the rear part of the exterior member 110 has a grip part 112 that is gripped by the measurement operator.
- the gripping portion 112 has a vertically long shape, its upper end portion is integrated with the main body of the exterior member 110, and is located away from the optical base 61 on the opposite side (rear side) of the measurement direction.
- the upper end of the gripping part 112 is provided with a scanner display unit 113 for displaying information on the measurement results by the scanner unit 60, a setting screen, etc., and an operation unit 114 for operating the scanner unit 60.
- the scanner display unit 113 is composed of a liquid crystal display, an organic EL display, etc., and is arranged so that the display surface is inclined. In addition, the display surface faces the measurement subject side, so that the three-dimensional scanner 2 can be moved while looking at the display contents of the scanner display unit 113.
- This scanner display unit 113 is a display unit incorporated in the three-dimensional scanner 2, a display unit provided integrally with the three-dimensional scanner 2, or a display unit attached inseparably to the main body part of the three-dimensional scanner 2 during operation, so to speak, a built-in type display unit.
- the scanner display unit 113 is provided above the gripping part 112, near the part where the gripping part 112 and the scanner main body 20 are connected.
- the display surface of the scanner display unit 113 is embedded above the grip unit 112 so that it faces in the opposite direction to the direction in which light is emitted from the scanner light sources 62 and 63. Furthermore, as shown in FIG.
- the periphery of the scanner display unit 113 is covered with an exterior member 110, and is disposed integrally and continuously with the exterior housing 110 that covers the grip unit 112.
- the operation unit 114 is also an operation unit incorporated in the three-dimensional scanner 2 like the scanner display unit 113, and is a so-called built-in operation unit.
- the operation unit 114 is disposed below the scanner display unit 113, and is surrounded by the exterior housing 110.
- the exterior housing 110 has projections and recesses corresponding to the shape of the operation unit 114, and the operation unit 114 is also covered by the exterior housing 110.
- a touch panel 113a that can be operated by touch is also provided on the display surface side of the scanner display unit 113.
- the operation unit 114 is composed of a number of operation buttons, including, for example, a measurement start button and a measurement stop button, and is located below the scanner display unit 113.
- the touch panel 113a can also be part of the operation unit.
- Three-dimensional scanner 2 comprises a display control unit 140, a marker lighting control unit 141, a scanner control unit 142, and a memory unit 143.
- the display control unit 140 is a part that controls the scanner display unit 113 based on signals output from the scanner control unit 142, and causes the scanner display unit 113 to display various images, user interfaces, etc. Operations performed by the user on the scanner display unit 113 are acquired by the scanner control unit 142 based on signals output from the touch panel 113a.
- the marker lighting control unit 141 is a part that controls the self-luminous markers 71-77, 81-87, 91-97, and 101-107 (only 71 is shown in FIG. 16).
- the self-luminous markers 71-77, 81-87, 91-97, and 101-107 are switched between a lit state and an unlit state by the marker lighting control unit 141.
- the marker lighting control unit 141 is controlled by the scanner control unit 142.
- the memory unit 143 is capable of temporarily storing programs, images captured by the scanner unit 60, and the like.
- the three-dimensional scanner 2 is equipped with a wireless communication unit (first communication unit) 144 that is controlled by the scanner control unit 142.
- the wireless communication unit 144 is a communication module or the like that is configured to be able to communicate with devices other than the three-dimensional scanner 2.
- the three-dimensional scanner 2 communicates with the imaging unit 3 via the wireless communication unit 144, and it is possible to send and receive various data, such as image data captured by the scanner unit 60, and various signals, for example.
- the three-dimensional scanner 2 is equipped with a motion sensor 145.
- the motion sensor 145 is composed of sensors that detect the acceleration and angular velocity of the three-dimensional scanner 2, and the detected values are output to the scanner control unit 142 and used for various calculation processes.
- the value output from the motion sensor 145 can be used to obtain an initial solution for the posture of the three-dimensional scanner 2, i.e., the postures of the first to fourth marker blocks 21 to 24, thereby improving matching accuracy and improving processing speed during posture calculation.
- Processing using the value output from the motion sensor 145 may be executed by the imaging unit 3 or the processing unit 4.
- the three-dimensional scanner 2 includes a scanner light source control unit 146 and a scanner image processing unit 147.
- the scanner light source control unit 146 controls the first scanner light source 62 and the second scanner light source 63.
- the first scanner light source 62 and the second scanner light source 63 are switched between a lit state and an off state by the scanner light source control unit 146.
- the scanner light source control unit 146 is controlled by the scanner control unit 142.
- the scanner image processing unit 147 controls the first scanner imaging unit 64, the second scanner imaging unit 65, and the texture camera 66 to perform imaging at a predetermined timing.
- the images captured by the first scanner imaging unit 64, the second scanner imaging unit 65, and the texture camera 66 are input to the scanner image processing unit 147.
- the scanner image processing unit 147 performs various image processing such as extraction of edge data on the input images.
- the scanner image processing unit 147 generates edge data (corresponding to the first measurement information of the present invention) by performing edge extraction processing on the bright line image generated by the first scanner imaging unit 64 or the second scanner imaging unit 65.
- edge data corresponding to the first measurement information of the present invention
- the first scanner light source 62 is irradiating multi-line light
- the first scanner imaging unit 64 and the second scanner imaging unit 65 generate a multi-line image.
- the scanner image processing unit 147 processes the multi-line image to generate edge data.
- the wireless communication unit 144 links the edge data generated by the scanner image processing unit 147 with the identification information generated by the trigger generation unit 38 and corresponding to the edge data, and transmits them.
- the edge data is associated with the identification information for distinguishing this edge data from other edge data. Therefore, the desired edge data can be identified based on the identification information.
- the wireless communication unit 144 corresponds to the first transmission unit of the present invention. Note that the edge data and the identification information may also be transmitted by wired communication.
- the scanner control section 142 of the three-dimensional scanner 2 receives the trigger.
- the scanner control section 142 receives the trigger
- the scanner light source control section 146 Irradiation of pattern light from the scanner light source 62 or the second scanner light source 63 is executed
- the scanner image processing unit 147 executes imaging by the first scanner imaging unit 64 and the second scanner imaging unit 65
- the marker lighting control unit 141 causes the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107 to emit light.
- Irradiation of pattern light from the first scanner light source 62 or the second scanner light source 63, imaging by the first scanner imaging unit 64 and the second scanner imaging unit 65, and emission of the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107 are synchronized.
- the main body control unit 33 of the imaging unit 3 and the scanner control unit 142 of the three-dimensional scanner 2 work together to synchronize, in response to a trigger being generated by the trigger generation unit 38, the irradiation of pattern light from the scanner light sources 62, 63, the imaging by the scanner imaging units 64, 65, the emission of light from the self-luminous markers 71-77, 81-87, 91-97, 101-107, and the imaging by the movable imaging unit 3A.
- the three-dimensional scanner 2 is equipped with an indicator light 148 and a communication control unit 149.
- the indicator light 148 displays the operating status of the three-dimensional scanner 2, and is controlled by the scanner control unit 142.
- the communication control unit 149 is a part that performs processing to execute communication of, for example, image data, etc.
- the processing unit 4 shown in FIG. 2 is a three-dimensional data generation means that generates a point cloud indicating the three-dimensional shape of the measurement object W based on the edge data generated by the scanner image processing unit 147, the center position information of the self-luminous marker generated by the camera image processing unit 35, and the position and orientation information of the scanner imaging camera 32.
- the point cloud data indicating the three-dimensional shape of the measurement object W is an example of display data indicating the three-dimensional shape of the measurement object W.
- the processing unit 4 includes a three-dimensional data generating unit 43a.
- the processing unit 4 receives the edge data generated by the scanner image processing unit 147 of the three-dimensional scanner 2, identification information corresponding to the edge data, the center position information of the self-luminous marker generated by the camera image processing unit 35 of the imaging unit 3, and identification information corresponding to the center position information of the self-luminous marker.
- the three-dimensional data generating unit 43a After receiving each piece of data and information, the three-dimensional data generating unit 43a generates point cloud data indicating the three-dimensional shape of the measurement object W based on the received edge data, the identification information corresponding to the edge data, the center position information of the self-luminous marker, and the identification information corresponding to the center position information of the self-luminous marker.
- the imaging unit 3 is equipped with a memory 39a that sequentially stores edge data generated by the scanner image processing unit 147, and a matching unit 39b that matches the edge data with central position information of the self-luminous marker based on identification information.
- the scanner image processing unit 147 when multiple measurement objects W are measured sequentially, or when different parts of the same measurement object W are measured sequentially, the scanner image processing unit 147 generates multiple edge data.
- the multiple generated edge data are each linked to different identification information and transmitted from the wireless communication unit 144 of the three-dimensional scanner 2 to the imaging unit 3.
- the multiple edge data transmitted from the wireless communication unit 144 of the three-dimensional scanner 2 are stored in the memory 39a of the imaging unit 3 with the identification information linked to them.
- the matching unit 39b identifies the center position information of the self-luminous marker to be transmitted to the processing unit 4.
- the matching unit 39b identifies edge data having identification information linked to the identified center position information of the self-luminous marker from among the multiple edge data stored in the memory 39a.
- the matching unit 39b matches the identified edge data with the center position information of the self-luminous marker.
- the communication unit 37 of the imaging unit 3 transmits the edge data identified by the matching unit 39b in a state in which it is associated with the center position information of the self-luminous marker to the processing unit 4.
- the timing at which the processing ends may differ between the two, but by synchronizing using a trigger ID as in this example, a point cloud indicating a three-dimensional shape can be generated from corresponding items regardless of the difference in the timing at which the processing ends.
- the processing unit 4 has a measurement setting unit 48 that accepts settings for at least one of the type of pattern light emitted by the scanner light sources 62, 63 of the three-dimensional scanner 2 and the exposure time of the scanner imaging units 64, 65.
- Types of pattern light include multi-line light and single-line light. The settings for the type of pattern light and the exposure time can be accepted via a setting screen, which will be described later, and this setting process will be described later.
- the control unit 43 of the processing unit 4 is a measurement control unit that controls the scanner light sources 62, 63 or the scanner imaging units 64, 65 based on the settings accepted by the measurement setting unit 48.
- the measurement setting unit 48 When the measurement setting unit 48 accepts settings for multi-line light, it writes information (setting information) indicating that multi-line light has been set to the measurement setting unit 48, and when single-line light has been set, it writes information (setting information) indicating that single-line light has been set to the measurement setting unit 48. Furthermore, when an exposure time has been set, it writes the set exposure time (setting information) to the measurement setting unit 48.
- the control unit 43 controls the scanner light sources 62, 63 or the scanner imaging units 64, 65 based on the setting information written in the measurement setting unit 48. For example, when multi-line light is set, the control unit 43 reads information indicating that multi-line light has been set from the measurement setting unit 48 and transmits the read setting information to the three-dimensional scanner 2 via the communication unit 46.
- the scanner light source control unit 146 of the three-dimensional scanner 2 controls the first scanner light source 62 so that multi-line light is emitted.
- the scanner light source control unit 146 of the three-dimensional scanner 2 controls the second scanner light source 63 so that single-line light is emitted.
- control unit 43 when setting the exposure time, the control unit 43 reads the set exposure time from the measurement setting unit 48. The control unit 43 transmits the read exposure time to the three-dimensional scanner 2 via the communication unit 46.
- the scanner image processing unit 147 of the three-dimensional scanner 2 controls the scanner imaging units 64 and 65 so that the exposure time is the set one.
- the processing unit 4 When the processing unit 4 generates display data showing the three-dimensional shape of the measurement target W, it transmits the generated display data via the communication unit (corresponding to the second communication unit of the present invention) 46.
- the wireless communication unit 144 of the three-dimensional scanner 2 receives the display data transmitted via the communication unit 46 of the processing unit 4.
- the scanner display unit 113 displays a display screen generated based on the display data received via the wireless communication unit 144.
- the scanner display unit 113 may display a display screen generated based on the display data received via the communication control unit 149.
- the following mainly describes the case where the three-dimensional scanner 2 wirelessly communicates with at least one of the imaging unit 3 and the processing unit 4, but wired communication via a communication cable may also be used.
- the processing unit 4 receives edge data generated by the scanner image processing unit 147 of the three-dimensional scanner 2 and transmitted via the wireless communication unit 144, and central position information of the self-luminous markers 71-77, 81-87, 91-97, 101-107 generated by the camera image processing unit 35 of the imaging unit 3 and transmitted via the wireless communication unit 36, and generates display data showing the three-dimensional shape of the measurement object W based on the received edge data and the central position information of the self-luminous markers 71-77, 81-87, 91-97, 101-107.
- the display data is generated each time imaging is completed, and the processing unit 4 transmits the generated display data to the three-dimensional scanner 2.
- the scanner display unit 113 also displays a setting screen 200 (FIG. 10) that accepts settings for at least one of the type of pattern light emitted by the scanner light sources 62, 63 described above and the exposure time of the scanner imaging units 64, 65.
- the setting screen 200 is a so-called user interface screen, and is generated by the display control unit 140 or scanner control unit 142 of the three-dimensional scanner 2 and displayed on the scanner display unit 113.
- the information required to generate the setting screen 200 may be transmitted from the processing unit 4, or may be generated by the imaging unit 3.
- the setting screen 200 is provided with a pattern light setting area 201 for setting the type of pattern light, an exposure time setting area 202 for setting the exposure time, and a resolution setting area 203 for setting the resolution.
- the pattern light setting area 201 is provided with a first button 201a for setting multi-line light and a second button 201b for setting single-line light.
- the touch panel 113a detects that the first button 201a has been pressed, and the first button 201a is displayed in a form that makes it possible to determine that the first button 201a has been pressed, and the scanner control unit 142 transmits the detection result to the processing unit 4.
- the control unit 43 of the processing unit 4 that receives the detection result writes information indicating that multi-line light has been set to the measurement setting unit 48.
- the scanner control unit 142 determines whether the current setting is multi-line light or not, and if the current setting is not multi-line light, controls the first scanner light source unit 62 so that multi-line light is irradiated from the first scanner light source unit 62.
- the second button 201b is pressed, the second button 201b is displayed in a form that makes it clear that it has been pressed, and the scanner control unit 142 sends the detection result to the processing unit 4.
- the control unit 43 of the processing unit 4 that receives the detection result writes information indicating that single line light has been set to the measurement setting unit 48.
- the scanner control unit 142 also determines whether the current setting is single line light or not, and if the current setting is not single line light, controls the second scanner light source unit 63 so that single line light is irradiated from the second scanner light source unit 63.
- the exposure time setting area 202 is provided with a decrease button 202a that is operated to shorten the exposure time, an increase button 202b that is operated to lengthen the exposure time, and an exposure time display section 202c that displays the set exposure time numerically.
- the measurement operator can easily set the desired exposure time by operating the increase button 202b or the decrease button 202a while looking at the exposure time displayed in the exposure time display section 202c.
- the scanner control section 142 transmits the set exposure time to the processing section 4.
- the control section 43 of the processing section 4 that receives the exposure time writes the exposure time in the measurement setting section 48.
- the scanner control section 142 also controls the scanner imaging sections 64 and 65 based on the set exposure time.
- An automatic button 202d may also be provided. When this automatic button 202d is operated, the three-dimensional measuring device 1 executes a process to automatically obtain the optimal exposure time, and the obtained exposure time is automatically set.
- the resolution setting area 203 is provided with a low resolution button 203a that is operated when increasing the amount of thinning out and lowering the resolution when generating a point cloud, a high resolution button 203b that is operated when decreasing the amount of thinning out and raising the resolution when generating a point cloud, and a resolution display section 203c that displays the set resolution.
- the measurement operator can set a desired resolution by operating the low resolution button 203a or the high resolution button 203b.
- the scanner light source control section 146 transmits the set resolution to the processing section 4.
- the control section 43 of the processing section 4 that receives the resolution writes the resolution into the measurement setting section 48.
- the three-dimensional data generating section 43a generates a point cloud so that the resolution becomes the resolution written into the measurement setting section 48.
- the resolution can be set in stages, such as "high resolution", "standard”, “low resolution”, etc.
- both the settings for the three-dimensional scanner 2 and the settings for the processing unit 4 can be set.
- the scanner control unit 142 controls at least one of the scanner imaging units 64, 65 and the scanner light sources 62, 63 based on the setting information accepted on the setting screen 200.
- the setting information for the three-dimensional scanner 2 accepted on the setting screen 200 is also sent to the processing unit 4 and used when generating a point cloud.
- the scanner control unit 142 sends the setting information accepted on the setting screen 200 to the processing unit 4.
- the setting items set on the setting screen 200 may include setting items that control the operation of the three-dimensional scanner 2 and transfer the setting items to the processing unit, and setting items that transfer the setting items to the processing unit without controlling the operation of the three-dimensional scanner 2.
- step S11 pressing (operation) of a setting button is detected.
- the setting buttons are the first button 201a, the second button 201b, the decrease button 202a, the increase button 202b, etc. shown in FIG. 10.
- step S12 the setting button is pressed and the fact that the setting information has been accepted (including the setting information) is transmitted to the imaging unit 3.
- step S13 the imaging unit 3 receives the fact that the setting information has been accepted.
- the imaging unit 3 transmits the fact that the setting information has been accepted to the processing unit 4.
- step S15 the processing unit 4 receives the fact that the setting information has been accepted.
- step S16 the setting information is reflected.
- the three-dimensional data generating unit 43a of the processing unit 4 shown in FIG. 2 generates new display data showing the three-dimensional shape of the measurement target W based on a new image including the pattern light generated by the scanner imaging units 64, 65 controlled based on the setting information written in the measurement setting unit 48, and the position and orientation of the three-dimensional scanner 2 specified by the imaging unit 3. That is, when the above-mentioned setting operation is performed by the measurement operator and the exposure time is changed, the scanner imaging units 64, 65 are controlled to have the changed exposure time, so that a new image different from the image generated before the change is generated. The display data generated based on this new image and the position and orientation of the three-dimensional scanner 2 is different from the display data before the change.
- the processing unit 4 transmits the new display data after the exposure time change to the three-dimensional scanner 2.
- the scanner display unit 113 can display a display screen generated based on the new display data transmitted from the processing unit 4, so the measurement operator can determine whether the changed exposure time is appropriate just by looking at the scanner display unit 113 of the three-dimensional scanner 2. Similarly, in the case of pattern light settings, you can determine whether the changed pattern light is appropriate.
- Figure 12 is a diagram showing a first example of a shape display screen 210 that displays a point cloud that indicates the three-dimensional shape of a measurement object W.
- Figure 12 shows an example in which a point cloud is displayed as the measurement result of a measurement object W illuminated with a multi-line light source.
- the scanner display unit 113 is capable of displaying a shape display screen 210 that has a distance information display area 211 that indicates the distance between the measurement object W and the three-dimensional scanner 2.
- the distance between the measurement object W and the three-dimensional scanner 2 can also be called the working distance, and therefore the distance information display area 211 is also a working distance display area.
- the shape display screen 210 is provided with a viewpoint fixing button 500, a texture capture button 501, a setting button 502, a scan stop button 503, and a scan start button 504.
- the viewpoint fixing button 500 is a button operated when fixing the viewpoint of the image displayed on the shape display screen 210.
- the texture capture button 501 is a button operated when acquiring a texture image with the texture camera 66, and a trigger signal for texture acquisition is generated when the texture capture button 501 is operated.
- the setting button 502 is a button operated when making various settings, and when the setting button 502 is operated, a setting screen (not shown) is displayed and various setting operations can be accepted.
- the scan stop button 503 is a button operated when stopping scanning by the three-dimensional scanner 2.
- the scan start button 504 is a button operated when starting scanning by the three-dimensional scanner 2.
- the distance information display area 211 may display whether the distance between the measurement object W and the three-dimensional scanner 2 is relatively close or far, or may display the distance numerically. In this example, the distance between the measurement object W and the three-dimensional scanner 2 is displayed in the form of a color bar.
- the shape display screen 210 also has a scale change section 212. When the measurement operator operates the scale change section 212, the displayed three-dimensional shape of the measurement object W is enlarged or reduced.
- FIG. 13 and 14 show second and third examples of the shape display screen 210 displaying a point cloud showing the three-dimensional shape of the measurement object W, in which the same measurement object W shown in FIG. 12 is measured multiple times from different angles.
- the number of points obtained increases as the number of measurements of the same measurement object W increases, and it is also possible to understand where the unmeasured areas (white areas in the measurement object W) are.
- FIG. 15 shows a fourth example of the shape display screen 210 displaying a point cloud showing the three-dimensional shape of the measurement object W, and it can be seen that by further increasing the number of measurements, the obtained point cloud becomes denser and the number of unmeasured areas decreases.
- the shape display screen 210 can also be called an image showing the measurement range by the three-dimensional scanner 2, or an image showing the measurement completion area by the three-dimensional scanner 2.
- the scanner display unit 113 can display a point cloud indicating the three-dimensional shape of the measurement target W with the viewpoint fixed.
- the processing when fixing the viewpoint will be described with reference to FIG. 16.
- step S21 it is detected that the viewpoint fixing button included in the operation unit 114 of the three-dimensional scanner 2 has been pressed.
- step S22 the fact that the viewpoint fixing button has been pressed is transmitted to the imaging unit 3.
- step S23 the imaging unit 3 receives that the viewpoint fixing button has been pressed.
- the imaging unit 3 transmits that the viewpoint fixing button has been pressed to the processing unit 4.
- the processing unit 4 receives that the viewpoint fixing button has been pressed.
- step S26 the setting to fix the viewpoint is reflected.
- step S27 the viewer's viewpoint is fixed.
- step S28 display data is created with the viewer's viewpoint fixed.
- step S29 the display data created in step S28 is sent to the three-dimensional scanner 2 via the imaging unit 3.
- step S30 the three-dimensional scanner 2 receives the display data.
- step S31 the screen displayed on the scanner display unit 113 is updated based on the display data received in step S30.
- FIG. 17 shows an example of a display screen 210 showing difference information representing the difference between the CAD data of the measurement object W and the three-dimensional shape of the measurement object W generated by the three-dimensional data generation unit 43a.
- the CAD data of the measurement object W can be input to the processing unit 4 from the outside.
- the control unit 43 matches the origin on the CAD data with the origin of the three-dimensional shape data generated by the three-dimensional data generation unit 43a, and then calculates the difference between the CAD data and the three-dimensional shape data.
- the display screen 210 can display the difference information between the CAD data and the three-dimensional shape data in a heat map format. For example, in FIG. 17, the larger the difference, the brighter the display, or conversely, the darker the display, the larger the difference.
- the scanner display unit 113 can display the display screen 210 showing the difference information representing the difference between the CAD data and the three-dimensional shape data obtained by measurement, so that the measurement operator can check the difference information on the three-dimensional scanner 2.
- FIG. 18 is a diagram showing an example of the display screen 210 when the texture of the measurement object W is reflected.
- the texture of the measurement object W can be acquired as a texture image (color image) by the texture camera 66 of the three-dimensional scanner 2.
- the texture image is transmitted to the processing unit 4.
- the control unit 43 of the processing unit 4 generates superimposed display data for superimposing the texture image on the point cloud representing the three-dimensional shape of the measurement object W generated based on the display data.
- the control unit 43 can generate the superimposed display data by matching the origin of the texture image with the origin of the three-dimensional shape data generated by the three-dimensional data generating unit 43a and superimposing the texture image on the three-dimensional shape data.
- the generated superimposed display data is transmitted to the three-dimensional scanner 2 and displayed on the scanner display unit 113.
- the scanner display unit 113 displays the display screen 210 in which the color image of the measurement object generated by the texture camera 66 is superimposed on the point cloud representing the three-dimensional shape of the measurement object W generated based on the display data.
- FIG 19 is a flow chart showing an example of a processing procedure for reflecting texture.
- step S41 it is detected that the texture camera button included in the operation unit 114 of the three-dimensional scanner 2 has been pressed.
- step S42 the texture camera 66 is started.
- step S43 a texture camera preview, i.e., an image captured by the texture camera 66, is displayed on the scanner display unit 113.
- step S44 it is detected that the imaging button included in the operation unit 114 has been pressed.
- step S45 the image captured by the texture camera 66 when the imaging button is pressed is captured.
- step S46 it is detected that the confirmation button (included in the operation unit 114) for confirming the image captured by the texture camera 66 has been pressed.
- step S47 the confirmed texture image is sent to the imaging unit 3.
- step S48 the imaging unit 3 executes image bridging processing, which is a bridge between the image and the processing unit, and sends it to the processing unit 4.
- step S49 the processing unit 4 receives the texture image.
- step S50 a process is executed in which the texture image received in step S49 is superimposed on a point cloud representing the three-dimensional shape of the measurement object generated based on the display data, i.e., texture processing is executed.
- step S51 texture-processed display data is created.
- step S52 the display data created in step S51 is sent to the three-dimensional scanner 2 via the imaging unit 3.
- step S53 the three-dimensional scanner 2 receives the display data.
- step S54 the screen displayed on the scanner display unit 113 is updated based on the display data received in step S53.
- step SA1 the imaging unit 3 issues a trigger.
- An ID is assigned to the trigger.
- the trigger issued by the imaging unit 3 is received by the wireless communication section 144 of the three-dimensional scanner 2 via the wireless communication section 36 of the imaging unit 3.
- step SA2 the scanner control section 142 of the three-dimensional scanner 2 outputs a light emission instruction to the marker lighting control section 141, and the marker lighting control section 141 causes the self-luminous markers 71 to 77, 81 to 87, 91 to 97, and 101 to 107 to emit light.
- step SA3 the scanner control unit 142 of the three-dimensional scanner 2 outputs a light emission instruction to the scanner light source control unit 146, and the scanner light source control unit 146 causes the first scanner light source 62 or the second scanner light source 63 to emit light. Which of the first scanner light source 62 or the second scanner light source 63 is caused to emit light is based on setting information.
- step SA4 simultaneously with step SA3, the scanner control unit 142 of the three-dimensional scanner 2 outputs an imaging instruction to the scanner image processing unit 147, and the scanner image processing unit 147 causes the first scanner imaging unit 64 and the second scanner imaging unit 65 to perform imaging.
- the exposure times of the first scanner imaging unit 64 and the second scanner imaging unit 65 are set based on the setting information.
- step SA5 a bright line image is acquired by imaging with the first scanner imaging unit 64 and the second scanner imaging unit 65.
- a trigger ID is assigned to the luminance image.
- step SA6 the bright line image is input to the scanner image processing unit 147, which extracts edge data from the bright line image.
- the edge data is received by the wireless communication unit 36 of the imaging unit 3 via the wireless communication unit 144 of the three-dimensional scanner 2.
- step SA1 in which the main body control unit 33 outputs an imaging instruction to the camera image processing unit 35, and the camera image processing unit 35 causes the scanner imaging camera 32 to perform imaging.
- step SA8 the scanner imaging camera 32 can acquire a marker image including multiple self-luminous markers. A trigger ID is assigned to the marker image.
- step SA9 the marker image is input to the camera image processing unit 35 of the imaging unit 3, which extracts the marker image coordinates, and in step SA10, the marker external parameters are calculated.
- the marker external parameters are six-axis parameters.
- step SA10 data matching is performed between the edge data sent from the three-dimensional scanner 2 and the marker image coordinates based on the trigger ID. Details of data matching will be described later.
- step SA12 the data obtained in step SA11 is sent to the communication unit 46 of the processing unit 4 via the communication unit 37.
- the control unit 43 of the processing unit 4 processes the data sent from the imaging unit 3.
- step SA14 the control unit 43 performs three-dimensional point cloud generation. This allows the three-dimensional shape of the measurement object W to be obtained.
- FIG. 21 is a flowchart showing an example of the data matching processing procedure.
- the imaging unit 3 acquires the marker external parameter data calculated in step SA10 of the flowchart shown in FIG. 20.
- the three-dimensional scanner 2 acquires the edge data extracted in step SA6 of the flowchart shown in FIG. 20 and transmits it to the imaging unit 3.
- the imaging unit 3 temporarily stores the marker external parameter data acquired in step SB1 and the edge data acquired in step SB2.
- step SB4 ID matching is performed between the marker external parameter data and the edge data based on the previously assigned trigger ID.
- step SB5 it is determined whether the trigger IDs match. If the trigger IDs match, the marker external parameter data and the edge data are linked in step SB6. If the trigger IDs do not match, the marker external parameter data and the edge data are discarded in step SB7.
- step SB9 data transmission processing to the processing unit 4 is performed.
- step SB9 the processing unit 4 receives the data. (Generation of texture image)
- the scanner control unit 142 can control the texture camera 66 to perform imaging.
- the trigger signal may be a trigger signal for three-dimensional shape measurement and a trigger signal for texture acquisition, or may be partially or completely shared.
- the trigger signal may be a trigger signal for three-dimensional shape measurement and a trigger signal for texture acquisition, or may be partially or completely shared, thereby improving the synchronization between imaging by the scanner imaging units 64 and 65 and imaging by the texture camera 66.
- the trigger signal for acquiring the texture may be generated by the trigger generating section 38 of the imaging unit 3 in response to an operation signal received by the operation input section 42 of the processing section 4.
- the reference camera 34 captures an image of the light-emitting body 31b.
- the camera image processing unit 35 acquires the position information of the light-emitting body 31b stored in the memory unit 39c of the imaging unit 3, processes the image of the light-emitting body 31b generated by the reference camera 34 based on the position information of the light-emitting body 31b, and generates position and orientation information of the scanner imaging camera 32 relative to the reference camera 34.
- the scanner imaging camera 32 generates a marker image including the self-luminous markers 71-77, 81-87, 91-97, 101-107 of the three-dimensional scanner 2.
- the reference camera 34 also captures the images of the multiple light-emitting bodies 31b provided on the movable imaging unit 3A to generate an image including the light-emitting bodies 31b.
- the camera image processing unit 35 of the imaging unit 3 calculates the position and orientation information of the three-dimensional scanner 2 with respect to the scanner imaging camera 32 based on the marker image including the self-luminous markers 71-77, 81-87, 91-97, 101-107 and the arrangement information of the self-luminous markers 71-77, 81-87, 91-97, 101-107 acquired from the memory unit 143 of the three-dimensional scanner 2. This also calculates the position and orientation of the three-dimensional scanner 2 with respect to the reference camera 34.
- the texture camera 66 of the three-dimensional scanner 2 is controlled to generate a texture image.
- the texture image generated here is based on the texture camera 66. Since the positional relationship between the texture camera 66 of the three-dimensional scanner 2 and the self-luminous markers 71-77, 81-87, 91-97, 101-107 is known in advance, the texture image can be superimposed on the point cloud of the measurement object W based on the reference camera 34, based on the position and orientation of the three-dimensional scanner 2 based on the reference camera 34 (center position information of the self-luminous markers 71-77, 81-87, 91-97, 101-107) and the texture image of the measurement object W based on the texture camera 66.
- the three-dimensional data generation section 43a of the processing section 4 can generate display data indicating the three-dimensional shape of the measurement object W.
- the display data generated by the three-dimensional data generation section 43a is received by the three-dimensional scanner 2 and displayed as the display screen 210 on the scanner display section 113 possessed by the three-dimensional scanner 2.
- the measurement operator can easily check information regarding the measurement results obtained by the three-dimensional scanner 2, such as whether the working distance is appropriate, whether the pattern light is being irradiated onto the part of the object W to be measured that is to be measured, and the current extent of the completed scan.
- the measurement operator does not need to go back and forth between the processing unit 4 and the measurement object W, improving measurement workability.
- information related to the measurement results can be viewed even during measurement operations using the three-dimensional scanner 2.
- the display screen 210 is a screen that displays a point cloud showing the three-dimensional shape of the measurement object W, but it is not limited to this and may be a screen that displays mesh data showing the three-dimensional shape of the measurement object W.
- the present invention can also be applied to a case in which the three-dimensional scanner 2 is attached to an arm 600 for operation, as shown in FIG. 22.
- the arm 600 is a multi-degree-of-freedom arm equipped with a plurality of arm components 600a, 600b, 600c and rotating parts 600d, 600e, 600f that rotatably connect these.
- a position and orientation determination unit 601 that determines the position and orientation of the three-dimensional scanner 2 is composed of a sensor or the like that detects the rotation angles of the arm components 600a, 600b, 600c. In other words, if the rotation angles of each arm component 600a, 600b, 600c are known, the position and orientation of the three-dimensional scanner 2 can be calculated based on a predetermined relational expression.
- the processing unit 4 includes a model input unit 400, a coordinate matching unit 401, a geometric element extraction unit 402, an operation control unit 403, a coordinate system creation unit 404, a measurement processing unit 405, and a coordinate calculation unit 406.
- the coordinate system matching unit 401, the geometric element extraction unit 402, the operation control unit 403, the coordinate system creation unit 404, the measurement processing unit 405, and the coordinate calculation unit 406 are provided in the control unit 40, and are configured by a microcomputer possessed by the control unit 40 and a program executed by the microcomputer. The functions of each unit will be explained below with reference to the flowchart shown in FIG. 23.
- Step SD1 after the start of the flowchart shown in FIG. 23 is a model input step.
- the model input unit 400 executes model input processing.
- the model input unit 400 is a part that accepts input of a reference model of the measurement object W.
- the reference model of the measurement object W is a three-dimensional model of the measurement object W, and examples of such models include CAD data, polygon data (STL data), and mesh data acquired in the past.
- the data of the three-dimensional model of the measurement object W is stored, for example, in the memory unit 45 or another memory unit (not shown).
- the user operates the operation input unit 42 to perform an operation to search for or identify data of the three-dimensional model of the measurement object W, and then executes a read operation to read the identified data.
- the model input unit 400 When the model input unit 400 detects that a read operation has been executed, it accepts input of the three-dimensional model specified by the user and temporarily stores it in a specified storage area so that it can be handled by the processing unit 4.
- the reference model input by the model input unit 400 has a three-dimensional coordinate system.
- Step SD2 is a model display step.
- the three-dimensional model of the measurement object W input in step SD1 is displayed on the monitor 41.
- the three-dimensional model M1 is displayed as a solid body on the monitor 41.
- Step SD3 is a step of creating a coordinate reference for the measurement target W. Step SD3 may be executed before step SD1.
- a user uses a contact-type probe 5 (shown in FIG. 26) having a plurality of self-luminous markers (probe markers).
- a trigger is transmitted to the probe 5 via an optical communication interface 36a.
- the optical communication interface 36a is a part for performing optical communication using visible light or invisible light, and can be configured, for example, by an infrared communication interface.
- the wireless communication unit 36 also has a radio communication interface 36b.
- the radio communication interface 36b may be, for example, a part for constructing a wireless LAN, or may be a part capable of short-distance digital wireless communication using radio waves, such as Bluetooth (registered trademark) communication.
- Optical communication is characterized by high directivity and accurate time required for information transfer.
- the imaging unit 3 tracks the multiple probe markers that the probe 5 has, and is also capable of imaging the probe markers.
- the scanner imaging camera 32 images the probe markers of the probe 5, thereby generating a probe marker image including the probe markers.
- the camera image processing unit 35 processes the probe marker image captured by the scanner imaging camera 32 to generate central position information of the probe marker. Based on the central position information of the probe marker, position and orientation information of the probe marker relative to the movable imaging unit 3A can be generated.
- the probe 5 is also separate from the imaging unit 3 and the processing unit 4, and the measurement operator can bring the probe 5 close to the measurement object W located away from the imaging unit 3 and the processing unit 4, and identify the desired measurement point using the probe 5.
- the configuration of the probe 5 will be explained below with reference to Figures 26 and 27.
- the contact probe 5 is a handheld, portable probe, similar to the three-dimensional scanner 2. As shown in FIG. 26, the probe 5 comprises a probe body 120 and a stylus 121 protruding from the probe body 120. A contactor 121a is provided at the tip of the stylus 121 for contacting the measurement object W. This contactor 121a is, for example, spherical. The contactor 121a is a part for indicating the position of the measurement point on the measurement object W.
- the probe body 120 also has a gripping portion 5A in the middle part in the longitudinal direction, and the measurement operator can hold the gripping portion 5A with one hand to move or change the orientation of the probe 5 during measurement.
- the probe body 120 has a plurality of probe markers 5B spaced apart from one another.
- a plurality of probe markers 5B are spaced apart from one another on one longitudinal end side of the probe body 120, and a plurality of probe markers 5B are also spaced apart from one another on the other longitudinal end side of the probe body 120.
- the circuit configuration of the probe 5 is shown in FIG. 27. Although only one probe marker 5B is shown in FIG. 27, multiple probe markers 5B are provided in reality.
- a probe camera 122 is provided near the stylus 121.
- the probe 5 is equipped with a display unit 123a configured with a liquid crystal display, an organic EL display, or the like, a touch panel 123b that can be operated by touch, and a display control unit 123c.
- an operation unit 124 having multiple buttons and the like is provided near the display unit 123a.
- the probe 5 is also equipped with a probe control unit 125, a memory unit 126, a probe marker lighting control unit 127, a fourth wireless communication unit 128, a motion sensor 129, and the like.
- the probe 5 is also equipped with a battery 5C that serves as a power source.
- the display control unit 123c controls the display unit 123a based on signals output from the probe control unit 125, and causes the display unit 123a to display various images, user interfaces, and the like. User operations performed on the display unit 123a are acquired by the probe control unit 125 based on signals output from the touch panel 123b.
- the probe marker lighting control unit 127 is a part that controls the probe marker 5B.
- the probe marker 5B can be switched between a lit state and an off state by the probe marker lighting control unit 127.
- the probe marker lighting control unit 127 is controlled by the probe control unit 125.
- the memory unit 126 is capable of storing programs and the like.
- the fourth wireless communication unit 128 has an optical communication interface 128a and a radio communication interface 128b, similar to the first wireless communication unit 36 of the imaging unit 3.
- the optical communication interface 128a is a part that receives a trigger transmitted via the optical communication interface 36a of the imaging unit 3.
- the probe marker lighting control unit 127 lights up the probe marker 5B. This allows the imaging of the probe marker by the imaging unit 3 to be synchronized with the lighting of the probe marker 5B.
- the radio communication interface 128b may have a different radio communication method from the radio communication interface 144b of the three-dimensional scanner 2.
- the radio communication interface 128b of the fourth wireless communication unit 128 can be configured as a part that is capable of Bluetooth communication or the like, which has a communication speed slower than that of a wireless LAN.
- the radio wave communication interface 36b of the imaging unit 3 may be compatible with both the radio wave communication interface 144b of the three-dimensional scanner 2 and the radio wave communication interface 128b of the probe 5.
- the radio wave communication interface 36b of the imaging unit 3 may be compatible with both the wireless LAN and Bluetooth communication.
- the data volume of data exchanged between the probe 5 and the imaging unit 3 is smaller than that of data exchanged between the three-dimensional scanner 2, which continuously transmits measurement data, and the imaging unit 3. Therefore, the radio communication established between the probe 5 and the imaging unit 3 may use Bluetooth communication, which consumes less power and is expected to have a long battery life.
- the motion sensor 129 is composed of sensors that detect the acceleration and angular velocity of the probe 5, and the detected values are output to the probe control unit 125 and used for various calculation processes such as calculating the attitude of the probe 5, similar to the attitude calculation of the three-dimensional scanner 2.
- the coordinates of the multiple measurement points indicated by the probe 5 are calculated by the coordinate calculation unit 406 shown in FIG. 24. For example, the coordinates of the multiple measurement points indicated by the probe 5 can be calculated based on the position and orientation information of the probe 5 and information output from the imaging unit 3.
- step SD3 of FIG. 23 the user holds the probe 5 and touches the contact 121a of the stylus 121 to the measurement object W to perform a measurement operation.
- This makes it possible to obtain the three-dimensional coordinates of the point where the contact 121a contacts (the measurement point indicated by the probe 5).
- This measurement operation is performed multiple times by changing the point where the contact 121a contacts to obtain the positions of multiple measurement points, and a coordinate system for display data is generated based on the positions of the multiple measurement points obtained.
- the scanner display unit 113 displays display data showing the three-dimensional shape of the measurement object W in a coordinate system created based on the positions of the multiple measurement points indicated by the probe 5.
- a coordinate system is created.
- the coordinate system creation unit 404 can create a measurement coordinate system based on the coordinates of multiple measurement points calculated by the coordinate calculation unit 406. It is also possible to create a coordinate system of the reference model based on geometric elements. For example, when the geometric element extraction unit 402 receives an input of a position designation on the reference model from the user, it extracts a geometric element at the designated position.
- the geometric element to be extracted is, for example, a plane, a cylinder, a circle, etc.
- the three-dimensional model M1 is displayed on the monitor 41 as a solid body as shown in FIG. 25, so that the desired geometric element can be selected more easily than when the edge line display (also called wire frame display) is used.
- the edge line display when used, the user may have difficulty in recognizing the surface of the three-dimensional model, or may mistakenly recognize a part that is not a surface as a surface, making it difficult to select the desired geometric element.
- the three-dimensional model M1 as a solid body, the user can clearly recognize the surface, and as a result, the desired geometric element can be easily selected without making a mistake.
- Data for identifying the geometric elements extracted by the geometric element extraction unit 402 is sent to the coordinate system creation unit 404.
- the coordinate system creation unit 404 creates a coordinate system for the reference model based on the geometric elements extracted by the geometric element extraction unit 402.
- Step SD4 is a coordinate alignment processing step.
- the coordinate system alignment unit 401 shown in FIG. 24 executes the coordinate alignment processing.
- the coordinate system alignment unit 401 is a part that aligns the coordinate system of the reference model input by the model input unit 400 with the coordinate system of the display data generated by the processing unit 4.
- the coordinate system alignment unit 401 acquires information on the coordinate system of the reference model and information on the coordinate system of the display data. Having acquired this information, the coordinate system alignment unit 401 executes processing to align the coordinate system of the reference model with the coordinate system of the display data based on the acquired information.
- the alignment between the reference model and the display data does not have to be based on coordinates, and may be based on techniques such as best fit or three-plane alignment.
- Step SD5 is an input step for starting the shape measurement of the measurement object W by the three-dimensional scanner 2.
- a home screen 113A generated by the display control unit 140 is displayed on the scanner display unit 113 (shown in FIG. 9) of the three-dimensional scanner 2.
- the marker lighting control unit 141 turns off the self-luminous markers. This allows the user to understand that the three-dimensional scanner 2 is inactive and cannot perform three-dimensional measurement.
- the switch button 113B provided on the home screen 113A.
- the operation of the switch button 113B is detected by the touch panel 113a shown in FIG. 8.
- the display control unit 140 When it is detected that the switch button 113B has been operated, the display control unit 140 generates a message window 113C and displays it on the scanner display unit 113.
- the message window 113C displays an operation confirmation message such as "This scanner will be activated," as well as an OK button 113D and a cancel button 113E.
- the touch panel 113a detects that the OK button 113D has been operated, the operation control unit 403 switches the three-dimensional scanner 2 from an inactive state to an active state.
- the operation control unit 403 detects an operation input that switches the operation state of the three-dimensional scanner 2 to an active state, it switches the operation state of the three-dimensional scanner 2 to active, and further, the marker lighting control unit 141 lights up the self-luminous marker.
- the operation control unit 403 keeps the three-dimensional scanner 2 in an inactive state.
- the operation control unit 403 detects an operation input that activates the operation state of the contact-type probe 5, it switches the operation state of the three-dimensional scanner 2 to an inactive state.
- the operation input that activates the operation state of the probe 5 can be the same as the operation input that activates the operation state of the three-dimensional scanner 2.
- the display control unit 140 When the operation state of the three-dimensional scanner 2 is switched to active by the operation control unit 403, the display control unit 140 generates a measurement screen 113F and displays it on the scanner display unit 113.
- the processing unit 4 switches the display form of the reference model from a solid body to a display form in which the ridgelines are emphasized, as shown in Fig. 29. As a result, the reference model with the ridgelines emphasized is displayed on the scanner display unit 113 of the three-dimensional scanner 2.
- This display form switching step is step SD6 in Fig. 23.
- the processing unit 4 sequentially generates point cloud data indicating the three-dimensional shape of the measurement object W based on the images including the pattern light generated by the scanner imaging units 64, 65 and the position and orientation of the three-dimensional scanner 2 identified by the imaging unit 3, with the coordinate system of the reference model and the coordinate system of the display data aligned by the coordinate system alignment unit 401.
- This step is the point cloud data acquisition step of step SD7 in FIG. 23.
- the main body control unit 33 and the scanner control unit 142 can accept a switching input to switch between irradiating multi-line light and irradiating single-line light.
- This switching input may be an input from the user, or may be an input by a control signal that is automatically generated when a certain condition is met.
- the scanner control unit 142 controls the first scanner light source 62 to irradiate multi-line light from the first scanner light source 62.
- the timing of irradiating the multi-line light from the first scanner light source 62, the imaging by the first scanner imaging unit 64 to the second scanner imaging unit 65, and the emission of the markers can be synchronized.
- the scanner control unit 142 controls the second scanner light source 63 to irradiate single-line light from the second scanner light source 63.
- the scanner control unit 142 stops irradiating multi-line light from the first scanner light source 62, and then executes the irradiation process of single-line light from the second scanner light source 63.
- the irradiation timing of the multi-line light from the second scanner light source 63, the imaging by the first scanner imaging unit 64 to the second scanner imaging unit 65, and the emission of the marker can be synchronized.
- the processing unit 4 determines whether or not the marker blocks 21-24 are detectable from the movable imaging unit 3A. For example, if the second image contains a self-luminous marker, the processing unit 4 can determine which of the first to fourth marker blocks 21-24 the self-luminous marker belongs to, thereby determining which of the first to fourth marker blocks 21-24 is detectable. If there are no self-luminous markers in the second image, the processing unit 4 determines that none of the marker blocks 21-24 can be detected.
- the processing unit 4 determines that the marker blocks 21-24 are detectable, it causes the self-luminous markers included in the marker blocks detectable from the movable imaging unit 3A to emit light in a first color of visible light wavelength.
- the processing unit 4 determines that the marker blocks 21-24 are not detectable from the movable imaging unit 3A, it causes the self-luminous markers included in the marker blocks 21-24 that are not detectable from the movable imaging unit 3A to emit light in a second color different from the first color of visible light wavelength.
- the first color can be green, blue, etc.
- the second color can be red, yellow, etc., but is not limited to this, and it is sufficient if the display form is different between when detection is possible and when detection is not possible.
- the first color and the second color are different from the emission color of the marker.
- the processing unit 4 also determines whether or not the position and orientation of the three-dimensional scanner 2 can be identified from the movable imaging unit 3A based on the second image generated by the movable imaging unit 3A, and if it determines that the position and orientation of the three-dimensional scanner 2 can be identified, it can cause the self-luminous markers included in the marker blocks 21 to 24 to emit light in a first color of visible light wavelength, and if it determines that the position and orientation of the three-dimensional scanner 2 cannot be identified, it can cause the self-luminous markers included in the marker blocks 21 to 24 to emit light in a second color different from the first color of visible light wavelength.
- the processing unit 4 determines that the position and orientation of the three-dimensional scanner 2 can be identified, it can also emit light to distinguish between the marker blocks 21-24 that can be detected by the movable imaging unit 3A and the marker blocks 21-24 that cannot be detected by the movable imaging unit 3A.
- the three-dimensional data generation unit 43a can also generate display data so that the pattern light contained in the image generated by the scanner imaging units 64, 65 is displayed on the scanner display unit 113 in different colors depending on the distance between the scanner imaging units 64, 65 and the measurement object W.
- the pattern light is displayed in the image in different colors depending on whether the distance between the scanner imaging units 64, 65 and the measurement object W is within an appropriate measurement range. This allows the user to easily determine whether the three-dimensional scanner 2 is within a range appropriate for measurement with pattern light, or outside of the range appropriate for measurement with pattern light, simply by looking at the scanner display unit 113.
- step SD8 the processing unit 4 generates display data that accumulates and displays the sequentially generated point cloud data on the reference model with emphasized ridgelines when displaying the data on the scanner display unit 113.
- the display form is automatically switched from the solid body shown in FIG. 25 to the display form in which the ridge lines are emphasized shown in FIG. 29 before or at the same time as performing measurement by the three-dimensional scanner 2.
- the processing unit 4 displays a large number of point cloud data showing the three-dimensional shape of the measurement target W by sequentially superimposing them on the reference model in which the ridge lines are emphasized shown in FIG. 29, as shown in FIGS. 12 to 15.
- the display form in which the ridge lines are emphasized may be a form in which only the ridge lines are displayed, or a form in which the faces are displayed faintly in addition to the ridge lines (a display form in which the point cloud data is visible from the front side even if it exists behind the face).
- Such a display form of the three-dimensional model can be called, for example, a semi-transparent display. By displaying the faces semi-transparently, it becomes possible to view the point cloud data existing behind them from the front side.
- the display mode where ridgelines are emphasized all ridgelines are displayed (both the ridgelines on the front and back sides of the model are displayed). However, this is not limited to displaying all ridgelines, and for example, a display mode where only the ridgelines on the front side are emphasized, or a display mode where only the ridgelines on the back side are emphasized, may be used.
- the display mode is automatically switched from a solid body to a display mode where ridgelines are emphasized, but this is not limited to the above, and the display mode may be switched after a user's operation to switch the display mode is received.
- the timing for switching the display mode from a solid body to a display mode where ridgelines are emphasized may be a timing after measurement by the 3D scanner 2 has begun.
- step SD9 When measurement by the three-dimensional scanner 2 is completed in step SD9, the process proceeds to step SD10, where mesh data is generated based on the point cloud data acquired in step SD7.
- a user input indicating that measurement is complete is made on the touch panel 113a, that operation is accepted by the touch panel 113a.
- the touch panel 113a is an example of an operation unit that accepts user input indicating that measurement is complete; for example, when a user presses the completion button 505 on the measurement screen 113F of FIG. 28, that operation is accepted by the touch panel 113a as user input indicating that measurement is complete.
- the user input indicating that measurement is complete may be an operation of a physical button or the like other than the operation of the touch panel 113a.
- the processing unit 4 meshes the point cloud data representing the three-dimensional shape of the measurement object W, and generates display data representing the meshed three-dimensional shape.
- mesh shaping processing such as removing unnecessary points may be performed.
- the communication unit 46 of the processing unit 4 transmits display data showing the meshed three-dimensional shape to the wireless communication unit 144 of the three-dimensional scanner 2.
- the display control unit 140 sends the display data showing the meshed three-dimensional shape transmitted from the communication unit 46 of the processing unit 4 to the scanner display unit 113.
- the scanner display unit 113 displays a display screen generated based on the display data showing the meshed three-dimensional shape received via the wireless communication unit 144. This allows the user to check the display data showing the meshed three-dimensional shape at hand.
- the measurement processing unit 405 shown in FIG. 24 performs measurement processing (also called measurement processing) of the three-dimensional shape of the measurement object W based on the series of point cloud data acquired in step SD7.
- Measurement processing includes, for example, geometric measurement, comparative measurement, cross-sectional measurement, etc.
- the three-dimensional data generation unit 43a acquires a texture image at a specified time point with the texture camera 66.
- the texture image at a specified time point can be an image that is within the field of view of the texture camera 66 at the time the texture capture button 501 is operated.
- the processing unit 4 sequentially generates point cloud data over a predetermined period (for example, from the start of measurement to the completion of measurement).
- the processing unit 4 generates display data for superimposing and displaying a texture image acquired by the texture camera 66 at a predetermined time on the point cloud data that has been sequentially generated over this predetermined period and cumulatively displayed.
- This display data is transmitted by the communication unit 46 of the processing unit 4 to the wireless communication unit 144 of the three-dimensional scanner 2.
- the scanner display unit 113 displays a display screen generated based on the display data for superimposing and displaying the texture image.
- a comparison can also be made between the reference model input by the model input unit 400 and the point cloud data acquired by the three-dimensional scanner 2.
- the processing unit 4 acquires the reference model and the point cloud data, and calculates the difference between the reference model and the point cloud data. Based on the calculated difference, the processing unit 4 generates display data of the difference in heat map format. It is also possible to display display data in heat map format as shown in FIG. 17. In display data in heat map format, the display color changes depending on the magnitude of the difference, so the user can easily grasp the areas where the difference between the reference model and the point cloud data is large and small.
- Switching from display data with a texture image superimposed to display data in heat map format is performed by accepting a switching operation by the user.
- the processing unit 4 accepts an operation to switch from display data with a texture image superimposed to display data in heat map format, the processing unit 4 hides the texture. This makes the display in heat map format easier to see.
- the present invention can be used to measure the three-dimensional shapes of various measurement objects.
- Reference Signs List 1 Three-dimensional measuring device 2 Three-dimensional scanner 3 Imaging unit (position and orientation identification unit) 4 Processing unit (three-dimensional data generation means) 37 Communication unit (third communication unit) 43 Control unit (measurement control unit) 46 Communication unit (second communication unit) 48 Measurement setting units 62, 63 Scanner light sources 64, 65 Scanner imaging units 71 to 77 First to seventh scanner markers 113 Scanner display unit 144 Wireless communication unit (first communication unit) 601 Position and orientation identification unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024576182A JPWO2024166602A1 (enExample) | 2023-02-07 | 2024-01-12 | |
| CN202480009961.0A CN120615158A (zh) | 2023-02-07 | 2024-01-12 | 三维测量装置 |
| US19/254,067 US20250329114A1 (en) | 2023-02-07 | 2025-06-30 | Three-dimensional measurement device |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-016774 | 2023-02-07 | ||
| JP2023016774 | 2023-02-07 | ||
| JP2023207986 | 2023-12-08 | ||
| JP2023-207986 | 2023-12-08 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US19/254,067 Continuation US20250329114A1 (en) | 2023-02-07 | 2025-06-30 | Three-dimensional measurement device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024166602A1 true WO2024166602A1 (ja) | 2024-08-15 |
Family
ID=92262308
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/000596 Ceased WO2024166602A1 (ja) | 2023-02-07 | 2024-01-12 | 三次元測定装置 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20250329114A1 (enExample) |
| JP (1) | JPWO2024166602A1 (enExample) |
| CN (1) | CN120615158A (enExample) |
| WO (1) | WO2024166602A1 (enExample) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09126735A (ja) * | 1995-11-02 | 1997-05-16 | Tokai Rika Co Ltd | 多重結像カメラ及びこのカメラを用いた形状測定方法 |
| JPH11509928A (ja) * | 1995-07-26 | 1999-08-31 | ジェームス クランプトン,ステファン | スキャニング装置および方法 |
| JP2001082935A (ja) * | 1999-09-10 | 2001-03-30 | Keyence Corp | 三次元測定装置 |
| JP2008076384A (ja) * | 2006-08-23 | 2008-04-03 | Canon Inc | 情報処理方法、情報処理装置およびプログラム |
| JP2010122209A (ja) * | 2008-10-16 | 2010-06-03 | Hexagon Metrology Inc | レーザスキャナを伴う関節式測定アーム |
| JP2019138686A (ja) * | 2018-02-07 | 2019-08-22 | オムロン株式会社 | 3次元測定装置、3次元測定方法及び3次元測定プログラム |
| JP2020020699A (ja) * | 2018-08-01 | 2020-02-06 | 株式会社キーエンス | 三次元座標測定装置 |
-
2024
- 2024-01-12 CN CN202480009961.0A patent/CN120615158A/zh active Pending
- 2024-01-12 JP JP2024576182A patent/JPWO2024166602A1/ja active Pending
- 2024-01-12 WO PCT/JP2024/000596 patent/WO2024166602A1/ja not_active Ceased
-
2025
- 2025-06-30 US US19/254,067 patent/US20250329114A1/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11509928A (ja) * | 1995-07-26 | 1999-08-31 | ジェームス クランプトン,ステファン | スキャニング装置および方法 |
| JPH09126735A (ja) * | 1995-11-02 | 1997-05-16 | Tokai Rika Co Ltd | 多重結像カメラ及びこのカメラを用いた形状測定方法 |
| JP2001082935A (ja) * | 1999-09-10 | 2001-03-30 | Keyence Corp | 三次元測定装置 |
| JP2008076384A (ja) * | 2006-08-23 | 2008-04-03 | Canon Inc | 情報処理方法、情報処理装置およびプログラム |
| JP2010122209A (ja) * | 2008-10-16 | 2010-06-03 | Hexagon Metrology Inc | レーザスキャナを伴う関節式測定アーム |
| JP2019138686A (ja) * | 2018-02-07 | 2019-08-22 | オムロン株式会社 | 3次元測定装置、3次元測定方法及び3次元測定プログラム |
| JP2020020699A (ja) * | 2018-08-01 | 2020-02-06 | 株式会社キーエンス | 三次元座標測定装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120615158A (zh) | 2025-09-09 |
| JPWO2024166602A1 (enExample) | 2024-08-15 |
| US20250329114A1 (en) | 2025-10-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN103988049B (zh) | 具有摄像头的坐标测量机 | |
| US10940573B2 (en) | Hand-held tool system | |
| CN110871441B (zh) | 感测系统、作业系统、增强现实图像的显示方法、以及存储有程序的存储介质 | |
| JP5316118B2 (ja) | 3次元視覚センサ | |
| JP5073256B2 (ja) | 位置測定装置及び位置測定方法及び位置測定プログラム | |
| JP6291562B2 (ja) | 有向性のプローブ処理による、三次元スキャナにおける多経路干渉の診断および排除 | |
| WO2016179448A1 (en) | Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform | |
| JP2010211746A (ja) | 3次元認識結果の表示方法および3次元視覚センサ | |
| US11547273B2 (en) | User interface for a dental measurement system | |
| JP2010237193A (ja) | キャリブレーション装置および3次元計測のためのパラメータの精度の確認支援方法 | |
| JP2021177157A (ja) | アイウェア表示システム | |
| US12228401B2 (en) | Survey system | |
| WO2024166602A1 (ja) | 三次元測定装置 | |
| JP2025092238A (ja) | 三次元測定装置、三次元測定方法及び三次元測定プログラム | |
| US10724850B2 (en) | Displacement measuring apparatus | |
| WO2024166601A1 (ja) | 三次元測定装置 | |
| JP2024111981A (ja) | 三次元測定装置 | |
| US10724849B2 (en) | Displacement measuring apparatus | |
| JP2021177156A (ja) | アイウェア表示システム | |
| JP4167453B2 (ja) | デジタイザおよび座標認識用シート | |
| US12508114B2 (en) | User interface for a dental measurement system | |
| JP7731104B2 (ja) | 情報処理装置、ロボットの軌道生成方法、ロボットの制御方法、プログラム、移動体及びロボットシステム | |
| DE112024000776T5 (de) | Dreidimensionale messvorrichtung | |
| JP7081941B2 (ja) | 三次元形状計測装置及び三次元形状計測方法 | |
| JP2013000182A (ja) | 表示制御装置、表示制御方法、及び表示制御プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24753049 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2024576182 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024576182 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202480009961.0 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 202480009961.0 Country of ref document: CN |