WO2017146202A1 - 三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法 - Google Patents
三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法 Download PDFInfo
- Publication number
- WO2017146202A1 WO2017146202A1 PCT/JP2017/007054 JP2017007054W WO2017146202A1 WO 2017146202 A1 WO2017146202 A1 WO 2017146202A1 JP 2017007054 W JP2017007054 W JP 2017007054W WO 2017146202 A1 WO2017146202 A1 WO 2017146202A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- subject
- camera
- texture information
- dimensional shape
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
- H04N5/2226—Determination of depth image, e.g. for foreground/background separation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
Definitions
- the present invention relates to a technical field such as a system for acquiring three-dimensional shape and texture information of a subject.
- Patent Document 1 discloses a portable three-dimensional shape measuring device that can quickly obtain highly accurate three-dimensional shape data in a portable three-dimensional shape measuring device that can be carried.
- a three-dimensional shape sensor capable of detecting a three-dimensional shape is attached as a shape sensor to the tip of a joint arm standing on the base, and the three-dimensional shape sensor is used to measure an object to be measured.
- the 3D position and orientation of the object to be measured are adjusted to face each other in a non-contact state, and the 3D shape data of the measurement object is output by surface scanning from the shape sensor when the 3D shape sensor is stationary.
- Patent Document 2 discloses a technique for generating three-dimensional shape data from a plurality of images.
- the position and orientation of the camera that captured the image can be obtained by calculation, so that a three-dimensional shape that does not depend on the accuracy of the position and orientation of the camera can be generated.
- a three-dimensional shape and texture information of a subject are generated by using a single camera and shooting while changing illumination conditions. By using a single camera, the displacement of the three-dimensional shape and texture due to lens distortion is avoided.
- the distance range in which the three-dimensional shape sensor disclosed in Patent Document 1 functions effectively is determined, and data outside the distance range cannot be acquired, or an error occurs. Therefore, the operator needs to move the three-dimensional shape sensor to a position where no error occurs. If the position of the 3D shape sensor can be accurately determined with respect to the 3D shape, the accuracy of the acquired 3D shape can be improved. Increasing the accuracy of the orientation has not been realized.
- the three-dimensional shape generated by the method based on Patent Document 2 depends on the resolution of the camera, a high-resolution camera is used to generate high-definition three-dimensional shape data and texture information. Close-up needs to be taken.
- the present invention provides a three-dimensional shape data and texture information generation system and a photographing control program capable of suppressing blur and noise caused by photographing conditions of a subject and obtaining high-definition three-dimensional shape data and texture information. And a method for generating three-dimensional shape data and texture information.
- the invention described in claim 1 includes an imaging unit that images a subject for each region, and a distance measuring unit that measures a distance from a position of the imaging unit to a distance measuring point.
- Driving means for driving at least one of the subject and the photographing means; control means for controlling the photographing means, the distance measuring means, and the driving means; and the part photographed by the photographing means
- Generating means for generating three-dimensional shape data and texture information of the subject based on an image for each region, the three-dimensional shape data and texture information generating system, wherein the control means is a distance measurement on the subject
- the distance measuring unit measures the distance from the position of the photographing unit to the ranging target point
- the meter Driving at least one of the subject and the photographing means so that the partial area is sequentially changed while keeping the calculated distance or the distance calculated from the measured distance within the range of the depth of field.
- the image of each partial area is acquired by driving by
- the control unit changes the one while changing the distance within a range of the depth of field. It is characterized in that at least one of the subject and the photographing means is driven by the driving means so that the partial area and a partial area adjacent to the changed partial area overlap.
- control unit is configured to capture the same portion constituting the subject a plurality of times. And at least one of the photographing unit is driven by the driving unit.
- the driving unit includes a turntable on which the subject is installed.
- the control means drives the turntable so that the partial areas are sequentially changed.
- the photographing unit and the distance measuring unit are It is attached to a slider that can change the distance, and the control means drives the slider so that the measured distance or the distance calculated from the measured distance is kept within the depth of field. It is characterized by making it.
- the photographing unit and the distance measuring unit are configured to perform the measurement with respect to the distance measurement target point. It is attached to a robot arm that can change at least one of the position and posture of the photographing means and the distance measuring means, and the control means calculates the measured distance or the distance calculated from the measured distance.
- the robot arm is driven so that the partial areas are sequentially changed while being kept within a depth of field.
- the invention according to claim 7 is the three-dimensional shape data and texture information generation system according to any one of claims 1 to 6, wherein the lens of the photographing means is a macro lens.
- a driving device that drives one of the above, and a controller that controls the camera, the distance sensor, and the driving device, and the controller includes the distance measurement target point on the subject in the partial area. With the camera in focus, the distance measurement sensor measures the distance from the camera position to the distance measurement target point, and calculates the measured distance or the distance calculated from the measured distance.
- At least one of the subject and the camera is driven by the driving device so that the partial area is sequentially changed while keeping within a depth range, and each of the subjects
- photographing means for photographing a subject for each region, distance measuring means for measuring a distance from a position of the photographing means to a distance measuring point, the subject and the photographing means, Driving means for driving at least one of the above, control means for controlling the photographing means, the distance measuring means, and the driving means, and the image for each partial area photographed by the photographing means
- the part including a distance measurement target point on the subject in a computer included in the control unit in the three-dimensional shape data and texture information generation system, the generating unit including three-dimensional shape data and texture information of the subject.
- the measured distance or at least one of the subject and the photographing means is changed so that the partial area is sequentially changed while keeping a distance calculated from the measured distance within the range of depth of field.
- the step of driving the driving unit and the step of acquiring an image of each partial area by causing the photographing unit to capture each partial area of the subject are executed.
- the distance from the camera position to the distance measurement target point is measured by the distance measurement sensor in a state where the camera is focused on a partial area including the distance measurement target point on the subject. Measuring at least a distance between the subject and the camera so that the partial area is sequentially changed while keeping the measured distance or a distance calculated from the measured distance within a range of depth of field. Driving one of them with a driving device, in the process of sequentially changing the partial area, causing the camera to shoot each partial area of the subject, and the one imaged by the camera Generating a three-dimensional shape data and texture information of the subject by a processor based on an image for each partial area.
- the present invention it is possible to suppress the occurrence of blur and noise due to the shooting conditions of the subject, and to obtain high-definition three-dimensional shape data and texture information.
- the 3D shape and texture information obtained can be generated in a matched state.
- Example 1 It is a block diagram which shows schematic structure of the three-dimensional shape data and texture information generation system S which concern on this embodiment.
- Example 1 it is a figure which shows an example of the positional relationship of the to-be-photographed object (for example, globe) installed in the turntable, and the camera 1 and the ranging sensor 2.
- FIG. 6 is a flowchart illustrating an example of imaging control processing, three-dimensional shape data, and texture information generation processing in the first embodiment.
- Example 2 it is a figure which shows an example of the positional relationship of the to-be-photographed object installed in the base, and the camera 1 and the ranging sensor 2.
- FIG. 10 is a flowchart illustrating an example of imaging control processing, three-dimensional shape data, and texture information generation processing in Embodiment 2.
- FIG. 1 is a block diagram showing a schematic configuration of the three-dimensional shape data and texture information generation system S according to the present embodiment.
- the three-dimensional shape data and texture information generation system S includes a camera 1, a distance measuring sensor 2, a driving device 3, a three-dimensional shape data and texture information generation device 4, and the like.
- the camera 1 is an example of a photographing unit in the present invention
- the distance measuring sensor 2 is an example of a distance measuring unit in the present invention
- the driving device 3 is an example of a driving unit in the present invention.
- the camera 1 and the distance measuring sensor 2 are used as sensors for acquiring a three-dimensional shape of a subject (an object to be photographed).
- the three-dimensional shape data and texture information generation system S according to the present embodiment is used for industrial purposes, for example, when it is necessary to obtain a texture such as a three-dimensional shape of an object or a pattern drawn on the object with high accuracy. It is used for detecting a three-dimensional shape before processing in a robot, for confirming work after processing, or for acquiring detailed shapes and textures in a digitization process of art.
- the camera 1 includes a macro lens, a shutter, an image sensor (image sensor), and the like. Under the control of the three-dimensional shape data and texture information generation device 4, a subject is photographed for each region, and the photographed image is three-dimensional. The data is output to the shape data and texture information generation device 4. Note that a single-lens reflex camera is applied to the camera 1.
- the macro lens is a kind of single focus lens in which the focal length (distance from the center of the optical lens to the focal point) does not change, and the focus range is very narrow (for example, an object with an aperture value of about F5.6). This is a lens for photographing a narrow range at a high magnification.
- the focus range is theoretically only on one plane, but there is an allowable range of a predetermined distance (that is, depth of field) before and after the plane in the depth direction. Can be treated as a matching range.
- the focus is adjusted by using an image captured by the macro lens and an image captured by the 3D shape data and texture information generating device 4 or an image confirmed by the finder of the camera 1.
- the camera 1 that does not have an autofocus mechanism is applied, or even the camera 1 that has an autofocus mechanism does not use the autofocus mechanism.
- the distance measuring sensor 2 includes a light source (for example, a light emitting diode or a laser diode), a light receiving element, and the like, and is controlled from the position of the camera 1 (position in real space) under the control of the three-dimensional shape data and texture information generation device 4.
- a distance to a distance measurement target point (corresponding to the center of a pixel in a partial area) on the subject is measured, and the measured distance is output to the three-dimensional shape data and texture information generation device 4.
- the position of the camera 1 is on the optical axis passing through the center of the lens of the camera 1 and is, for example, the focal point of the lens.
- the distance measured by the distance measuring sensor 2 can be used as it is, but the position of the camera 1 and the position of the distance measuring sensor 2 do not match.
- the distance measured by the distance measuring sensor 2 is corrected by the three-dimensional shape data and texture information generating device 4 so as to be a distance from the position of the camera 1 based on the known principle of triangulation.
- the distance measuring sensor 2 may be provided integrally with the camera 1 in the housing of the camera 1.
- the offset can be calculated and corrected so that the focus is in a wider range in consideration of the shape of the subject.
- the driving device 3 drives at least one of the subject and the camera 1 under the control of the three-dimensional shape data and texture information generation device 4 (that is, moves at least one of the subject and the camera 1 with power).
- a drive mechanism and a control circuit for controlling the drive mechanism are provided.
- the drive mechanism includes, for example, a turntable on which a subject is installed, and a slider to which the camera 1 and the distance measuring sensor 2 are attached.
- the drive mechanism is provided with an articulated robot arm to which the camera 1 and the distance measuring sensor 2 are attached.
- the three-dimensional shape data and texture information generation device 4 includes interface units 41a to 41c, a display unit 42, an operation unit 43, a storage unit 44, a control unit (controller) 45, a calculation unit 46, and the like. They are connected to each other via a bus 47.
- the interface unit 41a serves as an interface when the control unit 45 communicates with the camera 1 by wire or wireless.
- the interface unit 41b serves as an interface for the control unit 45 to communicate with the distance measurement sensor 2 by wire or wireless.
- the interface unit 41 c serves as an interface for the control unit 45 to communicate with the driving device 3 by wire or wireless.
- the display unit 42 has a display screen for displaying various information.
- the display screen displays the distance between the subject and the camera 1, such as an image (two-dimensional image) for each partial region of the subject photographed by the camera 1, or a three-dimensional image representing the three-dimensional shape of the entire subject. Is done.
- the operation unit 43 receives an operation instruction from the user and outputs a signal corresponding to the received operation instruction to the control unit 45. For example, a mouse is applied to the operation unit 43.
- the display unit 42 and the operation unit 43 may be applied with a touch panel having a display function for displaying various types of information on a display screen and an input function for receiving an operation with a human finger or a pen.
- the storage unit 44 includes, for example, a hard disk drive or a non-volatile semiconductor memory, and stores an OS (Operating System), an imaging control program, a three-dimensional shape generation program, various setting data, and the like.
- the storage unit 44 stores image data captured for each partial area of the subject, and three-dimensional shape data and texture information generated based on the image for each partial area.
- This three-dimensional shape data and texture information are data constituting a three-dimensional image representing the whole subject or a part of the three-dimensional shape or texture information.
- the control unit 45 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like. Then, the control unit 45 executes subject shooting control processing in accordance with the shooting control program stored in the storage unit 44. In the photographing control process, the control unit 45 functions as a control unit that controls the camera 1, the distance measuring sensor 2, and the driving device 3. Specifically, the control unit 45 determines the distance from the position of the camera 1 to the distance measurement target point with the distance measurement sensor 2 in a state where the camera 1 is focused on a partial area including the distance measurement target point on the subject. Let me measure.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the control unit 45 sequentially changes the partial areas while keeping the measured distance or the distance calculated from the measured distance constant (constant considering the depth of field of several mm). At least one of the subject and the camera 1 is driven by the driving device 3, and each partial region of the subject is photographed by the camera 1, thereby acquiring an image for each partial region. That is, the control unit 45 keeps at least one of the subject and the camera 1 in a predetermined direction, for example, several centimeters while keeping the distance between the subject and the camera 1 constant (considering a depth of field of several mm).
- At least one of the subject and the camera 1 is driven so that the partial area to be changed and the partial area adjacent to the changed partial area overlap (overlap) (that is, ,
- the driving device 3 is driven by applying a driving ON signal to the driving device 3, and each partial area is photographed by the camera 1 (that is, the shutter ON signal) in this driving control process (that is, a process in which the partial areas are sequentially changed).
- To the camera 1) to obtain an image for each partial area from the camera 1.
- the same part constituting the subject is photographed at least twice, and the same part constituting the subject can be photographed from a relatively different photographing position.
- the three-dimensional position of the same location can be specified based on photogrammetry from images acquired from different shooting positions.
- the calculation unit 46 includes a CPU, a ROM, a RAM, and the like. Then, the calculation unit 46 executes 3D shape data and texture information generation processing for generating 3D shape data and texture information of the subject in accordance with the 3D shape and texture information generation program stored in the storage unit 44. In the three-dimensional shape data and texture information generation process, the calculation unit 46 generates three-dimensional shape data and texture information of the subject based on an image (two-dimensional image) for each partial area captured by the camera 1. Functions as a means. Here, when a two-dimensional image is obtained by photographing a subject in the three-dimensional space with the camera 1, the coordinates (XYZ) in the three-dimensional space are moved to the coordinates (uv) in the two-dimensional image. Is converted by the following equation (1) based on the product of the matrix M outer representing the matrix M outer and the matrix M inner representing the optical transformation in the camera 1.
- the calculation unit 46 may calculate the matrix M outer and the matrix M inner can be calculated. If the matrix M outer and the matrix M inner are obtained, the calculation unit 46 can calculate the coordinates (XYZ) in the three-dimensional space from the coordinates (uv) in the two-dimensional image.
- Original shape data and texture information can be generated.
- the matrix M outer representing the position and orientation of the camera 1 can be calculated from a set of coordinates (XYZ) in the three-dimensional space and coordinates (uv) in the two-dimensional image
- the matrix M outer includes sufficiently characteristic coordinates.
- a set necessary for calculating the matrix M outer can be obtained.
- the matrix M inner representing the optical transformation in the camera 1
- a calibration pattern including coordinates (XYZ) in a known three-dimensional space is assumed, assuming that the matrix M inner is constant. Often calculated by shooting.
- the matrix M inner is affected by the focus position adjustment by the autofocus mechanism of the camera 1, and becomes an obstacle to restoring the coordinates (X Y Z) in the three-dimensional space with high accuracy.
- the task of focusing on the subject is not the matrix M inner but the position component (t x t y t z ) of the matrix M outer. Realized.
- the matrix M inner calibrated or calculated with high accuracy can be used to calculate the coordinates (X Y Z) in the three-dimensional space without causing a decrease in accuracy.
- the three-dimensional shape data and texture information generation device 4 may be divided into a plurality of PCs (personal computers).
- another PC connected to a network refers to and calculates data stored in a hard disk by a photographing PC to generate three-dimensional shape data and texture information.
- an image is stored in an external hard disk by a photographing PC, and this is transferred to another PC and calculated to generate three-dimensional shape data and texture information.
- the shooting PC receives and stores the image data from the camera (note that the camera has captured the strobe signal of the camera) Determined from).
- FIG. 2 is a diagram illustrating an example of a positional relationship between a subject (for example, a globe) installed on the turntable and the camera 1 and the distance measuring sensor 2 in the first embodiment.
- the turntable rotates around the central axis (in the direction of the broken line arrow in FIG. 2) under the control of the control unit 45.
- the camera 1 and the distance measuring sensor 2 are attached to a slider.
- the slider is provided so as to be able to change the distance from the distance measurement target point on the subject under the control of the control unit 45.
- FIG. 3 is a flowchart illustrating an example of the imaging control process, the three-dimensional shape data, and the texture information generation process in the first embodiment.
- the process illustrated in FIG. 3 is started, for example, when the user gives a start instruction by operating the operation unit 43.
- the control unit 45 activates the camera 1 and the distance measuring sensor 2 and inputs setting data indicating an assumed path created based on the shape of the subject from the storage unit 44 (step S40). S1).
- this assumed path indicates that the subject is rotated 360 degrees horizontally, for example, by 10 degrees, and is created in advance and stored in the storage unit 44.
- control unit 45 determines a distance measurement target point on the subject based on the assumed path indicated by the setting data input in step S1 (step S2).
- control unit 45 drives the slider (that is, moves back and forth with respect to the distance measurement target point) so that the camera 1 is focused on a partial area including the distance measurement target point determined in step S2. Adjust (step S3). Thereby, when the camera 1 is focused on a partial area including the distance measurement target point, the process proceeds to step S4.
- step S4 the control unit 45 gives a measurement ON signal to the distance measurement sensor 2 in a state where the camera 1 is focused on a partial area including the distance measurement target point.
- the distance sensor 2 is made to measure the distance to the point, and distance data indicating the distance measured by the distance sensor 2 is acquired (stored in the RAM).
- the control unit 45 gives the camera 1 a shutter ON signal so that the camera 1 captures a partial area including the distance measurement target point, and acquires an image (two-dimensional image) captured by the camera 1. (Step S5).
- control unit 45 determines whether or not to finish photographing based on the assumed path indicated by the setting data input in step S1 (step S6).
- the control unit 45 determines to end the shooting (step S6: YES), and proceeds to step S11.
- step S6 YES
- an image having a predetermined vertical width for one round of the subject is obtained.
- the user changes the angle of the subject placed on the turntable (for example, puts the subject on the turntable again), and starts operation by operating the operation unit 43 again.
- the processing shown in FIG. 3 is started.
- step S6 when the control unit 45 determines not to finish photographing (step S6: NO), the process proceeds to step S7.
- step S7 the control unit 45 changes the partial area of the subject (for example, 10 degrees horizontally) by giving a drive ON signal to the drive device 3 based on the assumed path indicated by the setting data input in step S1.
- the turntable is driven (ie, rotated) to be changed. Since the subject rotates in conjunction with the turntable, it is possible to prevent the position of the subject from deviating against the user's intention.
- the control unit 45 gives the distance measurement sensor 2 a measurement ON signal, thereby causing the distance measurement sensor 2 to measure the distance from the position of the camera 1 to the distance measurement target point included in the changed partial area. Then, distance data indicating the distance measured by the distance measuring sensor 2 is acquired (step S8).
- the control unit 45 determines that a difference (distance difference) between the distance indicated by the distance data acquired in step S4 and the distance indicated by the distance data acquired in step S8 (that is, immediately preceding step S8) is ⁇ predetermined value (desirably Is 0, but is set within a range of several mm in consideration of the depth of field) (step S9). If the control unit 45 determines that the distance difference is within a range of ⁇ predetermined value (step S9: YES), the control unit 45 returns to step S5, and gives a shutter ON signal to the camera 1 to change the distance measurement target. A part of the region including the point is photographed by the camera 1 and an image photographed by the camera 1 is acquired.
- ⁇ predetermined value desirably Is 0, but is set within a range of several mm in consideration of the depth of field
- step S9 when the control unit 45 determines that the distance difference is not within the range of ⁇ predetermined value (step S9: NO), the control unit 45 gives the drive ON signal to the drive device 3, thereby reducing the distance difference.
- the slider is driven so as to disappear (that is, moved back and forth with respect to the distance measurement target point) (step S10), and the process returns to step S9. That is, the control unit 45 drives the slider so that the measured distance is maintained within the range of the depth of field (that is, the “ ⁇ predetermined range”). Since the camera 1 is attached to the slider, the distance from the position of the camera 1 to the distance measurement target point can be kept more accurately within the range of the depth of field.
- the partial areas can be sequentially changed while keeping the distance to the distance measurement target point measured by the distance measurement sensor 2 within the range of the depth of field.
- step S ⁇ b> 11 the calculation unit 46, based on the image (two-dimensional image) acquired as described above according to the three-dimensional shape and texture information generation program stored in the storage unit 44, Three-dimensional shape data for generating texture information and texture information generation processing are executed.
- FIG. 4 is a diagram illustrating an example of a positional relationship between the subject installed on the base and the camera 1 and the distance measuring sensor 2 in the second embodiment.
- the camera 1 and the distance measuring sensor 2 are attached to the tip of an articulated robot arm.
- the multi-joint robot arm is configured by connecting a plurality of arms via joints, and is driven under the control of the control unit 45. That is, the angles (rotation angle and bending angle) of each joint of the articulated robot arm are changed by a motor (not shown) driven by the control unit 45.
- a motor not shown
- FIG. 5 is a flowchart illustrating an example of the imaging control process, the three-dimensional shape data, and the texture information generation process in the second embodiment.
- the process illustrated in FIG. 5 is started, for example, when the user issues a start instruction by operating the operation unit 43.
- the control unit 45 activates the camera 1 and the distance measuring sensor 2, and the user operates the operation unit 43 to accept the designation of the distance measurement target point on the subject (step) S21).
- step) S21 For example, on the display screen of the display unit 42, an image captured on the macro lens (an image confirmed by the viewfinder) is displayed, and the user operates the mouse to place an icon at a position corresponding to the distance measurement target point on the image. At the same time, by clicking the mouse, designation of the distance measurement target point is accepted.
- step S21 as in the first embodiment, setting data indicating an assumed path created based on the shape of the subject may be input.
- the assumed path is determined from the distance measurement target point specified by the user. Based on the above, the articulated robot arm is driven.
- control unit 45 determines the distance measurement target point designated in step S21 (step S22).
- control unit 45 drives the articulated robot arm so that the camera 1 is focused on a partial area including the distance measurement target point determined in step S22 (step S23). Thereby, when the camera 1 is focused on a partial area including the distance measurement target point, the process proceeds to step S24.
- step S24 the control unit 45 performs a state confirmation process of the camera 1 on the distance measurement target point in a state where the camera 1 is focused on a partial region including the distance measurement target point.
- the control unit 45 causes the distance measurement sensor 2 to measure the distance from the position of the camera 1 to the distance measurement target point, obtains distance data indicating the distance measured by the distance measurement sensor 2, and The normal vector from the distance measurement target point is acquired.
- step S25 determines whether or not the state of the camera 1 is appropriate. For example, it is determined whether or not the normal vector from the distance measurement target point matches the optical axis of the camera 1, and if they match, it is determined that the state of the camera 1 is appropriate. If the controller 45 determines that the state of the camera 1 is appropriate (step S25: YES), the controller 45 proceeds to step S27. On the other hand, if the controller 45 determines that the state of the camera 1 is not appropriate (step S25: NO), the controller 45 proceeds to step S26.
- step S26 the control unit 45 gives a drive ON signal to the drive device 3 to drive the articulated robot arm so that the postures (directions) of the camera 1 and the distance measuring sensor 2 change, and the process returns to step S25. Again, it is determined whether the state of the camera 1 is appropriate. By such processing, the normal vector and the optical axis of the camera 1 are matched.
- step S27 the control unit 45 gives the camera 1 a shutter ON signal to cause the camera 1 to shoot a partial area including the distance measurement target point, and obtain an image captured by the camera 1.
- step S28 determines whether or not to end photographing. For example, when the user operates the operation unit 43 to give an end instruction, the control unit 45 determines that the shooting is to be ended (step S28: YES), and proceeds to step S30.
- step S28 determines not to end the photographing
- step S29 the control unit 45 changes the partial region of the subject by giving a drive ON signal to the drive device 3 in response to an instruction from the user via the operation unit 43 (or based on an assumed path), for example. Then, the articulated robot arm is driven, and the process returns to step S24.
- control unit 45 causes the distance measurement sensor 2 to measure the distance from the position of the camera 1 to the distance measurement target point in the state confirmation process, and distance data indicating the distance measured by the distance measurement sensor 2. Further, a normal vector from the distance measurement target point is acquired.
- step S25 it is determined whether or not the state of the camera 1 is appropriate (step S25).
- the difference distance difference between the distance indicated by the distance data acquired this time and the distance indicated by the distance data acquired previously is within a range of ⁇ predetermined values, and measurement is performed. If the normal vector from the distance target point matches the optical axis of the camera 1, it is determined that the state of the camera 1 is appropriate, and the process proceeds to step S27.
- the difference (distance difference) between the distance indicated by the distance data acquired this time and the distance indicated by the distance data acquired last time is not within the range of ⁇ predetermined values, or the normal vector from the distance measurement target point and the camera If the optical axis of 1 does not match, the process proceeds to step S26.
- the control unit 45 drives the articulated robot arm so that the partial areas are sequentially changed while keeping the measured distance within the range of the depth of field.
- the positions and postures of the camera 1 and the distance measuring sensor 2 can be freely adjusted, and the distance is measured from the position of the camera 1 with the subject fixed.
- the distance to the target point can be kept more accurately within the range of the depth of field.
- step S ⁇ b> 30 the calculation unit 46, based on the image (two-dimensional image) acquired as described above according to the three-dimensional shape and texture information generation program stored in the storage unit 44, Three-dimensional shape data for generating texture information and texture information generation processing are executed.
- the articulated robot arm is used in the second embodiment, a turntable and a single joint robot arm may be used instead of the articulated robot arm as another embodiment.
- the subject is placed on the turntable, and the camera 1 and the distance measuring sensor 2 are attached to the tip of the single joint robot arm.
- the control unit 45 drives the slider and the single joint robot arm so that the partial areas are sequentially changed while keeping the distance measured as described above within the range of the depth of field.
- the three-dimensional shape data and texture information generation device 4 is configured so that the camera 1 is focused on a partial area including a distance measurement target point on the subject.
- the distance sensor 2 measures the distance from the position to the distance measurement target point, and some areas are sequentially changed while keeping the measured distance or the distance calculated from the measured distance within the depth of field.
- at least one of the subject and the camera 1 is driven by the driving device 3, and each partial region of the subject is captured by the camera 1, thereby obtaining an image for each partial region. Since the 3D shape data and texture information of the subject are generated based on the image, the generation of blur and noise due to the shooting conditions of the subject is suppressed, and high-definition 3D shape data is suppressed. And it is possible to obtain texture information.
- the 3D shape and texture information obtained can be generated in a matched state.
- a user can obtain a high-quality image only by setting a rough shooting situation such as the position and orientation of the camera 1.
- the user can set and capture an appropriate position and orientation without changing the focus position by simply moving the camera 1 to the position and orientation where the subject is desired to be photographed.
- texture photographing using a single lens reflex it is possible to photograph a texture image of the entire circumference of the shape with the parameters subjected to lens calibration as they are in focus.
- even a subject installed at a position displaced by several millimeters or more from the center of rotation is always photographed from the same distance, and the texture of the region centered on the intersection of the subject and the optical axis is You can shoot in focus.
- the distance is constant and the overlapping area of the partial areas is constant.
- the photographing position and orientation can be calculated, and the camera 1 can be moved and photographed.
- the overlapping area between the partial areas it is possible to acquire a stable quality image for the entire circumference of the subject.
- the position and direction suitable for shooting differ, so the shooting environment is constructed manually.
- an environment suitable for photographing including illumination is automatically constructed, so that a group of images with stable quality can be photographed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Studio Devices (AREA)
- Measurement Of Optical Distance (AREA)
- Image Input (AREA)
- Image Analysis (AREA)
Abstract
Description
先ず、図1等を参照して、本実施形態に係る三次元形状データおよびテクスチャ情報生成システムSの構成及び機能について説明する。図1は、本実施形態に係る三次元形状データおよびテクスチャ情報生成システムSの概要構成を示すブロック図である。図1に示すように、三次元形状データおよびテクスチャ情報生成システムSは、カメラ1、測距センサ2、駆動装置3、及び三次元形状データおよびテクスチャ情報生成装置4等を備えて構成される。ここで、カメラ1は本発明における撮影手段の一例であり、測距センサ2は本発明における測距手段の一例であり、駆動装置3は本発明における駆動手段の一例である。カメラ1及び測距センサ2は、被写体(撮影対象となる対象物)の三次元形状の取得用センサとして使用される。本実施形態に係る三次元形状データおよびテクスチャ情報生成システムSは、例えば、対象物の三次元形状や対象物に描かれた模様等のテクスチャを高精度に取得する必要が生じる場合として、産業用ロボットにおける加工前の三次元形状検出用、加工後の作業確認用、または、美術品のデジタル化プロセスにおける詳細形状及びテクスチャの取得用として利用される。
次に、本実施形態に係る三次元形状データおよびテクスチャ情報生成システムSの動作について、実施例1と実施例2とに分けて説明する。
先ず、図2及び図3を参照して、実施例1について説明する。図2は、実施例1において、ターンテーブルに設置された被写体(例えば、地球儀)と、カメラ1及び測距センサ2との位置関係の一例を示す図である。図2に示すように、ターンテーブルは、制御部45の制御の下、中心軸まわりに回転(図2の破線矢印方向)する。カメラ1及び測距センサ2は、スライダに取り付けられている。スライダは、制御部45の制御の下、被写体上における測距対象点からの距離を変更可能に設けられており、例えば、被写体上における測距対象点に対して前後(図2の破線矢印方向)に移動する。
先ず、図4及び図5を参照して、実施例2について説明する。図4は、実施例2において、基台に設置された被写体と、カメラ1及び測距センサ2との位置関係の一例を示す図である。図4に示すように、カメラ1及び測距センサ2は、多関節ロボットアームの先端に取り付けられている。多関節ロボットアームは、複数のアームを関節を介して連結して構成され、制御部45の制御の下、駆動する。すなわち、多関節ロボットアームの各関節の角度(回転角度及び曲げ角度)は、制御部45により駆動するモータ(図示せず)により変化する。これにより、被写体上における測距対象点に対するカメラ1及び測距センサ2の位置と姿勢との少なくとも何れか一方が変更可能になっている。
2 測距センサ
3 駆動装置
4 三次元形状データおよびテクスチャ情報生成装置
41a~41c インターフェース部
42 表示部
43 操作部
44 記憶部
45 制御部
46 計算部
S 三次元形状データおよびテクスチャ情報生成システム
Claims (10)
- 被写体を一部領域毎に撮影する撮影手段と、前記撮影手段の位置から測距対象点までの距離を計測する測距手段と、前記被写体と前記撮影手段との少なくとも何れか一方を駆動する駆動手段と、前記撮影手段、前記測距手段、及び前記駆動手段を制御する制御手段と、前記撮影手段により撮影された前記一部領域毎の画像に基づいて前記被写体の三次元形状データおよびテクスチャ情報を生成する生成手段と、を備える三次元形状データおよびテクスチャ情報生成システムであって、
前記制御手段は、
前記被写体上における測距対象点を含む前記一部領域に前記撮影手段のピントが合った状態で、前記撮影手段の位置から前記測距対象点までの距離を前記測距手段に計測させ、
前記計測された距離または計測された距離から計算される距離を被写界深度の範囲内に保ちながら前記一部領域が順次変更されるように前記被写体と前記撮影手段との少なくとも何れか一方を前記駆動手段により駆動させ、
前記被写体の各前記一部領域を前記撮影手段に撮影させることで前記一部領域毎の画像を取得することを特徴とする三次元形状データおよびテクスチャ情報生成システム。 - 前記制御手段は、前記距離を被写界深度の範囲内に保ちながら、変更される前記一部領域と、当該変更される一部領域に隣接する一部領域とがオーバーラップするように前記被写体と前記撮影手段との少なくとも何れか一方を前記駆動手段により駆動させることを特徴とする請求項1に記載の三次元形状データおよびテクスチャ情報生成システム。
- 前記制御手段は、前記被写体を構成する同一の箇所が複数回撮影されるように、前記被写体と前記撮影手段との少なくとも何れか一方を前記駆動手段により駆動させることを特徴とする請求項2に記載の三次元形状データおよびテクスチャ情報生成システム。
- 前記駆動手段は、前記被写体が設置されるターンテーブルを備えており、
前記制御手段は、前記一部領域が順次変更されるように前記ターンテーブルを駆動させることを特徴とする請求項1乃至3の何れか一項に記載の三次元形状データおよびテクスチャ情報生成システム。 - 前記撮影手段及び前記測距手段は、前記測距対象点からの距離を変更可能なスライダに取り付けられており、
前記制御手段は、前記計測された距離が被写界深度の範囲内に保たれるように前記スライダを駆動させることを特徴とする請求項1乃至4の何れか一項に記載の三次元形状データおよびテクスチャ情報生成システム。 - 前記撮影手段及び前記測距手段は、前記測距対象点に対する前記撮影手段及び前記測距手段の位置と姿勢との少なくとも何れか一方を変更可能なロボットアームに取り付けられており、
前記制御手段は、前記計測された距離または計測された距離から計算される距離を被写界深度の範囲内に保ちながら前記一部領域が順次変更されるように前記ロボットアームを駆動させることを特徴とする請求項1乃至3の何れか一項に記載の三次元形状データおよびテクスチャ情報生成システム。 - 前記撮影手段のレンズは、マクロレンズであることを特徴とする請求項1乃至6の何れか一項に記載の三次元形状データおよびテクスチャ情報生成システム。
- 被写体を一部領域毎に撮影するカメラと、
前記カメラの位置から測距対象点までの距離を計測する測距センサと、
前記被写体と前記カメラとの少なくとも何れか一方を駆動する駆動装置と、
前記カメラ、前記測距センサ、及び前記駆動装置を制御するコントローラと、
を備え、
前記コントローラは、前記被写体上における測距対象点を含む前記一部領域に前記カメラのピントが合った状態で、前記カメラの位置から前記測距対象点までの距離を前記測距センサに計測させ、前記計測された距離または計測された距離から計算される距離を被写界深度の範囲内に保ちながら前記一部領域が順次変更されるように前記被写体と前記カメラとの少なくとも何れか一方を前記駆動装置により駆動させ、前記被写体の各前記一部領域を前記カメラに撮影させ、
前記カメラにより撮影された前記一部領域毎の画像に基づいて前記被写体の三次元形状データおよびテクスチャ情報を生成する生成処理を実行するプロセッサを、
さらに備えることを特徴とする三次元形状データおよびテクスチャ情報生成システム。 - 被写体を一部領域毎に撮影する撮影手段と、前記撮影手段の位置から測距対象点までの距離を計測する測距手段と、前記被写体と前記撮影手段との少なくとも何れか一方を駆動する駆動手段と、前記撮影手段、前記測距手段、及び前記駆動手段を制御する制御手段と、前記撮影手段により撮影された前記一部領域毎の画像に基づいて前記被写体の三次元形状データおよびテクスチャ情報を生成する生成手段と、を備える三次元形状データおよびテクスチャ情報生成システムにおける前記制御手段に含まれるコンピュータに、
前記被写体上における測距対象点を含む前記一部領域に前記撮影手段のピントが合った状態で、前記撮影手段の位置から前記測距対象点までの距離を前記測距手段に計測させるステップと、
前記計測された距離または計測された距離から計算される距離を被写界深度の範囲内に保ちながら前記一部領域が順次変更されるように前記被写体と前記撮影手段との少なくとも何れか一方を前記駆動手段に駆動させるステップと、
前記被写体の各前記一部領域を前記撮影手段に撮影させることで前記一部領域毎の画像を取得するステップと、
を実行させることを特徴とする撮影制御プログラム。 - 被写体上における測距対象点を含む一部領域にカメラのピントが合った状態で、前記カメラの位置から前記測距対象点までの距離を、測距センサにより計測するステップと、
前記計測された距離または計測された距離から計算される距離を被写界深度の範囲内に保ちながら前記一部領域が順次変更されるように前記被写体と前記カメラとの少なくとも何れか一方を、駆動装置により駆動させるステップと、
前記一部領域が順次変更される過程において、前記被写体の各前記一部領域を前記カメラに撮影させるステップと、
前記カメラにより撮影された前記一部領域毎の画像に基づいて前記被写体の三次元形状データおよびテクスチャ情報をプロセッサにより生成するステップと、
を含むことを特徴とする三次元形状データおよびテクスチャ情報生成方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780013460.XA CN108700408B (zh) | 2016-02-25 | 2017-02-24 | 三维形状数据及纹理信息生成系统、方法及拍摄控制方法 |
EP17756636.1A EP3421930B1 (en) | 2016-02-25 | 2017-02-24 | Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method |
US16/077,873 US10571254B2 (en) | 2016-02-25 | 2017-02-24 | Three-dimensional shape data and texture information generating system, imaging control program, and three-dimensional shape data and texture information generating method |
MX2018010282A MX2018010282A (es) | 2016-02-25 | 2017-02-24 | Sistema de generacion de datos de forma tridimensional e informacion de textura, programa de control de formacion de imagenes y metodo de generacion de datos de forma tridimensional e informacion de textura. |
JP2017567832A JP6504274B2 (ja) | 2016-02-25 | 2017-02-24 | 三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法並びに情報記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016034458 | 2016-02-25 | ||
JP2016-034458 | 2016-02-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017146202A1 true WO2017146202A1 (ja) | 2017-08-31 |
Family
ID=59686301
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/007054 WO2017146202A1 (ja) | 2016-02-25 | 2017-02-24 | 三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US10571254B2 (ja) |
EP (1) | EP3421930B1 (ja) |
JP (1) | JP6504274B2 (ja) |
CN (1) | CN108700408B (ja) |
MX (1) | MX2018010282A (ja) |
WO (1) | WO2017146202A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108022297A (zh) * | 2017-11-30 | 2018-05-11 | 山东九维度网络科技有限公司 | 三维成像建模方法、存储介质和设备 |
WO2019227143A1 (en) | 2018-05-28 | 2019-12-05 | MMAPT IP Pty Ltd | A system for capturing media of a product |
WO2020217331A1 (ja) * | 2019-04-24 | 2020-10-29 | 住友電工焼結合金株式会社 | 焼結体の製造システム及び製造方法 |
CN114383521A (zh) * | 2020-10-02 | 2022-04-22 | 贝克休斯油田作业有限责任公司 | 自动涡轮叶片与护罩间隙测量 |
KR102581442B1 (ko) * | 2022-10-28 | 2023-09-20 | 포항공과대학교 산학협력단 | 수중에서 3차원 스캐닝이 가능한 수중 로봇의 제어 방법 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7140209B2 (ja) * | 2018-12-21 | 2022-09-21 | 株式会社ニコン | 検出装置、情報処理装置、検出方法、及び情報処理プログラム |
DE102020105215A1 (de) * | 2020-02-27 | 2021-09-02 | Jungheinrich Aktiengesellschaft | Verfahren zum Kalibrieren einer Sensoreinheit eines Flurförderzeugs |
CN111787236B (zh) * | 2020-08-14 | 2021-08-17 | 广东申义实业投资有限公司 | 用于全景图片拍摄的遥控装置、方法及全景图片拍摄系统 |
CN112312113B (zh) * | 2020-10-29 | 2022-07-15 | 贝壳技术有限公司 | 用于生成三维模型的方法、装置和系统 |
KR102611537B1 (ko) * | 2021-06-04 | 2023-12-08 | 한국전자통신연구원 | 초고품질의 디지털 데이터 생성 방법 및 장치 |
CN113865509A (zh) * | 2021-09-29 | 2021-12-31 | 苏州华兴源创科技股份有限公司 | 自动跟随检测装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1075391A (ja) * | 1996-08-30 | 1998-03-17 | Canon Inc | 被写体形状構成装置及び被写体形状構成方法 |
JP2000111322A (ja) * | 1998-10-01 | 2000-04-18 | Sony Corp | 3次元データ処理装置および方法 |
US20030160970A1 (en) * | 2002-01-30 | 2003-08-28 | Anup Basu | Method and apparatus for high resolution 3D scanning |
JP2003269932A (ja) * | 2002-03-13 | 2003-09-25 | Olympus Optical Co Ltd | 3次元画像撮影装置及び3次元画像撮影方法 |
WO2015008587A1 (ja) * | 2013-07-16 | 2015-01-22 | 富士フイルム株式会社 | 撮影装置及び3次元計測装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11223516A (ja) * | 1998-02-09 | 1999-08-17 | Fuji Xerox Co Ltd | 3次元画像撮像装置 |
JP2001197521A (ja) | 2000-01-06 | 2001-07-19 | Toppan Printing Co Ltd | 撮像装置、撮像方法及び撮像条件に係るデータを記録した記録媒体 |
JP2001338279A (ja) | 2000-05-30 | 2001-12-07 | Mitsubishi Electric Corp | 三次元形状計測装置 |
JP2003148926A (ja) | 2001-11-13 | 2003-05-21 | Kanto Auto Works Ltd | 可搬式三次元形状計測装置 |
JP3944039B2 (ja) * | 2002-09-13 | 2007-07-11 | キヤノン株式会社 | 焦点調節装置及びプログラム |
US7391424B2 (en) * | 2003-08-15 | 2008-06-24 | Werner Gerhard Lonsing | Method and apparatus for producing composite images which contain virtual objects |
JP4419570B2 (ja) * | 2003-12-26 | 2010-02-24 | 富士ゼロックス株式会社 | 3次元画像撮影装置および方法 |
JP2006162250A (ja) * | 2004-12-02 | 2006-06-22 | Ushio Inc | フィルムワークのパターン検査装置 |
JP2007304429A (ja) * | 2006-05-12 | 2007-11-22 | Fujifilm Corp | カメラシステム |
JP2012098265A (ja) * | 2010-11-02 | 2012-05-24 | Beru Techno:Kk | 重量、形状その他性状の計測装置 |
JP5610579B2 (ja) * | 2011-01-28 | 2014-10-22 | 三菱日立パワーシステムズ株式会社 | 3次元寸法測定装置 |
JP5894426B2 (ja) * | 2011-12-13 | 2016-03-30 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 計測対象抽出装置、顔形状推定装置、計測対象抽出方法および顔形状推定方法 |
US9258550B1 (en) * | 2012-04-08 | 2016-02-09 | Sr2 Group, Llc | System and method for adaptively conformed imaging of work pieces having disparate configuration |
JP6029394B2 (ja) | 2012-09-11 | 2016-11-24 | 株式会社キーエンス | 形状測定装置 |
US8848201B1 (en) * | 2012-10-20 | 2014-09-30 | Google Inc. | Multi-modal three-dimensional scanning of objects |
JP6075644B2 (ja) * | 2014-01-14 | 2017-02-08 | ソニー株式会社 | 情報処理装置および方法 |
-
2017
- 2017-02-24 JP JP2017567832A patent/JP6504274B2/ja active Active
- 2017-02-24 EP EP17756636.1A patent/EP3421930B1/en active Active
- 2017-02-24 CN CN201780013460.XA patent/CN108700408B/zh active Active
- 2017-02-24 WO PCT/JP2017/007054 patent/WO2017146202A1/ja active Application Filing
- 2017-02-24 US US16/077,873 patent/US10571254B2/en active Active
- 2017-02-24 MX MX2018010282A patent/MX2018010282A/es unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1075391A (ja) * | 1996-08-30 | 1998-03-17 | Canon Inc | 被写体形状構成装置及び被写体形状構成方法 |
JP2000111322A (ja) * | 1998-10-01 | 2000-04-18 | Sony Corp | 3次元データ処理装置および方法 |
US20030160970A1 (en) * | 2002-01-30 | 2003-08-28 | Anup Basu | Method and apparatus for high resolution 3D scanning |
JP2003269932A (ja) * | 2002-03-13 | 2003-09-25 | Olympus Optical Co Ltd | 3次元画像撮影装置及び3次元画像撮影方法 |
WO2015008587A1 (ja) * | 2013-07-16 | 2015-01-22 | 富士フイルム株式会社 | 撮影装置及び3次元計測装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3421930A4 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108022297A (zh) * | 2017-11-30 | 2018-05-11 | 山东九维度网络科技有限公司 | 三维成像建模方法、存储介质和设备 |
CN108022297B (zh) * | 2017-11-30 | 2021-11-19 | 山东九维度网络科技有限公司 | 三维成像建模方法、存储介质和设备 |
WO2019227143A1 (en) | 2018-05-28 | 2019-12-05 | MMAPT IP Pty Ltd | A system for capturing media of a product |
EP3802009A4 (en) * | 2018-05-28 | 2022-03-09 | MMAPT IP Pty Ltd | PRODUCT MEDIA CAPTURE SYSTEM |
US11838688B2 (en) | 2018-05-28 | 2023-12-05 | MMAPT IP Pty Ltd. | System for capturing media of a product |
WO2020217331A1 (ja) * | 2019-04-24 | 2020-10-29 | 住友電工焼結合金株式会社 | 焼結体の製造システム及び製造方法 |
JPWO2020217331A1 (ja) * | 2019-04-24 | 2021-12-23 | 住友電工焼結合金株式会社 | 焼結体の製造システム及び製造方法 |
CN114383521A (zh) * | 2020-10-02 | 2022-04-22 | 贝克休斯油田作业有限责任公司 | 自动涡轮叶片与护罩间隙测量 |
CN114383521B (zh) * | 2020-10-02 | 2024-03-08 | 贝克休斯油田作业有限责任公司 | 自动涡轮叶片与护罩间隙测量 |
KR102581442B1 (ko) * | 2022-10-28 | 2023-09-20 | 포항공과대학교 산학협력단 | 수중에서 3차원 스캐닝이 가능한 수중 로봇의 제어 방법 |
Also Published As
Publication number | Publication date |
---|---|
US10571254B2 (en) | 2020-02-25 |
JPWO2017146202A1 (ja) | 2018-03-29 |
US20190339067A1 (en) | 2019-11-07 |
CN108700408A (zh) | 2018-10-23 |
CN108700408B (zh) | 2020-06-30 |
EP3421930B1 (en) | 2021-03-17 |
JP6504274B2 (ja) | 2019-04-24 |
MX2018010282A (es) | 2018-12-19 |
EP3421930A4 (en) | 2019-03-06 |
EP3421930A1 (en) | 2019-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017146202A1 (ja) | 三次元形状データおよびテクスチャ情報生成システム、撮影制御プログラム、及び三次元形状データおよびテクスチャ情報生成方法 | |
CN108965690B (zh) | 图像处理系统、图像处理装置及计算机可读存储介质 | |
CN109382821B (zh) | 校准方法、校准系统及程序 | |
WO2012053521A1 (ja) | 光学情報処理装置、光学情報処理方法、光学情報処理システム、光学情報処理プログラム | |
JP4306006B2 (ja) | 3次元データ入力方法及び装置 | |
JP2009042162A (ja) | キャリブレーション装置及びその方法 | |
JP2019049467A (ja) | 距離計測システムおよび距離計測方法 | |
JP5093058B2 (ja) | ロボットの座標の結合方法 | |
JP6973233B2 (ja) | 画像処理システム、画像処理装置および画像処理プログラム | |
JP6653143B2 (ja) | 物体の3d座標を測定するための方法および装置 | |
JP5198078B2 (ja) | 計測装置および計測方法 | |
JP5573537B2 (ja) | ロボットのティーチングシステム | |
WO2020240918A1 (ja) | 作業支援システム、作業支援方法およびプログラム | |
JP4221808B2 (ja) | 3次元データ入力方法及び装置 | |
JP2009264898A (ja) | ワーク位置姿勢計測方法および計測装置 | |
WO2017057426A1 (ja) | 投影装置、コンテンツ決定装置、投影方法、および、プログラム | |
JP7173825B2 (ja) | カメラシステム、その制御方法およびプログラム | |
JP5610579B2 (ja) | 3次元寸法測定装置 | |
JP2005186193A (ja) | ロボットのキャリブレーション方法および三次元位置計測方法 | |
JP2004170277A (ja) | 3次元計測方法、3次元計測システム、画像処理装置、及びコンピュータプログラム | |
JP2008154195A (ja) | レンズのキャリブレーション用パターン作成方法、レンズのキャリブレーション用パターン、キャリブレーション用パターンを利用したレンズのキャリブレーション方法、レンズのキャリブレーション装置、撮像装置のキャリブレーション方法、および撮像装置のキャリブレーション装置 | |
JP4839858B2 (ja) | 遠隔指示システム及び遠隔指示方法 | |
JP3740848B2 (ja) | 3次元入力装置 | |
JP6234335B2 (ja) | 撮像装置及び撮像装置を設置した移動体 | |
JP2008175635A (ja) | 3次元計測装置および計測プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017567832 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/010282 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017756636 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017756636 Country of ref document: EP Effective date: 20180925 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17756636 Country of ref document: EP Kind code of ref document: A1 |