WO2023223916A1 - Three-dimensional reconfiguration system and three-dimensional reconfiguration method - Google Patents

Three-dimensional reconfiguration system and three-dimensional reconfiguration method Download PDF

Info

Publication number
WO2023223916A1
WO2023223916A1 PCT/JP2023/017602 JP2023017602W WO2023223916A1 WO 2023223916 A1 WO2023223916 A1 WO 2023223916A1 JP 2023017602 W JP2023017602 W JP 2023017602W WO 2023223916 A1 WO2023223916 A1 WO 2023223916A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
marker
camera
reconstruction system
dimensional reconstruction
Prior art date
Application number
PCT/JP2023/017602
Other languages
French (fr)
Japanese (ja)
Inventor
叡一 松元
泰輔 橋本
Original Assignee
株式会社Preferred Networks
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Preferred Networks filed Critical 株式会社Preferred Networks
Publication of WO2023223916A1 publication Critical patent/WO2023223916A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Definitions

  • the present disclosure relates to a three-dimensional reconstruction system and a three-dimensional reconstruction method.
  • three-dimensional reconstruction systems that generate a three-dimensional model of an object based on a plurality of images obtained by photographing the object from different directions with a camera.
  • Such a three-dimensional reconstruction system is used in 3DCG (3 Dimensional Computer Graphics) and the like.
  • a method is disclosed in which camera posture information is acquired based on an image taken by a camera of a marker provided on the mounting surface of an object, and is used for three-dimensional reconstruction.
  • An object of the present disclosure is to provide a three-dimensional reconstruction system in which markers are easily photographed regardless of the photographing direction of the camera.
  • a three-dimensional reconstruction system includes one or more cameras for photographing an object, a three-dimensional marker, and a plurality of images of the object photographed from different directions by the one or more cameras.
  • a control device configured to output reconstruction information of the object based on a captured image and posture information of the camera based on an image of the three-dimensional marker included in the plurality of captured images;
  • the three-dimensional marker has a marker portion arranged at an angle with respect to a mounting surface of the three-dimensional marker.
  • FIG. 1 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system according to an embodiment.
  • FIG. 3 is a diagram illustrating a two-dimensional code formed on a three-dimensional marker according to an embodiment. It is a figure which illustrates the corner of the marker part based on embodiment.
  • FIG. 3 is a flow diagram showing an example of processing by the control device according to the embodiment.
  • FIG. 3 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system according to a modified example.
  • FIG. 2 is a block diagram of an example hardware configuration of a control device according to an embodiment.
  • the x direction, y direction, and z direction are directions perpendicular to each other.
  • the z direction is a normal direction to the mounting surface, and is typically a vertical direction.
  • the z positive direction side is written as the upper side, and the z negative direction side is written as the lower side.
  • the x direction and the y direction are directions in which the mounting surface extends, and are typically horizontal directions.
  • FIG. 1 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system 100.
  • FIG. 2 is a diagram illustrating a two-dimensional code formed on the three-dimensional marker 6 included in the three-dimensional reconstruction system 100.
  • FIG. 3 is a diagram illustrating the corners 61 to 64 of the marker portion 6b of the three-dimensional marker 6.
  • the three-dimensional reconstruction system 100 includes a support base 2, upper cameras 3A and 3B, lower cameras 4A and 4B, a rotating section 5, four three-dimensional markers 6, and two reference markers. 6R, lights 8A and 8B, and a control device 9.
  • the three-dimensional reconstruction system 100 generates a three-dimensional model of an object based on a plurality of images obtained by photographing the object 10 from different directions using an upper camera 3A, an upper camera 3B, a lower camera 4A, and a lower camera 4B.
  • the object 10 include, for example, a container-shaped object such as a cup as shown in FIG. Can be mentioned.
  • object includes both an object that actually exists and is the target of three-dimensional reconstruction, and an object that is reconstructed based on a plurality of images taken of this object. Whether the term “object” refers to an actually existing object or a reconstructed object can be appropriately distinguished depending on the context. For example, when referring to the handling of an object in real space, such as “placing an object on a support stand,” “object” means an object that actually exists. On the other hand, when referring to the handling of an object in a virtual space such as “memorizing an object” or “reconstructing an object”, the "object” means a reconstructed object that actually exists.
  • the support stand 2 is an example of a support member that supports the object 10.
  • the support base 2 includes an upper surface 2A.
  • the support base 2 is a base on which the object 10 is placed and on which the placed object 10 is rotated.
  • the support base 2 is, for example, a transparent plate such as an acrylic plate.
  • the upper surface 2A corresponds to a placement surface on which the object 10 is placed. Since the support base 2 is a transparent plate, it is possible to photograph the object 10 and the three-dimensional marker 6 placed on the upper surface 2A from the lower surface 2B, which is the back side of the upper surface 2A, as shown in FIG. Since the three-dimensional reconstruction system 100 uses a transparent plate for the support base 2 and can photograph the object 10 from all directions, it is possible to obtain three-dimensional reconstruction information of the object 10 without missing information.
  • the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B are examples of one or more cameras for photographing an object.
  • Each of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B can photograph the object 10 placed on the upper surface 2A from different directions.
  • the upper cameras 3A and 3B are installed in the direction of the upper surface 2A of the support base 2 (above the mounting surface), and can photograph the object 10 by being directed toward the object 10 from diagonally above the upper surface 2A.
  • the upper cameras 3A and 3B are installed at different inclination angles.
  • the lower cameras 4A and 4B are installed toward the lower surface 2B side of the support base 2 (below the mounting surface), and can photograph the object 10 by being directed toward the object 10 from diagonally below the upper surface 2A.
  • the lower cameras 4A and 4B are installed at different inclination angles.
  • Each of the upper camera 3A, upper camera 3B, lower camera 4A, and lower camera 4B is, for example, an RGB-D camera.
  • the RGB-D camera can capture an RGB image including each color of R (Red), G (Green), and B (Blue) of the object 10 and a depth image.
  • the depth image is an image that includes depth information (depth) up to the object 10.
  • the upper cameras 3A and 3B will be collectively referred to as "upper camera 3”
  • the lower cameras 4A and 4B will also be collectively referred to as "lower camera 4". Note that the positions of the upper camera 3 and the lower camera 4 may be rotated.
  • the "camera” means an element that can capture an RGB image or a depth image of the object 10.
  • This "camera” includes the entire camera device such as an RGB-D camera and a depth camera, sensors such as a CMOS sensor and a depth sensor built into the camera device, and sensors used alone.
  • the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B each have an internal parameter and an external parameter.
  • the internal parameters include information related to distortion of the lens included in each camera. In this embodiment, the internal parameters are known based on simulation results and the like.
  • the external parameters include posture information of each camera.
  • the attitude of the camera includes a relative attitude of the camera with respect to the three-dimensional marker 6 and an absolute attitude in the world coordinate system. Further, the attitude of the camera corresponds to the inclination of the optical axis of an optical system such as a lens included in each camera.
  • the external parameters are acquired using the three-dimensional marker 6.
  • the rotating unit 5 is an example of a rotating mechanism that rotates the object 10 by rotating the support base 2.
  • the rotating unit 5 rotates the support base 2 in the direction of an arrow 50, for example.
  • Each of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B photographs the object 10 at each of a plurality of rotation angles by the rotation unit 5.
  • a known power transmission system can be used for the mechanism of the rotating part 5.
  • the rotating section 5 includes a motor and a gear mechanism.
  • the rotating part 5 may be configured such that the driving force of the motor is transmitted to the rotating shaft of the support base 2 via a gear mechanism.
  • the rotating unit 5 may be configured to apply a driving force to the outer edge of the support base 2 to rotate the support base 2 .
  • the three-dimensional marker 6 is arranged around the object 10 on the upper surface 2A.
  • the three-dimensional marker 6 is a mark used to recognize the relative position of the object 10 and each of the upper camera 3A, upper camera 3B, lower camera 4A, and lower camera 4B.
  • the three-dimensional marker 6 is placed on the upper surface 2A.
  • the three-dimensional marker 6 is not fixed to the upper surface 2A, and its position and orientation with respect to the upper surface 2A are variable. However, since the three-dimensional marker 6 is not moved while the support base 2 is rotating to photograph the object 10 by the upper camera 3 and the lower camera 4, the position and orientation of the three-dimensional marker 6 with respect to the upper surface 2A remain unchanged. becomes.
  • the number of three-dimensional markers 6 There is no particular limit to the number of three-dimensional markers 6. However, if the number of 3D markers 6 is small, the 3D marker 6 may be hidden behind the object 10 during shooting, and the posture information of the upper camera 3 and lower camera 4 may not be properly acquired. The larger the number, the better. Therefore, it is preferable that a plurality of three-dimensional markers 6, including two three-dimensional markers 6 placed before and after the object 10, be arranged around the object 10 in the photographing direction of the upper camera 3 or the lower camera 4. In this embodiment, the number of three-dimensional markers 6 is four, as an example. Note that the three-dimensional markers 6 in FIG. 1 each have a different two-dimensional code.
  • the three-dimensional marker 6 includes a three-dimensional member 6a and a marker portion 6b arranged at an angle with respect to the mounting surface of the three-dimensional marker 6.
  • the mounting surface of the object 10 may be the same as the mounting surface of the three-dimensional marker 6. That is, the mounting surface of the three-dimensional marker 6 may be the upper surface 2A.
  • the above-mentioned angle may be an angle that is not parallel to the upper surface 2A, and may be an angle determined according to the shooting direction of each camera so that the marker portion 6b can be detected from the RGB image captured by each camera.
  • the three-dimensional member 6a is a three-dimensional member having a predetermined height with respect to the upper surface 2A, and has a shape for arranging the marker portion 6b at an angle with respect to the upper surface 2A.
  • the material of the three-dimensional member 6a can be made of resin, metal, wood, paper, or the like.
  • the three-dimensional member 6a may be a hollow box-like member or may be a solid member that is not hollow.
  • the three-dimensional member 6a is a polyhedron including six sides, and is a cubic box-shaped member in which all six sides have the same shape.
  • the marker portions 6b are provided on each of six surfaces (one top surface, four side surfaces, and one bottom surface) of the three-dimensional member 6a.
  • the marker portion 6b provided on the side surface of the three-dimensional member 6a is arranged at an angle (90° in this embodiment) with respect to the upper surface 2A.
  • the marker portion 6b provided on the bottom surface of the three-dimensional member 6a is arranged so that its surface is in contact with the top surface 2A, but since the support base 2 is transparent, the marker portion 6b on the bottom surface can be detected by the lower camera 4. You can take pictures through it. As shown in FIG.
  • the marker unit 6b uses a two-dimensional code such as an AR marker, which is an image of a fixed pattern that serves as a marker in an image recognition type AR (Augmented Reality) system.
  • a two-dimensional code such as a QR code (registered trademark) or another two-dimensional code may be used. This embodiment will be described assuming that an AR marker is used for the marker portion 6b.
  • a separate marker ID is assigned to each surface of the marker portion 6b, and the marker ID can be recognized by a known image recognition technique.
  • a planar AR marker is used. Further, the AR marker is used to calculate the distance, angle, etc. with respect to the camera from the shape of the image, such as the degree of distortion when it is captured by the camera. Alternatively, the AR marker is used to display 3DCG or the like at the marker position based on the acquired information.
  • the marker section 6b is rectangular and includes a two-dimensional code (two-dimensional bit pattern) in two colors, black and white. However, the marker section 6b may include a one-dimensional code such as a barcode instead of the two-dimensional code.
  • the marker portions 6b provided on the six surfaces of the three-dimensional member 6a each have a different two-dimensional code. Therefore, the four three-dimensional markers 6 include a total of 24 types of AR markers with different bit patterns.
  • the marker part 6b may be one in which an AR marker is printed on each surface of the three-dimensional member 6a, or a plate-like member or a sheet-like member on which an AR marker is formed is attached to each surface of the three-dimensional member 6a with an adhesive member. It may be fixed by, for example. Note that the three-dimensional marker 6 may be shaped using a 3D printer so that an AR marker appears on each surface.
  • Identification information of the marker section 6b is recorded in the two-dimensional code.
  • the identification information of the marker portion 6b may be, for example, an identification number indicating which marker portion of which three-dimensional marker it is, or may be, for example, a serial identification number of the marker portion 6b that does not distinguish between three-dimensional markers. It may be.
  • each marker portion 6b the bit pattern of the two-dimensional code and the orientation of the marker portion 6b are associated in advance, and the orientation of the bit pattern directly or indirectly indicates the orientation of the marker portion 6b. That is, as shown in FIG. 3, each marker portion 6b has corners 61 to 64 at its four corners that can be distinguished from each other and represent its orientation. In the identification of the marker portion 6b, each of the corners 61 to 64 is recognized as being different from each other. For example, the four corners of each marker portion 6b can be identified by mutually different identification numbers 1 to 4, such as “the corner with identification number 4 in the marker portion 6b with identification number 1”.
  • the two reference markers 6R are an example of two flat markers that are fixed to the upper surface 2A and are flat markers parallel to the upper surface 2A.
  • the reference marker 6R also has a marker part having a two-dimensional code in which its own identification information is recorded, and the marker part of the reference marker 6R also has four corners that indicate its orientation, similar to the marker part 6b of the three-dimensional marker 6. has.
  • the reference marker 6R is also an AR marker, for example.
  • the two reference markers 6R may be AR markers printed at predetermined positions on the support base 2, or plate-like members or sheet-like members on which AR markers are formed are placed on each surface of the three-dimensional member 6a. It may be fixed with an adhesive member or the like.
  • the two reference markers 6R are provided to define a world coordinate system used in three-dimensional reconstruction.
  • the distance between the two reference markers 6R is predetermined. This distance is used to determine the length of object 10.
  • the upper surface 2A on which the two reference markers 6R are provided corresponds to the xy plane in the world coordinate system.
  • the center position between the two reference markers 6R becomes the origin of the world coordinate system.
  • the reference marker 6R which is a planar marker, is illustrated, but the reference marker may be two or more of the three-dimensional markers 6. That is, at least two of the plurality of three-dimensional markers 6 included in the three-dimensional reconstruction system 100 may be fixed to the upper surface 2A and may be reference markers having a predetermined distance between them.
  • the illumination 8A is arranged in the direction toward the upper surface 2A of the support base 2.
  • the illumination 8B is arranged toward the lower surface 2B of the support base 2.
  • the lights 8A and 8B each illuminate the object 10 with light.
  • the illumination lights 8A and 8B are arranged according to the installation positions of the upper camera 3 and the lower camera 4 so that there is no shadow on the surface of the object 10, especially in the images taken by the upper camera 3 and the lower camera 4.
  • the control device 9 controls the operation of the three-dimensional reconstruction system 100.
  • the control device 9 captures a plurality of captured images of the object 10 captured from different directions by the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B, and a plurality of captured images included in the plurality of captured images. It is configured to output reconstruction information of the object 10 based on the posture information of each camera based on the image of the three-dimensional marker 6. Further, the control device 9 uses the RGB image and the depth image from the upper camera 3 and the lower camera 4, and the posture information of the upper camera 3 and the lower camera 4 based on the image of the three-dimensional marker 6 included in the RGB image. Reconstruction information of the object 10 can be output.
  • control device 9 controls photographing of the object 10 by the upper camera 3 and the lower camera 4. Further, the control device 9 reconstructs the object 10 by generating a three-dimensional model of the object 10 based on the captured image of the object 10.
  • the control device 9 includes an imaging control section 91, an attitude calculation section 92, and a model generation section 93 as functions related to these.
  • the photographing control section 91 controls the rotation so that the upper camera 3 and the lower camera 4 photograph the object 10 during rotation by the rotating section 5 to obtain a plurality of photographed images (in this embodiment, an RGB image and a depth image). 5, the upper camera 3, and the lower camera 4.
  • the attitude calculation unit 92 calculates the attitude information of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B based on the images of the three-dimensional marker 6 included in the plurality of captured images by the upper camera 3 and the lower camera 4. .
  • the model generation unit 93 generates a three-dimensional image of the object 10 based on a plurality of captured images captured by the upper camera 3 and the lower camera 4 and the attitude information of the upper camera 3 and the lower camera 4 acquired by the attitude calculation unit 92.
  • the object 10 is reconstructed by generating the original model.
  • the model generation unit 93 outputs the generated three-dimensional model of the object 10 as reconstruction information of the object 10.
  • the model generation unit 93 can reconstruct the object 10 by generating a model shape using a depth image and generating a texture using an RGB image.
  • the three-dimensional reconstruction method by the model generation unit 93 is not limited to this method using depth images.
  • the model generation unit 93 uses a three-dimensional reconstruction method of the object 10 using so-called photogrammetry, or an error between an image (two-dimensional image) obtained by rendering current three-dimensional information of the object 10 and a photographed image.
  • the object 10 can also be reconstructed using a three-dimensional reconstruction method using machine learning that updates the three-dimensional information based on the following.
  • FIG. 4 is a flowchart showing an example of processing by the control device 9.
  • the control device 9 starts the process shown in FIG. 4 when receiving an operation input from the user to instruct the start of reconfiguration through its operation unit.
  • the object 10 is placed on the upper surface 2A of the support base 2.
  • the mounting position of the object 10 is preferably the center of rotation of the support base 2.
  • the top surface 2A has a substantially circular shape, and the three-dimensional marker 6 is arranged approximately concentrically with the outer edge of the top surface 2A, so that the object 10 is placed at the center of the top surface 2A. It is preferable that
  • step S41 the control device 9 causes the photographing control section 91 to drive the rotation section 5 and cause the support base 2 to start rotating.
  • step S42 the control device 9 causes the photographing control unit 91 to operate the upper camera 3 and the lower camera 4 to photograph the rotating object 10 from both the upper and lower sides.
  • Each camera of the lower camera 4 acquires an RGB image and a depth image.
  • the photographing control unit 91 causes the upper camera 3 and the lower camera 4 to perform photographing at the same timing or at different timings while the object 10 is rotating. Thereby, the photographing control unit 91 can acquire RGB images of each camera in parallel when the object 10 is at a position of an arbitrary rotation angle.
  • the photographing control unit 91 also acquires depth images from each camera.
  • the depth image is represented by a group of points that are colored according to the distance from the camera to the surface of the object 10 at each pixel.
  • step S42 the photographing control unit 91 continues the photographing operation and acquires an image while the rotating unit 5 rotates the object 10 once.
  • each of the upper camera 3 and the lower camera 4 can photograph the object 10 from directions inclined at different angles with respect to the upper surface 2A, and from a plurality of directions along the rotation direction.
  • the upper camera 3 whose photographing range includes the upper surface of the three-dimensional marker 6 can photograph the marker portion 6b on the upper surface of the three-dimensional marker 6.
  • the photographing control unit 91 can simultaneously photograph the object 10 from different directions using a plurality of cameras.
  • the photographing control section 91 outputs the photographed image to each of the posture calculation section 92 and the model generation section 93.
  • step S43 the control device 9 causes the photographing control section 91 to stop the rotation section 5 and stop the rotation of the support base 2.
  • step S44 the control device 9 uses the posture calculation section 92 to determine the position relative to the reference marker 6R, that is, the world coordinate system, based on the image of the three-dimensional marker 6 (marker section 6b) in the RGB image. Estimate the posture information of each of the upper camera 3 and the lower camera 4 in .
  • the posture calculation unit 92 detects the marker portions (the marker portion 6b of the three-dimensional marker 6 and the marker portion of the reference marker 6R) photographed by the camera. In this detection, the posture calculation section 92 specifies the position of each of the four corners 61 to 64 in the camera coordinate system for each of the marker sections being photographed. In addition, in this detection, the posture calculation section 92 specifies the identification number of the marker section being photographed.
  • the posture calculation unit 92 calculates the position of the marker portion based on the positions in the camera coordinate system of the corners 61 to 64 of the marker portion photographed by the camera, the known size of the marker portion, and the internal parameters of the camera.
  • the relative attitude (relative position and relative rotation angle) of the camera relative to the camera can be calculated. That is, the attitude calculation unit 92 can calculate the relative attitude of the camera with respect to the marker part based on how the marker part looks.
  • the positions of the corners 61 to 64 in each marker portion in the camera coordinate system are information representing how the marker portion is viewed.
  • the attitude calculation unit 92 calculates the attitude (absolute attitude) of each camera based on the relative attitude of each camera with respect to the marker unit, for example, using one reference marker 6R as a reference, thereby calculating the attitude indicating the absolute attitude of all the cameras. Estimate information. An example of this calculation will be explained below. First, the attitude calculation unit 92 regards the relative attitude of a certain camera (herein referred to as camera 1) that is photographing (detecting) the marker portion of the reference marker 6R as the absolute attitude of the camera 1. This makes it possible to estimate the absolute orientation of the camera 1 with reference to the reference marker 6R.
  • the posture calculation unit 92 identifies another camera (herein referred to as camera 2) that is photographing the marker portion 6b of the three-dimensional marker 6 photographed by the camera 1, and determines the absolute posture of the camera 1. Based on the relative postures of the cameras 1 and 2 with respect to the marker portion 6b, the absolute posture of the camera 2 with respect to the reference marker 6R is calculated. By performing such calculations on all cameras, the pose calculation unit 92 can estimate the pose information indicating the absolute poses of all the cameras. Note that when a plurality of marker parts 6b are included in the RGB image taken by one camera, the attitude calculation part 92 calculates the absolute attitude of the camera that minimizes the error with respect to the relative attitude with respect to each marker part 6b. The accuracy of estimating the absolute orientation of the camera may be improved by estimating the absolute orientation of the camera, that is, by optimizing the absolute orientation of the camera. The posture calculation section 92 outputs the estimated posture information to the model generation section 93.
  • camera 2 another camera that is photographing the marker portion 6b of the three-dimensional marker
  • step S45 model generation step
  • the control device 9 causes the model generation unit 93 to use the information acquired in step S42 based on the posture information of each of the upper camera 3 and the lower camera 4 with respect to the reference marker 6R.
  • the depth images of the object 10 taken in each shooting direction are combined to generate a three-dimensional model of the object 10.
  • the attitude information of each camera here is information on the absolute attitude of each camera.
  • the model generation unit 93 can generate a three-dimensional model with a pattern or color by creating a three-dimensional mesh model based on the depth image, for example, and pasting a texture created based on the RGB image on the surface of the mesh model.
  • step S46 the control device 9 causes the model generation unit 93 to output the generated three-dimensional model as reconstruction information, and stores it in the ROM, auxiliary storage device, etc. of the control device 9.
  • the model generation unit 93 can also output the reconstruction information to an external device such as a display device or a PC (Personal Computer).
  • control device 9 may collectively execute each process of steps S44 and S45 using the RGB image and depth image of the object 10 at each position in the rotational direction acquired in step S42. That is, the functions of the attitude calculation section 92 and the model generation section 93 in the control device 9 may be integrated.
  • the three-dimensional reconstruction system 100 includes an upper camera 3 and a lower camera 4 (cameras) that can photograph the object 10 placed on the upper surface 2A (placement surface) from different directions, and
  • the three-dimensional marker 6 is arranged around the object 10.
  • the three-dimensional reconstruction system 100 uses the RGB image and the depth image (a plurality of captured images) by the upper camera 3 and the lower camera 4, and the image of the three-dimensional marker 6 included in the RGB image and the depth image. It has a control device 9 configured to output reconstruction information of the object 10 based on posture information of the camera 4 .
  • the three-dimensional marker 6 has a marker portion 6b arranged at an angle with respect to the upper surface 2A.
  • the three-dimensional marker 6 has a three-dimensional member 6a having a predetermined height with respect to the upper surface 2A, and the marker portion 6b is provided on the three-dimensional member 6a.
  • Conventional three-dimensional reconstruction methods use planar markers parallel to the mounting surface, so when the marker is photographed with a camera at a shallow angle to the mounting surface, the marker can be easily detected from the photographed image. Images may not be identified and, as a result, camera pose information may not be obtained.
  • the shallow angle with respect to the mounting surface means, for example, an angle in which the inclination angle with respect to the mounting surface is 15 degrees or less. If camera orientation information cannot be obtained, the accuracy of three-dimensional reconstruction of the object will be reduced.
  • a photographed image of the marker portion 6b arranged at an angle with respect to the upper surface 2A is used. Therefore, even when photographing the three-dimensional marker 6 at a shallow angle with respect to the upper surface 2A, the three-dimensional reconstruction system 100 can capture an image of the marker portion 6b provided on either surface of the three-dimensional marker 6. can be included in the camera, and camera pose information can be obtained. Thereby, in this embodiment, it is possible to provide the three-dimensional reconstruction system 100 in which markers are easily photographed regardless of the photographing direction of each of the upper camera 3 and the lower camera 4.
  • the three-dimensional reconstruction system 100 only needs to have one or more cameras.
  • the three-dimensional member 6a is a cube having six planes (a polyhedron including a plurality of planes).
  • the marker portions 6b are provided on two or more of the six planes of this cube.
  • the marker section 6b has a two-dimensional code that records identification information of the marker section 6b.
  • camera attitude information can be acquired no matter which marker portion 6b included in the three-dimensional marker 6 is included in the captured image.
  • the marker section 6b may include a one-dimensional code instead of a two-dimensional code.
  • the three-dimensional reconstruction system 100 has at least two reference markers 6R (planar markers) that are fixed to the upper surface 2A and are planar markers parallel to the upper surface 2A.
  • the distance between the two reference markers 6R is predetermined.
  • the three-dimensional reconstruction system 100 can calibrate the size information of the marker section 6b based on the distance information between the two reference markers 6R included in the photographed image. As a result, camera posture information can be acquired accurately, so that the reconstruction accuracy by the three-dimensional reconstruction system 100 can be improved.
  • the reference marker 6R which is a flat marker
  • one of the three-dimensional markers 6 may be used as the reference marker, or two three-dimensional markers 6 with a predetermined distance between them may be used as the reference marker. may be used as a pair of reference markers.
  • the reference marker can be formed on the support base 2 in advance by printing or the like, which is advantageous in that the reference marker can be easily provided.
  • a three-dimensional marker is used as a reference marker, it is advantageous in that the reference marker can be photographed regardless of the photographing direction of the camera.
  • FIG. 5 is a diagram illustrating the overall configuration of the three-dimensional reconstruction system 100a.
  • the three-dimensional reconstruction system 100a uses an opaque floor 11 as a mounting surface, and images an object 10 mounted on the floor 11 from different directions using upper cameras 3A and 3B.
  • the three-dimensional reconstruction system 100a includes the upper cameras 3A and 3B as a plurality of cameras each capable of photographing the object 10 from different directions.
  • the control device 9 determines the object based on the plurality of images taken by the upper cameras 3A and 3B and the attitude information of each of the upper cameras 3A and 3B based on the image of the three-dimensional marker 6 included in these plurality of images. 10 reconstruction information is output.
  • the three-dimensional reconstruction system 100a uses a photographed image of a marker section 6b provided on a three-dimensional member 6a having a predetermined height with respect to the floor 11. For this reason, in the three-dimensional reconstruction system 100a, even when photographing the three-dimensional marker 6 at a shallow angle with respect to the floor 11, the image of the marker section 6b provided on either surface of the three-dimensional marker 6 is reflected in the photographed image. It is possible to obtain camera pose information. Thereby, in this modification, it is possible to provide a three-dimensional reconstruction system 100 that can acquire posture information of each camera, regardless of the shooting direction of each of the upper cameras 3A and 3B.
  • the object 10 can be three-dimensionally reconstructed with a simple configuration. Further, the three-dimensional reconstruction system 100a can three-dimensionally reconstruct the object 10 placed on an arbitrary mounting surface such as the floor 11, and can improve the usability of the three-dimensional reconstruction system 100a.
  • each device the three-dimensional reconstruction systems 100 and 100a in the embodiments described above may be configured with hardware, or may include a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc. It may be configured by information processing of software (program) to be executed.
  • the information processing is configured by software
  • the software that realizes at least some of the functions of each device in the above-described embodiments can be stored in a CD-ROM (Compact Disc-Read Only Memory), USB (Universal Serial Bus) memory, etc.
  • CD-ROM Compact Disc-Read Only Memory
  • USB Universal Serial Bus
  • the information processing of the software may be executed by storing the information in a non-temporary storage medium (non-temporary computer readable medium) such as the following, and reading it into a computer. Further, the software may be downloaded via a communication network. Furthermore, all or part of the software processing may be implemented in a circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), so that the information processing by the software may be executed by hardware. .
  • a non-temporary storage medium non-temporary computer readable medium
  • the software may be downloaded via a communication network.
  • all or part of the software processing may be implemented in a circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), so that the information processing by the software may be executed by hardware. .
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the storage medium that stores the software may be a removable one such as an optical disk, or a fixed storage medium such as a hard disk or memory. Further, the storage medium may be provided inside the computer (main storage device, auxiliary storage device, etc.) or may be provided outside the computer.
  • FIG. 6 is a block diagram showing an example of the hardware configuration of each device (three-dimensional reconstruction systems 100 and 100a) in the embodiment described above.
  • Each device includes, for example, a processor 71, a main storage device 72 (memory), an auxiliary storage device 73 (memory), a network interface 74, and a device interface 75, which are connected via a bus 76. It may also be realized as a computer 7.
  • each device (three-dimensional reconstruction system 100 and 100a) in the embodiment described above is a system that realizes functions by one or more computers executing instructions stored in one or more storage devices. It may be configured as Alternatively, the information transmitted from the terminal may be processed by one or more computers provided on the cloud, and the processing results may be sent to the terminal.
  • each device three-dimensional reconstruction systems 100 and 100a
  • various operations of each device (three-dimensional reconstruction systems 100 and 100a) in the embodiments described above are executed in parallel using one or more processors or multiple computers via a network. Good too. Further, various calculations may be distributed to a plurality of calculation cores within the processor and executed in parallel. Further, a part or all of the processing, means, etc. of the present disclosure may be realized by at least one of a processor and a storage device provided on a cloud that can communicate with the computer 7 via a network. In this way, each device in the embodiments described above may be in the form of parallel computing using one or more computers.
  • the processor 71 may be an electronic circuit (processing circuit, processing circuit, CPU, GPU, FPGA, ASIC, etc.) that performs at least one of computer control or calculation. Further, the processor 71 may be any of a general-purpose processor, a dedicated processing circuit designed to execute a specific operation, or a semiconductor device including both a general-purpose processor and a dedicated processing circuit. Further, the processor 71 may include an optical circuit or may include an arithmetic function based on quantum computing.
  • the processor 71 may perform calculation processing based on data and software input from each device in the internal configuration of the computer 7, and may output calculation results and control signals to each device.
  • the processor 71 may control each component constituting the computer 7 by executing the OS (Operating System) of the computer 7, applications, and the like.
  • Each device in the embodiments described above may be realized by one or more processors 71.
  • the processor 71 may refer to one or more electronic circuits arranged on one chip, or one or more electronic circuits arranged on two or more chips or two or more devices. You can also point. When using multiple electronic circuits, each electronic circuit may communicate by wire or wirelessly.
  • the main memory device 72 may store instructions and various data to be executed by the processor 71, and the information stored in the main memory device 72 may be read by the processor 71.
  • the auxiliary storage device 73 is a storage device other than the main storage device 72. Note that these storage devices are any electronic components capable of storing electronic information, and may be semiconductor memories. Semiconductor memory may be either volatile memory or nonvolatile memory.
  • the storage device for storing various data in each device (three-dimensional reconstruction systems 100 and 100a) in the embodiments described above may be realized by the main storage device 72 or the auxiliary storage device 73, and may be implemented by the main storage device 72 or the auxiliary storage device 73. It may also be realized by a built-in memory.
  • Each of the devices (three-dimensional reconstruction systems 100 and 100a) in the embodiments described above includes at least one storage device (memory) and at least one processor connected to (coupled with) this at least one storage device.
  • at least one processor may be connected to one storage device.
  • at least one storage device may be connected to one processor.
  • the present invention may include a configuration in which at least one processor among the plurality of processors is connected to at least one storage device among the plurality of storage devices. Further, this configuration may be realized by a storage device and a processor included in a plurality of computers.
  • a configuration in which the storage device is integrated with the processor for example, a cache memory including an L1 cache and an L2 cache may be included.
  • the network interface 74 is an interface for connecting to the communication network 8 wirelessly or by wire. As the network interface 74, an appropriate interface such as one that complies with existing communication standards may be used. Information may be exchanged with an external device 9A connected via the communication network 8 through the network interface 74.
  • the communication network 8 may be any one or a combination of WAN (Wide Area Network), LAN (Local Area Network), PAN (Personal Area Network), etc., and can be used to communicate between the computer 7 and the external device 9A. It may be anything that involves the exchange of information. Examples of WAN include the Internet, examples of LAN include IEEE802.11 and Ethernet (registered trademark), and examples of PAN include Bluetooth (registered trademark) and NFC (Near Field Communication).
  • the device interface 75 is an interface such as a USB that is directly connected to the external device 9B.
  • the external device 9A is a device connected to the computer 7 via a network.
  • the external device 9B is a device directly connected to the computer 7.
  • the external device 9A or the external device 9B may be an input device, for example.
  • the input device is, for example, a camera, a microphone, a motion capture device, various sensors, a keyboard, a mouse, a touch panel, or other devices, and provides the acquired information to the computer 7.
  • the device may be a device including an input unit, a memory, and a processor, such as a personal computer, a tablet terminal, or a smartphone.
  • the external device 9A or the external device 9B may be an output device, for example.
  • the output device may be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) panel, or may be a speaker that outputs audio or the like.
  • the device may be a device including an output unit, a memory, and a processor, such as a personal computer, a tablet terminal, or a smartphone.
  • the external device 9A or the external device 9B may be a storage device (memory).
  • the external device 9A may be a network storage or the like, and the external device 9B may be a storage such as an HDD.
  • the external device 9A or the external device 9B may be a device that has some functions of the components of each device (three-dimensional reconstruction systems 100 and 100a) in the above-described embodiments.
  • the computer 7 may transmit some or all of the processing results to the external device 9A or 9B, or may receive some or all of the processing results from the external device 9A or 9B. .
  • the expression "at least one (one) of a, b, and c" or “at least one (one) of a, b, or c" (including similar expressions) When used, it includes either a, b, c, a-b, a-c, b-c or a-b-c. Further, each element may include multiple instances, such as a-a, a-b-b, a-a-b-b-c-c, etc. Furthermore, it also includes adding other elements other than the listed elements (a, b and c), such as having d as in a-b-c-d.
  • connection and “coupled” refer to direct connection/coupling and indirect connection/coupling. , electrically connected/coupled, communicatively connected/coupled, functionally connected/coupled, physically connected/coupled, etc., without limitation. intended as a term.
  • the term should be interpreted as appropriate depending on the context in which the term is used, but forms of connection/coupling that are not intentionally or naturally excluded are not included in the term. Should be construed in a limited manner.
  • the expression "A configured to B” when used, it means that the physical structure of element A is capable of performing operation B. configuration, and includes a permanent or temporary setting/configuration of element A being configured/set to actually perform operation B. good.
  • element A is a general-purpose processor
  • the processor has a hardware configuration that can execute operation B, and can perform operation B by setting a permanent or temporary program (instruction). It only needs to be configured to actually execute.
  • element A is a dedicated processor, dedicated arithmetic circuit, etc.
  • the circuit structure of the processor is designed to actually execute operation B, regardless of whether control instructions and data are actually attached. It is sufficient if it is implemented in
  • minimize refers to finding a global minimum value or an approximate value of the global minimum value. This term includes determining, determining a local minimum, and determining an approximation of a local minimum, and should be interpreted as appropriate depending on the context in which the term is used. It also includes finding approximate values of these minimum values probabilistically or heuristically. Similarly, when terms such as “optimize” or “optimization” are used, it refers to finding a global optimum, finding an approximation of a global optimum, or calculating a local optimum. This term includes determining and approximating a local optimum, and should be interpreted accordingly depending on the context in which the term is used. It also includes finding approximate values of these optimal values probabilistically or heuristically.
  • each piece of hardware when multiple pieces of hardware perform a predetermined process, each piece of hardware may cooperate to perform the predetermined process, or some of the hardware may perform the predetermined process. You may do all of the above. Further, some hardware may perform part of a predetermined process, and another piece of hardware may perform the rest of the predetermined process.
  • expressions such as "one or more hardware performs a first process, and the one or more hardware performs a second process" (including similar expressions) are used. ), the hardware that performs the first processing and the hardware that performs the second processing may be the same or different. In other words, the hardware that performs the first processing and the hardware that performs the second processing may be included in the one or more pieces of hardware.
  • the hardware may include an electronic circuit, a device including an electronic circuit, and the like.
  • each storage device among the multiple storage devices may store only part of the data. , the entire data may be stored. Further, a configuration may be included in which some of the plurality of storage devices store data.

Abstract

Provided is a three-dimensional reconfiguration system capable of easily photographing a marker irrespective of a photographing direction. This three-dimensional reconfiguration system comprises: one or more cameras that photograph an object; and a control device configured to output reconfiguration information on the object on the basis of a stereoscopic marker, a plurality of photographed images of the object photographed from various directions by the cameras, and attitude information of the cameras based on an image of the stereoscopic marker included in the photographed images. The stereoscopic marker include a marker portion that is disposed at an angle with respect to a mount surface for the stereoscopic marker.

Description

三次元再構成システム、及び三次元再構成方法Three-dimensional reconstruction system and three-dimensional reconstruction method
 本開示は、三次元再構成システム、及び三次元再構成方法に関する。 The present disclosure relates to a three-dimensional reconstruction system and a three-dimensional reconstruction method.
 従来、カメラにより異なる方向から物体を撮影して得られる複数の画像に基づき、物体の三次元モデルを生成する三次元再構成システムが知られている。このような三次元再構成システムは、3DCG(3 Dimensional Computer Graphics)等において使用される。 Conventionally, three-dimensional reconstruction systems are known that generate a three-dimensional model of an object based on a plurality of images obtained by photographing the object from different directions with a camera. Such a three-dimensional reconstruction system is used in 3DCG (3 Dimensional Computer Graphics) and the like.
 また、物体の載置面に設けられたマーカーのカメラによる撮影画像に基づいて、カメラの姿勢情報を取得し、三次元再構成に利用する方法が開示されている。 Furthermore, a method is disclosed in which camera posture information is acquired based on an image taken by a camera of a marker provided on the mounting surface of an object, and is used for three-dimensional reconstruction.
国際公開第2020/075768号International Publication No. 2020/075768
 しかしながら、従来の三次元再構成方法では、カメラにより載置面に対して浅い角度でマーカーを撮影する場合等に、撮影画像からマーカーの画像を識別できず、その結果、カメラの姿勢情報を取得できないことがある。カメラの姿勢情報を取得できないと、物体の三次元再構成精度が低くなる。 However, with conventional three-dimensional reconstruction methods, when the marker is photographed at a shallow angle with respect to the mounting surface using a camera, it is not possible to identify the marker image from the photographed image, and as a result, the posture information of the camera is not obtained. There are things I can't do. If camera orientation information cannot be obtained, the accuracy of three-dimensional reconstruction of the object will be reduced.
 本開示は、カメラの撮影方向によらずにマーカーが撮影されやすい三次元再構成システムを提供することを目的とする。 An object of the present disclosure is to provide a three-dimensional reconstruction system in which markers are easily photographed regardless of the photographing direction of the camera.
 本開示の実施形態の一観点に係る三次元再構成システムは、物体を撮影するための1以上のカメラと、立体マーカーと、前記1以上のカメラによって異なる方向から撮影された前記物体の複数の撮影画像と、前記複数の撮影画像に含まれる前記立体マーカーの画像に基づく前記カメラの姿勢情報と、に基づいて、前記物体の再構成情報を出力するように構成されている制御装置と、を有し、前記立体マーカーは、当該立体マーカーの載置面に対して角度を付けて配置されるマーカー部を有する。 A three-dimensional reconstruction system according to one aspect of an embodiment of the present disclosure includes one or more cameras for photographing an object, a three-dimensional marker, and a plurality of images of the object photographed from different directions by the one or more cameras. a control device configured to output reconstruction information of the object based on a captured image and posture information of the camera based on an image of the three-dimensional marker included in the plurality of captured images; The three-dimensional marker has a marker portion arranged at an angle with respect to a mounting surface of the three-dimensional marker.
 本開示によれば、マーカーが撮影されやすい三次元再構成システムを提供できる。 According to the present disclosure, it is possible to provide a three-dimensional reconstruction system in which markers are easily photographed.
実施形態に係る三次元再構成システムの全体構成を例示する図である。1 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system according to an embodiment. 実施形態に係る立体マーカーに形成された2次元コードを例示する図である。FIG. 3 is a diagram illustrating a two-dimensional code formed on a three-dimensional marker according to an embodiment. 実施形態に係るマーカー部のコーナーを例示する図である。It is a figure which illustrates the corner of the marker part based on embodiment. 実施形態に係る制御装置による処理例を示すフロー図である。FIG. 3 is a flow diagram showing an example of processing by the control device according to the embodiment. 変形例に係る三次元再構成システムの全体構成を例示する図である。FIG. 3 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system according to a modified example. 実施形態に係る制御装置のハードウェア構成例のブロック図である。FIG. 2 is a block diagram of an example hardware configuration of a control device according to an embodiment.
 以下、添付図面を参照しながら実施形態について詳細に説明する。説明の理解を容易にするため、各図面において同一の構成要素に対しては可能な限り同一の符号を付して、重複する説明は適宜省略する。 Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. In order to facilitate understanding of the description, the same components in each drawing are given the same reference numerals as much as possible, and overlapping descriptions are omitted as appropriate.
 各図において、x方向、y方向及びz方向は互いに垂直な方向である。z方向は載置面の法線方向であり、典型的には鉛直方向である。z正方向側を上側、z負方向側を下側と表記する。x方向及びy方向は載置面の延在方向であり、典型的には水平方向である。 In each figure, the x direction, y direction, and z direction are directions perpendicular to each other. The z direction is a normal direction to the mounting surface, and is typically a vertical direction. The z positive direction side is written as the upper side, and the z negative direction side is written as the lower side. The x direction and the y direction are directions in which the mounting surface extends, and are typically horizontal directions.
 <三次元再構成システム100の構成例>
 図1から図3を参照して、第1実施形態に係る三次元再構成システム100の構成を説明する。図1は、三次元再構成システム100の全体構成を例示する図である。図2は、三次元再構成システム100が有する立体マーカー6に形成された2次元コードを例示する図である。図3は、立体マーカー6におけるマーカー部6bのコーナー61~64を例示する図である。
<Configuration example of three-dimensional reconstruction system 100>
The configuration of a three-dimensional reconstruction system 100 according to the first embodiment will be described with reference to FIGS. 1 to 3. FIG. 1 is a diagram illustrating the overall configuration of a three-dimensional reconstruction system 100. FIG. 2 is a diagram illustrating a two-dimensional code formed on the three-dimensional marker 6 included in the three-dimensional reconstruction system 100. FIG. 3 is a diagram illustrating the corners 61 to 64 of the marker portion 6b of the three-dimensional marker 6.
 図1に示すように、三次元再構成システム100は、支持台2と、上部カメラ3A及び3Bと、下部カメラ4A及び4Bと、回転部5と、4つの立体マーカー6と、2つの基準マーカー6Rと、照明8A及び8Bと、制御装置9と、を有する。 As shown in FIG. 1, the three-dimensional reconstruction system 100 includes a support base 2, upper cameras 3A and 3B, lower cameras 4A and 4B, a rotating section 5, four three-dimensional markers 6, and two reference markers. 6R, lights 8A and 8B, and a control device 9.
 三次元再構成システム100は、上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bにより、異なる方向から物体10を撮影して得られる複数の画像に基づき、物体の三次元モデルを生成する。物体10としては、例えば図1に例示されるコップ等の器状の物体の他、上面視において支持台2上における立体マーカー6の内側に載置可能なサイズを有する任意の形状の物体等が挙げられる。 The three-dimensional reconstruction system 100 generates a three-dimensional model of an object based on a plurality of images obtained by photographing the object 10 from different directions using an upper camera 3A, an upper camera 3B, a lower camera 4A, and a lower camera 4B. . Examples of the object 10 include, for example, a container-shaped object such as a cup as shown in FIG. Can be mentioned.
 本明細書において、「物体」には、現実に存在して三次元再構成の対象となる物体と、この物体を撮影した複数の画像に基づき再構成されたものと、の両方が含まれる。「物体」が現実に存在する物体を指すか、或いは再構成されたものを指すかは、文脈に応じて適宜区別できる。例えば、「物体を支持台に載置する」等の現実空間における物体の取り扱いを示す場合には、「物体」は現実に存在する物体を意味する。一方、「物体を記憶する」又は「物体を再構成する」等の仮想空間における物体の取り扱いを示す場合には、「物体」は現実に存在する物体を再構成したものを意味する。 In this specification, "object" includes both an object that actually exists and is the target of three-dimensional reconstruction, and an object that is reconstructed based on a plurality of images taken of this object. Whether the term "object" refers to an actually existing object or a reconstructed object can be appropriately distinguished depending on the context. For example, when referring to the handling of an object in real space, such as "placing an object on a support stand," "object" means an object that actually exists. On the other hand, when referring to the handling of an object in a virtual space such as "memorizing an object" or "reconstructing an object", the "object" means a reconstructed object that actually exists.
 支持台2は、物体10を支持する支持部材の一例である。支持台2は上面2Aを含む。本実施形態では、支持台2は、物体10を載置し、載置された物体10を回転させる台座である。支持台2は、例えばアクリル板等の透明板である。上面2Aは、物体10を載置する載置面に対応する。支持台2は透明板であるので、図1に示すように上面2Aの裏側となる下面2Bからも、上面2Aに載置された物体10および立体マーカー6を撮影可能である。三次元再構成システム100は、支持台2に透明板を用いることにより、物体10を全方位から撮影できるため、情報の欠落がない物体10の三次元再構成情報を得ることができる。 The support stand 2 is an example of a support member that supports the object 10. The support base 2 includes an upper surface 2A. In this embodiment, the support base 2 is a base on which the object 10 is placed and on which the placed object 10 is rotated. The support base 2 is, for example, a transparent plate such as an acrylic plate. The upper surface 2A corresponds to a placement surface on which the object 10 is placed. Since the support base 2 is a transparent plate, it is possible to photograph the object 10 and the three-dimensional marker 6 placed on the upper surface 2A from the lower surface 2B, which is the back side of the upper surface 2A, as shown in FIG. Since the three-dimensional reconstruction system 100 uses a transparent plate for the support base 2 and can photograph the object 10 from all directions, it is possible to obtain three-dimensional reconstruction information of the object 10 without missing information.
 上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bは、物体を撮影するための1以上のカメラの一例である。上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bのそれぞれは、上面2Aに載置された物体10を異なる方向から撮影可能である。上部カメラ3A及び3Bは、支持台2の上面2A側の方向(載置面の上方)に設置され、上面2Aに対する斜め上方向から物体10に向けられて物体10を撮影可能である。上部カメラ3A及び3Bは、異なる傾斜角度でそれぞれ設置される。下部カメラ4A及び4Bは、支持台2の下面2B側の方向(載置面の下方)に設置され、上面2Aに対する斜め下方向から物体10に向けられて物体10を撮影可能である。下部カメラ4A及び4Bは、異なる傾斜角度でそれぞれ設置される。 The upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B are examples of one or more cameras for photographing an object. Each of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B can photograph the object 10 placed on the upper surface 2A from different directions. The upper cameras 3A and 3B are installed in the direction of the upper surface 2A of the support base 2 (above the mounting surface), and can photograph the object 10 by being directed toward the object 10 from diagonally above the upper surface 2A. The upper cameras 3A and 3B are installed at different inclination angles. The lower cameras 4A and 4B are installed toward the lower surface 2B side of the support base 2 (below the mounting surface), and can photograph the object 10 by being directed toward the object 10 from diagonally below the upper surface 2A. The lower cameras 4A and 4B are installed at different inclination angles.
 上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bのそれぞれは、例えばRGB-Dカメラである。RGB-Dカメラは、物体10のR(Red)、G(Green)、B(Blue)の各色を含むRGB画像と、深度画像と、を撮像できる。深度画像は、物体10までの奥行き情報(深度)を含む画像である。なお、以下では、上部カメラ3A及び3Bを纏めて「上部カメラ3」、下部カメラ4A及び4Bを纏めて「下部カメラ4」とも表記する。なお、上部カメラ3及び下部カメラ4の位置は回転してもよい。 Each of the upper camera 3A, upper camera 3B, lower camera 4A, and lower camera 4B is, for example, an RGB-D camera. The RGB-D camera can capture an RGB image including each color of R (Red), G (Green), and B (Blue) of the object 10 and a depth image. The depth image is an image that includes depth information (depth) up to the object 10. Note that, hereinafter, the upper cameras 3A and 3B will be collectively referred to as "upper camera 3", and the lower cameras 4A and 4B will also be collectively referred to as "lower camera 4". Note that the positions of the upper camera 3 and the lower camera 4 may be rotated.
 本実施形態では、「カメラ」は、物体10のRGB画像や深度画像を撮像できる要素を意味する。この「カメラ」は、RGB―Dカメラや深度カメラ等のカメラ装置全体や、カメラ装置に内蔵されるCMOSセンサや深度センサなどのセンサ類や、単体で用いられるセンサを包含している。 In this embodiment, the "camera" means an element that can capture an RGB image or a depth image of the object 10. This "camera" includes the entire camera device such as an RGB-D camera and a depth camera, sensors such as a CMOS sensor and a depth sensor built into the camera device, and sensors used alone.
 上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bそれぞれは、内部パラメータと、外部パラメータと、を有する。内部パラメータは各カメラが備えるレンズの歪曲に関連する情報等を含む。本実施形態では、内部パラメータは、シミュレーション結果等に基づき既知である。外部パラメータは、各カメラの姿勢情報等を含む。カメラの姿勢は、立体マーカー6に対するカメラの相対姿勢と、世界座標系における絶対姿勢と、を含む。またカメラの姿勢は、各カメラに含まれるレンズ等の光学系の光軸の傾きに対応する。外部パラメータは、立体マーカー6を用いて取得される。 The upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B each have an internal parameter and an external parameter. The internal parameters include information related to distortion of the lens included in each camera. In this embodiment, the internal parameters are known based on simulation results and the like. The external parameters include posture information of each camera. The attitude of the camera includes a relative attitude of the camera with respect to the three-dimensional marker 6 and an absolute attitude in the world coordinate system. Further, the attitude of the camera corresponds to the inclination of the optical axis of an optical system such as a lens included in each camera. The external parameters are acquired using the three-dimensional marker 6.
 回転部5は、支持台2を回転させることにより、物体10を回転させる回転機構の一例である。回転部5は、例えば支持台2を矢印50方向に回転させる。上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bのそれぞれは、回転部5による複数の回転角度ごとに物体10を撮影する。 The rotating unit 5 is an example of a rotating mechanism that rotates the object 10 by rotating the support base 2. The rotating unit 5 rotates the support base 2 in the direction of an arrow 50, for example. Each of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B photographs the object 10 at each of a plurality of rotation angles by the rotation unit 5.
 回転部5の機構には公知の動力伝達系を使用できる。例えば、回転部5はモータとギヤ機構とを有する。回転部5は、モータの駆動力がギヤ機構を介して支持台2の回転軸に伝達されるように構成されてもよい。或いは回転部5は、支持台2の外縁部に駆動力を付与して支持台2を回転させるように構成されてもよい。 A known power transmission system can be used for the mechanism of the rotating part 5. For example, the rotating section 5 includes a motor and a gear mechanism. The rotating part 5 may be configured such that the driving force of the motor is transmitted to the rotating shaft of the support base 2 via a gear mechanism. Alternatively, the rotating unit 5 may be configured to apply a driving force to the outer edge of the support base 2 to rotate the support base 2 .
 立体マーカー6は、上面2Aにおいて物体10の周囲に配置される。立体マーカー6は、物体10と、上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bのそれぞれと、の相対位置を認識するために用いられる標識である。立体マーカー6は、上面2A上に載置されている。立体マーカー6は、上面2Aに固定されておらず、上面2Aに対する位置及び向きが可変である。但し、上部カメラ3及び下部カメラ4により物体10を撮影するために支持台2が回転している間は、立体マーカー6を動かさないため、上面2Aに対して立体マーカー6の位置及び向きは不変となる。 The three-dimensional marker 6 is arranged around the object 10 on the upper surface 2A. The three-dimensional marker 6 is a mark used to recognize the relative position of the object 10 and each of the upper camera 3A, upper camera 3B, lower camera 4A, and lower camera 4B. The three-dimensional marker 6 is placed on the upper surface 2A. The three-dimensional marker 6 is not fixed to the upper surface 2A, and its position and orientation with respect to the upper surface 2A are variable. However, since the three-dimensional marker 6 is not moved while the support base 2 is rotating to photograph the object 10 by the upper camera 3 and the lower camera 4, the position and orientation of the three-dimensional marker 6 with respect to the upper surface 2A remain unchanged. becomes.
 立体マーカー6の数には特段の制限はない。但し、立体マーカー6の数が少ないと、撮影中に立体マーカー6が物体10の陰に隠れ、上部カメラ3及び下部カメラ4の姿勢情報を適切に取得できない場合があるため、立体マーカー6の数は多い方が好ましい。そのため、上部カメラ3又は下部カメラ4の撮影方向において物体10の前後に配置される2つの立体マーカー6を含む複数の立体マーカー6が物体10の周囲に配置されるのが好ましい。本実施形態では、立体マーカー6の数は一例として4つである。なお、図1における立体マーカー6はそれぞれ異なる2次元コードを有する。 There is no particular limit to the number of three-dimensional markers 6. However, if the number of 3D markers 6 is small, the 3D marker 6 may be hidden behind the object 10 during shooting, and the posture information of the upper camera 3 and lower camera 4 may not be properly acquired. The larger the number, the better. Therefore, it is preferable that a plurality of three-dimensional markers 6, including two three-dimensional markers 6 placed before and after the object 10, be arranged around the object 10 in the photographing direction of the upper camera 3 or the lower camera 4. In this embodiment, the number of three-dimensional markers 6 is four, as an example. Note that the three-dimensional markers 6 in FIG. 1 each have a different two-dimensional code.
 立体マーカー6は、立体部材6aと、立体マーカー6の載置面に対して角度を付けて配置されるマーカー部6bと、を有する。本実施形態では、物体10の載置面は、立体マーカー6の載置面と同一であってよい。つまり、立体マーカー6の載置面は、上面2Aであってよい。上記角度は、上面2Aと平行にならない角度であり、各カメラで撮像されるRGB画像からマーカー部6bが検出できるような、各カメラの撮影方向に応じて決められる角度であってもよい。立体部材6aは、上面2Aに対して所定の高さを有する立体状の部材であり、マーカー部6bを上面2Aに対して角度を付けて配置させるための形状を有する。立体部材6aの材質には特段の制限はなく、立体部材6aは樹脂、金属、木材、紙材等を含んで構成できる。立体部材6aは、中空の箱状部材であってもよいし、中空ではない塊をなす部材であってもよい。本実施形態では一例として、立体部材6aは、6面を含む多面体であり、6面がいずれも同じ形状である立方体の箱状部材である。 The three-dimensional marker 6 includes a three-dimensional member 6a and a marker portion 6b arranged at an angle with respect to the mounting surface of the three-dimensional marker 6. In this embodiment, the mounting surface of the object 10 may be the same as the mounting surface of the three-dimensional marker 6. That is, the mounting surface of the three-dimensional marker 6 may be the upper surface 2A. The above-mentioned angle may be an angle that is not parallel to the upper surface 2A, and may be an angle determined according to the shooting direction of each camera so that the marker portion 6b can be detected from the RGB image captured by each camera. The three-dimensional member 6a is a three-dimensional member having a predetermined height with respect to the upper surface 2A, and has a shape for arranging the marker portion 6b at an angle with respect to the upper surface 2A. There is no particular restriction on the material of the three-dimensional member 6a, and the three-dimensional member 6a can be made of resin, metal, wood, paper, or the like. The three-dimensional member 6a may be a hollow box-like member or may be a solid member that is not hollow. In this embodiment, as an example, the three-dimensional member 6a is a polyhedron including six sides, and is a cubic box-shaped member in which all six sides have the same shape.
 マーカー部6bは、立体部材6aの6面(1つの上面、4つの側面、1つの底面)それぞれに設けられている。立体部材6aの側面に設けられるマーカー部6bは、上面2Aに対して角度(本実施形態では90°)を付けて配置される。立体部材6aの底面に設けられるマーカー部6bは、その面が上面2Aと接触するように配置されるが、支持台2が透明なので、この底面のマーカー部6bは、下部カメラ4によって支持台2を通して撮影できる。図2に示すように、マーカー部6bは、例えば、画像認識型AR(Augmented Reality:拡張現実感)システムにおいての標識となる決まったパターンの画像であるARマーカーのような2次元コードを用いてもよいし、QRコード(登録商標)のような2次元コードを用いてもよいし、その他の2次元コードを用いてもよい。本実施形態では、マーカー部6bにARマーカーが用いられるものとして説明する。 The marker portions 6b are provided on each of six surfaces (one top surface, four side surfaces, and one bottom surface) of the three-dimensional member 6a. The marker portion 6b provided on the side surface of the three-dimensional member 6a is arranged at an angle (90° in this embodiment) with respect to the upper surface 2A. The marker portion 6b provided on the bottom surface of the three-dimensional member 6a is arranged so that its surface is in contact with the top surface 2A, but since the support base 2 is transparent, the marker portion 6b on the bottom surface can be detected by the lower camera 4. You can take pictures through it. As shown in FIG. 2, the marker unit 6b uses a two-dimensional code such as an AR marker, which is an image of a fixed pattern that serves as a marker in an image recognition type AR (Augmented Reality) system. Alternatively, a two-dimensional code such as a QR code (registered trademark) or another two-dimensional code may be used. This embodiment will be described assuming that an AR marker is used for the marker portion 6b.
 マーカー部6bの各面には、別々のマーカーIDが割り当てられており、そのマーカーIDは公知の画像認識技術により認識可能である。ARマーカーは平面状のものが用いられる。またARマーカーは、カメラに写った時の歪み具合等の形状からカメラに対する距離、角度等を計算するために用いられる。或いはARマーカーは、取得した情報からマーカーの位置に3DCG等を表示するために使用される。 A separate marker ID is assigned to each surface of the marker portion 6b, and the marker ID can be recognized by a known image recognition technique. A planar AR marker is used. Further, the AR marker is used to calculate the distance, angle, etc. with respect to the camera from the shape of the image, such as the degree of distortion when it is captured by the camera. Alternatively, the AR marker is used to display 3DCG or the like at the marker position based on the acquired information.
 マーカー部6bは、矩形であって、白黒2色である2次元コード(2次元ビットパターン)を含む。但し、マーカー部6bは、2次元コードに代えてバーコード等の1次元コードを含んでもよい。物体10の周囲に配置される4つの立体マーカー6において、立体部材6aの6つの面に設けられるマーカー部6bは、それぞれ別の2次元コードを有する。従って、4つの立体マーカー6は、合計でビットパターンが異なる24種類のARマーカーを含んでいる。 The marker section 6b is rectangular and includes a two-dimensional code (two-dimensional bit pattern) in two colors, black and white. However, the marker section 6b may include a one-dimensional code such as a barcode instead of the two-dimensional code. In the four three-dimensional markers 6 arranged around the object 10, the marker portions 6b provided on the six surfaces of the three-dimensional member 6a each have a different two-dimensional code. Therefore, the four three-dimensional markers 6 include a total of 24 types of AR markers with different bit patterns.
 マーカー部6bは、ARマーカーが立体部材6aの各面に印刷されたものであってもよいし、ARマーカーが形成された板状部材又はシート状部材が、立体部材6aの各面に接着部材等によって固定されたものであってもよい。なお、立体マーカー6は、各面にARマーカーが現れるように3Dプリンターによって造形されたものであってもよい。 The marker part 6b may be one in which an AR marker is printed on each surface of the three-dimensional member 6a, or a plate-like member or a sheet-like member on which an AR marker is formed is attached to each surface of the three-dimensional member 6a with an adhesive member. It may be fixed by, for example. Note that the three-dimensional marker 6 may be shaped using a 3D printer so that an AR marker appears on each surface.
 2次元コードには、マーカー部6bの識別情報が記録されている。マーカー部6bの識別情報は、例えば何番目の立体マーカーの何番目のマーカー部であるかを示す識別番号であってもよいし、例えば立体マーカーの区別がない、マーカー部6bの通しの識別番号であってもよい。 Identification information of the marker section 6b is recorded in the two-dimensional code. The identification information of the marker portion 6b may be, for example, an identification number indicating which marker portion of which three-dimensional marker it is, or may be, for example, a serial identification number of the marker portion 6b that does not distinguish between three-dimensional markers. It may be.
 各マーカー部6bにおいて、2次元コードのビットパターンとマーカー部6bの向きとが予め対応付けられており、ビットパターンの向きがマーカー部6bの向きを直接的或いは間接的に示している。すなわち、図3に示すように、各マーカー部6bはその四隅に互いに区別して認識可能なコーナー61~64を有し、それらによって自身の向きを表す。マーカー部6bの識別では、コーナー61~64のそれぞれは、互いに異なるものであると認識される。例えば、「識別番号1のマーカー部6bにおける識別番号4のコーナー」等のように、各マーカー部6bにおける4つのコーナーは互いに異なる識別番号1~4等によって識別可能である。 In each marker portion 6b, the bit pattern of the two-dimensional code and the orientation of the marker portion 6b are associated in advance, and the orientation of the bit pattern directly or indirectly indicates the orientation of the marker portion 6b. That is, as shown in FIG. 3, each marker portion 6b has corners 61 to 64 at its four corners that can be distinguished from each other and represent its orientation. In the identification of the marker portion 6b, each of the corners 61 to 64 is recognized as being different from each other. For example, the four corners of each marker portion 6b can be identified by mutually different identification numbers 1 to 4, such as “the corner with identification number 4 in the marker portion 6b with identification number 1”.
 2つの基準マーカー6Rは、上面2Aに固定され、上面2Aに平行な平面状のマーカーである2つの平面マーカーの一例である。基準マーカー6Rもまた、自身の識別情報が記録されている2次元コードを有するマーカー部を有し、基準マーカー6Rのマーカー部も立体マーカー6のマーカー部6b同様に自身の向きを表す4つのコーナーを有する。基準マーカー6Rも例えばARマーカーである。2つの基準マーカー6Rは、ARマーカーが支持台2の所定位置に印刷されたものであってもよいし、ARマーカーが形成された板状部材又はシート状部材が、立体部材6aの各面に接着部材等によって固定されたものであってもよい。 The two reference markers 6R are an example of two flat markers that are fixed to the upper surface 2A and are flat markers parallel to the upper surface 2A. The reference marker 6R also has a marker part having a two-dimensional code in which its own identification information is recorded, and the marker part of the reference marker 6R also has four corners that indicate its orientation, similar to the marker part 6b of the three-dimensional marker 6. has. The reference marker 6R is also an AR marker, for example. The two reference markers 6R may be AR markers printed at predetermined positions on the support base 2, or plate-like members or sheet-like members on which AR markers are formed are placed on each surface of the three-dimensional member 6a. It may be fixed with an adhesive member or the like.
 2つの基準マーカー6Rは、三次元再構成において用いられる世界座標系を規定するために設けられている。2つの基準マーカー6Rの間の距離は予め定められている。この距離は、物体10の長さを特定するために用いられる。2つの基準マーカー6Rが設けられている上面2Aは、世界座標系におけるxy平面に対応する。2つの基準マーカー6Rの間の中心となる位置が世界座標系の原点となる。 The two reference markers 6R are provided to define a world coordinate system used in three-dimensional reconstruction. The distance between the two reference markers 6R is predetermined. This distance is used to determine the length of object 10. The upper surface 2A on which the two reference markers 6R are provided corresponds to the xy plane in the world coordinate system. The center position between the two reference markers 6R becomes the origin of the world coordinate system.
 本実施形態では、平面状のマーカーである基準マーカー6Rを例示するが、基準マーカーは、立体マーカー6のうちの2つ以上であってもよい。つまり、三次元再構成システム100が有する複数の立体マーカー6のうち少なくとも2つは、上面2Aに固定され、互いの間の距離が予め定められた基準マーカーであってもよい。 In this embodiment, the reference marker 6R, which is a planar marker, is illustrated, but the reference marker may be two or more of the three-dimensional markers 6. That is, at least two of the plurality of three-dimensional markers 6 included in the three-dimensional reconstruction system 100 may be fixed to the upper surface 2A and may be reference markers having a predetermined distance between them.
 照明8Aは支持台2の上面2A側の方向に配置される。照明8Bは支持台2の下面2B側の方向に配置される。照明8A及び8Bはそれぞれ物体10に光を照明する。照明8A及び8Bは、特に上部カメラ3及び下部カメラ4による撮影画像内において物体10の表面の陰影が無くなるように、上部カメラ3及び下部カメラ4の設置位置に応じて配置される。 The illumination 8A is arranged in the direction toward the upper surface 2A of the support base 2. The illumination 8B is arranged toward the lower surface 2B of the support base 2. The lights 8A and 8B each illuminate the object 10 with light. The illumination lights 8A and 8B are arranged according to the installation positions of the upper camera 3 and the lower camera 4 so that there is no shadow on the surface of the object 10, especially in the images taken by the upper camera 3 and the lower camera 4.
 制御装置9は、三次元再構成システム100の動作を制御する。本実施形態では、制御装置9は、上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bのそれぞれによって異なる方向から撮影された物体10の複数の撮影画像と、複数の撮影画像に含まれる立体マーカー6の画像に基づく上記各カメラの姿勢情報と、に基づいて、物体10の再構成情報を出力するように構成されている。また、制御装置9は、上部カメラ3及び下部カメラ4によるRGB画像及び深度画像と、RGB画像に含まれる立体マーカー6の画像に基づく上部カメラ3及び下部カメラ4の姿勢情報と、に基づいて、物体10の再構成情報を出力することができる。 The control device 9 controls the operation of the three-dimensional reconstruction system 100. In the present embodiment, the control device 9 captures a plurality of captured images of the object 10 captured from different directions by the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B, and a plurality of captured images included in the plurality of captured images. It is configured to output reconstruction information of the object 10 based on the posture information of each camera based on the image of the three-dimensional marker 6. Further, the control device 9 uses the RGB image and the depth image from the upper camera 3 and the lower camera 4, and the posture information of the upper camera 3 and the lower camera 4 based on the image of the three-dimensional marker 6 included in the RGB image. Reconstruction information of the object 10 can be output.
 具体的には、制御装置9は、上部カメラ3及び下部カメラ4による物体10の撮影を制御する。また、制御装置9は、物体10の撮影画像に基づき、物体10の三次元モデルを生成することにより物体10を再構成する。制御装置9は、これらに関する機能として、撮影制御部91と、姿勢計算部92と、モデル生成部93と、を有する。 Specifically, the control device 9 controls photographing of the object 10 by the upper camera 3 and the lower camera 4. Further, the control device 9 reconstructs the object 10 by generating a three-dimensional model of the object 10 based on the captured image of the object 10. The control device 9 includes an imaging control section 91, an attitude calculation section 92, and a model generation section 93 as functions related to these.
 撮影制御部91は、回転部5による回転中に、上部カメラ3及び下部カメラ4により物体10を撮影して複数の撮影画像(本実施形態ではRGB画像と深度画像)を取得するように、回転部5、上部カメラ3及び下部カメラ4の各動作を制御する。 The photographing control section 91 controls the rotation so that the upper camera 3 and the lower camera 4 photograph the object 10 during rotation by the rotating section 5 to obtain a plurality of photographed images (in this embodiment, an RGB image and a depth image). 5, the upper camera 3, and the lower camera 4.
 姿勢計算部92は、上部カメラ3及び下部カメラ4による複数の撮影画像に含まれる立体マーカー6の画像に基づき、上部カメラ3A、上部カメラ3B、下部カメラ4A及び下部カメラ4Bの姿勢情報を計算する。 The attitude calculation unit 92 calculates the attitude information of the upper camera 3A, the upper camera 3B, the lower camera 4A, and the lower camera 4B based on the images of the three-dimensional marker 6 included in the plurality of captured images by the upper camera 3 and the lower camera 4. .
 モデル生成部93は、上部カメラ3及び下部カメラ4により撮影された複数の撮影画像と、姿勢計算部92により取得された上部カメラ3及び下部カメラ4の姿勢情報と、に基づき、物体10の三次元モデルを生成することにより、物体10を再構成する。モデル生成部93は、生成した物体10の三次元モデルを物体10の再構成情報として出力する。 The model generation unit 93 generates a three-dimensional image of the object 10 based on a plurality of captured images captured by the upper camera 3 and the lower camera 4 and the attitude information of the upper camera 3 and the lower camera 4 acquired by the attitude calculation unit 92. The object 10 is reconstructed by generating the original model. The model generation unit 93 outputs the generated three-dimensional model of the object 10 as reconstruction information of the object 10.
 例えば、モデル生成部93は、深度画像によりモデル形状を生成し、RGB画像によりテクスチャを生成することにより、物体10を再構成できる。但し、モデル生成部93による三次元再構成方法は、深度画像を用いるこの方法に限定されるものではない。例えば、モデル生成部93は、いわゆるフォトグラメトリによる物体10の三次元再構成方法や、物体10の現時点の三次元情報をレンダリングして得られた画像(二次元画像)と撮影画像との誤差に基づいてその三次元情報を更新する機械学習による三次元再構成方法等を用いて物体10を再構成することもできる。 For example, the model generation unit 93 can reconstruct the object 10 by generating a model shape using a depth image and generating a texture using an RGB image. However, the three-dimensional reconstruction method by the model generation unit 93 is not limited to this method using depth images. For example, the model generation unit 93 uses a three-dimensional reconstruction method of the object 10 using so-called photogrammetry, or an error between an image (two-dimensional image) obtained by rendering current three-dimensional information of the object 10 and a photographed image. The object 10 can also be reconstructed using a three-dimensional reconstruction method using machine learning that updates the three-dimensional information based on the following.
 <制御装置9による処理例>
 図4は、制御装置9による処理の一例を示すフローチャートである。例えば制御装置9は、その操作部を介してユーザから再構成開始指示の操作入力を受け付けた際に、図4の処理を開始する。図4の処理が開始される前に、支持台2の上面2A上には、物体10が載置されている。物体10の載置位置は、支持台2の回転中心であるのが好ましい。本実施形態では、図1に示すように上面2Aは略円形状であり、立体マーカー6が上面2Aの外縁と略同心円状に配置されるので、物体10の載置位置は上面2Aの中心位置であることが好ましい。
<Example of processing by control device 9>
FIG. 4 is a flowchart showing an example of processing by the control device 9. For example, the control device 9 starts the process shown in FIG. 4 when receiving an operation input from the user to instruct the start of reconfiguration through its operation unit. Before the process in FIG. 4 is started, the object 10 is placed on the upper surface 2A of the support base 2. The mounting position of the object 10 is preferably the center of rotation of the support base 2. In this embodiment, as shown in FIG. 1, the top surface 2A has a substantially circular shape, and the three-dimensional marker 6 is arranged approximately concentrically with the outer edge of the top surface 2A, so that the object 10 is placed at the center of the top surface 2A. It is preferable that
 まず、ステップS41において、制御装置9は、撮影制御部91により、回転部5を駆動させて支持台2に回転を開始させる。 First, in step S41, the control device 9 causes the photographing control section 91 to drive the rotation section 5 and cause the support base 2 to start rotating.
 続いて、ステップS42(撮影ステップ)において、制御装置9は、撮影制御部91により、上部カメラ3及び下部カメラ4を作動させて、回転中の物体10を上下両側から撮影し、上部カメラ3及び下部カメラ4の各カメラによりRGB画像と深度画像とを取得する。撮影制御部91は、物体10が回転しているときに上部カメラ3及び下部カメラ4の撮影を同一タイミング或いは異なるタイミングで実行させる。これにより、撮影制御部91は、物体10が任意の回転角度の位置にあるときの各カメラのRGB画像を並行して取得できる。このとき撮影制御部91は、各カメラの深度画像も併せて取得する。深度画像は、各画素におけるカメラから物体10の表面までの距離に応じて色分けされた点群で表現される。 Subsequently, in step S42 (photographing step), the control device 9 causes the photographing control unit 91 to operate the upper camera 3 and the lower camera 4 to photograph the rotating object 10 from both the upper and lower sides. Each camera of the lower camera 4 acquires an RGB image and a depth image. The photographing control unit 91 causes the upper camera 3 and the lower camera 4 to perform photographing at the same timing or at different timings while the object 10 is rotating. Thereby, the photographing control unit 91 can acquire RGB images of each camera in parallel when the object 10 is at a position of an arbitrary rotation angle. At this time, the photographing control unit 91 also acquires depth images from each camera. The depth image is represented by a group of points that are colored according to the distance from the camera to the surface of the object 10 at each pixel.
 撮影制御部91は、ステップS42では回転部5が物体10を一回転させる間、撮影動作を継続させて画像を取得する。これにより、上部カメラ3及び下部カメラ4の各カメラは、上面2Aに対してそれぞれ別の角度で傾斜した方向から、また回転方向に沿った複数の方向から物体10を撮影できる。ここで、立体マーカー6の上面が撮影範囲に入っている上部カメラ3は、立体マーカー6の上面のマーカー部6bを撮影できる。また、立体マーカー6が用いられているので、上部カメラ3及び下部カメラ4の撮影方向が上面2Aに対して浅い角度であっても、その上部カメラ3及び下部カメラ4は、立体マーカー6の側面に上面2Aに対して角度を付けて配置される同一のマーカー部6bを撮影できる。また、下部カメラ4は、透明な支持台2の下から立体マーカー6の底面のマーカー部6bを撮影できる。このように、各カメラは、各撮影方向においてマーカー部6bの画像を含むRGB画像と、深度画像と、を取得できる。つまり、撮影制御部91は、複数のカメラを用いて異なる方向から同時に物体10を撮影できる。撮影制御部91は、撮影画像を姿勢計算部92及びモデル生成部93のそれぞれに出力する。 In step S42, the photographing control unit 91 continues the photographing operation and acquires an image while the rotating unit 5 rotates the object 10 once. Thereby, each of the upper camera 3 and the lower camera 4 can photograph the object 10 from directions inclined at different angles with respect to the upper surface 2A, and from a plurality of directions along the rotation direction. Here, the upper camera 3 whose photographing range includes the upper surface of the three-dimensional marker 6 can photograph the marker portion 6b on the upper surface of the three-dimensional marker 6. Moreover, since the three-dimensional marker 6 is used, even if the shooting direction of the upper camera 3 and the lower camera 4 is at a shallow angle with respect to the upper surface 2A, the upper camera 3 and the lower camera 4 The same marker portion 6b arranged at an angle with respect to the upper surface 2A can be photographed. Further, the lower camera 4 can photograph the marker portion 6b on the bottom surface of the three-dimensional marker 6 from below the transparent support base 2. In this way, each camera can acquire an RGB image including an image of the marker section 6b and a depth image in each photographing direction. That is, the photographing control unit 91 can simultaneously photograph the object 10 from different directions using a plurality of cameras. The photographing control section 91 outputs the photographed image to each of the posture calculation section 92 and the model generation section 93.
 続いて、ステップS43において、制御装置9は、撮影制御部91により、回転部5を停止させて支持台2の回転を停止させる。 Subsequently, in step S43, the control device 9 causes the photographing control section 91 to stop the rotation section 5 and stop the rotation of the support base 2.
 続いて、ステップS44(姿勢情報推定ステップ)において、制御装置9は、姿勢計算部92により、RGB画像中の立体マーカー6(マーカー部6b)の画像に基づき、基準マーカー6Rに対する、すなわち世界座標系における、上部カメラ3及び下部カメラ4の各カメラの姿勢情報を推定する。 Subsequently, in step S44 (posture information estimation step), the control device 9 uses the posture calculation section 92 to determine the position relative to the reference marker 6R, that is, the world coordinate system, based on the image of the three-dimensional marker 6 (marker section 6b) in the RGB image. Estimate the posture information of each of the upper camera 3 and the lower camera 4 in .
 ステップS44の各カメラの姿勢情報の推定手順を説明する。姿勢計算部92は、カメラによって撮影されているマーカー部(立体マーカー6のマーカー部6b及び基準マーカー6Rのマーカー部)を検出する。この検出において姿勢計算部92は、撮影されているマーカー部それぞれについて、4つのコーナー61~64それぞれのカメラ座標系における位置を特定している。また、この検出において姿勢計算部92は、撮影されているマーカー部の識別番号を特定している。 The procedure for estimating the posture information of each camera in step S44 will be explained. The posture calculation unit 92 detects the marker portions (the marker portion 6b of the three-dimensional marker 6 and the marker portion of the reference marker 6R) photographed by the camera. In this detection, the posture calculation section 92 specifies the position of each of the four corners 61 to 64 in the camera coordinate system for each of the marker sections being photographed. In addition, in this detection, the posture calculation section 92 specifies the identification number of the marker section being photographed.
 姿勢計算部92は、カメラによって撮影されているマーカー部のコーナー61~64のカメラ座標系における位置と、そのマーカー部の既知のサイズと、そのカメラの内部パラメータと、に基づいて、そのマーカー部に対するそのカメラの相対姿勢(相対位置及び相対回転角度)を計算できる。すなわち、姿勢計算部92は、マーカー部の見え方に基づいて、マーカー部に対するカメラの相対姿勢を計算できる。カメラ座標系における各マーカー部におけるコーナー61~64の位置が、マーカー部の見え方を表す情報である。 The posture calculation unit 92 calculates the position of the marker portion based on the positions in the camera coordinate system of the corners 61 to 64 of the marker portion photographed by the camera, the known size of the marker portion, and the internal parameters of the camera. The relative attitude (relative position and relative rotation angle) of the camera relative to the camera can be calculated. That is, the attitude calculation unit 92 can calculate the relative attitude of the camera with respect to the marker part based on how the marker part looks. The positions of the corners 61 to 64 in each marker portion in the camera coordinate system are information representing how the marker portion is viewed.
 姿勢計算部92は、例えば1つの基準マーカー6Rを基準として、マーカー部に対する各カメラの相対姿勢に基づいて各カメラの姿勢(絶対姿勢)を計算することにより、全てのカメラの絶対姿勢を示す姿勢情報を推定する。この計算の一例を以下説明する。まず、姿勢計算部92は、基準マーカー6Rのマーカー部を撮影(検出)している或るカメラ(ここではカメラ1と呼ぶ)の相対姿勢をカメラ1の絶対姿勢とみなす。これにより基準マーカー6Rを基準とした、このカメラ1の絶対姿勢を推定できる。次に、姿勢計算部92は、カメラ1が撮影している立体マーカー6のマーカー部6bを撮影している別のカメラ(ここではカメラ2と呼ぶ)を特定し、カメラ1の絶対姿勢と、このマーカー部6bに対するカメラ1、2それぞれの相対姿勢と、に基づいて、基準マーカー6Rを基準としたカメラ2の絶対姿勢を計算する。姿勢計算部92は、このような計算を全てのカメラに対して行うことで、全てのカメラの絶対姿勢を示す姿勢情報を推定できる。なお、1つのカメラが撮影したRGB画像中に複数のマーカー部6bが含まれている場合、姿勢計算部92は、各マーカー部6bに対する相対姿勢に対して誤差を最小化する絶対姿勢をそのカメラの絶対姿勢として推定すること、すなわちカメラの絶対姿勢を最適化することで、カメラの絶対姿勢の推定精度を向上させるようにしてもよい。姿勢計算部92は、推定した姿勢情報をモデル生成部93に出力する。 The attitude calculation unit 92 calculates the attitude (absolute attitude) of each camera based on the relative attitude of each camera with respect to the marker unit, for example, using one reference marker 6R as a reference, thereby calculating the attitude indicating the absolute attitude of all the cameras. Estimate information. An example of this calculation will be explained below. First, the attitude calculation unit 92 regards the relative attitude of a certain camera (herein referred to as camera 1) that is photographing (detecting) the marker portion of the reference marker 6R as the absolute attitude of the camera 1. This makes it possible to estimate the absolute orientation of the camera 1 with reference to the reference marker 6R. Next, the posture calculation unit 92 identifies another camera (herein referred to as camera 2) that is photographing the marker portion 6b of the three-dimensional marker 6 photographed by the camera 1, and determines the absolute posture of the camera 1. Based on the relative postures of the cameras 1 and 2 with respect to the marker portion 6b, the absolute posture of the camera 2 with respect to the reference marker 6R is calculated. By performing such calculations on all cameras, the pose calculation unit 92 can estimate the pose information indicating the absolute poses of all the cameras. Note that when a plurality of marker parts 6b are included in the RGB image taken by one camera, the attitude calculation part 92 calculates the absolute attitude of the camera that minimizes the error with respect to the relative attitude with respect to each marker part 6b. The accuracy of estimating the absolute orientation of the camera may be improved by estimating the absolute orientation of the camera, that is, by optimizing the absolute orientation of the camera. The posture calculation section 92 outputs the estimated posture information to the model generation section 93.
 続いて、ステップS45(モデル生成ステップ)において、制御装置9は、モデル生成部93により、基準マーカー6Rに対する上部カメラ3及び下部カメラ4の各カメラの姿勢情報に基づいて、ステップS42にて取得された撮影方向ごとでの物体10の深度画像を合成し、物体10の三次元モデルを生成する。ここでの各カメラの姿勢情報は、各カメラの絶対姿勢の情報である。モデル生成部93は、例えば深度画像に基づき三次元のメッシュモデルを作成して、その表面にRGB画像に基づいて作成したテクスチャを貼り付けて、模様や色のついた三次元モデルを生成できる。 Subsequently, in step S45 (model generation step), the control device 9 causes the model generation unit 93 to use the information acquired in step S42 based on the posture information of each of the upper camera 3 and the lower camera 4 with respect to the reference marker 6R. The depth images of the object 10 taken in each shooting direction are combined to generate a three-dimensional model of the object 10. The attitude information of each camera here is information on the absolute attitude of each camera. The model generation unit 93 can generate a three-dimensional model with a pattern or color by creating a three-dimensional mesh model based on the depth image, for example, and pasting a texture created based on the RGB image on the surface of the mesh model.
 続いて、ステップS46において、制御装置9は、モデル生成部93により、生成した三次元モデルを再構成情報として出力し、制御装置9のROMや補助記憶装置等に格納する。なお、モデル生成部93は、外部装置である表示装置又はPC(Personal Computer)等に再構成情報を出力することもできる。ステップS46の処理が完了すると本処理フローは終了する。 Subsequently, in step S46, the control device 9 causes the model generation unit 93 to output the generated three-dimensional model as reconstruction information, and stores it in the ROM, auxiliary storage device, etc. of the control device 9. Note that the model generation unit 93 can also output the reconstruction information to an external device such as a display device or a PC (Personal Computer). When the process of step S46 is completed, this process flow ends.
 なお、制御装置9は、ステップS44、S45の各処理を、ステップS42にて取得された回転方向の各位置の物体10のRGB画像及び深度画像を用いて纏めて実行してもよい。つまり、制御装置9における姿勢計算部92とモデル生成部93の機能は、統合されてもよい。 Note that the control device 9 may collectively execute each process of steps S44 and S45 using the RGB image and depth image of the object 10 at each position in the rotational direction acquired in step S42. That is, the functions of the attitude calculation section 92 and the model generation section 93 in the control device 9 may be integrated.
 <三次元再構成システム100の作用効果>
 以上説明したように、三次元再構成システム100は、上面2A(載置面)に載置された物体10を異なる方向から撮影可能な上部カメラ3及び下部カメラ4(カメラ)と、上面2Aにおいて物体10の周囲に配置される立体マーカー6と、を有する。また三次元再構成システム100は、上部カメラ3及び下部カメラ4によるRGB画像及び深度画像(複数の撮影画像)と、RGB画像及び深度画像に含まれる立体マーカー6の画像に基づく上部カメラ3及び下部カメラ4の姿勢情報と、に基づいて、物体10の再構成情報を出力するように構成されている制御装置9を有する。立体マーカー6は、上面2Aに対して角度を付けて配置されるマーカー部6bを有する。立体マーカー6は、上面2Aに対して所定の高さを有する立体部材6aを有し、マーカー部6bは、立体部材6aに設けられている。
<Effects of the three-dimensional reconstruction system 100>
As explained above, the three-dimensional reconstruction system 100 includes an upper camera 3 and a lower camera 4 (cameras) that can photograph the object 10 placed on the upper surface 2A (placement surface) from different directions, and The three-dimensional marker 6 is arranged around the object 10. Furthermore, the three-dimensional reconstruction system 100 uses the RGB image and the depth image (a plurality of captured images) by the upper camera 3 and the lower camera 4, and the image of the three-dimensional marker 6 included in the RGB image and the depth image. It has a control device 9 configured to output reconstruction information of the object 10 based on posture information of the camera 4 . The three-dimensional marker 6 has a marker portion 6b arranged at an angle with respect to the upper surface 2A. The three-dimensional marker 6 has a three-dimensional member 6a having a predetermined height with respect to the upper surface 2A, and the marker portion 6b is provided on the three-dimensional member 6a.
 従来の三次元再構成方法では、載置面に対して平行な平面状のマーカーを用いるため、例えばカメラにより載置面に対して浅い角度でマーカーを撮影する場合等に、撮影画像からマーカーの画像を識別できず、その結果、カメラの姿勢情報を取得できないことがある。載置面に対して浅い角度とは、例えば載置面に対する傾き角度が15度以下の角度をいう。カメラの姿勢情報を取得できないと、物体の三次元再構成精度が低くなる。 Conventional three-dimensional reconstruction methods use planar markers parallel to the mounting surface, so when the marker is photographed with a camera at a shallow angle to the mounting surface, the marker can be easily detected from the photographed image. Images may not be identified and, as a result, camera pose information may not be obtained. The shallow angle with respect to the mounting surface means, for example, an angle in which the inclination angle with respect to the mounting surface is 15 degrees or less. If camera orientation information cannot be obtained, the accuracy of three-dimensional reconstruction of the object will be reduced.
 本実施形態では、上面2Aに対して角度を付けて配置されるマーカー部6bの撮影画像を用いる。このため、三次元再構成システム100は、上面2Aに対して浅い角度で立体マーカー6を撮影する場合等にも、立体マーカー6のいずれかの面に設けられたマーカー部6bの画像を撮影画像内に含めることができ、カメラの姿勢情報を取得できる。これにより、本実施形態では、上部カメラ3及び下部カメラ4の各カメラの撮影方向によらずに、マーカーが撮影されやすい三次元再構成システム100を提供できる。 In this embodiment, a photographed image of the marker portion 6b arranged at an angle with respect to the upper surface 2A is used. Therefore, even when photographing the three-dimensional marker 6 at a shallow angle with respect to the upper surface 2A, the three-dimensional reconstruction system 100 can capture an image of the marker portion 6b provided on either surface of the three-dimensional marker 6. can be included in the camera, and camera pose information can be obtained. Thereby, in this embodiment, it is possible to provide the three-dimensional reconstruction system 100 in which markers are easily photographed regardless of the photographing direction of each of the upper camera 3 and the lower camera 4.
 また、本実施形態では、支持台2(支持部材)と、回転部5(回転機構)と、を有し、上部カメラ3及び下部カメラ4の各カメラは、回転部5による複数の回転角度ごとに物体10を撮影する。この構成により、上面2A(載置面)に載置された物体10を異なる方向から撮影でき、撮影画像を用いて物体10を三次元再構成できる。なお、本実施形態では、三次元再構成システム100は、1以上のカメラを有すればよい。 Moreover, in this embodiment, it has a support base 2 (support member) and a rotating part 5 (rotating mechanism), and each camera of the upper camera 3 and the lower camera 4 is Object 10 is photographed. With this configuration, the object 10 placed on the upper surface 2A (placing surface) can be photographed from different directions, and the object 10 can be three-dimensionally reconstructed using the photographed images. Note that in this embodiment, the three-dimensional reconstruction system 100 only needs to have one or more cameras.
 また、本実施形態では、立体部材6aは、6つの平面を有する立方体(複数の平面を含む多面体)である。マーカー部6bは、この立方体における6つの平面のうちの2以上に設けられている。この構成により、マーカー部6bの画像が撮影画像内に含まれる確度を高めることができるため、上部カメラ及び下部カメラ4の各カメラの撮影方向によらずに、マーカーが撮影されやすい三次元再構成システム100を提供できる。 Furthermore, in this embodiment, the three-dimensional member 6a is a cube having six planes (a polyhedron including a plurality of planes). The marker portions 6b are provided on two or more of the six planes of this cube. With this configuration, it is possible to increase the accuracy with which the image of the marker section 6b is included in the photographed image, so that three-dimensional reconstruction allows the marker to be easily photographed, regardless of the photographing directions of the upper camera and the lower camera 4. A system 100 can be provided.
 また、本実施形態では、マーカー部6bは、マーカー部6bの識別情報を記録する2次元コードを有する。この構成により、立体マーカー6に含まれるどのマーカー部6bの画像が撮影画像内に含まれた場合にも、カメラの姿勢情報を取得できる。これにより、上部カメラ3及び下部カメラ4の各カメラの撮影方向によらずに、マーカーが撮影されやすい三次元再構成システム100を提供できる。なお、マーカー部6bは、2次元コードに代えて1次元コードを含んでもよい。 Furthermore, in this embodiment, the marker section 6b has a two-dimensional code that records identification information of the marker section 6b. With this configuration, camera attitude information can be acquired no matter which marker portion 6b included in the three-dimensional marker 6 is included in the captured image. Thereby, it is possible to provide the three-dimensional reconstruction system 100 in which markers can be easily photographed regardless of the photographing directions of the upper camera 3 and the lower camera 4. Note that the marker section 6b may include a one-dimensional code instead of a two-dimensional code.
 また、本実施形態では、三次元再構成システム100は、上面2Aに固定され、上面2Aに平行な平面状のマーカーである少なくとも2つの基準マーカー6R(平面マーカー)を有する。2つの基準マーカー6Rの間の距離は予め定められている。三次元再構成システム100は、撮影画像に含まれる2つの基準マーカー6Rの間の距離情報を基準にして、マーカー部6bのサイズ情報を校正できる。この結果、カメラの姿勢情報を正確に取得できるため、三次元再構成システム100による再構成精度を向上させることができる。 Furthermore, in this embodiment, the three-dimensional reconstruction system 100 has at least two reference markers 6R (planar markers) that are fixed to the upper surface 2A and are planar markers parallel to the upper surface 2A. The distance between the two reference markers 6R is predetermined. The three-dimensional reconstruction system 100 can calibrate the size information of the marker section 6b based on the distance information between the two reference markers 6R included in the photographed image. As a result, camera posture information can be acquired accurately, so that the reconstruction accuracy by the three-dimensional reconstruction system 100 can be improved.
 なお、平面マーカーである基準マーカー6Rに代えて、複数の立体マーカー6のうち、1つの立体マーカー6を基準マーカーとしてもよいし、互いの間の距離が予め定められている2つの立体マーカー6を一対の基準マーカーとしてもよい。平面マーカーを基準マーカーとする場合には、支持台2に基準マーカーを印刷等によって予め形成できるため、基準マーカーを設けやすい点において有利である。立体マーカーを基準マーカーとする場合には、カメラの撮影方向によらずに基準マーカーを撮影できる点において有利である。 Note that instead of the reference marker 6R, which is a flat marker, one of the three-dimensional markers 6 may be used as the reference marker, or two three-dimensional markers 6 with a predetermined distance between them may be used as the reference marker. may be used as a pair of reference markers. When using a flat marker as the reference marker, the reference marker can be formed on the support base 2 in advance by printing or the like, which is advantageous in that the reference marker can be easily provided. When a three-dimensional marker is used as a reference marker, it is advantageous in that the reference marker can be photographed regardless of the photographing direction of the camera.
 <変形例>
 変形例に係る三次元再構成システム100aについて説明する。なお、上述した実施形態と同じの構成部には同じ符号を付し、重複した説明を適宜省略する。
<Modified example>
A three-dimensional reconstruction system 100a according to a modification will be described. Note that the same components as those in the embodiment described above are given the same reference numerals, and redundant explanations will be omitted as appropriate.
 図5は、三次元再構成システム100aの全体構成を例示する図である。三次元再構成システム100aは、不透明な床11を載置面とし、床11上に載置された物体10を上部カメラ3A及び3Bにより異なる方向から撮影する。換言すると、三次元再構成システム100aは、それぞれが異なる方向から物体10を撮影可能な複数のカメラとして上部カメラ3A及び3Bを含んでいる。 FIG. 5 is a diagram illustrating the overall configuration of the three-dimensional reconstruction system 100a. The three-dimensional reconstruction system 100a uses an opaque floor 11 as a mounting surface, and images an object 10 mounted on the floor 11 from different directions using upper cameras 3A and 3B. In other words, the three-dimensional reconstruction system 100a includes the upper cameras 3A and 3B as a plurality of cameras each capable of photographing the object 10 from different directions.
 制御装置9は、上部カメラ3A及び3Bによる複数の撮影画像と、これら複数の撮影画像に含まれる立体マーカー6の画像に基づく上部カメラ3A及び3Bの各カメラの姿勢情報と、に基づいて、物体10の再構成情報を出力する。 The control device 9 determines the object based on the plurality of images taken by the upper cameras 3A and 3B and the attitude information of each of the upper cameras 3A and 3B based on the image of the three-dimensional marker 6 included in these plurality of images. 10 reconstruction information is output.
 三次元再構成システム100aは、床11に対して所定の高さを有する立体部材6aに設けられたマーカー部6bの撮影画像を用いる。このため、三次元再構成システム100aは、床11に対して浅い角度で立体マーカー6を撮影する場合等にも、立体マーカー6のいずれかの面に設けられたマーカー部6bの画像が撮影画像内に含まれるようにすることができ、カメラの姿勢情報を取得できる。これにより、本変形例では、上部カメラ3A及び3Bの各カメラの撮影方向によらずに、各カメラの姿勢情報を取得可能な三次元再構成システム100を提供できる。 The three-dimensional reconstruction system 100a uses a photographed image of a marker section 6b provided on a three-dimensional member 6a having a predetermined height with respect to the floor 11. For this reason, in the three-dimensional reconstruction system 100a, even when photographing the three-dimensional marker 6 at a shallow angle with respect to the floor 11, the image of the marker section 6b provided on either surface of the three-dimensional marker 6 is reflected in the photographed image. It is possible to obtain camera pose information. Thereby, in this modification, it is possible to provide a three-dimensional reconstruction system 100 that can acquire posture information of each camera, regardless of the shooting direction of each of the upper cameras 3A and 3B.
 また、三次元再構成システム100aは、支持台2及び回転部5等を有さないため、簡単な構成により物体10を三次元再構成できる。また、三次元再構成システム100aは、床11等の任意の載置面に載置した物体10を三次元再構成でき、三次元再構成システム100aの使いやすさを向上させることができる。 Moreover, since the three-dimensional reconstruction system 100a does not include the support base 2, the rotating part 5, etc., the object 10 can be three-dimensionally reconstructed with a simple configuration. Further, the three-dimensional reconstruction system 100a can three-dimensionally reconstruct the object 10 placed on an arbitrary mounting surface such as the floor 11, and can improve the usability of the three-dimensional reconstruction system 100a.
 <上述した実施形態及び変形例におけるハードウェア構成例>
 前述した実施形態における各装置(三次元再構成システム100及び100a)の一部又は全部は、ハードウェアで構成されていてもよいし、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)等が実行するソフトウェア(プログラム)の情報処理で構成されてもよい。ソフトウェアの情報処理で構成される場合には、前述した実施形態における各装置の少なくとも一部の機能を実現するソフトウェアを、CD-ROM(Compact Disc-Read Only Memory)、USB(Universal Serial Bus)メモリ等の非一時的な記憶媒体(非一時的なコンピュータ可読媒体)に収納し、コンピュータに読み込ませることにより、ソフトウェアの情報処理を実行してもよい。また、通信ネットワークを介して当該ソフトウェアがダウンロードされてもよい。さらに、ソフトウェアの処理の全部又は一部がASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)等の回路に実装されることにより、当該ソフトウェアによる情報処理がハードウェアにより実行されてもよい。
<Example of hardware configuration in the embodiment and modification described above>
Part or all of each device (the three- dimensional reconstruction systems 100 and 100a) in the embodiments described above may be configured with hardware, or may include a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc. It may be configured by information processing of software (program) to be executed. In the case where the information processing is configured by software, the software that realizes at least some of the functions of each device in the above-described embodiments can be stored in a CD-ROM (Compact Disc-Read Only Memory), USB (Universal Serial Bus) memory, etc. The information processing of the software may be executed by storing the information in a non-temporary storage medium (non-temporary computer readable medium) such as the following, and reading it into a computer. Further, the software may be downloaded via a communication network. Furthermore, all or part of the software processing may be implemented in a circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), so that the information processing by the software may be executed by hardware. .
ソフトウェアを収納する記憶媒体は、光ディスク等の着脱可能なものでもよいし、ハードディスク、メモリ等の固定型の記憶媒体であってもよい。また、記憶媒体は、コンピュータ内部に備えられてもよいし(主記憶装置、補助記憶装置等)、コンピュータ外部に備えられてもよい。 The storage medium that stores the software may be a removable one such as an optical disk, or a fixed storage medium such as a hard disk or memory. Further, the storage medium may be provided inside the computer (main storage device, auxiliary storage device, etc.) or may be provided outside the computer.
 図6は、前述した実施形態における各装置(三次元再構成システム100及び100a)のハードウェア構成の一例を示すブロック図である。各装置は、一例として、プロセッサ71と、主記憶装置72(メモリ)と、補助記憶装置73(メモリ)と、ネットワークインタフェース74と、デバイスインタフェース75と、を備え、これらがバス76を介して接続されたコンピュータ7として実現されてもよい。 FIG. 6 is a block diagram showing an example of the hardware configuration of each device (three- dimensional reconstruction systems 100 and 100a) in the embodiment described above. Each device includes, for example, a processor 71, a main storage device 72 (memory), an auxiliary storage device 73 (memory), a network interface 74, and a device interface 75, which are connected via a bus 76. It may also be realized as a computer 7.
 図6のコンピュータ7は、各構成要素を一つ備えているが、同じ構成要素を複数備えていてもよい。また、図6では、1台のコンピュータ7が示されているが、ソフトウェアが複数台のコンピュータにインストールされて、当該複数台のコンピュータそれぞれがソフトウェアの同一の又は異なる一部の処理を実行してもよい。この場合、コンピュータそれぞれがネットワークインタフェース74等を介して通信して処理を実行する分散コンピューティングの形態であってもよい。つまり、前述した実施形態における各装置(三次元再構成システム100及び100a)は、1又は複数の記憶装置に記憶された命令を1台又は複数台のコンピュータが実行することで機能を実現するシステムとして構成されてもよい。また、端末から送信された情報をクラウド上に設けられた1台又は複数台のコンピュータで処理し、この処理結果を端末に送信するような構成であってもよい。 Although the computer 7 in FIG. 6 includes one of each component, it may include a plurality of the same components. Furthermore, although one computer 7 is shown in FIG. 6, the software may be installed on multiple computers, and each of the multiple computers may execute the same or different part of the software. Good too. In this case, it may be a form of distributed computing in which each computer communicates via the network interface 74 or the like to execute processing. In other words, each device (three- dimensional reconstruction system 100 and 100a) in the embodiment described above is a system that realizes functions by one or more computers executing instructions stored in one or more storage devices. It may be configured as Alternatively, the information transmitted from the terminal may be processed by one or more computers provided on the cloud, and the processing results may be sent to the terminal.
 前述した実施形態における各装置(三次元再構成システム100及び100a)の各種演算は、1又は複数のプロセッサを用いて、又はネットワークを介した複数台のコンピュータを用いて、並列処理で実行されてもよい。また、各種演算が、プロセッサ内に複数ある演算コアに振り分けられて、並列処理で実行されてもよい。また、本開示の処理、手段等の一部又は全部は、ネットワークを介してコンピュータ7と通信可能なクラウド上に設けられたプロセッサ及び記憶装置の少なくとも一方により実現されてもよい。このように、前述した実施形態における各装置は、1台又は複数台のコンピュータによる並列コンピューティングの形態であってもよい。 Various operations of each device (three- dimensional reconstruction systems 100 and 100a) in the embodiments described above are executed in parallel using one or more processors or multiple computers via a network. Good too. Further, various calculations may be distributed to a plurality of calculation cores within the processor and executed in parallel. Further, a part or all of the processing, means, etc. of the present disclosure may be realized by at least one of a processor and a storage device provided on a cloud that can communicate with the computer 7 via a network. In this way, each device in the embodiments described above may be in the form of parallel computing using one or more computers.
 プロセッサ71は、少なくともコンピュータの制御又は演算のいずれかを行う電子回路(処理回路、Processing circuit、Processing circuitry、CPU、GPU、FPGA、ASIC等)であってもよい。また、プロセッサ71は、汎用プロセッサ、特定の演算を実行するために設計された専用の処理回路又は汎用プロセッサと専用の処理回路との両方を含む半導体装置のいずれであってもよい。また、プロセッサ71は、光回路を含むものであってもよいし、量子コンピューティングに基づく演算機能を含むものであってもよい。 The processor 71 may be an electronic circuit (processing circuit, processing circuit, CPU, GPU, FPGA, ASIC, etc.) that performs at least one of computer control or calculation. Further, the processor 71 may be any of a general-purpose processor, a dedicated processing circuit designed to execute a specific operation, or a semiconductor device including both a general-purpose processor and a dedicated processing circuit. Further, the processor 71 may include an optical circuit or may include an arithmetic function based on quantum computing.
 プロセッサ71は、コンピュータ7の内部構成の各装置等から入力されたデータやソフトウェアに基づいて演算処理を行ってもよく、演算結果や制御信号を各装置等に出力してもよい。プロセッサ71は、コンピュータ7のOS(Operating System)や、アプリケーション等を実行することにより、コンピュータ7を構成する各構成要素を制御してもよい。 The processor 71 may perform calculation processing based on data and software input from each device in the internal configuration of the computer 7, and may output calculation results and control signals to each device. The processor 71 may control each component constituting the computer 7 by executing the OS (Operating System) of the computer 7, applications, and the like.
 前述した実施形態における各装置(三次元再構成システム100及び100a)は、1又は複数のプロセッサ71により実現されてもよい。ここで、プロセッサ71は、1チップ上に配置された1又は複数の電子回路を指してもよいし、2つ以上のチップ或いは2つ以上のデバイス上に配置された1又は複数の電子回路を指してもよい。複数の電子回路を用いる場合、各電子回路は有線又は無線により通信してもよい。 Each device (the three- dimensional reconstruction systems 100 and 100a) in the embodiments described above may be realized by one or more processors 71. Here, the processor 71 may refer to one or more electronic circuits arranged on one chip, or one or more electronic circuits arranged on two or more chips or two or more devices. You can also point. When using multiple electronic circuits, each electronic circuit may communicate by wire or wirelessly.
 主記憶装置72は、プロセッサ71が実行する命令及び各種データ等を記憶してもよく、主記憶装置72に記憶された情報がプロセッサ71により読み出されてもよい。補助記憶装置73は、主記憶装置72以外の記憶装置である。なお、これらの記憶装置は、電子情報を格納可能な任意の電子部品を意味するものとし、半導体のメモリでもよい。半導体のメモリは、揮発性メモリ又は不揮発性メモリのいずれでもよい。前述した実施形態における各装置(三次元再構成システム100及び100a)において各種データ等を保存するための記憶装置は、主記憶装置72又は補助記憶装置73により実現されてもよく、プロセッサ71に内蔵される内蔵メモリにより実現されてもよい。 The main memory device 72 may store instructions and various data to be executed by the processor 71, and the information stored in the main memory device 72 may be read by the processor 71. The auxiliary storage device 73 is a storage device other than the main storage device 72. Note that these storage devices are any electronic components capable of storing electronic information, and may be semiconductor memories. Semiconductor memory may be either volatile memory or nonvolatile memory. The storage device for storing various data in each device (three- dimensional reconstruction systems 100 and 100a) in the embodiments described above may be realized by the main storage device 72 or the auxiliary storage device 73, and may be implemented by the main storage device 72 or the auxiliary storage device 73. It may also be realized by a built-in memory.
 前述した実施形態における各装置(三次元再構成システム100及び100a)が、少なくとも1つの記憶装置(メモリ)と、この少なくとも1つの記憶装置に接続(結合)される少なくとも1つのプロセッサで構成される場合、記憶装置1つに対して、少なくとも1つのプロセッサが接続されてもよい。また、プロセッサ1つに対して、少なくとも1つの記憶装置が接続されてもよい。また、複数のプロセッサのうち少なくとも1つのプロセッサが、複数の記憶装置のうち少なくとも1つの記憶装置に接続される構成を含んでもよい。また、複数台のコンピュータに含まれる記憶装置とプロセッサによって、この構成が実現されてもよい。さらに、記憶装置がプロセッサと一体になっている構成(例えば、L1キャッシュ、L2キャッシュを含むキャッシュメモリ)を含んでもよい。 Each of the devices (three- dimensional reconstruction systems 100 and 100a) in the embodiments described above includes at least one storage device (memory) and at least one processor connected to (coupled with) this at least one storage device. In this case, at least one processor may be connected to one storage device. Furthermore, at least one storage device may be connected to one processor. Furthermore, the present invention may include a configuration in which at least one processor among the plurality of processors is connected to at least one storage device among the plurality of storage devices. Further, this configuration may be realized by a storage device and a processor included in a plurality of computers. Furthermore, a configuration in which the storage device is integrated with the processor (for example, a cache memory including an L1 cache and an L2 cache) may be included.
 ネットワークインタフェース74は、無線又は有線により、通信ネットワーク8に接続するためのインタフェースである。ネットワークインタフェース74は、既存の通信規格に適合したもの等、適切なインタフェースを用いればよい。ネットワークインタフェース74により、通信ネットワーク8を介して接続された外部装置9Aと情報のやり取りが行われてもよい。なお、通信ネットワーク8は、WAN(Wide Area Network)、LAN(Local Area Network)、PAN(Personal Area Network)等の何れか又はそれらの組み合わせであってよく、コンピュータ7と外部装置9Aとの間で情報のやり取りが行われるものであればよい。WANの一例としてインターネット等があり、LANの一例としてIEEE802.11やイーサネット(登録商標)等があり、PANの一例としてBluetooth(登録商標)やNFC(Near Field Communication)等がある。 The network interface 74 is an interface for connecting to the communication network 8 wirelessly or by wire. As the network interface 74, an appropriate interface such as one that complies with existing communication standards may be used. Information may be exchanged with an external device 9A connected via the communication network 8 through the network interface 74. The communication network 8 may be any one or a combination of WAN (Wide Area Network), LAN (Local Area Network), PAN (Personal Area Network), etc., and can be used to communicate between the computer 7 and the external device 9A. It may be anything that involves the exchange of information. Examples of WAN include the Internet, examples of LAN include IEEE802.11 and Ethernet (registered trademark), and examples of PAN include Bluetooth (registered trademark) and NFC (Near Field Communication).
 デバイスインタフェース75は、外部装置9Bと直接接続するUSB等のインタフェースである。 The device interface 75 is an interface such as a USB that is directly connected to the external device 9B.
 外部装置9Aはコンピュータ7とネットワークを介して接続されている装置である。外部装置9Bはコンピュータ7と直接接続されている装置である。 The external device 9A is a device connected to the computer 7 via a network. The external device 9B is a device directly connected to the computer 7.
 外部装置9A又は外部装置9Bは、一例として、入力装置であってもよい。入力装置は、例えば、カメラ、マイクロフォン、モーションキャプチャ、各種センサ、キーボード、マウス、タッチパネル等のデバイスであり、取得した情報をコンピュータ7に与える。また、パーソナルコンピュータ、タブレット端末、スマートフォン等の入力部とメモリとプロセッサを備えるデバイスであってもよい。 The external device 9A or the external device 9B may be an input device, for example. The input device is, for example, a camera, a microphone, a motion capture device, various sensors, a keyboard, a mouse, a touch panel, or other devices, and provides the acquired information to the computer 7. Alternatively, the device may be a device including an input unit, a memory, and a processor, such as a personal computer, a tablet terminal, or a smartphone.
 また、外部装置9A又は外部装置9Bは、一例として、出力装置でもよい。出力装置は、例えば、LCD(Liquid Crystal Display)、有機EL(Electro Luminescence)パネル等の表示装置であってもよいし、音声等を出力するスピーカ等であってもよい。また、パーソナルコンピュータ、タブレット端末又はスマートフォン等の出力部とメモリとプロセッサを備えるデバイスであってもよい。 Additionally, the external device 9A or the external device 9B may be an output device, for example. The output device may be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) panel, or may be a speaker that outputs audio or the like. Alternatively, the device may be a device including an output unit, a memory, and a processor, such as a personal computer, a tablet terminal, or a smartphone.
 また、外部装置9Aまた外部装置9Bは、記憶装置(メモリ)であってもよい。例えば、外部装置9Aはネットワークストレージ等であってもよく、外部装置9BはHDD等のストレージであってもよい。 Furthermore, the external device 9A or the external device 9B may be a storage device (memory). For example, the external device 9A may be a network storage or the like, and the external device 9B may be a storage such as an HDD.
 また、外部装置9A又は外部装置9Bは、前述した実施形態における各装置(三次元再構成システム100及び100a)の構成要素の一部の機能を有する装置でもよい。つまり、コンピュータ7は、外部装置9A又は外部装置9Bに処理結果の一部又は全部を送信してもよいし、外部装置9A又は外部装置9Bから処理結果の一部又は全部を受信してもよい。 Further, the external device 9A or the external device 9B may be a device that has some functions of the components of each device (three- dimensional reconstruction systems 100 and 100a) in the above-described embodiments. In other words, the computer 7 may transmit some or all of the processing results to the external device 9A or 9B, or may receive some or all of the processing results from the external device 9A or 9B. .
 本明細書(請求項を含む)において、「a、b及びcの少なくとも1つ(一方)」又は「a、b又はcの少なくとも1つ(一方)」の表現(同様な表現を含む)が用いられる場合は、a、b、c、a-b、a-c、b-c又はa-b-cのいずれかを含む。また、a-a、a-b-b、a-a-b-b-c-c等のように、いずれかの要素について複数のインスタンスを含んでもよい。さらに、a-b-c-dのようにdを有する等、列挙された要素(a、b及びc)以外の他の要素を加えることも含む。 In this specification (including claims), the expression "at least one (one) of a, b, and c" or "at least one (one) of a, b, or c" (including similar expressions) When used, it includes either a, b, c, a-b, a-c, b-c or a-b-c. Further, each element may include multiple instances, such as a-a, a-b-b, a-a-b-b-c-c, etc. Furthermore, it also includes adding other elements other than the listed elements (a, b and c), such as having d as in a-b-c-d.
 本明細書(請求項を含む)において、「データを入力として/を用いて/データに基づいて/に従って/に応じて」等の表現(同様な表現を含む)が用いられる場合は、特に断りがない場合、データそのものを用いる場合や、データに何らかの処理を行ったもの(例えば、ノイズ加算したもの、正規化したもの、データから抽出した特徴量、データの中間表現等)を用いる場合を含む。また、「データを入力として/を用いて/データに基づいて/に従って/に応じて」何らかの結果が得られる旨が記載されている場合(同様な表現を含む)、特に断りがない場合、当該データのみに基づいて当該結果が得られる場合や、当該データ以外の他のデータ、要因、条件及び/又は状態にも影響を受けて当該結果が得られる場合を含む。また、「データを出力する」旨が記載されている場合(同様な表現を含む)、特に断りがない場合、データそのものを出力として用いる場合や、データに何らかの処理を行ったもの(例えば、ノイズ加算したもの、正規化したもの、データから抽出した特徴量、各種データの中間表現等)を出力として用いる場合を含む。 In this specification (including claims), when expressions such as "using data as input/based on/according to data" (including similar expressions) are used, there is a special disclaimer. If there is no data, this includes cases where the data itself is used, or data that has been processed in some way (e.g., noise added, normalized, features extracted from the data, intermediate representation of the data, etc.) . In addition, if it is stated that a certain result is obtained "using data as input/based on/according to data" (including similar expressions), unless otherwise specified, the relevant This includes cases in which the relevant results are obtained based solely on data, and cases in which the relevant results are obtained under the influence of other data, factors, conditions, and/or states other than the relevant data. In addition, if it is stated that "data will be output" (including similar expressions), if there is no special notice, if the data itself is used as output, or if the data has been processed in some way (for example, noise This includes cases in which outputs (additions, normalizations, feature quantities extracted from data, intermediate representations of various data, etc.) are used as outputs.
 本明細書(請求項を含む)において、「接続される(connected)」及び「結合される(coupled)」との用語が用いられる場合は、直接的な接続/結合、間接的な接続/結合、電気的(electrically)な接続/結合、通信的(communicatively)な接続/結合、機能的(operatively)な接続/結合、物理的(physically)な接続/結合等のいずれをも含む非限定的な用語として意図される。当該用語は、当該用語が用いられた文脈に応じて適宜解釈されるべきであるが、意図的に或いは当然に排除されるのではない接続/結合形態は、当該用語に含まれるものして非限定的に解釈されるべきである。 In this specification (including the claims), when the terms "connected" and "coupled" are used, the terms "connected" and "coupled" refer to direct connection/coupling and indirect connection/coupling. , electrically connected/coupled, communicatively connected/coupled, functionally connected/coupled, physically connected/coupled, etc., without limitation. intended as a term. The term should be interpreted as appropriate depending on the context in which the term is used, but forms of connection/coupling that are not intentionally or naturally excluded are not included in the term. Should be construed in a limited manner.
 本明細書(請求項を含む)において、「AがBするよう構成される(A configured to B)」との表現が用いられる場合は、要素Aの物理的構造が、動作Bを実行可能な構成を有するとともに、要素Aの恒常的(permanent)又は一時的(temporary)な設定(setting/configuration)が、動作Bを実際に実行するように設定(configured/set)されていることを含んでよい。例えば、要素Aが汎用プロセッサである場合、当該プロセッサが動作Bを実行可能なハードウェア構成を有するとともに、恒常的(permanent)又は一時的(temporary)なプログラム(命令)の設定により、動作Bを実際に実行するように設定(configured)されていればよい。また、要素Aが専用プロセッサ、専用演算回路等である場合、制御用命令及びデータが実際に付属しているか否かとは無関係に、当該プロセッサの回路的構造等が動作Bを実際に実行するように構築(implemented)されていればよい。 In this specification (including the claims), when the expression "A configured to B" is used, it means that the physical structure of element A is capable of performing operation B. configuration, and includes a permanent or temporary setting/configuration of element A being configured/set to actually perform operation B. good. For example, if element A is a general-purpose processor, the processor has a hardware configuration that can execute operation B, and can perform operation B by setting a permanent or temporary program (instruction). It only needs to be configured to actually execute. In addition, when element A is a dedicated processor, dedicated arithmetic circuit, etc., the circuit structure of the processor is designed to actually execute operation B, regardless of whether control instructions and data are actually attached. It is sufficient if it is implemented in
 本明細書(請求項を含む)において、含有又は所有を意味する用語(例えば、「含む(comprising/including)」、「有する(having)」等)が用いられる場合は、当該用語の目的語により示される対象物以外の物を含有又は所有する場合を含む、open-endedな用語として意図される。これらの含有又は所有を意味する用語の目的語が数量を指定しない又は単数を示唆する表現(a又はanを冠詞とする表現)である場合は、当該表現は特定の数に限定されないものとして解釈されるべきである。 In this specification (including claims), when terms meaning inclusion or ownership (e.g., "comprising/including", "having", etc.) are used, the object of the term It is intended as an open-ended term, including the case of containing or possessing something other than the object indicated. If the object of a term meaning inclusion or possession is an expression that does not specify a quantity or suggests a singular number (an expression with a or an as an article), the expression shall be interpreted as not being limited to a specific number. It should be.
 本明細書(請求項を含む)において、ある箇所において「1つ又は複数(one or more)」、「少なくとも1つ(at least one)」等の表現が用いられ、他の箇所において数量を指定しない又は単数を示唆する表現(a又はanを冠詞とする表現)が用いられているとしても、後者の表現が「1つ」を意味することを意図しない。一般に、数量を指定しない又は単数を示唆する表現(a又はanを冠詞とする表現)は、必ずしも特定の数に限定されないものとして解釈されるべきである。 In this specification (including the claims), expressions such as "one or more" and "at least one" are used in some places, and quantities are specified in other places. Even if an expression is used that suggests no or singular (an expression with the article a or an), it is not intended that the latter expression means "one". In general, expressions that do not specify a quantity or imply a singular number (expressions with the article a or an) should be construed as not necessarily being limited to a particular number.
 本明細書において、ある実施形態の有する特定の構成について特定の効果(advantage/result)が得られる旨が記載されている場合、別段の理由がない限り、当該構成を有する他の1つ又は複数の実施形態についても当該効果が得られると理解されるべきである。但し、当該効果の有無は、一般に種々の要因、条件及び/又は状態に依存し、当該構成により必ず当該効果が得られるものではないと理解されるべきである。当該効果は、種々の要因、条件及び/又は状態が満たされたときに実施形態に記載の当該構成により得られるものに過ぎず、当該構成又は類似の構成を規定したクレームに係る発明において、当該効果が必ずしも得られるものではない。 In this specification, when it is stated that a specific advantage/result can be obtained with a specific configuration of a certain embodiment, unless there is a reason to the contrary, one or more other components having the configuration are described. It should be understood that this effect can also be obtained with the embodiment of . However, it should be understood that the presence or absence of such an effect generally depends on various factors, conditions, and/or states, and that the configuration does not necessarily provide the effect. The effect is only obtained by the configuration described in the embodiment when various factors, conditions, and/or states are satisfied, and in the claimed invention that defines the configuration or a similar configuration, Effects are not always obtained.
 本明細書(請求項を含む)において、「最小化する(minimize)/最小化(minimization)」等の用語が用いられる場合は、グローバルな最小値を求めること、グローバルな最小値の近似値を求めること、ローカルな最小値を求めること、及びローカルな最小値の近似値を求めることを含み、当該用語が用いられた文脈に応じて適宜解釈されるべきである。また、これら最小値の近似値を確率的又はヒューリスティックに求めることを含む。同様に、「最適化する(optimize)/最適化(optimization)」等の用語が用いられる場合は、グローバルな最適値を求めること、グローバルな最適値の近似値を求めること、ローカルな最適値を求めること、及びローカルな最適値の近似値を求めることを含み、当該用語が用いられた文脈に応じて適宜解釈されるべきである。また、これら最適値の近似値を確率的又はヒューリスティックに求めることを含む。 In this specification (including claims), when terms such as "minimize" or "minimization" are used, it refers to finding a global minimum value or an approximate value of the global minimum value. This term includes determining, determining a local minimum, and determining an approximation of a local minimum, and should be interpreted as appropriate depending on the context in which the term is used. It also includes finding approximate values of these minimum values probabilistically or heuristically. Similarly, when terms such as "optimize" or "optimization" are used, it refers to finding a global optimum, finding an approximation of a global optimum, or calculating a local optimum. This term includes determining and approximating a local optimum, and should be interpreted accordingly depending on the context in which the term is used. It also includes finding approximate values of these optimal values probabilistically or heuristically.
 本明細書(請求項を含む)において、複数のハードウェアが所定の処理を行う場合、各ハードウェアが協働して所定の処理を行ってもよいし、一部のハードウェアが所定の処理の全てを行ってもよい。また、一部のハードウェアが所定の処理の一部を行い、別のハードウェアが所定の処理の残りを行ってもよい。本明細書(請求項を含む)において、「1又は複数のハードウェアが第1の処理を行い、前記1又は複数のハードウェアが第2の処理を行う」等の表現(同様な表現を含む)が用いられている場合、第1の処理を行うハードウェアと第2の処理を行うハードウェアは同じものであってもよいし、異なるものであってもよい。つまり、第1の処理を行うハードウェア及び第2の処理を行うハードウェアが、前記1又は複数のハードウェアに含まれていればよい。なお、ハードウェアは、電子回路、電子回路を含む装置等を含んでよい。 In this specification (including claims), when multiple pieces of hardware perform a predetermined process, each piece of hardware may cooperate to perform the predetermined process, or some of the hardware may perform the predetermined process. You may do all of the above. Further, some hardware may perform part of a predetermined process, and another piece of hardware may perform the rest of the predetermined process. In this specification (including claims), expressions such as "one or more hardware performs a first process, and the one or more hardware performs a second process" (including similar expressions) are used. ), the hardware that performs the first processing and the hardware that performs the second processing may be the same or different. In other words, the hardware that performs the first processing and the hardware that performs the second processing may be included in the one or more pieces of hardware. Note that the hardware may include an electronic circuit, a device including an electronic circuit, and the like.
 本明細書(請求項を含む)において、複数の記憶装置(メモリ)がデータの記憶を行う場合、複数の記憶装置のうち個々の記憶装置は、データの一部のみを記憶してもよいし、データの全体を記憶してもよい。また、複数の記憶装置のうち一部の記憶装置がデータを記憶する構成を含んでもよい。 In this specification (including claims), when multiple storage devices (memories) store data, each storage device among the multiple storage devices may store only part of the data. , the entire data may be stored. Further, a configuration may be included in which some of the plurality of storage devices store data.
 以上、本開示の実施形態について詳述したが、本開示は上記した個々の実施形態に限定されるものではない。特許請求の範囲に規定された内容及びその均等物から導き出される本発明の概念的な思想と趣旨を逸脱しない範囲において、種々の追加、変更、置き換え、部分的削除等が可能である。例えば、前述した実施形態において、数値又は数式を説明に用いている場合、これらは例示的な目的で示されたものであり、本開示の範囲を限定するものではない。また、実施形態で示した各動作の順序も例示的なものであり、本開示の範囲を限定するものではない。 Although the embodiments of the present disclosure have been described in detail above, the present disclosure is not limited to the individual embodiments described above. Various additions, changes, substitutions, partial deletions, etc. can be made without departing from the conceptual idea and spirit of the present invention derived from the content defined in the claims and equivalents thereof. For example, in the embodiments described above, when numerical values or formulas are used for explanation, these are shown for illustrative purposes and do not limit the scope of the present disclosure. Further, the order of each operation shown in the embodiments is also an example, and does not limit the scope of the present disclosure.
 この出願は、2022年5月20日に日本国特許庁に出願された日本国特許出願第2022-082762号に基づいて、その優先権を主張するものであり、この日本国特許出願の全内容を含む。 This application is based on and claims priority to Japanese Patent Application No. 2022-082762 filed with the Japan Patent Office on May 20, 2022, and the entire content of this Japanese patent application is including.

Claims (15)

  1.  物体を撮影するための1以上のカメラと、
     立体マーカーと、
     前記1以上のカメラによって異なる方向から撮影された前記物体の複数の撮影画像と、前記複数の撮影画像に含まれる前記立体マーカーの画像に基づく前記カメラの姿勢情報と、に基づいて、前記物体の再構成情報を出力するように構成されている制御装置と、を有し、
     前記立体マーカーは、当該立体マーカーの載置面に対して角度を付けて配置されるマーカー部を有する、三次元再構成システム。
    one or more cameras for photographing the object;
    three-dimensional marker,
    of the object based on a plurality of captured images of the object taken from different directions by the one or more cameras, and posture information of the camera based on images of the three-dimensional marker included in the plurality of captured images. a control device configured to output reconfiguration information;
    A three-dimensional reconstruction system, wherein the three-dimensional marker has a marker section arranged at an angle with respect to a mounting surface of the three-dimensional marker.
  2.  前記物体の載置面は、前記立体マーカーの載置面と同一である、請求項1に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 1, wherein the mounting surface of the object is the same as the mounting surface of the three-dimensional marker.
  3.  前記立体マーカーは、前記載置面に対して所定の高さを有する立体部材を有し、
     前記マーカー部は、前記立体部材に設けられている、請求項1に記載の三次元再構成システム。
    The three-dimensional marker has a three-dimensional member having a predetermined height with respect to the placement surface,
    The three-dimensional reconstruction system according to claim 1, wherein the marker section is provided on the three-dimensional member.
  4.  前記載置面を含み、前記物体を支持する支持部材と、
     前記支持部材を回転させることにより、前記物体を回転させる回転機構と、を有し、
     前記カメラは、前記回転機構による複数の回転角度ごとに前記物体を撮影する、請求項1に記載の三次元再構成システム。
    a support member that includes the placement surface and supports the object;
    a rotation mechanism that rotates the object by rotating the support member,
    The three-dimensional reconstruction system according to claim 1, wherein the camera photographs the object at each of a plurality of rotation angles by the rotation mechanism.
  5.  前記カメラは、それぞれが異なる方向から前記物体を撮影可能な複数のカメラを含む、請求項1に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 1, wherein the camera includes a plurality of cameras each capable of photographing the object from different directions.
  6.  前記複数のカメラは、前記載置面に対して角度を付けて配置される同一の前記マーカー部を前記物体とともに撮影可能である、請求項5に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 5, wherein the plurality of cameras are capable of photographing the same marker section arranged at an angle with respect to the mounting surface together with the object.
  7.  前記複数のカメラは、前記載置面の上方に設置されて前記物体を斜め上方向から撮影可能な上部カメラと、前記載置面の下方に設置された前記物体を斜め下方向から撮影可能な下部カメラと、を含む、請求項6に記載の三次元再構成システム。 The plurality of cameras include an upper camera installed above the mounting surface and capable of photographing the object from diagonally above, and an upper camera installed below the mounting surface capable of photographing the object diagonally from below. The three-dimensional reconstruction system according to claim 6, comprising: a lower camera.
  8.  前記立体部材は、複数の平面を含む多面体であり、
     前記マーカー部は、前記多面体における複数の平面のうちの2以上に設けられている、請求項3に記載の三次元再構成システム。
    The three-dimensional member is a polyhedron including a plurality of planes,
    The three-dimensional reconstruction system according to claim 3, wherein the marker portion is provided on two or more of the plurality of planes in the polyhedron.
  9.  前記マーカー部は、前記マーカー部の識別情報を記録する1次元コード又は2次元コードを有する、請求項1に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 1, wherein the marker section has a one-dimensional code or a two-dimensional code that records identification information of the marker section.
  10.  複数の前記立体マーカーが前記物体の周囲に配置される、請求項1に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 1, wherein a plurality of the three-dimensional markers are arranged around the object.
  11.  複数の前記立体マーカーは、前記カメラの撮影方向において前記物体の前後に配置される2つの前記立体マーカーを含む、請求項10に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 10, wherein the plurality of three-dimensional markers include two three-dimensional markers placed before and after the object in the photographing direction of the camera.
  12.  複数の前記立体マーカーのうち少なくとも2つは、前記載置面に固定されており、互いの間の距離が予め定められている、請求項10に記載の三次元再構成システム。 The three-dimensional reconstruction system according to claim 10, wherein at least two of the plurality of three-dimensional markers are fixed to the placement surface, and a distance between them is predetermined.
  13.  前記載置面に固定され、前記載置面に平行な平面状のマーカーである少なくとも2つの平面マーカーをさらに有し、
     前記2つの平面マーカーの間の距離は、予め定められている、請求項1に記載の三次元再構成システム。
    further comprising at least two planar markers fixed to the placement surface and parallel to the placement surface;
    The three-dimensional reconstruction system according to claim 1, wherein the distance between the two planar markers is predetermined.
  14.  前記立体マーカーは、前記載置面と接触する面を有する別のマーカー部をさらに有し、
     前記載置面は、透明な部材の載置面であり、
     前記カメラは、前記載置面の下方から前記別のマーカー部を前記物体とともに撮影可能である、請求項1に記載の三次元再構成システム。
    The three-dimensional marker further includes another marker portion having a surface that contacts the placement surface,
    The placement surface is a placement surface for a transparent member,
    The three-dimensional reconstruction system according to claim 1, wherein the camera is capable of photographing the other marker section together with the object from below the placement surface.
  15.  三次元再構成システムによる三次元再構成方法であって、前記三次元再構成システムが、
     カメラにより、載置面に載置された物体を異なる方向から撮影可能であり、
     制御装置により、前記カメラによる複数の撮影画像と、前記複数の撮影画像に含まれる立体マーカーの画像に基づく前記カメラの姿勢情報と、に基づいて、前記物体の再構成情報を出力し、
     前記立体マーカーは、前記載置面に対して角度を付けて配置されるマーカー部を有する、三次元再構成方法。
    A three-dimensional reconstruction method using a three-dimensional reconstruction system, the three-dimensional reconstruction system comprising:
    The camera allows the object placed on the mounting surface to be photographed from different directions.
    outputting reconstruction information of the object by a control device based on a plurality of images taken by the camera and posture information of the camera based on images of three-dimensional markers included in the plurality of images;
    A three-dimensional reconstruction method, wherein the three-dimensional marker has a marker portion arranged at an angle with respect to the placement surface.
PCT/JP2023/017602 2022-05-20 2023-05-10 Three-dimensional reconfiguration system and three-dimensional reconfiguration method WO2023223916A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-082762 2022-05-20
JP2022082762 2022-05-20

Publications (1)

Publication Number Publication Date
WO2023223916A1 true WO2023223916A1 (en) 2023-11-23

Family

ID=88835389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017602 WO2023223916A1 (en) 2022-05-20 2023-05-10 Three-dimensional reconfiguration system and three-dimensional reconfiguration method

Country Status (1)

Country Link
WO (1) WO2023223916A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015075429A (en) * 2013-10-10 2015-04-20 国立大学法人 筑波大学 Marker, evaluation method of marker, information processing apparatus, information processing method, and program
WO2020075768A1 (en) * 2018-10-10 2020-04-16 株式会社Preferred Networks Three-dimensional scanning device, three-dimensional model generation method, training data, and machine learning model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015075429A (en) * 2013-10-10 2015-04-20 国立大学法人 筑波大学 Marker, evaluation method of marker, information processing apparatus, information processing method, and program
WO2020075768A1 (en) * 2018-10-10 2020-04-16 株式会社Preferred Networks Three-dimensional scanning device, three-dimensional model generation method, training data, and machine learning model

Similar Documents

Publication Publication Date Title
CN111862179B (en) Three-dimensional object modeling method and apparatus, image processing device, and medium
US9432655B2 (en) Three-dimensional scanner based on contours from shadow images
JP4497772B2 (en) Image processing device
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
JP2007036482A (en) Information projection display and program
TWI510052B (en) Scanner
CN107517346A (en) Photographic method, device and mobile device based on structure light
CN109906471B (en) Real-time three-dimensional camera calibration
JP7460532B2 (en) systems, methods and devices
CN116250017A (en) Systems, methods, and media for directly restoring planar surfaces in a scene using structured light
CN108062790B (en) Three-dimensional coordinate system establishing method applied to object three-dimensional reconstruction
WO2023223916A1 (en) Three-dimensional reconfiguration system and three-dimensional reconfiguration method
JP2020004085A (en) Image processor, image processing method and program
WO2020240210A1 (en) 3d model capture system
US20240013437A1 (en) Method for providing calibration data for calibrating a camera, method for calibrating a camera, method for producing at least one predefined point-symmetric region, and device
JP5506371B2 (en) Image processing apparatus, image processing method, and program
CN115294213A (en) Calibration tower, camera calibration method and device, electronic equipment and storage medium
JP2005114642A (en) System for forming three-dimensional object and method
WO2023223958A1 (en) Three-dimensional reconstruction method and three-dimensional reconstruction system
TWI518442B (en) Three-dimensional scanner
JP2006338167A (en) Image data creation method
TWI480507B (en) Method and system for three-dimensional model reconstruction
JP2006059165A (en) Three-dimensional modeling device, geometric pattern, three-dimensional modeling data generating method, three-dimensional modeling program and recording medium
KR102505659B1 (en) Three demension scanning apparatus using light based on smartphone
WO2022113582A1 (en) Calibration method, calibration device, calibration system, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23807520

Country of ref document: EP

Kind code of ref document: A1