WO2006082825A1 - 指標配置計測方法、位置姿勢推定方法、指標配置計測装置、位置姿勢推定装置 - Google Patents

指標配置計測方法、位置姿勢推定方法、指標配置計測装置、位置姿勢推定装置 Download PDF

Info

Publication number
WO2006082825A1
WO2006082825A1 PCT/JP2006/301620 JP2006301620W WO2006082825A1 WO 2006082825 A1 WO2006082825 A1 WO 2006082825A1 JP 2006301620 W JP2006301620 W JP 2006301620W WO 2006082825 A1 WO2006082825 A1 WO 2006082825A1
Authority
WO
WIPO (PCT)
Prior art keywords
index
imaging
group
arrangement
auxiliary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2006/301620
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Rika Takemoto
Shinji Uchiyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of WO2006082825A1 publication Critical patent/WO2006082825A1/ja
Priority to US11/611,404 priority Critical patent/US7664341B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • Index placement measurement method position and orientation estimation method, index placement measurement device, and position and orientation estimation device
  • the present invention relates to a technique for obtaining the position and orientation of each index arranged in a real space.
  • MR Mixed reality
  • AR Augmented Reality
  • augmented reality augmented reality
  • a display device for experiencing the MR space according to the movement of the user's head it can be classified into the following two types according to the implementation method.
  • One is drawing on a real space image taken by an imaging device such as a video camera using a virtual space (for example, computer graphics (hereinafter referred to as CG)) generated according to the position and orientation of the imaging device.
  • CG computer graphics
  • Video see-through method which superimposes and displays images of virtual objects and text information), and the other is the position and orientation of the observer's viewpoint on an optical see-through display mounted on the observer's head.
  • This is an optical see-through method for displaying an image of a virtual space generated according to the above.
  • the AR technology superimposes the state of the body on the patient's body surface, builds a simulation that superimposes the virtual building on the vacant land, and superimposes the work procedure and wiring pattern during assembly work. Applications in various fields such as assembly support are expected.
  • the correspondence force with image coordinates can also determine the position and orientation of the camera viewpoint.
  • Non-Patent Documents 1 and 2 a method for calculating the position and orientation of an imaging device from a set of three-dimensional coordinates and image coordinates of an index has been proposed in the field of photogrammetry.
  • the point index has an advantage that it can be set even in a narrow place.
  • the square index is easy to identify and has a merit that the position and orientation of the imaging device can be obtained from only one index because of the large amount of information of one index. Therefore, the point index and the square index can be used complementarily.
  • a 6-DOF position / orientation sensor such as a magnetic sensor or an ultrasonic sensor is attached to the imaging device to be measured, and the position and Measuring posture is also done. The sensor output value is measured. Since the accuracy varies depending on the fixed range but can be obtained stably, the method using both the sensor and image processing can improve the robustness compared to the method using only image processing (Patent Document 2, Non-Patent Document 2). (Ref. 6).
  • Measurement of the position and orientation of the index in the reference coordinate system can be performed manually by using a tape measure or a protractor or by a surveying instrument. Has been done.
  • the position of the point index can be obtained by a method called the bundle adjustment method.
  • the bundle adjustment method a large number of point indices are photographed by an imaging device, and the position of the point index in the real space, the projection point on the image of the point index, and the viewpoint of the imaging device exist on the same straight line. Based on the conditions, the error (projection error) between the projection position where the index is actually observed on the image and the position and orientation of the imaging device and the calculated position of the index is minimized.
  • it is a method for obtaining the position and orientation of the imaging device that has captured the image by repeated calculation, and the position of the point index.
  • an arbitrary index Mi is less than
  • a group is composed of markers that have been photographed at the same time with at least one index Mj (i ⁇ j) and photographed at the same time, the whole needs to be composed of one group. See Figure 1 for a specific example.
  • FIG. 1 is a diagram illustrating a problem in obtaining an arrangement relationship between markers when two markers are grouped into one group and each of the two groups is imaged by a camera.
  • 101 and 102 are cameras, the camera 101 images the markers 101a and 101b, and the camera 102 images the markers 102a and 102b.
  • reference numeral 103 denotes an image taken by the camera 101, and the image 103 includes markers 101a and 101b.
  • reference numeral 104 denotes an image captured by the camera 102. This image 104 includes the forces of the markers 102a and 102b!
  • the arrangement relationship between the marker 101a and the marker 101b can be obtained, and if the image 104 is used, the arrangement relationship between the marker 102a and the marker 102b is Can be sought.
  • a method of using a camera with a wide angle of view is generally conceivable as a device that can capture a wide range at a time, but there is a problem that the influence of image distortion becomes large. is there.
  • a method of photographing from a distant place is conceivable, it is difficult to calculate an accurate arrangement relationship in any case, such as a high-resolution image not being obtained.
  • Non-Patent Document 9 a large number of markers are arranged so that the arrangement relationship of all the markers can be calculated.
  • placing a large number of markers may damage the landscape, and it is desirable that the minimum markers necessary for alignment be placed during the MR experience (see Non-Patent Document 9).
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2000-041173
  • Patent Document 2 Japanese Patent Laid-Open No. 2002-228442
  • Non-Patent Document 1 RM Haralick, C. Lee, K. Ottenberg, and M. Nolle: "Revie w and analysis of solutions of the three point perspective pose e stimulation problem", Int 'l. J. Computer Vision, vol. 13, no. 3, pp. 331— 356, 1994.
  • Patent Document 2 MA Fischler and RC Bolles: "Random sample consen sus: a paradigm for model fitting with applications to image analys is and automated cartography", Comm. ACM, vol. 24, no. 6, pp. 381-395, 1981.
  • Non-Patent Document 3 Junichi Kyokumoto: "A method of constructing augmented reality using a two-dimensional matrix code", Interactive System and Software IV, Modern Science, 1996.
  • Non-Patent Document 4 Kato, M. Billinghurst, Asano, Tachibana: "Augmented reality system based on marker tracking and its calibration", Transactions of the Virtual Reality Society of Japan, vol. 4, no. 4, pp. 607-616 , 1999.
  • Non-Patent Document 5 H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto and K. Tachibana: "Virtual object manipulation on a table—top AR environment", Proc. ISAR2000, pp. 111—119, 2000 .
  • Non-Patent Document 6 A. State, G. Hirota, DT Chen, WF Garrett and MA Livingston: "Superior augmented reality registration by integrating 1 andmark tracking and magnetic tracking, Proc.SIGLrRAPH '96, pp. 429-438, 1996.
  • Patent Document 7 G. Baratoff, A. Neubeck and H. Regenbrecht: "Interactiv e multi- marker calibration for augmented reality applications, Pro c. ISMAR2002, pp. 107-116, 2002.
  • Patent Document 8 G. Baratoff, A. Neubeck and H. Regenbrecht: "Interactiv e multi- marker calibration for augmented reality applications, Pro c. ISMAR2002, pp. 107-116, 2002.
  • Patent Document 9 G. Baratoff, A. Neubeck and H. Regenbrecht: "Interactiv e multi- marker calibration for augmented reality applications, Pro c. ISMAR2002, pp. 107-116, 2002.
  • the present invention has been made in view of the above problems, and a vision-based marker calibration method can be used even when indicators are arranged to a certain extent without affecting the scenery during MR experience.
  • the purpose is to provide a technique for obtaining the position of each indicator.
  • the index arrangement measuring method of the present invention comprises the following arrangement.
  • a first image is captured so that both a part or all of the first index Z index group and an auxiliary index whose position and orientation in the three-dimensional space temporarily placed in the real space can be defined are simultaneously reflected. Imaging process;
  • One or a plurality of images obtained by repeating the first imaging step once or a plurality of times and one or a plurality of images obtained by repeating the second imaging step once or a plurality of times To calculate a relative arrangement relationship between the first index Z index group, the auxiliary index, and the second index Z index group;
  • the index arrangement measuring method of the present invention has the following configuration. Prepare for the completion.
  • a first image is captured so that both a part or all of the first index Z index group and an auxiliary index whose position and orientation in the three-dimensional space temporarily placed in the real space can be defined are simultaneously reflected. Imaging process;
  • a first arrangement calculation process for calculating the relative arrangement relationship between the first index Z index group and the auxiliary index from one or more images obtained by repeating the first imaging process one or more times.
  • a second layout calculation step for calculating a relative layout relationship between the first index Z index group and the second index Z index group from the results of the first layout calculation step and the second layout calculation step.
  • the position and orientation estimation method of the present invention comprises the following arrangement.
  • a first arrangement calculation that calculates a relative arrangement relationship between the first index Z index group and the auxiliary index from one or more images obtained by repeating the first imaging process once or a plurality of times.
  • a second arrangement calculating step for calculating a relative arrangement relationship between the first index Z index group and the second index Z index group from the results of the first arrangement calculating step and the second arrangement calculating step;
  • An imaging apparatus position / orientation estimation step that estimates the position / orientation of the imaging apparatus using the arrangement of the first index Z index group and the second index Z index group without using an auxiliary index.
  • an index arrangement measuring apparatus of the present invention has the following configuration.
  • a first image is captured so that both a part or all of the first index Z index group and an auxiliary index whose position and orientation in the three-dimensional space temporarily placed in the real space can be defined are simultaneously reflected.
  • Imaging means
  • Second imaging means for imaging
  • One or more images obtained by repeating the imaging by the first imaging means once or a plurality of times, and one obtained by repeating the imaging by the second imaging means once or a plurality of times Arrangement calculating means for calculating a relative arrangement relationship between the first index Z index group, the auxiliary index, and the second index Z index group from one or more images;
  • an index arrangement measuring apparatus of the present invention has the following configuration.
  • a first image is captured so that both a part or all of the first index Z index group and an auxiliary index whose position and orientation in the three-dimensional space temporarily placed in the real space can be defined are simultaneously reflected.
  • Imaging means
  • a first arrangement that calculates the relative arrangement relationship between the first index Z index group and the auxiliary index from one or more images obtained by repeating imaging by the first imaging means once or a plurality of times.
  • a calculation means ;
  • a relative arrangement relationship between the second index Z index group and the auxiliary index is calculated from one or a plurality of images obtained by repeating the imaging by the second imaging means once or a plurality of times.
  • a second arrangement calculating unit that calculates a relative arrangement relationship between the first index Z index group and the second index Z index group from the results of the first arrangement calculating means and the second arrangement calculating means. Stepped
  • a position / orientation estimation apparatus of the present invention comprises the following arrangement.
  • a position and orientation estimation device that estimates the position and Z or orientation of a movable imaging device based on an index existing in real space
  • 1st index A first index that captures both a part or all of the Z index group and an auxiliary index that can be defined in position and orientation in 3D space temporarily placed in real space. Imaging means;
  • a relative arrangement relationship between the second index Z index group and the auxiliary index is calculated from one or a plurality of images obtained by repeating the imaging by the second imaging means once or a plurality of times.
  • Second arrangement calculation means for calculating a relative arrangement relationship between the first index Z index group and the second index Z index group from the results of the first arrangement calculation means and the second arrangement calculation means;
  • Imaging apparatus position and orientation estimation means for estimating the position and orientation of the imaging apparatus using the arrangement of the first index Z index group and the second index Z index group without using an auxiliary index.
  • the position of each indicator is determined using the vision-based marker calibration method. Can be obtained.
  • FIG. 1 is a diagram illustrating a problem in obtaining an arrangement relationship between markers when two markers are grouped into one group and each of the two groups is imaged with a camera.
  • FIG. 2A is a diagram showing various markers arranged in the real space.
  • FIG. 2B is a diagram showing various markers arranged in the real space.
  • FIG. 2C is a diagram showing various markers arranged in the real space.
  • FIG. 3 is a diagram showing a functional configuration of a system according to the embodiment of the present invention.
  • FIG. 4 is a flowchart of a process for obtaining an arrangement position and orientation of each alignment marker arranged in the real space.
  • FIG. 5 is a diagram showing a state in which a bridging marker is shared in each captured image when two cameras capture a real space with a bridging marker placed over the alignment marker.
  • FIG. 6 Basic configuration of a computer having the functions of the operation input unit 350, the image capturing unit 320, the index detection and identification unit 340, the marker calibration unit 360, the index information management unit 370, and the display object generation and display unit 380.
  • FIG. 6 Basic configuration of a computer having the functions of the operation input unit 350, the image capturing unit 320, the index detection and identification unit 340, the marker calibration unit 360, the index information management unit 370, and the display object generation and display unit 380.
  • FIG. 7 is a flowchart of the work for finding the position and orientation of each alignment marker placed in the real space.
  • an arrangement relationship of each index is obtained using an image of a real space in which a plurality of indices are arranged. These indicators are used when positioning between the real space and the virtual space, as used in the technical fields of MR and AR.
  • a “bridging index” is newly placed in the real space.
  • “marker” is used as “index”.
  • an index that is not a bridging index is referred to as a “positioning marker” to distinguish it from a bridging index, and the bridging index is referred to as a “bridging marker”.
  • the term “marker” includes both the alignment marker and the bridging marker.
  • the bridging marker is an index that is artificially placed because the user artificially places it for marker calibration.
  • FIGS. 2A to 2C are diagrams showing various markers arranged in the real space.
  • Figure 2A shows point indicators, and each indicator has a different color and all have a circular shape so that it can be identified and detected in an image obtained by imaging a real space. ing. As the position of the point marker in the image, the position of the center of gravity of the point marker region in the image is used.
  • FIGS. 2B and 2C each show a polygonal index
  • FIG. 2B shows a triangular marker
  • FIG. 2C shows a square marker. Since the polygon index is surrounded by a color frame so that it can be easily detected on the image, it is assumed that the inside of the frame is the index.
  • different patterns are embedded in the polygons, and for example, a unique pattern of markers placed inside each index can be specified by template matching inside. Can be identified.
  • any marker may be used for either the alignment marker or the bridging marker.
  • the marker used as the alignment marker and the marker used as the bridging marker are determined, It is necessary to register information about the marker in the computer which will be described later.
  • the bridging marker will be further described.
  • the bridging marker is placed in the real space in order to obtain the positional relationship between the positioning marker in one image and the positioning marker in the other image.
  • One picture This is for determining the position and orientation relationship between the coordinate system A to which the alignment marker in the image belongs and the coordinate system B to which the alignment marker in the other image belongs.
  • it is necessary to calculate the position and orientation of the bridging marker in coordinate system A and the position and orientation in coordinate system B.
  • the marker is given three identifiable points that do not line up on the same straight line (for example, a triangular index that can identify each vertex), the position and orientation can be obtained.
  • the bridging marker when the condition that "the bridging mar force exists on a plane that is known in both coordinate system A and coordinate system B" is given, the bridging marker can be identified by two points. Even with only the index, the position and orientation in the coordinate system A and the position and orientation in the sitting rectangle B can be calculated. Therefore, the bridging marker may have a shape having two distinguishable feature points. In the present embodiment, a triangular index whose direction can be identified is used as a bridging marker so that it can be used as a bridging marker without any special constraint condition.
  • FIG. 3 is a diagram showing a functional configuration of the system according to the present embodiment.
  • the photographing unit 310 functions as a camera that captures an image of a real space.
  • the photographing unit 310 may be of a type that captures a still image of the real space in multiple times, or may be of a type that captures a moving image of the real space.
  • the photographing position / orientation measuring unit 330 is attached to the photographing unit 310, measures the position and orientation of the photographing unit 310, and outputs the measurement result to the index detection / identification unit 340.
  • the shooting position / orientation measurement unit 330 is a magnetic sensor
  • the coordinate system sensor coordinates
  • the sensor applicable to the photographing position / orientation measurement unit 330 is not limited to a magnetic sensor, and other types of sensors such as an optical sensor may be applied. It is also possible to apply a gyro that measures only 3 degrees of freedom, which is not the case with a sensor that measures 6 degrees of freedom.
  • the captured video capturing unit 320 captures an image captured by the capturing unit 310 and outputs the captured image to the index detection / identification unit 340.
  • the index detection / identification unit 340 detects the image coordinate value of each marker (including the alignment marker and the bridging marker) included in the image received from the captured video capturing unit 320. And a process for uniquely identifying each marker. Marker identification is performed using information unique to each marker registered in the index information management unit 370. Then, the identifier of the identified marker i, the image coordinate value on the image of the marker i, the identifier unique to this image (for example, the frame number), and the position / orientation information of the photographing unit 310 received from the photographing position / orientation measuring unit 330 are set. And output to the marker calibration unit 360. Such processing is performed for each image received from the captured image capturing unit 320.
  • the marker calibration unit 360 performs processing for obtaining arrangement information (arrangement position or arrangement position and orientation) of each alignment marker arranged in the real space.
  • arrangement information arrangement position or arrangement position and orientation
  • the obtained arrangement information and an identifier for identifying the alignment force i are registered as a set and registered in the index information management unit 370.
  • the marker calibration unit 360 uses both the alignment marker and the bridging marker to obtain the alignment information of the alignment marker.
  • the marker calibration unit 360 uses the bundle adjustment method or the like. Is calculated. That is, by using the alignment mark and the bridging marker without distinction, the position information of the alignment marker i is calculated.
  • the operation input unit 350 inputs an operation instruction to the shooting position / orientation measurement unit 330, the shooting image capturing unit 320, and the index detection / identification unit 340.
  • the shooting position / orientation measuring unit 330, the shooting image capturing unit 320, The index detection / identification unit 340 operates upon receiving an operation instruction from the operation input unit 350.
  • the index information management unit 370 includes information for identifying each marker (for each marker, an identifier unique to the marker, a shape of the marker, a feature inside the marker (described in a frame) Registered characters, symbols, etc.) and marker type information (alignment marker, bridging marker)) are registered. Then, a set of the alignment marker arrangement information output from the marker calibration unit 360 and an identifier for identifying the alignment marker is registered.
  • the index information setting unit 390 deletes the bridging marker information from the index information updated by the marker calibration unit 360 using the marker type information, and stores it in the MR or AR. Index information necessary for alignment is generated and set. The bridging marker is deleted when experiencing the placed MR or AR during the marker calibration. Therefore, information on the bridging marker is not necessary in the alignment process during MR or AR execution. According to this embodiment, since the marker information setting unit 390 can automatically delete the information on the bridging marker, the load on the user can be reduced.
  • the display object generation 'display unit 380 displays the set data registered in the index information management unit 370 by the marker calibration unit 360 and the isolated point marker check result.
  • the display format is not particularly limited.
  • FIG. 6 illustrates a computer having the functions of the operation input unit 350, the video capturing unit 320, the index detection / identification unit 340, the marker calibration unit 360, the index information management unit 370, and the display object generation / display unit 380. It is a block diagram which shows the basic composition of.
  • Reference numeral 601 denotes a CPU that controls the entire computer using programs and data stored in the RAM 602 and the ROM 603, and executes each process described below performed by the computer.
  • Reference numeral 602 denotes a RAM which includes an area for temporarily storing programs and data loaded from the external storage device 606, and a work area used when the CPU 601 executes each process.
  • Reference numeral 603 denotes a ROM which stores setting data and a boot program of the computer.
  • An operation unit 604 includes a keyboard and a mouse, and various instructions can be input to the CPU 601 by an operator of the computer.
  • Reference numeral 605 denotes a display unit, which includes a CRT, a liquid crystal screen, and the like, and can display processing results by the CPU 601 using images, characters, and the like.
  • Reference numeral 606 denotes an external storage device that functions as a large-capacity information storage device such as a hard disk drive device.
  • the OS operating system
  • the CPU 601 perform each process described later that is performed by the computer. Programs and data to be executed are stored, and some or all of them are loaded into the RAM 602 under the control of the CPU 601 and are processed by the CPU 601.
  • Reference numeral 607 denotes an IZF, which can be connected to the photographing unit 310 and the photographing position / orientation measuring unit 330. Images taken by the photographing unit 310 and photographing parts measured by the photographing position / orientation measuring unit 330. The position and orientation information is input to this computer via this IZF607.
  • Reference numeral 608 denotes a bus connecting the above-described units.
  • step S701 the processing in step S701 is performed by a computer having the above-described configuration.
  • the processing (work) in other steps is performed by a person.
  • marker calibration processing (corresponding to steps S410 to S450 in Fig. 4) is performed by the computer (step S701).
  • step S701 it is checked whether or not an alignment marker that is an isolated point (isolated marker) remains (step S702).
  • the check result is displayed. Specifically, in the display of the registered set data, display (coloring and marking) that distinguishes isolated point markers is performed. The user can easily grasp the isolated marker by checking this display.
  • the bridging marker is not a target for determination as to whether it is an isolated marker.
  • the reason is that the bridging marker is used in an auxiliary manner to calculate the positioning marker placement information, and is an index that is not used during the AR experience. Because.
  • An isolated marker is a positioning marker whose coordinate system cannot be shared.
  • the marker 101a and the marker 10 lb are defined in the first coordinate system
  • the marker 102a and the marker 102b are defined in the second coordinate system.
  • the first coordinate system and the second coordinate system are independent coordinate systems
  • the markers 102a and 102 defined in the second coordinate system are used.
  • the marker 102b is determined to be an isolated marker.
  • the force 101a, the marker 101b, the marker 102a, the marker 102b, and the marker 505 are defined in the same coordinate system. Therefore, in the case shown in FIG.
  • step S703 If any of the alignment markers remains as an isolated marker, it is because the number of images in the real space is small or the number of bridging markers is insufficient. A bridging marker is set so that it can be linked to a marker that is not isolated (step S703). Then, repeat the work after step S701.
  • step S703 the bridging force arranged in step S703 is eliminated (step S704).
  • a bridging marker is arranged only when marker calibration is performed, and the bridging marker is excluded when marker calibration is completed.
  • step S701 a flowchart showing the process in step S701 performed by the computer having the above-described configuration, that is, a process for obtaining the position and orientation of each alignment marker arranged in the real space. 4 will be used for explanation.
  • the program and data for causing the CPU 601 to execute the processing according to the flowchart in the figure are stored in the external storage device 606, which is loaded into the RAM 602 under the control of the CPU 601 and used by the CPU 601.
  • the computer executes each process described below.
  • step S410 information regarding each marker placed in the real space is registered in the external storage device 606 (step S410).
  • a GUI for registering information on each marker placed in the real space is displayed on the display screen of the display unit 380, and the operator operates the operation unit 604 to display each marker via the GUI. Enter information about the marker.
  • the CPU 601 detects this in step S410 and registers information about each marker input via the GUI in the external storage device 606. To do.
  • information on each marker includes an identifier (for example, ID) unique to the marker, a shape of the mark, features inside the marker (characters, symbols, etc. written in the frame), marker type information ( There are alignment markers and bridging markers), and these sets are registered in the external storage device 606 for each marker arranged in the real space.
  • ID for example, ID
  • shape of the mark features inside the marker (characters, symbols, etc. written in the frame)
  • marker type information There are alignment markers and bridging markers
  • step S420 an operation instruction input from the operation unit 604 is received (step S420).
  • step S430 an instruction to image the real space is detected
  • the process proceeds to step S430, and the imaging is performed.
  • the image input from the shadow unit 310 is once stored in the RAM 602 (step S430), and each marker included in the image is identified, and the process of identifying the coordinate position of each marker in the image is performed (step S440). ).
  • the identification result of the marker (identifier of the identified marker), the image coordinate value of this marker, and the (approximate) position / attitude information of the imaging unit 310 acquired from the imaging position / orientation measurement unit 330 are:
  • it is registered in the external storage device 606 as detection information (step S445).
  • the detection information can be registered in the external storage device 606 for each marker included in the image acquired in step S430.
  • step S450 the process returns to step S420 to wait for the next operation instruction.
  • the CPU 601 detects that an operation instruction to perform marker calibration described below is input via the operation unit 604 in step S420, the processing is stepped. Proceeding to S450, the arrangement information of each alignment marker arranged in the real space is obtained. A so-called marker calibration process described below is performed (step S450).
  • the (approximate) position / orientation information of the photographing unit 310 acquired from the photographing position / orientation measuring unit 330 is used.
  • This “position and orientation” may be in the world coordinate system (a coordinate system in which one point in the real space is the origin, and three axes orthogonal to each other at this origin are x, y, and z, respectively). Depending on the content, any coordinate system may be used as appropriate.
  • the marker calibration process using the set of identification results (identified marker identifiers) obtained, the image coordinate values, and the position and orientation information of the imaging device at the time of capturing this image, for example, a bundle
  • processing for obtaining the arrangement position and orientation of each marker is performed, and the result is sent to the index information management unit 470.
  • FIG. 5 shows a situation where the same bridging marker is photographed in two captured images when the real space where the bridging marker is arranged in addition to the alignment marker is imaged from two positions.
  • FIG. 1 the same parts as those in FIG. 1 are denoted by the same reference numerals, and the description thereof is omitted.
  • the camera 101 and the camera 102 share a bridge marker 505, respectively. Captured images 501, 502 are obtained. Note that the camera 101 and the camera 102 may use different cameras or take pictures.
  • such a bridging marker when obtaining the arrangement position and orientation of the alignment marker, such a bridging marker is arranged if the number of alignment markers is small and sparse. By doing so, the arrangement position of the alignment marker can be calculated. Note that the bridging marker is projected only when calculating the position of the alignment marker and is eliminated during the MR experience, so there is no problem of losing the landscape.
  • marker type information is used as index information in order to distinguish between the alignment marker and the bridging marker.
  • the alignment marker and the bridging marker may be distinguished using other methods.
  • a marker having a specific shape different from the alignment marker may be used for the bridging marker.
  • an example in which marker calibration processing is performed by a method different from that of the first embodiment will be described as a modification of the first embodiment. That is, in the marker calibration process according to the present embodiment, first, an arbitrary image (hereinafter, this image is referred to as a reference image) is included, for example, using the image coordinate values of the respective markers. By bundle adjustment, processing for obtaining the arrangement position and orientation of each marker (and the position and orientation of the photographing unit 310 at the time of image photographing) is performed.
  • first alignment marker one of the alignment markers for which the arrangement position and orientation are obtained is hereinafter referred to as “first alignment marker”.
  • the reference image also includes a bridging marker.
  • the position and orientation of the bridging marker are also found.
  • the bridging marker included in the reference image is hereinafter referred to as “reference bridging marker”.
  • T1 be the placement position relationship between the placement position and orientation of the first alignment force and the placement position and orientation of the reference bridging marker obtained for the reference image force.
  • the following processing is performed.
  • the arrangement position and orientation of each marker included in another image including the reference bridging marker (hereinafter referred to as another image)
  • the arrangement position and orientation of each marker are obtained by the bundle adjustment.
  • the obtained arrangement position and orientation are based on the coordinate system followed by the marker included in the other image.
  • the arrangement positions and orientations of all the alignment markers are unified according to a coordinate system (hereinafter referred to as a reference coordinate system) to which the first alignment marker belongs.
  • the arrangement position and orientation relationship between the alignment marker obtained from another image and the arrangement position and orientation of the reference bridging marker in the other image is corrected by the inverse transformation matrix of the matrix indicating T1. That's fine.
  • the arrangement position and orientation of the alignment marker obtained from the separate image are in accordance with the reference coordinate system, and thus the alignment marker and the first alignment marker obtained from the separate image are obtained. Can be obtained.
  • each captured image is included in a certain image !, and at least one of the one or more bridging markers is always included in any image. It is necessary to take an image.
  • an object of the present invention is to supply a recording medium (or storage medium) that records software program codes for realizing the functions of the above-described embodiments to a system! /,
  • the apparatus, and the computer of the system or the apparatus (Or CPU or MPU) stored on recording media Needless to say, this can also be achieved by reading and executing the programmed program code.
  • the read program code itself realizes the functions of the above-described embodiment, and the recording medium on which the program code is recorded constitutes the present invention.
  • the program code force from which the recording medium power is also read out is written in the memory provided in the function expansion card inserted into the computer or the function expansion unit connected to the computer, and then, based on the instruction of the program code, It goes without saying that the CPU of the function expansion card or function expansion unit performs part or all of the actual processing and the functions of the above-described embodiments are realized by the processing.
  • the recording medium stores program codes corresponding to the flowcharts described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
PCT/JP2006/301620 2005-02-02 2006-02-01 指標配置計測方法、位置姿勢推定方法、指標配置計測装置、位置姿勢推定装置 Ceased WO2006082825A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/611,404 US7664341B2 (en) 2005-02-02 2006-12-15 Index layout measurement method, position and orientation estimation method, index layout measurement apparatus, and position and orientation estimation apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005026879A JP4599184B2 (ja) 2005-02-02 2005-02-02 指標配置計測方法、指標配置計測装置
JP2005-026879 2005-02-02

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/611,404 Continuation US7664341B2 (en) 2005-02-02 2006-12-15 Index layout measurement method, position and orientation estimation method, index layout measurement apparatus, and position and orientation estimation apparatus

Publications (1)

Publication Number Publication Date
WO2006082825A1 true WO2006082825A1 (ja) 2006-08-10

Family

ID=36777209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/301620 Ceased WO2006082825A1 (ja) 2005-02-02 2006-02-01 指標配置計測方法、位置姿勢推定方法、指標配置計測装置、位置姿勢推定装置

Country Status (3)

Country Link
US (1) US7664341B2 (enExample)
JP (1) JP4599184B2 (enExample)
WO (1) WO2006082825A1 (enExample)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008122109A (ja) * 2006-11-08 2008-05-29 Canon Inc 情報処理装置、情報処理方法
JP2008249407A (ja) * 2007-03-29 2008-10-16 Canon Inc 画像処理装置、画像処理方法
CN105791626A (zh) * 2013-04-25 2016-07-20 胡妍 一种折叠型拍照式扫描仪
CN105872294A (zh) * 2013-04-25 2016-08-17 高燕妮 一种折叠式扫描仪
CN105872295A (zh) * 2013-04-25 2016-08-17 高燕妮 一种便携式高拍仪
CN105872301A (zh) * 2013-04-25 2016-08-17 申清章 具有感应开关的小型扫描仪
CN106101504A (zh) * 2013-04-25 2016-11-09 滁州华尊电气科技有限公司 一种具有感应开关的高速扫描仪

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4914039B2 (ja) * 2005-07-27 2012-04-11 キヤノン株式会社 情報処理方法および装置
US20070065004A1 (en) * 2005-08-01 2007-03-22 Topcon Corporation Three-dimensional measurement system and method of the same, and color-coded mark
JP4876056B2 (ja) * 2007-11-16 2012-02-15 キヤノン株式会社 画像処理装置、画像処理方法
US9058764B1 (en) * 2007-11-30 2015-06-16 Sprint Communications Company L.P. Markers to implement augmented reality
JP5207719B2 (ja) * 2007-12-05 2013-06-12 株式会社トプコン カラーコード付き標識、カラーコード抽出手段及び三次元計測システム
EP2250623A4 (en) 2008-03-05 2011-03-23 Ebay Inc METHOD AND DEVICE FOR IMAGE RECOGNITION SERVICES
US9495386B2 (en) 2008-03-05 2016-11-15 Ebay Inc. Identification of items depicted in images
JP5111210B2 (ja) * 2008-04-09 2013-01-09 キヤノン株式会社 画像処理装置、画像処理方法
US8542906B1 (en) 2008-05-21 2013-09-24 Sprint Communications Company L.P. Augmented reality image offset and overlay
US8839121B2 (en) * 2009-05-06 2014-09-16 Joseph Bertolami Systems and methods for unifying coordinate systems in augmented reality applications
JP5290864B2 (ja) * 2009-05-18 2013-09-18 キヤノン株式会社 位置姿勢推定装置及び方法
JP5567908B2 (ja) * 2009-06-24 2014-08-06 キヤノン株式会社 3次元計測装置、その計測方法及びプログラム
US9164577B2 (en) * 2009-12-22 2015-10-20 Ebay Inc. Augmented reality system, method, and apparatus for displaying an item image in a contextual environment
JP5746477B2 (ja) 2010-02-26 2015-07-08 キヤノン株式会社 モデル生成装置、3次元計測装置、それらの制御方法及びプログラム
JP5297403B2 (ja) 2010-02-26 2013-09-25 キヤノン株式会社 位置姿勢計測装置、位置姿勢計測方法、プログラムおよび記憶媒体
JP5544942B2 (ja) * 2010-03-11 2014-07-09 トヨタ自動車株式会社 物体情報検出システム、情報検出方法、及びプログラム
EP2381214B1 (en) * 2010-04-22 2020-06-10 Metronor A/S Optical measurement system
JP5612916B2 (ja) 2010-06-18 2014-10-22 キヤノン株式会社 位置姿勢計測装置、その処理方法、プログラム、ロボットシステム
JP5624394B2 (ja) 2010-07-16 2014-11-12 キヤノン株式会社 位置姿勢計測装置、その計測処理方法及びプログラム
JP5496008B2 (ja) 2010-08-06 2014-05-21 キヤノン株式会社 位置姿勢計測装置、位置姿勢計測方法、およびプログラム
US20130289407A1 (en) * 2010-09-14 2013-10-31 Samsung Medison Co., Ltd. 3d ultrasound system for extending view of image and method for operating the 3d ultrasound system
US10127606B2 (en) 2010-10-13 2018-11-13 Ebay Inc. Augmented reality system and method for visualizing an item
JP5691629B2 (ja) * 2011-02-24 2015-04-01 株式会社大林組 画像合成方法
JP5764969B2 (ja) * 2011-02-24 2015-08-19 株式会社大林組 画像合成方法
US9449342B2 (en) 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
JP6004809B2 (ja) 2012-03-13 2016-10-12 キヤノン株式会社 位置姿勢推定装置、情報処理装置、情報処理方法
US10846766B2 (en) 2012-06-29 2020-11-24 Ebay Inc. Contextual menus based on image recognition
JP6111790B2 (ja) * 2013-03-28 2017-04-12 富士通株式会社 撮影補助装置、画像生成システムおよび立体画像生成方法
JP6192454B2 (ja) * 2013-09-17 2017-09-06 ウエストユニティス株式会社 表示システム
JP6253368B2 (ja) 2013-11-25 2017-12-27 キヤノン株式会社 三次元形状計測装置およびその制御方法
US9191620B1 (en) 2013-12-20 2015-11-17 Sprint Communications Company L.P. Voice call using augmented reality
US20170032349A1 (en) * 2014-04-18 2017-02-02 Nec Solution Innovators, Ltd. Information processing apparatus
WO2016163563A1 (ja) * 2015-04-09 2016-10-13 日本電気株式会社 地図生成装置、地図生成方法及びプログラム記録媒体
US10546173B2 (en) 2015-04-09 2020-01-28 Nec Corporation Information processing device, information processing system, position reporting method, and program recording medium
JP6915611B2 (ja) * 2016-04-14 2021-08-04 日本電気株式会社 情報処理装置、情報処理方法及びプログラム
CN110313172B (zh) * 2017-02-22 2021-11-23 索尼公司 成像系统和信息处理方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000041173A (ja) * 1998-07-23 2000-02-08 Mr System Kenkyusho:Kk 視点位置姿勢の決定方法、カメラ装置、及び視点位置センサ
JP2002071313A (ja) * 2000-08-31 2002-03-08 Toshiba Corp ターゲットマーカとこれを用いた対象物の位置計測方法及びロボットシステム
JP2002156229A (ja) * 2000-11-17 2002-05-31 Kajima Corp 構造物の移動式変位計測方法及び装置
JP2004342067A (ja) * 2003-04-22 2004-12-02 3D Media Co Ltd 画像処理方法、画像処理装置、及びコンピュータプログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3467017B2 (ja) 2000-11-30 2003-11-17 キヤノン株式会社 位置姿勢の決定方法及び装置並びに記憶媒体
US7249139B2 (en) * 2001-07-13 2007-07-24 Accenture Global Services Gmbh Secure virtual marketplace for virtual objects and services
JP3796449B2 (ja) 2002-01-31 2006-07-12 キヤノン株式会社 位置姿勢決定方法および装置並びにコンピュータプログラム
EP1349114A3 (en) 2002-03-19 2011-06-15 Canon Kabushiki Kaisha Sensor calibration apparatus, sensor calibration method, program, storage medium, information processing method, and information processing apparatus
JP4136859B2 (ja) * 2003-01-10 2008-08-20 キヤノン株式会社 位置姿勢計測方法
JP4367926B2 (ja) 2004-05-17 2009-11-18 キヤノン株式会社 画像合成システムおよび画像合成方法、および画像合成装置
JP4914039B2 (ja) 2005-07-27 2012-04-11 キヤノン株式会社 情報処理方法および装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000041173A (ja) * 1998-07-23 2000-02-08 Mr System Kenkyusho:Kk 視点位置姿勢の決定方法、カメラ装置、及び視点位置センサ
JP2002071313A (ja) * 2000-08-31 2002-03-08 Toshiba Corp ターゲットマーカとこれを用いた対象物の位置計測方法及びロボットシステム
JP2002156229A (ja) * 2000-11-17 2002-05-31 Kajima Corp 構造物の移動式変位計測方法及び装置
JP2004342067A (ja) * 2003-04-22 2004-12-02 3D Media Co Ltd 画像処理方法、画像処理装置、及びコンピュータプログラム

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008122109A (ja) * 2006-11-08 2008-05-29 Canon Inc 情報処理装置、情報処理方法
JP2008249407A (ja) * 2007-03-29 2008-10-16 Canon Inc 画像処理装置、画像処理方法
CN105791626A (zh) * 2013-04-25 2016-07-20 胡妍 一种折叠型拍照式扫描仪
CN105872294A (zh) * 2013-04-25 2016-08-17 高燕妮 一种折叠式扫描仪
CN105872295A (zh) * 2013-04-25 2016-08-17 高燕妮 一种便携式高拍仪
CN105872301A (zh) * 2013-04-25 2016-08-17 申清章 具有感应开关的小型扫描仪
CN105872282A (zh) * 2013-04-25 2016-08-17 高燕妮 具有感应开关的便携式高拍仪
CN105915747A (zh) * 2013-04-25 2016-08-31 高燕妮 一种折叠式扫描仪
CN105915748A (zh) * 2013-04-25 2016-08-31 高燕妮 一种折叠式扫描仪
CN106101504A (zh) * 2013-04-25 2016-11-09 滁州华尊电气科技有限公司 一种具有感应开关的高速扫描仪
CN106101478A (zh) * 2013-04-25 2016-11-09 滁州华尊电气科技有限公司 一种高速扫描仪
CN106101477A (zh) * 2013-04-25 2016-11-09 滁州华尊电气科技有限公司 具有摄像头的高速扫描仪
CN106131369A (zh) * 2013-04-25 2016-11-16 滁州华尊电气科技有限公司 具有感应开关的高速扫描仪
CN106210427A (zh) * 2013-04-25 2016-12-07 滁州华尊电气科技有限公司 一种具有摄像头的高速扫描仪
CN106210428A (zh) * 2013-04-25 2016-12-07 滁州华尊电气科技有限公司 高速扫描仪
CN106231148A (zh) * 2013-04-25 2016-12-14 滁州华尊电气科技有限公司 具有待拍区域指示的高速扫描仪
CN105872301B (zh) * 2013-04-25 2020-10-13 乐清市华尊电气有限公司 具有感应开关的小型扫描仪

Also Published As

Publication number Publication date
US20070091125A1 (en) 2007-04-26
US7664341B2 (en) 2010-02-16
JP4599184B2 (ja) 2010-12-15
JP2006214832A (ja) 2006-08-17

Similar Documents

Publication Publication Date Title
JP4599184B2 (ja) 指標配置計測方法、指標配置計測装置
JP5196825B2 (ja) 画像処理装置、画像処理方法
US7529387B2 (en) Placement information estimating method and information processing device
JP4926817B2 (ja) 指標配置情報計測装置および方法
US7657065B2 (en) Marker placement information estimating method and information processing device
JP4886560B2 (ja) 情報処理装置、情報処理方法
US8019114B2 (en) Position and orientation measurement method and apparatus
JP4976756B2 (ja) 情報処理方法および装置
JP4739004B2 (ja) 情報処理装置及び情報処理方法
JP4502361B2 (ja) 指標姿勢検出方法および装置
JPWO2021111613A1 (ja) 3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラム
JP5726024B2 (ja) 情報処理方法および装置
JP2007048068A (ja) 情報処理方法および装置
JP4926598B2 (ja) 情報処理方法、情報処理装置
JP2008046749A (ja) 画像処理方法および装置
JP4810403B2 (ja) 情報処理装置、情報処理方法
JP2008140047A (ja) 情報処理方法、情報処理装置
Araujo et al. Life cycle of a slam system: Implementation, evaluation and port to the project tango device
JP5127165B2 (ja) 情報処理方法および装置
JP2016065830A (ja) 画像処理装置、画像処理方法
JP2006293485A (ja) 指標表示方法及び画像表示装置
JP2005107966A (ja) 指標識別方法および装置
JP2014215821A (ja) 情報処理装置、情報処理方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06712763

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6712763

Country of ref document: EP