JP5517397B2 - Image processing method and image processing apparatus - Google Patents

Image processing method and image processing apparatus Download PDF

Info

Publication number
JP5517397B2
JP5517397B2 JP2007038430A JP2007038430A JP5517397B2 JP 5517397 B2 JP5517397 B2 JP 5517397B2 JP 2007038430 A JP2007038430 A JP 2007038430A JP 2007038430 A JP2007038430 A JP 2007038430A JP 5517397 B2 JP5517397 B2 JP 5517397B2
Authority
JP
Japan
Prior art keywords
marker
step
image
markers
definition information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2007038430A
Other languages
Japanese (ja)
Other versions
JP2008064735A (en
JP2008064735A5 (en
Inventor
その子 前田
博紀 米澤
憲司 守田
昭宏 片山
利果 武本
清秀 佐藤
俊広 小林
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2006220638 priority Critical
Priority to JP2006220638 priority
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2007038430A priority patent/JP5517397B2/en
Publication of JP2008064735A publication Critical patent/JP2008064735A/en
Publication of JP2008064735A5 publication Critical patent/JP2008064735A5/ja
Application granted granted Critical
Publication of JP5517397B2 publication Critical patent/JP5517397B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/32Aligning or centering of the image pick-up or image-field
    • G06K9/3216Aligning or centering of the image pick-up or image-field by locating a pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Description

  The present invention relates to a technique for obtaining the position and orientation of an index existing in a real space.

  One important technique for providing high-quality mixed reality is a technique for aligning a real space and a virtual space. Various methods have been proposed as alignment techniques. Among them, the method of calculating (calibrating) the camera's position and orientation from an image (index image) obtained by shooting an index is widely used from the viewpoint of cost and usability. It's being used.

  Many of these methods are based on the premise that the positions and orientations of all the indices in the space are known, and it is necessary to measure and input the positions and orientations of the indices in advance. As the index, an artificial marker (hereinafter referred to as a marker) or a natural feature is generally used. When markers are used, the more markers that are placed, the more unrealistic it is for a person to perform manual measurements.

  In order to expand the range of the space in which the alignment process can be performed, it is necessary to install a large number of markers in a wide range. Therefore, there is a need for means for efficiently and highly accurately measuring the position and orientation of these many widely installed markers.

  In recent years, a method for calibrating the relative position and orientation of markers from an image obtained by photographing the markers has been proposed (see Non-Patent Document 1).

  After obtaining the relative positions and orientations of all the markers by this method, the positions and orientations of all the markers in the world coordinate system can be obtained by determining the positions and orientations of the arbitrary markers in the world coordinate system.

  The work required for the alignment process in this method and the outline of the calibration process are as follows.

  (1.1) Create a marker and place it in the space where the alignment process is performed.

  (1.2) The position and orientation of a marker (reference marker) whose position and orientation are determined in the world coordinate system are input.

  (1.3) Shoot the marker.

  (1.4) Perform calibration.

      (A) The relative position and orientation of a certain marker and the camera are calculated from an image including a plurality of markers.

      (B) Calculate the relative position and orientation of another marker and the camera.

      (C) Calculate the relative position and orientation between the two markers.

      (D) This is performed for an image including all of the plurality of markers, and the relative positions and orientations between all the markers are calculated.

      (E) Calculate the absolute position and orientation of all markers (relative markers) other than the reference marker from the reference marker.

  (1.5) Alignment processing is performed using a marker whose position and orientation are obtained by calibration.

  General alignment using natural features is composed of two steps: creation of a natural feature database and estimation of the position and orientation of the camera using the natural feature database (Non-Patent Document 2). The creation of the natural feature database is performed using a large number of images obtained by photographing the real space, similarly to the calibration of the marker. The outline of the work necessary for the alignment processing using natural features is as follows.

  (2.1) Take multiple images of real space.

  (2.2) Extract features from the image.

  (2.3) The position of a certain feature is obtained from a plurality of images.

      (A) Corresponding features between multiple images.

      (A) The relative position and orientation between a plurality of images is obtained based on the above correspondence.

      (C) The three-dimensional position of the feature is obtained based on the relative position and orientation.

      (D) Perform this for all extracted features.

(2.4) The position and orientation of the camera are estimated and aligned using the natural feature database created above.
Kotake, Uchiyama, Yamamoto: Marker calibration method using pioneering knowledge about marker placement, Transactions of the Virtual Reality Society of Japan, vol.10, No.3, pp.401-410, 2005. Iryna Gordon, David G. Lowe: "Scene Modeling, Recognition and Tracking with Invariant Image Features", Proc. Third IEEE and ACM International Symposium on Mixed and Augmented Reality, pp.110 − 119, 2004

  In the existing method, when the marker is used, all the markers photographed in the operation (1.3) are calibrated by the above (1.4), and the alignment process of the above (1.5) is performed. I was using it.

  As the number of markers used increases, the alignment accuracy improves. However, since the load of the alignment processing and calibration processing increases, it is desirable that the number of markers used be small. It is possible for the operator to reduce the number of markers to some extent by taking care that only the necessary markers are shot at the time of shooting. However, this requires operator experience and skills. Therefore, in practice, the above operations (1.1) to (1.5) are repeatedly performed, and the optimum arrangement and number of markers are determined empirically.

  At this time, in the existing method, when excluding any marker from the calibration and alignment processing, all or part of the acquired image is discarded, and the marker is not included in the acquired image, and then again. Calibration was in progress. Alternatively, after removing the marker from the space, the image is acquired again and calibration is performed.

  Further, when a plurality of image processing apparatuses that perform calibration and alignment processing are used in an adjacent space, in addition to the markers installed for use by the own image processing apparatus in the operation of (1.1), adjacent There is a high possibility that the marker installed for use by the image processing apparatus is accidentally photographed. In this case, unnecessary alignment processing and calibration processing occur.

  Furthermore, there is a possibility that the marker ID of the marker arranged for use by the own image processing apparatus and the marker ID of the marker arranged for use by the adjacent image processing apparatus overlap. In this case, calibration and alignment processing will not operate correctly.

  The overlap of the marker ID also occurs, for example, when the marker created for the preliminary operation in the operation (1.1) is inadvertently left in the space where the calibration and alignment processing is accidentally shot. In addition, duplication of marker IDs also occurs due to erroneous marker recognition by image processing.

  Marker misrecognition occurs due to disturbances such as shooting a subject similar to the marker, parameters such as exposure and aperture of the device that captures the marker, and light conditions in the shooting environment. It was difficult to prevent this from occurring.

  In the existing method, in order to exclude the duplicated marker from the calibration and alignment processing, all or a part of the acquired image is discarded, and then all the markers other than the duplicated marker are re-photographed.

  On the other hand, when the natural feature is used, the position of the feature extracted in the operation of (2.2) is calculated by the processing of (2.3), a natural feature database is created, and the above (2) 4) was used in the alignment process.

  With existing methods, many similar features may appear. In this case, there is a risk that similar features may be mistakenly associated with each other as the same feature in the creation of the natural feature database or the alignment process, and the operation may not be performed correctly.

  The present invention has been made in view of the above problems, and an object of the present invention is to provide a technique for automatically or semi-automatically selecting an index used for calibration and alignment processing.

  Another object of the present invention is to notify the operator of work necessary for executing calibration according to the marker definition information, thereby reducing the load of setting work for alignment performed by the operator. There is.

  In order to achieve the object of the present invention, for example, an image processing method of the present invention comprises the following arrangement.

That is, an acquisition step of acquiring an image of a real space in which a plurality of markers having identifiable patterns are arranged;
An identification step of identifying the ID of the marker from the pattern of the marker in the image acquired in the acquisition step, and registering the identified ID in a predetermined file ;
A detection step of comparing the IDs identified in the identification step and registered in the predetermined file, and detecting a plurality of markers having the same identified ID as overlapping markers;
A calculation step of calculating a position and / or orientation in a real space of markers other than the overlappingly arranged markers detected in the detection step among the markers identified in the identification step, To do.

  In order to achieve the object of the present invention, for example, an image processing apparatus of the present invention comprises the following arrangement.

That is, an acquisition means for acquiring an image of a real space in which a plurality of markers having identifiable patterns are arranged;
Identifying means for identifying the ID of the marker from the pattern of the marker in the image obtained by the obtaining means, and registering the identified ID in a predetermined file ;
Detecting means for comparing IDs identified by the identifying means and registered in the predetermined file, and detecting a plurality of markers having the same identified ID as overlapping markers;
Calculating means for calculating a position and / or orientation in a real space of markers other than the overlappingly arranged markers detected by the detecting means among the markers identified by the identifying means, To do.

  According to the configuration of the present invention, a marker to be used for calibration and alignment processing can be selected automatically or semi-automatically.

  Hereinafter, the present invention will be described in detail according to preferred embodiments with reference to the accompanying drawings.

[First Embodiment]
In this embodiment, a case where a marker is used as an example of an index will be described.

  FIG. 1 is a diagram illustrating a configuration example (format example, pattern example) of a marker used in the present embodiment. As shown in the figure, the marker used in the present embodiment is configured by 4 × 4 small squares, and the 4 × 4 small squares further include four corners (blk1 to blk3, white), c1 to c4, It is classified into x1 to x8. Each small square is filled with either white or black, and represents a bit value “0” or a bit value “1” depending on the filled color. For example, a small square filled with white represents a bit value “0”, and a small square filled with black represents a bit value “1”.

  First, small squares (blk1 to blk3, whit) at four corners will be described. Any one of the four small squares is white (wht), and the remaining three are black (blk 1 to 3). These are used to define the marker coordinate system. The marker coordinate system is a coordinate system defined for each marker. More specifically, the direction of the line connecting blk1 and blk2 is the positive direction of the x coordinate axis, and the direction of the line connecting blk3 and blk2 Is a coordinate system with the positive direction of the y coordinate axis, the normal direction of the marker as the positive direction of the z coordinate axis, and the center position of the marker as the origin. FIG. 3 is a diagram showing a marker coordinate system. The position of the marker in the world coordinate system is determined by the position of the marker coordinate system origin in the world coordinate system. Further, the posture of the marker in the world coordinate system is determined by how the marker coordinate system is rotated with respect to the world coordinate system.

  From this, the relative position and orientation between the camera that images such a marker and the marker can be obtained by a known technique using the marker shape, side length, and center position in the marker image.

  Next, c1 to c4 and x1 to x8 will be described. x1 to x8 are for representing an identifier (marker ID) unique to the marker. Since 8 bits can be expressed by x1 to x8, the marker ID can take a value of 0 to 255. c1 to c4 are inspection bits, and are used for correction when an information bit is erroneously recognized by image processing.

  The value of the check bit can be obtained from the following equation.

c1 = not (x1 + x2 + x3 + x5 + x6)
c2 = not (x2 + x3 + x4 + x6 + x7)
c3 = not (x3 + x4 + x5 + x7 + x8)
c4 = not (x1 + x4 + x5 + x6 + x8)
FIG. 2 is a diagram showing an example of an actual marker created in accordance with the format shown in FIG. As described above, each small square is filled with white or black, and each small square represents assigned information. In addition, although it does not specifically limit about the size of a marker, In this embodiment, suppose that all the lengths of one side of all the markers arranged in the real space are the same.

  Next, the image processing apparatus according to the present embodiment for obtaining the position and orientation of such a marker will be described. FIG. 4 is a block diagram illustrating a functional configuration of the image processing apparatus according to the present embodiment.

  In the figure, reference numeral 100 denotes an image processing apparatus main body. The image input unit 101 acquires an image supplied from an external device such as a digital camera or a hard disk device. The image acquired here is an image of a real space in which a plurality of markers described with reference to FIGS. The image input unit 101 supplies the acquired image to the subsequent image acquisition unit 102.

  The image acquisition unit 102 performs various preprocessing on the image, such as changing the image received from the image input unit 101 to a format that can be handled by the apparatus. The processed image is sent to the subsequent image management unit 106.

  When the image management unit 106 receives an image from the image acquisition unit 102, the image management unit 106 stores the image in the storage unit 110. Further, an image stored in the storage unit 110 and marker definition information described later are read out as necessary, and are sent to the index management unit 107 and the image composition unit 111.

  The storage unit 110 is for storing an image received from the image management unit 106 and marker definition information described later.

  Here, the marker definition information is a collection of various information related to the marker and exists for each marker. In the present embodiment, the marker definition information includes a marker ID, a marker state, a marker size, and a marker position and orientation. The marker definition information is stored in the storage unit 110 in a table format as shown in FIG. 5, for example.

  FIG. 5 is a block diagram illustrating a configuration example of a table (marker definition information table) in which marker definition information for each marker is registered. As shown in the figure, in the table, “marker ID”, “marker state”, “marker size”, and “marker position and orientation” are registered for each marker.

  As described above, “marker ID” corresponds to x1 to x8 marked on the marker. That is, the values indicated by the bit strings x1 to x8 are registered in the marker ID.

  As shown in the figure, the “marker state” includes a reference marker flag, a detection flag, a target flag, and a completion status.

  The reference marker flag is a flag indicating whether or not the marker is a reference marker. If the reference marker flag is “1”, it is a reference marker, and if it is “0”, it is not a reference marker (in other words, a relative marker). In the figure, only the marker with marker ID = 7 is the reference marker flag = 1, and it is shown that only this marker is the reference marker.

  The detection flag is a flag indicating whether or not the marker is detected from the image transmitted from the image acquisition unit 102. If the detection flag is “1”, it indicates that it has been detected, and if it is “0”, it indicates that it has not been detected. In the drawing, the markers with marker ID = 1, 7, 45, 125 are detection flag = 1, and the markers with marker ID = 1, 7, 45, 125 are detected from the image.

  The target flag is a flag indicating whether or not the marker is a position / orientation calculation target (that is, a calibration target). If the target flag is “1”, it indicates that it is a calculation target, and if it is “0”, it indicates that it is not a calculation target. In the figure, the markers with marker ID = 1, 7, 45 are the target flag = 1, and the markers with marker ID = 1, 7, 45 are calibration targets. On the other hand, the marker with marker ID = 125 has a target flag = 0, indicating that it is not a calibration target.

  The completion status is a flag indicating whether or not marker calibration has been executed. If the completion status is “1”, the position / orientation calculation is executed. If “0”, the position / orientation calculation is not executed. If it is “2”, the position / orientation calculation is executed. This indicates that the position and orientation are not calculated correctly. In the figure, the marker with marker ID = 1 has a completion status = 2, and the position and orientation of this marker have not been determined correctly. Further, the markers with marker ID = 7, 45 have the completion status = 1, and the position and orientation of these markers are correctly determined for these markers. The completion status = 0 of the marker with marker ID = 125 is 0, and the position / orientation calculation has not yet been executed for this marker.

  “Marker size” indicates the length of one side of the marker. As described above, in this embodiment, all the markers have the same length on one side. In the figure, all the markers with marker ID = 1, 7, 45, 125 are “40” on one side.

  “Marker position and orientation” indicate the position and orientation of the marker in the world coordinate system, respectively. Here, the posture of the marker is expressed by a rotation axis vector and a rotation angle. That is, it represents a state in which the marker coordinate system is rotated to the right around the rotation axis vector in a state where the axes of the world coordinate system and the marker coordinate system are matched.

  The table data described above is registered in the storage unit 110.

  Returning to FIG. 4, the index management unit 107 performs processing for transmitting information transmitted from the storage unit 110 and the image management unit 106 to the index position / orientation calculation unit 108 and the index extraction unit 109.

  The index position / orientation calculation unit 108 performs a process of calculating the position / orientation in the world coordinate system of the marker specified as the position / orientation calculation target by the process described later. The index extraction unit 109 performs processing for detecting a marker from an image received from the storage unit 110 via the index management unit 107. The results obtained by the index position and orientation calculation unit 108 and the index extraction unit 109 are sent to the index management unit 107. That is, the index management unit 107 manages various information related to the index.

  The image composition unit 111 understands marker information such as a marker ID and the outline shape of the marker for each marker on the image sent from the image acquisition unit 102 that has been specified as a position / orientation calculation target by processing to be described later. Generate computer graphics (CG) to make it easier. Then, a process of superimposing the generated CG on the image received from the storage unit 110 via the index management unit 107 is performed. The image processed by the image composition unit 111 is sent to the user interface management unit 103 at the subsequent stage.

  The user interface management unit 103 manages information to be displayed on the display screen of the display unit 105, and sends the image received from the image composition unit 111 to the display unit 105. Further, the user interface management unit 103 receives various instructions received from the control unit 104, controls the above-described units in accordance with the received instructions, and performs a display corresponding to the control result on the display screen of the display unit 105. Also do.

  The display unit 105 is configured by a CRT, a liquid crystal screen, or the like, and can display various processing results as images or characters. The control unit 104 receives an operation instruction from the operator of the apparatus and notifies the user interface management unit 103 of the operation instruction.

  FIG. 30 is a block diagram showing a hardware configuration of a computer applicable to the image processing apparatus 100.

  Reference numeral 3001 denotes a CPU which controls the entire computer using programs (computer programs) and data stored in the RAM 3002 and ROM 3003 and executes each process described later performed by the computer. For example, in the configuration of the image processing apparatus 100 illustrated in FIG. 4, functions included in each unit other than the image input unit 101, the storage unit 110, and the display unit 105 function as part of the functions included in the CPU 3001.

  Reference numeral 3002 denotes a RAM, which is a program or data loaded from an external storage device 3006, or data acquired from the outside (an imaging device such as a camera in the case of this embodiment) via an I / F (interface) 3007 (this embodiment). In this case, an area for temporarily storing a captured image) is provided. Further, the RAM 3002 has a work area used when the CPU 3001 executes various processes. As described above, the RAM 3002 can appropriately provide an area according to various uses.

  Reference numeral 3003 denotes a ROM which stores setting data and a boot program for the computer.

  An operation unit 3004 includes a keyboard and a mouse, and various instructions can be input to the CPU 3001 when operated by an operator of the computer.

  A display unit 3005 includes a CRT, a liquid crystal screen, and the like, and can display a processing result by the CPU 3001 using an image, text, or the like. The display unit 3005 corresponds to the display unit 105 in FIG.

  Reference numeral 3006 denotes an external storage device, which is a large-capacity information storage device represented by a hard disk drive device. Here, an OS (Operating System), various types of information (data) held by the storage unit 110, and programs and data for causing the CPU 3001 to execute processes described below performed by the computer are stored. These programs and data are appropriately loaded into the RAM 3002 under the control of the CPU 3001. The CPU 3001 executes processes using the loaded program and data, thereby executing processes to be described later performed by the computer.

  Reference numeral 3007 denotes an I / F. In the case of the present embodiment, the I / F functions as an interface for connecting an imaging device for imaging a real space in which a plurality of the markers are arranged to the computer. An image captured by the imaging apparatus is input to the RAM 3002 or the external storage device 3006 of the computer via the I / F 3007.

  A bus 3008 connects the above-described units.

  Note that the hardware configuration of a computer applicable to the image processing apparatus 100 according to the present embodiment is not limited to the configuration shown in the figure, and a configuration capable of executing each process described below performed by the image processing apparatus 100. If it is good.

  Next, a GUI (graphical user interface) that operates on the computer and is used to obtain the position and orientation of the index placed in the real space will be described. FIG. 6 is a diagram showing a display example of this GUI. Note that the programs and data related to the GUI shown in FIG. When this is loaded into the RAM 3002 and the CPU 3001 executes processing using the loaded program and data, a window having the configuration shown in the figure is displayed on the display screen of the display unit 3005. The same applies to other windows appearing in the following description.

  Unless otherwise specified in the following description, operations on the GUI are performed by an operator of the computer using the operation unit 3004, and processing executed by this operation is performed by the CPU 3001.

  When the “file” menu 210 is designated in the GUI shown in FIG. 6, a file menu as shown in FIG. 7 is displayed on the display screen of the display unit 3005. When the “open marker definition file” menu 212 is designated in the file menu shown in FIG. 7, the window shown in FIG. 10 is displayed on the display screen of the display unit 3005. In the window shown in FIG. 10, reference numeral 300 denotes an area for displaying a list of file names of marker definition files stored in the external storage device 3006. The operator of the computer uses the operation unit 3004 to display this area. Any one of the file names displayed in 300 can be selected. The marker definition file is a file in which data of the table shown in FIG. 5 is recorded. If a plurality of marker definition files are stored in the external storage device 3006, these files can be selected using the GUI shown in FIG.

  Here, when the operator selects one file and designates an “OK” button 310, the image processing apparatus deletes the display of the window shown in the figure, and then, according to the content of the selected file, the GUI shown in FIG. The display in the area 230 is updated. Details of the information displayed in the area 230 will be described later.

  Returning to the file menu of FIG. 7 and instructing the “create new marker definition file” menu 214, a window as shown in FIG. 11 is displayed on the display screen of the display unit 3005. In the window shown in FIG. 11, reference numeral 400 denotes an area for inputting a file name of a marker definition file to be newly created. When the operator of the computer inputs the file name of a new marker definition file in the area 400 using the operation unit 3004 and instructs the “OK” button 410, the image processing apparatus deletes the window display shown in FIG. Thereafter, a marker definition file having the file name input in the area 400 is created in the external storage device 3006.

  Returning to the file menu of FIG. 7, when a “save marker definition file” menu 216 is instructed, a process of saving the marker definition file currently opened in the external storage device 3006 is performed.

  When an “end” menu 218 is instructed, the CPU 3001 performs a process related to shutdown using a known shutdown function of the OS, and shuts down the computer.

  Returning to the GUI shown in FIG. 6, when the “edit” menu 220 is designated, the edit menu shown in FIG. 8 is displayed on the display screen of the display unit 3005. When the “create new marker definition information” menu 222 is instructed in the edit menu shown in FIG. 8, the marker definition information is newly created and additionally registered in the currently opened marker definition file. After the additional registration, the display in the area 230 in the GUI of FIG. 6 is updated according to the contents of the marker definition file that is currently open. Here, in order to avoid duplication of marker IDs, unused values in the marker definition information table are used as marker IDs of newly created marker definition information. The newly created marker definition information can be changed on a window to be described later.

  Returning to the editing menu of FIG. 8, when the “delete marker definition information” menu 224 is designated, the marker definition information currently selected in the area 230 in the GUI of FIG. 6 is deleted from the marker definition file currently opened. Is done. After deletion, the display in the area 230 in the GUI of FIG. 6 is updated according to the contents of the marker definition file that is currently open. Due to this updated display, the deleted marker definition information is not displayed.

  When the “edit marker definition information” menu 226 is designated, a window as shown in FIG. 9 is displayed. The window shown in the figure is used for editing the marker definition information currently selected in the area 230. Reference numeral 250 denotes a window body.

  Reference numeral 252 denotes an area for inputting a marker ID in the currently selected marker definition information. The value input in this area 252 is the marker ID in the currently selected marker definition information.

  Reference numeral 254 denotes an area for inputting a marker position in the currently selected marker definition information. The value input in this area 254 is the marker position in the currently selected marker definition information.

  An area 256 is used to input a marker posture in the currently selected marker definition information. The value input in this area 256 is the marker posture in the currently selected marker definition information.

  Reference numeral 258 denotes an area for inputting a marker size in the currently selected marker definition information. The value input in this area 258 becomes the marker size in the currently selected marker definition information. In the present embodiment, the marker size input to the area 258 is reflected in the marker definition information of all markers.

  Reference numeral 259 denotes a pull-down menu for editing the reference marker flag included in the marker state in the currently selected marker definition information. In this pull-down menu 259, “reference marker” and “relative marker” are displayed, and either one can be selected. If the operator selects “reference marker”, the reference marker flag is set to “1”, and if “relative marker” is selected, the reference marker flag is set to “0”.

  However, in this embodiment, only one reference marker can exist in the marker definition file. Accordingly, if marker definition information with the reference marker flag set to “1” already exists, only “relative marker” can be selected even if other marker definition information is operated.

  Reference numeral 251 denotes an “OK” button. By instructing this, the marker definition information being selected is updated with each piece of information input to each of the areas 252 to 259. That is, the marker ID, marker position, orientation, marker size, and reference marker flag in the currently selected marker definition information are updated to the information input in the areas 252 to 259, respectively. However, when the marker ID input to the area 252 is used in the same file, a warning is displayed on the display screen of the display unit 3005, and the marker definition information is not updated.

  Next, the area 230 in the GUI of FIG. 6 will be described. In the area 230, a display corresponding to the marker definition information of each marker described in the marker definition file currently opened is displayed. More specifically, each row is displayed in accordance with marker definition information for one marker.

  Reference numeral 231 denotes a slider bar. By indicating the upper part or the lower part, the information displayed in the area 230 can be scrolled upward or downward.

  Reference numeral 232 denotes a check box for setting a target flag in the corresponding marker definition information. When checked (a check mark is displayed in the check box), the target flag = 1, and when unchecked (a check mark is deleted from the check box), the target flag = 0. For example, when a check box in the row of marker ID = 2 (marker ID = 2) is designated, a check mark is displayed in the check box. In addition, the target flag of the marker with marker ID = 2 is “1”. Further, when the check box in the row of ID = 2 is designated again, the check mark is deleted from the check box. In addition, the target flag of the marker with marker ID = 2 is “0”. Thus, the target flag can be arbitrarily set for each marker. This indicates that the operator of this computer can arbitrarily select a marker for position and orientation calculation. Even if the target flag is operated by some process other than the operation on the check box, the operation result is reflected in the check mark in the check box.

  An area 234 displays the completion status in the corresponding marker definition information as an icon. When the position and orientation calculation of the corresponding marker is being executed, the completion status = 1. In this case, an icon “◎” is displayed in the area 234. If the calculation of the position and orientation of the corresponding marker is executed but fails, the completion status = 2. In this case, an icon “×” is displayed in the area 234. Further, when the position / orientation calculation of the corresponding marker is not executed, the completion status = 0. In this case, nothing is displayed in the area 234. In the case of the drawing, since the icon “◎” is displayed in the area 234 in the row of the marker with ID = 2, it can be determined that the position / orientation calculation is being executed. On the other hand, since nothing is displayed in the area 234 in the row of the marker with ID = 1, it can be determined that the position / orientation calculation has not yet been executed. Of course, any icon may be displayed according to the value of the completion status, and any information may be displayed as long as the value of the completion status can be visually understood.

  Reference numeral 236 denotes an area for displaying the detection flag in the corresponding marker definition information as an icon. Detection flag = 1 when the corresponding marker is detected from the image. In this case, a camera icon is displayed in the area 236. Further, when the corresponding marker is not detected from the image, the detection flag = 0. In this case, nothing is displayed in the area 236. In the case of the figure, since the camera icon is displayed in the area 236 in the marker row of ID = 2, it can be determined that the marker is detected from the image. On the other hand, since nothing is displayed in the area 236 in the marker row with ID = 1, it can be determined that the marker is not detected from the image. Of course, any icon may be displayed according to the value of the detection flag, and any information may be displayed as long as the value of the detection flag can be visually understood.

  Reference numeral 238 denotes an area for displaying a marker ID in the corresponding marker definition information. In the case of a reference marker, the marker ID is displayed with an underline. In the case of the figure, since the underline is displayed for the marker ID of ID = 2, it can be determined that this is the reference marker. Of course, what is displayed according to the value of the reference marker flag is not limited to the underline, and any information can be displayed as long as the reference marker can be visually understood.

  Reference numeral 260 denotes an area for displaying a real space image. The real space image displayed in this area 260 may be an image acquired in the RAM 3002 via the I / F 3007 or an image acquired in the RAM 3002 from the external storage device 3006. Also, an enlarged version of a selected thumbnail image among one or more thumbnail images displayed in an area 270 to be described later (an image obtained by enlarging a thumbnail may be used, or a thumbnail creation source is prepared separately. It may be used). The displayed image is a real space image in which a plurality of markers are arranged. In the region 260, a CG indicating the ID and contour of each marker is superimposed on the real space image.

  An area 270 displays a list of thumbnails of real space images acquired via the I / F 3007. Reference numeral 272 denotes a slider bar which can scroll the information displayed in the area 270 leftward and rightward by designating the left end and the right end. When one of the images displayed in the area 270 is designated, first, a frame is displayed on the selected thumbnail. Then, an enlarged image of the instructed image (an image obtained by enlarging the thumbnail may be used, or a thumbnail creation source may be prepared separately and used) is displayed in the area 260 as described above. Is done.

  A button 282 is used to sequentially acquire images (that is, moving images) of each frame sent from the imaging device via the I / F 3007 to the external storage device 3006 and to input an instruction to sequentially display in the area 260. Button.

  A button 284 is pressed to input an instruction to create a one-frame thumbnail acquired via the I / F 3007 at that time and to additionally display the area 270 when the button is pressed. Of course, the acquired image of the frame is also stored in the external storage device 3006. The button 284 is pressed to acquire one frame in the moving image, and can be pressed only when the button 282 for acquiring the moving image is instructed. When one frame image is acquired, marker detection processing is performed from the one frame image, and information relating to the detected marker is described in the marker definition file that is currently open. Furthermore, the display in the area 230 is updated according to the marker definition file updated by this processing.

  Reference numeral 286 denotes a button for inputting an instruction to delete the currently selected thumbnail in the area 270. If the thumbnail to be deleted is created by reducing the original image, the original image is also deleted. If the deleted image includes a marker, information on the marker is also deleted in the marker definition file that is currently open. Further, the display in the area 230 is updated according to the marker definition file updated by this processing.

  Reference numeral 288 denotes a button for inputting an instruction to calculate the calculation of the position and orientation of the marker with the target flag = 1 in the marker definition file that is currently open.

  12 and 13 are flowcharts of various processes performed using the GUI. Note that programs and data for causing the CPU 3001 to execute processing according to the flowcharts of FIGS. 12 and 13 are stored in the external storage device 3006. This is loaded into the RAM 3002 as appropriate under the control of the CPU 3001. Then, when the CPU 3001 executes processing using the loaded program and data, the computer executes each processing described below.

  Here, whether or not the calibration is executed is determined not only by the selection by the operator but also by the marker state of the marker definition information. Therefore, the description will be given focusing on the process of changing the marker definition information from the entire process.

  In addition, the program and data related to the GUI of FIG. 6 are already loaded into the RAM 3002 and the CPU 3001 executes the processing using this before the following processing is solved. 6 GUIs are displayed.

  After detecting the instruction of the “file” menu 210 in the GUI of FIG. 6, when the instruction of the “create new marker definition file” menu 214 is detected, the process proceeds to step S101 via step S100. In step S101, a window as shown in FIG. 11 is displayed on the display screen of the display unit 3005. The operator of this computer inputs the file name of the marker definition file to be newly created in the area 400 of this window. In step S101, the file name input in the area 400 is further acquired. Then, the process proceeds to step S102, and a marker definition file having the file name acquired in step S101 is created in the external storage device 3006. The marker definition information table in the created file is set to empty. Then, the process proceeds to step S108, all the real space images already acquired in the external storage device 3006 are deleted, and the display in the area (acquired image display area) 270 is updated. As a result, nothing is displayed in the area 270. Then, the process proceeds to step S146, and the display in area 230 is updated. That is, nothing is displayed in the area 230. Then, the process returns to step S100.

  On the other hand, if an instruction of the “open marker definition file” menu 212 is detected after detecting an instruction of the “file” menu 210 in the GUI of FIG. 6, the process proceeds to step S106 via steps S100 and S105. In step S106, a window as shown in FIG. 10 is displayed on the display screen of the display unit 3005. The operator of this computer selects one marker definition file in this window. In step S106, the selected marker definition file is further loaded from the external storage device 3006 to the RAM 3002. Then, the process proceeds to step S107, and both the detection flag and the target flag of the loaded marker definition information table are initialized to 0. Then, the processes in steps S108 and S146 are performed, and the process returns to step S100.

  On the other hand, after detecting an instruction of the “edit” menu 220 in the GUI of FIG. 6, if an instruction of the “create new marker definition information” menu 222 is detected, the process proceeds to step S112 via steps S100, S105, and S110. . In step S112, one piece of new marker definition information is added to the currently opened marker definition file. As the marker ID of the added marker definition information, a marker ID different from the marker ID used by the marker definition information existing in the same file is assigned. In addition, the reference marker flag, detection flag, target flag, and completion status of the added marker definition information are all initialized to 0. In addition, the marker size of the marker definition information existing in the same file is set. Note that when there is no marker definition information, a default value is set. Then, after performing the process in step S146, the process returns to step S100.

  On the other hand, if an instruction of the “delete marker definition information” menu 224 is detected after detecting an instruction of the “edit” menu 220 in the GUI of FIG. 6, the process proceeds to step S116 via steps S100, S105, S110, and S114. Proceed. In step S116, a process of deleting the marker definition information currently selected in the area 230 from the marker definition file is performed. Then, after performing the process in step S146, the process returns to step S100.

  On the other hand, when the instruction of the “edit marker definition information” menu 226 is detected after detecting the instruction of the “edit” menu 220 in the GUI of FIG. 6, the process is performed through steps S100, S105, S110, S114, and S118. Proceed to S120. In step S120, a window as shown in FIG. 9 is displayed on the display screen of the display unit 3005. The operator of this computer inputs corresponding information to one or more of the areas 252 to 259 in this window. When the “OK” button 251 is instructed, in step S120, information input to each of the areas 252 to 259 is acquired. Then, it is checked whether or not there are two or more marker IDs acquired from the area 252 in the marker definition file that is currently open.

  If there are two or more as a result of this check, the process proceeds to step S140, and a warning is displayed on the display screen of the display unit 3005. Then, the process proceeds to step S150. The processing after step S150 will be described later. On the other hand, if there are not two or more, the process proceeds to step S121, and the marker definition information currently selected in the area 230 is updated to the information acquired in step S120. Then, the process in step S146 is performed. When updating the display in the region 230 in step S146, an underline is displayed for the marker ID of the reference marker flag = 1. Then, the process returns to step S100.

  On the other hand, in the GUI of FIG. 6, the instruction of the button 284 is detected in a state where the button 282 is already instructed, that is, in the state where a moving image in the real space is acquired to the external storage device 3006 via the I / F 3007. Then, the process proceeds to step S124 via steps S100, S105, S110, S114, S118, and S122. In step S124, from the image acquired in the external storage device 3006 in the vicinity of the timing at which the button 284 is instructed, the marker appearing in this image is detected and the marker ID is identified. Then, among the marker definition information of each marker registered in the currently opened marker definition file, it is checked whether there is marker definition information having the same marker ID as the marker ID identified from the image. Both the target flag and detection flag of this marker definition information are set to 1. That is, the fact that the marker detected from the image has been detected and the position and orientation calculation target are recorded in the marker definition information of this marker.

  On the other hand, in step S124, if there is no marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file, Create marker definition information for the detection marker, and add it to the currently open marker definition file. In this case, a marker ID identified from the image is assigned as the marker ID, and both the target flag and the detection flag are set to 1.

  Then, the process proceeds to step S125, and the thumbnails of the images acquired in the external storage device 3006 are additionally displayed in the area 270 near the timing when the button 284 is designated. Then, after performing the process in step S146, the process returns to step S100.

  On the other hand, when it is detected that the button 286 is instructed in the GUI of FIG. 6, the process proceeds to step S128 via steps S100, S105, S110, S114, S118, S122, and S126.

  In step S128, a marker is detected from the original image of the thumbnail selected in the area 270 and the image acquired in the external storage device 3006, and the marker ID of the detected marker is identified. Subsequently, it is checked whether the marker detected from the original image of the thumbnail selected in the area 270 exists in the image acquired in the external storage device 3006. If it exists, do nothing, otherwise delete the marker definition information for that marker.

  Then, the process proceeds to step S129, and a process of deleting the currently selected thumbnail and original from the area 270 from the external storage device 3006 is performed. Then, after performing the process in step S146, the process returns to step S100.

  With these processes, it becomes possible to automatically delete the marker definition information of the marker that is no longer detected from the acquired image with the deletion of the image from the marker definition information table and exclude it from the calibration target.

  On the other hand, when an instruction for the check box 232 is detected in the GUI of FIG. 6, the process proceeds to step S132 via steps S100, S105, S110, S114, S118, S122, S126, and S130. In step S132, the target flag in the marker definition information corresponding to the designated check box 232 (marker definition information having the marker ID displayed on the same line as the designated check box 232) is inverted. That is, if the current target flag is “1”, it is set to “0”, and if the current target flag is “0”, it is set to “1”. Then, after performing the process in step S146, the process returns to step S100.

  On the other hand, when it is detected that the button 288 is instructed in the GUI of FIG. 6, the process proceeds to step S136 via steps S100, S105, S110, S114, S118, S122, S126, S130, and S134.

  In step S136, first, the reference marker flag in the marker definition information with the target flag = 1 is referred to in all marker definition information registered in the marker definition file that is currently open, and the marker definition information with the reference marker flag = 1 is 2. Check if it exists. If there are two or more, the process proceeds to step S140.

  In step S140, a warning that calibration cannot be performed because there are two or more reference markers is displayed on the display screen of the display unit 3005. Then, the process proceeds to step S150. The processing after step S150 will be described later.

  On the other hand, if there is only one marker definition information with the reference marker flag = 1, the process proceeds to step S138. In step S138, it is checked whether there is marker definition information in which the target flag = 1 and the detection flag = 0 among the marker definition information registered in the marker definition file that is currently open. If it exists, the process proceeds to step S140, and since the position and orientation of such a marker cannot be calculated, a warning indicating that fact is displayed, and the process proceeds to step S150. The processing after step S150 will be described later.

  On the other hand, in the marker definition information registered in the marker definition file that is currently open, if there is no marker definition information with the target flag = 1 and the detection flag = 0, the process proceeds to step S142.

  In step S142, a marker position / orientation calculation process corresponding to the marker definition information of the target flag = 1 among the marker definition information registered in the marker definition file currently opened is performed. Then, the process proceeds to step S144. When the position and orientation are found correctly as a result of this calculation process, the value of the completion status in this marker definition information is set to “1”. If it cannot be obtained correctly, the completion status value is set to “2”. Then, the process in step S146 is performed, and then the process returns to step S100.

  If the above processing is executed an arbitrary number of times by the operator and the operator can obtain the position and orientation of all the markers for which the operator wants to obtain the position and orientation, the operator can select the “File” menu 210 in the GUI of FIG. , The “Save marker definition file” menu 216 is instructed. When this instruction is detected, the process proceeds to step S152 via steps S100, S105, S110, S114, S118, S122, S126, S130, S134, and S150. Even if the process proceeds from step S140 to step S150, the process proceeds to step S152 if an instruction of the “save marker definition file” menu 216 is detected, and the process proceeds if not detected. Proceed to step S154.

  In step S152, the marker definition file that is currently open is saved in the external storage device 3006. Although a detailed description is omitted in this embodiment, if this marker definition file is read and alignment processing is performed using the marker definition information with completion status = 1, the marker selected by the operator and the position and orientation is calculated. It is possible to perform an alignment process using.

  Then, the process proceeds to step S154. In step S154, it is checked whether or not the “end” menu 218 has been instructed. If not instructed, the process returns to step S100. If instructed, the process ends, and as described above. Shut down the computer.

  As described above, according to the present embodiment, the operator can set the position and orientation calculation target marker and the reference marker among the markers detected from the image.

  In the above processing, the marker detected from the image is automatically set as the position / orientation calculation target in step S124, and the operator of the computer further selects the position / orientation calculation target from among the position / orientation calculation targets. (Semi-automatic or manual selection is possible), and it is possible to provide a flexible and easy-to-use method.

  Further, in addition to the selection method described above, the operator can be notified of the work necessary to execute calibration according to the marker definition information, as in the processes in steps S112, S121, S136, and S138. It has become possible to lighten the load of setting work for positioning performed by a person.

  In step S112, which is a process for creating new marker definition information, the target flag of the marker to be added is set to 0. However, it may be set to 1. With this process, the system can guide the operator to acquire and calibrate the marker for which the marker definition information has been created and use it in the alignment process. This process is effective when the operator wants to manage markers used for alignment based on the presence or absence of marker definition information.

  In step S124, which is a process at the time of image acquisition, the target flag of the marker detected from the marker image has been changed to 1, but it may not be changed. This process is effective when the operator places importance on manually selecting a marker to be calibrated.

  In step S128, which is a process for deleting an image, marker definition information of a marker that does not exist in the image is deleted. However, the marker definition information detection flag may be set to 0 only. This processing is effective when the operator wants to manage the marker definition information of the marker once detected.

  It may be possible to set in advance which of the deformations described above is to be made according to the preference of the operator.

[Second Embodiment]
In the first embodiment, the positions and orientations of all the markers detected from the image or the marker selected by the operator among all the detected markers are obtained. In the present embodiment, the operator images a marker that is not subject to position / orientation calculation, and actively determines a marker that is not subject to position / orientation calculation. Such an embodiment is effective when a plurality of devices that perform calibration and alignment processing are used in an adjacent space and a marker used by another device is not used in calibration and alignment processing. .

  The present embodiment is premised on the configuration according to the first embodiment, but the difference between the present embodiment and the first embodiment will be described in detail below. Accordingly, points not particularly mentioned in the following description are the same as those in the first embodiment.

  FIG. 15 is a diagram illustrating a configuration example of marker definition information according to the present embodiment. Also in the present embodiment, one marker definition information exists for each marker as in the first embodiment, but the marker definition information according to the present embodiment is the same as the marker definition information according to the first embodiment. It has a configuration in which an “exclusion flag” is added to the marker state.

  The exclusion flag is a flag indicating whether or not the marker is excluded from the position / orientation calculation process. If the flag is “1”, the flag is not included, and if the marker is “0”, the marker is not included. In this embodiment, the specified value of the exclusion flag is “0”.

  FIG. 16 is a diagram showing a display example of a GUI (Graphical User Interface) that operates on the computer and is used to obtain the position and orientation of the index placed in the real space. The GUI shown in FIG. 16 is obtained by adding a “mode” menu 240 to the GUI shown in FIG. Further, in the area 230, as shown in the area 236 in the marker ID = 7, 10, the “×” icon reflecting the value of the exclusion flag is superimposed on the icon, which is the GUI in FIG. Different.

  When the “mode” menu 240 is designated in the GUI shown in FIG. 16, a mode menu as shown in FIG. 14 is displayed on the display screen of the display unit 3005. When the “acquisition mode” menu 242 is instructed in the mode menu shown in FIG. 4, the acquisition mode is set, and the marker detected from the image captured by the imaging apparatus becomes the position and orientation calculation target. That is, when the “acquisition mode” menu 242 is instructed, when a marker is identified from the image, the marker exclusion flag is set to “0”, and thereafter, the same operation as in the first embodiment is performed.

  On the other hand, when the “exclusion” mode 244 is instructed, the exclusion mode is set, and the marker detected from the image captured by the imaging device is excluded from the position and orientation calculation target. That is, when the “exclusion” mode 244 is instructed, the marker exclusion flag is set to “1” when the marker is identified from the image.

  That is, for the marker detected from the image in the acquisition mode, the detection flag is set to 1 and the exclusion flag is set to 0. On the other hand, for the marker detected from the image in the exclusion mode, the detection flag is set to 1, and the exclusion flag is set to 1.

  In the region 236, an icon corresponding to the value of the detection flag is displayed as in the first embodiment. In addition to this, an icon corresponding to the value of the exclusion flag is superimposed and displayed in the present embodiment. In the figure, an “x” icon is displayed in the area 236 corresponding to the marker ID = 7, 10, indicating that the exclusion flag corresponding to the marker ID = 7, 10 is “1”. Yes. When the exclusion flag = 0, the “x” icon is not displayed in the region 236. Of course, any icon may be displayed according to the value of the exclusion flag, and any information may be displayed as long as the value of the exclusion flag is visually recognized.

  17 and 18 are flowcharts of various processes performed using the GUI shown in FIG. The process (steps S200, S201, S202, S208, and S246) that is performed when the instruction of the “create new marker definition file” menu 214 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 12 is the same as steps S100, S101, S102, S108, and S146 in FIG.

  Also, the processing (steps S205, S206, S207, S208, and S246) when the instruction of the “open marker definition file” menu 212 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 12 is the same as steps S105, S106, S107, S108, and S146 in FIG. 12, but in step S207, in addition to the detection flag and the target flag, the exclusion flag is also initialized to zero.

  In addition, when an instruction of the “edit” menu 220 is detected after detecting an instruction of the “edit” menu 220 in the GUI of FIG. 16, the process proceeds to step S212 via steps S200, S205, and S210. . In step S212, one piece of new marker definition information is added to the currently opened marker definition file. As the marker ID of the added marker definition information, a marker ID different from the marker ID used by the marker definition information existing in the same file is assigned. In addition, the reference marker flag, detection flag, target flag, exclusion flag, and completion status of the added marker definition information are all initialized to 0, and the marker size of the marker definition information existing in the same file is set. The Note that when there is no marker definition information, a default value is set. Then, after performing the process in step S246, the process returns to step S200.

  Also, the processing (steps S214, S216, and S246) performed when the instruction of the “delete marker definition information” menu 224 is detected after the instruction of the “edit” menu 220 is detected in the GUI of FIG. Steps S114, S116, and S146 in FIG.

  Also, the processing (steps S218, S220, S221, S246, and S240) that is performed when an instruction in the “edit marker definition information” menu 226 is detected after detecting an instruction in the “edit” menu 220 in the GUI of FIG. Since these are the same as steps S118, S120, S121, S146 in FIG. 12 and step S140 in FIG.

  In the GUI of FIG. 16, the instruction of the button 284 is detected in a state where the button 282 has already been instructed, that is, a moving image of the real space is acquired in the external storage device 3006 via the I / F 3007. Then, the process proceeds to step S250 via steps S200, S205, S210, S214, S218, and S222. In step S250, from the image acquired in the external storage device 3006 in the vicinity of the timing at which the button 284 is instructed, the marker appearing in this image is detected and the marker ID is identified. In step S252, it is checked whether the current mode set in the “mode” menu 240 is an exclusion mode or an acquisition mode. As a result of the check, if it is an exclusion mode, the process proceeds to step S258, and if it is an acquisition mode, the process proceeds to step S254.

  In step S258, if marker definition information having the same marker ID as the marker ID identified from the image exists among the marker definition information of each marker registered in the marker definition file that is currently open, this marker definition Both the exclusion flag and the detection flag in the information are set to 1. That is, among the marker definition information registered in the marker definition file that is currently open, the marker of this marker indicates that the marker detected from the image is excluded from the position / orientation calculation target and detected from the image. Record in definition information.

  On the other hand, if there is no marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file in step S258, Create marker definition information for the detection marker, and add it to the currently open marker definition file. In this case, a marker ID identified from the image is assigned as the marker ID, and both the exclusion flag and the detection flag are set to 1. Then, the process proceeds to step S260.

  On the other hand, in step S254, if there is marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file, Both the target flag and the detection flag in the marker definition information are set to 1, and the exclusion flag is set to 0. That is, out of the marker definition information registered in the marker definition file that is currently open, the marker detected from the image is not subject to the position / orientation calculation target that it is the position / orientation calculation target. The effect is recorded in the marker definition information of this marker.

  On the other hand, in step S254, if there is no marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file, Create marker definition information for the detection marker, and add it to the currently open marker definition file. In this case, the marker ID assigned from the image is assigned to the marker ID, both the target flag and the detection flag are set to 1, and the exclusion flag is set to 0.

  Then, the process proceeds to step S260, and the thumbnail of the image acquired in the external storage device 3006 is additionally displayed in the area 270 near the timing when the button 284 is designated. Then, the process proceeds to step S262. In step S262, the display of the area 236 for each marker in the area 230 is updated. As described above, the “x” icon is displayed for the area 236 with the exclusion flag = 1. When the exclusion flag = 0, the “x” icon is not displayed in the area 236. Then, after performing the process of step S246, the process returns to step S200.

  With the above processing, it is possible to automatically set a marker detected from an image acquired in the acquisition mode as a calibration target and automatically exclude a marker detected from the image acquired in the exclusion mode from the calibration. .

  Further, when it is detected that the button 286 is instructed in the GUI of FIG. 16, the process proceeds to step S270 via steps S200, S205, S210, S214, S218, S222, and S226. In step S270, a marker is detected from the original image of the thumbnail selected in the area 270 and the image acquired in the external storage device 3006, and the marker ID of the detected marker is identified. Subsequently, it is checked whether the marker detected from the original image of the thumbnail selected in the area 270 exists in the image acquired in the external storage device 3006. If it exists, do nothing, otherwise delete the marker definition information for that marker.

  Then, the process proceeds to step S272, and a process of deleting the currently selected thumbnail and original from the area 270 from the external storage device 3006 is performed. And after performing the process in step S262, S246, a process is returned to step S200.

  As the image is deleted by these processes, the marker definition information of the marker that is no longer detected from the acquired image can be automatically deleted from the marker definition information table and excluded from the calibration target. In addition, an “x” icon is displayed on the marker detected from the image acquired in the exclusion mode, so that it is easy to understand which marker is excluded.

  Also, the processing (steps S230, S232, S246) performed when an instruction for the check box 232 is detected in the GUI of FIG. 16 is the same as steps S130, S132, S146 in FIG.

  On the other hand, when it is detected that the button 288 is instructed in the GUI of FIG. 16, the process proceeds to step S236 via steps S200, S205, S210, S214, S218, S222, S226, S230, and S234. In step S236, first, the reference marker flag in the marker definition information with the target flag = 1 is referred to in all the marker definition information registered in the marker definition file that is currently open, and the marker definition information with the reference marker flag = 1 is 2. Check if it exists. If there are two or more, the process proceeds to step S240. In step S240, a warning that calibration cannot be performed because there are two or more reference markers is displayed on the display screen of the display unit 3005. Then, the process proceeds to step S250. The processing after step S250 will be described later.

  On the other hand, if there is only one marker definition information with the reference marker flag = 1, the process proceeds to step S238. In step S238, among the marker definition information registered in the marker definition file that is currently open, the marker definition information with the target flag = 1 and the detection flag = 0, or the target flag = 1 and the exclusion flag = 1. Check if there is marker definition information. If it exists, the process proceeds to step S240, and since the position and orientation of such a marker cannot be calculated, a warning indicating that fact is displayed, and the process proceeds to step S250. The processing after step S250 will be described later.

  By these processes, it is possible to reduce in advance the failure of the calibration process. In particular, it is possible to avoid in advance problems that occur when the target flag is manually changed.

  On the other hand, among the marker definition information registered in the marker definition file that is currently open, the marker definition information with the target flag = 1 and the detection flag = 0, the marker definition with the target flag = 1 and the exclusion flag = 1 If none of the information exists, the process proceeds to step S242. In step S242, a marker position / orientation calculation process corresponding to the marker definition information of the target flag = 1 among the marker definition information registered in the marker definition file currently opened is performed. Then, the process proceeds to step S244, and when the position and orientation are found correctly as a result of the calculation process, the value of the completion status in the marker definition information is set to “1”. If it cannot be obtained correctly, the completion status value is set to “2”. Then, the process in step S246 is performed, and then the process returns to step S200.

  If the above processing is executed an arbitrary number of times by the operator and the operator can obtain the position and orientation of all the markers that the operator wants to obtain the position and orientation, the operator can select the “File” menu 210 in the GUI of FIG. , The “Save marker definition file” menu 216 is instructed. When this instruction is detected, the process proceeds to step S252 via steps S200, S205, S210, S214, S218, S222, S226, S230, S234, and S250. Even when the process proceeds from step S240 to step S250, the process proceeds to step S252 if the instruction of the “save marker definition file” menu 216 is detected, and the process proceeds if not detected. Proceed to step S254.

  In step S252, a process of saving the marker definition file currently opened in the external storage device 3006 is performed. Although detailed description is omitted in this embodiment, if this marker definition file is read and processing is performed to perform alignment processing using marker definition information with completion status = 1 and exclusion flag = 0, it is acquired in the exclusion mode. The alignment process can be performed without using the marker detected from the marker image.

  Then, the process proceeds to step S254. In step S254, it is checked whether the “end” menu 218 has been instructed. If not instructed, the process returns to step S200. If instructed, the process ends, and as described above. Shut down the computer.

  In the present embodiment, the example in which the marker is excluded from the calibration by performing shooting in a special mode called the exclusion mode has been described. By shooting a marker to be excluded in the exclusion mode, it is possible to specify a marker to be excluded from calibration and alignment processing. Since the mode can be changed at any time, it is possible to easily add the marker to be excluded from the calibration by switching to the exclusion mode and photographing the marker to be excluded.

  Similarly to the first embodiment, adjustment according to the preference and policy of the operator may be performed by changing the value set in the marker state in each process. For example, by not setting the exclusion flag to 0 in S254, the marker once photographed in the exclusion mode may not be set as the position and orientation calculation target. This is advantageous in the case where the marker to be excluded is photographed in advance and then the marker to be subjected to position and orientation calculation is photographed.

  Further, it may be possible to set in advance which of these changes is made according to the preference of the operator.

[Third Embodiment]
In the present embodiment, when a marker having the same marker ID is arranged in the real space, this is detected and excluded from calibration processing. The present embodiment is premised on the configuration according to the first embodiment, but the difference between the present embodiment and the first embodiment will be described in detail below. Accordingly, points not particularly mentioned in the following description are the same as those in the first embodiment.

  First, the markers used in the present embodiment are the same as those in the first embodiment. In addition, it is assumed that marker duplication detection from the acquired image is realized in cooperation with the image management unit 106, the index management unit 107, the index extraction unit 109, and the storage unit 110.

  FIG. 19 is a diagram illustrating a configuration example of marker definition information according to the present embodiment. Also in the present embodiment, one marker definition information exists for each marker as in the first embodiment, but the marker definition information according to the present embodiment is the same as the marker definition information according to the first embodiment. It has a configuration in which a “duplicate flag” is added to the marker state.

  The overlapping flag is a flag that is “1” when it is determined that there are a plurality of markers in the real space, and is “0” for a marker that is not determined to be overlapping. In this embodiment, the specified value of the duplication flag is “0”.

  FIG. 20 is a diagram showing a display example of a GUI (Graphical User Interface) that operates on the computer and is used for obtaining the position and orientation of an index placed in the real space. In the GUI shown in FIG. 20, a region 237 is newly added in the region 230 in the GUI shown in FIG. The region 237 is for displaying an icon indicating that a plurality of corresponding markers exist in the real space. In the figure, it is detected that there are a plurality of markers with the marker ID = 2 in the real space. Yes. That is, if the overlap flag = 1, an icon is displayed, and if the overlap flag = 0, no icon is displayed. Of course, any icon may be displayed according to the value of the overlap flag, and any information may be displayed as long as the value of the overlap flag is visually recognized.

  21 and 22 are flowcharts of various processes performed using the GUI shown in FIG. The processing (steps S300, S301, S302, S308, and S346) that is performed when the instruction of the “create new marker definition file” menu 214 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 12 is the same as steps S100, S101, S102, S108, and S146 in FIG.

  Also, the processing (steps S305, S306, S307, S308, and S346) when the instruction of the “open marker definition file” menu 212 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 12 is the same as steps S105, S106, S107, S108, and S146 in FIG. 12, but in step S307, in addition to the detection flag and the target flag, the duplication flag is also initialized to zero.

  In addition, when an instruction on the “new creation of marker definition information” menu 222 is detected after detecting an instruction on the “edit” menu 220 in the GUI of FIG. 20, the process proceeds to step S312 via steps S300, S305, and S310. . In step S312, one piece of new marker definition information is added to the currently opened marker definition file. In the added marker definition information, a marker ID different from the marker ID used by the marker definition information existing in the same file is assigned. In addition, the reference marker flag, detection flag, target flag, duplicate flag, and completion status of the added marker definition information are all initialized to 0, and the marker size is set to the marker size of the marker definition information existing in the same file. The Note that when there is no marker definition information, a default value is set. Then, after performing the process in step S346, the process returns to step S300.

  Also, the processing (steps S314, S316, and S346) performed when the instruction of the “delete marker definition information” menu 224 is detected after the instruction of the “edit” menu 220 is detected in the GUI of FIG. Steps S114, S116, and S146 in FIG.

  Also, the processing (steps S318, S320, S321, S346, and S340) that is performed when the instruction of the “edit marker definition information” menu 226 is detected after detecting the instruction of the “edit” menu 220 in the GUI of FIG. Since these are the same as steps S118, S120, S121, S146 in FIG. 12 and step S140 in FIG.

  In the GUI shown in FIG. 20, the instruction of the button 284 is detected in a state where the button 282 is already instructed, that is, in a state where a moving image in the real space is acquired in the external storage device 3006 via the I / F 3007. Then, the process proceeds to step S323 via steps S300, S305, S310, S314, S318, and S322. In step S323, from the image acquired in the external storage device 3006 in the vicinity of the timing when the button 284 is instructed, the marker appearing in this image is detected and the marker ID is identified. If there is marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the marker definition file that is currently open, the marker definition information includes Both the target flag and the detection flag are set to 1. That is, among the marker definition information registered in the marker definition file that is currently open, the marker definition information of this marker indicates that the marker detected from the image has been detected and that the position and orientation are to be calculated. Record.

  On the other hand, in step S323, if there is no marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file, Create marker definition information for the detection marker, and add it to the currently open marker definition file. In this case, the marker ID identified from the image is assigned as the marker ID, and both the target flag and the detection flag are set to 1.

  Next, in step S324, the marker IDs identified from the same image are compared, and two or more of the same marker ID exist, that is, whether or not there are two or more markers having the same marker ID in one image. Check. As a result of this check, if there are two or more, the process proceeds to step S325, and the duplication flag in the marker definition information having the duplicate marker ID is set to 1. Then, the process proceeds to step S326.

  On the other hand, if the result of this check is that there are not two or more, the process proceeds to step S326. In step S326, thumbnails of images acquired in the external storage device 3006 near the timing when the button 284 is designated are additionally displayed in the area 270. In step S330, the display of the area 237 for each marker in the area 230 is updated. As described above, an icon is displayed for the area 237 with the overlap flag = 1. When the overlap flag = 0, no icon is displayed in the area 237. Then, after performing the process of step S346, the process returns to step S300.

  By these processes, it is possible to exclude automatically detected markers from the calibration at the same time as automatically setting the markers to be calibrated.

  When it is detected that the button 286 is instructed in the GUI of FIG. 20, the process proceeds to step S328 via steps S300, S305, S310, S314, S318, S322, and S327. In step S328, a marker is detected from the original image of the thumbnail selected in the area 270 and the image acquired in the external storage device 3006, and the marker ID of the detected marker is identified. Subsequently, it is checked whether the marker detected from the original image of the thumbnail selected in the area 270 exists in the image acquired in the external storage device 3006. If it exists, do nothing, otherwise delete the marker definition information for that marker.

  Then, the process proceeds to step S329, and a process of deleting the currently selected thumbnail and original from the area 270 from the external storage device 3006 is performed. Then, after performing the processes in steps S330 and S346, the process returns to step S300.

  With these processes, the marker definition information of the markers that are no longer detected from the acquired image can be automatically deleted from the marker definition information table and excluded from the calibration target as the image is deleted. In addition, an icon is superimposed on the detected marker, which makes it easy to understand which marker is overlapping.

  Also, the processing (steps S332, S333, and S346) that is performed when an instruction for the check box 232 is detected in the GUI of FIG. 20 is the same as steps S130, S132, and S146 in FIG.

  On the other hand, when it is detected that the button 288 is instructed in the GUI of FIG. 20, the process proceeds to step S336 via steps S300, S305, S310, S314, S318, S322, S327, S332, and S334.

  In step S336, first, the reference marker flag in the marker definition information with the target flag = 1 is referred to in all the marker definition information registered in the marker definition file currently opened, and the marker definition information with the reference marker flag = 1 is set to 2. Check if it exists. If there are two or more, the process proceeds to step S340.

  In step S340, a warning that calibration cannot be performed because two or more reference markers exist is displayed on the display screen of the display unit 3005. Then, the process proceeds to step S350. The processing after step S350 will be described later.

  On the other hand, if there is only one marker definition information with the reference marker flag = 1, the process proceeds to step S338. In step S338, among the marker definition information registered in the marker definition file that is currently open, the marker definition information with the target flag = 1 and the detection flag = 0, or the target flag = 1 and the duplication flag = 1. Check if there is marker definition information. If it exists, the process proceeds to step S340, and since the position and orientation cannot be calculated for such a marker, a warning indicating that fact is displayed, and the process proceeds to step S350. The processing after step S350 will be described later.

  By these processes, it is possible to reduce in advance the failure of the calibration process. In particular, it is possible to avoid in advance problems that occur when the target flag is manually changed.

  On the other hand, among the marker definition information registered in the marker definition file that is currently open, marker definition information with target flag = 1 and detection flag = 0, marker definition with target flag = 1 and duplicate flag = 1 If none of the information exists, the process proceeds to step S342. In step S342, the marker position / orientation calculation process corresponding to the marker definition information of the target flag = 1 among the marker definition information registered in the marker definition file currently opened is performed. Then, the process proceeds to step S344. If the position and orientation are found correctly as a result of this calculation process, the value of the completion status in this marker definition information is set to “1”. If it cannot be obtained correctly, the completion status value is set to “2”. Then, the process in step S346 is performed, and then the process returns to step S300.

  If the above processing is executed an arbitrary number of times by the operator and the operator can obtain the position and orientation of all the markers for which the operator wants to obtain the position and orientation, the operator can select the “File” menu 210 in the GUI of FIG. , The “Save marker definition file” menu 216 is instructed. When this instruction is detected, the process proceeds to step S352 via steps S300, S305, S310, S314, S318, S322, S327, S332, S334, and S350. Even when the process proceeds from step S340 to step S350, the process proceeds to step S352 if an instruction in the “save marker definition file” menu 216 is detected, and the process proceeds if not detected. Proceed to step S354.

  In step S352, a process of saving the marker definition file currently opened in the external storage device 3006 is performed. Although a detailed description is omitted in the present embodiment, if the marker definition file is read and the alignment process is performed using the marker definition information with the completion status = 1, the operator selects the position and orientation. It is possible to perform alignment processing using the calculated marker.

  Then, the process proceeds to step S354. In step S354, it is checked whether the “end” menu 218 has been instructed. If not instructed, the process returns to step S300. If instructed, this process ends, and as described above. Shut down the computer.

  In the present embodiment, an example has been described in which it is detected that two or more markers having the same marker ID are arranged and excluded from calibration. Markers determined to be duplicated cannot be used unless the marker definition information is deleted, so that the possibility of calibration failure due to duplication can be reduced.

  Similar to the first embodiment, the value set in the marker state in each process may be changed to make adjustments according to the operator's orientation and policy. Further, it may be possible to set in advance which of these modifications is made according to the preference of the operator.

[Fourth Embodiment]
The present embodiment is an example in which a printing function is added to the configuration according to the first embodiment, and the fact that a marker has been printed is used as an element for selecting a calibration target. Such an embodiment is useful when the policy is to regard the act of printing as having the intention of the operator to place and calibrate the marker.

  The present embodiment is premised on the configuration according to the first embodiment, but the difference between the present embodiment and the first embodiment will be described in detail below. Accordingly, points not particularly mentioned in the following description are the same as those in the first embodiment.

  FIG. 23 is a block diagram illustrating a functional configuration of the image processing apparatus according to the present embodiment. As shown in the figure, an image processing apparatus 2300 according to this embodiment has a configuration in which an index generation unit 120 is added to the image processing apparatus 100 according to the first embodiment shown in FIG. A printing unit 121 is connected to 120.

  The index generation unit 120 receives the marker definition information read from the storage unit 110 by the index management unit 107, and generates a marker image according to the format shown in FIG. Then, print data is generated based on the generated image. The generated print data is sent to the printing unit 121. The printing unit 121 receives the sent print data and performs a process of recording it on a recording medium such as paper. Therefore, an apparatus having a printing function such as a printer or a multifunction machine is applied to the printing unit 121.

  In the image processing apparatus 2300, the operations performed by the units other than the index generation unit 120 are the same as those in the first embodiment, and a description thereof will be omitted.

  FIG. 24 is a diagram illustrating a configuration example of marker definition information according to the present embodiment. Also in the present embodiment, one marker definition information exists for each marker as in the first embodiment, but the marker definition information according to the present embodiment is the same as the marker definition information according to the first embodiment. It has a configuration in which a “print flag” is added to the marker state.

  The print flag is a flag indicating whether or not the marker has been printed by the printing unit 121. For a printed marker, “1” is set in the print flag, and for a marker that has not been printed yet, “0” is set in the print flag. It is assumed that the print flag is set by the index generation unit 120. For example, the print flag may be set to 1 as the marker print data is generated and the generated print data is sent to the printing unit 121. In the present embodiment, the specified value of the print flag is “0”.

  FIG. 25 is a diagram showing a display example of a GUI (Graphical User Interface) that operates on the computer and is used for obtaining the position and orientation of the index placed in the real space. In the GUI shown in FIG. 25, a region 239 is newly added to the region 230 in the GUI shown in FIG. An area 239 is for displaying an icon indicating that the corresponding marker is already printed, and in the figure, the marker with marker ID = 2 is printed. That is, an icon is displayed if the print flag = 1, and no icon is displayed if the print flag = 0. Of course, any icon may be displayed according to the print flag value, and any information may be displayed as long as the print flag value is visually recognized.

  Here, when the “edit” menu 220 is designated in the GUI shown in FIG. 25, the edit menu shown in FIG. 26 is displayed on the display screen of the display unit 3005. The edit menu shown in FIG. 26 is obtained by adding a “print marker” menu 228 to the edit menu according to the first embodiment shown in FIG. When this “print marker” menu 228 is instructed, a window as shown in FIG. 27 is displayed on the display screen of the display unit 3005.

  In the window of FIG. 5, reference numeral 510 denotes an area for inputting a marker ID of a marker to be printed. When the operator of the image processing apparatus 2300 inputs the marker ID of the marker to be printed in this area 510 and then instructs the OK button 520, the marker print data having the indicated marker ID is displayed in the index management unit 107, the index. Generated by the generation unit 120, and the printing unit 121 performs a printing process according to the print data. Thereby, a marker having the instructed marker ID is printed on a recording medium such as paper.

  Note that the initial value displayed in advance in the area 510 is one of the marker IDs displayed in the area 230, but may be freely changed within the range of usable marker IDs.

  28 and 29 are flowcharts of various processes performed using the GUI shown in FIG. The processing (steps S400, S401, S402, S408, and S446) that is performed when the instruction of the “create new marker definition file” menu 214 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 12 is the same as steps S100, S101, S102, S108, and S146 in FIG.

  In addition, the processing (steps S405, S406, S407, S408, S446) when the instruction of the “open marker definition file” menu 212 is detected after detecting the instruction of the “file” menu 210 in the GUI of FIG. 25, respectively. 12 is the same as steps S105, S106, S107, S108, and S146 in FIG. 12, but in step S407, the print flag is initialized to 0 in addition to the detection flag and the target flag.

  In addition, if an instruction on the “new creation of marker definition information” menu 222 is detected after detecting an instruction on the “edit” menu 220 in the GUI of FIG. 25, the process proceeds to step S412 via steps S400, S405, and S410. . In step S412, one new marker definition information is added to the currently opened marker definition file. In the added marker definition information, a marker ID different from the marker ID used by the marker definition information existing in the same file is assigned. In addition, the reference marker flag, detection flag, target flag, print flag, and completion status of the added marker definition information are all initialized to 0. In addition, the marker size of the marker definition information existing in the same file is set. Note that when there is no marker definition information, a default value is set. Then, after performing the process in step S446, the process returns to step S400.

  Also, the processing (steps S414, S416, and S446) performed when the instruction of the “delete marker definition information” menu 224 is detected after detecting the instruction of the “edit” menu 220 in the GUI of FIG. Steps S114, S116, and S146 in FIG.

  In addition, the processing (steps S418, S420, S421, S446, and S440) that is performed when the instruction of the “edit marker definition information” menu 226 is detected after detecting the instruction of the “edit” menu 220 in the GUI of FIG. Since these are the same as steps S118, S120, S121, S146 in FIG. 12 and step S140 in FIG.

  In the GUI shown in FIG. 25, if an instruction in the “edit marker” menu 228 is detected after an instruction in the “edit” menu 220 is detected, the process proceeds to steps S400, S405, S410, S414, S418, and S460. In step S462, the window shown in FIG. 27 is displayed on the display screen of the display unit 3005. The operator of this computer inputs the marker ID of the marker to be printed in the area 510 in this window, and then instructs the OK button 520. Accordingly, in step S462, when the instruction of the OK button 520 is detected, the input marker ID is acquired. Then, it is checked whether the acquired marker ID is registered in the marker definition file that is currently open (is displayed in the area 230).

  As a result of this check, if it does not exist, the process proceeds to step S464, where marker definition information having the acquired marker ID is newly created and additionally registered in the currently opened marker definition file. The process in step S464 is performed in the same manner as the process in step S412. Then, the process proceeds to step S466.

  On the other hand, if it exists as a result of the check in step S462, the process proceeds to step S466, and print data of the image of this marker is created based on the marker definition information having the marker ID input to the area 510. That is, according to the format shown in FIG. 1, a marker image representing the marker ID in the marker definition information is created, and then print data of the image is created. Then, the created print data is sent to the printing unit 121. In this embodiment, since the printing unit 121 is connected to the I / F 3007 in the computer, the print data is sent to the printing unit 121 via the I / F 3007. As a result, the marker having the marker ID input to the area 510 is printed on a recording medium such as paper by the printing unit 121.

  When the processing is shifted from step S464 to step S466, in step S466, print data of the marker image is created based on the newly created marker definition information. Then, the created print data is sent to the printing unit 121.

  Then, the process proceeds to step S468, and both the target flag and the print flag in the marker definition information of the printed marker are set to 1. Then, after performing the process in step S446, the process returns to step S400.

  By automatically including these printed markers in the calibration target, the marker print history can be used as an element for selecting the marker to be calibrated.

  Further, by automatically adding the marker definition information of the printed marker to the marker definition information table, it is possible to reduce the operator's work.

  In the GUI shown in FIG. 25, the button 284 is detected when the button 282 is already instructed, that is, when a moving image of the real space is acquired in the external storage device 3006 via the I / F 3007. Then, the process proceeds to step S424 via steps S400, S405, S410, S414, S418, S460, and S422.

  In step S424, from the image acquired in the external storage device 3006 in the vicinity of the timing at which the button 284 is instructed, the marker appearing in this image is detected and the marker ID is identified. If there is marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the marker definition file that is currently open, the marker definition information includes Is set to 1. That is, of the marker definition information registered in the marker definition file that is currently open, the fact that a marker detected from the image has been detected is recorded in the marker definition information of this marker.

  On the other hand, in step S424, when there is no marker definition information having the same marker ID as the marker ID identified from the image among the marker definition information of each marker registered in the currently opened marker definition file, Create marker definition information for the detection marker, and add it to the currently open marker definition file. In this case, the marker ID identified from the image is assigned as the marker ID, and the detection flag is set to 1.

  Then, the process proceeds to step S425, and the thumbnail of the image acquired in the external storage device 3006 is additionally displayed in the area 270 near the timing when the button 284 is designated. Then, after performing the process in step S446, the process returns to step S400.

  On the other hand, when it is detected in the GUI of FIG. 25 that the button 286 has been designated, the process proceeds to step S428 via steps S400, S405, S410, S414, S418, S460, S422, and S427.

  In step S428, a marker is detected from the original image of the thumbnail selected in the area 270 and the image acquired in the external storage device 3006, and the marker ID of the detected marker is identified. Subsequently, it is checked whether the marker detected from the original image of the thumbnail selected in the area 270 exists in the image acquired in the external storage device 3006. If it exists, do nothing, otherwise delete the marker definition information for that marker.

  Then, the process proceeds to step S429, and the currently selected thumbnail and original are deleted from the external storage device 3006 from the area 270. Then, after performing the process in step S446, the process returns to step S400.

  With these processes, it becomes possible to automatically delete the marker definition information of the marker that is no longer detected from the image acquired along with the deletion of the image from the marker definition information table and exclude it from the calibration target.

  Also, the processing (steps S432, S433, and S446) performed when an instruction for the check box 232 is detected in the GUI of FIG. 25 is the same as steps S130, S132, and S146 in FIG.

  On the other hand, when it is detected that the button 288 is instructed in the GUI of FIG. 25, the process proceeds to step S436 via steps S400, S405, S410, S414, S418, S460, S422, S427, S432, and S434.

  In step S436, first, the reference marker flag in the marker definition information with the target flag = 1 is referred to in all marker definition information registered in the marker definition file that is currently open, and the marker definition information with the reference marker flag = 1 is 2. Check if it exists. If there are two or more, the process proceeds to step S440.

  In step S440, a warning that calibration cannot be performed because there are two or more reference markers is displayed on the display screen of the display unit 3005. Then, the process proceeds to step S450. The processing after step S450 will be described later.

  On the other hand, if there is only one marker definition information with the reference marker flag = 1, the process proceeds to step S438. In step S438, it is checked whether there is marker definition information in which the target flag = 1 and the detection flag = 0 among the marker definition information registered in the currently opened marker definition file. If it exists, the process proceeds to step S440, and since the position and orientation of such a marker cannot be calculated, a warning indicating that fact is displayed, and the process proceeds to step S450. The processing after step S450 will be described later.

  This process avoids mistakes in which undetected markers are subject to calibration, and can prompt the operator to acquire an image for detection.

  On the other hand, in the marker definition information registered in the marker definition file that is currently open, if there is no marker definition information with the target flag = 1 and the detection flag = 0, the process proceeds to step S439.

  In step S439, it is checked whether or not there is marker definition information with the target flag = 0 and the print flag = 1 among the marker definition information registered in the marker definition file that is currently open. As a result of this check, if it exists, the process proceeds to step S441.

  In step S441, it is determined that it is not natural to calibrate despite being printed, and a warning is displayed on the display screen of the display unit 3005, and a prompt is made to suspend calibration processing again. Is displayed on the display screen of the display unit 3005. This process makes it possible to avoid mistakes that the operator manually excludes from the calibration target despite printing and placement, and to prompt the operator to calibrate the printed marker.

  If an instruction to interrupt the calculation is input as a result of the operation on the GUI, the process proceeds to step S450. On the other hand, if an instruction to continue the calculation is input, the process proceeds to step S442.

  On the other hand, as a result of the check in step S439, if the marker definition information registered in the marker definition file that is currently open does not include the marker definition information with the target flag = 0 and the print flag = 1, the process is performed. The process proceeds to step S442.

  In step S442, a marker position / orientation calculation process corresponding to the marker definition information of the target flag = 1 among the marker definition information registered in the marker definition file currently opened is performed. Then, the process proceeds to step S444, and when the position and orientation are found correctly as a result of the calculation process, the value of the completion status in the marker definition information is set to “1”. If it cannot be obtained correctly, the completion status value is set to “2”. Then, the process in step S446 is performed, and then the process returns to step S400.

  If the above processing is executed an arbitrary number of times by the operator and the operator can obtain the position and orientation of all the markers that the operator wants to obtain the position and orientation, the operator can select the “File” menu 210 in the GUI of FIG. , The “Save marker definition file” menu 216 is instructed. When this instruction is detected, the process proceeds to step S452 via steps S400, S405, S410, S414, S418, S460, S422, S427, S432, S434, and S450. Even when the process proceeds from step S440 or step S441 to step S450, if the instruction of the “save marker definition file” menu 216 is detected, the process proceeds to step S452 and must be detected. If so, the process proceeds to step S454.

  In step S452, a process of saving the marker definition file currently opened in the external storage device 3006 is performed. Although a detailed description is omitted in the present embodiment, if the marker definition file is read and the alignment process is performed using the marker definition information with the completion status = 1, the operator selects the position and orientation. It is possible to perform alignment processing using the calculated marker.

  Then, the process proceeds to step S454. In step S454, it is checked whether the “end” menu 218 has been instructed. If not instructed, the process returns to step S400. If instructed, this process ends, and as described above. Shut down the computer.

  In the present embodiment, an example in which the marker printing function is added to the first embodiment and the fact that the marker has been printed is used as an element for selecting the calibration target has been described. According to such a configuration, when there is a marker that is not detected from the image among the printed markers, it is possible to warn and guide the operator so that the printed marker can be calibrated.

  Further, when it is desired to exclude the printed marker from the calibration target, it can be realized by a simple operation on the calibration target check box 232.

  A combination of such an operator's manual selection method and automatically adding a printed marker to the calibration target at the time of marker printing (processing in step S468) provides a flexible and easy-to-use method. It became possible to do.

  Further, in addition to the selection method described above, the operator performs work by notifying the operator of work necessary for executing calibration according to the marker definition information as in the processes of steps S436, S438, and S439. It has become possible to reduce the load of setting work for alignment.

  In step S407, which is a process of opening the marker definition file, the print flag of the marker definition information to be read is set to 0. However, the print history may be saved by setting this to 1. As a result, the operator can manage the print history with the marker definition file.

  In step S468, which is a marker printing process, the target flag of the printed marker is set to 1, but this may be set to 0, and steps S439 and S441 may be deleted from the processing flow. This is effective when the operator confirms the print flag and operates with a policy of manually selecting a calibration target.

  Further, the target flag of the detected marker may be set to 1 in step S424 which is an image acquisition process. This is effective when the detected marker is also a calibration target.

  In step S424, which is an image acquisition process, the print flag of the detected marker may be set to 1. This is effective when it is determined that all detected markers are printed markers installed in the space.

  In step S424, which is an image acquisition process, the target flag of the detected marker may be set to 1, and the print flag may be set to 1. This is effective when it is determined that all the detected markers are printed markers installed in the space and the printed markers are to be calibrated.

  In addition, the marker ID input to the area 510 in the window shown in FIG. 27 is not limited to one. For example, a plurality of marker IDs separated by commas or a marker ID in a range connected by hyphens are input. You may do it. Thereby, it becomes possible to reduce an operator's operation.

  In addition, processing for automatically creating new marker definition information and printing markers may be added by the number specified by the operator. By such processing, when a large number of markers are newly printed and used, it is possible to reduce the work of the operator.

  In this embodiment, only two types of printing / non-printing are managed. However, the number of times of printing and time may be added to the marker definition information, and the process may be changed depending on the printing history. For example, a marker printed a plurality of times may be determined to have a high possibility of overlapping, and may be excluded from the marker calibration target.

  Needless to say, the modifications described above may be combined.

  For example, the marker print flag detected at the time of image acquisition may be set to 1, and a process for automatically printing a marker with a marker ID that does not exist in the marker definition information table by a specified number may be combined.

  When processing is performed in this way, the marker ID to be used is determined, and before printing, images of all markers installed in the real space are acquired in advance and excluded from printing. Then, it is possible to automatically generate a specified number of markers other than the excluded markers.

  This is effective when a plurality of systems that perform calibration and alignment processing are used in an adjacent space, and when duplicate markers are not used (markers are exclusively used).

  It may be possible to set in advance which of the deformations described above is to be made according to the preference of the operator.

[Fifth Embodiment]
In the first to fourth embodiments, all the markers present in the image are handled equally. However, the present invention is not limited to this, and how reliable the marker in the image is as the data for calibration is obtained as reliability, and the calibration target and processing are changed according to (evaluation) the reliability. May be.
As an example of marker reliability,
(1) Appearance frequency of the index in the acquired image group (2) Angle formed by the index in the acquired image and the image acquisition device (3) Area on the image of the index in the acquired image (4) Contrast of the acquired image ( 5) The position of the index in the acquired image on the image and a combination thereof may be used.

  (1) is used when the frequency of appearance is extremely low and the reliability is considered low because it is regarded as noise that misrecognizes other than the marker.

  (2) is used for a policy that the reliability of a marker with a small angle is low because the accuracy of marker position and orientation calculation by image recognition decreases as the angle decreases.

  (3) is used when the reliability of the marker having a small area is low because the marker position and orientation calculation accuracy decreases as the area of the index image on the image decreases.

  (4) is used when the marker detection error from the image increases as the contrast is lower, and the marker detected from the image with the lower contrast has a low reliability.

  In (5), since the position on the image is more susceptible to the optical distortion of the image acquisition device as the edge of the image is, the index positioned at the edge of the image is less reliable. use.

  Needless to say, the reliability may be determined by combining the above examples. Moreover, the conditions for selecting the marker to be calibrated and the position alignment processing target are not limited to those described above, and all incidental information regarding the marker may be used.

  And according to each above embodiment, there can exist the following effects. That is, an arbitrary marker or a duplicated marker can be excluded from the calibration and alignment processing, and a series of operations for re-adding the processing can be facilitated. In addition, it is not necessary to re-photograph all except the excluded markers, or to remove the excluded markers from the space, and it is possible to reduce the man-hours for the setting work for calibration and alignment processing.

  As a selection method, a method that is manual by the operator and a method that performs semi-automatically from the marker definition information held by the image processing apparatus can be used together to provide a flexible and easy-to-use method.

  Furthermore, in addition to the selection method described above, the operator can be notified of the work required for the image processing apparatus to execute calibration according to the marker definition information, thereby reducing the burden of the operator's setting work. Can do.

[Sixth Embodiment]
In each of the above embodiments, a marker is used as an example of an index. However, the index used for alignment is not limited to the marker, and may be a natural feature, for example.

  FIG. 31 is a diagram illustrating an example of an image obtained when a house as a real object is photographed. For example, when an image of a real space as shown in FIG. 31 is taken, corners (three corners indicated by “x” in FIG. 31) in the image are used as natural features (feature points). it can. In addition, SIFT features in the image may be used as feature points.

  When natural features are used, as an application of the second and third embodiments, when it is determined that there are many similar natural features in the real space (when the number is larger than a predetermined number), The duplication flag is set to “1”. Then, such feature points are excluded from calibration and alignment processing.

  FIGS. 32A to 32C are diagrams illustrating an example of an image obtained by photographing the real space when there are many similar natural features in the real space. In the images shown in FIGS. 32A to 32C, the natural feature represented by “X” frequently occurs. For example, the overlap flag is set to “1” for this natural feature and is removed.

  FIGS. 33A to 33C show images when the natural feature having the overlap flag set to “1” is removed from the natural feature shown in FIGS. 32A to 32C, respectively. It is.

  As a result, it is possible to perform calibration and alignment processing on the object excluding similar natural features.

  As an application of the fifth embodiment, the number of similar features may be used as the reliability. For example, 1 / (number of similar features) may be obtained for all features existing in one image or all features existing in all images, and this may be used as the reliability. In this case, when the value is “1”, it is determined that the reliability is the highest, and the closer the value is to 0, the lower the reliability is. Then, a reliability threshold value is set, and features whose reliability is lower than the set threshold value are excluded from calibration and alignment processing.

[Other Embodiments]
Needless to say, the object of the present invention can be achieved as follows. That is, a recording medium (or storage medium) in which a program code of software that realizes the functions of the above-described embodiments is recorded is supplied to the system or apparatus. Then, the computer (or CPU or MPU) of the system or apparatus reads and executes the program code stored in the recording medium. In this case, the program code itself read from the recording medium realizes the functions of the above-described embodiment, and the recording medium on which the program code is recorded constitutes the present invention.

  Further, by executing the program code read by the computer, an operating system (OS) or the like running on the computer performs part or all of the actual processing based on the instruction of the program code. Needless to say, the process includes the case where the functions of the above-described embodiments are realized.

  Furthermore, it is assumed that the program code read from the recording medium is written in a memory provided in a function expansion card inserted into the computer or a function expansion unit connected to the computer. After that, based on the instruction of the program code, the CPU included in the function expansion card or function expansion unit performs part or all of the actual processing, and the function of the above-described embodiment is realized by the processing. Needless to say.

  When the present invention is applied to the recording medium, program code corresponding to the flowchart described above is stored in the recording medium.

It is a figure which shows the structural example (format example) of the marker used in the 1st Embodiment of this invention. It is a figure which shows an example of the actual marker produced according to the format shown in FIG. It is a figure which shows a marker coordinate system. It is a block diagram which shows the function structure of the image processing apparatus which concerns on the 1st Embodiment of this invention. It is a block diagram which shows the structural example of the table (marker definition information table) which registered the marker definition information with respect to each marker. It is a figure which shows the example of a display of GUI (graphical user interface) which operate | moves on a computer and is used in order to obtain | require the position and orientation of the parameter | index arrange | positioned in the real space. It is a figure which shows the example of a display of a file menu. It is a figure which shows the example of a display of an edit menu. It is a figure which shows the example of a display of a window. It is a figure which shows the example of a display of a window. It is a figure which shows the example of a display of a window. 7 is a flowchart of various processes performed using the GUI shown in FIG. 6. 7 is a flowchart of various processes performed using the GUI shown in FIG. 6. It is a figure which shows the example of a display of a mode menu. It is a figure which shows the structural example of the marker definition information which concerns on the 2nd Embodiment of this invention. It is a figure which shows the example of a display of GUI (graphical user interface) which operate | moves on a computer and is used in order to obtain | require the position and orientation of the parameter | index arrange | positioned in the real space. 18 is a flowchart of various processes performed using the GUI illustrated in FIG. 16. 18 is a flowchart of various processes performed using the GUI illustrated in FIG. 16. It is a figure which shows the structural example of the marker definition information which concerns on the 3rd Embodiment of this invention. It is a figure which shows the example of a display of GUI (graphical user interface) which operate | moves on a computer and is used in order to obtain | require the position and orientation of the parameter | index arrange | positioned in the real space. 21 is a flowchart of various processes performed using the GUI illustrated in FIG. 20. 21 is a flowchart of various processes performed using the GUI illustrated in FIG. 20. It is a block diagram which shows the function structure of the image processing apparatus which concerns on the 4th Embodiment of this invention. It is a figure which shows the structural example of the marker definition information which concerns on the 4th Embodiment of this invention. It is a figure which shows the example of a display of GUI (graphical user interface) which operate | moves on a computer and is used in order to obtain | require the position and orientation of the parameter | index arrange | positioned in the real space. It is a figure which shows the example of a display of an edit menu. It is a figure which shows the example of a display of a window. It is a flowchart of the various processes performed using GUI shown in FIG. It is a flowchart of the various processes performed using GUI shown in FIG. 2 is a block diagram illustrating a hardware configuration of a computer applicable to the image processing apparatus 100. FIG. It is a figure which shows an example of the image obtained when image | photographing the house as a real object. (A)-(c) is a figure which shows an example of the image obtained by image | photographing this real space, when many similar natural features exist in real space. (A)-(c) is a figure which shows the image at the time of removing the natural feature which set the duplication flag to "1" from the natural feature shown to Fig.32 (a)-(c), respectively. .

Claims (13)

  1. An acquisition step of acquiring an image of a real space in which a plurality of markers having identifiable patterns are arranged;
    An identification step of identifying the ID of the marker from the pattern of the marker in the image acquired in the acquisition step, and registering the identified ID in a predetermined file ;
    A detection step of comparing the IDs identified in the identification step and registered in the predetermined file, and detecting a plurality of markers having the same identified ID as overlapping markers;
    A calculation step of calculating a position and / or orientation in a real space of markers other than the overlappingly arranged markers detected in the detection step among the markers identified in the identification step, Image processing method.
  2. Furthermore,
    Among the markers identified in the identification step, the selection step of selecting a marker to be calculated by the calculation step,
    The image processing method according to claim 1, wherein in the calculation step, a position and / or orientation in a real space of the marker selected in the selection step is calculated.
  3. Furthermore,
    In the identification step, the first information indicating that the identified marker is identified, and the second information indicating whether or not to be calculated by the calculation step or not, are recorded.
    In the selection step, the recorded second information is changed to select whether the marker identified in the identification step is to be calculated by the calculation step or not. The image processing method according to claim 2.
  4. Furthermore,
    A setting step for setting either the first mode in which the marker identified in the identification step is a calculation target in the calculation step or the second mode in which the marker is not to be calculated;
    The image processing method according to claim 1, wherein in the calculation step, a position and / or orientation in a real space is calculated for a marker set as a calculation target in the calculation step.
  5. Furthermore,
    Create the marker print data and send it to the printer,
    In the calculation step, from the markers identified in the identification step, the markers for which the print data has been created in the creation step, excluding the markers for which the print data has not been created in the creation step, and / or postures in the real space The image processing method according to claim 1, wherein the image processing method is calculated.
  6. Furthermore,
    An evaluation step for evaluating the reliability of the marker identified in the identification step;
    In the calculation step, the position and / or orientation in the real space of the marker evaluated as high reliability in the evaluation step is calculated except for the marker evaluated as low reliability in the evaluation step. The image processing method according to claim 1.
  7.   In the evaluation step, the appearance frequency of the marker in the image acquired in the acquisition step, the angle formed between the marker and the image, the area of the marker in the image, the position of the marker in the image, the marker and the other part in the image 7. The image processing method according to claim 6, wherein the reliability of the marker is evaluated using any one of a contrast with the image processing method.
  8.   A pattern is recorded on the marker so that the marker can be identified, and the pattern is configured such that a coordinate system based on the marker can be defined. 8. The image processing method according to any one of items 7.
  9. The image processing method according to claim 1 , wherein in the identification step, the identified ID is registered in accordance with the position of the detected marker.
  10. The method further comprises the step of aligning the image in the real space and the image in the virtual space using the position and / or orientation of the marker calculated in the calculation step in the real space. 10. The image processing method according to any one of items 9 to 9 .
  11. An acquisition means for acquiring an image of a real space in which a plurality of markers having identifiable patterns are arranged;
    Identifying means for identifying the ID of the marker from the pattern of the marker in the image obtained by the obtaining means, and registering the identified ID in a predetermined file ;
    Detecting means for comparing IDs identified by the identifying means and registered in the predetermined file, and detecting a plurality of markers having the same identified ID as overlapping markers;
    Calculating means for calculating a position and / or orientation in a real space of markers other than the overlappingly arranged markers detected by the detecting means among the markers identified by the identifying means, An image processing apparatus.
  12. The computer program for functioning a computer as each means which the image processing apparatus of Claim 11 has.
  13. A computer-readable storage medium storing the computer program according to claim 12 .
JP2007038430A 2006-08-11 2007-02-19 Image processing method and image processing apparatus Active JP5517397B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2006220638 2006-08-11
JP2006220638 2006-08-11
JP2007038430A JP5517397B2 (en) 2006-08-11 2007-02-19 Image processing method and image processing apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007038430A JP5517397B2 (en) 2006-08-11 2007-02-19 Image processing method and image processing apparatus
US11/833,396 US8103127B2 (en) 2006-08-11 2007-08-03 Image processing method and image processing apparatus of calculating position and orientation of target objects located in image

Publications (3)

Publication Number Publication Date
JP2008064735A JP2008064735A (en) 2008-03-21
JP2008064735A5 JP2008064735A5 (en) 2010-03-11
JP5517397B2 true JP5517397B2 (en) 2014-06-11

Family

ID=39050878

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007038430A Active JP5517397B2 (en) 2006-08-11 2007-02-19 Image processing method and image processing apparatus

Country Status (2)

Country Link
US (1) US8103127B2 (en)
JP (1) JP5517397B2 (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4249649B2 (en) * 2004-04-09 2009-04-02 ホリゾン・インターナショナル株式会社 Vertical collating machine
JP5403861B2 (en) * 2006-11-06 2014-01-29 キヤノン株式会社 Information processing apparatus and information processing method
JP5132138B2 (en) * 2006-11-28 2013-01-30 キヤノン株式会社 Position and orientation measurement method, position and orientation measurement device
KR101457404B1 (en) * 2008-06-13 2014-11-06 삼성전자주식회사 Electronic picture frame and image display method thereof
JP5156571B2 (en) 2008-10-10 2013-03-06 キヤノン株式会社 Image processing apparatus and image processing method
JP5695821B2 (en) * 2009-08-31 2015-04-08 株式会社トプコン Color code target, color code discrimination device, and color code discrimination method
US8271895B2 (en) * 2010-03-22 2012-09-18 Mitutoyo Corporation GUI for programming step and repeat operations in a machine vision inspection system
KR20130056529A (en) * 2011-11-22 2013-05-30 삼성전자주식회사 Apparatus and method for providing augmented reality service in portable terminal
JP5728372B2 (en) * 2011-11-30 2015-06-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
JP5984181B2 (en) * 2012-09-11 2016-09-06 ポールトゥウィン・ピットクルーホールディングス株式会社 Motion capture device and program thereof
JP6368142B2 (en) * 2014-05-14 2018-08-01 キヤノン株式会社 Information processing apparatus and information processing method
JP6413521B2 (en) * 2014-09-08 2018-10-31 富士通株式会社 Display control method, information processing program, and information processing apparatus
JP6448457B2 (en) * 2015-04-30 2019-01-09 三菱電機株式会社 Imaging direction variation detection apparatus and imaging direction variation detection method
US10029788B2 (en) 2016-03-28 2018-07-24 Zipline International Inc. Vision based calibration system for unmanned aerial vehicles

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001506820A (en) * 1996-11-27 2001-05-22 プリンストン ビデオ イメージ,インコーポレイテッド Motion tracking using the image texture template
US7016539B1 (en) * 1998-07-13 2006-03-21 Cognex Corporation Method for fast, robust, multi-dimensional pattern recognition
US6839466B2 (en) * 1999-10-04 2005-01-04 Xerox Corporation Detecting overlapping images in an automatic image segmentation device with the presence of severe bleeding
JP4282216B2 (en) * 2000-09-19 2009-06-17 オリンパス株式会社 3D position and orientation sensing device
JP2003254716A (en) * 2002-03-04 2003-09-10 Sony Corp Instrument and method for measuring three-dimensional position and posture, storage medium, and computer program
US7505048B2 (en) * 2003-04-25 2009-03-17 Microsoft Corporation Estimation of overlap of polygons
JP2005031044A (en) * 2003-07-11 2005-02-03 Olympus Corp Three-dimensional error measuring device
JP2005147894A (en) * 2003-11-17 2005-06-09 Canon Inc Measuring method and measuring instrument
JP2006172016A (en) * 2004-12-14 2006-06-29 Toshiba Corp Mobile robot, mobile robot control method and mobile robot control program
US7460730B2 (en) * 2005-08-04 2008-12-02 Microsoft Corporation Video registration and image sequence stitching

Also Published As

Publication number Publication date
US20080037901A1 (en) 2008-02-14
US8103127B2 (en) 2012-01-24
JP2008064735A (en) 2008-03-21

Similar Documents

Publication Publication Date Title
US7698094B2 (en) Position and orientation measurement method and apparatus
US6816187B1 (en) Camera calibration apparatus and method, image processing apparatus and method, program providing medium, and camera
CN1699916B (en) System and method for excluding extraneous features from image inspection operations
JP3504054B2 (en) Document processing apparatus and document processing method
CN101093581B (en) Information processing method and apparatus for calculating information regarding measurement target on the basis of captured images
US8081815B2 (en) Marker arrangement information measuring apparatus and method
JP2006217638A (en) Image processing method and image processor
JP2008224626A (en) Information processor, method for processing information, and calibration tool
US8520931B2 (en) Position and orientation measurement apparatus and method thereof
US7933450B2 (en) Writing information processing system, writing information generating device and computer readable storage medium
US20090051946A1 (en) Image area selecting method
US20130236062A1 (en) Information processing apparatus, processing method thereof, and computer-readable storage medium
US8780110B2 (en) Computer vision CAD model
US7342572B2 (en) System and method for transforming an ordinary computer monitor into a touch screen
US20060273984A1 (en) Image processing method and image processing apparatus
CN100578141C (en) Setting information estimating method and information processing device
JP4914039B2 (en) Information processing method and apparatus
US20080292180A1 (en) Position and orientation measurement apparatus and control method thereof
JP4434890B2 (en) Image composition method and apparatus
JPH0764756A (en) Display edition system
CN1323427A (en) Pen type input device with camera
JP2008309631A (en) Information processing method and information processor
US9135513B2 (en) Image processing apparatus and method for obtaining position and orientation of imaging apparatus
US20070091125A1 (en) Index layout measurement method, position and orientation estimation method, index layout measurement apparatus, and position and orientation estimation apparatus
US20110218776A1 (en) Model producing apparatus, model producing method, and computer-readable recording medium in which model producing program is stored

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100125

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100125

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111114

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111118

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120117

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121019

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121218

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20130823

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131125

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20140109

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140320

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140401