WO2005005924A1 - 情報呈示装置及びそれを用いた情報呈示システム - Google Patents
情報呈示装置及びそれを用いた情報呈示システム Download PDFInfo
- Publication number
- WO2005005924A1 WO2005005924A1 PCT/JP2004/009786 JP2004009786W WO2005005924A1 WO 2005005924 A1 WO2005005924 A1 WO 2005005924A1 JP 2004009786 W JP2004009786 W JP 2004009786W WO 2005005924 A1 WO2005005924 A1 WO 2005005924A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- image
- unit
- shape information
- input
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 57
- 238000013461 design Methods 0.000 claims description 34
- 230000008859 change Effects 0.000 claims description 17
- 238000000034 method Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 21
- 239000003550 marker Substances 0.000 description 16
- 230000006870 function Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000012546 transfer Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 101710091104 Probable cinnamyl alcohol dehydrogenase 1 Proteins 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000002844 melting Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Definitions
- the present invention provides an information presenting apparatus capable of presenting information at a high density to an image observer by superimposing information related to a captured image on a shadow image and simultaneously displaying the information, and It is related to an information presentation system using it.
- Barco Reader is a very well-known information presentation device that presents certain relevant information to real-world objects and / or predetermined skills.
- USP 6, 389, 182 discloses an apparatus for presenting information using spatial localization information of an object and / or a predetermined marker. This is done by reading the two-dimensional code printed on the business card, which is the object, as a force with a camera, and using the program in the program to display the two-dimensional coded ID. To analyze. Then, photo image data of the person corresponding to the analyzed ID is read from the image database, and the two-dimensional code on the image of the name stimulus photographed with the above camera displayed on the computer display is displayed. Beside the code. Then, it is displayed as if there is a face image next to the 2D code on the business card.
- Japanese Patent Application Laid-Open No. 2002-92044 discloses an image of an object in a real space input by image input means such as a CCD imaging device, for example, an image of a current building and a three-dimensional image.
- image input means such as a CCD imaging device, for example, an image of a current building and a three-dimensional image.
- CAD system has ⁇
- a system has been proposed that detects (shift) and matches the design information with the object in the real space.
- the information on the object or object in the real space is usually the building or building. It means enormous information such as detailed shape, dimensions, material, part name, etc., and is the power of management by 3D CAD etc. S General.
- this 3D CAD or the like requires a large-capacity storage device for recording a huge amount of information and a high-performance display device for displaying the shape and dimensions in detail.
- the display position of the face photo is always
- the card is a whispering card that has a fixed positional relationship with respect to the two-dimensional code, the two-dimensional code and the display U are in the same space, and there is no problem.
- relevant information about large-scale structures such as plant facilities
- An object of the present invention is to accurately grasp an error between an object in real space and design information and to match the design information with the actual object, even though it is a portable device.
- An object of the present invention is to provide an information presenting device that can be used and an information presenting system using the same.
- Another object of the present invention is to provide an information presenting apparatus and an information presenting system using the information presenting apparatus, which can easily perform a registration operation.
- an image input unit configured to input an image of a target object in a real space; and the image input unit based on the target image input from the image input unit.
- a three-dimensional position / posture relationship detecting unit configured to detect a relative three-dimensional position and posture relationship with the object; a shape information storage unit configured to store shape information of the object; The position and orientation of the object detected by the three-dimensional position / posture detection unit are compared with the values stored in the shape information storage unit corresponding to the part of the object input from the image input unit.
- An error comparison and detection unit configured to detect a difference between the input actual position and orientation of the object and the stored shape information of the object, and a difference value from the error comparison and detection unit.
- the shape information stored in the shape information storage unit configured to reflect the image of the object inputted from the image picture input part, and the shape information of the shape information stored objects stored in the unit, an image obtained by superposing
- an information presenting device comprising: a superimposed image generation unit configured to generate the image; and a display unit configured to display the image generated by the superimposed image generation unit.
- the three-dimensional position / posture relationship detection unit detects the relative three-dimensional position and posture relationship between the image input unit and the target from the image of the target input from the image input unit.
- the three-dimensional position and orientation relationship output unit configured to output the image and the image input unit and the relative three-dimensional position and orientation relationship detected by the three-dimensional position and orientation relationship output unit are compared with the above error. It is preferable to include a three-dimensional position / posture information management unit configured to reflect using a difference value from the detection unit.
- the image input unit configured to input an image of an object in a real space, and the image of the object input from the image input unit
- a three-dimensional position / posture relationship detecting unit configured to detect a relative three-dimensional position and posture relationship between the input unit and the object
- a shape information storage unit configured to store shape information of the object.
- the position and orientation of the object detected by the three-dimensional position / posture relationship detection unit are stored in the shape information storage unit corresponding to the part of the object input from the image input unit.
- An error comparison and detection unit configured to compare the input position and orientation of the target object with the stored shape information of the target object, and the error comparison detection unit The difference value from the part is stored in the shape information storage part.
- the shape information management unit configured to reflect the Jo information, and the image of the image input unit or et input object, the shape information
- a superimposed image generator configured to generate an image in which the shape information of the object stored in the report storage unit is superimposed; and a display configured to display the image generated by the superimposed image generator.
- An information presenting device having: a three-dimensional design information storage unit configured to store three-dimensional design information of an object; and a three-dimensional design information stored in the three-dimensional design information storage unit.
- a shape information generation unit configured to generate shape information; and a three-dimensional CAD having: a shape information storage unit of the information presenting device, wherein the shape information storage unit of the three-dimensional CAD includes a shape generated by the shape information generation unit.
- the three-dimensional position / posture relationship detection unit of the information presenting device calculates the relative three-dimensional position of the image input unit and the target from the image of the target input from the image input unit.
- a three-dimensional position / posture relationship output unit configured to detect and output a posture relationship, and a relative three-dimensional position between the image input unit and the object detected by the three-dimensional position / posture relationship output unit.
- a three-dimensional position / orientation information management unit configured to reflect the difference from the error comparison / detection unit in the posture relationship.
- FIG. 1 is a block diagram showing a configuration of an information presenting apparatus and an information presenting system using the same according to a first embodiment of the present invention.
- Figure 2 is a flowchart showing the procedure for associating the real space with the model space (registration).
- FIG. 3A is a diagram showing an object and a display state before the completion of the registration work.
- FIG. 3B is a diagram showing the target object and the display state after the completion of the resist- ation-relaying operation.
- FIG. 4 is a flowchart showing a procedure of an operation for matching an object in real space with design information (shape information).
- FIG. 5A is a diagram showing the target object in the real space and the design information (shape information) before the completion of the work, the display state, and the shape information of the target object.
- FIG. 5B is a diagram showing the object, the display state, and the shape information of the object after completing the operation of matching the object in the real space with the design information (shape information).
- FIG. 6A is a diagram showing an object and a display state before the completion of the registration work in the second embodiment of the present invention.
- FIG. 6B is a diagram showing the target object and the display state after the completion of the registration work.
- FIG. 7 is a diagram illustrating an object and a display state before and after completion of a registration operation for explaining a modification of the second embodiment.
- FIG. 8A is a diagram showing an object, a display state, and shape information of the object before the completion of the registration operation in the third embodiment of the present invention.
- FIG. 8B is a diagram showing the target object, the display state, and the shape information of the target object after the completion of the registration operation.
- FIG. 9A is a diagram illustrating a target object, a display state, and shape information of the target object before the completion of the registration work for explaining a modification of the third embodiment.
- FIG. 9B is a diagram illustrating an object, a display state, and shape information of the object after the registration work has been broken for explaining an example of the third embodiment.
- FIG. 10 shows an information port according to the fourth embodiment of the present invention.
- FIG. 11 is a block diagram showing a configuration of a kingdom indicating device and an information kingdom system using the same.
- FIG. 11 shows an information indicating device according to a fifth embodiment of the present invention and information using the same.
- FIG. 12 is a block diagram illustrating the configuration of the system.
- FIG. 12 is a block diagram showing a configuration of the information system using the device for explaining a modification of the fifth embodiment.
- FIG. 2 is a block diagram of a main display system.
- FIG. 13A shows an information port according to the sixth embodiment of the present invention.
- the target device in the real space and the display information and the shape information of the target object that match the target object in the real space and the total information (shape information) in the king display device and the information presentation system using it are shown. It is a diagram.
- Figure 13B shows an object in real space and aj.
- FIG. 7 is a diagram showing an object after a work to match an X51I report (shape information) with a display state, and shape information of the object;
- FIG. 14 shows a device for explaining a form of the first embodiment and an information terminal using the device.
- FIG. 15 is a block diagram of a hot K device and an information indicating system using the device for explaining a form example of the fourth embodiment.
- FIG. 16 shows an information display device for explaining a configuration obtained by further modifying the modification shown in FIG. 14 and an information terminal using the same.
- FIG. 17 is a block diagram of an information presenting apparatus and an information presenting system using the information presenting apparatus for explaining a configuration obtained by further modifying the modification shown in FIG.
- the information presenting system according to the first embodiment of the present invention includes an information presenting device according to the first embodiment of the present invention, such as a force-melting PDA or a notebook personal computer.
- Small information display device 10 that can be carried by any user, and a 3D C installed in an office, etc.
- the information presentation device 10 includes an image input unit 14 3 3D position / posture relationship output unit 16, a shape information storage unit 18, an error comparison and detection unit 20, a shape information management unit 22, and a superimposed image generation. It consists of a part 24, a display part 26, and a three-dimensional position / posture relation information management part 28.
- the image input unit 14 inputs an image of an object in real space
- the three-dimensional position / posture relation output unit 16 calculates the relative three-dimensional position and orientation of the image input unit 14 and the object from the image of the object input from the image input unit 14. Has the function of detecting relationships.
- the shape information storage section 18 is a memory or the like for storing shape information of the object.
- the error comparison and detection unit 20 corresponds to the position and orientation of the object detected by the three-dimensional position and orientation relationship output unit 16 and the portion of the object input from the image input unit 14.
- the above shape It has a function of comparing the values stored in the information storage unit 18 and detecting a difference between the input actual position and orientation of the object and the stored shape information of the object.
- the shape information management unit 22 has a function of reflecting the difference value from the error comparison detection unit 20 in the shape information stored in the shape information storage unit 18.
- the superimposed image generation unit 24 generates an image in which the image of the object input from the image input unit 14 and the shape information of the object stored in the shape information storage unit 18 are superimposed. It has the function to do.
- the display unit 26 is a liquid crystal display or the like that displays an image generated by the superimposed image generation unit 24.
- the three-dimensional position / posture relationship information management unit 28 applies the error to the relative three-dimensional li and posture relationship between the image input unit 14 detected by the three-dimensional position / posture relationship output unit 16 and the object.
- Comparison detector 28 applies the error to the relative three-dimensional li and posture relationship between the image input unit 14 detected by the three-dimensional position / posture relationship output unit 16 and the object.
- the three-dimensional position / posture relationship output unit 16 and the three-dimensional position / posture relationship information management unit 28 constitute a three-dimensional position / posture relationship detection unit 30 ⁇ .
- the three-dimensional CAD 12 includes a three-dimensional design information storage unit 32 and a shape information generation unit 34.
- the 3D design information storage section 32 is a hard disk that stores 3D design information of the object.
- the shape information generation unit 34 has a function of generating shape information from the three-dimensional mouthpiece information stored in the three-dimensional design information storage unit 32.
- the shape information stored in the shape information storage unit 18 of the L information presenting device is the shape information generation unit 3 4 of the 3D CAD 12. Is generated.
- the transfer of the shape information from the shape information generation unit 34 to the shape information storage unit 18 may be performed online by wireless or wired communication, or may be performed offline via some storage medium. Good,
- the “3D design information of the object” means the shape, color, material, part,
- the shape information generation unit 34 which is based on information necessary for manufacturing and installing the object such as a PP name, extracts only shape information from the three-dimensional design information of such an object.
- the object and the shape information have their own coordinate systems.
- the coordinate system of the object is “real space” and the coordinate system of the shape information is
- the shape information management unit 22 of the information control device 10 is connected to the error comparison detection unit 20 with the extracted shape information stored in the shape information storage unit 18. The value of the difference between 0 and
- the exchange of information with the 3D design information storage section 32 of the 3D CAD 12 may be performed either online or offline.
- a work of associating a real space with a model space is performed. That is, using the object itself, an object in the real space and a shape Display the object model in the module space created from the information By moving the object model in the model space so that both sides overlap, the work of associating the space with the model space is performed. Then, the movement amount of the object model at the time of this association work is recorded and used for associating the real space with the model space.
- the image input unit 14 takes an image of an object (step S10).
- the object used in this registration is one whose shape information is known in advance to be consistent between the real space and the model space. It may be registered as a landmark of.
- the three-dimensional position / posture relation output unit 16 converts the relative three-dimensional position of the image input unit 14 and the object from the image of the object input by the image input unit 14.
- the posture is detected (step S12). At this time, there is a difference between the real space and the model space because the association (registration h ratio) has not been completed.
- the detection of the three-dimensional position and orientation is disclosed in USP 6,577,249, for example, by regarding the object itself as a marker. It can be performed by using the marker detection and position estimation method and the green report estimation method.
- the three-dimensional position / posture relation output unit 16 further responds to the object from the shape information storage unit 18 based on the image of the object input by the image input unit 14.
- the object used for this registration is, beforehand, confirmed that the shape information is identical between the real space and the model space beforehand. Since it uses the one that is available, it can be selected automatically.
- the selected shape information is drawn so as to be directly supplied to the error comparison detection unit 20, the three-dimensional position / position relationship output unit 16 The selected shape information is deformed based on the detected three-dimensional position and orientation, and then input to the error comparison detection unit 20.
- the error comparison and detection unit 20 includes the input image from the above described image image input unit 14 and the shape information storage unit 18.
- Step S 16 The error at this time is the difference between the real space and the model space.
- the input from the image input unit 14 is performed by the superimposed image generation unit 24.
- the object model 40 created from the image 36 of the object 38 and the shape information of the object is displayed on the screen 42 of the display unit 26. Will be displayed shifted.
- the 3D position / posture relationship information management unit 28 Based on the error detected by the unit 20, the information on the relative three-dimensional position and orientation relationship between the image input unit 14 and the object 38 is corrected so that the error is minimized (step S18). .
- the three-dimensional position / posture relationship information management unit 28 supplies the corrected three-dimensional position / posture relationship information to the three-dimensional position / posture relationship output unit 16, thereby realizing the real space.
- the work of associating the coordinate system with the model space is completed (step S20). As a result, as shown in FIG. 3B, the image 36 of the object 38 and the object model 40 are displayed on the screen 42 of the display unit 26 in alignment with each other.
- an error between the object in the real space and the design information (in this embodiment, the shape information) is accurately grasped, and the design information and the actual object are matched. Work.
- the image input unit 14 photographs the object 38 (step S30).
- the three-dimensional position / posture relation output unit 16 is connected to the image input unit
- the relative three-dimensional position and orientation of the image input unit 14 and the object 38 are detected from the image 36 of the object 38 input in step 14 (step S32). At this time, there is no difference between the real space and the model space, since the association (resistor resolution) has been completed.
- the three-dimensional position / posture relation output unit 16 searches the data corresponding to the detected three-dimensional position and posture from information of a plurality of shapes stored in the shape information storage unit 18. Do (Step S34).
- the retrieved shape information is input to the error comparison detection unit 20.
- the error comparison detection unit 20 detects an error between the input image from the image input unit 14 and the shape information retrieved from the shape information storage unit 18 (Step S36) .
- the error to be detected is, for example, the movement amount of the shape information.
- this movement amount can be parallel movement, enlargement / reduction, rotation, or a combination of these.
- the shape information management unit 22 determines whether or not the error detected by the error comparison and detection unit 20 is within an allowable range (step S38). No further work is required.
- the shape information management unit 2.2 further performs the shape information search from the shape information storage unit 18 described above. Then, the detected error is added, and the shape information is changed (step S40). In other words, since the shape of the object 38 already created cannot be changed, the shape information is changed.
- the shape information management unit 22 stores the shape change information in the shape information storage unit 18 to reflect the change (step S42).
- the image input unit 14 takes an image of the object 38 as shown in FIG. 5A in which the parts 44 and 46 are attached to the main body 48.
- the shape information 52 of the main body 48, the shape information 54 of the part 44, and the shape information of the part 46 The shape information 50 corresponding to the object 38 composed of 56 is searched, and the image 36 of the object 38 and such shape information 50 are compared by the error comparison detection unit 20.
- the image generation unit 24 generates the object model 40 (model 58 of the body 48, model 6 of the part 44) created from the image 36 of the object 38 and the shape information 50. 0 Generates an image in which the part 46 overlaps with the model 62) of the part 46, and displays it on the display unit 26.
- the image 64 of the component 46 displayed on the screen 42 of 26 and the model 62 of the component 46 are displayed with a deviation. It is determined by the actions of step S40 and step S42.
- step S40 and step S42 are not performed.
- the shape change information is stored in the 3D design information storage unit 3 2 of the 3D CAD 12 from the shape information management unit 22 as described above. It is also possible to return to.
- an object 38 whose shape information does not match in the real space and the model space in advance is used in such a reference.
- a sticker is movably arranged on the object 38 by sticking or the like.
- the three-dimensional position and orientation are detected by the three-dimensional position and orientation relationship output unit 16, and the error comparison and detection unit 20 detects the object.
- an actual marker for the object 38 is used. 7 Move the position of 6 By moving, the object 38 and the shape information 50 of the object 38 match. That is, while observing the image 36 of the photographed object 38 displayed on the display unit 26 and the object model 40 thereof, the image observer moves so that the two coincide. The position of the marker 76 is moved from the position 80 of the previous marker.
- the object 42 is displayed on the screen 42 of the display unit 26.
- a scale 82 may be added to 40 for display. By displaying such a scale 82, the observer can make a clear statement of the amount of displacement from the scale 82 on the screen 42, and according to the confirmed amount of displacement, the object 3 8 You can move the upper Ma-power 76.
- the third embodiment is also the same as the second embodiment described above.
- An object 38 whose shape information does not match in the real space and the model space in advance during the registration is used.
- the marker 76 is attached to the object by printing or the like.
- the three-dimensional position / posture output unit 16 detects the three-dimensional position and posture, and the error comparison / detection unit 20 outputs the target object.
- the image 36 of 38 and the object model 40 will be displayed shifted.
- the relative three-dimensional position and orientation relationship between the image input unit 14 detected by the three-dimensional orientation and orientation relationship output unit 16 and the object 38 by the three-dimensional position and orientation relationship information management unit 28 It is necessary to reflect the difference using the value of the difference from the error comparison / detection unit 20. In the present embodiment, this is reflected in the matrix force against the shape information 50 of the object 38. This is performed by moving the position and orientation of the shape information 84 of 76 and obtaining the shape information 86 as shown in FIG. 8B. As a result, as shown in FIG. 8B, the screen of the display unit 26 is displayed.
- the image input unit 14 captures the object 38 from a plurality of directions to form a still image, and the merging force 76 with respect to the shape information 50 of the object 38 on any one of the still images.
- the position and orientation of the shape information 84 are moved, the movement may be reflected even in a pair of still images.
- FIG. 9A shows the movement of the shape information 84 of the force 76 with respect to the shape information 50 of the object 38
- FIG. 9B shows the marker for the shape information 50 of the object 38.
- the shape information of 76 is shown after the movement of 86 is completed.
- the left column shows a case viewed from one direction a
- the right column shows a case viewed from another direction b.
- the movement of the shape information 84 of the marker 76 is performed by white movement.
- the observer it is also possible for the observer to specify the movement direction and movement by using a keyboard or the like.
- the scale 82 as described in the second embodiment is additionally displayed on the display screen 42 of the display unit 26 with respect to the object model 40.
- the shape information of the object is stored, and an error from the actually shadowed object is detected and reflected.
- attribute information of an object such as a part may be stored so that a change in the attribute information is reflected. Therefore, the information presentation system according to the fourth embodiment is configured as shown in FIG.
- the three-dimensional CAD 12 is used for the object stored in the three-dimensional design information storage unit 32 in addition to the configuration in the first embodiment.
- the information king's small device 10 is the same as the component in the first embodiment.
- an attribute information storage unit 92, a keyboard # 94 and an attribute information management unit 96 are further provided.
- the attribute information management unit 96 is controlled by the keypad 94 as an input unit. Has a function of reflecting the change of the inputted attribute information to the attribute information stored in the attribute information storage unit 92,
- the superimposed image generation unit 24 of the information presenting device 10 includes the image of the object input from the image input unit 14 and the shape information of the object stored in the shape information storage unit 18.
- the attribute information stored in the attribute information storage section 92 is also superimposed and displayed on the display section 26 C
- the transfer of the attribute information from the attribute information generating unit 90 of the three-dimensional CAD 12 to the attribute information storage unit 92 of the information presenting device 10 is performed online by wireless or wired communication. Good, or go off-line via some storage medium
- the sex information management unit 96 of the 3E / JN 3 device 10 is input by the keypad 94 in the same manner as the attribute information stored in the attribute information storage unit 92. It has the function of reflecting the change of the attribute information to the 3D 3 or BT information stored in the 3D 3 or total information storage unit 32 of the 3D CAD 1 2, that is, the original attribute information. Attribute information management unit 96 and 3D BX meter information storage unit of 3D CAD 12 above
- Transfer of information between 3 and 2 May be performed in any of the above.
- the information presentation system is provided with a keyboard 94 in place of the error comparison detection unit 20 in the configuration shown in FIG. 9 4 to 3 dimensions ML m
- the configuration is such that the movement amount of the shape information is directly input as a value to the posture relation information management unit 28 and the shape information management unit 22.
- the object is automatically corrected by the correction of the weight shift by 20.
- the three-dimensional position and orientation relationship information management unit 28 and the shape information management unit 22 By instructing the three-dimensional position and orientation relationship information management unit 28 and the shape information management unit 22 to minimize the deviation between the image of the object and the object model created from the shape information of the object, Is to correct the gap ''
- the input may be made to the three-dimensional position / posture relation information management unit 28 and the shape information management unit 22. With such a configuration, the same operation and effect as in the fourth embodiment can be obtained. (_ And not.
- the observer observes the superimposed screen while For example, from the keyboard 94, the three-dimensional position / posture relationship information management unit 28 and the shape information management unit 2 are set so that the deviation between the image of the object and the object model created from the shape information of the object is minimized. By instructing to 2, the deviation was corrected.
- the shape information management unit 22 is also configured to change the shape or ⁇ to instruct a component replacement.
- the image input unit 14 takes an image of a body 48 as shown in FIG. 13A with a part 98 and a part 100 attached thereto as an object 38. I do.
- the shape information 52 of the main body 48, the shape information 102 of the part 98, and the shape information of the part 100 corresponding to the three-dimensional position / posture relation detection unit 30 are obtained.
- the shape information 50 of the object 38 consisting of 104 is retrieved.
- the superimposed image generation unit 24 generates an image 36 of the object 38 and an object model 40 created from the shape information 50 (that is, a model 58 of the main body 48 and a part 98 of the part 98).
- An image in which the model 106 and the part 100 model 108) are superimposed is generated, and is displayed on the display unit 26.
- the shape of the actual part 100 and the shape information 100 of the part 100 are different from each other, as shown here. If there is, the image displayed on the screen 42 of the display unit 26 will be displaced from the image 110 of the 100 and its model 108.
- the design information is the shape information 102 of the straight pipe (long) and the shape information 104 of the straight pipe (short), but for the object in the real space, Parts 98 are in the elpo tube, parts
- 100 has a difference like it is a straight pipe (long)
- the shape information management unit 22 exchanges the shape information 102 of the straight pipe (long) with the shape information 111 of the elliptic pipe, as shown in FIG. Change the length of the (short) shape information 104 (also the updated shape information 116).
- the real space and the target information are matched. That is, when the shape information 102 of the part 98 is changed to the shape information 114, and the shape information 104 of the part 100 is changed to the shape information 116, the shift is canceled.
- the image 1 10 of the part 98 displayed on the screen 42 of the display unit 26 matches the model 1 18 after the change, and the image 1 1 2 of the part mouth ⁇ 0 0 And the modified module 1 2 0 are displayed in a superimposed manner:
- the three-dimensional CAD 12 3 by the shape information management unit 22 and the ⁇ ⁇ ⁇ or attribute information management unit 96 in the first embodiment and Z or the fourth embodiment is used. It does not have to be reflected in the dimension design information storage unit 32.
- the input section is not limited to the keyboard 94, but may be a mouse track pole, a sunset panel or the like.
- an image input unit configured to input an image of an object placed in a real space
- a 3D position / posture relationship output configured to detect and output a relative 3D position / posture relationship between the upper image input unit and the target from the image of the target input from the image input unit.
- a shape information storage unit configured to store shape information of the object;
- the relative 3D position and orientation relationship between the image input unit and the object detected by the 3D position and orientation relationship output unit A three-dimensional position and orientation information management unit configured to reflect using the value input from the unit,
- a superimposed image generation unit configured to generate an image in which the image of the object input from the image input unit and the shape information of the object stored in the shape information storage unit are superimposed;
- a display unit configured to display the image generated by the image generation unit
- the value input from the input unit includes the movement amount of the shape m report.
- the information port described in (1) ⁇
- the device according to (1) or (2) further including a shape information management unit configured to reflect the shape information stored in the shape information storage unit using the value input from the input unit.
- Information presentation device configured to reflect the shape information stored in the shape information storage unit using the value input from the input unit.
- the three-dimensional position and orientation information management unit includes: a first movement amount of shape information input from the input unit based on an image of the object captured from a first direction; and the first direction.
- the information according to any one of (1) to (3), using the second movement amount of the shape information input from the input unit based on an image taken from a second direction different from the second direction. Presentation device.
- the display unit includes: a first image generated by the superimposed image generation unit based on an image of the object captured in a first direction; and a second direction different from the first direction.
- the information presenting device according to any one of (1) to (4), wherein a second image generated by the superimposed image generating unit based on an image captured from is displayed.
- the information presentation device according to (1), wherein the value input from the input unit includes a value expressing a shape change of the shape information.
- the device further including a shape information management unit configured to reflect the shape information stored in the shape information storage unit using the value input from the input unit.
- Information presentation device configured to reflect the shape information stored in the shape information storage unit using the value input from the input unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04747254A EP1653191A4 (en) | 2003-07-11 | 2004-07-02 | INFORMATION PRESENTATION APPARATUS AND INFORMATION PRESENTATION SYSTEM USING THE DEVICE |
CN200480019973.4A CN1823256B (zh) | 2003-07-11 | 2004-07-02 | 信息呈现装置 |
US11/327,776 US7433510B2 (en) | 2003-07-11 | 2006-01-06 | Information presentation apparatus and information presentation system using the same |
HK06110939.7A HK1090414A1 (en) | 2003-07-11 | 2006-10-03 | Information presentation device |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-273566 | 2003-07-11 | ||
JP2003273565A JP2005031045A (ja) | 2003-07-11 | 2003-07-11 | 情報呈示装置及び情報呈示システム |
JP2003-273565 | 2003-07-11 | ||
JP2003273566 | 2003-07-11 | ||
JP2003-355462 | 2003-10-15 | ||
JP2003355462 | 2003-10-15 | ||
JP2003400768A JP4540329B2 (ja) | 2003-07-11 | 2003-11-28 | 情報呈示装置 |
JP2003-400768 | 2003-11-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/327,776 Continuation US7433510B2 (en) | 2003-07-11 | 2006-01-06 | Information presentation apparatus and information presentation system using the same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005005924A1 true WO2005005924A1 (ja) | 2005-01-20 |
Family
ID=34069248
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/009786 WO2005005924A1 (ja) | 2003-07-11 | 2004-07-02 | 情報呈示装置及びそれを用いた情報呈示システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US7433510B2 (ja) |
EP (1) | EP1653191A4 (ja) |
KR (1) | KR20060030902A (ja) |
HK (1) | HK1090414A1 (ja) |
WO (1) | WO2005005924A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7653263B2 (en) * | 2005-06-30 | 2010-01-26 | General Electric Company | Method and system for volumetric comparative image analysis and diagnosis |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005005924A1 (ja) * | 2003-07-11 | 2005-01-20 | Olympus Corporation | 情報呈示装置及びそれを用いた情報呈示システム |
US8131055B2 (en) * | 2008-01-31 | 2012-03-06 | Caterpillar Inc. | System and method for assembly inspection |
US20090287450A1 (en) * | 2008-05-16 | 2009-11-19 | Lockheed Martin Corporation | Vision system for scan planning of ultrasonic inspection |
WO2011002441A1 (en) * | 2009-06-29 | 2011-01-06 | Snap-On Incorporated | Vehicle measurement system with user interface |
US8413341B2 (en) * | 2009-06-29 | 2013-04-09 | Snap-On Incorporated | Vehicle measurement system with user interface |
KR101044252B1 (ko) * | 2010-09-14 | 2011-06-28 | 주식회사 티엠이앤씨 | 공간모델을 이용한 시설물 유지관리 시스템 및 시설물 유지관리 방법 |
EP2717181A1 (en) | 2012-10-08 | 2014-04-09 | Hexagon Technology Center GmbH | Method and system for virtual assembly of a structure |
CN107403050A (zh) * | 2017-08-01 | 2017-11-28 | 贺州学院 | 装配式建筑检查方法、装置及系统 |
JP6970817B2 (ja) | 2018-04-11 | 2021-11-24 | 富士フイルム株式会社 | 構造物管理装置、構造物管理方法、及び構造物管理プログラム |
US11769268B1 (en) * | 2019-09-30 | 2023-09-26 | United Services Automobile Association (Usaa) | System and method for detecting deviations in building structures using reconstructed 3d models |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6274155A (ja) * | 1985-09-27 | 1987-04-04 | Olympus Optical Co Ltd | 医用画像フアイル装置 |
JP2002092044A (ja) * | 2000-09-19 | 2002-03-29 | Olympus Optical Co Ltd | 設備管理システム及び方法ならびに設備管理プログラムを記録した記録媒体 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6405072B1 (en) * | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
US6445814B2 (en) * | 1996-07-01 | 2002-09-03 | Canon Kabushiki Kaisha | Three-dimensional information processing apparatus and method |
TW548572B (en) | 1998-06-30 | 2003-08-21 | Sony Corp | Image processing apparatus, image processing method and storage medium |
JP3525896B2 (ja) * | 1999-03-19 | 2004-05-10 | 松下電工株式会社 | 3次元物体認識方法および同方法を使用したビンピッキングシステム |
US6577249B1 (en) | 1999-10-19 | 2003-06-10 | Olympus Optical Co., Ltd. | Information display member, position detecting method using the same, apparatus and method of presenting related information, and information presenting apparatus and information presenting method |
US6697761B2 (en) * | 2000-09-19 | 2004-02-24 | Olympus Optical Co., Ltd. | Three-dimensional position/orientation sensing apparatus, information presenting system, and model error detecting system |
JP4674948B2 (ja) * | 2000-09-29 | 2011-04-20 | オリンパス株式会社 | 手術ナビゲーション装置および手術ナビゲーション装置の作動方法 |
JP2002324239A (ja) * | 2001-04-25 | 2002-11-08 | Olympus Optical Co Ltd | 情報呈示システム |
JP4635392B2 (ja) * | 2001-08-09 | 2011-02-23 | コニカミノルタホールディングス株式会社 | 3次元物体の表面形状モデリング装置、および、プログラム |
WO2005005924A1 (ja) * | 2003-07-11 | 2005-01-20 | Olympus Corporation | 情報呈示装置及びそれを用いた情報呈示システム |
-
2004
- 2004-07-02 WO PCT/JP2004/009786 patent/WO2005005924A1/ja active Application Filing
- 2004-07-02 KR KR1020067000665A patent/KR20060030902A/ko active Search and Examination
- 2004-07-02 EP EP04747254A patent/EP1653191A4/en not_active Withdrawn
-
2006
- 2006-01-06 US US11/327,776 patent/US7433510B2/en not_active Expired - Fee Related
- 2006-10-03 HK HK06110939.7A patent/HK1090414A1/xx not_active IP Right Cessation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6274155A (ja) * | 1985-09-27 | 1987-04-04 | Olympus Optical Co Ltd | 医用画像フアイル装置 |
JP2002092044A (ja) * | 2000-09-19 | 2002-03-29 | Olympus Optical Co Ltd | 設備管理システム及び方法ならびに設備管理プログラムを記録した記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1653191A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7653263B2 (en) * | 2005-06-30 | 2010-01-26 | General Electric Company | Method and system for volumetric comparative image analysis and diagnosis |
Also Published As
Publication number | Publication date |
---|---|
EP1653191A4 (en) | 2010-09-08 |
US7433510B2 (en) | 2008-10-07 |
KR20060030902A (ko) | 2006-04-11 |
EP1653191A1 (en) | 2006-05-03 |
US20060111797A1 (en) | 2006-05-25 |
HK1090414A1 (en) | 2006-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4434890B2 (ja) | 画像合成方法及び装置 | |
US7433510B2 (en) | Information presentation apparatus and information presentation system using the same | |
US7193633B1 (en) | Method and apparatus for image assisted modeling of three-dimensional scenes | |
JP4002983B2 (ja) | 投影装置、投影装置の制御方法、複合投影システム、投影装置の制御プログラム、投影装置の制御プログラムが記録された記録媒体 | |
JP6171671B2 (ja) | 情報処理装置、位置指定方法および位置指定プログラム | |
JP5991423B2 (ja) | 表示装置、表示方法、表示プログラムおよび位置設定システム | |
US7602404B1 (en) | Method and apparatus for image assisted modeling of three-dimensional scenes | |
Zhang | Flexible camera calibration by viewing a plane from unknown orientations | |
CN104463975B (zh) | 设置方法以及信息处理装置 | |
JP3486613B2 (ja) | 画像処理装置およびその方法並びにプログラム、記憶媒体 | |
JP4834424B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
Harders et al. | Calibration, registration, and synchronization for high precision augmented reality haptics | |
JP4743818B2 (ja) | 画像処理装置、画像処理方法、コンピュータプログラム | |
Behringer et al. | Augmented Reality: Placing artificial objects in real scenes | |
WO2015107665A1 (ja) | 作業支援用データ作成プログラム | |
JP2009278456A (ja) | 映像表示装置 | |
JP6381361B2 (ja) | データ処理装置、データ処理システム、データ処理装置の制御方法、並びにプログラム | |
US11395102B2 (en) | Field cooperation system and management device | |
JP2005031045A (ja) | 情報呈示装置及び情報呈示システム | |
CN113129362A (zh) | 一种三维坐标数据的获取方法及装置 | |
JP4540329B2 (ja) | 情報呈示装置 | |
JP2006018444A (ja) | 画像処理システム及び付加情報指示装置 | |
Siegl et al. | An augmented reality human–computer interface for object localization in a cognitive vision system | |
Chen et al. | Proposal of a rescue operation support system based on 3D reconstruction, GPS, and digital pen | |
JP2009008577A (ja) | オブジェクト認識装置及びオブジェクト認識用プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480019973.4 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
REEP | Request for entry into the european phase |
Ref document number: 2004747254 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004747254 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11327776 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067000665 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004747254 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11327776 Country of ref document: US |