JP4537557B2 - Information presentation system - Google Patents

Information presentation system Download PDF

Info

Publication number
JP4537557B2
JP4537557B2 JP2000283292A JP2000283292A JP4537557B2 JP 4537557 B2 JP4537557 B2 JP 4537557B2 JP 2000283292 A JP2000283292 A JP 2000283292A JP 2000283292 A JP2000283292 A JP 2000283292A JP 4537557 B2 JP4537557 B2 JP 4537557B2
Authority
JP
Japan
Prior art keywords
object
presentation system
information presentation
image
system according
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2000283292A
Other languages
Japanese (ja)
Other versions
JP2002092647A (en
JP2002092647A5 (en
Inventor
明人 斉藤
隆男 柴▲崎▼
祐一郎 赤塚
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to JP2000283292A priority Critical patent/JP4537557B2/en
Priority claimed from US09/951,873 external-priority patent/US6697761B2/en
Publication of JP2002092647A publication Critical patent/JP2002092647A/en
Publication of JP2002092647A5 publication Critical patent/JP2002092647A5/ja
Application granted granted Critical
Publication of JP4537557B2 publication Critical patent/JP4537557B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to an information presentation system and a model error detection system.
[0002]
[Prior art]
Conventionally, various techniques for superimposing and displaying information on an object in an image captured by a camera have been developed. For example, in Japanese Patent Laid-Open No. 10-267671, a landscape image is captured by a camera, and a visual field space in a map information space is obtained based on the camera position, camera angle, focal length, and image size information at this time. A technique is disclosed in which an internal structure is acquired, the name and attribute of the structure are created as label information, and the label information is superimposed on the landscape image.
[0003]
Japanese Patent Laid-Open No. 7-56624 discloses a system used for maintenance and repair at a work site, in which a head-mounted display has an image obtained by two-dimensionally developing information about a subject existing around an operator, and a subject. A technique for superimposing and displaying a live-action image masked so as to display only this portion is disclosed, and a gyro output included in a head-mounted display is used for controlling the mask.
[0004]
[Problems to be solved by the invention]
However, in the above prior art, it is necessary to take in the position and angle of the photographing means, the focal length, the image size information, and the like, and it is impossible to accurately superimpose considering the error and drift of the sensor. The photographing position is also limited because the position needs to be able to be measured accurately to some extent. In addition, the name and attributes are displayed on the actual image, which only helps the operator to understand.
[0005]
On the other hand, in the industrial field, especially in the field of equipment construction and maintenance, there was a need to compare the object data with the actual object on the image, but the above-mentioned conventional technology did not satisfy the needs. .
[0006]
The present invention has been made in view of the above problems, and an object of the present invention is to provide an information presentation system and a model error detection system that can easily compare a captured image of an actual object and data of the object. is there.
[0007]
[Means for Solving the Problems]
In order to achieve the above object, an information presentation system according to a first aspect of the present invention includes an image input unit that inputs an image obtained by photographing a marker having a known three-dimensional positional relationship with an object with an imaging device, Based on the position and orientation relationship, position and orientation detection means for recognizing information on the plurality of markers on the image and obtaining the position and orientation relationship between the object and the imaging device, and the three-dimensional model data of the object. Display means for displaying the image on the image in a different position, size, and orientation.
[0027]
DETAILED DESCRIPTION OF THE INVENTION
Embodiments of the present invention will be described below with reference to the drawings.
[0028]
FIG. 1 is a block diagram showing a configuration of an information presentation system according to the first embodiment of the present invention. Hereinafter, the configuration and operation of this system will be described in detail.
[0029]
First, in the configuration example of the information presentation system shown in FIG. 1A, the output of the image input unit 1 is connected to the inputs of the three-dimensional position / orientation relationship detection unit 2, the model data superimposition display unit 3, and the camera parameter acquisition unit 4. The outputs of the three-dimensional position / orientation relationship detection unit 2 and the camera parameter acquisition unit 4 are connected to the input of the information database 5. The outputs of the information database 5 and the object instruction unit 6 are connected to the input of the model data superimposing display unit 3, respectively.
[0030]
On the other hand, in the configuration example of the information presentation system shown in FIG. 1B, the output of the image input unit 1 is connected to the inputs of the three-dimensional position / orientation relationship detection unit 2, the model data superimposed display unit 3, and the camera parameter acquisition unit 4. The outputs of the three-dimensional position / orientation relationship detection unit 2 and the camera parameter acquisition unit 4 are connected to the input of the information database 5. The information database 5 and the output of the image input unit 1 are connected to the input of the model data superimposed display unit 3. In addition, the output of the object instruction unit 6 is connected to the input of the three-dimensional position / orientation relationship detection unit 2.
[0031]
Here, FIG. 1A is a configuration example in which the object instruction unit 6 performs two-dimensional pointing on the display screen of the model data superimposing display unit 3 (a screen on which two-dimensional projection is performed). b) is a configuration example in which a laser pointer that is the object indicating unit 6 is used to indicate in a real space that is pointed within three-dimensional coordinates.
[0032]
In the configuration as described above, a marker whose three-dimensional positional relationship with the object is known is photographed by a photographing device (not shown), and an image related to the marker is input to the system via the image input unit 1. The three-dimensional position / orientation relationship detection unit 2 obtains the position / orientation relationship between the object and the imaging device using the position of the marker on the image. The model data superimposing display unit 3 displays a real image of the target object input via the image input unit 1, and displays the three-dimensional model data of the target object based on the position and orientation relationship, The image is superimposed and displayed on the image in the size and orientation.
[0033]
Further, the attribute information about the desired object instructed on the screen of the model data superimposing display unit 3 by the object instructing unit 6 is superimposed on the model data superimposing display unit 3 with reference to the information database 5 as appropriate. It can also be displayed. The attribute information includes a name display of the object and a symbol indicating the object from the name display, such as a pointer. The display position of the attribute information can also be set by the object instruction unit 6 so that the display of the various attribute information does not interfere with each other.
[0034]
In the superimposed display, the three-dimensional model data or attribute information is superimposed and displayed on the screen in a color different from the display color of the object. Moreover, it can also be displayed on the wall surface or ceiling surface around the object on the image.
[0035]
The three-dimensional model data is displayed as a painted surface with a transparent background.
[0036]
The three-dimensional model data can be displayed on the model data superimposing display unit 3 even when the object is shielded by a shield and is not displayed in the image, for example. This display mode will be described in detail later with reference to the drawings.
[0037]
Here, the three-dimensional position / orientation relationship detection unit 2 recognizes the marker in the image input to the system via the image input unit 1 and obtains the position / orientation relationship between the imaging device and the object. However, since this method has already been proposed in, for example, Japanese Patent Application Laid-Open No. 2000-227309 by the applicant of the present application, description thereof will be omitted.
[0038]
As the marker, a visual marker disposed in the vicinity of the object or the object can be employed. According to this visual marker, the detectable range is widened as compared with the magnetic marker and the acoustic marker. Moreover, when it is not desired to impair the scenery of a building or the like as an object, a marker that is invisible to human eyes can be employed. Alternatively, the position and orientation relationship can be obtained by detecting feature points of the object. In addition, a marker projected on the object by a projector or the like can be used.
[0039]
Hereinafter, with reference to FIGS. 2 to 17, various actual display modes by the information presentation system according to the first embodiment will be described in detail.
[0040]
First, an annotation display mode will be described with reference to FIGS.
[0041]
When a live video image is input from the camera via the image input unit 1, the live video image is displayed on the model data superimposed display unit 3 as shown in FIG. 2. At this time, the three-dimensional position / orientation relationship detection unit 2 recognizes the marker information in the live video image, starts to estimate the position / orientation of the camera, and all objects in the field of view (related data exists). Name) is read from the information database 5 and superimposed on the model data superimposing display unit 3.
[0042]
In FIG. 2, “warm water (feed) 2”, “warm water (return) 2”, “air conditioning (return) 1”, and “air conditioning (feed) 1” are superimposed on the live video image as the names. Yes.
[0043]
In this state, when the user operates the mouse, the trackball, or the like serving as the object instruction unit 6 to instruct an object (for example, hot water (feed) 2) for which detailed data is to be displayed, as shown in FIG. The member attribute information is displayed at the upper right of the screen, the repair history information is displayed at the lower right of the screen, and the system information is displayed at the lower part of the screen.
[0044]
As the member attribute information, a related file name, type, device number, model number, product name, construction type name, length, number, and the like of the selected object are displayed. Further, as the repair history information, the repair date / time and contents of the selected object are displayed. As the system information, a camera position and an object position are displayed. These are displayed as text files.
[0045]
In addition, as shown in FIG. 3, 3D model data of a pipe model as a selected object is superimposed on the live video image. In this example, wireframe display is performed for the selected object.
[0046]
In this state, when the object instruction unit 6 is further operated, detailed design information of the selected object can be displayed in another window as shown in FIG. Here, a design drawing is displayed.
[0047]
Of course, the live video image can be enlarged and reduced. In this case, the model data superimposed display unit 3 displays the live video image based on predetermined camera parameters from the camera parameter acquisition unit 4. .
[0048]
Also, a method of displaying a macro image first obtained by a camera, designating a desired area on the display with a mouse or the like as the object instruction unit 6, and displaying a detailed image or information in the area. Of course, can be adopted. Since this method is publicly known, detailed description is omitted here.
[0049]
Next, referring to FIG. 5 to FIG. 7, a mode of comparison display between the finished product and the design drawing will be described. In this aspect, when a live video image is input from the camera via the image input unit 1, the live video image is displayed on the model data superimposed display unit 3 as shown in FIG. 5. At this time, the three-dimensional position / orientation relationship detection unit 2 recognizes the marker information in the live video image, starts to estimate the position / orientation of the camera, and all objects in the field of view (related data exists). 3D model data is superimposed on the live video image in a translucent state.
[0050]
The transparency of the 3D model data can be adjusted as appropriate.
[0051]
In FIG. 5, for convenience of explanation, the live video image is indicated by a thin line, and the 3D model data is indicated by a thick line.
[0052]
The user can compare the completed live video image with the 3D model data generated from the design drawing by looking at the display of the model data superimposed display unit 3.
[0053]
In this state, when the user operates the mouse, the trackball, or the like as the object instruction unit 6 to instruct an object for which detailed data is to be displayed, the member attribute information is displayed at the upper right of the screen as shown in FIG. The repair history information is displayed at the lower right of the screen, and the system information is displayed at the lower part of the screen.
[0054]
As the member attribute information, a related file name, type, device number, model number, product name, construction type name, length, number, and the like of the selected object are displayed. Further, as the repair history information, the repair date / time and contents of the selected object are displayed. As the system information, a camera position and an object position are displayed. These are displayed as text files.
[0055]
In addition, as shown in FIG. 6, 3D model data of the pipe model as the selected object is superimposed on the live video image in a wire frame display. Further, in FIG. 5, the 3D model data is displayed in a superimposed manner in a semi-transparent state. However, as shown in FIG. 7, the 3D model data can be fully displayed in a wire frame.
[0056]
Next, a model display mode on the back side of the wall will be described with reference to FIGS. First, in the initial state shown in FIG. 8, only the marker is displayed on the model data superimposed display unit 3.
[0057]
Since the position of this marker in the coordinate system is known, the three-dimensional position / orientation relationship detection unit 2 starts estimating the position / orientation of the camera, and as shown in FIG. It is displayed superimposed on part 3.
[0058]
At this time, as shown in FIG. 10, the names of all the objects (things having related data exist) are read from the information database 5 and superimposed on the model data superimposed display unit 3. That is, “warm water (feed) 2”, “hot water (return) 2”, “air conditioning (return) 1”, and “air conditioning (feed) 1” are superimposed and displayed as the names.
[0059]
In this state, when the user operates the mouse, the trackball, or the like serving as the object instruction unit 6 and designates an object for which detailed data is to be displayed, the member attribute information is displayed at the upper right of the screen as shown in FIG. The repair history information is displayed at the lower right of the screen, and the system information is displayed at the lower part of the screen.
[0060]
As the member attribute information, a related file name, type, device number, model number, product name, construction type name, length, number, and the like of the selected object are displayed. Further, as the repair history information, the repair date / time and contents of the selected object are displayed. As the system information, a camera position and an object position are displayed. These are displayed as text files.
[0061]
Further, as shown in FIG. 11, 3D model data of a piping model that is an object selected by an operation of a mouse or the like that is the object instruction unit 6 is superimposed and displayed in a wire frame display.
[0062]
Next, a model error detection system that detects a dynamic error will be described in detail as a second embodiment of the present invention.
[0063]
Here, the “deviation” between the object and the model data is measured.
[0064]
When accuracy is required, measurement is performed by combining a laser length measuring device or the like.
[0065]
If the registered position of the target object at the reference point coordinates and the actual target object are misaligned (external factor, design error, etc.), the registered position needs to be corrected. In this embodiment, the following method is used. Thus, the registered position is corrected.
[0066]
That is, in the second embodiment, the object model M is displayed on the screen according to the registered position of the object at the reference point coordinates. At this time, the actual object R is photographed, and both are superimposed. Then, a configuration such as the side length and thickness of the object R in the image is recognized, compared with the object model M, and an error E on the projection screen is obtained by taking the difference between the two. That is, for example, an object region is extracted from the object R, and the constituent elements of the object R are thinned. Then, the error E is measured by taking the difference between the thinned object R and the object model M and recognizing the error.
[0067]
As a method for measuring the error E, for example, the following method can be adopted.
[0068]
That is, first, a point to be compared between the object R and the object model M is pointed by using a pointing device such as a mouse, and a difference between the coordinates of the point is taken to obtain a two-dimensional projection coordinate in the visual field direction. Find the error of. At this time, if the processing for obtaining the error in the two-dimensional projection coordinates in the line-of-sight direction is performed using the images from a plurality of directions at the location specified by the pointing device as the instruction means, a three-dimensional error (three-dimensional) is obtained. Error data).
[0069]
Second, an image of the object R is captured in a computer by an image capturing unit, and pattern matching is performed between an image obtained by two-dimensionally projecting the three-dimensional model data of the object R and an image of the object R. The error E can be detected.
[0070]
Third, the error E can also be obtained by taking the difference between the current image of the object R and the image of the object R taken in the past in the same field of view. The model data of the object R photographed in the past is an image of the object projected in the past or created based on an image of the current object R.
[0071]
Fourth, when the relative three-dimensional position / orientation relationship between the imaging devices or the imaging devices is known in advance, the distance between the imaging device and the object R is measured using a length measuring device, and is within the reference point coordinates. The error E of the position coordinate data can be obtained by comparing the distance between the object R registered as a three-dimensional model and the photographing apparatus.
[0072]
Fifth, an error is obtained by comparing the image of the object R, the volume obtained by using the distance between the object R measured by the length measuring device and the photographing apparatus, and the volume data stored in advance. Can do.
[0073]
The embodiment of the present invention has been described above, but the present invention is not limited to this, and various improvements and changes can be made without departing from the spirit of the present invention.
[0074]
【The invention's effect】
As described above in detail, according to the present invention, it is possible to provide an information presentation system and a model error detection system that can easily compare a captured image of an actual object and data of the object.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a configuration of an information presentation system according to a first embodiment of the present invention.
FIG. 2 is a diagram for explaining an annotation display mode by the information presentation system according to the first embodiment;
FIG. 3 is a diagram for explaining a mode of annotation display by the information presentation system according to the first embodiment.
FIG. 4 is a diagram for explaining an aspect of annotation display by the information presentation system according to the first embodiment.
FIG. 5 is a diagram for explaining an aspect of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment.
FIG. 6 is a diagram for explaining an aspect of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment.
FIG. 7 is a diagram for explaining an aspect of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment.
FIG. 8 is a diagram for explaining a model display mode on the back side of the wall by the information presentation system according to the first embodiment;
FIG. 9 is a diagram for explaining a mode of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment;
FIG. 10 is a diagram for explaining a mode of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment.
FIG. 11 is a diagram for explaining a mode of comparison display between a completed product and a design drawing by the information presentation system according to the first embodiment.
[Explanation of symbols]
DESCRIPTION OF SYMBOLS 1 Image input part 2 3D position-and-orientation relation detection part 3 Model data superimposition display part 4 Camera parameter acquisition part 5 Information database 6 Object instruction | indication part

Claims (22)

  1. An image input means for inputting an image obtained by capturing an image of a marker having a known three-dimensional positional relationship with an object by an imaging device;
    Position and orientation detection means for recognizing information of the plurality of markers on the image and obtaining a position and orientation relationship between the object and the imaging device;
    Display means for displaying the three-dimensional model data of the object on the image in a position, size, and orientation based on the position and orientation relationship;
    An information presentation system characterized by comprising:
  2. The information presentation system further includes a database that stores attribute information of the object,
    The information display system according to claim 1, wherein the display unit further displays attribute information of the object.
  3.   The information presenting system according to claim 1, further comprising object instruction means for indicating a desired object from a plurality of objects.
  4.   The information presentation system according to claim 1, wherein the three-dimensional model data of the object is displayed on the screen in a color different from that of the object.
  5.   The information presentation system according to claim 2, wherein the attribute information is displayed on the screen in a color different from that of the object.
  6.   3. The information presentation system according to claim 2, wherein the display of the attribute information includes a name display of the object and a symbol indicating the object from the name display.
  7.   The information presentation system according to claim 2, further comprising means for setting a display position so that the display of the attribute information does not interfere with each other.
  8.   The information presentation system according to claim 1, wherein the three-dimensional model data is displayed on a painted surface with a transparent background.
  9.   The information presentation system according to claim 1, wherein the three-dimensional model data is displayed even when the object is shielded by a shield and is not displayed in the image.
  10.   The information presentation system according to claim 1, wherein the marker is a visual marker attached to at least one of an object and the vicinity of the object.
  11.   The information presentation system according to claim 1, wherein the marker is invisible to human eyes.
  12.   The information presentation system according to claim 1, wherein the marker uses a characteristic portion of the object as a marker.
  13.   The information presentation system according to claim 1, wherein the marker is projected by a projector.
  14.   The information presentation system according to claim 2, wherein the attribute information is displayed on at least one of a wall surface and a ceiling surface around the object on the image.
  15. The above information presentation system
    The 3D model data of the object is compared with the image of the object acquired by the image input unit based on the position and orientation relationship, and the 3D model data of the object and the actual object Error detecting means for detecting an error between the object and
    Correction means for correcting the three-dimensional model data of the object based on the detection result of the error detection means;
    The information presentation system according to claim 1, further comprising:
  16.   The information presentation system according to claim 15, wherein the image of the object acquired by the image input unit is an image obtained by photographing the object.
  17.   The information presentation system according to claim 15, wherein the image of the object acquired by the image input unit is an image on which the object is projected.
  18. The above information presentation system
    Further comprising a designation means for designating a place to be compared on the display of the display means,
    The error detection means obtains a three-dimensional error by performing processing for obtaining an error in the two-dimensional projection coordinates in the line-of-sight direction at the location designated by the designation means, using images from a plurality of directions. The information presentation system according to claim 15.
  19.   16. The error detection unit according to claim 15, wherein the error detection unit detects an error by performing pattern matching between an image obtained by two-dimensionally projecting the three-dimensional model data of the object and an image of the object. Information presentation system.
  20.   The three-dimensional model data of the object is an image of the object projected in the past or created based on the image of the object projected in the past. Information presentation system described.
  21. The information presentation system further includes a database storing attribute information of the object, and the database holds position coordinate data as attribute information of the object,
    16. The error detection unit obtains an error of the position coordinate data based on an actual measurement value of a distance between the object and the image input unit, using the position coordinates of the object. Information presentation system described in 1.
  22. The information presentation system further includes a database that stores attribute information of the object,
    The database holds volume data as attribute information of the object,
    The error detection means obtains an error between the volume data and the volume obtained based on the measured value of the distance between the object and the image input means. The information presentation system according to claim 15.
JP2000283292A 2000-09-19 2000-09-19 Information presentation system Expired - Fee Related JP4537557B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000283292A JP4537557B2 (en) 2000-09-19 2000-09-19 Information presentation system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2000283292A JP4537557B2 (en) 2000-09-19 2000-09-19 Information presentation system
US09/951,873 US6697761B2 (en) 2000-09-19 2001-09-13 Three-dimensional position/orientation sensing apparatus, information presenting system, and model error detecting system

Publications (3)

Publication Number Publication Date
JP2002092647A JP2002092647A (en) 2002-03-29
JP2002092647A5 JP2002092647A5 (en) 2007-06-21
JP4537557B2 true JP4537557B2 (en) 2010-09-01

Family

ID=18767677

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000283292A Expired - Fee Related JP4537557B2 (en) 2000-09-19 2000-09-19 Information presentation system

Country Status (1)

Country Link
JP (1) JP4537557B2 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3864150B2 (en) * 2003-06-18 2006-12-27 オリンパス株式会社 Information presentation device and information presentation method
FR2889761A1 (en) * 2005-08-09 2007-02-16 Total Immersion Sa System for user to locate a camera for quickly adjusted insertion of virtual image images in video images of camera-captured actual elements
US9323055B2 (en) * 2006-05-26 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
JP5052166B2 (en) * 2007-03-06 2012-10-17 株式会社小松製作所 Embedded object detection apparatus, management apparatus, and construction machine monitoring system embedded object search method
JP5427577B2 (en) * 2009-12-04 2014-02-26 パナソニック株式会社 Display control apparatus and display image forming method
JP5633345B2 (en) * 2010-12-03 2014-12-03 富士通株式会社 Image processing apparatus, image display system, and image processing program
JP5014494B2 (en) * 2011-01-21 2012-08-29 パナソニック株式会社 Information processing apparatus, augmented reality system, information processing method, and information processing program
JP4856291B1 (en) * 2011-01-26 2012-01-18 パイオニア株式会社 Display device and control method
JP5565331B2 (en) * 2011-01-28 2014-08-06 コニカミノルタ株式会社 Display system, display processing apparatus, display method, and display program
JP5912059B2 (en) 2012-04-06 2016-04-27 ソニー株式会社 Information processing apparatus, information processing method, and information processing system
JP5915996B2 (en) * 2012-06-20 2016-05-11 清水建設株式会社 Composite image display system and method
US9710573B2 (en) * 2013-01-22 2017-07-18 General Electric Company Inspection data graphical filter
JP5991423B2 (en) 2013-02-21 2016-09-14 富士通株式会社 Display device, display method, display program, and position setting system
JP5895250B2 (en) * 2013-06-17 2016-03-30 株式会社 デジタルコラボレーションズ Knowledge management device, knowledge management device terminal, and knowledge management device program
JP2015090525A (en) * 2013-11-05 2015-05-11 隅田設計株式会社 Plant facility modification design system
US9530250B2 (en) * 2013-12-10 2016-12-27 Dassault Systemes Augmented reality updating of 3D CAD models
JP6299234B2 (en) 2014-01-23 2018-03-28 富士通株式会社 Display control method, information processing apparatus, and display control program
JP6211428B2 (en) * 2014-01-30 2017-10-11 Kddi株式会社 Guidance display device, method and program
JP2015184778A (en) * 2014-03-20 2015-10-22 コニカミノルタ株式会社 Augmented reality display system, augmented reality information generation device, augmented reality display device, server, augmented reality information generation program, augmented reality display program, and data structure of augmented reality information
JP6265027B2 (en) 2014-04-22 2018-01-24 富士通株式会社 Display device, position specifying program, and position specifying method
JP6500355B2 (en) 2014-06-20 2019-04-17 富士通株式会社 Display device, display program, and display method
JP6659927B2 (en) * 2015-05-11 2020-03-04 株式会社 デジタルコラボレーションズ Information grasp management device, terminal of information grasp management device, and program of information grasp management device
JP2016157458A (en) * 2016-03-31 2016-09-01 ソニー株式会社 Information processing apparatus
JP6230678B2 (en) * 2016-10-25 2017-11-15 三菱電機株式会社 Terminal device, terminal device program, and invisible object displacement state visualization method
JP6410874B1 (en) * 2017-05-30 2018-10-24 株式会社タカラトミー AR video generator

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06317090A (en) * 1993-05-07 1994-11-15 Tokyu Constr Co Ltd Three-dimensional display device
JPH0773338A (en) * 1993-07-02 1995-03-17 Matsushita Electric Ind Co Ltd Virtual reality apparaus mainly based on vision
JPH0886615A (en) * 1994-09-19 1996-04-02 Mitsubishi Electric Corp Image display apparatus and image display system
JPH113195A (en) * 1997-06-11 1999-01-06 Fujitsu Ltd Interactive system and storage medium
JPH11183143A (en) * 1997-12-24 1999-07-09 Anima Kk Dynamic form measuring equipment
JPH11239989A (en) * 1998-02-25 1999-09-07 Fujitsu Ltd Calibration device in robot simulation
JPH11351826A (en) * 1998-06-09 1999-12-24 Mitsubishi Electric Corp Camera position identifier
JP2000047577A (en) * 1998-07-28 2000-02-18 Hitachi Eng Co Ltd Method and device for displaying electronic map
JP2000194859A (en) * 1998-12-25 2000-07-14 Canon Inc Object shape extraction method, object shape extraction device and recording medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3208939B2 (en) * 1993-08-09 2001-09-17 日産自動車株式会社 Monitoring device with head-mounted display device
JPH07200777A (en) * 1993-12-28 1995-08-04 Hitachi Ltd Location estimating method in mobile object
JP3512992B2 (en) * 1997-01-07 2004-03-31 株式会社東芝 Image processing apparatus and image processing method
JP3770991B2 (en) * 1997-03-06 2006-04-26 東レエンジニアリング株式会社 Method and apparatus for generating analysis model and method for analyzing injection molding process
JP3225882B2 (en) * 1997-03-27 2001-11-05 日本電信電話株式会社 Landscape labeling system
JP4794708B2 (en) * 1999-02-04 2011-10-19 オリンパス株式会社 3D position and orientation sensing device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06317090A (en) * 1993-05-07 1994-11-15 Tokyu Constr Co Ltd Three-dimensional display device
JPH0773338A (en) * 1993-07-02 1995-03-17 Matsushita Electric Ind Co Ltd Virtual reality apparaus mainly based on vision
JPH0886615A (en) * 1994-09-19 1996-04-02 Mitsubishi Electric Corp Image display apparatus and image display system
JPH113195A (en) * 1997-06-11 1999-01-06 Fujitsu Ltd Interactive system and storage medium
JPH11183143A (en) * 1997-12-24 1999-07-09 Anima Kk Dynamic form measuring equipment
JPH11239989A (en) * 1998-02-25 1999-09-07 Fujitsu Ltd Calibration device in robot simulation
JPH11351826A (en) * 1998-06-09 1999-12-24 Mitsubishi Electric Corp Camera position identifier
JP2000047577A (en) * 1998-07-28 2000-02-18 Hitachi Eng Co Ltd Method and device for displaying electronic map
JP2000194859A (en) * 1998-12-25 2000-07-14 Canon Inc Object shape extraction method, object shape extraction device and recording medium

Also Published As

Publication number Publication date
JP2002092647A (en) 2002-03-29

Similar Documents

Publication Publication Date Title
Zollmann et al. Augmented reality for construction site monitoring and documentation
US8792709B2 (en) Transprojection of geometry data
Neumann et al. A self-tracking augmented reality system
US8731276B2 (en) Motion space presentation device and motion space presentation method
US8509490B2 (en) Trajectory processing apparatus and method
EP2918972B1 (en) Method and handheld distance measuring device for generating a spatial model
US9325969B2 (en) Image capture environment calibration method and information processing apparatus
US8705893B1 (en) Apparatus and method for creating floor plans
US9776364B2 (en) Method for instructing a 3D printing system comprising a 3D printer and 3D printing system
EP3050030B1 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
DE102010043136B4 (en) Measuring device and method for a non-contact measurement of distances at a target object
JP3991020B2 (en) Image display method and image display system
US8335400B2 (en) Information processing method and information processing apparatus
US7193636B2 (en) Image processing device and method therefor and program codes, storing medium
US20130077854A1 (en) Measurement apparatus and control method
JP2016057108A (en) Arithmetic device, arithmetic system, arithmetic method and program
US7986825B2 (en) Model forming apparatus, model forming method, photographing apparatus and photographing method
US8970690B2 (en) Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
JP4607095B2 (en) Method and apparatus for image processing in surveying instrument
US7928977B2 (en) Image compositing method and apparatus for superimposing a computer graphics image on an actually-sensed image
EP2208021B1 (en) Method of and arrangement for mapping range sensor data on image sensor data
US9135513B2 (en) Image processing apparatus and method for obtaining position and orientation of imaging apparatus
JP4681856B2 (en) Camera calibration method and camera calibration apparatus
CA2760006C (en) Point cloud assisted photogrammetric rendering method and apparatus
JP2015532077A (en) Method for determining the position and orientation of an apparatus associated with an imaging apparatus that captures at least one image

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070501

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070501

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090901

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090915

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091112

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20100112

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100408

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20100419

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100608

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100618

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130625

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees