CN102693125A - Augmented reality apparatus and method of windows form - Google Patents

Augmented reality apparatus and method of windows form Download PDF

Info

Publication number
CN102693125A
CN102693125A CN2012100089976A CN201210008997A CN102693125A CN 102693125 A CN102693125 A CN 102693125A CN 2012100089976 A CN2012100089976 A CN 2012100089976A CN 201210008997 A CN201210008997 A CN 201210008997A CN 102693125 A CN102693125 A CN 102693125A
Authority
CN
China
Prior art keywords
area
mark
augmented reality
image
view direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100089976A
Other languages
Chinese (zh)
Inventor
安启赫
朴炯一
李钟权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pantech Co Ltd
Original Assignee
Pantech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pantech Co Ltd filed Critical Pantech Co Ltd
Publication of CN102693125A publication Critical patent/CN102693125A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A window type augmented reality providing apparatus and method thereof are provided to offer interfaces to a user by displaying an AR(Augmented Reality) object of a window type corresponding to a marker by considering of the size of the predetermined window according to the marker and the location of an AR apparatus. A marker recognition unit recognizes a marker in a preview mode. A window detection unit detects a window area from the marker. A location estimation unit determines a location relationship between the window area or the marker, and an AR service apparatus. An information processing unit determines AR information. The information processing unit determines the area of the AR information displayed through the window area based on the location relationship. An image processing unit displays the AR information area on the window area.

Description

Be used to provide the equipment and the method for the augmented reality of window form
Technical field
Following discloses relate to equipment and the method that augmented reality (AR) service is provided, and relate more specifically to a kind of equipment and method that the AR object of window form is provided.
Background technology
Augmented reality (AR) is a kind of virtual reality of form, and it can provide the image that obtains through observable real world view of glasses that makes up the user and the virtual world that comprises additional information.AR technology has been replenished the real world view with the virtual world image, and the figure that uses a computer has been realized virtual world environment, simultaneously with actual time environment remain the principal ingredient among the AR.Computer picture can provide the additional information to real world.For example, real world and virtual world environment possibly be difficult to be distinguished from each other out three-dimensional (3D) thereby virtual image can overlap with the real image that the user watches.Virtual reality technology can be immersed in the virtual world environment user.In order to realize AR, computing machine can identification marking (marker), and can on display screen, show the 3D graphical model that is connected to this mark.Mark can be on two dimension (2D) plane physical presence, and can provide and be connected to size, view direction and the associated position information of the 3D graphical model of this mark.Mark and 3D graphical model can show with different shape based on user's selection.
Summary of the invention
Illustrative embodiments of the present invention provides a kind of equipment and method that is used to provide augmented reality (AR) service of window form, and it can show the AR object, just looks like that AR shows liking through window.
Further feature of the present invention will be set forth in following explanation, and a part will be clearly according to this explanation, perhaps can know from practice of the present invention.
Illustrative embodiments of the present invention provides a kind of equipment that is used to provide augmented reality, and said equipment comprises: windows detecting device, said windows detecting device are used for confirming first area and second area based on the image of the object of being caught; Message handler, said message handler are used for based on said object discerned the first of the virtual world image layer that will show in the first area with corresponding first view direction in first area; And image processor, said image processor is used for showing first in the first area, and in second area, shows the real world images layer.
Illustrative embodiments of the present invention provides a kind of equipment that is used to provide augmented reality, and said equipment comprises: image capture apparatus, and said image capture apparatus is used to catch image; Mark recognition unit, said mark recognition unit are used for from said image identification marking; Windows detecting device, said windows detecting device are used for confirming and the corresponding first area of said mark; Message handler, said message handler are used for the first based on the virtual world image layer of confirming with respect to the view direction of said mark to show in the first area, and said virtual world image layer comprises one or more a plurality of augmented reality objects; And image processor, it is used for being presented at said first area and shows said first.
Illustrative embodiments of the present invention provides a kind of method that augmented reality is provided, and said method comprising the steps of: the image based on the object of being caught is confirmed first area and second area; Based on discerning the first of the virtual world image layer that will in the first area, show with first view direction of the corresponding object in first area; And in the first area, show first, and in second area, show the real world images layer.
Illustrative embodiments of the present invention provides a kind of method that augmented reality is provided, and said method comprising the steps of: use image capture apparatus to catch image; Identification marking from said image; Confirm and the corresponding first area of said mark; Based on the first of the virtual world image layer of confirming with respect to the view direction of said mark in the first area, to show, said virtual world image layer comprises one or more a plurality of augmented reality objects; And in said first area, show said first.
Should be understood that the general description of front all is exemplary and indicative with the specific descriptions of back, and is intended to the present invention for required protection further explanation is provided.。Further feature and aspect will become obvious from detailed description, accompanying drawing and claim.
Description of drawings
Accompanying drawing is included providing further understanding of the present invention, and is attached among the application and constitutes the application's a part, these accompanying drawing illustrations illustrative embodiments of the present invention, and be used to explain principle of the present invention with instructions.
Fig. 1 is an illustration according to the figure of augmented reality (AR) service that provides with window form of embodiment of the present invention;
Fig. 2 is an illustration according to the figure of the equipment of the augmented reality that window form is provided (AR) service of exemplary embodiment of the invention;
Fig. 3 is an illustration according to the process flow diagram of the augmented reality that window form is provided (AR) service method of exemplary embodiment of the invention.
Embodiment
Describe illustrative embodiments below with reference to accompanying drawings more all sidedly, obtained illustrative embodiments in the accompanying drawing.Yet the disclosure can be with many multi-form enforcements, and should not to be interpreted as be the restriction to the illustrative embodiments of setting forth here.But, thereby provide exemplary these embodiment disclosure fully complete, and pass on the scope of the present disclosure to those skilled in the art comprehensively.In description, the details that can omit known features and technology is to avoid unnecessary this embodiment of obscuring.
Term purpose used herein just is to describe embodiment, rather than will limit the disclosure.As used herein, do not indicate single plural number and be intended to also comprise plural form, only if clear from context ground indicates in addition.Therefore, do not indicate single plural number and do not represent the logarithm quantitative limitation, and there is at least one project of quoting in expression.Wording " first ", " second " etc. do not hint concrete order, thereby can comprise to identify each element.In addition, any order or importance are not represented in the use of first, second grade of wording, but wording first, second can be used for an element and another element are distinguished from each other.What it is also understood that is that wording " comprises/comprise " when being meant that when this instructions uses existing characteristic, zone, important document, step, operation, element or the parts of stating still not to get rid of to exist perhaps adds one or more other characteristics, zone, important document, step, operation, element, parts and/or its combination.To understand for the disclosure, " one of at least " is interpreted as the combination in any of following the key element that each language gives an example, comprises the combination of a plurality of key elements of giving an example.For example, " at least one among X, Y and the Z " will be interpreted as X only, only Y, two or more of combination in any (for example, XYZ, XZ, XZZ, YZ, X) of Z or X, Y and Z only.
Illustrative embodiments of the present invention provides augmented reality (AR) service equipment and the method that shows the AR object with window form.
Fig. 1 is an illustration according to the figure of augmented reality (AR) service that provides with window form of embodiment of the present invention.
With reference to Fig. 1, user 110 can capture of labels serve to receive AR from AR service equipment 200.Mark can be the image that shows customizing messages (for example QR code).In addition, mark can be that comprise can be by or the image of more a plurality of definite unique points such as the identification of the image capture apparatus of video camera.For example, the rectangular object that AR service equipment 200 is caught can be identified as mark, and the area of rectangular object can be confirmed as window area 120.In addition, if rectangular object is caught from view direction 310, then can obtain the plane of window area 120 and the angle between the view direction 310, and can this visual angle be stored as view angle theta.In addition, can confirm the area of window area 120 based on view angle theta.For example, can define the area of window area 120 according to following formula 1.
[formula 1]
A w=kA msinθ
Wherein, A wThe area of expression window area 120, k representes coefficient, A mThe area of expressive notation, θ representes the visual angle.In addition, can obtain " k " based on the distance between AR service equipment 200 and the mark.For example, " k " can be inversely proportional to the distance between AR service equipment 200 and the mark.
AR service equipment 200 can make rear side AR object 130,140 and 150 through illustrating with the corresponding window area 120 of mark.
AR service equipment 200 can show rear side AR object 130,140 and 150 through window area 120 in the display frame that will show, thereby the part that can show rear side AR object 130,140 and 150 with respect to position or projection angle with the corresponding window area 120 of mark based on AR service equipment 200.Hereinafter, the rear side AR object 130,140 that shows through window area 120 and 150 that part ofly can be called the AR information area.
Based on the plane that comprises the determined window area 120 of mark, the front side on this plane can be called real world space 160, and rear side can be called virtual world space 170.Real world space 160 can comprise or the more a plurality of front sides AR object that the real world images of being caught is strengthened.Virtual world space 170 can be to comprise a perhaps virtual image of more a plurality of rear side AR objects.Augmented reality (AR) object can be two dimension or the 3-D view that is presented in the display frame of AR service equipment 200.
Based on the plane that comprises window area 120, the AR object that is placed on this plane rear side can be called rear side AR object, and the AR object that is placed on this front side, plane can be called front side AR object.
In AR service equipment 200, real world space 160 can be used as ground floor (" real world images layer ") and generates, and virtual world space 170 can be used as the second layer (" virtual world image layer ") and generates.In the display frame of AR service equipment 200, can show ground floor and can obtain the corresponding second layer and itself and ground floor are mated.If window area 120 is displayed in the display frame, then can show the part of the second layer through window area 120.If AR service equipment 200 is positioned at middle position with respect to window area 120, then can shows rear side AR object 140, and can not show rear side object 130 and 150, because rear side object 130 and 150 is positioned on the blind spot through window area 120.In addition, can about 1 dynamically control ground floor in about 0 the scope transparency.For example, if the transparency of ground floor changes to 0.7 from 1, then the transparency of the second layer can change to 0.3 from 0.Thereby rear side object 130 and 150 can show with the real world space 160 of ground floor.
Fig. 2 is an illustration according to the figure of the equipment of the augmented reality that window form is provided (AR) service of exemplary embodiment of the invention.This equipment can be called AR service equipment 200.
With reference to Fig. 2, AR service equipment 200 comprises controller 210, flag memory cell 220, object-storage unit 230, video camera 240, display 250, mark recognition unit 211, windows detecting device 212, position estimator 213, message handler 214, image processor 215 and augmented reality (AR) actuator 216.
Flag memory cell 220 can storage mark information and window area information.At least one of unique point that label information can comprise marking image, be used for the mark sign (ID) of identity marking and be used for position or the angle of trace labelling.Window area information can comprise size, position, shape, angle of window area 120 etc.
Object-storage unit 230 can be stored and the corresponding AR object information of mark.
The AR object information of being stored can be three-dimensional (3D) model information that is used to generate 3D AR image.In addition, the AR object can comprise two dimension (2D) image or 3D rendering.
The AR object information of being stored can be the movable information and the 3D model information that can be associated with corresponding AR motion of objects.The AR object information can be used for the AR recreation together with corresponding AR object.
In addition, the AR object information of being stored can be the data that are associated with the 3D model information that is used for various purposes, for example, and file data, music data, voice data etc.The AR object information of being stored in addition, can comprise the information that is associated with the transmission medium of the authority with visit corresponding data.Transmission medium can connect through network.For example, the user can use the AR object information of being stored to play games or shared data according to variety of way.If the AR object related with acoustic phase is shown as near the user, then the sound with this AR object associated can increase.If the AR object is shown as from the user leave, then sound can reduce.If the AR object is selected, then can come to receive in real time or shared audio files based on the characteristic of selected AR object.
The plane that exists based on window area, can with the AR object information be categorized as with the information of rear side AR object associated and with the information of front side AR object associated.Rear side AR object can be the AR object that is positioned at the rear side on the existing plane of window area.Front side AR object can be the AR object that is positioned at the front side on the existing plane of window area.Front side AR object can comprise and the corresponding AR object of the window frame of window area 120.Specifically, can be on display 250 the AR object of display window shaped as frame shape.The position of the window area 120 corresponding marks of can basis being caught with the video camera 240 of AR service equipment 200 shows the AR object with reorientation window frame shape.
Video camera 240 can provide still image or preview image to mark recognition unit 211 and/or display 250.If the position or the visual angle of video camera 240 change, then preview image can change in real time.Before being provided to mark recognition unit 211 and/or display 250, the image that video camera 240 is caught can be revised by image correction process or video camera treatment for correcting.
Display 250 can show the information that is associated with AR service equipment 200, designator, numeral, character, moving image, rest image etc.Display 250 can show the mark that receives from video camera 240, and can show AR information area that front side AR object and/or image processor 215 generate part or all.
The mark that mark recognition unit 211 can be discerned preview image or from the still image that video camera 240 is caught, comprise.
Windows detecting device 212 can detect the window area 120 corresponding to mark.The size of window area 120 can not relate to the size of mark, and perhaps window area 120 can have identical size or corresponding or relevant size with mark.In addition, the size that can confirm window area 120 based on the size and/or the distance between mark and the AR service equipment 200 of mark.
Position estimator 213 can be confirmed the position relation between mark and the AR service equipment 200.In addition, position estimator 213 can be confirmed the position relation between window area 120 and the AR service equipment 200.
Message handler 214 can be confirmed AR information and AR object corresponding with mark and that will show at the rear side of window area 120.In addition, message handler 214 can come the definite part that will pass through the AR information area of window area 120 demonstrations based on this position relation.
For rear side AR object is shown as as if rear side AR to as if through shown in the window area 120, the rear side AR object of blind spot that is positioned at the virtual world space 170 at window area 120 rears can not show on display 250.In addition, message handler 214 can confirm can be from the virtual world space the 170 AR information areas that show through window area 120.Specifically, message handler 214 can confirm with when can be when the view direction of AR service equipment 200 is watched through the regional corresponding AR information area of the rear side AR object shown in the window area 120.
Image processor 215 can show determined AR information area in window area 120.This AR information area can with when regional corresponding through the rear side AR object shown in the window area 120 when the view direction of AR service equipment 200 is watched.For example, return Fig. 1,, then can show flower shape AR objects 150 through window area 120 if from the left side viewing window port area 120 of window area 120.If from the right side viewing window port area 120 of window area 120, then can show butterfly shape AR object 130 through window area 120.If from the front side viewing window port area 120 of window area 120, then can show tree-like AR object 140 through window area 120.Rear side AR object 130,140 and 150 positional information can be stored in the object-storage unit 230.Rear side AR object 130,140 and 150 can be shown as around display 250 and move.Rear side AR object 130,140 and 150 can be 3D rendering and be shown as through window area 120 and move and become front side AR object.
Image processor 215 can be confirmed the front side AR object corresponding to mark, and can show front side AR object based on the position of window area 120 and AR service equipment 200.
Along with AR is performed, AR actuator 216 can be handled rear side AR object or rear side AR motion of objects.
Controller 210 can be controlled the operation of AR service equipment 200.It is whole that controller 210 can be carried out a part of person of operation of mark recognition unit 211, windows detecting device 212, position estimator 213, message handler 214, image processor 215 and AR actuator 216.Separately illustration controller 210, mark recognition unit 211, windows detecting device 212, position estimator 213, message handler 214, image processor 215 and AR actuator 216 to describe operation separately.Controller 210 can comprise one, and perhaps more a plurality of processors are whole with a part of person of the operation of execution mark recognition unit 211, windows detecting device 212, position estimator 213, message handler 214, image processor 215 and AR actuator 216.
Hereinafter, will the AR service method that window form is provided according to exemplary embodiment of the invention be described with reference to Fig. 3.
Fig. 3 is an illustration according to the process flow diagram of the augmented reality that window form is provided (AR) service method of exemplary embodiment of the invention.As if Fig. 3 will be described to undertaken by AR service equipment shown in Figure 2 200, but be not limited thereto.
With reference to Fig. 3, in operation 310, AR service equipment 200 can be caught image.Image can be the preview image of captured in real time when AR service equipment 200 operation video cameras 240.In operation 312, AR service equipment 200 can determine whether from the image of being caught, to identify mark.If confirm the unidentified mark that goes out, then AR service equipment 200 can return 310.
If identify mark, then in operation 314, the window area corresponding to mark can followed the tracks of and detect to AR service equipment 200.Window area can be a window area 120 as shown in Figure 1.
In operation 316, AR service equipment 200 can be confirmed the position relation between mark and the AR service equipment 200 based on the unique point of mark.In addition, AR service equipment 200 can be confirmed the position relation between window area and the AR service equipment 200 based on the unique point of mark.
In operation 318, AR service equipment 200 can be confirmed to indicate and AR information corresponding to the rear side AR object associated of mark.
In operation 320, AR service equipment 200 can the position-based relation be confirmed to pass through the AR information area that window area shows.
In operation 322, AR service equipment 200 can determine whether to show the front side object.AR service equipment 200 can be confirmed the information with front side AR object associated.
In operation 324, AR service equipment 200 can show the AR information area in window area, and if definite will show front side AR object then can in real world space 160, show front side AR object.Can in window area, show the AR information area, thereby the rear side AR object in the AR information area can be shown as as if the AR information area illustrates through window area.
In operation 326, if front side AR object does not exist, then AR service equipment 200 can show the AR information area in the viewing area.
At operation S328, along with AR is performed, AR service equipment 216 can be handled rear side AR object or rear side AR motion of objects.
Can be recorded in according to an illustrative embodiment of the invention and comprise the computer-readable medium that is used for by the programmed instruction of computer implemented each operation of realization.Medium can also perhaps comprise data file, data structure etc. with program instructions independently.Medium and programmed instruction can specify and make up for the purposes of the present invention and especially, perhaps can be the known and available kinds of those skilled in the art.The example of computer-readable medium comprises magnetic medium, such as hard disk, floppy disk and tape etc.; Optical medium such as CD-ROM and DVD; Magnet-optical medium such as the light floppy disk; And be configured to store the hardware unit with execution of program instructions especially, such as ROM (read-only memory) (ROM), random-access memory (ram), flash memory etc.The example of programmed instruction comprises the machine code that produces such as compiler and comprises the file that can be used the higher level code of demoder execution by computing machine.Described hardware unit can be configured to as one or more a plurality of software modules to carry out the operation of above-mentioned embodiment of the present invention.
According to an illustrative embodiment of the invention; Be used to provide equipment and the method for AR service can be, and the AR object that can show the rear side that is positioned at window area be to provide window form based on the position of AR service equipment with based on mark and the size of definite window shows the AR object with the window form corresponding to mark.
It will be apparent to those skilled in the art that, can under the situation that does not break away from the spirit or scope of the present invention, make various modifications and variation the present invention.Thus, the present invention is intended to cover whole modification of the present invention and changes as long as they fall in the scope of appended claim and equivalent.
The application requires the right of priority of korean patent application No.10-2011-0003691 that submitted on January 13rd, 2011 and the korean patent application No.10-2011-0055515 that submitted on June 9th, 2011, and its integral body is incorporated into as in this comprehensive elaboration at this by reference.

Claims (20)

1. equipment that is used to provide augmented reality, this equipment comprises:
The windows detecting device, it is used for confirming first area and second area based on the image of the object of being caught;
Message handler, it is used for based on said object discerned the first of the virtual world image layer that will show in said first area with corresponding first view direction in said first area; And
Image processor, it is used for showing said first in said first area, and in said second area, shows the real world images layer.
2. equipment according to claim 1, this equipment also comprises:
The mark recognition unit, it is used to discern the mark of being caught by image capture apparatus, and this mark comprises unique point;
Position estimator, it is used for obtaining based on said unique point the positional information of said mark,
Wherein, said first area is based on the positional information of said mark and is definite.
3. equipment according to claim 1, wherein, said first area is configured to, if watch said object from said first view direction, then shows the said first of said virtual world image layer, and
Said first area is configured to, if watch said object from second view direction, then shows the second portion of said virtual world image layer.
4. equipment according to claim 1, this equipment also comprises:
The mark recognition unit, it is used to discern the mark of being caught by image capture apparatus; And
Position estimator, it is used to obtain from said first view direction of said image capture apparatus towards said mark,
Wherein, said message handler is based on discerning the first augmented reality object that is arranged in the said first that will in said first area, show with said first corresponding said first view direction.
5. equipment according to claim 4, wherein
Said position estimator obtains from second view direction of said image capture apparatus towards said mark,
Said message handler is based on discerning the second augmented reality object in the second portion that is arranged in the said virtual world image layer that will in said first area, show with corresponding second view direction of said second portion, and
Said image processor shows the said second augmented reality object in said first area.
6. equipment according to claim 1; Wherein, The said first of said virtual world image layer comprises the first augmented reality object; If the said first augmented reality object is shown as towards said real world images layer move, the then said first augmented reality object is changed into the part of said real world images layer.
7. equipment that is used to provide augmented reality, this equipment comprises:
Image capture apparatus, it is used to catch image;
The mark recognition unit, it is used for from said image identification marking;
The windows detecting device, it is used for confirming and the corresponding first area of said mark;
Message handler, it is used for the first based on the virtual world image layer of confirming with respect to the view direction of said mark to show in said first area, and said virtual world image layer comprises one or more augmented reality object; And
Image processor, it is used for showing said first in said first area.
8. equipment according to claim 7, this equipment also comprises position estimator, this position estimator is used for confirming based on the position of said mark the position of said first area.
9. equipment according to claim 7, this equipment also comprises position estimator, this position estimator is used to calculate the distance between said image capture apparatus and the said mark and obtains said view direction,
Wherein, said windows detecting device is confirmed the size of said first area based on said distance and said view direction.
10. equipment according to claim 7, wherein, said image processor is presented at the first augmented reality object of arranging in the said first of said virtual world image layer.
11. the method that augmented reality is provided, this method may further comprise the steps:
Image based on the object of being caught is confirmed first area and second area;
Based on said object discerned the first of the virtual world image layer that will in said first area, show with corresponding first view direction in said first area; And
In said first area, show said first, and in said second area, show the real world images layer.
12. method according to claim 11, this method is further comprising the steps of:
The mark that identification is caught by image capture apparatus, this mark comprises unique point;
Obtain the positional information of said mark based on said unique point,
Wherein, said first area is based on the positional information of said mark and is definite.
13. method according to claim 11, wherein, said first area is configured to, if watch said object from said first view direction, then shows the said first of said virtual world image layer, and
Said first area is configured to, if watch said object from second view direction, then shows the second portion of said virtual world image layer.
14. method according to claim 11, this method is further comprising the steps of:
The mark that identification is caught by image capture apparatus;
Acquisition is from said first view direction of said image capture apparatus towards said mark; And
Based on discerning the first augmented reality object that is arranged in the first that will in said first area, show with said first corresponding said first view direction.
15. method according to claim 14, this method is further comprising the steps of:
Acquisition is from second view direction of said image capture apparatus towards said mark; And
In said first area, show the second portion of said virtual world image layer, said second portion is corresponding to said second view direction.
16. method according to claim 11; Wherein, If the first augmented reality object of said virtual world image layer is shown as towards said real world images layer move, the then said first augmented reality object is changed into the part of said real world images layer.
17. the method that augmented reality is provided, this method may further comprise the steps:
Use image capture apparatus to catch image;
Identification marking from said image;
Confirm and the corresponding first area of said mark;
Based on the first of the virtual world image layer of confirming with respect to the view direction of said mark in said first area, to show, said virtual world image layer comprises one or more augmented reality object; And
In said first area, show said first.
18. method according to claim 17, this method is further comprising the steps of: the position of confirming said first area based on the position of said mark.
19. method according to claim 17, this method is further comprising the steps of:
Calculate the distance between said image capture apparatus and the said mark, and obtain said view direction, and
Confirm the size of said first area based on said distance and said view direction.
20. method according to claim 17, this method is further comprising the steps of: be presented at the first augmented reality object of arranging in the said first of said virtual world image layer.
CN2012100089976A 2011-01-13 2012-01-12 Augmented reality apparatus and method of windows form Pending CN102693125A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20110003691 2011-01-13
KR10-2011-0003691 2011-01-13
KR1020110055515A KR101308184B1 (en) 2011-01-13 2011-06-09 Augmented reality apparatus and method of windows form
KR10-2011-0055515 2011-06-09

Publications (1)

Publication Number Publication Date
CN102693125A true CN102693125A (en) 2012-09-26

Family

ID=46714163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100089976A Pending CN102693125A (en) 2011-01-13 2012-01-12 Augmented reality apparatus and method of windows form

Country Status (3)

Country Link
JP (1) JP2012146305A (en)
KR (1) KR101308184B1 (en)
CN (1) CN102693125A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107004303A (en) * 2014-12-04 2017-08-01 微软技术许可有限责任公司 Mixed reality is visualized and method
CN109871674A (en) * 2017-12-04 2019-06-11 上海聚虹光电科技有限公司 VR or AR equipment subregion operating right management method
CN111771227A (en) * 2017-12-21 2020-10-13 Cy游戏公司 Program, system, electronic device, and method for recognizing three-dimensional object

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014182597A (en) * 2013-03-19 2014-09-29 Yasuaki Iwai Virtual reality presentation system, virtual reality presentation device, and virtual reality presentation method
CN113190120B (en) * 2021-05-11 2022-06-24 浙江商汤科技开发有限公司 Pose acquisition method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101101505A (en) * 2006-07-07 2008-01-09 华为技术有限公司 Method and system for implementing three-dimensional enhanced reality
US20100067882A1 (en) * 2006-07-06 2010-03-18 Sundaysky Ltd. Automatic generation of video from structured content
US20100185529A1 (en) * 2009-01-21 2010-07-22 Casey Chesnut Augmented reality method and system for designing environments and buying/selling goods

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020158873A1 (en) * 2001-01-26 2002-10-31 Todd Williamson Real-time virtual viewpoint in simulated reality environment
JP4926598B2 (en) * 2006-08-08 2012-05-09 キヤノン株式会社 Information processing method and information processing apparatus
JP4774346B2 (en) * 2006-08-14 2011-09-14 日本電信電話株式会社 Image processing method, image processing apparatus, and program
JP2008108246A (en) * 2006-10-23 2008-05-08 Internatl Business Mach Corp <Ibm> Method, system and computer program for generating virtual image according to position of browsing person
KR101465668B1 (en) * 2008-06-24 2014-11-26 삼성전자주식회사 Terminal and method for blogging thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100067882A1 (en) * 2006-07-06 2010-03-18 Sundaysky Ltd. Automatic generation of video from structured content
CN101101505A (en) * 2006-07-07 2008-01-09 华为技术有限公司 Method and system for implementing three-dimensional enhanced reality
US20100185529A1 (en) * 2009-01-21 2010-07-22 Casey Chesnut Augmented reality method and system for designing environments and buying/selling goods

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
C.BICHLMEIER等: "Virtual Window for Improved Depth Perception in Medical AR", 《THE PROCEEDINGS OF INTERNATIONAL WORKSHOP ON AUGMENTED REALITY ENVIRONMENTS FOR MEDICAL IMAGING AND COMPUTER-AIDED SURGERY》 *
DEN IVANOV: "dFusion virtual 3D window", 《HTTP://VIMEO.COM/15323327》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107004303A (en) * 2014-12-04 2017-08-01 微软技术许可有限责任公司 Mixed reality is visualized and method
CN109871674A (en) * 2017-12-04 2019-06-11 上海聚虹光电科技有限公司 VR or AR equipment subregion operating right management method
CN111771227A (en) * 2017-12-21 2020-10-13 Cy游戏公司 Program, system, electronic device, and method for recognizing three-dimensional object
CN111771227B (en) * 2017-12-21 2023-10-31 Cy游戏公司 Method, system, electronic device and storage medium for recognizing three-dimensional object

Also Published As

Publication number Publication date
KR101308184B1 (en) 2013-09-12
JP2012146305A (en) 2012-08-02
KR20120082319A (en) 2012-07-23

Similar Documents

Publication Publication Date Title
KR101930657B1 (en) System and method for immersive and interactive multimedia generation
US20190206115A1 (en) Image processing device and method
EP3151202B1 (en) Information processing device and information processing method
EP2477160A1 (en) Apparatus and method for providing augmented reality perceived through a window
CN109743892B (en) Virtual reality content display method and device
CN104574267A (en) Guiding method and information processing apparatus
CN111833458B (en) Image display method and device, equipment and computer readable storage medium
CN107562189B (en) Space positioning method based on binocular camera and service equipment
CN102693125A (en) Augmented reality apparatus and method of windows form
US20140253591A1 (en) Information processing system, information processing apparatus, information processing method, and computer-readable recording medium recording information processing program
JP5791434B2 (en) Information processing program, information processing system, information processing apparatus, and information processing method
CN104735435A (en) Image processing method and electronic device
EP2717227A2 (en) Image processing program, image processing device, image processing system, and image processing method
US11978232B2 (en) Method for displaying three-dimensional augmented reality
JP6345381B2 (en) Augmented reality system
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
US20220277520A1 (en) Information processing apparatus, information processing method, and storage medium
CN113486941B (en) Live image training sample generation method, model training method and electronic equipment
CN106384365B (en) Augmented reality system comprising depth information acquisition and method thereof
Moares et al. Inter ar: Interior decor app using augmented reality technology
WO2021065607A1 (en) Information processing device and method, and program
AU2012203857B2 (en) Automatic repositioning of video elements
US20240078743A1 (en) Stereo Depth Markers

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20120926