CN105005970A - Augmented reality implementation method and apparatus - Google Patents
Augmented reality implementation method and apparatus Download PDFInfo
- Publication number
- CN105005970A CN105005970A CN201510363780.0A CN201510363780A CN105005970A CN 105005970 A CN105005970 A CN 105005970A CN 201510363780 A CN201510363780 A CN 201510363780A CN 105005970 A CN105005970 A CN 105005970A
- Authority
- CN
- China
- Prior art keywords
- image
- target object
- picture
- fitted
- dimentional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention discloses an augmented reality implementation method and apparatus. The method comprises the steps of: obtaining field depth information of a target object in the real world; establishing a three-dimensional coordinate system for real space according to the field depth information, and obtaining a three-dimensional image of the target object; matching a virtual image with the three-dimensional image of the target object, and obtaining a new three-dimensional fitting image; and obtaining a two-dimensional fitting image corresponding to the three-dimensional fitting image through conversion of the coordinate system, and displaying the two-dimensional fitting image. According to the implementation method and apparatus, the interaction between the target object in the real world and a virtual object can be realized in a screen of a camera terminal so as to solve a series of problems of shielding, illumination, shadow and the like.
Description
Technical field
The present invention relates to communication technique field, particularly relate to a kind of implementation method and device of augmented reality.
Background technology
Augmented reality (Augmented Reality, AR) be the emerging computer utility of one grown up on virtual reality technology basis and human-computer interaction technology, it is using the object in real world as target object, after identifying target object, obtain the enhancing information relevant to this target object, this enhancing information is such as advertising message, range information or other dummy object etc. relevant to this target object, again computer and visualization technique by this target object with strengthen information superposition in same picture, just achieve augmented reality.
The target object captured due to camera shooting terminal is two-dimensional signal, lack the three-dimensional information of real world, current AR technology, a two dimensional surface can only be set up in real space, if enhancing information is virtual object, dummy object can only be movable in two dimensional surface space simply, cannot be interactive with the target object of the three-dimensional in real world, wherein blocks, illumination, the phenomenon such as shade cannot realize because not having the three-dimensional information of real world.
Summary of the invention
In view of this, the embodiment of the present invention provides a kind of implementation method and device of augmented reality, on the screen of camera shooting terminal, realizes the target object of real world and the interaction of virtual object, blocks, illumination to solve, a series of problem such as shade.
First aspect, embodiments provides a kind of implementation method of augmented reality, comprising:
Obtain the depth of view information of target object in real world;
According to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtain the stereo-picture of described target object;
By the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image;
Obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, described two-dimentional fitted figure picture is shown.
Second aspect, the embodiment of the present invention additionally provides a kind of implement device of augmented reality, comprising:
Depth of view information acquiring unit, for obtaining the depth of view information of target object in real world;
Stereo-picture acquiring unit, for according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object;
Three-dimensional matching image acquisition unit, for the stereo-picture matching of the described target object by virtual image and acquisition, obtains new three-dimensional matching image;
Two dimension matching image-display units, for being obtained two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, is shown described two-dimentional fitted figure picture.
Embodiments provide a kind of method and device of augmented reality, the method, according to the depth of view information of target object in real world, for real world sets up three-dimensional coordinate system, obtains the stereo-picture of target object; The stereo-picture of virtual image and target object is carried out matching, obtains new three-dimensional matching image, finally will convert three-dimensional matching image to two-dimentional fitted figure picture by ordinate transform, and show.The embodiment of the present invention, on camera shooting terminal screen, realizes the target object of real world and the interaction of virtual object, blocks, illumination to solve, a series of problem such as shade.
Accompanying drawing explanation
By reading the detailed description done non-limiting example done with reference to the following drawings, other features, objects and advantages of the present invention will become more obvious:
Fig. 1 (a) is the implementation method process flow diagram of a kind of augmented reality that the embodiment of the present invention one provides;
Fig. 1 (b) is the view of the realistic space target object that the embodiment of the present invention one provides;
Fig. 1 (c) is the stereographic map of the realistic space target object of the camera shooting terminal acquisition that the embodiment of the present invention one provides;
Fig. 1 (d) is the stereographic map that the target object of the realistic space of acquisition and dummy object are carried out matching by camera shooting terminal that the embodiment of the present invention one provides;
Fig. 1 (e) is that the two-dimentional fitted figure of the target object of the realistic space that the camera shooting terminal screen that provides of the embodiment of the present invention one shows and dummy object is as schematic diagram;
Fig. 2 is the implementation method process flow diagram of a kind of augmented reality that the embodiment of the present invention two provides;
Fig. 3 is the implementation method process flow diagram of a kind of augmented reality that the embodiment of the present invention three provides;
Fig. 4 is the structured flowchart of the implement device of a kind of augmented reality that the embodiment of the present invention four provides.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not full content.
Embodiment one
Fig. 1 (a) is the implementation method process flow diagram of a kind of augmented reality that the embodiment of the present invention one provides.Described method is performed by the implement device of augmented reality, and described device is realized by the mode of software and/or hardware, and described device is located in camera shooting terminal, and as shown in Fig. 1 (a), the method comprises:
Step 101: the depth of view information obtaining target object in real world.
Wherein, described depth of view information is often some distance to camera shooting terminal in target object, and depth of view information can be gathered by the camera of camera shooting terminal.
Step 102: according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object.
In the present embodiment, take camera shooting terminal as initial point, camera shooting terminal is that realistic space sets up three-dimensional coordinate system, and by often some distance to camera shooting terminal of target object, obtain the stereo-picture of target object, wherein said three-dimensional coordinate system can be three-dimensional system of coordinate.Illustrate, as shown in Fig. 1 (b), in realistic space, there are three target objects in ground, be respectively book, square box and adhesive tape, wherein book is placed on below square box, and camera shooting terminal obtains the depth of view information of these three target objects, set up three-dimensional coordinate system, obtain the stereo-picture (shown in Fig. 1 (c)) of these objects.
Step 103: by the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image;
In the present embodiment, in camera shooting terminal, by the stereo-picture matching of virtual image and target object, obtain new fitted figure picture.Example in the step 102 that continues, as shown in Fig. 1 (d), in camera shooting terminal, by virtual portrait and book, square box and adhesive tape carry out matching, wherein the first virtual portrait 1 is in the inside of adhesive tape core, second virtual portrait 2 stand on book, 3rd virtual portrait 3 stand on square box, 4th virtual portrait 4 stand on after square box, 5th virtual portrait 5 is stood on the ground, as can be seen from Fig. 1 (d), in new fitted figure picture, owing to establishing three-dimensional coordinate system in camera shooting terminal inside, book, square box and adhesive tape are stereo-picture, when the first virtual portrait 1 is at tape roll core inner, adhesive tape core has blocked a part for the first virtual portrait 1 health.In addition, square box has also blocked a part for the 4th virtual portrait 4 health.
Step 104: obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, described two-dimentional fitted figure picture is shown.
In the present embodiment, the ordinate transform of solid is become two-dimensional coordinate system by camera shooting terminal, convert three-dimensional matching image to two-dimentional fitted figure picture, described two-dimentional fitted figure picture is presented on the screen of camera shooting terminal, as shown in Fig. 1 (e), the image that camera shooting terminal screen shows is two-dimentional fitted figure picture, achieves the interaction of virtual portrait and real world target object.
Present embodiments provide a kind of implementation method of augmented reality, according to the depth of view information of target object in real world, for real world sets up three-dimensional coordinate system, obtain the stereo-picture of target object; The stereo-picture of virtual image and target object is carried out matching, obtains new three-dimensional matching image, finally will convert three-dimensional matching image to two-dimentional fitted figure picture by ordinate transform, and show.The embodiment of the present invention, on camera shooting terminal screen, realizes the target object of real world and the interaction of virtual object, blocks, illumination to solve, a series of problem such as shade.
Embodiment two
The process flow diagram of the implementation method of a kind of augmented reality that Fig. 2 provides for the present embodiment two; On the basis of above-described embodiment, before the depth of view information obtaining target object in real world, add following operation: the image color information obtaining target object in real world.Adopt aforesaid operations, for the coloured image of display-object object provides foundation on camera shooting terminal screen.
Further, on the basis of above-described embodiment, obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, described two-dimentional fitted figure picture shown, comprising:
Two-dimentional fitted figure picture corresponding to new three-dimensional matching image is obtained by ordinate transform, according to the image color information of the described target object obtained, for the target object in two-dimentional fitted figure picture carries out color configuration, the two-dimentional fitted figure picture after color configuration is shown.
Based on above-mentioned optimization, the scheme that the present embodiment provides is specific as follows:
Step 201: the image color information obtaining target object in real world.
Wherein image color information can be the rgb value that target object is often put.
Step 202: the depth of view information obtaining target object in real world.
Step 203: according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object.
Step 204: obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, according to the image color information of the described target object obtained, for the described target object in two-dimentional fitted figure picture carries out color configuration, the two-dimentional fitted figure picture after color configuration is shown.
Concrete, the rgb value that in the real world obtained, target object is often put is fixing, in camera shooting terminal, the rgb value often put according to target object is that the target object in two-dimentional fitted figure picture carries out color configuration, and on the screen of camera shooting terminal, carry out colour display.
The present embodiment two provides a kind of implementation method of augmented reality, the target object of virtual image and real world can be carried out colour display, increase the picture sense of shooting.
Embodiment three
The process flow diagram of the implementation method of a kind of augmented reality that Fig. 3 provides for the present embodiment; The present embodiment is optimized " described by the stereo-picture matching of the described target object of virtual image and acquisition, to obtain new three-dimensional matching image " step on the basis of above-described embodiment.
In such scheme, when camera shooting terminal is static, described by the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image, comprise: virtual image is inserted in default coordinate position, the image of virtual image and target object is made to fit to new image, as new three-dimensional matching image.
Concrete, in camera shooting terminal, exist for the three-dimensional coordinate system that realistic space is set up, user can according to oneself need for virtual image setting coordinate position, when adding virtual image, virtual image can be inserted in default coordinate position, realizes the matching of virtual image and target object.
On the basis of above-described embodiment, when camera shooting terminal moves, the present embodiment is optimized " described obtain two-dimentional fitted figure picture corresponding to new three-dimensional matching image by ordinate transform, shown by described two-dimentional fitted figure picture ".
Further, based on above-mentioned optimization, when camera shooting terminal moves, the technical scheme that the present embodiment provides is specific as follows:
Step 301: the image color information obtaining target object in real world.
Step 302: the depth of view information obtaining target object in real world.
Step 303: according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object.
Step 304: by the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image.
Step 305: the position coordinates of Real-time Obtaining camera shooting terminal.
Wherein, the position coordinates of camera shooting terminal can by being located at the GPS on camera shooting terminal, acceleration, and the sensors such as gyroscope obtain.
Step 306: according to the position coordinates of camera shooting terminal, calculates the position coordinates of virtual image and target object in the image of new three-dimensional matching by six axis coordinate systems, the target object after acquisition camera shooting terminal moves and the fitted figure picture of virtual image.
Concrete, when camera shooting terminal moves, six axis coordinate systems are set up in camera shooting terminal inside, six axial coordinates are adopted to represent the position that virtual image and target object are often put, when camera shooting terminal move rear static time, virtual image and the position coordinates of often on target object, all compared with before, all there occurs change, compared with before target object after the camera shooting terminal obtained moves moves with camera shooting terminal with the fitted figure picture of virtual image, all there occurs change.Camera shooting terminal inside adopts six axis coordinate systems to determine the position coordinates of upper often of virtual image and target object, for different observed rays present different to block, illumination, shade etc. can clearly embody.
Step 307: the target object after being moved by the camera shooting terminal of acquisition and the fitted figure picture of virtual image are converted to three-dimensional matching image.
In the present embodiment, the target object after the camera shooting terminal of acquisition moves and the fitted figure picture of virtual image are the image that six axial coordinates represent, convert thereof into three-dimensional matching image.
Step 308: the three-dimensional matching image after conversion is converted to corresponding two-dimentional fitted figure picture, and shows.
Present embodiments provide a kind of implementation method of augmented reality, when camera shooting terminal is static, the method is to " described by the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image " step be optimized, can make user according to oneself need the position of virtual portrait is arranged, when camera shooting terminal moves, the method is to " describedly obtaining two-dimentional fitted figure picture corresponding to new three-dimensional matching image by ordinate transform, described two-dimentional fitted figure picture is shown " step is optimized, the shooting picture of different angles can be presented, different blocking is presented under different shooting angle, illumination, the situations such as shade.
Embodiment four
The structured flowchart of the implement device of a kind of augmented reality that Fig. 4 provides for the present embodiment four, as shown in Figure 4, this device comprises:
Depth of view information acquiring unit 401, for obtaining the depth of view information of target object in real world;
Stereo-picture acquiring unit 402, for according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object; Wherein, described depth of view information is often some distance to camera shooting terminal in target object;
Three-dimensional matching image acquisition unit 403, for the stereo-picture matching of the described target object by virtual image and acquisition, obtains new three-dimensional matching image;
Two dimension matching image-display units 404, for being obtained two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, is shown described two-dimentional fitted figure picture.
On the basis of above-described embodiment, described device also comprises,
Color of image acquiring unit 405, for obtain target object in real world depth of view information before, obtain the image color information of target object in real world.
Two dimension matching image-display units 404, specifically for, two-dimentional fitted figure picture corresponding to new three-dimensional matching image is obtained by ordinate transform, according to the image color information of the described target object obtained, for the described target object in the image of two-dimentional matching carries out color configuration, the two-dimentional fitted figure picture after color configuration is shown.
On the basis of above-described embodiment, when camera shooting terminal is static;
Three-dimensional matching image acquisition unit 403, specifically for: virtual image is inserted in default coordinate position, makes the image of virtual image and target object fit to new image.
On the basis of above-described embodiment, when camera shooting terminal moves; Two dimension matching image-display units 404, specifically for,
The position coordinates of Real-time Obtaining camera shooting terminal;
According to the position coordinates of camera shooting terminal, calculated the position coordinates of virtual image and target object in new three-dimensional matching image by six axis coordinate systems, the target object after acquisition camera shooting terminal moves and the fitted figure picture of virtual image;
Target object after being moved by the camera shooting terminal of acquisition and the fitted figure picture of virtual image are converted to three-dimensional matching image;
Three-dimensional matching image after conversion is converted to corresponding two-dimentional fitted figure picture, and shows.
Present embodiments provide a kind of implement device of augmented reality, this device, according to the depth of view information of target object in real world, for real world sets up three-dimensional coordinate system, obtains the stereo-picture of target object; The stereo-picture of virtual image and target object is carried out matching, obtains new three-dimensional matching image, finally will convert three-dimensional matching image to two-dimentional fitted figure picture by ordinate transform, and show.The embodiment of the present invention, in camera shooting terminal display frame, realizes the target object of real world and the interaction of virtual object, blocks, illumination to solve, a series of problem such as shade.
This device may be used for the implementation method performing augmented reality of the present invention, possesses corresponding function and beneficial effect.
Note, above are only preferred embodiment of the present invention and institute's application technology principle.Skilled person in the art will appreciate that and the invention is not restricted to specific embodiment described here, various obvious change can be carried out for a person skilled in the art, readjust and substitute and can not protection scope of the present invention be departed from.Therefore, although be described in further detail invention has been by above embodiment, the present invention is not limited only to above embodiment, when not departing from the present invention's design, can also comprise other Equivalent embodiments more, and scope of the present invention is determined by appended right.
Claims (10)
1. an implementation method for augmented reality, is characterized in that, comprising:
Obtain the depth of view information of target object in real world;
According to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtain the stereo-picture of described target object;
By the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image;
Obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, described two-dimentional fitted figure picture is shown.
2. method according to claim 1, is characterized in that, before the depth of view information obtaining target object in real world, also comprises, and obtains the image color information of target object in real world;
Obtain two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, described two-dimentional fitted figure picture shown, comprising:
Two-dimentional fitted figure picture corresponding to new three-dimensional matching image is obtained by ordinate transform, according to the image color information of the described target object obtained, for the described target object in two-dimentional fitted figure picture carries out color configuration, the two-dimentional fitted figure picture after color configuration is shown.
3. method according to claim 1, is characterized in that, described depth of view information is often some distance to camera shooting terminal in target object.
4. method according to claim 1, is characterized in that, when camera shooting terminal is static,
Described by the stereo-picture matching of the described target object of virtual image and acquisition, obtain new three-dimensional matching image to comprise: virtual image is inserted in default coordinate position, the image of virtual image and target object is made to fit to new image, as new three-dimensional matching image.
5. method according to claim 1, is characterized in that, when camera shooting terminal moves,
Describedly obtain two-dimentional fitted figure picture corresponding to new three-dimensional matching image by ordinate transform, described two-dimentional fitted figure picture shown, comprising:
The position coordinates of Real-time Obtaining camera shooting terminal;
According to the position coordinates of camera shooting terminal, calculated the position coordinates of virtual image and target object in new three-dimensional matching image by six axis coordinate systems, the target object after acquisition camera shooting terminal moves and the fitted figure picture of virtual image;
Target object after being moved by the camera shooting terminal of acquisition and the fitted figure picture of virtual image are converted to three-dimensional matching image;
Three-dimensional matching image after conversion is converted to corresponding two-dimentional fitted figure picture, and shows.
6. a device for the realization of augmented reality, is characterized in that, comprising:
Depth of view information acquiring unit, for obtaining the depth of view information of target object in real world;
Stereo-picture acquiring unit, for according to described depth of view information, for realistic space sets up three-dimensional coordinate system, obtains the stereo-picture of described target object;
Three-dimensional matching image acquisition unit, for the stereo-picture matching of the target object by virtual image and described acquisition, obtains new three-dimensional matching image;
Two dimension matching image-display units, for being obtained two-dimentional fitted figure picture corresponding to described three-dimensional matching image newly by ordinate transform, is shown described two-dimentional fitted figure picture.
7. device according to claim 6, is characterized in that, also comprises,
Color of image acquiring unit, for obtain target object in real world depth of view information before, obtain the image color information of target object in real world;
Two dimension matching image-display units, specifically for, two-dimentional fitted figure picture corresponding to new three-dimensional matching image is obtained by ordinate transform, according to the image color information of the described target object obtained, for the described target object in the image of two-dimentional matching carries out color configuration, the image of the two-dimentional matching after color matching is shown.
8. device according to claim 6, is characterized in that, described depth of view information is often some distance to camera shooting terminal in target object.
9. device according to claim 6, is characterized in that, when camera shooting terminal is static;
Three-dimensional matching image acquisition unit, specifically for: virtual image is inserted in default coordinate position, makes the image of virtual image and target object fit to new image, as new three-dimensional matching image.
10. device according to claim 6, is characterized in that, when camera shooting terminal moves,
Two dimension matching image-display units, specifically for,
The position coordinates of Real-time Obtaining camera shooting terminal;
According to the position coordinates of camera shooting terminal, calculated the position coordinates of virtual image and target object in new three-dimensional matching image by six axis coordinate systems, the target object after acquisition camera shooting terminal moves and the fitted figure picture of virtual image;
Target object after being moved by the camera shooting terminal of acquisition and the fitted figure picture of virtual image are converted to three-dimensional matching image;
Three-dimensional matching image after conversion is converted to corresponding two-dimentional fitted figure picture, and shows.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510363780.0A CN105005970B (en) | 2015-06-26 | 2015-06-26 | The implementation method and device of a kind of augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510363780.0A CN105005970B (en) | 2015-06-26 | 2015-06-26 | The implementation method and device of a kind of augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105005970A true CN105005970A (en) | 2015-10-28 |
CN105005970B CN105005970B (en) | 2018-02-16 |
Family
ID=54378629
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510363780.0A Expired - Fee Related CN105005970B (en) | 2015-06-26 | 2015-06-26 | The implementation method and device of a kind of augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105005970B (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105681684A (en) * | 2016-03-09 | 2016-06-15 | 北京奇虎科技有限公司 | Image real-time processing method and device based on mobile terminal |
CN105786432A (en) * | 2016-03-18 | 2016-07-20 | 北京奇虎科技有限公司 | Method and device for displaying virtual image on mobile terminal |
CN106203292A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | Method, device and the mobile terminal that the augmented reality of a kind of image processes |
CN106231205A (en) * | 2016-08-10 | 2016-12-14 | 苏州黑盒子智能科技有限公司 | Augmented reality mobile terminal |
CN106227481A (en) * | 2016-07-22 | 2016-12-14 | 北京奇虎科技有限公司 | Method and the terminal of AR image is shown during reading articles |
CN106444042A (en) * | 2016-11-29 | 2017-02-22 | 北京知境科技有限公司 | Dual-purpose display equipment for augmented reality and virtual reality, and wearable equipment |
CN106898049A (en) * | 2017-01-18 | 2017-06-27 | 北京商询科技有限公司 | A kind of spatial match method and system for mixed reality equipment |
CN107025683A (en) * | 2017-03-30 | 2017-08-08 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN107065196A (en) * | 2017-06-16 | 2017-08-18 | 京东方科技集团股份有限公司 | A kind of augmented reality display device and augmented reality display methods |
CN107613223A (en) * | 2017-09-11 | 2018-01-19 | 广东欧珀移动通信有限公司 | Image processing method and device, electronic installation and computer-readable recording medium |
CN107734265A (en) * | 2017-09-11 | 2018-02-23 | 广东欧珀移动通信有限公司 | Image processing method and device, electronic installation and computer-readable recording medium |
CN107749083A (en) * | 2017-09-28 | 2018-03-02 | 联想(北京)有限公司 | The method and apparatus of image shows |
CN108537889A (en) * | 2018-03-26 | 2018-09-14 | 广东欧珀移动通信有限公司 | Method of adjustment, device, storage medium and the electronic equipment of augmented reality model |
CN109420336A (en) * | 2017-08-30 | 2019-03-05 | 深圳市掌网科技股份有限公司 | Game implementation method and device based on augmented reality |
CN109981983A (en) * | 2019-03-26 | 2019-07-05 | Oppo广东移动通信有限公司 | Augmented reality image processing method, device, electronic equipment and storage medium |
CN109976533A (en) * | 2019-04-15 | 2019-07-05 | 珠海天燕科技有限公司 | Display control method and device |
CN111340878A (en) * | 2020-05-15 | 2020-06-26 | 支付宝(杭州)信息技术有限公司 | Image processing method and device |
US11156473B2 (en) * | 2016-08-18 | 2021-10-26 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
WO2022062442A1 (en) * | 2020-09-23 | 2022-03-31 | 北京市商汤科技开发有限公司 | Guiding method and apparatus in ar scene, and computer device and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101587542A (en) * | 2009-06-26 | 2009-11-25 | 上海大学 | Field depth blending strengthening display method and system based on eye movement tracking |
US20110164832A1 (en) * | 2010-01-04 | 2011-07-07 | Samsung Electronics Co., Ltd. | Image-based localization feature point registration apparatus, method and computer-readable medium |
CN102129708A (en) * | 2010-12-10 | 2011-07-20 | 北京邮电大学 | Fast multilevel imagination and reality occlusion method at actuality enhancement environment |
US20110279478A1 (en) * | 2008-10-23 | 2011-11-17 | Lokesh Bitra | Virtual Tagging Method and System |
CN102568026A (en) * | 2011-12-12 | 2012-07-11 | 浙江大学 | Three-dimensional enhancing realizing method for multi-viewpoint free stereo display |
CN103337079A (en) * | 2013-07-09 | 2013-10-02 | 广州新节奏智能科技有限公司 | Virtual augmented reality teaching method and device |
CN103489214A (en) * | 2013-09-10 | 2014-01-01 | 北京邮电大学 | Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system |
CN103500471A (en) * | 2013-09-27 | 2014-01-08 | 深圳市中视典数字科技有限公司 | Method for realizing high-resolution augmented reality system |
CN104599243A (en) * | 2014-12-11 | 2015-05-06 | 北京航空航天大学 | Virtual and actual reality integration method of multiple video streams and three-dimensional scene |
WO2015075005A1 (en) * | 2013-11-19 | 2015-05-28 | Commissariat à l'énergie atomique et aux énergies alternatives | Determination of the image depth map of a scene |
-
2015
- 2015-06-26 CN CN201510363780.0A patent/CN105005970B/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110279478A1 (en) * | 2008-10-23 | 2011-11-17 | Lokesh Bitra | Virtual Tagging Method and System |
CN101587542A (en) * | 2009-06-26 | 2009-11-25 | 上海大学 | Field depth blending strengthening display method and system based on eye movement tracking |
US20110164832A1 (en) * | 2010-01-04 | 2011-07-07 | Samsung Electronics Co., Ltd. | Image-based localization feature point registration apparatus, method and computer-readable medium |
CN102129708A (en) * | 2010-12-10 | 2011-07-20 | 北京邮电大学 | Fast multilevel imagination and reality occlusion method at actuality enhancement environment |
CN102568026A (en) * | 2011-12-12 | 2012-07-11 | 浙江大学 | Three-dimensional enhancing realizing method for multi-viewpoint free stereo display |
CN103337079A (en) * | 2013-07-09 | 2013-10-02 | 广州新节奏智能科技有限公司 | Virtual augmented reality teaching method and device |
CN103489214A (en) * | 2013-09-10 | 2014-01-01 | 北京邮电大学 | Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system |
CN103500471A (en) * | 2013-09-27 | 2014-01-08 | 深圳市中视典数字科技有限公司 | Method for realizing high-resolution augmented reality system |
WO2015075005A1 (en) * | 2013-11-19 | 2015-05-28 | Commissariat à l'énergie atomique et aux énergies alternatives | Determination of the image depth map of a scene |
CN104599243A (en) * | 2014-12-11 | 2015-05-06 | 北京航空航天大学 | Virtual and actual reality integration method of multiple video streams and three-dimensional scene |
Non-Patent Citations (3)
Title |
---|
ALEX OLWAL 等: "LUMAR:A Hybrid Spatial Display System for 2D and 3D Handheld Augmented Reality", 《17TH INTERNATIONAL CONFERENCE ON ARTIFICIAL REALITY AND TELEXISTENCE》 * |
孙磊: "基于视觉的增强现实系统的研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 * |
李红波 等: "引入摄像机姿态的增强现实虚实遮挡处理算法", 《中国科技论文在线》 * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105681684A (en) * | 2016-03-09 | 2016-06-15 | 北京奇虎科技有限公司 | Image real-time processing method and device based on mobile terminal |
CN105786432A (en) * | 2016-03-18 | 2016-07-20 | 北京奇虎科技有限公司 | Method and device for displaying virtual image on mobile terminal |
CN106203292A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | Method, device and the mobile terminal that the augmented reality of a kind of image processes |
CN106227481A (en) * | 2016-07-22 | 2016-12-14 | 北京奇虎科技有限公司 | Method and the terminal of AR image is shown during reading articles |
CN106231205A (en) * | 2016-08-10 | 2016-12-14 | 苏州黑盒子智能科技有限公司 | Augmented reality mobile terminal |
CN106231205B (en) * | 2016-08-10 | 2019-07-30 | 苏州黑盒子智能科技有限公司 | Augmented reality mobile terminal |
US11156473B2 (en) * | 2016-08-18 | 2021-10-26 | Sony Corporation | Information processing apparatus, information processing system, and information processing method |
CN106444042A (en) * | 2016-11-29 | 2017-02-22 | 北京知境科技有限公司 | Dual-purpose display equipment for augmented reality and virtual reality, and wearable equipment |
CN106898049A (en) * | 2017-01-18 | 2017-06-27 | 北京商询科技有限公司 | A kind of spatial match method and system for mixed reality equipment |
CN107025683A (en) * | 2017-03-30 | 2017-08-08 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
WO2018227954A1 (en) * | 2017-06-16 | 2018-12-20 | 京东方科技集团股份有限公司 | Augmented reality display device and augmented reality display method |
US11347055B2 (en) | 2017-06-16 | 2022-05-31 | Boe Technology Group Co., Ltd. | Augmented reality display apparatus and augmented reality display method |
CN107065196A (en) * | 2017-06-16 | 2017-08-18 | 京东方科技集团股份有限公司 | A kind of augmented reality display device and augmented reality display methods |
CN109420336A (en) * | 2017-08-30 | 2019-03-05 | 深圳市掌网科技股份有限公司 | Game implementation method and device based on augmented reality |
CN107734265A (en) * | 2017-09-11 | 2018-02-23 | 广东欧珀移动通信有限公司 | Image processing method and device, electronic installation and computer-readable recording medium |
CN107613223A (en) * | 2017-09-11 | 2018-01-19 | 广东欧珀移动通信有限公司 | Image processing method and device, electronic installation and computer-readable recording medium |
CN107749083A (en) * | 2017-09-28 | 2018-03-02 | 联想(北京)有限公司 | The method and apparatus of image shows |
CN108537889A (en) * | 2018-03-26 | 2018-09-14 | 广东欧珀移动通信有限公司 | Method of adjustment, device, storage medium and the electronic equipment of augmented reality model |
CN109981983B (en) * | 2019-03-26 | 2021-04-23 | Oppo广东移动通信有限公司 | Augmented reality image processing method and device, electronic equipment and storage medium |
CN109981983A (en) * | 2019-03-26 | 2019-07-05 | Oppo广东移动通信有限公司 | Augmented reality image processing method, device, electronic equipment and storage medium |
CN109976533A (en) * | 2019-04-15 | 2019-07-05 | 珠海天燕科技有限公司 | Display control method and device |
CN109976533B (en) * | 2019-04-15 | 2022-06-03 | 珠海天燕科技有限公司 | Display control method and device |
CN111340878A (en) * | 2020-05-15 | 2020-06-26 | 支付宝(杭州)信息技术有限公司 | Image processing method and device |
WO2022062442A1 (en) * | 2020-09-23 | 2022-03-31 | 北京市商汤科技开发有限公司 | Guiding method and apparatus in ar scene, and computer device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN105005970B (en) | 2018-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105005970A (en) | Augmented reality implementation method and apparatus | |
CN106355153B (en) | A kind of virtual objects display methods, device and system based on augmented reality | |
US9898857B2 (en) | Blending between street view and earth view | |
KR100953931B1 (en) | System for constructing mixed reality and Method thereof | |
CN109829981B (en) | Three-dimensional scene presentation method, device, equipment and storage medium | |
Portalés et al. | Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments | |
CN107369205B (en) | Mobile terminal city two-dimensional and three-dimensional linkage display method | |
CN106897976B (en) | Single video card triple channel solid what comes into a driver's projection software based on GPU corrects fusion method | |
CN107248194A (en) | A kind of CAE data three-dimensionals based on cloud computing show exchange method | |
CN104680532A (en) | Object labeling method and device | |
WO2018188479A1 (en) | Augmented-reality-based navigation method and apparatus | |
CN105913488B (en) | A kind of three-dimensional point cloud fast reconstructing method based on three-dimensional mapping table | |
CN105653036A (en) | Scrawling augmented reality method and system | |
CN111275731A (en) | Projection type real object interactive desktop system and method for middle school experiment | |
CN104159036A (en) | Display method and shooting equipment of image direction information | |
CN105095314A (en) | Point of interest (POI) marking method, terminal, navigation server and navigation system | |
CN106780757B (en) | Method for enhancing reality | |
Fukuda et al. | Improvement of registration accuracy of a handheld augmented reality system for urban landscape simulation | |
CN106204746A (en) | A kind of augmented reality system realizing 3D model live paint | |
CN110390712B (en) | Image rendering method and device, and three-dimensional image construction method and device | |
CN208506731U (en) | Image display systems | |
US11043019B2 (en) | Method of displaying a wide-format augmented reality object | |
CN102663665A (en) | Display method and edit method of stereo image graphic label with adaptive depth | |
WO2016020778A1 (en) | A method, mobile device and computer program for substituting a furnishing covering surface of in an image | |
CN112862976B (en) | Data processing method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder | ||
CP01 | Change in the name or title of a patent holder |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180216 |