CN110223394A - A kind of AR display methods under faint light condition - Google Patents
A kind of AR display methods under faint light condition Download PDFInfo
- Publication number
- CN110223394A CN110223394A CN201910410282.5A CN201910410282A CN110223394A CN 110223394 A CN110223394 A CN 110223394A CN 201910410282 A CN201910410282 A CN 201910410282A CN 110223394 A CN110223394 A CN 110223394A
- Authority
- CN
- China
- Prior art keywords
- wearable device
- virtual
- image
- virtual image
- display methods
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000004297 night vision Effects 0.000 claims abstract description 19
- 238000012876 topography Methods 0.000 claims description 24
- 238000012856 packing Methods 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 5
- 230000006835 compression Effects 0.000 claims description 3
- 238000007906 compression Methods 0.000 claims description 3
- 239000011800 void material Substances 0.000 claims 1
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The invention discloses the AR display methods under a kind of faint light condition, comprising: carries out panoramic scanning to actual scene by the low-light level night vision device being arranged on AR wearable device, obtains live image;Live image is transferred on AR wearable device by data line, and generates virtual image;Pose calculating is carried out according to the position angle relationship of low-light level night vision device and AR wearable device, obtains and save the spatial pose relativeness of low-light level night vision device Yu AR wearable device;Based on spatial pose relativeness, the shape and/or angle for adjusting virtual image are allowed to meet the display demand of AR wearable device;The virtual image being adjusted is sent to AR wearable device, so that the virtual image being adjusted is added in reality scene by AR wearable device, completes display;The present invention is enhanced the environment under dim light by low-light level night vision device, and the positioning and superposition of virtual information are then carried out in the image of return, improves use ability of the AR wearable device in dim light.
Description
Technical field
The present invention relates to the AR display methods under virtual field of display technology more particularly to a kind of faint light condition.
Background technique
Existing AR wearable device needs that virtual information could be completed in the case that scene is bright and clear when in use
With the Overlapping display of real-world object, but when light is faint at the scene or night, since live dim light causes AR to dress
Equipment can not preferably obtain site environment data, be unable to complete the Overlapping display of virtual information and real-world object, dress AR
Equipment loses using ability.
Summary of the invention
The present invention provides the AR display methods under a kind of faint light condition, to solve existing AR wearable device existing
The technical issues of Overlapping display of virtual information and real-world object is unable to complete in the case that light is faint, to pass through low-light
Night vision device enhances the environment under dim light, the positioning and superposition of virtual information is then carried out in the image of return, in turn
Improve use ability of the AR wearable device in dim light.
In order to solve the above-mentioned technical problem, the embodiment of the invention provides the display sides AR under a kind of faint light condition
Method, comprising:
Panoramic scanning is carried out to actual scene by the low-light level night vision device being arranged on AR wearable device, obtains scene photo
Picture;
The live image is transferred on AR wearable device by data line, and generates virtual image;
Pose calculating is carried out according to the position angle relationship of the low-light level night vision device and the AR wearable device, obtains and protects
Deposit the spatial pose relativeness of the low-light level night vision device Yu the AR wearable device;
Based on the spatial pose relativeness, the shape and/or angle for adjusting the virtual image are allowed to meet described
The display demand of AR wearable device;
The virtual image being adjusted is sent to the AR wearable device, so that the AR wearable device will be described
The virtual image being adjusted is added in reality scene, completes display.
Preferably, described to be carried out according to the position angle relationship of the low-light level night vision device and the AR wearable device
Pose calculates, comprising: extracts the characteristic point in the live image, extracts from the live image and be covered with indicator screen
Corner feature figure;After taking out characteristic point, corresponding matching relationship is obtained.
Preferably, it is described be based on the spatial pose relativeness, adjust the virtual image shape and/
Or angle is allowed to before the display demand for meeting the AR wearable device, further includes: is arranged on the virtual image corresponding
It is connected to domain identifier;The topography comprising the connection domain identifier is identified from the virtual image;Based on the connected domain
Mark and the topography, determine the relative positional relationship between site environment and the AR wearable device.
It is preferably, described that the topography comprising the connection domain identifier is identified from the virtual image,
It include: the space-invariance feature for obtaining the benchmark image of the pre-stored connection domain identifier;From the virtual image
The image-region for identifying the space-invariance feature comprising the benchmark image, as the topography.
Preferably, described based on the connection domain identifier and the topography, determine site environment with it is described
Relative positional relationship between AR wearable device, comprising: establish world coordinate system;According to the connection domain identifier described virtual
Position on image determines position of the site environment in the world coordinate system;According to the topography with it is described
The non-space Invariance feature of benchmark image, calculates the relativeness of the topography Yu the benchmark image;Based on institute
Topography and the relativeness of the benchmark image and the position for being connected to domain identifier on the virtual image are stated, really
Relative positional relationship between the fixed site environment and the AR equipment.
Preferably, the live image is transferred on AR wearable device by data line described, and
After generation virtual image, further includes: virtual described in same interface display after the AR wearable device enters debugging mode
Image and the live image, are arranged virtual scale in the virtual image, and it is empty that realization is adjusted to the virtual scale
Quasi- image is overlapped with the preliminary of site environment.
Preferably, described to be arranged after virtual scale in the virtual image, further includes: by the virtual graph
Each virtual scale as in is adjusted to be overlapped with each shooting scale in the live image.
Preferably, the virtual scale is the scale in virtual image on virtual tag object;The shooting scale
To shoot the scale in obtained field scene on shot mark object.
Preferably, the virtual image that will be adjusted is sent to the AR wearable device, for institute
It states AR wearable device the virtual image being adjusted is added in reality scene, comprising: in the AR wearable device
The superposition range parameter that superposition object is obtained in virtual image carries out at packing the object and superposition range parameter
Reason, obtains virtual data packet;The virtual data packet is sent to the AR wearable device;It is called in the AR wearable device
The virtual target object is superimposed to the video of the AR wearable device by the virtual data packet according to the superposition range parameter
In scene.
Preferably, it is described to the object and superposition range parameter carry out packing processing before, further includes:
Compression and encryption are carried out to this group of data.
Compared with the prior art, the embodiment of the present invention has the following beneficial effects:
The present invention is enhanced the environment under dim light by low-light level night vision device, is then carried out in the image of return virtual
The positioning and superposition of information, be unable to complete in the case where solving existing AR wearable device light being faint at the scene virtual information with
The technical issues of Overlapping display of real-world object, to improve use ability of the AR wearable device in dim light.
Detailed description of the invention
Fig. 1: for the AR display methods flow diagram under the faint light condition in the embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
Please refer to Fig. 1, the preferred embodiment of the present invention provides the AR display methods under a kind of faint light condition, comprising:
S1 carries out panoramic scanning to actual scene by the low-light level night vision device being arranged on AR wearable device, obtains scene
Image;
The live image is transferred on AR wearable device by data line, and generates virtual image by S2;
In the present embodiment, the live image is transferred on AR wearable device by data line described, and
After generation virtual image, further includes: virtual described in same interface display after the AR wearable device enters debugging mode
Image and the live image, are arranged virtual scale in the virtual image, and it is empty that realization is adjusted to the virtual scale
Quasi- image is overlapped with the preliminary of site environment.
In the present embodiment, described to be arranged after virtual scale in the virtual image, further includes: by the virtual graph
Each virtual scale as in is adjusted to be overlapped with each shooting scale in the live image.
In the present embodiment, the virtual scale is the scale in virtual image on virtual tag object;The shooting scale
To shoot the scale in obtained field scene on shot mark object.
S3 carries out pose calculating according to the position angle relationship of the low-light level night vision device and the AR wearable device, obtains
And save the spatial pose relativeness of the low-light level night vision device Yu the AR wearable device;
In the present embodiment, described to be carried out according to the position angle relationship of the low-light level night vision device and the AR wearable device
Pose calculates, comprising: extracts the characteristic point in the live image, extracts from the live image and be covered with indicator screen
Corner feature figure;After taking out characteristic point, corresponding matching relationship is obtained.
S4, is based on the spatial pose relativeness, and the shape and/or angle for adjusting the virtual image are allowed to meet institute
State the display demand of AR wearable device;
In the present embodiment, it is described be based on the spatial pose relativeness, adjust the virtual image shape and/
Or angle is allowed to before the display demand for meeting the AR wearable device, further includes: is arranged on the virtual image corresponding
It is connected to domain identifier;The topography comprising the connection domain identifier is identified from the virtual image;Based on the connected domain
Mark and the topography, determine the relative positional relationship between site environment and the AR wearable device.
It is in the present embodiment, described that the topography comprising the connection domain identifier is identified from the virtual image,
It include: the space-invariance feature for obtaining the benchmark image of the pre-stored connection domain identifier;From the virtual image
The image-region for identifying the space-invariance feature comprising the benchmark image, as the topography.
In the present embodiment, described based on the connection domain identifier and the topography, determine site environment with it is described
Relative positional relationship between AR wearable device, comprising: establish world coordinate system;According to the connection domain identifier described virtual
Position on image determines position of the site environment in the world coordinate system;According to the topography with it is described
The non-space Invariance feature of benchmark image, calculates the relativeness of the topography Yu the benchmark image;Based on institute
Topography and the relativeness of the benchmark image and the position for being connected to domain identifier on the virtual image are stated, really
Relative positional relationship between the fixed site environment and the AR equipment.
The virtual image being adjusted is sent to the AR wearable device by S5, so that the AR wearable device will
The virtual image being adjusted is added in reality scene, completes display.
In the present embodiment, the virtual image that will be adjusted is sent to the AR wearable device, for institute
It states AR wearable device the virtual image being adjusted is added in reality scene, comprising: in the AR wearable device
The superposition range parameter that superposition object is obtained in virtual image carries out at packing the object and superposition range parameter
Reason, obtains virtual data packet;The virtual data packet is sent to the AR wearable device;It is called in the AR wearable device
The virtual target object is superimposed to the video of the AR wearable device by the virtual data packet according to the superposition range parameter
In scene.
In the present embodiment, it is described to the object and superposition range parameter carry out packing processing before, further includes:
Compression and encryption are carried out to this group of data.
The present invention is enhanced the environment under dim light by low-light level night vision device, is then carried out in the image of return virtual
The positioning and superposition of information, be unable to complete in the case where solving existing AR wearable device light being faint at the scene virtual information with
The technical issues of Overlapping display of real-world object, to improve use ability of the AR wearable device in dim light.
Particular embodiments described above has carried out further the purpose of the present invention, technical scheme and beneficial effects
It is described in detail, it should be understood that the above is only a specific embodiment of the present invention, the protection being not intended to limit the present invention
Range.It particularly points out, to those skilled in the art, all within the spirits and principles of the present invention, that is done any repairs
Change, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.
Claims (10)
1. the AR display methods under a kind of faint light condition characterized by comprising
Panoramic scanning is carried out to actual scene by the low-light level night vision device being arranged on AR wearable device, obtains live image;
The live image is transferred on AR wearable device by data line, and generates virtual image;
Pose calculating is carried out according to the position angle relationship of the low-light level night vision device and the AR wearable device, obtains and saves institute
State the spatial pose relativeness of low-light level night vision device Yu the AR wearable device;
Based on the spatial pose relativeness, the shape and/or angle for adjusting the virtual image, which are allowed to meet the AR, is worn
Wear the display demand of equipment;
The virtual image being adjusted is sent to the AR wearable device, so that the AR wearable device is by the process
The virtual image of adjustment is added in reality scene, completes display.
2. the AR display methods under faint light condition as described in claim 1, which is characterized in that described according to the low-light
The position angle relationship of night vision device and the AR wearable device carries out pose calculating, comprising: extracts the spy in the live image
Point is levied, the corner feature figure for being covered with indicator screen is extracted from the live image;After taking out characteristic point, obtain corresponding
Matching relationship.
3. the AR display methods under faint light condition as described in claim 1, which is characterized in that be based on the sky described
Between pose relativeness, the display that the shape and/or angle for adjusting the virtual image are allowed to meet the AR wearable device needs
Before asking, further includes: corresponding connection domain identifier is set on the virtual image;It is identified from the virtual image and includes
The topography of the connection domain identifier;Based on the connection domain identifier and the topography, determine site environment with it is described
Relative positional relationship between AR wearable device.
4. the AR display methods under faint light condition as claimed in claim 3, which is characterized in that described from the virtual graph
The topography comprising the connection domain identifier is identified as in, comprising: obtain the base of the pre-stored connection domain identifier
The space-invariance feature of quasi- image;The space-invariance feature comprising the benchmark image is identified from the virtual image
Image-region, as the topography.
5. the AR display methods under faint light condition as claimed in claim 4, which is characterized in that described to be based on the connection
Domain identifier and the topography, determine the relative positional relationship between site environment and the AR wearable device, comprising: establish
World coordinate system;According to position of the connection domain identifier on the virtual image, determine the site environment in the generation
Position in boundary's coordinate system;According to the non-space Invariance feature of the topography and the benchmark image, calculate described
The relativeness of topography and the benchmark image;Relativeness based on the topography and the benchmark image and
Position of the connection domain identifier on the virtual image, determines the opposite position between the site environment and the AR equipment
Set relationship.
6. the AR display methods under faint light condition as described in claim 1, which is characterized in that described by the scene
Image is transferred on AR wearable device by data line, and after generating virtual image, further includes: is set in AR wearing
The standby virtual image described in same interface display and the live image after entering debugging mode, are arranged in the virtual image
Virtual scale is adjusted the virtual scale and realizes that virtual image is overlapped with the preliminary of site environment.
7. the AR display methods under faint light condition as claimed in claim 6, which is characterized in that described in the virtual graph
It is arranged after virtual scale as in, further includes: be adjusted to each virtual scale in the virtual image and the scene photo
Each shooting scale as in is overlapped.
8. the AR display methods under faint light condition as claimed in claim 7, which is characterized in that the virtual scale is void
Scale in quasi- image on virtual tag object;The shooting scale is the quarter shot in obtained field scene on shot mark object
Degree.
9. the AR display methods under faint light condition as described in claim 1, which is characterized in that described to be adjusted
The virtual image is sent to the AR wearable device, so that the AR wearable device folds the virtual image being adjusted
It is added in reality scene, comprising: the superposition range parameter of superposition object is obtained in the virtual image of the AR wearable device,
Packing processing is carried out to the object and superposition range parameter, obtains virtual data packet;The virtual data packet is sent to
The AR wearable device;The virtual data packet is called in the AR wearable device, by the virtual target object according to described
Superposition range parameter is superimposed in the video scene of the AR wearable device.
10. the AR display methods under faint light condition as claimed in claim 9, which is characterized in that described to the mesh
Mark object and superposition range parameter carry out before packing processing, further includes: carry out compression and encryption to this group of data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910410282.5A CN110223394A (en) | 2019-05-16 | 2019-05-16 | A kind of AR display methods under faint light condition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910410282.5A CN110223394A (en) | 2019-05-16 | 2019-05-16 | A kind of AR display methods under faint light condition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110223394A true CN110223394A (en) | 2019-09-10 |
Family
ID=67821160
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910410282.5A Pending CN110223394A (en) | 2019-05-16 | 2019-05-16 | A kind of AR display methods under faint light condition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110223394A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111812841A (en) * | 2020-03-06 | 2020-10-23 | 谷东科技有限公司 | Volume holographic grating two-dimensional pupil expanding waveguide plate and pupil expanding method thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106875493A (en) * | 2017-02-24 | 2017-06-20 | 广东电网有限责任公司教育培训评价中心 | The stacking method of virtual target thing in AR glasses |
CN206311847U (en) * | 2016-12-23 | 2017-07-07 | 王国清 | A kind of Night vision helmet of use AR technologies |
CN107340870A (en) * | 2017-07-13 | 2017-11-10 | 深圳市未来感知科技有限公司 | A kind of fusion VR and AR virtual reality display system and its implementation |
WO2018019272A1 (en) * | 2016-07-29 | 2018-02-01 | 成都理想境界科技有限公司 | Method and apparatus for realizing augmented reality on the basis of plane detection |
CN108762501A (en) * | 2018-05-23 | 2018-11-06 | 歌尔科技有限公司 | AR display methods, intelligent terminal, AR equipment and system |
CN109491497A (en) * | 2018-10-19 | 2019-03-19 | 华中科技大学 | A kind of human assistance assembly application system based on augmented reality |
CN109725733A (en) * | 2019-01-25 | 2019-05-07 | 中国人民解放军国防科技大学 | Human-computer interaction method and human-computer interaction equipment based on augmented reality |
-
2019
- 2019-05-16 CN CN201910410282.5A patent/CN110223394A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018019272A1 (en) * | 2016-07-29 | 2018-02-01 | 成都理想境界科技有限公司 | Method and apparatus for realizing augmented reality on the basis of plane detection |
CN206311847U (en) * | 2016-12-23 | 2017-07-07 | 王国清 | A kind of Night vision helmet of use AR technologies |
CN106875493A (en) * | 2017-02-24 | 2017-06-20 | 广东电网有限责任公司教育培训评价中心 | The stacking method of virtual target thing in AR glasses |
CN107340870A (en) * | 2017-07-13 | 2017-11-10 | 深圳市未来感知科技有限公司 | A kind of fusion VR and AR virtual reality display system and its implementation |
CN108762501A (en) * | 2018-05-23 | 2018-11-06 | 歌尔科技有限公司 | AR display methods, intelligent terminal, AR equipment and system |
CN109491497A (en) * | 2018-10-19 | 2019-03-19 | 华中科技大学 | A kind of human assistance assembly application system based on augmented reality |
CN109725733A (en) * | 2019-01-25 | 2019-05-07 | 中国人民解放军国防科技大学 | Human-computer interaction method and human-computer interaction equipment based on augmented reality |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111812841A (en) * | 2020-03-06 | 2020-10-23 | 谷东科技有限公司 | Volume holographic grating two-dimensional pupil expanding waveguide plate and pupil expanding method thereof |
CN111812841B (en) * | 2020-03-06 | 2023-07-07 | 谷东科技有限公司 | Volume holographic grating two-dimensional pupil expansion waveguide piece and pupil expansion method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104835117B (en) | Spherical panorama generation method based on overlapping mode | |
CN105809701B (en) | Panoramic video posture scaling method | |
CN106101689B (en) | The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality | |
TWI397317B (en) | Method for providing output image in either cylindrical mode or perspective mode | |
CN104519340B (en) | Panoramic video joining method based on many depth images transformation matrix | |
Barreto et al. | Issues on the geometry of central catadioptric image formation | |
CN109003311B (en) | Calibration method of fisheye lens | |
CN110782394A (en) | Panoramic video rapid splicing method and system | |
CN108389232A (en) | Irregular surfaces projected image geometric correction method based on ideal viewpoint | |
CN108830940A (en) | Hiding relation processing method, device, terminal device and storage medium | |
WO2017217296A1 (en) | Image processing device | |
CN110855972A (en) | Image processing method, electronic device, and storage medium | |
JP2008033531A (en) | Method for processing information | |
CN107145224B (en) | Human eye sight tracking and device based on three-dimensional sphere Taylor expansion | |
CN108307183A (en) | Virtual scene method for visualizing and system | |
CN106886976B (en) | Image generation method for correcting fisheye camera based on internal parameters | |
CN108717704A (en) | Method for tracking target, computer installation based on fish eye images and computer readable storage medium | |
CN107845056A (en) | Fish eye images panorama generation method based on cylinder model | |
KR20120008191A (en) | A method and device for display of mobile device, and mobile device using the same | |
CN208506731U (en) | Image display systems | |
CN110223394A (en) | A kind of AR display methods under faint light condition | |
CN110430421A (en) | A kind of optical tracking positioning system for five face LED-CAVE | |
CN207366930U (en) | A kind of 3D stereopsis training system | |
Xiang et al. | Towards mobile projective AR for construction co-robots | |
CN109963143A (en) | A kind of image acquiring method and system of AR glasses |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190910 |
|
RJ01 | Rejection of invention patent application after publication |