CN112435347A - E-book reading system and method for enhancing reality - Google Patents

E-book reading system and method for enhancing reality Download PDF

Info

Publication number
CN112435347A
CN112435347A CN202011320395.5A CN202011320395A CN112435347A CN 112435347 A CN112435347 A CN 112435347A CN 202011320395 A CN202011320395 A CN 202011320395A CN 112435347 A CN112435347 A CN 112435347A
Authority
CN
China
Prior art keywords
book
augmented reality
computing unit
plane
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011320395.5A
Other languages
Chinese (zh)
Inventor
于�玲
孙善宝
罗清彩
谭强
徐驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan Inspur Hi Tech Investment and Development Co Ltd
Original Assignee
Jinan Inspur Hi Tech Investment and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan Inspur Hi Tech Investment and Development Co Ltd filed Critical Jinan Inspur Hi Tech Investment and Development Co Ltd
Priority to CN202011320395.5A priority Critical patent/CN112435347A/en
Publication of CN112435347A publication Critical patent/CN112435347A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/02Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators
    • G06F15/025Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators adapted to a specific application
    • G06F15/0291Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators adapted to a specific application for reading, e.g. e-books
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an augmented reality e-book reading system and method, belonging to the field of augmented reality, artificial intelligence and computer vision, wherein the system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an e-book reader, records feature points, and projects an e-book in augmented reality glasses; the system hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, and the software module comprises an Android OS and an SLAM framework. The invention finds the plane and characteristic positioning points in the environment by collecting and analyzing the environment information, and projects the electronic book to the visual field of people through the augmented reality glasses, so that the characters are not monotonous characters any more, but have various simulation effects, such as projecting the book shape, turning pages through gestures and the like, and the invention can protect privacy, adjust the display effect at will and is not limited by the fixed screen size any more.

Description

E-book reading system and method for enhancing reality
Technical Field
The invention relates to the field of augmented reality, artificial intelligence and computer vision, in particular to an electronic book reading system and method for augmented reality.
Background
The current e-book reader is still displayed on a screen based on software on a mobile phone, a tablet and the like, and has not substantially evolved for a long time.
When reading, the device is limited by a fixed screen size, and characters are monotonous characters, so that the simulation effect cannot be realized. In addition, the privacy protection of the traditional electronic reader is poor, and the display effect cannot be adjusted at will according to the requirements of customers.
Disclosure of Invention
The technical task of the invention is to provide a system and a method for reading an electronic book with augmented reality aiming at the defects of the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows:
1. the invention provides an augmented reality electronic book reading system, which analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an electronic book reader, records feature points and projects an electronic book in augmented reality glasses;
the system hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, wherein the front camera is responsible for image acquisition and is sent to the computing unit for analysis and processing; the computing unit is used for analyzing the image acquired by the front camera, selecting a characteristic point or a plane and projecting the electronic book to the position; the imaging system is used for generating an anchor point, focusing the gaze on the anchor point and calculating a calibration parameter; the software module comprises an Android OS and a SLAM framework.
Optionally, the projection location is a fixed location in the environment or a fixed location in the field of view, the projection form being a pure text and/or book simulation.
Optionally, the computing unit is a dedicated computing unit for AR glasses, or a mobile phone with an Android system, and the computing unit is based on an ARM processor.
Optionally, the imaging system is an array optical waveguide lens, and a monocular lens or a binocular lens is adopted.
Optionally, the system further comprises an eyeball tracking system, wherein the eyeball tracking system is a group of paired cameras and is used for tracking the movement track of the eyeballs in real time and simulating the sight line through calibration, so that the imaging position of the relative human sight line is calculated, and the target can be projected on the focus of the sight line more naturally.
Optionally, in the eyeball tracking system sight calibration process, several anchor points are loaded in the visual field, then the eye is focused on the anchor points, and at the same time, the mode of the eyeball sight at the moment is detected, so that a set of parameters is calculated for calibration.
Alternatively, the SLAM framework uses an open source OBR-SLAM, or uses an own SLAM framework in ARCore.
2. The invention also provides an augmented reality electronic book reading method, which is based on an electronic book reading system, wherein the system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an electronic book reader, records feature points, and projects an electronic book in augmented reality glasses;
the system comprises a hardware module, a software module and a control module, wherein the hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, the software module comprises an Android OS and a SLAM frame,
the method comprises the following implementation steps: eyeball position calibration, characteristic point detection, plane detection, characteristic point recording, projection by selecting characteristic points and imaging according to calibration data, specifically:
s1, generating an anchor point in the imaging system, focusing the gaze on the anchor point, and calculating calibration parameters;
s2, generating the characteristics of the image collected by the front camera by using a local characteristic extraction method;
s3, carrying out plane detection through local features and the combination of ARCORes;
s4, storing the recorded feature points for subsequent positioning and drawing;
s5, selecting a feature point or a plane by a controller or a gesture, etc., and projecting the electronic book to the position.
Compared with the prior art, the system and the method for reading the E-book with augmented reality have the advantages that,
the electronic book reading system based on augmented reality finds plane and characteristic positioning points in the environment by acquiring and analyzing the environment information, and projects the electronic book to the visual field of people through augmented reality glasses, so that characters are not monotonous characters any more, but various simulation effects can be achieved, such as projecting the shape of a book, turning pages through gestures and the like. Meanwhile, privacy can be protected, the display effect can be adjusted at will, and the display device is not limited by the fixed screen size any more.
The invention realizes the augmented reality electronic book reading system based on cheap equipment by using the open-source common development language and the library, and can be used for expanding the application scene of AR glasses.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
The system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an electronic book reader, records feature points, and projects an electronic book in augmented reality glasses;
the system hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, wherein the front camera is responsible for image acquisition and is sent to the computing unit for analysis and processing; the computing unit is used for analyzing the image acquired by the front camera, selecting a characteristic point or a plane and projecting the electronic book to the position; the imaging system is used for generating an anchor point, focusing the gaze on the anchor point and calculating a calibration parameter; the software module comprises an Android OS and a SLAM framework.
The projection position is a fixed position in the environment or a fixed position in the field of view, and the projection form is pure text and/or book simulation.
The computing unit is a special computing unit for the AR glasses or a mobile phone with an Android system, and the computing unit is based on an ARM processor.
The imaging system is an array optical waveguide lens, and a monocular lens or a binocular lens is adopted.
The system also comprises an eyeball tracking system which is a group of paired cameras and is used for tracking the movement track of the eyeballs in real time and simulating the sight line through calibration, so that the imaging position of the relative human sight line is calculated, and the target can be projected on the focus of the sight line more naturally.
The eyeball tracking system sight calibration process comprises the steps of loading a plurality of anchor points in a visual field, focusing eyes on the anchor points, and detecting the mode of the sight of the eyeball at the moment, so that a group of parameters is calculated for calibration.
The SLAM framework described above uses an open source OBR-SLAM, or uses the own SLAM framework in ARCore.
Example two
The invention relates to an augmented reality electronic book reading method, which is based on an electronic book reading system, wherein the system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placing plane of an electronic book reader, records feature points, and projects an electronic book in augmented reality glasses;
the system comprises a hardware module, a software module and a control module, wherein the hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, the software module comprises an Android OS and a SLAM frame,
the method comprises the following implementation steps: eyeball position calibration, characteristic point detection, plane detection, characteristic point recording, projection by selecting characteristic points and imaging according to calibration data, specifically:
s1, generating an anchor point in the imaging system, focusing the gaze on the anchor point, and calculating calibration parameters;
s2, generating the characteristics of the image collected by the front camera by using a local characteristic extraction method;
s3, carrying out plane detection through local features and the combination of ARCORes;
s4, storing the recorded feature points for subsequent positioning and drawing;
s5, selecting a feature point or a plane by a controller or a gesture, etc., and projecting the electronic book to the position.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. An augmented reality e-book reading system is characterized in that the system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an e-book reader, records feature points, and projects an e-book in augmented reality glasses;
the system hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, wherein the front camera is responsible for image acquisition and is sent to the computing unit for analysis and processing; the computing unit is used for analyzing the image acquired by the front camera, selecting a characteristic point or a plane and projecting the electronic book to the position; the imaging system is used for generating an anchor point, focusing the gaze on the anchor point and calculating a calibration parameter; the software module comprises an Android OS and a SLAM framework.
2. An augmented reality e-book reading system as claimed in claim 1, wherein the projection location is a fixed location in the environment or a fixed location in the field of view, the projection being in the form of a plain text and/or book simulation.
3. The system of claim 1, wherein the computing unit is a computing unit dedicated to AR glasses or a mobile phone with Android system, and the computing unit is based on an ARM processor.
4. The augmented reality electronic book reading system of claim 1, wherein the imaging system is an arrayed optical waveguide lens, a monocular lens or a binocular lens.
5. The system of claim 1, further comprising an eye tracking system, wherein the eye tracking system is a set of paired cameras for tracking eye movement trajectory in real time and simulating the line of sight by calibration, so as to calculate an imaging position relative to the line of sight, and enable the target to project more naturally on the focal point of the line of sight.
6. The system of claim 5, wherein the eye tracking system is calibrated by loading several anchor points in the visual field, focusing the eye on the anchor points, and detecting the eye gaze pattern to calculate a set of parameters for calibration.
7. The system of claim 1, wherein the SLAM framework uses an open-source OBR-SLAM or an own SLAM framework in an ARCore.
8. An augmented reality e-book reading method is characterized in that the method is based on an e-book reading system, the system analyzes environmental information through a computer vision positioning and mapping method based on feature point matching, finds a placement plane of an e-book reader and records feature points, and projects the e-book in augmented reality glasses;
the system comprises a hardware module, a software module and a control module, wherein the hardware module comprises a computing unit, an imaging system and a front camera which can be fixed in front of the head, the software module comprises an Android OS and a SLAM frame,
the method comprises the following implementation steps: eyeball position calibration, characteristic point detection, plane detection, characteristic point recording, projection by selecting characteristic points and imaging according to calibration data, specifically:
s1, generating an anchor point in the imaging system, focusing the gaze on the anchor point, and calculating calibration parameters;
s2, generating the characteristics of the image collected by the front camera by using a local characteristic extraction method;
s3, carrying out plane detection through local features and the combination of ARCORes;
s4, storing the recorded feature points for subsequent positioning and drawing;
s5, selecting a feature point or a plane by a controller or a gesture, etc., and projecting the electronic book to the position.
CN202011320395.5A 2020-11-23 2020-11-23 E-book reading system and method for enhancing reality Pending CN112435347A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011320395.5A CN112435347A (en) 2020-11-23 2020-11-23 E-book reading system and method for enhancing reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011320395.5A CN112435347A (en) 2020-11-23 2020-11-23 E-book reading system and method for enhancing reality

Publications (1)

Publication Number Publication Date
CN112435347A true CN112435347A (en) 2021-03-02

Family

ID=74692887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011320395.5A Pending CN112435347A (en) 2020-11-23 2020-11-23 E-book reading system and method for enhancing reality

Country Status (1)

Country Link
CN (1) CN112435347A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113163134A (en) * 2021-04-21 2021-07-23 山东新一代信息产业技术研究院有限公司 Harsh environment vision enhancement method and system based on augmented reality

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056092A (en) * 2016-06-08 2016-10-26 华南理工大学 Gaze estimation method for head-mounted device based on iris and pupil
CN108681389A (en) * 2018-05-11 2018-10-19 亮风台(上海)信息科技有限公司 A kind of method and apparatus read by arrangement for reading
CN108885803A (en) * 2016-03-30 2018-11-23 微软技术许可有限责任公司 Virtual object manipulation in physical environment
CN109086726A (en) * 2018-08-10 2018-12-25 陈涛 A kind of topography's recognition methods and system based on AR intelligent glasses
CN110310373A (en) * 2019-06-28 2019-10-08 京东方科技集团股份有限公司 A kind of image processing method and augmented reality equipment of augmented reality equipment
KR20190128962A (en) * 2018-05-09 2019-11-19 서강대학교산학협력단 METHOD AND WEARABLE DISPLAY APPARATUS FOR PROVIDING eBOOK BASED ON AUGMENTED REALLITY
CN111510701A (en) * 2020-04-22 2020-08-07 Oppo广东移动通信有限公司 Virtual content display method and device, electronic equipment and computer readable medium
CN111739359A (en) * 2020-06-30 2020-10-02 上海乂学教育科技有限公司 Augmented reality courseware generation system
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108885803A (en) * 2016-03-30 2018-11-23 微软技术许可有限责任公司 Virtual object manipulation in physical environment
CN106056092A (en) * 2016-06-08 2016-10-26 华南理工大学 Gaze estimation method for head-mounted device based on iris and pupil
KR20190128962A (en) * 2018-05-09 2019-11-19 서강대학교산학협력단 METHOD AND WEARABLE DISPLAY APPARATUS FOR PROVIDING eBOOK BASED ON AUGMENTED REALLITY
CN108681389A (en) * 2018-05-11 2018-10-19 亮风台(上海)信息科技有限公司 A kind of method and apparatus read by arrangement for reading
CN109086726A (en) * 2018-08-10 2018-12-25 陈涛 A kind of topography's recognition methods and system based on AR intelligent glasses
CN110310373A (en) * 2019-06-28 2019-10-08 京东方科技集团股份有限公司 A kind of image processing method and augmented reality equipment of augmented reality equipment
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device
CN111510701A (en) * 2020-04-22 2020-08-07 Oppo广东移动通信有限公司 Virtual content display method and device, electronic equipment and computer readable medium
CN111739359A (en) * 2020-06-30 2020-10-02 上海乂学教育科技有限公司 Augmented reality courseware generation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨秉书: "基于ORB-SLAM的移动增强现实技术的应用研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113163134A (en) * 2021-04-21 2021-07-23 山东新一代信息产业技术研究院有限公司 Harsh environment vision enhancement method and system based on augmented reality

Similar Documents

Publication Publication Date Title
CN109086726B (en) Local image identification method and system based on AR intelligent glasses
CN114020156B (en) Wearable device capable of eye tracking
Itoh et al. Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization
US9898082B1 (en) Methods and apparatus for eye tracking
Mehrubeoglu et al. Real-time eye tracking using a smart camera
US20190102956A1 (en) Information processing apparatus, information processing method, and program
US10254831B2 (en) System and method for detecting a gaze of a viewer
KR101706992B1 (en) Apparatus and method for tracking gaze, recording medium for performing the method
US9622654B2 (en) Device for and method of corneal imaging
US9696798B2 (en) Eye gaze direction indicator
Rakhmatulin et al. Deep neural networks for low-cost eye tracking
GB2560340A (en) Verification method and system
WO2015026645A1 (en) Automatic calibration of scene camera for optical see-through head mounted display
CN107656619A (en) A kind of intelligent projecting method, system and intelligent terminal
CN110051319A (en) Adjusting method, device, equipment and the storage medium of eyeball tracking sensor
Lander et al. hEYEbrid: A hybrid approach for mobile calibration-free gaze estimation
Rakhmatulin A review of the low-cost eye-tracking systems for 2010-2020
CN112435347A (en) E-book reading system and method for enhancing reality
KR20190102651A (en) Apparatus and Method for Digital Holographic Display
Parada et al. ExpertEyes: Open-source, high-definition eyetracking
EP3051386A1 (en) Eye tracking system and method
CN116382473A (en) Sight calibration, motion tracking and precision testing method based on self-adaptive time sequence analysis prediction
JP2015123262A (en) Sight line measurement method using corneal surface reflection image, and device for the same
Lander et al. Eyemirror: Mobile calibration-free gaze approximation using corneal imaging
CN106662911A (en) Gaze detector using reference frames in media

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210302