US20160171277A1 - Method and system for visually-biased sensory-enhanced e-reading - Google Patents

Method and system for visually-biased sensory-enhanced e-reading Download PDF

Info

Publication number
US20160171277A1
US20160171277A1 US14/570,609 US201414570609A US2016171277A1 US 20160171277 A1 US20160171277 A1 US 20160171277A1 US 201414570609 A US201414570609 A US 201414570609A US 2016171277 A1 US2016171277 A1 US 2016171277A1
Authority
US
United States
Prior art keywords
user
reader
eye movement
indicator
visual enhancement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/570,609
Inventor
Sarah FLAWN
Benjamin Landau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kobo Inc
Rakuten Kobo Inc
Original Assignee
Kobo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/533,700 external-priority patent/US20160124505A1/en
Priority claimed from US14/533,890 external-priority patent/US20160121348A1/en
Application filed by Kobo Inc filed Critical Kobo Inc
Priority to US14/570,609 priority Critical patent/US20160171277A1/en
Assigned to Kobo Incorporated reassignment Kobo Incorporated ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FLAWN, SARAH, LANDAU, BENJAMIN
Assigned to RAKUTEN KOBO INC. reassignment RAKUTEN KOBO INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: KOBO INC.
Publication of US20160171277A1 publication Critical patent/US20160171277A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/0061
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06K9/00617
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Definitions

  • Examples described herein relate to a system and method for visually-biased sensory-enhanced e-Reading.
  • An electronic personal display is a mobile computing device that displays information to a user. While an electronic personal display may be capable of many of the functions of a personal computer, a user can typically interact directly with an electronic personal display without the use of a keyboard that is separate from, or coupled to, but distinct from the electronic personal display itself.
  • Some examples of electronic personal displays include mobile digital devices/tablet computers and electronic readers (e-readers) such (e.g., Apple iPad®, Microsoft® SurfaceTM, Samsung Galaxy Tab® and the like), handheld multimedia smartphones (e.g., Apple iPhone®, Samsung Galaxy S®, and the like), and handheld electronic readers (e.g., Amazon Kindle®, Barnes and Noble Nook®, Kobo Aura HD, Kobo Aura H2O, Kobo GLO and the like).
  • a purpose build device may include a display that reduces glare, performs well in high lighting conditions, and/or mimics the look of text as presented via actual discrete pages of paper. While such purpose built devices may excel at displaying content for a user to read, they may also perform other functions, such as displaying images, emitting audio, recording audio, and web surfing, among others.
  • Electronic personal displays are among numerous kinds of consumer devices that can receive services and utilize resources across a network service. Such devices can operate applications or provide other functionality that links a device to a particular account of a specific service.
  • the electronic reader (e-reader) devices typically link to an online bookstore, and media playback devices often include applications that enable the user to access an online media electronic library (or e-library).
  • the user accounts can enable the user to receive the full benefit and functionality of the device.
  • Such devices may incorporate a touch screen display having integrated touch sensors and touch sensing functionality, whereby user input commands via touch-based gestures are received thereon.
  • FIG. 1 illustrates a system utilizing applications and providing e-book services on a computing device configured for operation of an e-book reading launch interface, in an embodiment.
  • FIG. 2 illustrates a schematic architecture of a computing device for configuring and launching an e-book reading interface, according to an embodiment.
  • FIG. 3 illustrates example embodiments for visually-biased sensory-enhanced e-reading.
  • FIG. 4 illustrates a method for operation in for configuring and launching an e-book reading interface on a computer device having a touchscreen display, according to an embodiment.
  • Embodiments include eye tracking while e-reading and providing visual enhancements based on the eye tracking
  • booklovers will be able to select an immersive reading experience based on visual sensory enhancements. For example, when reaching the climax of a horror novel (end of chapter or end of book) or when triggering a specific word such as “murder” or “blood” a faint red light, or blotches of red light could begin pulsating behind the text.
  • red blood or claw marks could appear in the background, in the margins of the page, or translucency.
  • the just-read word “ocean” could trigger blue illumination in the background or subtle ripples behind the text like waves on the surface of the sea.
  • Embodiments include a multi-layered sensory-driven reading experience for sight that includes an extensive electronic depository of words that trigger corresponding images or other visual enhancements such as the examples above.
  • the feature could also be customizable, allowing users to program certain words to trigger particular images or image types.
  • E-books are a form of electronic publication content stored in digital format in a computer non-transitory memory, viewable on a computing device having display functionality.
  • An e-book can correspond to, or mimic, the paginated format of a printed publication for viewing, such as provided by printed literary works (e.g., novels) and periodicals (e.g., magazines, comic books, journals, etc.).
  • some e-books may have chapter designations, as well as content that corresponds to graphics or images (e.g., such as in the case of magazines or comic books).
  • Multi-function devices such as cellular-telephony or messaging devices, can utilize specialized applications (e.g., specialized e-reading application software) to view e-books in a format that mimics the paginated printed publication.
  • specialized applications e.g., specialized e-reading application software
  • some devices can display digitally-stored content in a more reading-centric manner, while also providing, via a user input interface, the ability to manipulate that content for viewing, such as via discrete pages arranged sequentially (that is, pagination) corresponding to an intended or natural reading progression, or flow, of the content therein.
  • an “e-reading device”, variously referred to herein as an electronic personal display or mobile computing device, can refer to any computing device that can display or otherwise render an e-book.
  • an e-reading device can include a mobile computing device on which an e-reading application can be executed to render content that includes e-books (e.g., comic books, magazines, etc.).
  • Such mobile computing devices can include, for example, a multi-functional computing device for cellular telephony/messaging (e.g., feature phone or smart phone), a tablet computer device, an ultra-mobile computing device, or a wearable computing device with a form factor of a wearable accessory device (e.g., smart watch or bracelet, glass-wear integrated with a computing device, etc.).
  • an e-reading device can include an e-reader device, such as a purpose-built device that is optimized for an e-reading experience (e.g., with E-ink displays).
  • a digitally rendered e-book may be configured in other, more fluid arrangements that allow alternative ways for a user to conveniently access a particular content portion or page of the e-book.
  • FIG. 1 illustrates a system 100 for utilizing applications and providing e-book services on a computing device configured for operation of an e-book reading launch interface, according to an embodiment.
  • system 100 includes an electronic personal display device, shown by way of example as an e-reading device 110 , and a network service 121 .
  • the network service 121 can include multiple servers and other computing resources that provide various services in connection with one or more applications that are installed on the e-reading device 110 .
  • the network service 121 may provide visual enhancements that correspond with e-reading content.
  • the network service 121 can provide e-book services that communicate with the e-reading device 110 .
  • the e-book services provided through network service 121 can, for example, include services in which e-books are sold, shared, downloaded and/or stored. More generally, the network service 121 can provide various other content services, including content rendering services (e.g., streaming media) or other network application environments or services.
  • content rendering services e.g., streaming media
  • the e-reading device 110 can correspond to any electronic personal display device on which applications and application resources (e.g., e-books, media files, documents) can be rendered and consumed.
  • the e-reading device 110 can correspond to a tablet or a telephony/messaging device (e.g., smart phone).
  • e-reading device 110 can run an e-reader application that links the device to the network service 121 and enables e-books provided through the service to be viewed and consumed by way of e-reading.
  • the e-reading device 110 can run a media playback or streaming application that receives files or streaming data from the network service 121 .
  • the e-reading device 110 can be equipped with hardware and software to optimize certain application activities, such as reading electronic content (e.g., e-books).
  • the e-reading device 110 can have a tablet-like form factor, although variations are possible.
  • the e-reading device 110 can also have an E-ink display.
  • the network service 121 can include a device interface 128 , a content store 122 and a user account electronic library (e-library) 124 storing e-books or digital content items.
  • Content store 122 may be an online store for purchasing of digital content items for download therefrom onto a resident memory of e-reading device 110 and/or user account e-library 124 .
  • User account e-library 124 associates the e-reading device 110 with a user having an account 123 .
  • the account 123 can also be associated with ownership of, and/or accessibility to, one or more digital content items stored in content store 122 .
  • the digital content items are e-books
  • the content store 122 is an online store having e-books for purchase or other licensed use.
  • the device interface 128 can handle requests from the e-reading device 110 with regard to services and functionality of the network service 121 .
  • the device interface 128 can utilize information provided with user account 123 in order to enable services, such as purchasing and downloading of e-books into user account e-library 124 , and determining what e-books and content items providable via content store 122 are associated with, and accessible to, user account 123 .
  • the device interface 128 can provide the e-reading device 110 with access to the on-line content store 122 .
  • the device interface 128 can handle input to identify content items (e.g., e-books), and further to link content items to the account 123 of the user.
  • the user account e-library 124 can retain metadata for individual accounts 123 to identify e-books or other digital content items that have been purchased or made available for consumption for a given account.
  • information relating to e-books within user account e-library 124 can include a metadata set in addition to substantive digital text and image content portions.
  • the metadata set can include, for example, information such as the graphic representation of the e-book, such as including artwork- or image-based representation of a counterpart physical paper book cover, as well as summary information, author information, title, short synapse or book review, publication date and language of the e-book, and book or volume series information.
  • the e-reading device 110 may be associated with the user account 123 , and multiple devices may be associated with the same account. As described in greater detail below, e-reading device 110 can locally store content items (e.g., e-books) that are purchased or otherwise made available to the user of the e-reading device 110 as well as to archive, in user account 124 , e-books and other digital content items that have been purchased for the user account 123 , but are not necessarily stored in local resident memory of computing device 110 .
  • content items e.g., e-books
  • e-reading device 110 can include a touchscreen display 116 .
  • the display screen 116 is touch-sensitive, to process touch inputs including gestures (e.g., swipes).
  • the display screen 116 may be integrated with one or more touch sensors 138 to provide a touch-sensing region on a surface of the display screen 116 .
  • the one or more touch sensors 138 may include capacitive sensors that can sense or detect a human body's capacitance as input.
  • the touch-sensing region coincides with a substantial surface area, if not all, of the display screen 116 .
  • the e-reading device 110 includes features for providing functionality related to displaying paginated content, including paginated content comprising an e-magazine or e-comic book.
  • the e-reading device 110 can include page transitioning logic, which enables the user to transition through paginated content.
  • the e-reading device 110 can display pages of e-books, e-magazines and e-comics, and enable the user to transition from one page state to another.
  • an e-book can provide content that is rendered sequentially in pages, and the e-book can display page states in the form of single pages, multiple pages or portions thereof. Accordingly, a given page state can coincide with, for example, a single page, or two or more pages displayed at once.
  • Page transitioning logic can operate to enable the user to transition from a given page state to another page state In the specific example embodiment where a given page state coincides with a single page, for instance, each page state corresponding to one page of the digitally constructed, ordered sequence of pages paginated to comprise, in one embodiment, an e-book. In some implementations, the page transitioning logic enables single page transitions, chapter transitions, or cluster transitions (multiple pages at one time).
  • the e-reading device 110 includes display sensor logic 135 to detect and interpret user input or user input commands made through interaction with the touch sensors 138 .
  • display sensor logic 135 can detect a user making contact with the touch-sensing region of the display screen 116 , otherwise known as a touch event. More specifically, display sensor logic 135 can detect a touch events also referred to herein as a tap, an initial tap held in contact with display screen 116 for longer than some pre-defined threshold duration of time (otherwise known as a “long press” or a “long touch”), multiple taps performed either sequentially or generally simultaneously, swiping gesture actions made through user interaction with the touch sensing region of the display screen 116 , or any combination of these gesture actions.
  • display sensor logic 135 can interpret such interactions in a variety of ways. For example, each such interaction may be interpreted as a particular type of user input associated with a respective input command, execution of which may trigger a change in state of display 116 .
  • display sensor logic 135 implements operations to monitor for the user contacting or superimposing upon, using a finger, thumb or stylus, a surface of display 116 coinciding with a placement of one or more touch sensor components 138 , that is, a touch event, and also detects and correlates a particular gesture action (e.g., pinching, swiping, tapping, etc.) as a particular type of input command.
  • Display sensor logic is also responsive to the user's eye contact with various words or text that may initiate presentation of visual enhancements that correspond with e-reading content.
  • Display sensor logic 135 may also sense directionality of a user gesture action so as to distinguish between, for example, leftward, rightward, upward, downward and diagonal swipes along a surface portion of display screen 116 for the purpose of associating respective user input commands therewith.
  • E-library view (or interface) logic 120 provides an interface, displayable via display screen 116 of computing device 110 , showing titles in a user's e-library collection of e-books, or from a user's home page in relation to an online content store 122 hosting e-books for commercial sale and downloading therefrom.
  • the e-library collection of e-books may be hosted via a remotely located computer server device associate with user account e-library 124 , or at a locally resident within a memory at computing device 110 .
  • the e-library view logic 120 can display iconic or other graphic representations of individual e-books in the user's e-library collection.
  • the e-library view logic 120 can use the metadata associated with the records of the e-books in the user's e-library account 124 to display lists, folders, or other virtual structures that include graphic representations and/or other identifiers of e-books in the user's collection.
  • the metadata set can include, for example, information such as the graphic representation of the e-book, such as including artwork- or image-based representation of a counterpart physical paper book cover, as well as summary information, author information, title, short synapse or book review, publication date and language of the e-book, and book or volume series information.
  • the user's collection can include e-books that the user has on the particular device 110 (e.g., locally stored e-books), as well as e-books that are not locally stored, but rather are stored or archived at a remote computer server and associated with the user account e-library 124 .
  • e-books that the user has on the particular device 110 (e.g., locally stored e-books), as well as e-books that are not locally stored, but rather are stored or archived at a remote computer server and associated with the user account e-library 124 .
  • Annotations interface logic module 125 provides an annotations and bookmarking scheme in conjunction with the interface rendered via e-library view logic 120 , providing an annotations interface page(s) to be deployed upon launch in lieu of a table of contents or a first page of an e-book for reading.
  • Launch of the e-book for reading in one embodiment, is triggered by a user enacting a touch event upon a graphical icon representing a specific e-book from an e-library collection, as will be described further in regard to FIGS. 2 and 3 .
  • E-library view logic module 120 and annotations interface logic module 125 can be implemented as software modules comprising instructions stored in a memory of mobile computing device 110 , as described in further detail below with regard to FIG. 2 .
  • e-library view logic module 120 display sensor logic 135 and annotations interface logic module 125 described herein may be implemented using programmatic modules or components.
  • a programmatic module or component may include a program, a subroutine, a portion of a program, or a software or a hardware component capable of performing one or more stated tasks or functions in conjunction with one or more processors.
  • a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs and hardware components.
  • e-library view logic module 120 may be implemented through instructions that are executable by one or more processors. These instructions may be stored on a computer-readable non-transitory medium.
  • the numerous computing and communication devices shown with embodiments of the invention include processor(s) and various forms of computer memory, including volatile and non-volatile forms, storing data and instructions.
  • Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers.
  • Other examples of computer storage mediums include portable storage units, flash or solid-state memory (such as included on many cell phones and consumer electronic devices) and magnetic memory.
  • Computers, terminals, network enabled devices are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, embodiments may be implemented in the form of computer-programs, or a computer usable storage medium capable of storing such a program.
  • FIG. 2 illustrates a schematic architecture of a computing device for configuring and launching an e-book reading interface, according to an embodiment.
  • E-reading device 110 further includes processor 210 , a memory 250 storing instructions and logic pertaining at least to display sensor logic 135 , e-library view logic module 120 and annotations interface logic 125 .
  • Processor 210 can implement functionality using the logic and instructions stored in memory 250 . Additionally, in some implementations, processor 210 communicates with the network service 121 (see FIG. 1 ). More specifically, the e-reading device 110 can access the network service 121 to receive various kinds of resources (e.g., digital content items such as e-books, configuration files, account information), as well as to provide information (e.g., user account information, service requests etc.). For example, e-reading device 110 can receive application resources, such as e-books or media files, that the user elects to purchase or otherwise download via the network service 121 . The application resources that are downloaded onto the e-reading device 110 can be stored in memory 250 .
  • resources e.g., digital content items such as e-books, configuration files, account information
  • information e.g., user account information, service requests etc.
  • e-reading device 110 can receive application resources, such as e-books or media files, that the user elects to purchase or otherwise download via the network service 121
  • display 116 can correspond to, for example, a liquid crystal display (LCD) or light emitting diode (LED) display that illuminates in order to provide content generated from processor 210 .
  • display 116 can be touch-sensitive.
  • one or more of the touch sensor components 138 may be integrated with display 116 .
  • the touch sensor components 138 may be provided (e.g., as a layer) above or below display 116 such that individual touch sensor components 138 track different regions of display 116 .
  • display 116 can correspond to an electronic paper type display, which mimics conventional paper in the manner in which content is displayed. Examples of such display technologies include electrophoretic displays, electro-wetting displays, and electro-fluidic displays.
  • Processor 210 can receive input from various sources, including touch sensor components 138 , display 116 , keystroke input 208 such as from a virtual or rendered keyboard, and other input mechanisms 299 (e.g., buttons, mouse, microphone, etc.). With reference to examples described herein, processor 210 can respond to input detected at the touch sensor components 138 . In some embodiments, processor 210 responds to inputs from the touch sensor components 138 in order to facilitate or enhance e-book activities such as generating e-book content on display 116 , performing page transitions of the displayed e-book content, powering off the device 110 and/or display 116 , activating a screen saver, launching or closing an application, and/or otherwise altering a state of display 116 .
  • e-book activities such as generating e-book content on display 116 , performing page transitions of the displayed e-book content, powering off the device 110 and/or display 116 , activating a screen saver, launching or closing an application, and/
  • memory 250 may store display sensor logic 135 that monitors for user interactions detected through the touch sensor components 138 , and further processes the user interactions as a particular input or type of input.
  • display sensor logic module 135 may be integrated with the touch sensor components 138 .
  • the touch sensor components 138 can be provided as a modular component that includes integrated circuits or other hardware logic, and such resources can provide some or all of display sensor logic 135 .
  • some or all of display sensor logic 135 may be implemented with processor 210 (which utilizes instructions stored in memory 250 ), or with an alternative processing resource.
  • E-reading device 110 further includes wireless connectivity subsystem 213 , comprising a wireless communication receiver, a transmitter, and associated components, such as one or more embedded or internal antenna elements, local oscillators, and a processing module such as a digital signal processor (DSP) (not shown).
  • wireless connectivity subsystem 213 comprising a wireless communication receiver, a transmitter, and associated components, such as one or more embedded or internal antenna elements, local oscillators, and a processing module such as a digital signal processor (DSP) (not shown).
  • DSP digital signal processor
  • E-library view logic module 120 can be implemented as a software module, comprising instructions stored in memory 250 , on mobile computing device 110 .
  • the local memory 250 can include records for each e-book in the user's e-library account 124 , each record include metadata of the e-books therein.
  • the user may have the content portion of select e-books archived remotely at a computer server cloud system, so as not to reside in the local memory 250 , but be provided by the network service 121 upon request or as needed.
  • the e-library view logic module 120 can display the e-books of a user's collection in the form of a virtual bookshelf or bookcase feature showing graphical icons representing the e-books.
  • the e-books are displayed as icons that include imagery, title information, etc.
  • the e-library view module 120 can display representations of e-books in the user's collection as icons, or as icons with associated text.
  • folders can be used to provide a panel view of the graphic representations (e.g., icons and/or text) of the e-books in the user's e-library collection 124 , corresponding to a side view of a bookshelf showing book spines with titles printed thereon for identifying individual books.
  • Annotations interface logic 125 can be implemented as a software module comprising instructions stored in memory 250 of computing device 110 Annotations interface logic module 125 provides a provides an annotations and bookmarking interface scheme in conjunction with e-library view logic 120 , configuring an annotations interface page(s), which can be deployed upon a subsequent launch of an e-book for reading.
  • the annotations interface page can be presented in lieu of a typical table of contents or a first substantive reading page.
  • Launch of the e-book for reading may be triggered by a user enacting a touch event upon a graphical icon representing a specific e-book from e-library collection 124 as displayed on display screen 116 via e-library view logic 120 .
  • FIG. 3 illustrates embodiments of providing visually-biased sensory-enhanced e-reading.
  • Embodiments include eye tracking while e-reading and providing visual enhancements based on the eye tracking
  • sight is used to enhance the e-reading experience of a user and in one embodiment, visual enhancements are provided to the user that are related to particular words or phrases on the page the user is reading.
  • the visual enhancements may be specific to a particular story, genre, or e-reading setting.
  • booklovers will be able to select an immersive reading experience based on visual sensory enhancements. For example, when reaching the climax of a horror novel (end of chapter or end of book) or when triggering a specific word such as “murder” or “blood” a faint red light, or blotches of red light could begin pulsating behind the text.
  • a bullet hole 315 may appear as if a bullet had been shot through the e-reader.
  • FIG. 3 shows blood marks 304 that might drip down the side of the screen.
  • the visual enhancement could be a still image, an animated image, a video or any visual enhancement. It is appreciated that the visual enhancements may be accesses as a stored file and may be accessed from a remote location.
  • a dramatic visual enhancement such as a lightning bolt 310 may appear in response to what is happening in the story presented on the e-reader.
  • the lightning 310 may illuminate the background of the display, as if the lightning were occurring in the distance.
  • the red blood 304 or claw marks could appear in the background, in the margins of the page, or translucency.
  • the just-read word “ocean” could trigger blue illumination in the background or subtle ripples behind the text like waves on the surface of the sea.
  • Embodiments include a multi-layered sensory-driven reading experience for sight that includes an extensive electronic depository of words that trigger corresponding images or other visual enhancements such as the examples above.
  • the feature could also be customizable, allowing users to program certain words to trigger particular images or image types.
  • FIG. 4 illustrated is a method for providing visual enhancement to an E-reading experience, according to an embodiment.
  • a method for providing visual enhancement to an E-reading experience according to an embodiment.
  • method 400 includes tracking eye movement of a user of an e-reader.
  • Co-pending U.S. patent application Ser. No. 14/533,700, filed on Nov. 5, 2014, entitled “OPERATING AN ELECTRONIC PERSONAL DISPLAY USING EYE MOVEMENT TRACKING,” by Liu, having Attorney Docket No. KOBO-3013, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety provides details for tracking eye movement according to embodiments described herein.
  • method 400 includes providing a pre-defined visual indicator embedded with a portion of a story presented on the e-reader.
  • a library containing visual enhancements and corresponding words is accessed when e-book content is loaded and when a user views particular words or phrases, corresponding visual enhancements from the library can be accessed and presented to the user.
  • method 400 includes in response to the eye movement of the user being correlated with the pre-defined visual indicator, displaying an image which is associated with the portion of the story presented on the e-reader.
  • the predefined visual indicator is a word or phrase on the page that is displayed on the e-reading device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of synchronizing visual enhancement with e-reading content is provided. The method includes tracking eye movement of a user of an e-reader, providing a pre-defined visual indicator embedded within a portion of a story presented on the e-reader and responsive to the eye movement of the user being correlated with the pre-defined visual indicator, displaying an image which is associated with the portion of the story presented on the e-reader.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is related to co-pending U.S. patent application Ser. No. 14/533,890, filed on Nov. 5, 2014, entitled “PROVIDING A SCENT WHILE A USER INTERACTS WITH AN ELECTRONIC MEDIA PROVIDING DEVICE,” by Liu et al., having Attorney Docket No. KOBO-3012, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety.
  • This application is related to co-pending U.S. patent application Ser. No. 14/533,700, filed on Nov. 5, 2014, entitled “OPERATING AN ELECTRONIC PERSONAL DISPLAY USING EYE MOVEMENT TRACKING,” by Liu, having Attorney Docket No. KOBO-3013, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety.
  • This application is related to co-pending U.S. patent application Ser. No. 14/553,522, filed on Nov. 25, 2014, entitled “AUDIO IN SYNCHRONIZED OPERATION WITH E-READING CONTENT,” by Flawn et al., having Attorney Docket No. KOBO-3030, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety.
  • This application is related to co-pending U.S. patent application Ser. No. ______, filed on ______, entitled “METHOD AND SYSTEM FOR TACTILE-BIASED SENSORY-ENHANCED E-READING,” by Landau et al., having Attorney Docket No. KOBO-3040, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety.
  • This application is related to co-pending U.S. patent application Ser. No. ______, filed on ______, entitled “METHOD AND SYSTEM FOR CUSTOMIZABLE MULTI-LAYERED SENSORY-ENHANCED E-READING INTERFACE,” by Flawn et al., having Attorney Docket No. KOBO-3042, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety.
  • TECHNICAL FIELD
  • Examples described herein relate to a system and method for visually-biased sensory-enhanced e-Reading.
  • BACKGROUND
  • An electronic personal display is a mobile computing device that displays information to a user. While an electronic personal display may be capable of many of the functions of a personal computer, a user can typically interact directly with an electronic personal display without the use of a keyboard that is separate from, or coupled to, but distinct from the electronic personal display itself. Some examples of electronic personal displays include mobile digital devices/tablet computers and electronic readers (e-readers) such (e.g., Apple iPad®, Microsoft® Surface™, Samsung Galaxy Tab® and the like), handheld multimedia smartphones (e.g., Apple iPhone®, Samsung Galaxy S®, and the like), and handheld electronic readers (e.g., Amazon Kindle®, Barnes and Noble Nook®, Kobo Aura HD, Kobo Aura H2O, Kobo GLO and the like).
  • Some electronic personal display devices are purpose built devices designed to perform especially well at displaying digitally stored content for reading or viewing thereon. For example, a purpose build device may include a display that reduces glare, performs well in high lighting conditions, and/or mimics the look of text as presented via actual discrete pages of paper. While such purpose built devices may excel at displaying content for a user to read, they may also perform other functions, such as displaying images, emitting audio, recording audio, and web surfing, among others.
  • Electronic personal displays are among numerous kinds of consumer devices that can receive services and utilize resources across a network service. Such devices can operate applications or provide other functionality that links a device to a particular account of a specific service. For example, the electronic reader (e-reader) devices typically link to an online bookstore, and media playback devices often include applications that enable the user to access an online media electronic library (or e-library). In this context, the user accounts can enable the user to receive the full benefit and functionality of the device.
  • Yet further, such devices may incorporate a touch screen display having integrated touch sensors and touch sensing functionality, whereby user input commands via touch-based gestures are received thereon.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and form a part of this specification, illustrate various embodiments and, together with the Description of Embodiments, serve to explain principles discussed below. The drawings referred to in this brief description of the drawings should not be understood as being drawn to scale unless specifically noted.
  • FIG. 1 illustrates a system utilizing applications and providing e-book services on a computing device configured for operation of an e-book reading launch interface, in an embodiment.
  • FIG. 2 illustrates a schematic architecture of a computing device for configuring and launching an e-book reading interface, according to an embodiment.
  • FIG. 3 illustrates example embodiments for visually-biased sensory-enhanced e-reading.
  • FIG. 4 illustrates a method for operation in for configuring and launching an e-book reading interface on a computer device having a touchscreen display, according to an embodiment.
  • DETAILED DESCRIPTION
  • Embodiments include eye tracking while e-reading and providing visual enhancements based on the eye tracking In one embodiment, booklovers will be able to select an immersive reading experience based on visual sensory enhancements. For example, when reaching the climax of a horror novel (end of chapter or end of book) or when triggering a specific word such as “murder” or “blood” a faint red light, or blotches of red light could begin pulsating behind the text.
  • In one embodiment, red blood or claw marks could appear in the background, in the margins of the page, or translucency. In a book about the sea, the just-read word “ocean” could trigger blue illumination in the background or subtle ripples behind the text like waves on the surface of the sea. Embodiments include a multi-layered sensory-driven reading experience for sight that includes an extensive electronic depository of words that trigger corresponding images or other visual enhancements such as the examples above. The feature could also be customizable, allowing users to program certain words to trigger particular images or image types.
  • “E-books” are a form of electronic publication content stored in digital format in a computer non-transitory memory, viewable on a computing device having display functionality. An e-book can correspond to, or mimic, the paginated format of a printed publication for viewing, such as provided by printed literary works (e.g., novels) and periodicals (e.g., magazines, comic books, journals, etc.). Optionally, some e-books may have chapter designations, as well as content that corresponds to graphics or images (e.g., such as in the case of magazines or comic books). Multi-function devices, such as cellular-telephony or messaging devices, can utilize specialized applications (e.g., specialized e-reading application software) to view e-books in a format that mimics the paginated printed publication. Still further, some devices (sometimes labeled as “e-readers”) can display digitally-stored content in a more reading-centric manner, while also providing, via a user input interface, the ability to manipulate that content for viewing, such as via discrete pages arranged sequentially (that is, pagination) corresponding to an intended or natural reading progression, or flow, of the content therein.
  • An “e-reading device”, variously referred to herein as an electronic personal display or mobile computing device, can refer to any computing device that can display or otherwise render an e-book. By way of example, an e-reading device can include a mobile computing device on which an e-reading application can be executed to render content that includes e-books (e.g., comic books, magazines, etc.). Such mobile computing devices can include, for example, a multi-functional computing device for cellular telephony/messaging (e.g., feature phone or smart phone), a tablet computer device, an ultra-mobile computing device, or a wearable computing device with a form factor of a wearable accessory device (e.g., smart watch or bracelet, glass-wear integrated with a computing device, etc.). As another example, an e-reading device can include an e-reader device, such as a purpose-built device that is optimized for an e-reading experience (e.g., with E-ink displays).
  • While conventional physical paper books typically include a fixedly-configured table of contents page(s) intended to assist a user or observer to locate a desired portion or page of the book for reading, a digitally rendered e-book may be configured in other, more fluid arrangements that allow alternative ways for a user to conveniently access a particular content portion or page of the e-book.
  • FIG. 1 illustrates a system 100 for utilizing applications and providing e-book services on a computing device configured for operation of an e-book reading launch interface, according to an embodiment. In an example of FIG. 1, system 100 includes an electronic personal display device, shown by way of example as an e-reading device 110, and a network service 121. The network service 121 can include multiple servers and other computing resources that provide various services in connection with one or more applications that are installed on the e-reading device 110. For example, in one embodiment, the network service 121 may provide visual enhancements that correspond with e-reading content. By way of example, in one implementation, the network service 121 can provide e-book services that communicate with the e-reading device 110. The e-book services provided through network service 121 can, for example, include services in which e-books are sold, shared, downloaded and/or stored. More generally, the network service 121 can provide various other content services, including content rendering services (e.g., streaming media) or other network application environments or services.
  • The e-reading device 110 can correspond to any electronic personal display device on which applications and application resources (e.g., e-books, media files, documents) can be rendered and consumed. For example, the e-reading device 110 can correspond to a tablet or a telephony/messaging device (e.g., smart phone). In one implementation, for example, e-reading device 110 can run an e-reader application that links the device to the network service 121 and enables e-books provided through the service to be viewed and consumed by way of e-reading. In another implementation, the e-reading device 110 can run a media playback or streaming application that receives files or streaming data from the network service 121. By way of example, the e-reading device 110 can be equipped with hardware and software to optimize certain application activities, such as reading electronic content (e.g., e-books). For example, the e-reading device 110 can have a tablet-like form factor, although variations are possible. In some cases, the e-reading device 110 can also have an E-ink display.
  • In additional detail, the network service 121 can include a device interface 128, a content store 122 and a user account electronic library (e-library) 124 storing e-books or digital content items. Content store 122 may be an online store for purchasing of digital content items for download therefrom onto a resident memory of e-reading device 110 and/or user account e-library 124. User account e-library 124 associates the e-reading device 110 with a user having an account 123. The account 123 can also be associated with ownership of, and/or accessibility to, one or more digital content items stored in content store 122. In one embodiment, the digital content items are e-books, and the content store 122 is an online store having e-books for purchase or other licensed use. The device interface 128 can handle requests from the e-reading device 110 with regard to services and functionality of the network service 121. The device interface 128 can utilize information provided with user account 123 in order to enable services, such as purchasing and downloading of e-books into user account e-library 124, and determining what e-books and content items providable via content store 122 are associated with, and accessible to, user account 123. Additionally, the device interface 128 can provide the e-reading device 110 with access to the on-line content store 122. The device interface 128 can handle input to identify content items (e.g., e-books), and further to link content items to the account 123 of the user.
  • Yet further, the user account e-library 124 can retain metadata for individual accounts 123 to identify e-books or other digital content items that have been purchased or made available for consumption for a given account. Thus information relating to e-books within user account e-library 124 can include a metadata set in addition to substantive digital text and image content portions. The metadata set can include, for example, information such as the graphic representation of the e-book, such as including artwork- or image-based representation of a counterpart physical paper book cover, as well as summary information, author information, title, short synapse or book review, publication date and language of the e-book, and book or volume series information.
  • The e-reading device 110 may be associated with the user account 123, and multiple devices may be associated with the same account. As described in greater detail below, e-reading device 110 can locally store content items (e.g., e-books) that are purchased or otherwise made available to the user of the e-reading device 110 as well as to archive, in user account 124, e-books and other digital content items that have been purchased for the user account 123, but are not necessarily stored in local resident memory of computing device 110.
  • With reference to an example of FIG. 1, e-reading device 110 can include a touchscreen display 116. In an embodiment, the display screen 116 is touch-sensitive, to process touch inputs including gestures (e.g., swipes). For example, the display screen 116 may be integrated with one or more touch sensors 138 to provide a touch-sensing region on a surface of the display screen 116. For some embodiments, the one or more touch sensors 138 may include capacitive sensors that can sense or detect a human body's capacitance as input. In the example of FIG. 1, the touch-sensing region coincides with a substantial surface area, if not all, of the display screen 116.
  • In some embodiments, the e-reading device 110 includes features for providing functionality related to displaying paginated content, including paginated content comprising an e-magazine or e-comic book. The e-reading device 110 can include page transitioning logic, which enables the user to transition through paginated content. The e-reading device 110 can display pages of e-books, e-magazines and e-comics, and enable the user to transition from one page state to another. In particular, an e-book can provide content that is rendered sequentially in pages, and the e-book can display page states in the form of single pages, multiple pages or portions thereof. Accordingly, a given page state can coincide with, for example, a single page, or two or more pages displayed at once. Page transitioning logic can operate to enable the user to transition from a given page state to another page state In the specific example embodiment where a given page state coincides with a single page, for instance, each page state corresponding to one page of the digitally constructed, ordered sequence of pages paginated to comprise, in one embodiment, an e-book. In some implementations, the page transitioning logic enables single page transitions, chapter transitions, or cluster transitions (multiple pages at one time).
  • According to some embodiments, the e-reading device 110 includes display sensor logic 135 to detect and interpret user input or user input commands made through interaction with the touch sensors 138. By way of example, display sensor logic 135 can detect a user making contact with the touch-sensing region of the display screen 116, otherwise known as a touch event. More specifically, display sensor logic 135 can detect a touch events also referred to herein as a tap, an initial tap held in contact with display screen 116 for longer than some pre-defined threshold duration of time (otherwise known as a “long press” or a “long touch”), multiple taps performed either sequentially or generally simultaneously, swiping gesture actions made through user interaction with the touch sensing region of the display screen 116, or any combination of these gesture actions. Although referred to herein as a “touch” or a tap, it should be appreciated that in some design implementations, sufficient proximity to the screen surface, even without actual physical contact, may register a “contact” or a “touch event”. Furthermore, display sensor logic 135 can interpret such interactions in a variety of ways. For example, each such interaction may be interpreted as a particular type of user input associated with a respective input command, execution of which may trigger a change in state of display 116.
  • In one implementation, display sensor logic 135 implements operations to monitor for the user contacting or superimposing upon, using a finger, thumb or stylus, a surface of display 116 coinciding with a placement of one or more touch sensor components 138, that is, a touch event, and also detects and correlates a particular gesture action (e.g., pinching, swiping, tapping, etc.) as a particular type of input command. Display sensor logic is also responsive to the user's eye contact with various words or text that may initiate presentation of visual enhancements that correspond with e-reading content. Display sensor logic 135 may also sense directionality of a user gesture action so as to distinguish between, for example, leftward, rightward, upward, downward and diagonal swipes along a surface portion of display screen 116 for the purpose of associating respective user input commands therewith.
  • E-library view (or interface) logic 120 provides an interface, displayable via display screen 116 of computing device 110, showing titles in a user's e-library collection of e-books, or from a user's home page in relation to an online content store 122 hosting e-books for commercial sale and downloading therefrom. The e-library collection of e-books may be hosted via a remotely located computer server device associate with user account e-library 124, or at a locally resident within a memory at computing device 110. The e-library view logic 120 can display iconic or other graphic representations of individual e-books in the user's e-library collection. For example, the e-library view logic 120 can use the metadata associated with the records of the e-books in the user's e-library account 124 to display lists, folders, or other virtual structures that include graphic representations and/or other identifiers of e-books in the user's collection. The metadata set can include, for example, information such as the graphic representation of the e-book, such as including artwork- or image-based representation of a counterpart physical paper book cover, as well as summary information, author information, title, short synapse or book review, publication date and language of the e-book, and book or volume series information. The user's collection can include e-books that the user has on the particular device 110 (e.g., locally stored e-books), as well as e-books that are not locally stored, but rather are stored or archived at a remote computer server and associated with the user account e-library 124.
  • Annotations interface logic module 125 provides an annotations and bookmarking scheme in conjunction with the interface rendered via e-library view logic 120, providing an annotations interface page(s) to be deployed upon launch in lieu of a table of contents or a first page of an e-book for reading. Launch of the e-book for reading, in one embodiment, is triggered by a user enacting a touch event upon a graphical icon representing a specific e-book from an e-library collection, as will be described further in regard to FIGS. 2 and 3.
  • E-library view logic module 120 and annotations interface logic module 125 can be implemented as software modules comprising instructions stored in a memory of mobile computing device 110, as described in further detail below with regard to FIG. 2.
  • In one or more embodiments of e-library view logic module 120, display sensor logic 135 and annotations interface logic module 125 described herein may be implemented using programmatic modules or components. A programmatic module or component may include a program, a subroutine, a portion of a program, or a software or a hardware component capable of performing one or more stated tasks or functions in conjunction with one or more processors. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs and hardware components.
  • Furthermore, the one or more embodiments of e-library view logic module 120, display sensor logic 135 and annotations interface logic module 125 described herein may be implemented through instructions that are executable by one or more processors. These instructions may be stored on a computer-readable non-transitory medium. In particular, the numerous computing and communication devices shown with embodiments of the invention include processor(s) and various forms of computer memory, including volatile and non-volatile forms, storing data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, flash or solid-state memory (such as included on many cell phones and consumer electronic devices) and magnetic memory. Computers, terminals, network enabled devices (e.g., mobile devices such as cell phones and wearable computers) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, embodiments may be implemented in the form of computer-programs, or a computer usable storage medium capable of storing such a program.
  • FIG. 2 illustrates a schematic architecture of a computing device for configuring and launching an e-book reading interface, according to an embodiment.
  • E-reading device 110 further includes processor 210, a memory 250 storing instructions and logic pertaining at least to display sensor logic 135, e-library view logic module 120 and annotations interface logic 125.
  • Processor 210 can implement functionality using the logic and instructions stored in memory 250. Additionally, in some implementations, processor 210 communicates with the network service 121 (see FIG. 1). More specifically, the e-reading device 110 can access the network service 121 to receive various kinds of resources (e.g., digital content items such as e-books, configuration files, account information), as well as to provide information (e.g., user account information, service requests etc.). For example, e-reading device 110 can receive application resources, such as e-books or media files, that the user elects to purchase or otherwise download via the network service 121. The application resources that are downloaded onto the e-reading device 110 can be stored in memory 250.
  • In some implementations, display 116 can correspond to, for example, a liquid crystal display (LCD) or light emitting diode (LED) display that illuminates in order to provide content generated from processor 210. In some implementations, display 116 can be touch-sensitive. For example, in some embodiments, one or more of the touch sensor components 138 may be integrated with display 116. In other embodiments, the touch sensor components 138 may be provided (e.g., as a layer) above or below display 116 such that individual touch sensor components 138 track different regions of display 116. Further, in some variations, display 116 can correspond to an electronic paper type display, which mimics conventional paper in the manner in which content is displayed. Examples of such display technologies include electrophoretic displays, electro-wetting displays, and electro-fluidic displays.
  • Processor 210 can receive input from various sources, including touch sensor components 138, display 116, keystroke input 208 such as from a virtual or rendered keyboard, and other input mechanisms 299 (e.g., buttons, mouse, microphone, etc.). With reference to examples described herein, processor 210 can respond to input detected at the touch sensor components 138. In some embodiments, processor 210 responds to inputs from the touch sensor components 138 in order to facilitate or enhance e-book activities such as generating e-book content on display 116, performing page transitions of the displayed e-book content, powering off the device 110 and/or display 116, activating a screen saver, launching or closing an application, and/or otherwise altering a state of display 116.
  • In some embodiments, memory 250 may store display sensor logic 135 that monitors for user interactions detected through the touch sensor components 138, and further processes the user interactions as a particular input or type of input. In an alternative embodiment, display sensor logic module 135 may be integrated with the touch sensor components 138. For example, the touch sensor components 138 can be provided as a modular component that includes integrated circuits or other hardware logic, and such resources can provide some or all of display sensor logic 135. In variations, some or all of display sensor logic 135 may be implemented with processor 210 (which utilizes instructions stored in memory 250), or with an alternative processing resource.
  • E-reading device 110 further includes wireless connectivity subsystem 213, comprising a wireless communication receiver, a transmitter, and associated components, such as one or more embedded or internal antenna elements, local oscillators, and a processing module such as a digital signal processor (DSP) (not shown). As will be apparent to those skilled in the field of communications, the particular design of wireless connectivity subsystem 213 depends on the communication network in which computing device 110 is intended to operate, such as in accordance with Wi-Fi, Bluetooth, Near Field Communication (NFC) communication protocols, and the like.
  • E-library view logic module 120 can be implemented as a software module, comprising instructions stored in memory 250, on mobile computing device 110. In one implementation, the local memory 250 can include records for each e-book in the user's e-library account 124, each record include metadata of the e-books therein. The user may have the content portion of select e-books archived remotely at a computer server cloud system, so as not to reside in the local memory 250, but be provided by the network service 121 upon request or as needed. By way of example, the e-library view logic module 120 can display the e-books of a user's collection in the form of a virtual bookshelf or bookcase feature showing graphical icons representing the e-books. In such an implementation, the e-books are displayed as icons that include imagery, title information, etc. In a variation, the e-library view module 120 can display representations of e-books in the user's collection as icons, or as icons with associated text. Still further, folders can be used to provide a panel view of the graphic representations (e.g., icons and/or text) of the e-books in the user's e-library collection 124, corresponding to a side view of a bookshelf showing book spines with titles printed thereon for identifying individual books.
  • Annotations interface logic 125 can be implemented as a software module comprising instructions stored in memory 250 of computing device 110 Annotations interface logic module 125 provides a provides an annotations and bookmarking interface scheme in conjunction with e-library view logic 120, configuring an annotations interface page(s), which can be deployed upon a subsequent launch of an e-book for reading. In embodiment, upon e-book launch for reading, the annotations interface page can be presented in lieu of a typical table of contents or a first substantive reading page. Launch of the e-book for reading may be triggered by a user enacting a touch event upon a graphical icon representing a specific e-book from e-library collection 124 as displayed on display screen 116 via e-library view logic 120.
  • FIG. 3 illustrates embodiments of providing visually-biased sensory-enhanced e-reading. Embodiments include eye tracking while e-reading and providing visual enhancements based on the eye tracking In one embodiment, sight is used to enhance the e-reading experience of a user and in one embodiment, visual enhancements are provided to the user that are related to particular words or phrases on the page the user is reading. The visual enhancements may be specific to a particular story, genre, or e-reading setting.
  • In one embodiment, booklovers will be able to select an immersive reading experience based on visual sensory enhancements. For example, when reaching the climax of a horror novel (end of chapter or end of book) or when triggering a specific word such as “murder” or “blood” a faint red light, or blotches of red light could begin pulsating behind the text. In another embodiment, when a user is reading a thriller or mystery book, a bullet hole 315 may appear as if a bullet had been shot through the e-reader.
  • FIG. 3 shows blood marks 304 that might drip down the side of the screen. It is appreciated that the visual enhancement could be a still image, an animated image, a video or any visual enhancement. It is appreciated that the visual enhancements may be accesses as a stored file and may be accessed from a remote location.
  • In one embodiment, a dramatic visual enhancement, such as a lightning bolt 310 may appear in response to what is happening in the story presented on the e-reader. In one embodiment, the lightning 310 may illuminate the background of the display, as if the lightning were occurring in the distance.
  • In one embodiment, the red blood 304 or claw marks could appear in the background, in the margins of the page, or translucency. In a book about the sea, the just-read word “ocean” could trigger blue illumination in the background or subtle ripples behind the text like waves on the surface of the sea. Embodiments include a multi-layered sensory-driven reading experience for sight that includes an extensive electronic depository of words that trigger corresponding images or other visual enhancements such as the examples above. The feature could also be customizable, allowing users to program certain words to trigger particular images or image types.
  • Next with reference to FIG. 4, illustrated is a method for providing visual enhancement to an E-reading experience, according to an embodiment. In describing the example of FIG. 4, reference will be made to components such as described with regard to FIGS. 1 through 3 for purposes of illustrating components for performing a step or sub-step as described.
  • At step 402, method 400 includes tracking eye movement of a user of an e-reader. Co-pending U.S. patent application Ser. No. 14/533,700, filed on Nov. 5, 2014, entitled “OPERATING AN ELECTRONIC PERSONAL DISPLAY USING EYE MOVEMENT TRACKING,” by Liu, having Attorney Docket No. KOBO-3013, and assigned to the assignee of the present application and hereby incorporated by reference in its entirety provides details for tracking eye movement according to embodiments described herein.
  • At step 404, method 400 includes providing a pre-defined visual indicator embedded with a portion of a story presented on the e-reader. In one embodiment, a library containing visual enhancements and corresponding words is accessed when e-book content is loaded and when a user views particular words or phrases, corresponding visual enhancements from the library can be accessed and presented to the user.
  • At 406, method 400 includes in response to the eye movement of the user being correlated with the pre-defined visual indicator, displaying an image which is associated with the portion of the story presented on the e-reader. In one embodiment, the predefined visual indicator is a word or phrase on the page that is displayed on the e-reading device.
  • Although illustrative embodiments have been described in detail herein with reference to the accompanying drawings, variations to specific embodiments and details are contemplated and encompassed by this disclosure. It is intended that the scope of embodiments described herein be defined by claims and their equivalents. Furthermore, it is contemplated that a particular feature described, either individually or as part of an embodiment, can be combined with other individually described features, or parts of other embodiments. Thus, absence of describing combinations should not preclude the inventor(s) from claiming rights to such combinations.

Claims (20)

What is claimed is:
1. A method of synchronizing visual enhancement with e-reading content, the method comprising:
tracking eye movement of a user of an e-reader;
providing a pre-defined visual indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined visual indicator, displaying an image which is associated with the portion of the story presented on the e-reader.
2. The method as recited by claim 1, further comprising:
providing a pre-defined background visual indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined background visual indicator, broadcasting a background image file which is associated with the portion of the story presented on the e-reader.
3. The method as recited by claim 1, further comprising:
providing a pre-defined momentary visual indicator embedded within a portion of the story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined momentary visual indicator, broadcasting a momentary audio file associated with the portion of the story presented on the e-reader.
4. The method as recited by claim 1, further comprising:
tracking the eye movement of the user at a line-by-line granularity.
5. The method as recited by claim 1, further comprising:
tracking the eye movement of the user at a word-by-word granularity.
6. The method as recited by claim 1, further comprising:
providing a pre-defined new setting indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined new setting indicator, ceasing broadcast of the image.
7. The method as recited by claim 1, further comprising:
providing a pre-defined fade out indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined fade out indicator, fading out the broadcast of the audio file.
8. A system that synchronizes visual enhancement with e-reader content on an e-reader, the system comprising:
a camera that tracks an eye movement of a user of the e-reader;
a gaze to pre-defined visual enhancement indicator region correlation logic correlates a gaze of the user with a visual enhancement file embedded within a portion of a story presented on the e-reader; and
an operation implementation responsive to gaze logic implements presentation of the visual enhancement file in response to the gaze being correlated with the pre-defined visual enhancement indicator region.
9. The system of claim 8, wherein the visual enhancement file embedded within the portion of the story is an animation.
10. The system of claim 8, wherein the visual enhancement file embedded within the portion of the story is a picture file.
11. The system of claim 8, wherein the camera tracks the eye movement of the user of the e-reader at a line-by-line granularity.
12. The system of claim 8, wherein the camera tracks the eye movement of the user of the e-reader at a word-by-word granularity.
13. The system of claim 8, wherein the gaze to pre-defined visual enhancement indicator region correlation logic correlates the gaze of the user with a new setting indicator region embedded within another portion of the story presented on the e-reader; and
the operation implementation responsive to gaze logic ceases the presentation of the visual enhancement file in response to the gaze being correlated with the new setting indicator region.
14. The system of claim 8, wherein the gaze to pre-defined visual enhancement indicator region correlation logic correlates the gaze of the user with a pre-defined fade out indicator region embedded within another portion of the story presented on the e-reader; and
the operation implementation responsive to gaze logic fades out the presentation of the visual enhancement file in response to the gaze being correlated with the pre-defined fade out indicator region.
15. The system of claim 8, wherein the visual enhancement file embedded within a portion of a story presented on the e-reader is a stand-alone add on file for a pre-existing e-book file.
16. A non-transitory computer-readable storage medium storing instructions that, when executed by a hardware processor of a computing device, cause the hardware processor to perform a method of synchronizing visual enhancement with e-reading content, the method comprising:
tracking eye movement of a user of an e-reader with a camera of the e-reader;
providing a pre-defined visual enhancement indicator embedded within a portion of the story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined visual enhancement indicator, presenting a visual enhancement file associated with the portion of the story presented on the e-reader.
17. The non-transitory computer-readable storage medium as recited by claim 16, further comprising:
providing a pre-defined new setting indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined new setting indicator, ceasing presentation of the visual enhancement file.
18. The non-transitory computer-readable storage medium as recited by claim 16, further comprising:
providing a pre-defined fade out indicator embedded within a portion of a story presented on the e-reader; and
responsive to the eye movement of the user being correlated with the pre-defined fade out indicator, fading out the presentation of the visual enhancement file.
19. The non-transitory computer-readable storage medium as recited by claim 16, further comprising:
tracking the eye movement of the user at a line-by-line granularity.
20. The non-transitory computer-readable storage medium as recited by claim 16, further comprising:
tracking the eye movement of the user at a word-by-word granularity.
US14/570,609 2014-11-05 2014-12-15 Method and system for visually-biased sensory-enhanced e-reading Abandoned US20160171277A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/570,609 US20160171277A1 (en) 2014-11-05 2014-12-15 Method and system for visually-biased sensory-enhanced e-reading

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US14/533,700 US20160124505A1 (en) 2014-11-05 2014-11-05 Operating an electronic personal display using eye movement tracking
US14/533,890 US20160121348A1 (en) 2014-11-05 2014-11-05 Providing a scent while a user interacts with an electronic media providing device
US201414553522A 2014-11-25 2014-11-25
US14/570,609 US20160171277A1 (en) 2014-11-05 2014-12-15 Method and system for visually-biased sensory-enhanced e-reading

Publications (1)

Publication Number Publication Date
US20160171277A1 true US20160171277A1 (en) 2016-06-16

Family

ID=56111131

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/570,609 Abandoned US20160171277A1 (en) 2014-11-05 2014-12-15 Method and system for visually-biased sensory-enhanced e-reading
US14/570,772 Abandoned US20160170483A1 (en) 2014-11-05 2014-12-15 Method and system for tactile-biased sensory-enhanced e-reading
US14/570,832 Active US9939892B2 (en) 2014-11-05 2014-12-15 Method and system for customizable multi-layered sensory-enhanced E-reading interface

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/570,772 Abandoned US20160170483A1 (en) 2014-11-05 2014-12-15 Method and system for tactile-biased sensory-enhanced e-reading
US14/570,832 Active US9939892B2 (en) 2014-11-05 2014-12-15 Method and system for customizable multi-layered sensory-enhanced E-reading interface

Country Status (1)

Country Link
US (3) US20160171277A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160357253A1 (en) * 2015-06-05 2016-12-08 International Business Machines Corporation Initiating actions responsive to user expressions of a user while reading media content
CN110989882A (en) * 2019-11-28 2020-04-10 维沃移动通信有限公司 Control method, electronic device and computer readable storage medium
CN111143007A (en) * 2019-12-26 2020-05-12 珠海格力电器股份有限公司 Page control method, device, equipment and medium
US11775060B2 (en) 2021-02-16 2023-10-03 Athena Accessible Technology, Inc. Systems and methods for hands-free scrolling

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100078294A (en) * 2008-12-30 2010-07-08 삼성전자주식회사 Method for generating vibration and mobile terminal using the same
US8370151B2 (en) * 2009-01-15 2013-02-05 K-Nfb Reading Technology, Inc. Systems and methods for multiple voice document narration
US20110126119A1 (en) * 2009-11-20 2011-05-26 Young Daniel J Contextual presentation of information
JP5785015B2 (en) * 2011-07-25 2015-09-24 京セラ株式会社 Electronic device, electronic document control program, and electronic document control method
US20130209981A1 (en) * 2012-02-15 2013-08-15 Google Inc. Triggered Sounds in eBooks
US9141257B1 (en) * 2012-06-18 2015-09-22 Audible, Inc. Selecting and conveying supplemental content
US9047784B2 (en) 2012-08-02 2015-06-02 International Business Machines Corporation Automatic eBook reader augmentation

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160357253A1 (en) * 2015-06-05 2016-12-08 International Business Machines Corporation Initiating actions responsive to user expressions of a user while reading media content
US10317994B2 (en) * 2015-06-05 2019-06-11 International Business Machines Corporation Initiating actions responsive to user expressions of a user while reading media content
US10656709B2 (en) 2015-06-05 2020-05-19 International Business Machines Corporation Initiating actions responsive to user expressions of a user while reading media content
US10656708B2 (en) 2015-06-05 2020-05-19 International Business Machines Corporation Initiating actions responsive to user expressions of a user while reading media content
CN110989882A (en) * 2019-11-28 2020-04-10 维沃移动通信有限公司 Control method, electronic device and computer readable storage medium
CN111143007A (en) * 2019-12-26 2020-05-12 珠海格力电器股份有限公司 Page control method, device, equipment and medium
US11775060B2 (en) 2021-02-16 2023-10-03 Athena Accessible Technology, Inc. Systems and methods for hands-free scrolling

Also Published As

Publication number Publication date
US20160170483A1 (en) 2016-06-16
US9939892B2 (en) 2018-04-10
US20160170484A1 (en) 2016-06-16

Similar Documents

Publication Publication Date Title
US9733803B2 (en) Point of interest collaborative e-reading
US20160261590A1 (en) Method and system of shelving digital content items for multi-user shared e-book accessing
US20160171277A1 (en) Method and system for visually-biased sensory-enhanced e-reading
US20160275192A1 (en) Personalizing an e-book search query
US20160140085A1 (en) System and method for previewing e-reading content
US20160224302A1 (en) Method and system for device display screen transition related to device power monitoring
US20160217108A1 (en) Bifurcated presentation of e-content on an e-reading device
US20160140249A1 (en) System and method for e-book reading progress indicator and invocation thereof
US20150169503A1 (en) E-reader device and system for altering an e-book using captured content items
US20160210267A1 (en) Deploying mobile device display screen in relation to e-book signature
US20160188539A1 (en) Method and system for apportioned content excerpting interface and operation thereof
US20160170591A1 (en) Method and system for e-book annotations navigation and interface therefor
US20160140086A1 (en) System and method for content repagination providing a page continuity indicium while e-reading
US20160149864A1 (en) Method and system for e-reading collective progress indicator interface
US20160231921A1 (en) Method and system for reading progress indicator with page resume demarcation
US20160034575A1 (en) Vocabulary-effected e-content discovery
US9916064B2 (en) System and method for toggle interface
US9875016B2 (en) Method and system for persistent ancillary display screen rendering
US20160173565A1 (en) Method and system for time-release e-book gifting and interface therefor
US9898450B2 (en) System and method for repagination of display content
US20160239161A1 (en) Method and system for term-occurrence-based navigation of apportioned e-book content
US20160154551A1 (en) System and method for comparative time-to-completion display view for queued e-reading content items
US20160210098A1 (en) Short range sharing of e-reader content
US20160240182A1 (en) Automatic white noise generation for device e-reading mode
US20160147395A1 (en) Method and system for series-based digital reading content queue and interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: KOBO INCORPORATED, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FLAWN, SARAH;LANDAU, BENJAMIN;REEL/FRAME:034509/0290

Effective date: 20141215

AS Assignment

Owner name: RAKUTEN KOBO INC., CANADA

Free format text: CHANGE OF NAME;ASSIGNOR:KOBO INC.;REEL/FRAME:037753/0780

Effective date: 20140610

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION