WO2022075990A1 - Augmented reality documents - Google Patents

Augmented reality documents Download PDF

Info

Publication number
WO2022075990A1
WO2022075990A1 PCT/US2020/054727 US2020054727W WO2022075990A1 WO 2022075990 A1 WO2022075990 A1 WO 2022075990A1 US 2020054727 W US2020054727 W US 2020054727W WO 2022075990 A1 WO2022075990 A1 WO 2022075990A1
Authority
WO
WIPO (PCT)
Prior art keywords
document
media
physical media
augmented reality
digital document
Prior art date
Application number
PCT/US2020/054727
Other languages
French (fr)
Inventor
Wayne J. Schmidt
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/054727 priority Critical patent/WO2022075990A1/en
Publication of WO2022075990A1 publication Critical patent/WO2022075990A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • Computing devices can be utilized to perform particular functions.
  • Computing devices can be utilized to generate digital documents utilizing a plurality of applications.
  • the computing devices can be utilized to display the digital documents through a physical display, such as a monitor.
  • the computing device can be communicatively coupled to a printing device to generate printed images on a print medium (e.g., paper, plastic, etc.).
  • a print medium e.g., paper, plastic, etc.
  • Figure 1 illustrates an example of a device for generating augmented reality documents, in accordance with the present disclosure.
  • Figure 2 illustrates an example of a memory resource for generating augmented reality documents, in accordance with the present disclosure.
  • Figure 3 illustrates an example of a system for generating augmented reality documents, in accordance with the present disclosure.
  • Figure 4 illustrates an example of a system for generating augmented reality documents, in accordance with the present disclosure.
  • a user may utilize a computing device for various purposes, such as for business and/or recreational use.
  • the term “computing device” refers to an electronic system having a processing resource and a memory resource. Examples of computing devices can include, for instance, a laptop computer, a notebook computer, a desktop computer, networking device, and/or a mobile device, among other types of computing devices.
  • a mobile device refers to devices that are (or can be) carried and/or worn by a user.
  • a mobile device can be a phone (e.g., a smart phone), a tablet, a personal digital assistant (PDA), smart glasses, and/or a wrist-worn device (e.g., a smart watch), among other types of mobile devices.
  • a phone e.g., a smart phone
  • a tablet e.g., a personal digital assistant (PDA), smart glasses
  • a wrist-worn device e.g., a smart watch
  • the portable computing device can be a wearable device that can include an augmented reality display.
  • an augmented reality display can include a display that can superimpose a computer-generated image on a view of the real world.
  • the augmented reality device can display a portion of real world objects with a superimposed object that is generated by a computing device, in some examples, the augmented reality display can display a portion that includes real world objects and another portion that includes computer- generated images that are combined as a composite view, in this way, computer generated images can appear to be part of the real world or implanted within the real world when viewing the augmented reality display.
  • a physical document can provide a different user experience that may be preferred.
  • the physical document can be a paper book, piece of paper, or other physical media that can be utilized to generate print media.
  • print media can include physical media that can receive a print substance, such as ink or toner, to generate an image on the physical media.
  • print media may not be accessible or provide security from individuals that are in the same physical area as a user.
  • the present disclosure relates to augmented reality documents that can be generated to appear as though they are printed on a physical medium.
  • the physical medium can be utilized to alter the appearance of the documents displayed on the augmented reality display.
  • a digital document can be displayed through the augmented reality display to appear within the boundaries of the physical medium.
  • the physical medium can be flipped to a second side and the digital document can be switched to a different page. In this way, the physical medium can be manipulated as if the digital document was printed on the physical medium.
  • a stylus can be utilized with the physical medium to generate new content within the digital document.
  • Figure 1 illustrates an example of a device 100 for generating augmented reality documents, in accordance with the present disclosure.
  • the device can include a wearable device 102 communicatively coupled to a computing device 110.
  • the wearable device 102 can include the computing device 110 as an embedded portion within the wearable device 102.
  • the computing device 110 can be a remote computing device, such as a cloud resource, that can be connected to the wearable device 102 through a communication path 108 or network connection.
  • a communication path 108 can include wired or wireless pathways that can be utilized to transfer information and/or data.
  • the computing device 110 can include a processor resource 112 communicatively coupled to a memory resource 114.
  • the memory resource 114 can include instructions 116, 118 that can be executed by the processor resource 112 to perform particular functions.
  • the processor resource 112 may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of non-transitory machine-readable instructions.
  • the computing device 110 can be associated with a plurality of components. For example, the computing device 110 can be utilized to determine dimensions of a piece of media 106 and display a digital document through an augmented reality display 104. In some examples, the computing device 110 can be local or remote to the plurality of components.
  • the computing device 110 can include instructions 116, 118 stored on a machine-readable medium (e.g., memory resource 114, non-transitory computer-readable medium, etc.) and executable by a processor resource 112.
  • a machine-readable medium e.g., memory resource 114, non-transitory computer-readable medium, etc.
  • the computing device 110 can utilize a non-transitory computer- readable medium storing instructions 116, 118 that, when executed, cause the processor resource 112 to perform corresponding functions.
  • the memory resource 114 may be electronic, magnetic, optical, or other physical storage device that stores executable Instructions.
  • non-transitory machine readable medium may be, for example, a non- transitory machine readable medium (MRM) comprising Random-Access Memory (RAM), read only memory (ROM), an Electrically-Erasable Programmable ROM (EEPROM), a storage drive, an optical disc, and the like.
  • MRM non- transitory machine readable medium
  • RAM Random-Access Memory
  • ROM read only memory
  • EEPROM Electrically-Erasable Programmable ROM
  • storage drive an optical disc, and the like.
  • the non-transitory machine readable medium e.g., a memory resource 114
  • the executable instructions 116, 118 can be “installed” on the device.
  • the non-transitory machine readable medium (e.g., a memory resource 114) can be a portable, external or remote storage medium, for example, that allows a computing system to download the instructions 116, 118 from the portable/external/remote storage medium.
  • the executable instructions may be part of an “installation package”.
  • the non- transitory machine readable medium e.g., a memory resource 114 can be encoded with executable instructions for displaying and allowing interaction with a digital document through an interaction with a physical medium 106.
  • the computing device 110 can include instructions 116 that can be executed by a processor resource 112 to determine dimensions of a piece of media 106 positioned in view of the augmented reality display 104.
  • the wearable device 102 can include an imaging device that can be utilized to determine the edges and/or corners of the piece of media 106.
  • the imaging device can track the edges and/or comers of the piece of media 106 to identify the boundaries of the piece of media 106.
  • the piece of media 106 can include a sheet of paper or other type media that can be utilized to receive a print substance from a printing device. In this way, the piece of media 106 can appear to be a piece of print media when utilizing the augmented reality display 104.
  • identifying the edges and/or comers of the piece of media 106 can include identifying portions of a surface that are within the boundaries of the piece of media 106 from other objects or surfaces surrounding the piece of media 106. In this way, the computing device 110 can alter dimensions of a digital document to fit within the boundaries of the piece of media 106.
  • the computing device 110 can include instructions 118 that can be executed by a processor resource 112 to generate an image of a digital document utilizing the determined dimensions on the augmented reality display 104 such that the image is affixed with the piece of media 106.
  • the image of the document is affixed at a particular location within the piece of media 106 such that the image is positioned at the particular location even when the augmented reality display 104 moves.
  • a user can utilize the augmented reality display 104 to view a digital document that is affixed within the boundaries of the piece of media 106.
  • the augmented reality display 104 can be moved in a plurality of directions and the digital document can remain within the boundaries of the piece of media 106. In this way, viewing the piece of media 106 through the augmented reality display 104 can resemble a printed document even when the piece of media 106 is a blank piece of media 106.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to alter a position of the image with the piece of media 106 in response to an alteration of a position of the piece of media 106.
  • the image of the document or a portion of the image of the document can be displayed on the augmented reality display 104 to appear affixed to the piece of media 106.
  • the position of the image within the boundaries of the piece of media 106 can follow the piece of media 106 when the piece of media is moved.
  • the piece of media 106 can appear to have a printed document or portion of a printed document on the piece of media 106.
  • a portion of the piece of media 106 can be moved or altered to a different position.
  • the image displayed within the boundaries can be altered within the same portion of the piece of media 106 to appear within the boundaries of the piece of media 106.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to separate the digital document into a plurality of pages based on the dimensions.
  • the digital document can be separated into a plurality of pages based on the dimensions or boundaries of the piece of media 106.
  • the digital document can be formatted to fit within the boundaries of the piece of media 106 and separated into a plurality of pages such that each of the plurality of pages fit within the boundaries of the piece of media 106.
  • the computing device 110 can include instructions to determine a page flip of the piece of media 106.
  • a page flip of the piece of media 106 can include altering the piece of media 106 from a first side to a second side.
  • the piece of media 106 can be positioned on a first side and the piece of media 106 can be physically moved to a second side of the piece of media 106.
  • the physical movement of the piece of media 106 can be utilized as an input for the computing device 110.
  • the page flip of the piece of media 106 can be an input for the computing device 110 to display a different page of the digital document on the augmented reality display 104.
  • the computing device 110 can include instructions to update the image to a proximate page of the separated digital document.
  • physical movement of the piece of media 106 can be utilized as an input to alter the image displayed on the augmented reality display 104.
  • a page flip that starts from a first edge of the piece of media 106 can be a first input for the computing device 110 to alter to a previous page of the plurality of pages and a page flip that starts from a second edge of the piece of media 106 can be a second input for the computing device 110 to alter to a subsequent page of the plurality of pages.
  • the digital document can be affixed to a particular location within the boundaries of the piece of media 106.
  • the digital document can be affixed to a particular orientation within the boundaries of the piece of media 106.
  • the piece of media 106 may be rotated from a portrait to a landscape orientation.
  • the digital document can stay affixed to a portrait orientation when rotated to a landscape orientation. In this way, the digital document still appears affixed or printed on the piece of media 106.
  • the piece of media 106 can be rotated 180 degrees. In this example, the digital document may appear upside down since the digital document is affixed at the particular orientation within the boundaries of the piece of media 106.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to identify a user of the augmented reality display 104 and alter the image based on a profile associated with the user.
  • the wearable device 102 and/or computing device 110 can include an identification device.
  • the wearable device 102 and/or computing device 110 can utilize an identification device such as, but not limited to: a biometric scanning device (e.g., fingerprint scanner, iris scanner, etc.), a login and password combination, and/or other type of device that can identify or authenticate a particular user of the wearable device 102.
  • the digital document can include particular display settings based on the profile associated with the user.
  • the font size of a plurality of letters can be altered based on a profile of the user.
  • other display settings such as brightness, contrast, or other features can be utilized based on the profile of the user.
  • a single digital document can be provided to a plurality of users and each of the plurality of users can include altered display settings based on the corresponding user profiles of the plurality of users.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to block a portion of the image based on security settings of the profile associated with the user.
  • the user profile can include security settings.
  • a digital document can include portions that can be identified as sensitive information or secure information that is to be restricted from particular users within a group.
  • the sensitive information can be privileged information that may not be allowed to be shared with non-privileged members of a group.
  • a user profile can be utilized to determine when a user is allowed to view the sensitive information or if the user is not allowed to view the sensitive information.
  • the computing device 110 can block the sensitive information from users that have a user profile that does not allow the user to view the sensitive information. In some examples, the computing device 110 can block portions of the image to ensure the sensitive information is protected. In this way, a single digital document can be sent to a plurality of users without having to individually secure digital documents for each of the plurality of users prior to sending the digital document.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to track a stylus that is in contact the piece of media 106 and alter the image displayed on the augmented reality display 104 based on the contact of the stylus with the piece of media 106.
  • a stylus can include a physical device that can interact with the piece of media 106.
  • the stylus can be an electrical device or computing stylus that can also interact with electronic devices.
  • the stylus can be a non-electrical device that may be utilized with the piece of media 106.
  • the stylus can be an ink pen or a pencil that is normally utilized with the piece of media 106 to make physical marks on the piece of media 106.
  • the computing device 110 can track a physical location of the stylus and movements of the stylus to allow the interactions between the stylus and piece of media 106 to be utilized as inputs for the digital document.
  • the stylus can be utilized to write letters or shapes on the piece of media 106 and the computing device 110 can generate corresponding letters or shapes on the digital document. In this way, a user can utilize a stylus with the piece of media 106 in a similar way as if the piece of media 106 included a printed document and the stylus Is a pen or pencil and also update a digital document.
  • the computing device 110 can include instructions that can be executed by a processor resource 112 to generate a permanent document of the altered image displayed on the augmented reality display.
  • a user can update the digital document utilizing a stylus or other physical interaction with the piece of media 106.
  • the computing device 110 can generate a permanent document of the updated digital document.
  • a permanent document can include a printed document of the updated digital document.
  • the permanent document can include a print document that is generated by a printing device on a physical print medium.
  • the permanent document can be a digital print job, such as a portable document format (PDF).
  • PDF portable document format
  • the wearable device 102 can utilize an augmented reality display 104 with a computing device 110 to execute physical interactions of a piece of media 106 with digital interactions of a digital document.
  • the digital document displayed through the augmented reality display 104 can be altered based on a user profile of the user of the wearable device 102. In this way, the digital document can remain secure from unauthorized users while still providing the physical interactions between the user and the piece of media 106.
  • Figure 2 illustrates an example of a memory resource 214 for generating augmented reality documents, in accordance with the present disclosure.
  • the memory resource 214 can be utilized by a computing device 110 as described in Figure 1.
  • the memory resource 214 can be communicatively coupled to a wearable device that includes an augmented reality display.
  • an augmented reality display can utilize real world objects with implemented digital images to generate a combined image that includes the real world objects and digital images.
  • the real world objects can include a piece of paper and the digital images can include text or pictures that appear to be printed on the piece of paper.
  • the memory resource 214 may be electronic, magnetic, optical, or other physical storage device that stores executable instructions.
  • non-transitory machine readable medium e.g., a memory resource 214
  • MRM non- transitory machine readable medium
  • RAM Random-Access Memory
  • EEPROM Electrically-Erasable Programmable ROM
  • storage drive an optical disc, and the like.
  • the non-transitory machine readable medium e.g., a memory resource 214) may be disposed within a computing device.
  • the executable instructions 222, 224, 226, 228, 230, 232 can be “installed” on the wearable device.
  • the non-transitory machine readable medium (e.g., a memory resource 214) can be a portable, external or remote storage medium, for example, that allows a computing system to download the Instructions 222, 224, 226, 228, 230, 232 from the portable/external/remote storage medium.
  • the executable instructions may be part of an “installation package”.
  • the non-transitory machine readable medium (e.g., a memory resource 214) can be encoded with executable instructions for displaying a digital document on a physical medium.
  • the memory resource 214 can include instructions 222 that can be executed by a processor resource to determine boundaries of a physical media in a view path of an augmented reality display.
  • the physical media can include a physical object that can be utilized to display images or text.
  • the physical media can include print media such as paper.
  • a view path of an augmented reality display includes an optical pathway between a user and the piece of physical media.
  • the physical media can be in the view path of the augmented reality display when the user of the augmented reality display can see the physical media through the augmented reality display.
  • the boundaries of the physical media can be determined based on determining edge marks of the physical media.
  • edge marks can include physical edges of the physical media and/or markings on the physical media to identify an edge or boundary of the physical media.
  • the memory resource 214 can be communicatively coupled to an imaging device that can provide images of the physical media to the memory resource 214.
  • the imaging device can include a sensor that can be utilized to determine a difference between the physical media and a surrounding area.
  • the imaging device can include a sensor to detect a marking or image on the physical media and a corresponding boundaries of the physical media with respect to the marking or image.
  • the memory resource 214 can include instructions 224 that can be executed by a processor resource to generate a portion of a document that is presented on the physical media through the augmented reality display such that the portion of the document is fitted to the boundaries of the physical media.
  • the portion of the document can include a page of the document.
  • the document can be a digital representation of a plurality of images and/or text.
  • the document can be separated into a plurality of pages based on the boundaries of the physical media such that each of the plurality of pages comprise a particular portion of the document.
  • the portion or a particular page of the document can be displayed on the augmented reality display to appear as though the portion or particular page is affixed to the physical media.
  • the memory resource 214 can include instructions 226 that can be executed by a processor resource to generate images on the document in response to a stylus interacting with the physical media.
  • the generated images can be markings, edits, and/or highlights that are defined by the movement of the stylus on the physical media.
  • the stylus can be utilized to move from a first location to a second location to cross out text displayed in a particular location of the physical media through the augmented reality display.
  • the cross out can be generated within the digital document and appear on the physical media through the augmented reality display.
  • the memory resource 214 can include Instructions 228 that can be executed by a processor resource to generate images on the document in response to a stylus interacting with the physical media.
  • the images outlined by tracking the interactions of the stylus with the physical media can be implemented into the digital document and appear in a corresponding location on the physical media when displayed by the augmented reality display.
  • the stylus can be tracked while writing text on the physical media.
  • the written text can be digitized (e.g. , made into a digital format, etc.) and implemented into the digital document at the location of the stylus writing the text on the physical media.
  • the stylus may not make a physical mark on the physical media. That is, the stylus may make physical contact with the physical media without making marks on the physical media.
  • the memory resource 214 can include instructions 230 that can be executed by a processor resource to incorporate the generated images into the document.
  • the processor can incorporate the images generated by the interaction between the stylus and the physical media into the digital document at a corresponding location on the physical media when viewed through the augmented reality display.
  • the memory resource 214 can include instructions 232 that can be executed by a processor resource to alter the portion of the document that is presented on the physical media in response to a movement of the physical media.
  • the document or portion of the document can be affixed to a particular location within the boundaries of the physical media.
  • movement of the physical media can be utilized as an input to perform a particular function associated with the digital document. For example, a page flip of the physical media can be utilized as an input to perform a page change of the digital document.
  • the memory resource 214 can include instructions that can be executed by a processor resource to alter the document from a first page to a second page of the document in response to the page flip movement.
  • a page flip movement can include a physical movement of the physical media from a first side to a second side.
  • the page flip movement can be flipped by holding a right edge of the print media and flip the physical media in a leftward direction to a second side of the physical media.
  • the second page can be a subsequent page of the first page.
  • the page flip movement can be flipped by holding a left edge of the print media and flip the physical media a rightward direction to a second side of the physical media.
  • the second page can be a previous page of the first page.
  • the memory resource 214 can include instructions that can be executed by a processor resource to alter a viewing size of the portion of the document and update the portion of the document to fit within the boundaries of the physical media utilizing the altered viewing size.
  • the size of text or images of the digital document can be altered. For example, the size of text within the digital document can be increased. In this example, the same quantity of text may not fit within the boundaries of the physical media due to the increased size of text. In this example, the digital document can be updated with a greater quantity of pages to allow the increased text size to fit within the boundaries of the physical media.
  • the memory resource 214 can include instructions that can be executed by a processor resource to alter a presentation of the document based on settings associated with a profile of a user and update the document to fit within the boundaries of the physical media utilizing the altered presentation.
  • the profile of the user can include an orientation of the physical media to allow for larger or smaller font sizes.
  • the user profile can define that the digital document be presented in a portrait orientation to allow for a greater quantity of words or images to provided within the boundaries of the document.
  • the user profile can define that the digital document be presented in a landscape orientation to allow for a larger font size to be utilized.
  • a wearable device with an augmented reality display can be associated with a particular user.
  • the wearable device can be utilized to identify a particular user and/or a particular user profile of a wearer of the wearable device.
  • the profile of the user can have stored settings for displaying digital documents.
  • the settings can include a particular font size or image size to allow the corresponding user to more easily view the digital documents.
  • the digital document can be altered or updated to fit within the boundaries of the physical media.
  • the digital document can be affixed within the boundaries of the physical media at a size that is readable or viewable by the particular user.
  • Figure 3 illustrates an example of a system 301 for generating augmented reality documents, in accordance with the present disclosure.
  • the system 301 can include the same or similar elements as illustrated in Figure 1.
  • the system 301 can include a wearable device 302, a print media 306-1 , 306- 2 with a binding mechanism 350, and/or a computing device 310 that can be communicatively coupled to the wearable device 302 through a communication path 308.
  • the computing device 310 can include instructions 342, 344, 346, 348 stored on a machine-readable medium (e.g., memory resource 314, non- transitory computer-readable medium, etc.) and executable by a processor resource 312.
  • a machine-readable medium e.g., memory resource 314, non- transitory computer-readable medium, etc.
  • the computing device 310 can utilize a non-transitory computer-readable medium storing Instructions 342, 344, 346, 348 that, when executed, cause the processor resource 312 to perform corresponding functions.
  • the computing device 310 can include instructions 342 that can be executed by a processor resource 312 to determine a boundary of the physical media 306-1 , 306-2, wherein the boundary includes a first set of exterior edges of a first side of the binding mechanism 350 and a second set of exterior edges of a second side of the binding mechanism 350.
  • the physical media 306- 1 , 306-2 can be physical book that includes a binding mechanism 350.
  • the physical media 306-1 , 306-2 can include a first page 306-1 and a second page 306-2 that is divided by the binding mechanism 350.
  • the physical media 306-1 , 306-2 can be blank pages or pages that include markings to help identify the edges and/or boundaries of the physical media 306-1 , 306-2.
  • the computing device 310 can include instructions 344 that can be executed by a processor resource 312 to format a digital document to fit within the boundary of the first side (e.g., first page 306-1) and the second side (e.g., second page 306-2) of the binding mechanism 350, wherein a first page of the digital document is positioned within the boundary of the first side and a second page of the digital document is positioned within the boundary of the second side.
  • the boundaries of the first side of the physical media 306-1 and the boundaries of the second side of the physical media 306-2 can be identified utilizing a sensor or imaging device.
  • the computing device 310 can include instructions 346 that can be executed by a processor resource 312 to display the digital document on the wearable augmented reality display 304 to correspond within the boundaries of the first side and the second side of the binding mechanism 350.
  • the digital document can be separated into a plurality of pages and a first page of the plurality of pages can be displayed on the first side of the physical media 306-1 and a second page of the plurality of pages can be displayed on a second side of the physical media 306-2.
  • the computing device 310 can include instructions 348 that can be executed by a processor resource 312 to alter the digital document in response to a page turn of the physical media 306-1 , 306-2, wherein the altered digital document displays a third page of the digital document is positioned within the boundary of the first side and a fourth page of the digital document is positioned within the boundary of the second side.
  • a page of the physical media 306-1 , 306-2 can be flipped to a different page and the digital document can be altered to display subsequent or previous pages based on the direction of the page flip movement.
  • the page flip movement can be along the binding mechanism 350.
  • the page flip movement can include moving the second side of the physical media 306-2 in the direction of the first side of the physical media 306-1 along the binding mechanism 350.
  • the digital document can be updated to display a subsequent page of the digital document and a back side of the second side of the physical media 306-2.
  • the physical media 306-1 , 306-2 can be physically manipulated in the same way as a book to provide inputs to the computing device 310 and update the digital document in response to the provided inputs.
  • the computing device 310 can include instructions that can be executed by a processor resource 312 to alter a position of the digital document in response to an alteration of a position of the physical media.
  • the altered position of the digital document maintains the digital document within the boundary of the physical media 306-1 , 306-2.
  • the digital document can be affixed within the boundary of the physical media 306-1 , 306-2 even when the physical media 306-1 , 306-2 is repositioned. For example, a corner of the first side of the physical media 306-1 can be lifted and the digital document can be altered to be affixed within the boundary of the lifted first side of the physical media 306-1 .
  • the system 301 can include a stylus 352.
  • the stylus 352 can be utilized to physically interact with the physical media 306-1 , 306-2.
  • the computing device 310 can include instructions to generate images on the digital document in response to the stylus 352 physically interacting with the physical media 306-1 , 306-2.
  • the wearable device 302 can include a sensor or imaging device to track the location of the stylus and track when the stylus is physically interacting with the physical media 306-1 , 306-2.
  • digital images can be generated in corresponding locations of the digital image to physical interactions between the stylus and physical media 306-1 , 306- 2. In this way, a user can utilize the stylus to make digital Images within the digital document by providing physical contact on the physical media 306-1 , 306-2 with the stylus.
  • Figure 4 illustrates an example of a system 401 for generating augmented reality documents, in accordance with the present disclosure.
  • the system 401 can include the same or similar elements as system 301 as illustrated in Figure 3.
  • the system 401 can include a wearable device 402 that includes an augmented reality display 404.
  • the wearable device 402 can include a computing device and/or be communicatively coupled to a computing device.
  • communicatively coupled can include a communication path to communicate between devices.
  • the system 401 can include a physical booklet that includes a first side of physical media 406-1 and a second side of physical media 406-2 that can be connected by a binding mechanism 450.
  • the system 401 can include a stylus 452.
  • the stylus 452 can include a pen or pencil shaped device that can physically interact with the physical media 406-1 , 406-2.
  • the stylus 452 can include a tracking sensor that can be utilized to provide a location of the stylus 452 to the wearable device 402 and/or computing device communicatively coupled to the wearable device 402.
  • the wearable device 402 can include a tracking mechanism to track a location of the stylus 452 and/or physical interaction between the stylus 452 and the physical media 406-1 , 406-2. In this way, the wearable device 402 can alter or update a digital document displayed on the augmented reality display 404 to reflect the interaction between the stylus 452 and the physical media 406-1 , 406-2.
  • the wearable device 402 can utilize the augmented reality display 404 to display digital content 454 such that the digital content 454 appears affixed to the physical media 406-1 , 406-2.
  • the digital content 454 can be formatted to fit within the boundaries of the physical media 406-1 , 406-2 to appear as though the digital content 454 is printed on the physical media 406- 1 , 406-2.
  • the wearable device 402 can include a sensor to determine the boundaries of the physical media 406-1 , 406-2. In some examples, the wearable device 402 can determine the edges of the physical media 406-1 , 406-2 to identify the boundaries of the physical media 406-1 , 406-2.
  • the digital content 454 can be digital content associated with an audio/visual presentation.
  • an audio/visual presentation can include a combination of visual content and audio content.
  • an audio/visual presentation can include a real time or recorded presentation of an audio description along with digital content 454.
  • a specific example, of an audio/visual presentation can include a speaker that is describing a power point presentation.
  • the audio can be spoken words of the speaker and the digital content 454 can include the point presentation.
  • the a user or wearer of the wearable device 402 can listen to the audio portion of the audio/visual presentation while viewing the digital content 454 of the audio/visual presentation as if the digital content 454 was printed on the physical media 406-1 , 406-2.
  • the stylus 452 can be utilized to insert writing 456 on or within the digital content 454 of the audio/visual presentation during the audio/visual presentation. That is, the wearable device 402 can be generating the audio through a speaker or similar device and displaying the digital content 454 to allow interaction with the digital content 454 during the audio/visual presentation.
  • the physical media 406-1 , 406-2 can be manipulated in a similar ways described herein to make it appear as though the digital content 454 is affixed within the boundaries of the physical media 406-1 , 406-2. In this way, the digital content 454 can be altered to different pages of the digital content 454 during the audio/visual presentation.
  • the augmented reality display 404 can display a page or slide number of the digital content 454 that is currently being described by the audio of the audio/visual presentation.
  • the stylus 452 can be utilized to generate writing 456, highlighting 458, or other marks within the digital content 454 of the audio/visual presentation at different times than what is currently being described by the audio of the audio/visual presentation.
  • the wearable device 402 and/or stylus 452 can be utilized to manipulate the audio/visual presentation through input selections as described herein.
  • inputs can be displayed by the augmented reality display 404 for pausing, stopping, rewinding, fast forwarding, among other features that can be selected by the stylus 452.
  • the selectable inputs can be present on the wearable device 402 and/or the stylus 452 to manipulate the audio/visual presentation.
  • the wearable device 402 can determine dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2. For example, the wearable device 402 can determine a width dimension 462-1 of the physical media 406-2, a height dimension 462-2 of the physical media 406-1 , 406-2, and/or a width dimension 462-3 of the physical media 406-1 .
  • the width dimension 462-1 can be a distance between an edge of the physical media 406-2 and the binding mechanism 450.
  • the length dimension 462-2 of the physical media 406-2 can be a distance between a top edge of the physical media 406-2 and a bottom edge of the physical media 406-2 as illustrated in Figure 4.
  • the length dimension 462-2 of the physical media 406-2 can be the same as the length dimension of the physical media 406-1 .
  • the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 can be utilized to determine the boundaries of the physical media 406-1 , 406-2.
  • the digital content 454 displayed on the augmented reality display 404 can be affixed to be within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2.
  • the digital content 454 can be positioned within selected margins within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 such that the binding mechanism 450 is not interfering with the digital content 454. In this way, the digital content 454 can appear to be printed on the physical media 406-1 , 406-2 when viewing the augmented reality display 404.
  • the stylus 452 can be utilized to physically interact with the physical media 406-1 , 406-2 to insert writing 456.
  • the writing 456 can be text or other images.
  • the stylus 452 can be utilized to physically interact with the physical media 406-1 , 406-2 to make notes that appear within the digital document displayed by the augmented reality display 404. In this way, a user can utilize the stylus 452 with the physical media 406-1 , 406-2 as if the digital content 454 was printed on the physical media 406-1 , 406-2 and update the digital document displayed on the augmented reality display 404.
  • the physical interaction between the stylus 452 and the physical media 406-1 , 406-2 may not make physical marks on the physical media 406-1 , 406-2.
  • a physical stylus 452 may not deposit ink or carbon on the physical media 406-1 , 406-2 when physically interacting with the physical media 406-1 , 406-2.
  • the stylus 452 can provide multiple functions that can be utilized as inputs to be implemented into the digital document or digital content 454 displayed on the augmented reality display 404.
  • the stylus 452 can include a first mode to generate writing 456 and a second mode to generate highlighting 458.
  • the different modes can be selected utilizing the augmented reality display 404.
  • the augmented reality display 404 can provide a menu of selections for a plurality of different modes.
  • the different modes can be selected utilizing the stylus 452.
  • the stylus 452 can be a digital stylus that can include selectable inputs that can be provided to the augmented reality display 404 through a communication path.
  • the augmented reality display 404 can display selectable inputs that correspond to the different modes on the physical media 406-1 , 406-2 and the stylus 452 can be utilized to select the different modes by physically interacting with the corresponding areas of physical media 406-1 , 406-2.
  • writing 456 and highlighting 458 are illustrated in Figure 4, the stylus 452 could utilize a plurality of additional modes that can generate different inputs that can be implemented into the digital document or affect the digital content 454 in different ways.
  • the system 401 illustrates how a wearable device 402 with an augmented reality display 404 can allow a user to utilize traditional interactions with physical media 406-1 , 406-2 while simultaneously updating or altering digital content 454 presented on the augmented reality display 404.
  • the digital content 454 can be affixed to a particular location within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 to make it appear as though the digital content 454, writing 456, highlighting 458, and/or other edits are printed on the physical media 406-1 , 406-2. This can provide the experience of utilizing a printed physical media, such as a piece of paper or book, while simultaneously updating a digital document, providing security, and utilizing user profile settings that can improve a user experience.

Abstract

Example implementations relate to augmented reality documents. For example, implementations can include device that includes an augmented reality display, and a computing device, comprising a non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause a processor resource to: determine dimensions of a piece of media positioned in view of the augmented reality display, and generate an image of a digital document utilizing the determined dimensions on the augmented reality display such that the image is affixed with the piece of media.

Description

AUGMENTED REALITY DOCUMENTS
Background
[0001] Computing devices can be utilized to perform particular functions.
Computing devices can be utilized to generate digital documents utilizing a plurality of applications. In some examples, the computing devices can be utilized to display the digital documents through a physical display, such as a monitor. In some examples, the computing device can be communicatively coupled to a printing device to generate printed images on a print medium (e.g., paper, plastic, etc.).
Brief Description of the Drawings
[0002] Figure 1 illustrates an example of a device for generating augmented reality documents, in accordance with the present disclosure.
[0003] Figure 2 illustrates an example of a memory resource for generating augmented reality documents, in accordance with the present disclosure.
[0004] Figure 3 illustrates an example of a system for generating augmented reality documents, in accordance with the present disclosure.
[0005] Figure 4 illustrates an example of a system for generating augmented reality documents, in accordance with the present disclosure.
Detailed Description
[0006] A user may utilize a computing device for various purposes, such as for business and/or recreational use. As used herein, the term “computing device” refers to an electronic system having a processing resource and a memory resource. Examples of computing devices can include, for instance, a laptop computer, a notebook computer, a desktop computer, networking device, and/or a mobile device, among other types of computing devices. As used herein, a mobile device refers to devices that are (or can be) carried and/or worn by a user. For example, a mobile device can be a phone (e.g., a smart phone), a tablet, a personal digital assistant (PDA), smart glasses, and/or a wrist-worn device (e.g., a smart watch), among other types of mobile devices.
[0007] In some examples, the portable computing device can be a wearable device that can include an augmented reality display. As used herein, an augmented reality display can include a display that can superimpose a computer-generated image on a view of the real world. For example, the augmented reality device can display a portion of real world objects with a superimposed object that is generated by a computing device, in some examples, the augmented reality display can display a portion that includes real world objects and another portion that includes computer- generated images that are combined as a composite view, in this way, computer generated images can appear to be part of the real world or implanted within the real world when viewing the augmented reality display.
[0008] Although digital versions of a document can be utilized to make edits or updates, a physical document can provide a different user experience that may be preferred. In some examples, the physical document can be a paper book, piece of paper, or other physical media that can be utilized to generate print media. As used herein, print media can include physical media that can receive a print substance, such as ink or toner, to generate an image on the physical media. In some examples, print media may not be accessible or provide security from individuals that are in the same physical area as a user.
[0009] The present disclosure relates to augmented reality documents that can be generated to appear as though they are printed on a physical medium. In some examples, the physical medium can be utilized to alter the appearance of the documents displayed on the augmented reality display. For example, a digital document can be displayed through the augmented reality display to appear within the boundaries of the physical medium. In this example, the physical medium can be flipped to a second side and the digital document can be switched to a different page. In this way, the physical medium can be manipulated as if the digital document was printed on the physical medium. In a similar way, a stylus can be utilized with the physical medium to generate new content within the digital document.
[0010] Figure 1 illustrates an example of a device 100 for generating augmented reality documents, in accordance with the present disclosure. In some examples, the device can include a wearable device 102 communicatively coupled to a computing device 110. In some examples, the wearable device 102 can include the computing device 110 as an embedded portion within the wearable device 102. In other examples, the computing device 110 can be a remote computing device, such as a cloud resource, that can be connected to the wearable device 102 through a communication path 108 or network connection. As used herein, a communication path 108 can include wired or wireless pathways that can be utilized to transfer information and/or data.
[0011] In some examples, the computing device 110 can include a processor resource 112 communicatively coupled to a memory resource 114. As described further herein, the memory resource 114 can include instructions 116, 118 that can be executed by the processor resource 112 to perform particular functions. In some examples, the processor resource 112 may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of non-transitory machine-readable instructions. In some examples, the computing device 110 can be associated with a plurality of components. For example, the computing device 110 can be utilized to determine dimensions of a piece of media 106 and display a digital document through an augmented reality display 104. In some examples, the computing device 110 can be local or remote to the plurality of components.
[0012] In some examples, the computing device 110 can include instructions 116, 118 stored on a machine-readable medium (e.g., memory resource 114, non-transitory computer-readable medium, etc.) and executable by a processor resource 112. In a specific example, the computing device 110 can utilize a non-transitory computer- readable medium storing instructions 116, 118 that, when executed, cause the processor resource 112 to perform corresponding functions. [0013] The memory resource 114 may be electronic, magnetic, optical, or other physical storage device that stores executable Instructions. Thus, non-transitory machine readable medium (e.g., a memory resource 114) may be, for example, a non- transitory machine readable medium (MRM) comprising Random-Access Memory (RAM), read only memory (ROM), an Electrically-Erasable Programmable ROM (EEPROM), a storage drive, an optical disc, and the like. The non-transitory machine readable medium (e.g., a memory resource 114) may be disposed within the computing device 110. In this example, the executable instructions 116, 118 can be “installed” on the device. Additionally, and/or alternatively, the non-transitory machine readable medium (e.g., a memory resource 114) can be a portable, external or remote storage medium, for example, that allows a computing system to download the instructions 116, 118 from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package”. As described herein, the non- transitory machine readable medium (e.g., a memory resource 114) can be encoded with executable instructions for displaying and allowing interaction with a digital document through an interaction with a physical medium 106.
[0014] In some examples, the computing device 110 can include instructions 116 that can be executed by a processor resource 112 to determine dimensions of a piece of media 106 positioned in view of the augmented reality display 104. In some examples, the wearable device 102 can include an imaging device that can be utilized to determine the edges and/or corners of the piece of media 106. In some examples, the imaging device can track the edges and/or comers of the piece of media 106 to identify the boundaries of the piece of media 106. In some examples, the piece of media 106 can include a sheet of paper or other type media that can be utilized to receive a print substance from a printing device. In this way, the piece of media 106 can appear to be a piece of print media when utilizing the augmented reality display 104.
[0015] As described herein, identifying the edges and/or comers of the piece of media 106 can include identifying portions of a surface that are within the boundaries of the piece of media 106 from other objects or surfaces surrounding the piece of media 106. In this way, the computing device 110 can alter dimensions of a digital document to fit within the boundaries of the piece of media 106. [0016] In some examples, the computing device 110 can include instructions 118 that can be executed by a processor resource 112 to generate an image of a digital document utilizing the determined dimensions on the augmented reality display 104 such that the image is affixed with the piece of media 106. As used herein, the image of the document is affixed at a particular location within the piece of media 106 such that the image is positioned at the particular location even when the augmented reality display 104 moves. In this way, a user can utilize the augmented reality display 104 to view a digital document that is affixed within the boundaries of the piece of media 106. For example, the augmented reality display 104 can be moved in a plurality of directions and the digital document can remain within the boundaries of the piece of media 106. In this way, viewing the piece of media 106 through the augmented reality display 104 can resemble a printed document even when the piece of media 106 is a blank piece of media 106.
[0017] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to alter a position of the image with the piece of media 106 in response to an alteration of a position of the piece of media 106. As described herein, the image of the document or a portion of the image of the document can be displayed on the augmented reality display 104 to appear affixed to the piece of media 106. In this way, the position of the image within the boundaries of the piece of media 106 can follow the piece of media 106 when the piece of media is moved. In this way, the piece of media 106 can appear to have a printed document or portion of a printed document on the piece of media 106. In some examples, a portion of the piece of media 106 can be moved or altered to a different position. In these examples, the image displayed within the boundaries can be altered within the same portion of the piece of media 106 to appear within the boundaries of the piece of media 106.
[0018] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to separate the digital document into a plurality of pages based on the dimensions. In some examples, the digital document can be separated into a plurality of pages based on the dimensions or boundaries of the piece of media 106. For example, the digital document can be formatted to fit within the boundaries of the piece of media 106 and separated into a plurality of pages such that each of the plurality of pages fit within the boundaries of the piece of media 106.
[0019] In some examples, the computing device 110 can include instructions to determine a page flip of the piece of media 106. As used herein, a page flip of the piece of media 106 can include altering the piece of media 106 from a first side to a second side. For example, the piece of media 106 can be positioned on a first side and the piece of media 106 can be physically moved to a second side of the piece of media 106. In these examples, the physical movement of the piece of media 106 can be utilized as an input for the computing device 110. For example, the page flip of the piece of media 106 can be an input for the computing device 110 to display a different page of the digital document on the augmented reality display 104.
[0020] In some examples, the computing device 110 can include instructions to update the image to a proximate page of the separated digital document. As described herein, physical movement of the piece of media 106 can be utilized as an input to alter the image displayed on the augmented reality display 104. For example, a page flip that starts from a first edge of the piece of media 106 can be a first input for the computing device 110 to alter to a previous page of the plurality of pages and a page flip that starts from a second edge of the piece of media 106 can be a second input for the computing device 110 to alter to a subsequent page of the plurality of pages.
[0021] As described herein, the digital document can be affixed to a particular location within the boundaries of the piece of media 106. In some examples, the digital document can be affixed to a particular orientation within the boundaries of the piece of media 106. In some examples, the piece of media 106 may be rotated from a portrait to a landscape orientation. In these examples, the digital document can stay affixed to a portrait orientation when rotated to a landscape orientation. In this way, the digital document still appears affixed or printed on the piece of media 106. In a similar way, the piece of media 106 can be rotated 180 degrees. In this example, the digital document may appear upside down since the digital document is affixed at the particular orientation within the boundaries of the piece of media 106.
[0022] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to identify a user of the augmented reality display 104 and alter the image based on a profile associated with the user. In some examples, the wearable device 102 and/or computing device 110 can include an identification device. In some examples, the wearable device 102 and/or computing device 110 can utilize an identification device such as, but not limited to: a biometric scanning device (e.g., fingerprint scanner, iris scanner, etc.), a login and password combination, and/or other type of device that can identify or authenticate a particular user of the wearable device 102. In some examples, the digital document can include particular display settings based on the profile associated with the user. For example, the font size of a plurality of letters can be altered based on a profile of the user. In some examples, other display settings, such as brightness, contrast, or other features can be utilized based on the profile of the user. In this way, a single digital document can be provided to a plurality of users and each of the plurality of users can include altered display settings based on the corresponding user profiles of the plurality of users.
[0023] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to block a portion of the image based on security settings of the profile associated with the user. In some examples, the user profile can include security settings. In some examples, a digital document can include portions that can be identified as sensitive information or secure information that is to be restricted from particular users within a group. For example, the sensitive information can be privileged information that may not be allowed to be shared with non-privileged members of a group. In this example, a user profile can be utilized to determine when a user is allowed to view the sensitive information or if the user is not allowed to view the sensitive information. In this example, the computing device 110 can block the sensitive information from users that have a user profile that does not allow the user to view the sensitive information. In some examples, the computing device 110 can block portions of the image to ensure the sensitive information is protected. In this way, a single digital document can be sent to a plurality of users without having to individually secure digital documents for each of the plurality of users prior to sending the digital document.
[0024] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to track a stylus that is in contact the piece of media 106 and alter the image displayed on the augmented reality display 104 based on the contact of the stylus with the piece of media 106. As used herein, a stylus can include a physical device that can interact with the piece of media 106. In some examples, the stylus can be an electrical device or computing stylus that can also interact with electronic devices. In other examples, the stylus can be a non-electrical device that may be utilized with the piece of media 106.
[0025] For example, the stylus can be an ink pen or a pencil that is normally utilized with the piece of media 106 to make physical marks on the piece of media 106. In some examples, the computing device 110 can track a physical location of the stylus and movements of the stylus to allow the interactions between the stylus and piece of media 106 to be utilized as inputs for the digital document. In some examples, the stylus can be utilized to write letters or shapes on the piece of media 106 and the computing device 110 can generate corresponding letters or shapes on the digital document. In this way, a user can utilize a stylus with the piece of media 106 in a similar way as if the piece of media 106 included a printed document and the stylus Is a pen or pencil and also update a digital document.
[0026] In some examples, the computing device 110 can include instructions that can be executed by a processor resource 112 to generate a permanent document of the altered image displayed on the augmented reality display. In some examples, a user can update the digital document utilizing a stylus or other physical interaction with the piece of media 106. In these examples, the computing device 110 can generate a permanent document of the updated digital document. As used herein, a permanent document can include a printed document of the updated digital document. For example, the permanent document can include a print document that is generated by a printing device on a physical print medium. In other examples, the permanent document can be a digital print job, such as a portable document format (PDF).
[0027] As described herein, the wearable device 102 can utilize an augmented reality display 104 with a computing device 110 to execute physical interactions of a piece of media 106 with digital interactions of a digital document. As described herein, the digital document displayed through the augmented reality display 104 can be altered based on a user profile of the user of the wearable device 102. In this way, the digital document can remain secure from unauthorized users while still providing the physical interactions between the user and the piece of media 106.
[0028] Figure 2 illustrates an example of a memory resource 214 for generating augmented reality documents, in accordance with the present disclosure. In some examples, the memory resource 214 can be utilized by a computing device 110 as described in Figure 1. In some examples, the memory resource 214 can be communicatively coupled to a wearable device that includes an augmented reality display. As described herein, an augmented reality display can utilize real world objects with implemented digital images to generate a combined image that includes the real world objects and digital images. For example, the real world objects can include a piece of paper and the digital images can include text or pictures that appear to be printed on the piece of paper.
[0029] The memory resource 214 may be electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, non-transitory machine readable medium (e.g., a memory resource 214) may be, for example, a non- transitory machine readable medium (MRM) comprising Random-Access Memory (RAM), an Electrically-Erasable Programmable ROM (EEPROM), a storage drive, an optical disc, and the like. The non-transitory machine readable medium (e.g., a memory resource 214) may be disposed within a computing device. In this example, the executable instructions 222, 224, 226, 228, 230, 232 can be “installed” on the wearable device. Additionally, and/or alternatively, the non-transitory machine readable medium (e.g., a memory resource 214) can be a portable, external or remote storage medium, for example, that allows a computing system to download the Instructions 222, 224, 226, 228, 230, 232 from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package”. As described herein, the non-transitory machine readable medium (e.g., a memory resource 214) can be encoded with executable instructions for displaying a digital document on a physical medium.
[0030] In some examples, the memory resource 214 can include instructions 222 that can be executed by a processor resource to determine boundaries of a physical media in a view path of an augmented reality display. As described herein, the physical media can include a physical object that can be utilized to display images or text. For example, the physical media can include print media such as paper. As used herein, a view path of an augmented reality display includes an optical pathway between a user and the piece of physical media. For example, the physical media can be in the view path of the augmented reality display when the user of the augmented reality display can see the physical media through the augmented reality display.
[0031] In some examples the boundaries of the physical media can be determined based on determining edge marks of the physical media. As used herein, edge marks can include physical edges of the physical media and/or markings on the physical media to identify an edge or boundary of the physical media. In some examples, the memory resource 214 can be communicatively coupled to an imaging device that can provide images of the physical media to the memory resource 214. In some examples, the imaging device can include a sensor that can be utilized to determine a difference between the physical media and a surrounding area. In other examples, the imaging device can include a sensor to detect a marking or image on the physical media and a corresponding boundaries of the physical media with respect to the marking or image.
[0032] In some examples, the memory resource 214 can include instructions 224 that can be executed by a processor resource to generate a portion of a document that is presented on the physical media through the augmented reality display such that the portion of the document is fitted to the boundaries of the physical media. In some examples, the portion of the document can include a page of the document. For example, the document can be a digital representation of a plurality of images and/or text. In this example, the document can be separated into a plurality of pages based on the boundaries of the physical media such that each of the plurality of pages comprise a particular portion of the document. In this example, the portion or a particular page of the document can be displayed on the augmented reality display to appear as though the portion or particular page is affixed to the physical media.
[0033] In some examples, the memory resource 214 can include instructions 226 that can be executed by a processor resource to generate images on the document in response to a stylus interacting with the physical media. In some examples, the generated images can be markings, edits, and/or highlights that are defined by the movement of the stylus on the physical media. For example, the stylus can be utilized to move from a first location to a second location to cross out text displayed in a particular location of the physical media through the augmented reality display. In this example, the cross out can be generated within the digital document and appear on the physical media through the augmented reality display.
[0034] In some examples, the memory resource 214 can include Instructions 228 that can be executed by a processor resource to generate images on the document in response to a stylus interacting with the physical media. As described herein, the images outlined by tracking the interactions of the stylus with the physical media can be implemented into the digital document and appear in a corresponding location on the physical media when displayed by the augmented reality display. For example, the stylus can be tracked while writing text on the physical media. In this example, the written text can be digitized (e.g. , made into a digital format, etc.) and implemented into the digital document at the location of the stylus writing the text on the physical media. In this example, the stylus may not make a physical mark on the physical media. That is, the stylus may make physical contact with the physical media without making marks on the physical media.
[0035] In some examples, the memory resource 214 can include instructions 230 that can be executed by a processor resource to incorporate the generated images into the document. As described herein, the processor can incorporate the images generated by the interaction between the stylus and the physical media into the digital document at a corresponding location on the physical media when viewed through the augmented reality display.
[0036] In some examples, the memory resource 214 can include instructions 232 that can be executed by a processor resource to alter the portion of the document that is presented on the physical media in response to a movement of the physical media. As described herein, the document or portion of the document can be affixed to a particular location within the boundaries of the physical media. Thus, when the physical media is moved, the location of the digital document within the augmented reality display is moved with the physical media to keep the digital document within the boundaries of the physical media. In other examples, movement of the physical media can be utilized as an input to perform a particular function associated with the digital document. For example, a page flip of the physical media can be utilized as an input to perform a page change of the digital document.
[0037] In some examples, the memory resource 214 can include instructions that can be executed by a processor resource to alter the document from a first page to a second page of the document in response to the page flip movement. As described herein, a page flip movement can include a physical movement of the physical media from a first side to a second side. In some examples, the page flip movement can be flipped by holding a right edge of the print media and flip the physical media in a leftward direction to a second side of the physical media. In these examples, the second page can be a subsequent page of the first page. In a different example, the page flip movement can be flipped by holding a left edge of the print media and flip the physical media a rightward direction to a second side of the physical media. In these examples, the second page can be a previous page of the first page.
[0038] In some examples, the memory resource 214 can include instructions that can be executed by a processor resource to alter a viewing size of the portion of the document and update the portion of the document to fit within the boundaries of the physical media utilizing the altered viewing size. As described herein, the size of text or images of the digital document can be altered. For example, the size of text within the digital document can be increased. In this example, the same quantity of text may not fit within the boundaries of the physical media due to the increased size of text. In this example, the digital document can be updated with a greater quantity of pages to allow the increased text size to fit within the boundaries of the physical media.
[0039] In some examples, the memory resource 214 can include instructions that can be executed by a processor resource to alter a presentation of the document based on settings associated with a profile of a user and update the document to fit within the boundaries of the physical media utilizing the altered presentation. In some examples, the profile of the user can include an orientation of the physical media to allow for larger or smaller font sizes. For example, the user profile can define that the digital document be presented in a portrait orientation to allow for a greater quantity of words or images to provided within the boundaries of the document. In other examples, the user profile can define that the digital document be presented in a landscape orientation to allow for a larger font size to be utilized.
[0040] As described herein, a wearable device with an augmented reality display can be associated with a particular user. In some examples, the wearable device can be utilized to identify a particular user and/or a particular user profile of a wearer of the wearable device. In some examples, the profile of the user can have stored settings for displaying digital documents. For example, the settings can include a particular font size or image size to allow the corresponding user to more easily view the digital documents. In this way, the digital document can be altered or updated to fit within the boundaries of the physical media. In this way, the digital document can be affixed within the boundaries of the physical media at a size that is readable or viewable by the particular user.
[0041] Figure 3 illustrates an example of a system 301 for generating augmented reality documents, in accordance with the present disclosure. In some examples, the system 301 can include the same or similar elements as illustrated in Figure 1. For example, the system 301 can include a wearable device 302, a print media 306-1 , 306- 2 with a binding mechanism 350, and/or a computing device 310 that can be communicatively coupled to the wearable device 302 through a communication path 308.
[0042] in some examples, the computing device 310 can include instructions 342, 344, 346, 348 stored on a machine-readable medium (e.g., memory resource 314, non- transitory computer-readable medium, etc.) and executable by a processor resource 312. In a specific example, the computing device 310 can utilize a non-transitory computer-readable medium storing Instructions 342, 344, 346, 348 that, when executed, cause the processor resource 312 to perform corresponding functions.
[0043] in some examples, the computing device 310 can include instructions 342 that can be executed by a processor resource 312 to determine a boundary of the physical media 306-1 , 306-2, wherein the boundary includes a first set of exterior edges of a first side of the binding mechanism 350 and a second set of exterior edges of a second side of the binding mechanism 350. In some examples, the physical media 306- 1 , 306-2 can be physical book that includes a binding mechanism 350. In some examples, the physical media 306-1 , 306-2 can include a first page 306-1 and a second page 306-2 that is divided by the binding mechanism 350. In some examples, the physical media 306-1 , 306-2 can be blank pages or pages that include markings to help identify the edges and/or boundaries of the physical media 306-1 , 306-2.
[0044] In some examples, the computing device 310 can include instructions 344 that can be executed by a processor resource 312 to format a digital document to fit within the boundary of the first side (e.g., first page 306-1) and the second side (e.g., second page 306-2) of the binding mechanism 350, wherein a first page of the digital document is positioned within the boundary of the first side and a second page of the digital document is positioned within the boundary of the second side. As described herein, the boundaries of the first side of the physical media 306-1 and the boundaries of the second side of the physical media 306-2 can be identified utilizing a sensor or imaging device.
[0045] In some examples, the computing device 310 can include instructions 346 that can be executed by a processor resource 312 to display the digital document on the wearable augmented reality display 304 to correspond within the boundaries of the first side and the second side of the binding mechanism 350. In some examples, the digital document can be separated into a plurality of pages and a first page of the plurality of pages can be displayed on the first side of the physical media 306-1 and a second page of the plurality of pages can be displayed on a second side of the physical media 306-2.
[0046] In some examples, the computing device 310 can include instructions 348 that can be executed by a processor resource 312 to alter the digital document in response to a page turn of the physical media 306-1 , 306-2, wherein the altered digital document displays a third page of the digital document is positioned within the boundary of the first side and a fourth page of the digital document is positioned within the boundary of the second side. As described herein, a page of the physical media 306-1 , 306-2 can be flipped to a different page and the digital document can be altered to display subsequent or previous pages based on the direction of the page flip movement. In some examples, the page flip movement can be along the binding mechanism 350. For example, the page flip movement can include moving the second side of the physical media 306-2 in the direction of the first side of the physical media 306-1 along the binding mechanism 350. In this way, the digital document can be updated to display a subsequent page of the digital document and a back side of the second side of the physical media 306-2. In this way, the physical media 306-1 , 306-2 can be physically manipulated in the same way as a book to provide inputs to the computing device 310 and update the digital document in response to the provided inputs.
[0047] In some examples, the computing device 310 can include instructions that can be executed by a processor resource 312 to alter a position of the digital document in response to an alteration of a position of the physical media. In these examples, the altered position of the digital document maintains the digital document within the boundary of the physical media 306-1 , 306-2. As described herein, the digital document can be affixed within the boundary of the physical media 306-1 , 306-2 even when the physical media 306-1 , 306-2 is repositioned. For example, a corner of the first side of the physical media 306-1 can be lifted and the digital document can be altered to be affixed within the boundary of the lifted first side of the physical media 306-1 .
[0048] In some examples, the system 301 can include a stylus 352. In some examples, the stylus 352 can be utilized to physically interact with the physical media 306-1 , 306-2. In these examples, the computing device 310 can include instructions to generate images on the digital document in response to the stylus 352 physically interacting with the physical media 306-1 , 306-2. As described herein, the wearable device 302 can include a sensor or imaging device to track the location of the stylus and track when the stylus is physically interacting with the physical media 306-1 , 306-2. In some examples, digital images can be generated in corresponding locations of the digital image to physical interactions between the stylus and physical media 306-1 , 306- 2. In this way, a user can utilize the stylus to make digital Images within the digital document by providing physical contact on the physical media 306-1 , 306-2 with the stylus.
[0049] Figure 4 illustrates an example of a system 401 for generating augmented reality documents, in accordance with the present disclosure. In some examples, the system 401 can include the same or similar elements as system 301 as illustrated in Figure 3. For example, the system 401 can include a wearable device 402 that includes an augmented reality display 404. In some examples, the wearable device 402 can include a computing device and/or be communicatively coupled to a computing device. As used herein, communicatively coupled can include a communication path to communicate between devices.
[0050] In some examples, the system 401 can include a physical booklet that includes a first side of physical media 406-1 and a second side of physical media 406-2 that can be connected by a binding mechanism 450. In some examples, the system 401 can include a stylus 452. As described herein, the stylus 452 can include a pen or pencil shaped device that can physically interact with the physical media 406-1 , 406-2. In some examples, the stylus 452 can include a tracking sensor that can be utilized to provide a location of the stylus 452 to the wearable device 402 and/or computing device communicatively coupled to the wearable device 402. In other examples, the wearable device 402 can include a tracking mechanism to track a location of the stylus 452 and/or physical interaction between the stylus 452 and the physical media 406-1 , 406-2. In this way, the wearable device 402 can alter or update a digital document displayed on the augmented reality display 404 to reflect the interaction between the stylus 452 and the physical media 406-1 , 406-2.
[0051] As described herein, the wearable device 402 can utilize the augmented reality display 404 to display digital content 454 such that the digital content 454 appears affixed to the physical media 406-1 , 406-2. In some examples, the digital content 454 can be formatted to fit within the boundaries of the physical media 406-1 , 406-2 to appear as though the digital content 454 is printed on the physical media 406- 1 , 406-2. As described herein, the wearable device 402 can include a sensor to determine the boundaries of the physical media 406-1 , 406-2. In some examples, the wearable device 402 can determine the edges of the physical media 406-1 , 406-2 to identify the boundaries of the physical media 406-1 , 406-2.
[0052] In some examples, the digital content 454 can be digital content associated with an audio/visual presentation. As used herein, an audio/visual presentation can include a combination of visual content and audio content. For example, an audio/visual presentation can include a real time or recorded presentation of an audio description along with digital content 454. For example, a specific example, of an audio/visual presentation can include a speaker that is describing a power point presentation. In this specific example, the audio can be spoken words of the speaker and the digital content 454 can include the point presentation. In this way, the a user or wearer of the wearable device 402 can listen to the audio portion of the audio/visual presentation while viewing the digital content 454 of the audio/visual presentation as if the digital content 454 was printed on the physical media 406-1 , 406-2.
[0053] In some examples, the stylus 452 can be utilized to insert writing 456 on or within the digital content 454 of the audio/visual presentation during the audio/visual presentation. That is, the wearable device 402 can be generating the audio through a speaker or similar device and displaying the digital content 454 to allow interaction with the digital content 454 during the audio/visual presentation. In some examples, the physical media 406-1 , 406-2 can be manipulated in a similar ways described herein to make it appear as though the digital content 454 is affixed within the boundaries of the physical media 406-1 , 406-2. In this way, the digital content 454 can be altered to different pages of the digital content 454 during the audio/visual presentation.
[0054] In some examples, the augmented reality display 404 can display a page or slide number of the digital content 454 that is currently being described by the audio of the audio/visual presentation. In this way, the stylus 452 can be utilized to generate writing 456, highlighting 458, or other marks within the digital content 454 of the audio/visual presentation at different times than what is currently being described by the audio of the audio/visual presentation. When the audio/visual presentation is a recorded presentation, the wearable device 402 and/or stylus 452 can be utilized to manipulate the audio/visual presentation through input selections as described herein. For example, inputs can be displayed by the augmented reality display 404 for pausing, stopping, rewinding, fast forwarding, among other features that can be selected by the stylus 452. In addition, the selectable inputs can be present on the wearable device 402 and/or the stylus 452 to manipulate the audio/visual presentation.
[0055] In some examples, the wearable device 402 can determine dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2. For example, the wearable device 402 can determine a width dimension 462-1 of the physical media 406-2, a height dimension 462-2 of the physical media 406-1 , 406-2, and/or a width dimension 462-3 of the physical media 406-1 . In some examples, the width dimension 462-1 can be a distance between an edge of the physical media 406-2 and the binding mechanism 450. In some examples, the length dimension 462-2 of the physical media 406-2 can be a distance between a top edge of the physical media 406-2 and a bottom edge of the physical media 406-2 as illustrated in Figure 4. In some examples, the length dimension 462-2 of the physical media 406-2 can be the same as the length dimension of the physical media 406-1 .
[0056] In some examples, the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 can be utilized to determine the boundaries of the physical media 406-1 , 406-2. As described herein, the digital content 454 displayed on the augmented reality display 404 can be affixed to be within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2. In these examples, the digital content 454 can be positioned within selected margins within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 such that the binding mechanism 450 is not interfering with the digital content 454. In this way, the digital content 454 can appear to be printed on the physical media 406-1 , 406-2 when viewing the augmented reality display 404.
[0057] In some examples, the stylus 452 can be utilized to physically interact with the physical media 406-1 , 406-2 to insert writing 456. In some examples, the writing 456 can be text or other images. For example, the stylus 452 can be utilized to physically interact with the physical media 406-1 , 406-2 to make notes that appear within the digital document displayed by the augmented reality display 404. In this way, a user can utilize the stylus 452 with the physical media 406-1 , 406-2 as if the digital content 454 was printed on the physical media 406-1 , 406-2 and update the digital document displayed on the augmented reality display 404. In some examples, the physical interaction between the stylus 452 and the physical media 406-1 , 406-2 may not make physical marks on the physical media 406-1 , 406-2. For example, a physical stylus 452 may not deposit ink or carbon on the physical media 406-1 , 406-2 when physically interacting with the physical media 406-1 , 406-2.
[0058] In some examples, the stylus 452 can provide multiple functions that can be utilized as inputs to be implemented into the digital document or digital content 454 displayed on the augmented reality display 404. For example, the stylus 452 can include a first mode to generate writing 456 and a second mode to generate highlighting 458. In some examples, the different modes can be selected utilizing the augmented reality display 404. For example, the augmented reality display 404 can provide a menu of selections for a plurality of different modes. In some examples, the different modes can be selected utilizing the stylus 452. For example, the stylus 452 can be a digital stylus that can include selectable inputs that can be provided to the augmented reality display 404 through a communication path. In other examples, the augmented reality display 404 can display selectable inputs that correspond to the different modes on the physical media 406-1 , 406-2 and the stylus 452 can be utilized to select the different modes by physically interacting with the corresponding areas of physical media 406-1 , 406-2. Although writing 456 and highlighting 458 are illustrated in Figure 4, the stylus 452 could utilize a plurality of additional modes that can generate different inputs that can be implemented into the digital document or affect the digital content 454 in different ways.
[0059] The system 401 illustrates how a wearable device 402 with an augmented reality display 404 can allow a user to utilize traditional interactions with physical media 406-1 , 406-2 while simultaneously updating or altering digital content 454 presented on the augmented reality display 404. As described herein, the digital content 454 can be affixed to a particular location within the dimensions 462-1 , 462-2, 462-3 of the physical media 406-1 , 406-2 to make it appear as though the digital content 454, writing 456, highlighting 458, and/or other edits are printed on the physical media 406-1 , 406-2. This can provide the experience of utilizing a printed physical media, such as a piece of paper or book, while simultaneously updating a digital document, providing security, and utilizing user profile settings that can improve a user experience.
[0060] The figures herein follow a numbering convention in which the first digit corresponds to the drawing figure number and the remaining digits identify an element or component in the drawing. Elements shown in the various figures herein can be added, exchanged, and/or eliminated so as to provide a number of additional examples of the present disclosure. In addition, the proportion and the relative scale of the elements provided in the figures are intended to illustrate the examples of the present disclosure and should not be taken in a limiting sense. As used herein, the designator “N”, particularly with respect to reference numerals in the drawings, indicates that a number of the particular feature so designated can be included with examples of the present disclosure. The designators can represent the same or different numbers of the particular features. Further, as used herein, "a number of' an element and/or feature can refer to one or more of such elements and/or features.
[0061] In the foregoing detailed description of the present disclosure, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration how examples of the disclosure may be practiced. These examples are described in sufficient detail to enable those of ordinary skill in the art to practice the examples of this disclosure, and it is to be understood that other examples may be utilized and that process, electrical, and/or structural changes may be made without departing from the scope of the present disclosure.

Claims

What is claimed:
1. A device, comprising: an augmented reality display; and a computing device, comprising a non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause a processor resource to: determine dimensions of a piece of media positioned in view of the augmented reality display; and generate an image of a digital document utilizing the determined dimensions on the augmented reality display such that the image is affixed with the piece of media.
2. The device of claim 1 , wherein the processor resource is to alter a position of the image with the piece of media in response to an alteration of a position of the piece of media.
3. The device of claim 1 , wherein the processor resource is to: separate the digital document into a plurality of pages based on the dimensions; determine a page flip of the piece of media; and update the image to a proximate page of the separated digital document.
4. The device of claim 1 , wherein the processor resource is to: identify a user of the augmented reality display; and alter the image based on a profile associated with the user.
5. The device of claim 4, wherein the processor resource is to display, alter, or block a portion of the image based on security settings of the profile associated with the user.
6. The device of claim 1 , wherein the processor resource is to: track a stylus that is in contact the piece of media; and alter the image displayed on the augmented reality display based on the contact of the stylus with the piece of media.
7. The device of claim 6, wherein the processor resource is to generate a permanent document of the altered image displayed on the augmented reality display.
8. A non-transitory memory resource storing machine-readable instructions stored thereon that, when executed, cause a processor resource to: determine boundaries of a physical media in a view path of an augmented reality display; generate a portion of a document that is presented on the physical media through the augmented reality display such that the portion of the document is fitted to the boundaries of the physical media; generate images on the document in response to a stylus interacting with the physical media; incorporate the generated images into the document; and alter the portion of the document that is presented on the physical media in response to a movement of the physical media.
9. The memory resource of claim 8, wherein the movement of the physical media includes a page flip movement.
10. The memory resource of claim 9, wherein the instructions to alter the portion of the document includes instructions to alter the document from a first page to a second page of the document in response to the page flip movement.
11 . The memory resource of claim 8, comprising instructions to: alter a viewing size of the portion of the document; and update the portion of the document to fit within the boundaries of the physical media utilizing the altered viewing size.
12. The memory resource of claim 8, comprising instructions to: alter a presentation of the document based on settings associated with a profile of a user; and update the document to fit within the boundaries of the physical media utilizing the altered presentation.
13. A system, comprising: a physical media that includes a binding mechanism: a wearable augmented reality display; and a computing device, comprising instructions to: determine a boundary of the physical media, wherein the boundary includes a first set of exterior edges of a first side of the binding mechanism and a second set of exterior edges of a second side of the binding mechanism; format a digital document to fit within the boundary of the first side and the second side of the binding mechanism, wherein a first page of the digital document is positioned within the boundary of the first side and a second page of the digital document is positioned within the boundary of the second side; display the digital document on the wearable augmented reality display to correspond within the boundaries of the first side and the second side of the binding mechanism; and alter the digital document in response to a page turn of the physical media, wherein the altered digital document displays a third page of the digital document is positioned within the boundary of the first side and a fourth page of the digital document is positioned within the boundary of the second side.
14. The system of claim 13, wherein the computing device includes instructions to alter a position of the digital document in response to an alteration of a position of the physical media, wherein the altered position of the digital document maintains the digital document within the boundary of the physical media.
15. The system of claim 13, comprising a stylus to physically interact with the physical media, wherein the computing device includes instructions to generate images on the digital document in response to the stylus physically interacting with the physical media.
PCT/US2020/054727 2020-10-08 2020-10-08 Augmented reality documents WO2022075990A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/054727 WO2022075990A1 (en) 2020-10-08 2020-10-08 Augmented reality documents

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/054727 WO2022075990A1 (en) 2020-10-08 2020-10-08 Augmented reality documents

Publications (1)

Publication Number Publication Date
WO2022075990A1 true WO2022075990A1 (en) 2022-04-14

Family

ID=81127047

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/054727 WO2022075990A1 (en) 2020-10-08 2020-10-08 Augmented reality documents

Country Status (1)

Country Link
WO (1) WO2022075990A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110018903A1 (en) * 2004-08-03 2011-01-27 Silverbrook Research Pty Ltd Augmented reality device for presenting virtual imagery registered to a viewed surface
US8687021B2 (en) * 2007-12-28 2014-04-01 Microsoft Corporation Augmented reality and filtering
US20150378535A1 (en) * 2010-06-01 2015-12-31 Intel Corporation Apparatus and method for digital content navigation
US20180081448A1 (en) * 2015-04-03 2018-03-22 Korea Advanced Institute Of Science And Technology Augmented-reality-based interactive authoring-service-providing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110018903A1 (en) * 2004-08-03 2011-01-27 Silverbrook Research Pty Ltd Augmented reality device for presenting virtual imagery registered to a viewed surface
US8687021B2 (en) * 2007-12-28 2014-04-01 Microsoft Corporation Augmented reality and filtering
US20150378535A1 (en) * 2010-06-01 2015-12-31 Intel Corporation Apparatus and method for digital content navigation
US20180081448A1 (en) * 2015-04-03 2018-03-22 Korea Advanced Institute Of Science And Technology Augmented-reality-based interactive authoring-service-providing system

Similar Documents

Publication Publication Date Title
US10152088B2 (en) Mobile terminal and control method therefor
KR102240279B1 (en) Content processing method and electronic device thereof
US9836263B2 (en) Display control device, display control method, and program
US20130232439A1 (en) Method and apparatus for turning pages in terminal
US8085440B2 (en) Method and apparatus for selecting an operation to be performed on an image stored in a printer
KR101582218B1 (en) Printed Material on which Two-dimensional Code is Displayed
US9191612B2 (en) Automatic attachment of a captured image to a document based on context
US20140152543A1 (en) System, data providing method and electronic apparatus
US20140002382A1 (en) Signature feature extraction system and method for extracting features of signatures thereof
JP5813792B2 (en) System, data providing method, and electronic apparatus
Hincapié-Ramos et al. cAR: Contact augmented reality with transparent-display mobile devices
CN109478185A (en) Map notes
US10395025B1 (en) Method for recording stroke data made on a touch sensitive interactive device
WO2016153858A1 (en) Augmenting content for electronic paper display devices
CN112749372A (en) Method and system for detecting privacy violation of stored image
CN112749406A (en) Shared image cleaning method and system
US9639533B2 (en) Graphic user interface for a group of image product designs
WO2022075990A1 (en) Augmented reality documents
WO2015022995A1 (en) Information reproduction and display system, and information display device
US20150029160A1 (en) Paper Medium, Input Device, and Non-Transitory Computer-Readable Medium for Input Device
JP2017182647A (en) Book system having real book and electronic book coordinated
EP2725461B1 (en) Information management apparatus and storage medium storing information management program
US20140289593A1 (en) Apparatus & Method for Reviewing a Written Work Using an Electronic Display
CN105049658A (en) User terminal apparatus for managing data and method thereof
CN212181425U (en) Extended enhanced electronic book

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20956889

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20956889

Country of ref document: EP

Kind code of ref document: A1