US20100011281A1 - Systems and mehtods for annotating pages of a 3d electronic document - Google Patents
Systems and mehtods for annotating pages of a 3d electronic document Download PDFInfo
- Publication number
- US20100011281A1 US20100011281A1 US12/505,262 US50526209A US2010011281A1 US 20100011281 A1 US20100011281 A1 US 20100011281A1 US 50526209 A US50526209 A US 50526209A US 2010011281 A1 US2010011281 A1 US 2010011281A1
- Authority
- US
- United States
- Prior art keywords
- page
- annotation
- dimensional
- area
- layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/169—Annotation, e.g. comment data or footnotes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/171—Editing, e.g. inserting or deleting by use of digital ink
Definitions
- Page annotation of documents including books, magazines, journals, textbooks, photo albums, maps, periodicals, or the like, is a common technique performed by readers and viewers of these documents.
- Page annotation is highly desirable to the readers and the viewers because it provides the readers and the viewers with the ability to mark the documents with text notes, handwritten notes, bookmarks, highlights and/or the like, to, e.g., facilitate later review of the same material by the annotater or another reader.
- Schilit, Price, and Golovchinsky describe a research prototype called XLibris® used to display two-dimensional electronic document pages and support free-form annotations, which runs on a tablet computer and accepts pen input.
- the user can scribble notes, draw figures, and highlight text.
- the user also has the option of changing the color of the pen and/or selecting between a wide pen and a narrow pen.
- PCT Publication WO 0,142,980 describes an annotation tool for annotating two-dimensional electronic documents.
- PCT Publication WO 0,142,980 describes that “the annotations are stored separately from the viewed document pages but are correlated with the pages such that when a previously annotated page is revisited, annotations related to that page are retrieved and displayed on top of the page as an ‘ink’ layer.”
- the user can highlight certain parts of the two-dimensional document in translucent colors or mark opaque annotations on the page, in a way very similar to XLibris.
- the “pixel blending function blends pixels from a document page with corresponding pixels from an annotation or ‘ink’ layer mapped to that document page, and generates a blended pixel image that is displayed as an annotated document page.”
- PCT Publication No. WO 0,201,339 also describes an annotation tool for annotating two-dimensional electronic documents, and describes a technique which “analyzes the ink for each annotated pixel and renders the color and brightness of each pixel based on the original pixel color and the added annotation color so as to appear as physical ink would typically appear if similarly applied to physical paper.”
- Zinio Reader® developed by Zinio Systems Inc., located at http://www.zinio.com, and Adobe Acrobat® are two examples of annotation tools.
- Adobe Acrobat® includes one example of a two-dimensional electronic annotation tool that allows selected portions of the electronic document to be highlighted. However, if the two-dimensional electronic highlighter annotation tool is applied to a three-dimensional electronic document, then difficulty in defining the highlight area and the visualization of the highlighting ink is presented.
- Hanrahan and Haeberli describe a three-dimensional electronic paint program that uses a technique to paint surfaces of three-dimensional electronic objects in “Direct WYSIWYG Painting and Texturing on 3D Shapes,” Proceedings of the ACM SIGGRAPH'90 Conference, pages 215-223.
- the user manipulates the parameters, e.g., diffuse color, specular color, and surface roughness, used to shade the surfaces of the three-dimensional object.
- the paint brush strokes specified by the user are transformed from the screen space to the texture space of the object to update the texture data. As a result, the appearance of the 3D surfaces is modified.
- Exemplary embodiments provide systems and methods that allow pages of three-dimensional electronic documents to be annotated in a manner that more accurately simulate annotating pages of an actual physical three-dimensional document.
- Exemplary embodiments provide systems and methods that allow pages of three-dimensional electronic documents to be annotated without producing noticeable artifacts.
- Exemplary embodiments provide systems and methods that provide a framework to support highlighting annotations, free-form annotations, text annotations and/or the like on one or more pages.
- Exemplary embodiments provide systems and methods that allow the user to annotate, e.g., highlight, a figure, a table, multiple lines of text and/or the like on one or more pages.
- Exemplary embodiments provide systems and methods that allow the reader or viewer to specify an area as the annotated area.
- Exemplary embodiments provide systems and methods that transform an annotated area from the coordinate system of the computer screen to the local coordinate system of the page, whereupon the annotated area is transformed from the local coordinate system of the page to a coordinate system of a texture corresponding to the page, and the resulting coordinates are stored as part of the annotation data.
- Exemplary embodiments provide systems and methods that use annotation data to display annotations on the page as the annotated area is gradually defined, and to recreate the annotation from the stored annotation data.
- Exemplary embodiments provide systems and methods that superimpose or place one or more transparent polylines over the page area which is to be annotated.
- Exemplary embodiments provide systems and methods that superimpose or place one or more transparent geometric shapes, e.g., polygons or polylines over the page area which is to be annotated.
- Exemplary embodiments provide systems and methods that re-evaluate the color of vertices as a function of vertex color, annotation color and/or ink density.
- Exemplary embodiments provide systems and methods that modify a texture pasted on a page geometry.
- Exemplary embodiments provide systems and methods that generate a new page texture based on the original page texture, annotation color and ink density.
- a reader, viewer, annotater, or user can annotate more than one portion of a page and/or more than one page of the three-dimensional document without turning the page.
- the annotation tools for three-dimensional electronic documents simulate user interaction with an actual physical three-dimensional document, e.g., a physical book, by providing users with the ability to annotate the three-dimensional electronic document in an intuitive manner.
- the stages include, but are not limited to, the specifying stage and the displaying stage.
- the specifying stage the user decides where to place an annotation and what annotation, e.g., a red highlight, a blue arrow or a free-form line stroke, to place on the electronic document.
- the annotation system displays the annotation in a visual format based on the data captured during the specifying stage.
- an annotation instrument such as a mouse or stylus, is used as an electronic annotation tool to annotate the three-dimensional electronic document.
- a user defines a page area of the three-dimensional electronic document to be annotated in the specifying step.
- the annotating can be implemented in various ways including, but not limited to, displaying the annotations with a transparent polygon, vertex coloring, texture coloring, or a hybrid technique.
- a method for annotating a page of an electronic document includes selecting a page of the electronic document, the page having a first layer; providing an annotation tool to annotate a specified area of the selected page; specifying the area of the page to be annotated by the annotation tool; annotating a second layer, the second layer corresponding to the page, by marking the specified area of the page with the annotation tool; displaying an annotation corresponding to the specified area, wherein the annotation is displayed in a third layer other than the second layer and the first layer.
- the annotations are displayed by superimposing or placing a layer with an annotation over the page area that is specified to be annotated.
- the annotations are displayed using a texture coloring technique that modifies the texture pasted on the electronic page geometry.
- a part of an annotation may be represented by one of the transparent polygon (which may include, polyline), vertex coloring, and texture coloring annotation techniques during a display period, and another part of the annotation may be represented by a different annotation technique during the same display period.
- the annotations can be displayed in three dimensions to convey depth and/or a different shape than the underlying page.
- FIG. 1 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document, which has been annotated using an electronic annotation tool to define an annotation area;
- FIG. 2 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a transparent polygon technique
- FIG. 3 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying three annotation areas using a transparent polygon technique
- FIG. 4 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a vertex coloring technique
- FIG. 5 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a texture coloring technique
- FIG. 6 illustrates a flowchart outlining an exemplary embodiment of a method for annotating pages of a three-dimensional electronic document
- FIG. 7 illustrates a flowchart outlining an exemplary embodiment of a method for displaying annotations in the page area of a three-dimensional electronic document that has been annotated
- FIG. 8 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a texture coloring technique
- FIG. 9 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique
- FIG. 10 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique
- FIG. 11 illustrates a cross-section of the annotated three-dimensional page shown in FIG. 10 displaying an annotation area using a hybrid technique
- FIG. 12 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique
- FIG. 13 illustrates a flowchart outlining an exemplary embodiment of a method for annotating pages of three-dimensional electronic documents using a hybrid technique
- FIG. 14 is a block diagram outlining one exemplary embodiment of a system for annotating pages of three-dimensional electronic documents.
- Producing an annotation on an electronic document may include a specifying stage and a displaying stage.
- a reader viewer or user indicates to a system an areas of a page to which to place an annotation, and instructs the system as to the type of annotation (e.g., a red highlight, a blue arrow, or a free-form line) to place at the areas.
- the reader may specify an annotation using an input device such as, for example, a mouse or stylus.
- the system may capture the specification data, during the specification stage, as it is being input.
- the system displays the annotation in a visual format based on the data captured during the specifying stage.
- FIG. 1 illustrates a close-up view of an exemplary embodiment of a three-dimensional electronic document (e-book) 100 containing multiple document pages, such as the depicted document page 110 .
- the document page 110 is annotated with an annotation tool, e.g., a mouse (not shown) that marks a page area 120 of the document page 110 with a specific highlight color.
- an annotation tool e.g., a mouse (not shown) that marks a page area 120 of the document page 110 with a specific highlight color.
- annotation tool refers to any device or combination of devices that allows a user to produce a visual enhancement, such as underlining, coloring, text, graphics or the like, on an electronic document.
- a computer mouse or keyboard manipulates a cursor combined with circuitry and/or software and/or other hardware, such as physical buttons and/or menus on a screen as an annotation tool.
- An electronic highlighter is one type of annotation tool. Operations of a highlighter will be described in examples below.
- ink of course refers not to liquid ink, but rather virtual (e.g., electronically represented) ink used to produce visual enhancement, such as underlining, coloring or the like, on an electronic page.
- the system may allow the user to modify the annotation color and/or the ink density if desired. For example, if the user wants to modify the highlighter color, the user may select the color from a drop-down menu. Then, color of the highlighter is modified and new color data is stored when the highlighter is used on an electronic page. If the user wants to modify the ink density, the user may select the ink density from a drop-down menu. Then, the ink density of the highlighter is modified and new ink density data is stored when the highlighter is used on an electronic page.
- the user may annotate, e.g., highlight, an area of the page by marking a portion of the document page 110 with a highlighter.
- the portion to be highlighted is defined in this example as a rectangular page area 120 based on the user input.
- the page area 120 may be annotated in a specified color or ink density.
- the user To mark an area, e.g., the page area 120 of the document page 110 , the user inputs a command, such as, for example, by operating a mouse to move a cursor to a starting position 122 and pressing a mouse button to anchor a corner of the mark at the starting position 122 .
- the user can then drag the mouse to move the cursor to select the desired area 120 to be marked.
- the size of the area 120 changes and the highlight area is updated dynamically providing a visual feedback to the user of the area to be highlighted.
- the user finalizes the highlighted rectangular page area 120 by releasing the mouse button to anchor another corner of the rectangular page area at an end position 124 .
- the mouse communicates with the system to indicate where the mouse action occurred on the display screen.
- the screen coordinates corresponding to the mouse action are mapped to the local coordinate system of the page to determine where on the three-dimensional document page the user pointed at when the mouse action occurred.
- the area specified by the user on the screen is transformed from the screen coordinate system, to the world coordinate system, and further to the local coordinate system of the page. If necessary, the area is further transformed from the local coordinate system of the page to a texture coordinate system of the page. The resulting area is then stored as part of the highlight data.
- the world coordinate system is the principal coordinate system of a three-dimensional workspace and is independent of viewpoint and display. Individual objects, such as a page, are each defined in their own local coordinate systems within the world coordinate system. The local coordinate systems are each independent from each other and the world coordinate system. Light sources, the viewer, and virtual cameras are positioned in the world coordinate system. In some embodiments the viewer may be the virtual camera.
- the screen coordinate system may correspond to the coordinate system of a virtual camera.
- the location of the camera may correspond to the origin of the screen coordinate system.
- a reader's eye reviewing the screen corresponds to the origin of a screen coordinate system.
- the texture coordinate system defines a texture and is independent from the world coordinate system, screen coordinate system and the local coordinate system.
- Texture mapping is a process that maps a texture (for example, a high definition image, such as a picture or text) onto an object (e.g., a page).
- the shape of the object, e.g., a page, is represented with a polygon mesh.
- the polygon mesh is, generally, a set of vertices connected by edges.
- a vertex of the polygon mesh has a 3D coordinate (x,y,z) which defines where it is located in the local coordinate system of the page.
- Each vertex also has a 2D texture coordinate (u,v) for the purpose of texture mapping.
- the lower left corner of the page may be located at (0,0,0) of the local coordinate system and have a texture coordinate of (0,0).
- the upper right corner of the page may be located at (pageWidth,pageHeight,0) and have a texture coordinate of (1,1).
- the texture mapping may require some compressing or stretching of the texture.
- scan conversion determines where the point is located in the local coordinate system, and determines the point's texture coordinate.
- the texture coordinate determines the point's texture color.
- Raycasting is one example of a technique that is used in mapping a screen coordinate to a local coordinate of the page.
- the raycasting technique includes shooting a ray from a virtual camera (corresponding to the eye of a user viewing the screen) through/from the screen coordinate position of the mouse (x m , y m ) towards the three-dimensional document.
- the intersection position i.e., coordinate points, (x w , y w , z w ), of the ray and the page of the three-dimensional document.
- this point is then mapped from the world coordinate system to the local coordinate system of the page. If necessary, the point represented in the local coordinate system of the page is further mapped to the texture coordinate system of the page.
- annotation specification technique described above may be applied to document pages facing straight to the user and/or facing to the user at oblique angles; that the annotation specification technique may be applied to document pages represented as flat three-dimensional and/or curved three-dimensional surfaces; that, although the previous description mentioned a rectangular area to be annotated, an area of any shape, including circles, oblong or oval shapes, and polygons of any numbered sides can be specified by the user in a similar manner; and that, rather than a mouse, other input devices (e.g., a stylus or electronic pen) can be employed to specify the page area to be annotated.
- other input devices e.g., a stylus or electronic pen
- factors other than the location of the annotation area may be used to influence the annotation effect. For example, “covering up” the original contents of the page so that they cannot be viewed after annotating, e.g., highlighting, may be desirable, or allowing the original contents to still be viewable after highlighting may be desirable.
- the color of the highlight and the ink density of the highlight used will impact whether the original contents can be viewed. If the original contents of the page are to be viewed after highlighting, the original contents of the page that are located in the marked page area may be blended with the highlight color to produce a highlighting effect.
- the ink density of the highlight determines how much of the highlight color appears in the blending result.
- a user interface (not shown) can be provided that allows the user to change the color and/or ink density of the highlight.
- annotation data pertaining to the annotated area including the area boundary, the color, the ink density and/or the like is stored in the system for annotating three-dimensional documents and correlated with the corresponding page.
- the system for annotating three-dimensional documents is intended to display the annotation on the page as the annotation area is gradually defined. Additionally, whenever the page is revisited, e.g., due to backward and/or forward page turning, the annotation is recreated from the stored annotation data and displayed on the corresponding page.
- FIG. 2 illustrates an embodiment of displaying an annotation 200 , e.g., highlight, by using a transparent and/or translucent polygon 220 (hereafter, referred to as “transparent polygon” for convenience) and superimposing or placing the transparent polygon 220 over the page area 230 , of a three-dimensional page 210 , that is to be highlighted.
- the highlight may be created by a highlighting specification technique described previously.
- the location and size of the transparent polygon 220 is preferably equal to the location and size of the page area 230 , which is determined by the stored highlight data.
- the color of the polygon 220 reflects the color of the highlighter, and the opacity of the polygon 220 models the ink density of the highlighter.
- the transparent polygon 220 is offset/elevated (for example, along a “z” axis, if the plane corresponding to the page is defined by “x” and “y” axes) from the page 210 towards the front side of the page by a distance D 1 .
- the minimum offset distance may depend, e.g., on the Z buffer of the graphics hardware.
- FIG. 3 illustrates an exemplary embodiment of the step of displaying annotations 300 by superimposing multiple transparent polygons 320 , 330 , 340 over three page areas that are to be annotated 350 , 360 , 370 of a three-dimensional page 310 .
- the transparent polygons 320 , 330 , 340 overlap a common area 380 of the three-dimensional page 310 , the transparent polygons 320 , 330 , 340 are offset from each other to avoid the Z fighting problem.
- algorithms for determining the offset for new polygons created by additional annotations of the three-dimensional page 310 becomes increasingly complex.
- FIG. 4 illustrates an exemplary embodiment of displaying an annotation, e.g., highlights on a page 400 , using vertex coloring to color those vertices within the three-dimensional page area 410 that are to be annotated.
- the vertex coloring technique has advantages over the transparent polygon technique because it can avoid some of the system complications associated with the transparent polygon technique when surfaces which are to be viewed as curved are annotated. For example, in the transparent polygon technique, if a page is curved, e.g., when turned, after being highlighted, the transparent polygons associated with the highlight must also be curved. As in the transparent polygon technique, and as illustrated in FIG.
- the polygonal boundary specified by the user determines the page area 410 that will be annotated.
- the polygonal area 410 is additionally transformed from the local coordinate system of the page to the coordinate system of the texture corresponding to the page. This transformation takes place during the specification stage.
- the resulting texture coordinates are stored as part of the annotation data.
- the page 400 can be represented as a geometric object.
- the page geometry is represented as having a polygon mesh.
- the polygon mesh is a computer graphics technique which approximately represents a three-dimensional object, such as the three-dimensional page, using vertices 415 connected by edges 420 .
- the vertices 415 lying inside the user-specified polygonal area 410 can be identified.
- the colors of the identified vertices 415 are re-evaluated as a function of the annotation color and the ink density. Re-evaluation may change the colors of the vertices.
- the colors of the vertices inside and outside the polygonal area 410 of the polygon mesh may subsequently be interpolated and then blended with the page texture corresponding to the page to produce an annotating effect.
- the vertex coloring technique may produce noticeable artifacts, e.g., at the annotation boundary, resulting from the bi-linear interpolation of vertex colors occurring from the scan conversion of the polygon mesh.
- a finer polygon mesh will, to a certain degree, ameliorate the annotation boundary, a finer mesh will require more vertices to be processed and somewhat impact the processing speed of the system.
- FIG. 5 illustrates an exemplary embodiment of displaying an annotation, e.g., highlight, using the texture coloring technique.
- the page texture is modified and/or updated to display the annotated data.
- the page texture is a layer corresponding to the page and used to represent high resolution images.
- the page is represented by a polygon mesh. Each vertex of the page has a texture coordinate in the texture coordinate system of the page.
- the texture coloring technique employs many of the same steps used by the transparent polygon and vertex coloring techniques. Like the vertex coloring technique, after determining the boundary of the polygon area 510 , e.g., the area to be highlighted, in the specification stage, the polygonal area 510 is additionally transformed from the local coordinate system of the page (i.e., polygon mesh) to the coordinate system of a texture corresponding to the page. The resulting texture coordinates are stored as part of the highlight data.
- the page texture and the associated highlight data is separate from the page during annotation specifying; however, after specifying an annotation, the associated annotation is blended to the page texture, which is then pasted onto the page geometry.
- FIG. 5 illustrates the page texture pasted on the page 500 (i.e., page geometry). The entire image shown on the page 500 corresponds to the texture.
- a blending step may also be performed to achieve the annotation effect.
- the blending step includes computing the color of each texture pixel within the annotated polygonal area 510 .
- the blending operation produces a new texture for the page geometry.
- the new texture is then applied to the page 500 by pasting the new texture onto the page geometry.
- the texture coloring technique produces relatively well-defined boundaries for the annotated areas because, as discussed above, the page texture generally has a higher resolution than the polygon mesh. Therefore, the bi-linearly interpolated texture coordinates are generally more visually appealing than the result of the bi-linearly interpolated vertex colors.
- FIG. 6 is a flow chart illustrating an outline of an exemplary embodiment of a method for annotating three-dimensional documents. As shown in FIG. 6 , operation of the method begins in step S 100 , and continues to step S 105 , where a three-dimensional document is obtained. Then, in step S 110 , a three-dimensional page of a three-dimensional document to be annotated is turned to based, e.g., on a user input, and displayed. A user may use a drop-down menu, a mouse button, or other input device to select a page to be turned to.
- a page area of the three-dimensional page to be annotated is specified by marking an annotation on the screen.
- the annotation data e.g., data corresponding to the movement of an annotation input device (e.g., pen), the polygon boundary, annotation color, ink density, etc.
- the correlated annotation data is then stored with other data (e.g., the page texture) of the page to be annotated.
- an annotation is displayed based on the annotation data.
- step S 1125 it is determined whether the annotation of the current page is completed. This determination may be based on, e.g., a clock or whether the page has been turned. If so, operation continues to step S 130 . If not, operation returns to step S 115 .
- step S 130 it is determined whether the annotation of the current document is completed. This determination may be based on, e.g., a clock or whether the document has been closed. If so, operation continues to step S 135 where the method ends. If not, operation returns to step S 110 .
- annotation is not necessarily performed as a single activity. More likely, annotations are added as the user is reading through a document. For example, a user may read a page and find an area of the page to be annotated, e.g., a few interesting sentences on the page. The user can then mark the sentences with an annotation tool, e.g., a highlighter, and then continue to read through the document. In other words, the user can perform other activities between annotations such as reading, turning pages and/or the like.
- annotation tool e.g., a highlighter
- FIG. 7 is a flow chart outlining an exemplary embodiment of a method for displaying an annotation.
- the method of FIG. 7 may be considered a detailed method of performing step S 1120 of FIG. 6 .
- operation of the method of FIG. 7 begins in step S 120 , and continues to step S 1201 , where it is determined whether the annotation is to be displayed with a transparent polygon technique. If so, operation continues to step S 1202 .
- step S 1202 a transparent polygon is superimposed over the page area to be highlighted. Operation then jumps to step S 1206 and returns to step S 125 of FIG. 6 where a determination is made whether the current document's page annotations are complete. If, in step S 1201 , the transparent polygon technique is not to be employed, operation jumps to step S 1203 .
- step S 1203 it is determined whether the annotation is to be displayed with a vertex coloring technique. If so, operation continues to step S 1204 . Otherwise, operation jumps to step S 1205 .
- step S 220 In the vertex coloring technique, in step S 220 , all vertices within the page area to be annotated are colored. Operation then jumps to step S 230 .
- step S 225 those texture pixels within the page area to be annotated are modified using the texture coloring technique. Operation then continues to step S 1206 and returns to step S 125 of FIG. 6 where a determination is made whether the current document's page annotations are complete.
- the determinations in steps S 1201 and S 1203 may be based on, e.g., a user input, or be preprogrammed into the system and based on variables such as the size of the page and/or document, or the processing speed and memory of the system.
- the texture coloring technique has advantages over the transparent polygon and vertex coloring techniques and, as such, the texture coloring technique is a preferred method.
- the reader when a reader is interactively involved with a document, the reader generally desires quick frame rates, i.e., quick updates of the screen information, as the information is input.
- quick frame rates i.e., quick updates of the screen information
- the reader may be interested in quick frame rates during two scenarios, 1) when an annotation is being specified by a reader, and 2) when a page on which the annotation was specified is re-visited (e.g., due to page turning).
- the blending operation of the texture coloring technique needs to be carried out quickly when a reader is interactively involved with a document and a high processing speed/quick frame rate is desired.
- the computational speed of the blending operation depends on the resolution of the page texture to be pasted on the page, along with system constraints, such as, for example, available memory and/or processing speed. For example, as the resolution increases, the number of pixels to be processed by the blending operation also increases. As the number of pixels to be processed increases, the potential for a bottleneck and a failure of the blending operation to be performed quickly can develop.
- multi-resolution texture features used in features for turning a page of a three-dimensional document can be used to lower the resolution of the texture for a specific time and, thus, reduce the possibility of a bottleneck forming in the system.
- the blending operation can be performed using a low-resolution texture for the page. This significantly reduces the overhead of the blending operation because the number of pixels to be processed is reduced from the number of pixels processed for a high-resolution texture.
- the blending operation can use the higher resolution page texture.
- the system may, for example, initially display the low-resolution texture, and then automatically change to display a high-resolution texture after the page turning animation is complete.
- FIG. 8 illustrates an exemplary method for annotating three-dimensional documents.
- pen-based annotations 610 i.e., annotations with a stylus, may be used as a method for annotating three-dimensional documents.
- Pen based annotations allow the user to sketch out free-form annotations.
- Handwritten annotations 610 can be decomposed and approximated with a set of strokes where each stroke has a trajectory, width, color, and ink density.
- the trajectory of a stroke is captured as the user annotates the page 600 by sliding the stylus (not shown) from one location of the three-dimensional page 600 to the next location of the three-dimensional page 600 .
- the width, color, and ink density of a stroke can be modified by the user with a user interface (not shown).
- the stroke trajectory, coupled with stroke width generally defines the footprint of the stroke on the display screen. This footprint can be approximately represented as a polygon or other shape. For convenience, “polygon” is used hereafter as an example.
- the annotation i.e., polygon
- the polygon is transformed from the screen coordinate system to the local coordinate system of the three-dimensional page 600 .
- the polygon is further transformed from the local coordinate system of the page 600 to the coordinate system of the texture corresponding to the page 600 .
- the annotation data e.g., the polygon boundary, the color, and the ink density, is then stored and correlated with the respective page.
- the texture coloring technique is the preferred display technique.
- annotation operations may be modified to support other types of annotations for three-dimensional documents.
- the user can indicate where on the three-dimensional page to place a text annotation. This location is transformed from the screen coordinate system to the coordinate system of the texture corresponding to the page.
- the user can input the content of the text annotation.
- the content of the text annotation can be specified in a handwritten manner using the stylus.
- the content of the text annotation can be rendered as an ink layer and blended with the original page texture to create a new texture, which is pasted on the page geometry.
- a transparent or opaque text annotation can be shown on top of the original content of the three-dimensional page.
- FIG. 8 illustrates an exemplary embodiment of an annotation of a document page in a virtual three-dimensional document.
- Virtual three-dimensional document systems often have a relatively low frame rate speed and display a page with a texture at a relatively high resolution.
- “PARC TAP” was written using a stylus or pen in a top margin of the document page and captured on a high-end desk top, such as a Dell, Precision 340 equipped with an ATI 9700 PRO graphics card.
- the words “PARC TAP” were deliberately written slowly; however, the words “PARC TAP” were not captured adequately enough by the system for the words “PARC TAP” to be considered by a reader as naturally written.
- the curves on the letters “P”, “C” and “R” were not captured as curves (to the extent that a “curve” can be written by a reader) but, instead, captured as individual line segments.
- the low frame rates of the three-dimensional document system used in FIG. 8 limit the number of visual updates provided back to the reader as the reader annotates the document page.
- Pen events correspond to the reader's annotation of the page, such as, for example, the movement of a stylus on the page.
- a system with low frame rates is unable to capture enough pen events corresponding to the pen movement in real time.
- the latency (i.e., time) between two consecutive frames is one factor that determines the ability of the system to capture pen events and display the annotation as the reader annotates the page. In the example shown in FIG. 8 , due to the low frame rates, the latency between two consecutive frames was significant enough that the system only captured a few pen events corresponding to each of the curves. As such, the displayed curves include only a few line segments.
- Frame rates are impacted negatively by, at least, 1) the texture resolution of the page (e.g., texture resolution needs to be high enough to make the text on the page readable), and 2) the blending operation of the high-resolution texture of the page with the annotation layer and the other layers of visual augmentation, such as word/sentence underlining and highlighting, that were placed on the texture.
- the texture resolution of the page e.g., texture resolution needs to be high enough to make the text on the page readable
- these layers are usually rendered on top of the annotation layer.
- changes to the annotation layer entail the re-evaluation of the other layers that were placed on top of the annotation layer.
- an additional factor negatively impacting frame rate speed is that after all of the layers are blended with the texture, the new texture is sent from the main memory to the texture memory of the graphics card and rendered
- the following detailed description discloses a hybrid technique for annotating pages of three dimensional documents and discloses exemplary methods and systems providing an increased interactive rendering and displaying speed at the annotation specifying stage.
- the following exemplary embodiments provide a relatively quick blending operation when the annotation is being specified by the reader.
- a hybrid technique uses one annotation technique, e.g., the polyline technique, for one part of an annotation and another annotation technique, e.g., the texture coloring technique, for another part of an annotation such that the entire annotation can be displayed “on the fly” at interactive speeds acceptable to a reader viewing an annotation when, for example, the reader is specifying an annotation.
- FIG. 9 illustrates a free-form annotation of the letters “PAR” in the top margin of a document page. The word was annotated on the page by a reader using an electronic pen.
- a reader annotates the page by moving the pen on the page.
- the pen movements correspond to a set of strokes.
- the letter “A” may consist of three strokes “/”, “ ⁇ ” and “ ⁇ ”.
- a reader uses a sequence of pen events to specify a stroke.
- a pen “down” event indicates the beginning of a new stroke. Moving the pen while holding the pen down sketches the trajectory of the stroke.
- a pen “up” event denotes the completion of a stroke.
- Each stroke can correspond to a group of connected lines, commonly referred to as a polyline.
- FIGS. 9 , 10 and 11 illustrate an exemplary embodiment of a hybrid technique for annotating an area of a page of an electronic document.
- FIG. 10 illustrates a view of an electronic page 800 .
- FIG. 11 illustrates a cross-section view of the page 800 taken along line A-A of FIG. 10 , through the leg of the letter “P” that has been annotated on the page.
- an annotation for example, by sketching a stroke 810 between a beginning point 805 and an ending point 815 , 3D polyline segments 820 corresponding to the pen events of the stroke are created on an annotation layer 830 .
- Layer 835 is a page texture layer.
- Annotation layer 830 and page texture layer 835 are shown as separate layers from the page 800 ; however, these layers may be intrinsic therewith, i.e., the information of the separate layers may actually be formed and/or stored as parts of the same layer.
- the annotation layer 830 and texture layer 835 have coordinates that correspond to the page geometry (e.g., polygon mesh) of the page 800 .
- the stroke 810 is captured as a 3D polyline segment 820 by transforming the specified area marked on the screen from the screen coordinate system to the local coordinate system of the page and to the corresponding coordinates of the annotation layer.
- a raycasting technique can be used to perform the transformations.
- a ray 845 is shot from a camera 840 through the screen 850 position of a point, corresponding to the point marked by the mouse, towards the page 800 .
- a point(s) 860 corresponding to the 3D polyline 820 are computed slightly behind the near clipping plane 850 (i.e., “lens”) and in front of the far clipping plane (not shown) of the camera frustum of the virtual camera, i.e., slightly behind the plane 850 in FIG. 11 .
- the local coordinates of the page corresponding to the intersection coordinates i.e., the points 870 where the rays 845 strike the page 800 , and subsequently, the intersection's texture coordinates, are calculated.
- FIG. 11 shows four layers: the page geometry layer 800 , the page texture layer 835 , the page annotation layer 830 , and the 3D polyline layer 880 .
- the page geometry layer 800 , the page texture layer 835 , and the page annotation layer 830 are all located in the same space. That is, in FIG. 11 , they should all be located at the same location.
- the layers 800 , 830 and 835 are shown as separated.
- layers 800 , 830 and 835 are illustrated as generally parallel, and flat, the layers may not be parallel or flat (e.g., the layers may be curved).
- the representations of the stroke are computed on both the page annotation layer 830 and the 3D polyline layer 830 . However, only one of these two representations is displayed at any moment.
- the representation on the 3D polyline layer 880 is used as the stroke is being sketched out. Once the stroke is specified, the representation on the 3D polyline layer 880 is removed. Once the representation is removed from the 3D polyline layer 880 , the representation on the page annotation layer 830 is displayed by the texture coloring technique.
- the representation on the page annotation layer 830 may be completed by using a raycasting technique.
- raycasting can be used to compute a corresponding point on the 3D polyline layer 880 , by shooting or starting a ray from a virtual camera 840 through a point on the screen 850 , and extending the ray a little bit beyond the point on the screen 850 .
- the extension of the ray provides point 860 which is slightly behind the near clipping plane 850 .
- the stroke 810 is represented during the displaying stage by connecting those points 860 located slightly behind the near clipping plane 850 to form three-dimensional (“3D”) polyline segments 880 .
- the 3D polyline segments 880 connecting those points 860 are rendered and displayed with the existing page texture and other annotations of the page 800 .
- the 3D polyline segments 880 correspond to the annotation, i.e., stroke 810 , that the reader placed on the page 800 in the specifying step.
- the 3D polyline segments are placed in layer 880 .
- Layer 880 is located slightly behind the near clipping plane 850 of the camera frustum to ensure that the annotation will be displayed, as viewed by a reader, as on the top of the page 800 , and thus, avoid a Z fighting problem.
- the 3D polyline segments 880 are not part of the page texture, the 3D polyline segments 880 can be efficiently rendered and displayed by related art graphic cards.
- the 3D polyline segments 880 of the hybrid technique are independent of the page's texture resolution, and thus enable frame rate speed to increase.
- a reader may move to specify the next stroke.
- the 3D polyline segments 880 can be removed and the page texture layer updated and applied to the page geometry.
- the page texture can be updated with the texture coloring technique.
- FIG. 9 illustrates a snapshot taken as a user annotates a document page with the words “PARC TAP”.
- the letters “P” and “A”, as well as the first stroke of the letter “R”, were rendered and displayed with the texture coloring technique.
- the second stroke of the letter “R”, indicated by the darker color, is represented as 3D polyline segments.
- the 3D line segments can be used to represent the entire annotation as the annotation is being sketched out, and not removed between strokes.
- the 3D line segments can be removed and the texture can be updated to display the annotation with the texture coloring technique. If this exemplary embodiment is employed, the end of the annotation specifying step must be explicitly declared by the reader or implicitly detected by the system.
- the hybrid technique can be used to support highlighting annotations as well.
- a transparent 3D polygon may be created instead of a 3D polyline.
- the vertices of the polygon may be computed according to where on the computer screen the annotation input events occur.
- the vertices are preferably placed slightly behind the near clipping plane of the camera frustum to ensure that the transparent polygon is displayed on the top of the page and covers the area that the reader intends to highlight.
- a reader instead of a transparent polygon, a reader may indicate and a system may render only the edges of the polygon.
- the geometric representation corresponding to the polygon is removed and the texture is updated to display the highlight with the texture coloring technique.
- FIG. 12 illustrates an annotation captured using the same desktop as that used to capture the annotation illustrated in FIG. 8 .
- FIG. 12 illustrates an annotation 910 of “PARC TAP” on electronic page 900 .
- the annotation was obtained using the hybrid annotation technique. Because the hybrid technique displays a part of the annotation slightly behind the near clipping plane and other parts of the annotation using the texture coloring technique, the annotation technique achieves quicker frame rates than the frame rates obtained using the texture coloring technique. Thus, unlike in FIG. 8 , where the user deliberately slowed pen movements down to allow the system to process the pen events, a user does not have to deliberately slow down pen movements when annotating a document page.
- strokes displayed with the texture coloring technique and strokes represented as 3D polyline segments differ slightly in visual appearance.
- the strokes displayed using 3D polyline segments are shown in a darker color.
- the appearance of parts of the annotation can be modified, for example to match other parts of the annotations, by adjusting the width and/or transparency of the line segments.
- the page texture is not directly modified when the user specifies an annotation because the page texture may be public data shared by multiple users.
- the page texture is not directly modified, because it is desired to preserve the original content of the page.
- Annotations are more likely to be private data created by one user.
- the annotations may be easily removed when necessary.
- the annotation data is stored separately from the corresponding page texture to allow flexibility in accessing the un-annotated three-dimensional document, the annotations alone, or the annotated three-dimensional document.
- the page texture data could be modified and/or updated permanently with annotation data.
- FIG. 13 is a flow chart illustrating an outline of an exemplary embodiment of a method for annotating three-dimensional documents using a hybrid technique.
- operation of the method begins in step S 200 , and continues to step S 205 , where a three-dimensional document is obtained.
- step S 210 a page of a three-dimensional document to be annotated is turned to and displayed.
- a user may use a drop-down menu, a mouse button, keyboard, and/or other input device to select a page to be turned to.
- step S 215 a page area of the three-dimensional page to be annotated is specified by a user as the user makes an initial stroke of the annotation.
- step S 220 a raycasting technique is applied to create points on the page annotation layer as well as points corresponding to the annotation behind the near clipping plane of the camera frustum as the stroke is created.
- the points located behind the near clipping plane are connected to form 3D polyline segments.
- the 3D polyline segments are displayed over the page.
- the annotation data relating to the specified page area is correlated with the page by mapping the boundaries of the annotation data from the screen coordinate system to the local coordinate system of the page and the texture coordinate system.
- the correlated annotation data is then stored with other data (e.g., the page texture) of the page to be annotated.
- step S 240 a decision is made as to whether the stroke has been completed. If so, the operation continues to step S 245 . Otherwise, the operation returns to step S 220 , and steps S 220 through S 240 are repeated until the stroke has been completed.
- step S 245 the page texture is updated with the annotation data created in step 220 .
- the 3D polyline segments formed on the 3D polyline layer may be removed.
- step S 250 the updated page texture is displayed.
- the updated page texture is displayed.
- subsequent annotations can be displayed as 3D polyline segments, while the previous annotation data is displayed with the page texture.
- step S 255 it is determined whether the annotation of the current page is completed. If so, operation continues to step S 260 . If not, operation returns to step S 215 .
- step S 260 it is determined whether the annotation of the current document is completed. If so, operation continues to step S 265 , where the operation ends. If not, operation returns to step S 210 .
- FIG. 14 is a functional block diagram outlining an exemplary embodiment of an annotation control system 1000 according to this invention.
- the annotation control system 1000 includes an input/output interface 1110 , a controller 1120 , a memory 1130 , a document identifying circuit, routine or application 1140 , a page area specifying circuit, routine or application 1150 , and an annotation displaying circuit, routine or application 1160 , each appropriately interconnected by one or more control and/or data buses.
- the input/output interface 1110 is linked to a document data source 1200 by a link 1210 , and to a display device 1300 by a link 1310 . Further, the input/output interface 1110 is linked to one or more user input devices 1400 by one or more links 1410 .
- Each of the links 1210 , 1310 and 1410 can be any known or later-developed connection system or structure usable to connect their respective devices to the annotation control system 1000 . It should also be understood that links 1210 , 1310 and 1410 do not need to be of the same type.
- the memory 1130 can be implemented using any appropriate combination of alterable, volatile or non-volatile memory or non-alterable, or fixed, memory.
- the alterable memory whether volatile or non-volatile, can be implemented by using any one or more of static or dynamic RAM, a floppy disk and disk drive, a writable or rewritable optical disk and disk drive, a hard drive, flash memory or the like.
- the non-alterable or fixed memory can be implemented using any one or more of ROM, PROM, EPROM, EEPROM, and gaps in optical ROM disk, such as a CD ROM or DVD ROM disk and disk drive, or the like.
- the input/output interface 1110 is connected to the user input devices 1400 over a link 1410 .
- the user input devices 1400 can be one or more of a touch pad, a touch screen, a track ball, a mouse, a keyboard, a stylus, an electronic pen or any known or later-developed user input devices 1400 for inputting data and/or control signals to the annotation control system for annotating pages of the three-dimensional electronic document.
- the input/output interface 1110 is connected to display device 1300 over link 1310 .
- the display device 1300 can be any device that is capable of outputting, i.e. displaying, a rendered image of the three-dimensional electronic document.
- the document identifying circuit, routine or application 1140 receive a user input, inputs a three-dimensional electronic document to be annotated and identifies the three-dimensional document page to be annotated. Then, the page area specifying circuit, routine or application 1150 receives a user input, inputs the three-dimensional electronic document, inputs the identified three-dimension page to be annotated and specifies a portion of the page as the page area to be annotated. Finally, the annotation displaying circuit, routine or application 1160 inputs the three-dimensional electronic document, inputs the identified three-dimension page to be annotated, inputs the specified page area to be annotated, and displays the annotated page using an annotation display technique.
- An exemplary embodiment of an annotation control system 1000 for annotating pages of a three-dimensional electronic document according to FIGS. 6 and 13 operates in the following manner.
- a user input is output from the user input devices 1400 over link 1410 to the input/output data interface 1110 of the annotation control system 1000 .
- the user input information is then stored in the memory 1130 under control of the controller 1120 .
- the three-dimensional documents is output from the document data source 1200 over link 1210 to the input/output interface 1110 in accordance with the user input.
- the three-dimensional electronic document is then input into the document identifying circuit, routine or application 1140 under the control of the controller 1120 .
- the document identifying circuit, routine or application 1140 identifies the three-dimensional electronic document to be annotated based on user input and the controller stores the identified three-dimensional electronic document in the memory 1130 .
- the page area specifying circuit, routine or application 1150 allows the user to specify a portion of the three-dimensional document area to be annotated on the display device 1300 . Additionally, the page area specifying circuit, routine or application 1150 maps the boundary of the page area from the screen coordinate system to the local coordinate system of the page. The page area specifying circuit, routine or application 1150 also maps the boundary of the page area from the local coordinate system of the page to the texture coordinate system of the page when required (e.g., when the vertex coloring technique, texture coloring display technique, or hybrid technique is used to display the annotation). Further, the page area specifying circuit, routine or application 1150 may modify color or ink density of the annotation instrument based on user input. Finally, the page area specifying circuit, routine or application 1150 stores the annotation data in memory 1130 and correlates the annotation data with other data of the page via the controller 1120 .
- the annotation displaying circuit, routine or application 1160 displays the annotation in the specified page area according to the display technique selected by the user. Then, the annotation displaying circuit, routine or application 1160 superimposes 3D polyline segments or a transparent polygon over the page area to be annotated. Alternatively and/or subsequently, the annotation displaying circuit, routine or application 1160 re-evaluates and colors the vertices within the page area to be annotated or modifies texture pixels within the page area to be annotated based on the selected display technique.
- the systems and methods can be applied to a flat page surface and/or a curved page surface.
- the annotation is also required to deform along with the page area to be annotated.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Document Processing Apparatus (AREA)
- Image Generation (AREA)
Abstract
Description
- This is a Division of application Ser. No. 11/012,902 filed Dec. 16, 2004, which copended with U.S. patent application Ser. No. 10/739,213, now U.S. Pat. No. 7,148,905. The disclosures of the prior applications are hereby incorporated by reference herein in their entirety.
- Page annotation of documents including books, magazines, journals, textbooks, photo albums, maps, periodicals, or the like, is a common technique performed by readers and viewers of these documents. Page annotation is highly desirable to the readers and the viewers because it provides the readers and the viewers with the ability to mark the documents with text notes, handwritten notes, bookmarks, highlights and/or the like, to, e.g., facilitate later review of the same material by the annotater or another reader.
- Although many of these documents have been traditionally presented in paper format, electronic formats of these documents have become widely available due to numerous developments in the computer related fields, e.g., the Internet. With the increasing growth of electronic documents, the readers and the viewers still find page annotation highly desirable. Therefore, some annotation tools for two-dimensional electronic documents have been provided.
- For example, Schilit, Price, and Golovchinsky describe a research prototype called XLibris® used to display two-dimensional electronic document pages and support free-form annotations, which runs on a tablet computer and accepts pen input. By using the pen, the user can scribble notes, draw figures, and highlight text. The user also has the option of changing the color of the pen and/or selecting between a wide pen and a narrow pen.
- PCT Publication WO 0,142,980 describes an annotation tool for annotating two-dimensional electronic documents. PCT Publication WO 0,142,980 describes that “the annotations are stored separately from the viewed document pages but are correlated with the pages such that when a previously annotated page is revisited, annotations related to that page are retrieved and displayed on top of the page as an ‘ink’ layer.” By using the stylus, the user can highlight certain parts of the two-dimensional document in translucent colors or mark opaque annotations on the page, in a way very similar to XLibris. To display the annotations, the “pixel blending function blends pixels from a document page with corresponding pixels from an annotation or ‘ink’ layer mapped to that document page, and generates a blended pixel image that is displayed as an annotated document page.”
- PCT Publication No. WO 0,201,339 also describes an annotation tool for annotating two-dimensional electronic documents, and describes a technique which “analyzes the ink for each annotated pixel and renders the color and brightness of each pixel based on the original pixel color and the added annotation color so as to appear as physical ink would typically appear if similarly applied to physical paper.”
- Although using two-dimensional electronic annotation tools in three-dimensional electronic documents is conceivable, visualization and technical implementation problems result when the annotation tools created for the two-dimensional electronic documents are applied to three-dimensional electronic documents. Zinio Reader®, developed by Zinio Systems Inc., located at http://www.zinio.com, and Adobe Acrobat® are two examples of annotation tools.
- Adobe Acrobat® includes one example of a two-dimensional electronic annotation tool that allows selected portions of the electronic document to be highlighted. However, if the two-dimensional electronic highlighter annotation tool is applied to a three-dimensional electronic document, then difficulty in defining the highlight area and the visualization of the highlighting ink is presented.
- For example, to capture and display pen-based annotations in three-dimensions is different from capturing and displaying pen-based annotations in two-dimensions. Specifically, in two-dimensions, translation of the user input from the computer screen to the page and updating the appearance of the page is relatively straightforward. On the other hand, in three dimensions, three-dimensional transformations must be employed to determine where on the page the user wants to place an annotation and the three-dimensional parameters of the page must be modified in order to show the annotation in the rendered image. Therefore, it is desirable to create annotation tools specifically designed to annotate three-dimensional electronic documents.
- For example, Hanrahan and Haeberli describe a three-dimensional electronic paint program that uses a technique to paint surfaces of three-dimensional electronic objects in “Direct WYSIWYG Painting and Texturing on 3D Shapes,” Proceedings of the ACM SIGGRAPH'90 Conference, pages 215-223. Based on what is displayed on the computer screen, the user manipulates the parameters, e.g., diffuse color, specular color, and surface roughness, used to shade the surfaces of the three-dimensional object. The paint brush strokes specified by the user are transformed from the screen space to the texture space of the object to update the texture data. As a result, the appearance of the 3D surfaces is modified.
- It would therefore be desirable to implement annotation tools in three-dimensional electronic documents that better simulate annotation of actual physical, magazines, journals, textbooks, photo albums, maps, periodicals, or the like.
- Exemplary embodiments provide systems and methods that allow pages of three-dimensional electronic documents to be annotated in a manner that more accurately simulate annotating pages of an actual physical three-dimensional document.
- Exemplary embodiments provide systems and methods that allow pages of three-dimensional electronic documents to be annotated without producing noticeable artifacts.
- Exemplary embodiments provide systems and methods that provide a framework to support highlighting annotations, free-form annotations, text annotations and/or the like on one or more pages.
- Exemplary embodiments provide systems and methods that allow the user to annotate, e.g., highlight, a figure, a table, multiple lines of text and/or the like on one or more pages.
- Exemplary embodiments provide systems and methods that allow the reader or viewer to specify an area as the annotated area.
- Exemplary embodiments provide systems and methods that transform an annotated area from the coordinate system of the computer screen to the local coordinate system of the page, whereupon the annotated area is transformed from the local coordinate system of the page to a coordinate system of a texture corresponding to the page, and the resulting coordinates are stored as part of the annotation data.
- Exemplary embodiments provide systems and methods that use annotation data to display annotations on the page as the annotated area is gradually defined, and to recreate the annotation from the stored annotation data.
- Exemplary embodiments provide systems and methods that superimpose or place one or more transparent polylines over the page area which is to be annotated.
- Exemplary embodiments provide systems and methods that superimpose or place one or more transparent geometric shapes, e.g., polygons or polylines over the page area which is to be annotated.
- Exemplary embodiments provide systems and methods that re-evaluate the color of vertices as a function of vertex color, annotation color and/or ink density.
- Exemplary embodiments provide systems and methods that modify a texture pasted on a page geometry.
- Exemplary embodiments provide systems and methods that generate a new page texture based on the original page texture, annotation color and ink density.
- In exemplary embodiments, a reader, viewer, annotater, or user can annotate more than one portion of a page and/or more than one page of the three-dimensional document without turning the page.
- In exemplary embodiments, the annotation tools for three-dimensional electronic documents simulate user interaction with an actual physical three-dimensional document, e.g., a physical book, by providing users with the ability to annotate the three-dimensional electronic document in an intuitive manner. In exemplary embodiments, there are multiple stages in producing an annotation. The stages include, but are not limited to, the specifying stage and the displaying stage. In the specifying stage, the user decides where to place an annotation and what annotation, e.g., a red highlight, a blue arrow or a free-form line stroke, to place on the electronic document. In the displaying stage, the annotation system displays the annotation in a visual format based on the data captured during the specifying stage.
- In exemplary embodiments, an annotation instrument, such as a mouse or stylus, is used as an electronic annotation tool to annotate the three-dimensional electronic document. In exemplary embodiments, a user defines a page area of the three-dimensional electronic document to be annotated in the specifying step. The annotating can be implemented in various ways including, but not limited to, displaying the annotations with a transparent polygon, vertex coloring, texture coloring, or a hybrid technique.
- In exemplary embodiments, a method for annotating a page of an electronic document includes selecting a page of the electronic document, the page having a first layer; providing an annotation tool to annotate a specified area of the selected page; specifying the area of the page to be annotated by the annotation tool; annotating a second layer, the second layer corresponding to the page, by marking the specified area of the page with the annotation tool; displaying an annotation corresponding to the specified area, wherein the annotation is displayed in a third layer other than the second layer and the first layer.
- In exemplary embodiments, the annotations are displayed by superimposing or placing a layer with an annotation over the page area that is specified to be annotated.
- In exemplary embodiments, the annotations are displayed using a texture coloring technique that modifies the texture pasted on the electronic page geometry.
- In exemplary embodiments, a part of an annotation may be represented by one of the transparent polygon (which may include, polyline), vertex coloring, and texture coloring annotation techniques during a display period, and another part of the annotation may be represented by a different annotation technique during the same display period.
- In exemplary embodiments, the annotations can be displayed in three dimensions to convey depth and/or a different shape than the underlying page.
- These and other features and advantages are described in or are apparent from the following detailed description of exemplary embodiments of the invention.
- Exemplary embodiments will be described in detail, with reference to the following figures in which like reference numerals refer to like elements and wherein:
-
FIG. 1 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document, which has been annotated using an electronic annotation tool to define an annotation area; -
FIG. 2 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a transparent polygon technique; -
FIG. 3 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying three annotation areas using a transparent polygon technique; -
FIG. 4 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a vertex coloring technique; -
FIG. 5 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a texture coloring technique; -
FIG. 6 illustrates a flowchart outlining an exemplary embodiment of a method for annotating pages of a three-dimensional electronic document; -
FIG. 7 illustrates a flowchart outlining an exemplary embodiment of a method for displaying annotations in the page area of a three-dimensional electronic document that has been annotated; -
FIG. 8 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a texture coloring technique; -
FIG. 9 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique; -
FIG. 10 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique; -
FIG. 11 illustrates a cross-section of the annotated three-dimensional page shown inFIG. 10 displaying an annotation area using a hybrid technique; -
FIG. 12 illustrates an exemplary embodiment of an annotated page of a three-dimensional electronic document displaying an annotation area using a hybrid technique; -
FIG. 13 illustrates a flowchart outlining an exemplary embodiment of a method for annotating pages of three-dimensional electronic documents using a hybrid technique; and -
FIG. 14 is a block diagram outlining one exemplary embodiment of a system for annotating pages of three-dimensional electronic documents. - The following detailed description illustrates exemplary embodiments of methods and systems for annotating pages of a three-dimensional electronic document.
- Producing an annotation on an electronic document may include a specifying stage and a displaying stage. In the specifying stage a reader, viewer or user indicates to a system an areas of a page to which to place an annotation, and instructs the system as to the type of annotation (e.g., a red highlight, a blue arrow, or a free-form line) to place at the areas. The reader may specify an annotation using an input device such as, for example, a mouse or stylus. The system may capture the specification data, during the specification stage, as it is being input. In the displaying stage, the system displays the annotation in a visual format based on the data captured during the specifying stage.
- For example, as shown in
FIG. 1 , a user specifies an area to be annotated, e.g., highlighted, using a highlighting specification technique to specify an area to be annotated.FIG. 1 illustrates a close-up view of an exemplary embodiment of a three-dimensional electronic document (e-book) 100 containing multiple document pages, such as the depicteddocument page 110. Thedocument page 110 is annotated with an annotation tool, e.g., a mouse (not shown) that marks apage area 120 of thedocument page 110 with a specific highlight color. - It will be appreciated that, as used herein, “annotation tool” refers to any device or combination of devices that allows a user to produce a visual enhancement, such as underlining, coloring, text, graphics or the like, on an electronic document. For example, a computer mouse or keyboard manipulates a cursor combined with circuitry and/or software and/or other hardware, such as physical buttons and/or menus on a screen as an annotation tool. An electronic highlighter is one type of annotation tool. Operations of a highlighter will be described in examples below.
- Additionally, as used herein, “ink” of course refers not to liquid ink, but rather virtual (e.g., electronically represented) ink used to produce visual enhancement, such as underlining, coloring or the like, on an electronic page.
- The system may allow the user to modify the annotation color and/or the ink density if desired. For example, if the user wants to modify the highlighter color, the user may select the color from a drop-down menu. Then, color of the highlighter is modified and new color data is stored when the highlighter is used on an electronic page. If the user wants to modify the ink density, the user may select the ink density from a drop-down menu. Then, the ink density of the highlighter is modified and new ink density data is stored when the highlighter is used on an electronic page.
- As the user reads through the e-book 100 shown in
FIG. 1 , the user may annotate, e.g., highlight, an area of the page by marking a portion of thedocument page 110 with a highlighter. The portion to be highlighted is defined in this example as arectangular page area 120 based on the user input. Depending on the annotation technique, thepage area 120 may be annotated in a specified color or ink density. - To mark an area, e.g., the
page area 120 of thedocument page 110, the user inputs a command, such as, for example, by operating a mouse to move a cursor to astarting position 122 and pressing a mouse button to anchor a corner of the mark at the startingposition 122. The user can then drag the mouse to move the cursor to select the desiredarea 120 to be marked. As the user drags the mouse, the size of thearea 120 changes and the highlight area is updated dynamically providing a visual feedback to the user of the area to be highlighted. The user finalizes the highlightedrectangular page area 120 by releasing the mouse button to anchor another corner of the rectangular page area at anend position 124. - During the mouse action (e.g., the mouse press action, the mouse drag action, or the mouse release action) the mouse communicates with the system to indicate where the mouse action occurred on the display screen. The screen coordinates corresponding to the mouse action are mapped to the local coordinate system of the page to determine where on the three-dimensional document page the user pointed at when the mouse action occurred. In other words, the area specified by the user on the screen is transformed from the screen coordinate system, to the world coordinate system, and further to the local coordinate system of the page. If necessary, the area is further transformed from the local coordinate system of the page to a texture coordinate system of the page. The resulting area is then stored as part of the highlight data.
- The world coordinate system, sometimes referred to as the global coordinate system, is the principal coordinate system of a three-dimensional workspace and is independent of viewpoint and display. Individual objects, such as a page, are each defined in their own local coordinate systems within the world coordinate system. The local coordinate systems are each independent from each other and the world coordinate system. Light sources, the viewer, and virtual cameras are positioned in the world coordinate system. In some embodiments the viewer may be the virtual camera.
- The screen coordinate system may correspond to the coordinate system of a virtual camera. The location of the camera may correspond to the origin of the screen coordinate system. For example, a reader's eye reviewing the screen corresponds to the origin of a screen coordinate system.
- The texture coordinate system defines a texture and is independent from the world coordinate system, screen coordinate system and the local coordinate system. Texture mapping is a process that maps a texture (for example, a high definition image, such as a picture or text) onto an object (e.g., a page). The shape of the object, e.g., a page, is represented with a polygon mesh. The polygon mesh is, generally, a set of vertices connected by edges. A vertex of the polygon mesh has a 3D coordinate (x,y,z) which defines where it is located in the local coordinate system of the page. Each vertex also has a 2D texture coordinate (u,v) for the purpose of texture mapping. For example, the lower left corner of the page may be located at (0,0,0) of the local coordinate system and have a texture coordinate of (0,0). The upper right corner of the page may be located at (pageWidth,pageHeight,0) and have a texture coordinate of (1,1). Note that in this example the texture mapping may require some compressing or stretching of the texture. For a point not lying at any vertex of the polygon mesh, scan conversion determines where the point is located in the local coordinate system, and determines the point's texture coordinate. The texture coordinate determines the point's texture color.
- Raycasting is one example of a technique that is used in mapping a screen coordinate to a local coordinate of the page. The raycasting technique includes shooting a ray from a virtual camera (corresponding to the eye of a user viewing the screen) through/from the screen coordinate position of the mouse (xm, ym) towards the three-dimensional document. Next, the intersection position, i.e., coordinate points, (xw, yw, zw), of the ray and the page of the three-dimensional document, is calculated. Assuming that the intersection position is represented in the world coordinate system, this point is then mapped from the world coordinate system to the local coordinate system of the page. If necessary, the point represented in the local coordinate system of the page is further mapped to the texture coordinate system of the page.
- It should be appreciated that the annotation specification technique described above may be applied to document pages facing straight to the user and/or facing to the user at oblique angles; that the annotation specification technique may be applied to document pages represented as flat three-dimensional and/or curved three-dimensional surfaces; that, although the previous description mentioned a rectangular area to be annotated, an area of any shape, including circles, oblong or oval shapes, and polygons of any numbered sides can be specified by the user in a similar manner; and that, rather than a mouse, other input devices (e.g., a stylus or electronic pen) can be employed to specify the page area to be annotated.
- It should also be appreciated that factors other than the location of the annotation area may be used to influence the annotation effect. For example, “covering up” the original contents of the page so that they cannot be viewed after annotating, e.g., highlighting, may be desirable, or allowing the original contents to still be viewable after highlighting may be desirable. The color of the highlight and the ink density of the highlight used will impact whether the original contents can be viewed. If the original contents of the page are to be viewed after highlighting, the original contents of the page that are located in the marked page area may be blended with the highlight color to produce a highlighting effect. Generally, the ink density of the highlight determines how much of the highlight color appears in the blending result. The denser the highlight ink is, the more the highlight color shows in the blending result. To modify both the color and/or the ink density of the highlight, a user interface (not shown) can be provided that allows the user to change the color and/or ink density of the highlight.
- It should be appreciated that, as the annotation, e.g., highlight, is created, annotation data pertaining to the annotated area including the area boundary, the color, the ink density and/or the like is stored in the system for annotating three-dimensional documents and correlated with the corresponding page. The system for annotating three-dimensional documents is intended to display the annotation on the page as the annotation area is gradually defined. Additionally, whenever the page is revisited, e.g., due to backward and/or forward page turning, the annotation is recreated from the stored annotation data and displayed on the corresponding page.
- The following detailed description of methods and systems for annotating pages of three-dimensional electronic documents, such as e-books discloses exemplary embodiments of methods and systems of displaying a user specified annotation, such as highlights, using transparent geometric shapes, vertex coloring, texture coloring and/or a hybrid technique.
- Hereafter, a polygon will be used as a geometric shape.
FIG. 2 illustrates an embodiment of displaying anannotation 200, e.g., highlight, by using a transparent and/or translucent polygon 220 (hereafter, referred to as “transparent polygon” for convenience) and superimposing or placing thetransparent polygon 220 over thepage area 230, of a three-dimensional page 210, that is to be highlighted. The highlight may be created by a highlighting specification technique described previously. The location and size of thetransparent polygon 220 is preferably equal to the location and size of thepage area 230, which is determined by the stored highlight data. The color of thepolygon 220 reflects the color of the highlighter, and the opacity of thepolygon 220 models the ink density of the highlighter. - Superimposing the
transparent polygon 220 over thepage area 230 that is to be highlighted creates a “Z fighting” problem. The problem of Z fighting arises when two overlapping, co-planar polygons P1 and P2 are displayed. During scan conversion, polygon P1 may have some pixels in front of polygon P2 and other pixels behind polygon P2. As a result, there is no clear separation between polygon P1 and polygon P2. As such when a user views a page from the front side of the page, as viewed from the side ofpolygon 220 labeled “FRONT,” part of thetransparent polygon 220 may be in front of thepage area 230 and part of thetransparent polygon 220 may be behind the page area 230 (i.e., be blocked from the user's view by the page area 230). To avoid the Z fighting problem, thetransparent polygon 220 is offset/elevated (for example, along a “z” axis, if the plane corresponding to the page is defined by “x” and “y” axes) from thepage 210 towards the front side of the page by a distance D1. The minimum offset distance may depend, e.g., on the Z buffer of the graphics hardware. -
FIG. 3 illustrates an exemplary embodiment of the step of displayingannotations 300 by superimposing multipletransparent polygons dimensional page 310. When multipletransparent polygons common area 380 of the three-dimensional page 310, thetransparent polygons dimensional page 310 becomes increasingly complex. -
FIG. 4 illustrates an exemplary embodiment of displaying an annotation, e.g., highlights on apage 400, using vertex coloring to color those vertices within the three-dimensional page area 410 that are to be annotated. The vertex coloring technique has advantages over the transparent polygon technique because it can avoid some of the system complications associated with the transparent polygon technique when surfaces which are to be viewed as curved are annotated. For example, in the transparent polygon technique, if a page is curved, e.g., when turned, after being highlighted, the transparent polygons associated with the highlight must also be curved. As in the transparent polygon technique, and as illustrated inFIG. 4 , in the vertex coloring technique the polygonal boundary specified by the user determines thepage area 410 that will be annotated. However, after determining the boundary of thepolygon area 410, i.e., the area to be annotated, thepolygonal area 410 is additionally transformed from the local coordinate system of the page to the coordinate system of the texture corresponding to the page. This transformation takes place during the specification stage. The resulting texture coordinates are stored as part of the annotation data. - As shown in
FIG. 4 , thepage 400 can be represented as a geometric object. InFIG. 4 the page geometry is represented as having a polygon mesh. The polygon mesh is a computer graphics technique which approximately represents a three-dimensional object, such as the three-dimensional page, usingvertices 415 connected byedges 420. Once a user inputs the boundary of thepolygon area 410 to be annotated, thevertices 415 lying inside the user-specifiedpolygonal area 410 can be identified. After thevertices 415 are identified, the colors of the identifiedvertices 415 are re-evaluated as a function of the annotation color and the ink density. Re-evaluation may change the colors of the vertices. The colors of the vertices inside and outside thepolygonal area 410 of the polygon mesh may subsequently be interpolated and then blended with the page texture corresponding to the page to produce an annotating effect. - However, the vertex coloring technique may produce noticeable artifacts, e.g., at the annotation boundary, resulting from the bi-linear interpolation of vertex colors occurring from the scan conversion of the polygon mesh. Although using a finer polygon mesh will, to a certain degree, ameliorate the annotation boundary, a finer mesh will require more vertices to be processed and somewhat impact the processing speed of the system.
-
FIG. 5 illustrates an exemplary embodiment of displaying an annotation, e.g., highlight, using the texture coloring technique. In the texture coloring technique, the page texture is modified and/or updated to display the annotated data. As discussed above, the page texture is a layer corresponding to the page and used to represent high resolution images. Moreover, like the vertex coloring technique, in the texture coloring technique the page is represented by a polygon mesh. Each vertex of the page has a texture coordinate in the texture coordinate system of the page. - The texture coloring technique employs many of the same steps used by the transparent polygon and vertex coloring techniques. Like the vertex coloring technique, after determining the boundary of the
polygon area 510, e.g., the area to be highlighted, in the specification stage, thepolygonal area 510 is additionally transformed from the local coordinate system of the page (i.e., polygon mesh) to the coordinate system of a texture corresponding to the page. The resulting texture coordinates are stored as part of the highlight data. Thus, the page texture and the associated highlight data is separate from the page during annotation specifying; however, after specifying an annotation, the associated annotation is blended to the page texture, which is then pasted onto the page geometry.FIG. 5 illustrates the page texture pasted on the page 500 (i.e., page geometry). The entire image shown on thepage 500 corresponds to the texture. - In the texture coloring technique, a blending step may also be performed to achieve the annotation effect. The blending step includes computing the color of each texture pixel within the annotated
polygonal area 510. The color of the texture pixel may be determined by satisfying the following relationship: Ct=(1.0−density)*Ct+density*Ch, where Ct is the color of the pixel, Ch is the color of the annotation, and density is the ink density of the annotation normalized to be in the range of 0.0 to 1.0. The blending operation produces a new texture for the page geometry. The new texture is then applied to thepage 500 by pasting the new texture onto the page geometry. - The texture coloring technique produces relatively well-defined boundaries for the annotated areas because, as discussed above, the page texture generally has a higher resolution than the polygon mesh. Therefore, the bi-linearly interpolated texture coordinates are generally more visually appealing than the result of the bi-linearly interpolated vertex colors.
-
FIG. 6 is a flow chart illustrating an outline of an exemplary embodiment of a method for annotating three-dimensional documents. As shown inFIG. 6 , operation of the method begins in step S100, and continues to step S105, where a three-dimensional document is obtained. Then, in step S110, a three-dimensional page of a three-dimensional document to be annotated is turned to based, e.g., on a user input, and displayed. A user may use a drop-down menu, a mouse button, or other input device to select a page to be turned to. - Next, in step S115, a page area of the three-dimensional page to be annotated is specified by marking an annotation on the screen. The annotation data (e.g., data corresponding to the movement of an annotation input device (e.g., pen), the polygon boundary, annotation color, ink density, etc.) relating to the specified page area is correlated with the page by mapping the boundaries of the annotation data from the screen coordinate system to the page and texture coordinate systems. The correlated annotation data is then stored with other data (e.g., the page texture) of the page to be annotated. Then, in step S120, an annotation is displayed based on the annotation data. Next, in step S1125, it is determined whether the annotation of the current page is completed. This determination may be based on, e.g., a clock or whether the page has been turned. If so, operation continues to step S130. If not, operation returns to step S115.
- In step S130, it is determined whether the annotation of the current document is completed. This determination may be based on, e.g., a clock or whether the document has been closed. If so, operation continues to step S135 where the method ends. If not, operation returns to step S110.
- It should be appreciated that annotation is not necessarily performed as a single activity. More likely, annotations are added as the user is reading through a document. For example, a user may read a page and find an area of the page to be annotated, e.g., a few interesting sentences on the page. The user can then mark the sentences with an annotation tool, e.g., a highlighter, and then continue to read through the document. In other words, the user can perform other activities between annotations such as reading, turning pages and/or the like.
-
FIG. 7 is a flow chart outlining an exemplary embodiment of a method for displaying an annotation. For convenience, the method ofFIG. 7 may be considered a detailed method of performing step S1120 ofFIG. 6 . Thus, operation of the method ofFIG. 7 begins in step S120, and continues to step S1201, where it is determined whether the annotation is to be displayed with a transparent polygon technique. If so, operation continues to step S1202. In step S1202 a transparent polygon is superimposed over the page area to be highlighted. Operation then jumps to step S1206 and returns to step S125 ofFIG. 6 where a determination is made whether the current document's page annotations are complete. If, in step S1201, the transparent polygon technique is not to be employed, operation jumps to step S1203. - In step S1203 it is determined whether the annotation is to be displayed with a vertex coloring technique. If so, operation continues to step S1204. Otherwise, operation jumps to step S1205.
- In the vertex coloring technique, in step S220, all vertices within the page area to be annotated are colored. Operation then jumps to step S230.
- If the vertex coloring technique is not employed, then, in step S225, those texture pixels within the page area to be annotated are modified using the texture coloring technique. Operation then continues to step S1206 and returns to step S125 of
FIG. 6 where a determination is made whether the current document's page annotations are complete. The determinations in steps S1201 and S1203 may be based on, e.g., a user input, or be preprogrammed into the system and based on variables such as the size of the page and/or document, or the processing speed and memory of the system. - As discussed above, the texture coloring technique has advantages over the transparent polygon and vertex coloring techniques and, as such, the texture coloring technique is a preferred method. Moreover, generally, when a reader is interactively involved with a document, the reader generally desires quick frame rates, i.e., quick updates of the screen information, as the information is input. For example, the reader may be interested in quick frame rates during two scenarios, 1) when an annotation is being specified by a reader, and 2) when a page on which the annotation was specified is re-visited (e.g., due to page turning). As such, the blending operation of the texture coloring technique needs to be carried out quickly when a reader is interactively involved with a document and a high processing speed/quick frame rate is desired. However, the computational speed of the blending operation depends on the resolution of the page texture to be pasted on the page, along with system constraints, such as, for example, available memory and/or processing speed. For example, as the resolution increases, the number of pixels to be processed by the blending operation also increases. As the number of pixels to be processed increases, the potential for a bottleneck and a failure of the blending operation to be performed quickly can develop.
- In the second scenario, where a reader re-visits the page on which the annotation was specified, multi-resolution texture features used in features for turning a page of a three-dimensional document can be used to lower the resolution of the texture for a specific time and, thus, reduce the possibility of a bottleneck forming in the system. For example, when user responsiveness is desired, e.g. to generate the first frame of a page turning animation, the blending operation can be performed using a low-resolution texture for the page. This significantly reduces the overhead of the blending operation because the number of pixels to be processed is reduced from the number of pixels processed for a high-resolution texture. When, on the other hand, a high image quality is desired, e.g., at the end of the page turning animation, the blending operation can use the higher resolution page texture. In other words, the system may, for example, initially display the low-resolution texture, and then automatically change to display a high-resolution texture after the page turning animation is complete.
- However, unlike the scenario when a page on which the annotation was specified is re-visited (e.g., due to page turning), a high resolution is generally desired before, during and after the annotation is initially specified by the reader. As such, multi-resolution page texture features, such as used with page turning, cannot be easily adapted to reduce the possibility of bottlenecks developing in the system. For example, if the resolution of the texture on a page is changed when the reader specifies an annotation on a page visual artifacts will be produced.
-
FIG. 8 illustrates an exemplary method for annotating three-dimensional documents. As illustrated inFIG. 8 , pen-basedannotations 610, i.e., annotations with a stylus, may be used as a method for annotating three-dimensional documents. Pen based annotations allow the user to sketch out free-form annotations. -
Handwritten annotations 610 can be decomposed and approximated with a set of strokes where each stroke has a trajectory, width, color, and ink density. The trajectory of a stroke is captured as the user annotates thepage 600 by sliding the stylus (not shown) from one location of the three-dimensional page 600 to the next location of the three-dimensional page 600. The width, color, and ink density of a stroke can be modified by the user with a user interface (not shown). The stroke trajectory, coupled with stroke width, generally defines the footprint of the stroke on the display screen. This footprint can be approximately represented as a polygon or other shape. For convenience, “polygon” is used hereafter as an example. - After the specification stage discussed previously, the annotation, i.e., polygon, is transformed from the screen coordinate system to the local coordinate system of the three-
dimensional page 600. If necessary, the polygon is further transformed from the local coordinate system of thepage 600 to the coordinate system of the texture corresponding to thepage 600. The annotation data, e.g., the polygon boundary, the color, and the ink density, is then stored and correlated with the respective page. Although the handwritten annotation can be displayed using any of the display techniques previously discussed, the texture coloring technique is the preferred display technique. - It should be appreciated that the previously described annotation operations may be modified to support other types of annotations for three-dimensional documents. For example, at the specification stage, by using a mouse or a stylus the user can indicate where on the three-dimensional page to place a text annotation. This location is transformed from the screen coordinate system to the coordinate system of the texture corresponding to the page. Then, by using a keyboard, the user can input the content of the text annotation. Alternatively, the content of the text annotation can be specified in a handwritten manner using the stylus. At the display stage, the content of the text annotation can be rendered as an ink layer and blended with the original page texture to create a new texture, which is pasted on the page geometry. Thus a transparent or opaque text annotation can be shown on top of the original content of the three-dimensional page.
-
FIG. 8 illustrates an exemplary embodiment of an annotation of a document page in a virtual three-dimensional document. Virtual three-dimensional document systems often have a relatively low frame rate speed and display a page with a texture at a relatively high resolution. InFIG. 8 , “PARC TAP” was written using a stylus or pen in a top margin of the document page and captured on a high-end desk top, such as a Dell,Precision 340 equipped with an ATI 9700 PRO graphics card. InFIG. 8 , the words “PARC TAP” were deliberately written slowly; however, the words “PARC TAP” were not captured adequately enough by the system for the words “PARC TAP” to be considered by a reader as naturally written. In particular, the curves on the letters “P”, “C” and “R” were not captured as curves (to the extent that a “curve” can be written by a reader) but, instead, captured as individual line segments. - One factor in the failure of the system to capture the curves of the letters is the low frame rates of the three-dimensional document system used in
FIG. 8 . The low frame rates limit the number of visual updates provided back to the reader as the reader annotates the document page. Pen events correspond to the reader's annotation of the page, such as, for example, the movement of a stylus on the page. A system with low frame rates is unable to capture enough pen events corresponding to the pen movement in real time. The latency (i.e., time) between two consecutive frames is one factor that determines the ability of the system to capture pen events and display the annotation as the reader annotates the page. In the example shown inFIG. 8 , due to the low frame rates, the latency between two consecutive frames was significant enough that the system only captured a few pen events corresponding to each of the curves. As such, the displayed curves include only a few line segments. - Frame rates are impacted negatively by, at least, 1) the texture resolution of the page (e.g., texture resolution needs to be high enough to make the text on the page readable), and 2) the blending operation of the high-resolution texture of the page with the annotation layer and the other layers of visual augmentation, such as word/sentence underlining and highlighting, that were placed on the texture. For example, as other layers are sequentially blended with the texture, these layers are usually rendered on top of the annotation layer. As such, changes to the annotation layer entail the re-evaluation of the other layers that were placed on top of the annotation layer. Moreover, an additional factor negatively impacting frame rate speed is that after all of the layers are blended with the texture, the new texture is sent from the main memory to the texture memory of the graphics card and rendered.
- The problem of low frame rates impacting the visual quality of annotations on a page can be alleviated with faster/more advanced CPUs and graphic cards; however, as faster CPUs and more advanced graphic cards are used, it is also anticipated that three-dimensional document systems will use textures with higher resolutions (e.g., to support multi-monitors, i.e., systems comprising multiple monitors before a viewer) and, similarly, employ more layers of visual augmentation.
- The following detailed description discloses a hybrid technique for annotating pages of three dimensional documents and discloses exemplary methods and systems providing an increased interactive rendering and displaying speed at the annotation specifying stage. The following exemplary embodiments provide a relatively quick blending operation when the annotation is being specified by the reader.
- A hybrid technique uses one annotation technique, e.g., the polyline technique, for one part of an annotation and another annotation technique, e.g., the texture coloring technique, for another part of an annotation such that the entire annotation can be displayed “on the fly” at interactive speeds acceptable to a reader viewing an annotation when, for example, the reader is specifying an annotation. For example,
FIG. 9 illustrates a free-form annotation of the letters “PAR” in the top margin of a document page. The word was annotated on the page by a reader using an electronic pen. - As shown in
FIG. 9 , a reader annotates the page by moving the pen on the page. The pen movements correspond to a set of strokes. For example, as shown inFIG. 9 , the letter “A” may consist of three strokes “/”, “\” and “−”. In three-dimensional systems, a reader uses a sequence of pen events to specify a stroke. A pen “down” event indicates the beginning of a new stroke. Moving the pen while holding the pen down sketches the trajectory of the stroke. A pen “up” event denotes the completion of a stroke. Each stroke can correspond to a group of connected lines, commonly referred to as a polyline. -
FIGS. 9 , 10 and 11 illustrate an exemplary embodiment of a hybrid technique for annotating an area of a page of an electronic document.FIG. 10 illustrates a view of anelectronic page 800.FIG. 11 illustrates a cross-section view of thepage 800 taken along line A-A ofFIG. 10 , through the leg of the letter “P” that has been annotated on the page. In the illustrated hybrid technique, as the reader specifies an annotation, for example, by sketching astroke 810 between abeginning point 805 and anending point 3D polyline segments 820 corresponding to the pen events of the stroke are created on anannotation layer 830.Layer 835 is a page texture layer.Annotation layer 830 andpage texture layer 835 are shown as separate layers from thepage 800; however, these layers may be intrinsic therewith, i.e., the information of the separate layers may actually be formed and/or stored as parts of the same layer. Theannotation layer 830 andtexture layer 835 have coordinates that correspond to the page geometry (e.g., polygon mesh) of thepage 800. Thestroke 810 is captured as a3D polyline segment 820 by transforming the specified area marked on the screen from the screen coordinate system to the local coordinate system of the page and to the corresponding coordinates of the annotation layer. - A raycasting technique can be used to perform the transformations. During the specifying stage, a
ray 845 is shot from acamera 840 through thescreen 850 position of a point, corresponding to the point marked by the mouse, towards thepage 800. By using the coordinates of the intersection of theray 845 with the3D polyline 820, on theannotation layer 830 and thepage 800, a point(s) 860 corresponding to the3D polyline 820 are computed slightly behind the near clipping plane 850 (i.e., “lens”) and in front of the far clipping plane (not shown) of the camera frustum of the virtual camera, i.e., slightly behind theplane 850 inFIG. 11 . At the same time, the local coordinates of the page corresponding to the intersection coordinates, i.e., thepoints 870 where therays 845 strike thepage 800, and subsequently, the intersection's texture coordinates, are calculated. -
FIG. 11 shows four layers: thepage geometry layer 800, thepage texture layer 835, thepage annotation layer 830, and the3D polyline layer 880. Thepage geometry layer 800, thepage texture layer 835, and thepage annotation layer 830 are all located in the same space. That is, inFIG. 11 , they should all be located at the same location. However, for the basic, conceptual, understanding of the hybrid technique, thelayers layers page annotation layer 830 and the3D polyline layer 830. However, only one of these two representations is displayed at any moment. The representation on the3D polyline layer 880 is used as the stroke is being sketched out. Once the stroke is specified, the representation on the3D polyline layer 880 is removed. Once the representation is removed from the3D polyline layer 880, the representation on thepage annotation layer 830 is displayed by the texture coloring technique. - As previously described, the representation on the
page annotation layer 830 may be completed by using a raycasting technique. As discussed above, given a point on the screen, raycasting can be used to compute a corresponding point on the3D polyline layer 880, by shooting or starting a ray from avirtual camera 840 through a point on thescreen 850, and extending the ray a little bit beyond the point on thescreen 850. The extension of the ray providespoint 860 which is slightly behind thenear clipping plane 850. - The
stroke 810 is represented during the displaying stage by connecting thosepoints 860 located slightly behind thenear clipping plane 850 to form three-dimensional (“3D”)polyline segments 880. The3D polyline segments 880 connecting thosepoints 860 are rendered and displayed with the existing page texture and other annotations of thepage 800. - The
3D polyline segments 880 correspond to the annotation, i.e.,stroke 810, that the reader placed on thepage 800 in the specifying step. The 3D polyline segments are placed inlayer 880.Layer 880 is located slightly behind thenear clipping plane 850 of the camera frustum to ensure that the annotation will be displayed, as viewed by a reader, as on the top of thepage 800, and thus, avoid a Z fighting problem. Moreover, because the3D polyline segments 880 are not part of the page texture, the3D polyline segments 880 can be efficiently rendered and displayed by related art graphic cards. As such, the3D polyline segments 880 of the hybrid technique are independent of the page's texture resolution, and thus enable frame rate speed to increase. - Once a specific stroke has been completed, a reader may move to specify the next stroke. During the intermittent time between strokes, the
3D polyline segments 880 can be removed and the page texture layer updated and applied to the page geometry. The page texture can be updated with the texture coloring technique. - For example,
FIG. 9 illustrates a snapshot taken as a user annotates a document page with the words “PARC TAP”. As illustrated inFIG. 9 , the letters “P” and “A”, as well as the first stroke of the letter “R”, were rendered and displayed with the texture coloring technique. The second stroke of the letter “R”, indicated by the darker color, is represented as 3D polyline segments. - Alternatively, the 3D line segments can be used to represent the entire annotation as the annotation is being sketched out, and not removed between strokes. At the end of the annotation specifying, the 3D line segments can be removed and the texture can be updated to display the annotation with the texture coloring technique. If this exemplary embodiment is employed, the end of the annotation specifying step must be explicitly declared by the reader or implicitly detected by the system.
- While the exemplary embodiment illustrated in
FIG. 9 indicates a free-form annotation, the hybrid technique can be used to support highlighting annotations as well. For example, when a highlighted area is specified, a transparent 3D polygon may be created instead of a 3D polyline. The vertices of the polygon may be computed according to where on the computer screen the annotation input events occur. The vertices are preferably placed slightly behind the near clipping plane of the camera frustum to ensure that the transparent polygon is displayed on the top of the page and covers the area that the reader intends to highlight. Alternatively, instead of a transparent polygon, a reader may indicate and a system may render only the edges of the polygon. In contrast to the embodiment of a transparent polygon, if only the perimeter edges of the polygon are rendered the text under the highlighted area is unaffected and visible throughout the specifying stage. At the end of the highlight specifying, the geometric representation corresponding to the polygon is removed and the texture is updated to display the highlight with the texture coloring technique. -
FIG. 12 illustrates an annotation captured using the same desktop as that used to capture the annotation illustrated inFIG. 8 .FIG. 12 illustrates anannotation 910 of “PARC TAP” onelectronic page 900. The annotation was obtained using the hybrid annotation technique. Because the hybrid technique displays a part of the annotation slightly behind the near clipping plane and other parts of the annotation using the texture coloring technique, the annotation technique achieves quicker frame rates than the frame rates obtained using the texture coloring technique. Thus, unlike inFIG. 8 , where the user deliberately slowed pen movements down to allow the system to process the pen events, a user does not have to deliberately slow down pen movements when annotating a document page. - However, as illustrated in
FIG. 9 , strokes displayed with the texture coloring technique and strokes represented as 3D polyline segments differ slightly in visual appearance. For example, inFIG. 9 the strokes displayed using 3D polyline segments are shown in a darker color. The appearance of parts of the annotation can be modified, for example to match other parts of the annotations, by adjusting the width and/or transparency of the line segments. - In the previously described exemplary annotation display techniques, the page texture is not directly modified when the user specifies an annotation because the page texture may be public data shared by multiple users. As such, the page texture is not directly modified, because it is desired to preserve the original content of the page. Annotations, on the other hand, are more likely to be private data created by one user. By preserving the original page texture, and associating annotation data with the page texture when the page is displayed, the annotations may be easily removed when necessary. As such, the annotation data is stored separately from the corresponding page texture to allow flexibility in accessing the un-annotated three-dimensional document, the annotations alone, or the annotated three-dimensional document. However, the page texture data could be modified and/or updated permanently with annotation data.
-
FIG. 13 is a flow chart illustrating an outline of an exemplary embodiment of a method for annotating three-dimensional documents using a hybrid technique. As shown inFIG. 13 , operation of the method begins in step S200, and continues to step S205, where a three-dimensional document is obtained. Then, in step S210, a page of a three-dimensional document to be annotated is turned to and displayed. It should be appreciated that a user may use a drop-down menu, a mouse button, keyboard, and/or other input device to select a page to be turned to. - Next, in step S215, a page area of the three-dimensional page to be annotated is specified by a user as the user makes an initial stroke of the annotation.
- Then, in step S220, a raycasting technique is applied to create points on the page annotation layer as well as points corresponding to the annotation behind the near clipping plane of the camera frustum as the stroke is created. The points located behind the near clipping plane are connected to form 3D polyline segments. The 3D polyline segments are displayed over the page. The annotation data relating to the specified page area is correlated with the page by mapping the boundaries of the annotation data from the screen coordinate system to the local coordinate system of the page and the texture coordinate system. The correlated annotation data is then stored with other data (e.g., the page texture) of the page to be annotated.
- Next, in step S240, a decision is made as to whether the stroke has been completed. If so, the operation continues to step S245. Otherwise, the operation returns to step S220, and steps S220 through S240 are repeated until the stroke has been completed.
- After a stroke has been completed, the operation continues to step S245 where the page texture is updated with the annotation data created in
step 220. The 3D polyline segments formed on the 3D polyline layer may be removed. - Next, in step S250, the updated page texture is displayed. Thus, after a stroke of an annotation has been completed and the page texture updated, the updated page texture is displayed. As such, subsequent annotations can be displayed as 3D polyline segments, while the previous annotation data is displayed with the page texture.
- Next in step S255, it is determined whether the annotation of the current page is completed. If so, operation continues to step S260. If not, operation returns to step S215.
- In step S260, it is determined whether the annotation of the current document is completed. If so, operation continues to step S265, where the operation ends. If not, operation returns to step S210.
-
FIG. 14 is a functional block diagram outlining an exemplary embodiment of anannotation control system 1000 according to this invention. As shown inFIG. 14 , theannotation control system 1000 includes an input/output interface 1110, acontroller 1120, amemory 1130, a document identifying circuit, routine orapplication 1140, a page area specifying circuit, routine orapplication 1150, and an annotation displaying circuit, routine orapplication 1160, each appropriately interconnected by one or more control and/or data buses. The input/output interface 1110 is linked to adocument data source 1200 by alink 1210, and to adisplay device 1300 by alink 1310. Further, the input/output interface 1110 is linked to one or moreuser input devices 1400 by one ormore links 1410. - Each of the
links annotation control system 1000. It should also be understood thatlinks - The
memory 1130 can be implemented using any appropriate combination of alterable, volatile or non-volatile memory or non-alterable, or fixed, memory. The alterable memory, whether volatile or non-volatile, can be implemented by using any one or more of static or dynamic RAM, a floppy disk and disk drive, a writable or rewritable optical disk and disk drive, a hard drive, flash memory or the like. Similarly, the non-alterable or fixed memory can be implemented using any one or more of ROM, PROM, EPROM, EEPROM, and gaps in optical ROM disk, such as a CD ROM or DVD ROM disk and disk drive, or the like. - The input/
output interface 1110 is connected to theuser input devices 1400 over alink 1410. Theuser input devices 1400 can be one or more of a touch pad, a touch screen, a track ball, a mouse, a keyboard, a stylus, an electronic pen or any known or later-developeduser input devices 1400 for inputting data and/or control signals to the annotation control system for annotating pages of the three-dimensional electronic document. - Furthermore, the input/
output interface 1110 is connected to displaydevice 1300 overlink 1310. In general, thedisplay device 1300 can be any device that is capable of outputting, i.e. displaying, a rendered image of the three-dimensional electronic document. - The document identifying circuit, routine or
application 1140 receive a user input, inputs a three-dimensional electronic document to be annotated and identifies the three-dimensional document page to be annotated. Then, the page area specifying circuit, routine orapplication 1150 receives a user input, inputs the three-dimensional electronic document, inputs the identified three-dimension page to be annotated and specifies a portion of the page as the page area to be annotated. Finally, the annotation displaying circuit, routine orapplication 1160 inputs the three-dimensional electronic document, inputs the identified three-dimension page to be annotated, inputs the specified page area to be annotated, and displays the annotated page using an annotation display technique. - An exemplary embodiment of an
annotation control system 1000 for annotating pages of a three-dimensional electronic document according toFIGS. 6 and 13 operates in the following manner. - In operation, a user input is output from the
user input devices 1400 overlink 1410 to the input/output data interface 1110 of theannotation control system 1000. The user input information is then stored in thememory 1130 under control of thecontroller 1120. Next, the three-dimensional documents is output from thedocument data source 1200 overlink 1210 to the input/output interface 1110 in accordance with the user input. The three-dimensional electronic document is then input into the document identifying circuit, routine orapplication 1140 under the control of thecontroller 1120. - The document identifying circuit, routine or
application 1140 identifies the three-dimensional electronic document to be annotated based on user input and the controller stores the identified three-dimensional electronic document in thememory 1130. - The page area specifying circuit, routine or
application 1150 allows the user to specify a portion of the three-dimensional document area to be annotated on thedisplay device 1300. Additionally, the page area specifying circuit, routine orapplication 1150 maps the boundary of the page area from the screen coordinate system to the local coordinate system of the page. The page area specifying circuit, routine orapplication 1150 also maps the boundary of the page area from the local coordinate system of the page to the texture coordinate system of the page when required (e.g., when the vertex coloring technique, texture coloring display technique, or hybrid technique is used to display the annotation). Further, the page area specifying circuit, routine orapplication 1150 may modify color or ink density of the annotation instrument based on user input. Finally, the page area specifying circuit, routine orapplication 1150 stores the annotation data inmemory 1130 and correlates the annotation data with other data of the page via thecontroller 1120. - The annotation displaying circuit, routine or
application 1160 displays the annotation in the specified page area according to the display technique selected by the user. Then, the annotation displaying circuit, routine orapplication 1160 superimposes 3D polyline segments or a transparent polygon over the page area to be annotated. Alternatively and/or subsequently, the annotation displaying circuit, routine orapplication 1160 re-evaluates and colors the vertices within the page area to be annotated or modifies texture pixels within the page area to be annotated based on the selected display technique. - In exemplary embodiments of the systems and methods for annotating three-dimensional electronic documents, it should be appreciated that the systems and methods can be applied to a flat page surface and/or a curved page surface. However, when the systems and methods are applied to a curved page surface, the annotation is also required to deform along with the page area to be annotated.
- While this invention has been described in conjunction with the exemplary embodiments outlined above. These embodiments are intended to be illustrative, not limiting. Various changes, substitutes, improvements or the like may be made without departing from the spirit and scope of the invention.
Claims (15)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/505,262 US20100011281A1 (en) | 2004-12-16 | 2009-07-17 | Systems and mehtods for annotating pages of a 3d electronic document |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/012,902 US7577902B2 (en) | 2004-12-16 | 2004-12-16 | Systems and methods for annotating pages of a 3D electronic document |
US12/505,262 US20100011281A1 (en) | 2004-12-16 | 2009-07-17 | Systems and mehtods for annotating pages of a 3d electronic document |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/012,902 Division US7577902B2 (en) | 2004-12-16 | 2004-12-16 | Systems and methods for annotating pages of a 3D electronic document |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100011281A1 true US20100011281A1 (en) | 2010-01-14 |
Family
ID=36218131
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/012,902 Expired - Fee Related US7577902B2 (en) | 2004-12-16 | 2004-12-16 | Systems and methods for annotating pages of a 3D electronic document |
US12/505,262 Abandoned US20100011281A1 (en) | 2004-12-16 | 2009-07-17 | Systems and mehtods for annotating pages of a 3d electronic document |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/012,902 Expired - Fee Related US7577902B2 (en) | 2004-12-16 | 2004-12-16 | Systems and methods for annotating pages of a 3D electronic document |
Country Status (3)
Country | Link |
---|---|
US (2) | US7577902B2 (en) |
EP (1) | EP1672529B1 (en) |
JP (1) | JP2006172460A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080229180A1 (en) * | 2007-03-16 | 2008-09-18 | Chicago Winter Company Llc | System and method of providing a two-part graphic design and interactive document application |
US20110261083A1 (en) * | 2010-04-27 | 2011-10-27 | Microsoft Corporation | Grasp simulation of a virtual object |
US20120084647A1 (en) * | 2010-10-04 | 2012-04-05 | Fuminori Homma | Information processing apparatus, information processing method, and program |
US20120084646A1 (en) * | 2010-10-04 | 2012-04-05 | Fuminori Homma | Information processing apparatus, information processing method, and program |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9104358B2 (en) | 2004-12-01 | 2015-08-11 | Xerox Corporation | System and method for document production visualization |
JP3865141B2 (en) * | 2005-06-15 | 2007-01-10 | 任天堂株式会社 | Information processing program and information processing apparatus |
US7716574B2 (en) * | 2005-09-09 | 2010-05-11 | Microsoft Corporation | Methods and systems for providing direct style sheet editing |
US7877460B1 (en) | 2005-09-16 | 2011-01-25 | Sequoia International Limited | Methods and systems for facilitating the distribution, sharing, and commentary of electronically published materials |
CA2652986A1 (en) * | 2006-05-19 | 2007-11-29 | Sciencemedia Inc. | Interactive learning and assessment platform |
US8640023B1 (en) * | 2006-07-10 | 2014-01-28 | Oracle America, Inc. | Method and system for providing HTML page annotations using AJAX and JAVA enterprise edition |
JP4770614B2 (en) * | 2006-07-11 | 2011-09-14 | 株式会社日立製作所 | Document management system and document management method |
JP4861105B2 (en) * | 2006-09-15 | 2012-01-25 | 株式会社エヌ・ティ・ティ・ドコモ | Spatial bulletin board system |
US7559017B2 (en) * | 2006-12-22 | 2009-07-07 | Google Inc. | Annotation framework for video |
US8276060B2 (en) * | 2007-02-16 | 2012-09-25 | Palo Alto Research Center Incorporated | System and method for annotating documents using a viewer |
WO2008150919A1 (en) * | 2007-05-29 | 2008-12-11 | Livescribe, Inc. | Electronic annotation of documents with preexisting content |
US20090144654A1 (en) * | 2007-10-03 | 2009-06-04 | Robert Brouwer | Methods and apparatus for facilitating content consumption |
US8243067B2 (en) * | 2008-02-21 | 2012-08-14 | Sap America, Inc | PMI data visualization |
US8510646B1 (en) * | 2008-07-01 | 2013-08-13 | Google Inc. | Method and system for contextually placed chat-like annotations |
US8359302B2 (en) * | 2008-07-02 | 2013-01-22 | Adobe Systems Incorporated | Systems and methods for providing hi-fidelity contextual search results |
GB2462589B (en) * | 2008-08-04 | 2013-02-20 | Sony Comp Entertainment Europe | Apparatus and method of viewing electronic documents |
GB2462612A (en) | 2008-08-12 | 2010-02-17 | Tag Learning Ltd | A method of facilitating assessment of a coursework answer |
US9189147B2 (en) * | 2010-06-22 | 2015-11-17 | Microsoft Technology Licensing, Llc | Ink lag compensation techniques |
US20120159329A1 (en) * | 2010-12-16 | 2012-06-21 | Yahoo! Inc. | System for creating anchors for media content |
US8984402B2 (en) | 2010-12-20 | 2015-03-17 | Xerox Corporation | Visual indication of document size in a virtual rendering |
US8902220B2 (en) | 2010-12-27 | 2014-12-02 | Xerox Corporation | System architecture for virtual rendering of a print production piece |
US20120268772A1 (en) * | 2011-04-22 | 2012-10-25 | Xerox Corporation | Systems and methods for visually previewing finished printed document or package |
US8773428B2 (en) | 2011-06-08 | 2014-07-08 | Robert John Rolleston | Systems and methods for visually previewing variable information 3-D structural documents or packages |
US9105116B2 (en) | 2011-09-22 | 2015-08-11 | Xerox Corporation | System and method employing variable size binding elements in virtual rendering of a print production piece |
US9836868B2 (en) | 2011-09-22 | 2017-12-05 | Xerox Corporation | System and method employing segmented models of binding elements in virtual rendering of a print production piece |
WO2013152417A1 (en) * | 2012-04-11 | 2013-10-17 | Research In Motion Limited | Systems and methods for searching for analog notations and annotations |
JP2014174672A (en) * | 2013-03-07 | 2014-09-22 | Toshiba Corp | Information processor and information processing program |
KR102266869B1 (en) * | 2014-04-03 | 2021-06-21 | 삼성전자주식회사 | Electronic apparatus and dispalying method thereof |
WO2016133784A1 (en) * | 2015-02-16 | 2016-08-25 | Dimensions And Shapes, Llc | Methods and systems for interactive three-dimensional electronic book |
CN109542382B (en) * | 2017-12-26 | 2020-07-28 | 掌阅科技股份有限公司 | Display method of handwriting input content, electronic equipment and computer storage medium |
US11442591B2 (en) * | 2018-04-09 | 2022-09-13 | Lockheed Martin Corporation | System, method, computer readable medium, and viewer-interface for prioritized selection of mutually occluding objects in a virtual environment |
US11080939B1 (en) * | 2020-10-20 | 2021-08-03 | Charter Communications Operating, Llc | Generating test cases for augmented reality (AR) application testing |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5463725A (en) * | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
US5900876A (en) * | 1995-04-14 | 1999-05-04 | Canon Kabushiki Kaisha | Information processing apparatus and method with display book page turning |
US5923324A (en) * | 1997-04-04 | 1999-07-13 | International Business Machines Corporation | Viewer interactive three-dimensional workspace with interactive three-dimensional objects and corresponding two-dimensional images of objects in an interactive two-dimensional workplane |
US6340980B1 (en) * | 1996-08-26 | 2002-01-22 | E-Book Systems Pte Ltd | Computer user interface system and method having book image features |
US20020031756A1 (en) * | 2000-04-12 | 2002-03-14 | Alex Holtz | Interactive tutorial method, system, and computer program product for real time media production |
US20020035697A1 (en) * | 2000-06-30 | 2002-03-21 | Mccurdy Kevin | Systems and methods for distributing and viewing electronic documents |
US20020113823A1 (en) * | 2000-12-21 | 2002-08-22 | Card Stuart Kent | Navigation methods, systems, and computer program products for virtual three-dimensional books |
US20020176636A1 (en) * | 2001-05-22 | 2002-11-28 | Yoav Shefi | Method and system for displaying visual content in a virtual three-dimensional space |
US20030013073A1 (en) * | 2001-04-09 | 2003-01-16 | International Business Machines Corporation | Electronic book with multimode I/O |
US20040164975A1 (en) * | 2002-09-13 | 2004-08-26 | E-Book Systems Pte Ltd | Method, system, apparatus, and computer program product for controlling and browsing a virtual book |
US20040205545A1 (en) * | 2002-04-10 | 2004-10-14 | Bargeron David M. | Common annotation framework |
US20050151742A1 (en) * | 2003-12-19 | 2005-07-14 | Palo Alto Research Center, Incorporated | Systems and method for turning pages in a three-dimensional electronic document |
US20050289452A1 (en) * | 2004-06-24 | 2005-12-29 | Avaya Technology Corp. | Architecture for ink annotations on web documents |
US7148905B2 (en) * | 2003-12-19 | 2006-12-12 | Palo Alto Research Center Incorporated | Systems and method for annotating pages in a three-dimensional electronic document |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61267095A (en) * | 1985-05-22 | 1986-11-26 | 株式会社日立製作所 | Display indication system |
JP3387183B2 (en) * | 1993-12-24 | 2003-03-17 | 株式会社日立製作所 | Image display method and apparatus |
JPH07319899A (en) * | 1994-05-23 | 1995-12-08 | Hitachi Ltd | Controller for turning and displaying of page |
WO1997022109A1 (en) | 1995-12-14 | 1997-06-19 | Motorola Inc. | Electronic book and method of annotation therefor |
JPH1078963A (en) | 1996-08-01 | 1998-03-24 | Hewlett Packard Co <Hp> | Document comment method |
JP3443255B2 (en) * | 1996-10-18 | 2003-09-02 | 富士ゼロックス株式会社 | Electronic document management apparatus and method |
JPH1186017A (en) * | 1997-09-11 | 1999-03-30 | Canon Inc | Apparatus and method for information processing |
JP4309997B2 (en) * | 1998-06-17 | 2009-08-05 | ゼロックス コーポレイション | Annotation display method |
US7337389B1 (en) | 1999-12-07 | 2008-02-26 | Microsoft Corporation | System and method for annotating an electronic document independently of its content |
WO2002001339A1 (en) | 2000-06-26 | 2002-01-03 | Microsoft Corporation | Ink color rendering for electronic annotations |
JP2002157275A (en) * | 2000-11-22 | 2002-05-31 | Fuji Photo Film Co Ltd | Picture display device and storage medium |
JP3738720B2 (en) * | 2001-09-27 | 2006-01-25 | ヤマハ株式会社 | Information processing apparatus, control method therefor, control program, and recording medium |
-
2004
- 2004-12-16 US US11/012,902 patent/US7577902B2/en not_active Expired - Fee Related
-
2005
- 2005-12-12 JP JP2005357136A patent/JP2006172460A/en active Pending
- 2005-12-13 EP EP05112051A patent/EP1672529B1/en active Active
-
2009
- 2009-07-17 US US12/505,262 patent/US20100011281A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5463725A (en) * | 1992-12-31 | 1995-10-31 | International Business Machines Corp. | Data processing system graphical user interface which emulates printed material |
US5900876A (en) * | 1995-04-14 | 1999-05-04 | Canon Kabushiki Kaisha | Information processing apparatus and method with display book page turning |
US6340980B1 (en) * | 1996-08-26 | 2002-01-22 | E-Book Systems Pte Ltd | Computer user interface system and method having book image features |
US5923324A (en) * | 1997-04-04 | 1999-07-13 | International Business Machines Corporation | Viewer interactive three-dimensional workspace with interactive three-dimensional objects and corresponding two-dimensional images of objects in an interactive two-dimensional workplane |
US20020031756A1 (en) * | 2000-04-12 | 2002-03-14 | Alex Holtz | Interactive tutorial method, system, and computer program product for real time media production |
US20020035697A1 (en) * | 2000-06-30 | 2002-03-21 | Mccurdy Kevin | Systems and methods for distributing and viewing electronic documents |
US20020113823A1 (en) * | 2000-12-21 | 2002-08-22 | Card Stuart Kent | Navigation methods, systems, and computer program products for virtual three-dimensional books |
US20030013073A1 (en) * | 2001-04-09 | 2003-01-16 | International Business Machines Corporation | Electronic book with multimode I/O |
US20020176636A1 (en) * | 2001-05-22 | 2002-11-28 | Yoav Shefi | Method and system for displaying visual content in a virtual three-dimensional space |
US20040205545A1 (en) * | 2002-04-10 | 2004-10-14 | Bargeron David M. | Common annotation framework |
US20040164975A1 (en) * | 2002-09-13 | 2004-08-26 | E-Book Systems Pte Ltd | Method, system, apparatus, and computer program product for controlling and browsing a virtual book |
US20050151742A1 (en) * | 2003-12-19 | 2005-07-14 | Palo Alto Research Center, Incorporated | Systems and method for turning pages in a three-dimensional electronic document |
US7148905B2 (en) * | 2003-12-19 | 2006-12-12 | Palo Alto Research Center Incorporated | Systems and method for annotating pages in a three-dimensional electronic document |
US20050289452A1 (en) * | 2004-06-24 | 2005-12-29 | Avaya Technology Corp. | Architecture for ink annotations on web documents |
Non-Patent Citations (2)
Title |
---|
Card et al; "A 3D Electronic Smart Book", May 25-28, 2004, PARC, p303-307 * |
Steele, Heidi, "Easy Microsoft� Office Word 2003", Que, 9/19/2003, 3pages * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080229180A1 (en) * | 2007-03-16 | 2008-09-18 | Chicago Winter Company Llc | System and method of providing a two-part graphic design and interactive document application |
US8161369B2 (en) | 2007-03-16 | 2012-04-17 | Branchfire, Llc | System and method of providing a two-part graphic design and interactive document application |
US9275021B2 (en) | 2007-03-16 | 2016-03-01 | Branchfire, Llc | System and method for providing a two-part graphic design and interactive document application |
US20110261083A1 (en) * | 2010-04-27 | 2011-10-27 | Microsoft Corporation | Grasp simulation of a virtual object |
US8576253B2 (en) * | 2010-04-27 | 2013-11-05 | Microsoft Corporation | Grasp simulation of a virtual object |
US20120084647A1 (en) * | 2010-10-04 | 2012-04-05 | Fuminori Homma | Information processing apparatus, information processing method, and program |
US20120084646A1 (en) * | 2010-10-04 | 2012-04-05 | Fuminori Homma | Information processing apparatus, information processing method, and program |
US8516368B2 (en) * | 2010-10-04 | 2013-08-20 | Sony Corporation | Information processing apparatus, information processing method, and program |
US9430139B2 (en) * | 2010-10-04 | 2016-08-30 | Sony Corporation | Information processing apparatus, information processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
EP1672529B1 (en) | 2012-12-05 |
EP1672529A3 (en) | 2007-04-04 |
EP1672529A2 (en) | 2006-06-21 |
US20060136813A1 (en) | 2006-06-22 |
US7577902B2 (en) | 2009-08-18 |
JP2006172460A (en) | 2006-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7577902B2 (en) | Systems and methods for annotating pages of a 3D electronic document | |
US7148905B2 (en) | Systems and method for annotating pages in a three-dimensional electronic document | |
Blain | The complete guide to Blender graphics: computer modeling & animation | |
US7171630B2 (en) | Electronic simulation of interaction with printed matter | |
US7667703B2 (en) | Systems and method for turning pages in a three-dimensional electronic document | |
Fekete et al. | TicTacToon: A paperless system for professional 2D animation | |
EP1672474B1 (en) | Systems and methods for turning pages in a three-dimensional electronic document | |
US5729704A (en) | User-directed method for operating on an object-based model data structure through a second contextual image | |
US8005316B1 (en) | System and method for editing image data for media repurposing | |
US20140129990A1 (en) | Interactive input system having a 3d input space | |
Tolba et al. | Sketching with projective 2D strokes | |
DK2828831T3 (en) | Point and click lighting for image-based lighting surfaces | |
MXPA05007072A (en) | Common charting using shapes. | |
US20090100374A1 (en) | Method and system for referenced region manipulation | |
US7154511B2 (en) | Fast rendering of ink | |
CN109461215A (en) | Generation method, device, computer equipment and the storage medium of role's artistic illustration | |
Hong et al. | Annotating 3D electronic books | |
Roberts et al. | 3d visualisations should not be displayed alone-encouraging a need for multivocality in visualisation | |
JP3002972B2 (en) | 3D image processing device | |
US20150363960A1 (en) | Timeline tool for producing computer-generated animations | |
CN102592261A (en) | Vector diagram showing method and system | |
Krebs | Basics cad | |
Dowhal | A seven-dimensional approach to graphics | |
Zhang | Colouring the sculpture through corresponding area from 2D to 3D with augmented reality | |
Loviscach | A real-time production tool for animated hand sketches |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BLUESTONE INNOVATIONS LLC, VIRGINIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALO ALTO RESEARCH CENTER INCORPORATED;REEL/FRAME:029785/0623 Effective date: 20120622 Owner name: ETOME INNOVATIONS LLC, VIRGINIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLUESTONE INNOVATIONS LLC;REEL/FRAME:029785/0627 Effective date: 20130125 |
|
AS | Assignment |
Owner name: AMAZON TECHNOLOGIES, INC., NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ETOME INNOVATIONS LLC;REEL/FRAME:029987/0932 Effective date: 20130219 |
|
AS | Assignment |
Owner name: JB PARTNERS, LLC, VIRGINIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLUESTONE INNOVATIONS, LLC;REEL/FRAME:031841/0346 Effective date: 20131218 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |