WO2018085929A1 - Procédé et système d'effacement d'un espace clos sur un dispositif d'affichage interactif - Google Patents

Procédé et système d'effacement d'un espace clos sur un dispositif d'affichage interactif Download PDF

Info

Publication number
WO2018085929A1
WO2018085929A1 PCT/CA2017/051330 CA2017051330W WO2018085929A1 WO 2018085929 A1 WO2018085929 A1 WO 2018085929A1 CA 2017051330 W CA2017051330 W CA 2017051330W WO 2018085929 A1 WO2018085929 A1 WO 2018085929A1
Authority
WO
WIPO (PCT)
Prior art keywords
stroke
stroke data
object group
eraser
digitizer
Prior art date
Application number
PCT/CA2017/051330
Other languages
English (en)
Inventor
Gorden Dean Elhard
Cheng Xu
Michael Howatt Mabey
Alfonso Fabian De La Fuente
Original Assignee
Quirklogic, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/347,481 external-priority patent/US20180046345A1/en
Application filed by Quirklogic, Inc. filed Critical Quirklogic, Inc.
Publication of WO2018085929A1 publication Critical patent/WO2018085929A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the invention in general, in one aspect, relates to a method for updating an interactive display.
  • the method includes detecting, by the interactive display, an eraser stroke input from a digitizer, converting the eraser stroke input into eraser stroke data, extrapolating an enclosed area based on the eraser stroke data, identifying at least one object group using the enclosed area, and updating the interactive display using the at least one object group.
  • the invention in general, in one aspect, relates to a non-transitory computer readable medium (CRM) comprising instructions, which when executed by a processor, performs a method.
  • the method includes detecting, by an interactive display, an eraser stroke input from a digitizer, converting the eraser stroke input into eraser stroke data, extrapolating an enclosed area based on the eraser stroke data, identifying at least one object group using the enclosed area, and updating the interactive display using the at least one object group.
  • the invention in general, in one aspect, relates to a method for updating an interactive display.
  • the method includes detecting, by the interactive display, an eraser stroke input from a digitizer, converting the eraser stroke input into eraser stroke data, extrapolating an enclosed area based on the eraser stroke data, identifying at least one stroke using the enclosed area, and updating the interactive display using the at least one stroke.
  • FIG. 1 A shows a system in accordance with one or more embodiments of the invention.
  • FIG. IB shows an apparatus in accordance with one or more embodiments of the invention.
  • FIGS. 2A-2C show varying perspectives of a digitizer for operating an electronic fhpchart in accordance with one or more embodiments of the invention.
  • FIGS. 3A-3C show varying perspectives of a digitizer for operating an electronic fhpchart in accordance with one or more embodiments of the invention.
  • FIG. 4 shows relationships in accordance with one or more embodiments of the invention.
  • FIG. 5 shows a flowchart describing a method for initializing a system in accordance with one or more embodiments of the invention.
  • FIG. 6 shows a flowchart describing a method for grouping stroke data in accordance with one or more embodiments of the invention.
  • FIG. 7 shows a flowchart describing a method for manipulating a grouping of stroke data in accordance with one or more embodiments of the invention.
  • FIGS. 8A-8E show examples of one or more of the method steps described in FIGS. 6-7 in accordance with one or more embodiments of the invention.
  • FIG. 9 A shows a flowchart describing a method for updating an interactive display in accordance with one or more embodiments of the invention.
  • FIG. 9B shows a flowchart describing a method for identifying object groups relating to a set of stroke data in accordance with one or more embodiments of the invention.
  • FIG. 9C shows a flowchart describing a method for removing stroke inputs from an interactive display in accordance with one or more embodiments of the invention.
  • FIGS. 10A-10J show examples of one or more of the method steps described in FIGS. 9A-9C in accordance with one or more embodiments of the invention.
  • embodiments of the invention relate to a method and system for updating an interactive display on an interactive device. More specifically, embodiments of the invention are directed to the removal of identified stroke inputs from the interactive display. Each of the identified stroke inputs correspond to stroke data that at least partially overlap an enclosed area specified by a user via manipulation of a digitizer in erase mode.
  • FIG. 1 A shows a system in accordance with one or more embodiments of the invention.
  • the system includes an interactive device (102).
  • Each component of the interactive device (102) is described below.
  • the interactive device (102) is any physical system with an interactive display (104), a processor (106), local persistent storage (108), and volatile memory (1 10). Further, the interactive device (102) may be operative ly connected to cloud (or remote) storage (112) in a cloud computing environment. In one or more embodiments of the invention, the interactive device (102) may be any interactive device capable of receiving input, such as a reflective display device, an interactive whiteboard, an electronic tablet, or any other suitable device. For example, the interactive device (102) may be an e-flipchart apparatus as described in FIG. IB. [0024] The interactive device (102) includes functionality to receive at least one stroke input (not shown) on the interactive display (104).
  • the interactive device (102) also includes functionality to process, using the processor (106), the stroke input (described below) as stroke data (described below). Furthermore, the interactive device (102) is configured to categorize the stroke data based on an object type and to create object groups, using a proximity threshold and a time threshold, as further discussed below, and in accordance with the embodiments shown in FIG. 6. Additionally, the interactive device (102) is configured to store the stroke data in the volatile memory (110), local persistent storage (108), and/or cloud storage (1 12) associated with the interactive device (102).
  • the interactive display is
  • the display screen may be a reflective Liquid Crystal Display (LCD), a bi-stable or electrophoretic display (e.g., electronic paper and/or electronic ink displays), an electrochromic display, an electro-wetting or electro-fluidic display, an interferometric modulated display (e.g., a technology that creates color via the interference of reflected light), and an electromechanical modulated display (e.g., a video projector, a flap display, a flip disk display, a digital micro-mirror device (DMD), an interferometric modulator display (IMOD), an uni-pixel display (FTIR), and a telescopic pixel display).
  • LCD reflective Liquid Crystal Display
  • bi-stable or electrophoretic display e.g., electronic paper and/or electronic ink displays
  • electrochromic display e.g., an electro-wetting or electro-fluidic display
  • an interferometric modulated display e.g., a technology that creates color via the interference of
  • the interactive display (104) includes at least a touch-sensitive portion that is capable of receiving and displaying stroke input.
  • the stroke input, displayed by the interactive display (104) may be any digital pixel or marking made by touch input on the touch-sensitive portion of the interactive display (104), or by input on the interactive display (104) via a digital marker.
  • the stroke input may be a dot, a line, a letter, a drawing, a word, or a series of words made on the interactive display (104) using a digital marker, stylus pen, or user touch input.
  • the stroke input is processed into stroke data, by the processor (106) and stored on the interactive device (102).
  • the stroke data may be initially stored in the volatile memory (1 10), in accordance with the embodiments shown in FIG. 6.
  • Volatile memory (110) may be any volatile memory including, but not limited to, Dynamic Random-Access Memory (DRAM), Synchronous DRAM, SDR SDRAM, and DDR SDRAM.
  • the interactive device (102) may store the stroke data in the local persistent storage (108) of the interactive device (102).
  • Local persistent storage (108) may be, for example, solid state memory, optical storage, magnetic storage, or any other medium capable of persistently storing data.
  • the stroke data may optionally be stored on remote persistent storage or in a cloud computing environment having cloud storage (1 12), such as a web-based storage environment.
  • the stroke data may include, but is not limited to, location data for the stroke (e.g., the x, y, coordinates of the detected locations of the stroke input), optional stroke pressure data for the stroke (e.g. the amount of pressure that was detected at each location point), stroke characteristics that can be used to render the stroke from the location data and optional pressure data (e.g. stroke line width, stroke type (e.g.
  • the stroke data may include the location of the pixels that are changed as a result of the stroke (e.g., the pixels that make up the line(s) and/or curve(s) that were created as a result of the stroke input).
  • the interactive device shown in FIG. 1A may include functionality to implement all or a portion of the methods shown in FIGS. 5-7 and 9A-9C. More specifically, the interactive device may include software instructions in the form of computer readable program code to perform all or a portion of the functionality described in FIGS. 5-7 and 9A-9C.
  • FIG. 1A shows a configuration of components, system configurations other than those shown in FIG. 1A may be used without departing from the scope of the invention. For example, various components may be combined to create a single component. As another example, the functionality performed by a single component may be performed by two or more components.
  • FIG. IB shows a schematic diagram of an apparatus of an e-flipchart
  • FIG. IB shows a configuration of components, other configurations may be used without departing from the scope of the invention.
  • various components may be combined to create a single component.
  • the functionality performed by a single component may be performed by two or more components.
  • the apparatus (120) may include one or more hardware elements, each having specific functionality.
  • the main structure of the e-flipchart apparatus is formed between a back panel (142) and a front frame (122).
  • the front frame is a clear, opaque, or translucent material and includes an active area on which content may be displayed.
  • the back panel (142) is a rigid mechanical support structure made of a solid material, for example, plastic or metal.
  • a low-power, reflective display (126) In between the back panel (142) and the front frame (122) is a low-power, reflective display (126).
  • the reflective display (126) may be viewed as an output device that, through reflection, harnesses ambient light in order to present content.
  • the reflective display (126) may host slow refresh rates, monochromatic coloring (e.g., black and white, or gray scale shading), and the presentation of low contrasting definition.
  • the reflective display (126) may have one or more of the following features: (i) very low power consumption; (ii) the readability of content outdoors under sunlight; and (iii) the providing of strain relief on the eyes of a user.
  • fundamental static digital media such as monochromatic text and still images, may be delegated to a reflective display (126) for presentation.
  • Examples of a reflective display include, but are not limited to, a reflective Liquid Crystal Display (LCD), a bi-stable or electrophoretic display (e.g., electronic paper and/or electronic ink displays), an electrochromic display, an electro-wetting or electro-fluidic display, an interferometric modulated display (e.g., a technology that creates color via the interference of reflected light), and an electromechanical modulated display (e.g., Flap Display, digital micro-mirror device).
  • LCD Liquid Crystal Display
  • bi-stable or electrophoretic display e.g., electronic paper and/or electronic ink displays
  • electrochromic display e.g., an electro-wetting or electro-fluidic display
  • an interferometric modulated display e.g., a technology that creates color via the interference of reflected light
  • an electromechanical modulated display e.g., Flap Display, digital micro-mirror device
  • At least one portion of the reflective display (126) of the e-flipchart apparatus may be bi-stable.
  • the reflective display may correspond to the reflective displayed described in United States Patent No. 5,930,026. The invention is not limited to the reflective display described in the above referenced patent.
  • a layer having at least one touch portion which may be a transparent rigid or semi-rigid board (124), or a frame that uses edge sensors, such as Infra-red or optical sensing technology.
  • the layer having at least one touch portion (124) may be a capacitive film layer.
  • the layer having at least one touch portion (124) may only cover a portion of the reflective display, with the remaining surface area of the reflective display (126) being covered by non-touch sensitive material which may or may not be clear, opaque, translucent, transparent and/or non-transparent.
  • an optional electromagnetic layer which may be an electromagnetic board (128).
  • touch input may include a finger(s) and/or a touch by a digital marker or digitizer.
  • the electromagnetic layer may include a finger(s) and/or a touch by a digital marker or digitizer.
  • the electromagnetic layer (128) is configured to generate an electromagnetic field capable of detecting a digital marker or digitizer (see e.g., FIGs. 2A-3C) when such a tool is used to provide an input to the e-fhpchart.
  • the electromagnetic layer (128) includes wires (not shown) that allow the electromagnetic layer (128) to transmit and detect input signals.
  • the electromagnetic board (128) is configured to determine a position of the touch input (described above) on the e-fhpchart by detecting pressure or changes in the generated electromagnetic field caused by a designated portion of the touch input, for example, by the tip of a digital marker (or digitizer), electronic eraser, and/or pressure applied by one or more fingers.
  • the front frame (122) includes an active area or region with an active display, and an active input method that includes at least two input capabilities: the ability to detect a digital marker or digitizer and the ability to accept touch input from one or more finger touch points.
  • the apparatus (120) is configured to respond to each detected input type. For example, detecting a digital marker input may result in a line being drawn on or erased from the reflective display, while touching the same area with a finger may pan or zoom the display area.
  • controller (132) includes hardware and software/firmware to control the overall operation of the e-fhpchart. More specifically, the controller (132) may include one or more processors (CPUs), persistent storage, and/or volatile memory. Persistent storage may include, for example, magnetic storage, optical storage, solid state storage (e.g., NAND Flash, NOR Flash, etc.), or any combination thereof. Volatile memory may include RAM, DRAM, or any combination thereof. In one or more embodiments of the invention, all or a portion of the persistent storage and/or volatile memory may be removable. In one or more embodiments, the persistent storage may include software instructions for executing operations of the e-fhpchart.
  • processors CPUs
  • Persistent storage may include, for example, magnetic storage, optical storage, solid state storage (e.g., NAND Flash, NOR Flash, etc.), or any combination thereof.
  • Volatile memory may include RAM, DRAM, or any combination thereof.
  • the persistent storage may be configured to store software and/or firmware specific to e-flipchart operations.
  • the built-in CPU/processors of the controller (132) may execute an operating system and the software which implements e-flipchart functionality.
  • the controller (including components therein) (132) is powered by a battery and/or a power supply (130).
  • controller (132) is configured to detect and process input signals. For example, when an object touches the layer having at least one touch portion (124), a signal is sent to the controller (132) for detection of the input type and processing of the input.
  • the controller is configured to store e.g., in persistent storage and/or volatile memory, each stroke (in the form of touch input or digital marker) after such an action is performed on the e-flipchart (120).
  • the controller (132) is configured to store each stroke or action as it is produced in the active area of the front frame (122) of the e-flipchart apparatus (120).
  • the controller (132) has been described as a combination of hardware and software, the controller may be implemented entirely within hardware without departing from the scope of the invention.
  • the e-flipchart may include one or more external communication interfaces (138).
  • the communication interfaces permit the e-flipchart to interface with external components.
  • the communication interfaces may implement any communication protocol, for example, Bluetooth, IEEE 802.11, USB, etc. The invention is not limited to the aforementioned communication protocols.
  • IB is a lower-power reflective device that only draws power from the battery/power supply (130) when there is a screen refresh with new information displayed or when a user is drawing or inputting information in the apparatus.
  • the apparatus (120) is "always on” and in a mode that is ready to detect an input, the apparatus is in a low power state.
  • the e-flipchart apparatus is configured to change from the low power state to an active state.
  • the e-flipchart apparatus may be deemed to be in an active state when some or all the components on the e-flipchart apparatus are working accepting pen, touch, keyboard and LAN input, processing applications and/or saving data (and/or metadata) to memory. In the active state, the components of the e-flipchart apparatus are drawing energy from the controller (132). In contrast, the e-flipchart apparatus may be deemed to be in a low power state, (or ready-mode) when no pen, touch, keyboard or LAN inputs are detected (for at least a pre-determined period of time), but the apparatus still shows the last content displayed on it (or displays no content).
  • the controller may include an energy management process configured to control the state of various components of the e-flipchart apparatus based on whether the e-flipchart apparatus is in ready-mode or in the active mode.
  • the polling for input occurs at a low frequency, for example, the apparatus may scan for input 2-10 times per second. However, once an input is detected by the apparatus, the apparatus may transition to an active state and increase polling to a higher frequency, e.g., 60-120 times per second, in order to capture all the input that may be occurring on the reflective display. Other polling frequencies may be used in the active state and/or in the ready-mode without departing from the invention.
  • the term "low power state” is intended to convey that the power consumption of the e-flipchart apparatus in this state is relatively lower (or less) than the power consumption of the e- fhpchart apparatus in the active state.
  • the e-flipchart apparatus may include a camera for detecting certain types of input, e.g., a gesture interpretation.
  • the e-flipchart is configured to enable a user to create, modify, store, and share an e-presentation.
  • IB is approximately 42 inches in diagonal with a 3 :4 aspect ratio.
  • the size of the e-flipchart apparatus is designed to mimic that of a typical paper fhpchart; however, the dimensions and size of the reflective display apparatus of FIG. IB may vary without departing from the scope of the invention.
  • additional dimensions may include 32" 4:3 aspect ratio for a personal sized flip chart, and 55" or 60" for larger collaborative surfaces.
  • Even larger surfaces may vary the aspect ratio to allow for more usable width, without adding unusable height, such as a 9: 16 ratio for an 80" diagonal size.
  • FIG. IB describes an e-flipchart with a series of components organized in particular manner, those skilled in the art will appreciate that the location of such various components in the e-flipchart, in particular, the reflective display (126), the layer having at least one touch portion (124), and the optional electromagnetic layer (128) may be arranged in different order without departing from the invention.
  • FIGs. 2A-2C shows the hardware for the digital marker or digitizer that may be used as one type of input capable of being detected by the e-flipchart apparatus described in FIG. IB above.
  • FIGs. 2A-2C show a top view of the digital marker in the form of a cylinder (210).
  • the top of the digital marker has an electronic eraser (202) and at least one button (206).
  • the button (206) is software programmable and, when pressed or otherwise activated, is configured to send one or more signals to the e-fhpchart.
  • the button (206) may send a wireless signal that is detected by the e-flipchart.
  • the button (206) may be hidden or built into the electronic eraser (202).
  • the digital marker may include more than one button, where each button is separately programmable.
  • the e-flipchart when the electronic eraser (202) comes into contact with the e-flipchart, the e-flipchart is configured to remove or otherwise clear content from the corresponding locations on the reflective display (e.g., thereby enabling or entering an erase mode). Said another way, the electronic eraser (202) mimics the operation of a traditional eraser.
  • a button (206) on the digitizer may be programmed to enable the erase mode.
  • the corresponding locations on the reflective display from which content may be removed may be designated by eraser stroke inputs performed by a user handling the digitizer.
  • an eraser stroke input may be substantively similar to a stroke input (e.g., a dot, a line, a letter, a drawing, a word, or a series of words made on the interactive display (104) using a digital marker, stylus pen, or user touch input), however, instead of producing content on the reflective display, an eraser stroke input removes content at least partially overlapping the eraser stroke inputs.
  • a stroke input e.g., a dot, a line, a letter, a drawing, a word, or a series of words made on the interactive display (104) using a digital marker, stylus pen, or user touch input
  • an eraser stroke input removes content at least partially overlapping the eraser stroke inputs.
  • the corresponding locations on the reflective display from which content may be removed may be referred to as an enclosed area.
  • FIG. 2B shows a different orientation (i.e., a side view with a zero degree rotation of the cylinder) of the digital marker or digitizer, in which the button (206) is located at the side of the electronic eraser (202) rather than at the bottom of the electronic eraser (202).
  • FIG. 2C shows another top view of the digital marker, in which the cylinder is rotated 90 degrees. In FIG. 2C, the button (not shown) is hidden from view.
  • FIGs. 3A-3C depict side views of the digital marker or digitizer in accordance with one or more embodiments of the invention.
  • the entire length of the cylinder (210) form of the digital marker can be seen.
  • the electronic eraser (202) is shown on a top end of the digital marker.
  • the tip (204) of the digital marker or digitizer is shown on a bottom end of the cylinder (210).
  • the tip (204) material may be selected such that the tip (204) slides easily over the writing surface.
  • Materials for the tip (204) may include, but are not limited to, high density polyoxyethylene, hard felt, elastomer, polyoxymethylene, polyacetal, or polyoxyethylene. Other materials may be used without departing from the invention.
  • the tip (204) of the digital marker may be used to draw or write directly on the active area of the front frame (102) of the e-flipchart apparatus.
  • the button (206) is shown on a side of the cylinder (210) of the digital marker.
  • the tip (204) of the digital marker is shown as being thinner and sharper in comparison with the tip of FIGs. 3A-3B.
  • the tip (204) of the digital marker is a hardware component that may be interchangeable and designed to mimic a pencil, pen, marker, stylus, or any other suitable writing tool having varying widths and sharpness.
  • the button is not shown.
  • FIG. 3A-3C show a cylinder shape for the digital marker, those skilled in the art will appreciate that the shape of the digital markers may take other forms without departing from the scope of the invention.
  • FIG. 4 shows relationships in accordance with one or more embodiments of the invention. Specifically, FIG. 4 shows the relationship between stroke data (402 A, 402N) and an object group (404) and the relationship between an object group (404) and an object type (406). Each of these relationships is described below.
  • stroke data (402) is data pertaining to the stroke input (e.g., a dot, a line, a letter, a drawing, a word, or a series of words) made on the interactive display (104), using a digital marker, stylus pen, or user touch input.
  • the stroke input e.g., a dot, a line, a letter, a drawing, a word, or a series of words
  • the object group (404) is a logical grouping of a particular set of stroke data.
  • an object group (404) may include all of the stroke data that make up a letter, a word, a phrase, a sentence, or a paragraph.
  • the stroke data that is associated with an object group (404) may be determined using, for example, time and proximity parameters as described below.
  • FIG. 4 shows an object group (404) that includes stroke data 1 (402A) through stroke data n (402N) ("stroke data 1-n"), which is related to stroke input 1 through stroke input n ("stroke inputs 1-n").
  • stroke inputs 1-n may be any range or group of associated stroke input that the interactive device (104) receives.
  • stroke inputs 1-4 may include four separate strokes (e.g., four parallel lines) or four groups of strokes (e.g., four letters).
  • the object group (404) encompassing stroke data 1-n may include a cumulative aggregation of all of the stroke data (402) of stroke inputs 1-n.
  • the object group (404) encompassing stroke data 1-n may only include a selective aggregation of the stroke data (402A, 402N) of stroke inputs 1-n, based on the continuous classification and re-classification of the strokes as particular object types (406) (discussed below). For example, if a user draws a letter, stroke data 1-n may relate to each and every individual stroke data for the letter. As another example, if a user is in the process of writing a sentence, stroke data 1 -n may relate only to the last completed word or letter of the sentence.
  • the granularity of the stroke data in an object group may vary depending on the object group that the user wishes to define. [0056]
  • a particular object group may vary depending on the object group that the user wishes to define.
  • the object type (404) may be associated with an object type (406).
  • the object type (406) may be a letter, a word, a sentence or a paragraph.
  • each individual piece of stroke data (402 A, 402N) may be associated with one or more object groups, where each object group is associated with an object type. For example, consider a scenario in which a user wrote the phrase "Hello World" on an interactive display and that there are three object types: letter, word, and sentence.
  • the stroke data corresponding to the letter ' ⁇ ' is associated with: (i) an object group of object type letter that is associated with all stroke data for letter "H", (ii) an object group of object type word associated with all stroke data corresponding to the word "Hello”; and (iii) an object group of object type sentence associated with all stroke data corresponding to the words "Hello World.”
  • the stroke data is associated with a set of nested object groups, where each object group has a different level of granularity.
  • each object group is associated with a state.
  • the object group may be in an "open” state or in a "closed state.”
  • additional stroke data may be associated with the object group.
  • additional stroke data may not be associated with the object group.
  • each user may only have one open object group per object type at any given time.
  • a given user may only be associated with one open object group for each of the following types at any one time - letter, word, and sentence. Accordingly, if there are multiple users using the interactive device, then each user may have its own open set of object groups.
  • time and proximity thresholds allow the stroke input to be classified into object groups in a manner that is consistent with written language norms.
  • FIGS. 5-7 show flowcharts in accordance with one or more embodiments of the invention. More specifically, FIGS. 5-7 describe methods for categorizing and grouping stroke data in accordance with one or more embodiments of the invention. While the various steps in each flowchart are presented and described sequentially, one of ordinary skill in the art will appreciate that some or all of the steps may be executed in different orders, may be combined or omitted, and/or may be executed in parallel. In one or more embodiments of the invention, one or more steps shown in FIGS. 5-7 may be performed in parallel with one or more other steps shown in FIGS. 5-7. Furthermore, the steps may be performed actively or passively. For example, some steps may be performed using polling or be interrupt driven in accordance with one or more embodiments of the invention.
  • FIG. 5 shows a flowchart for initial system setup in accordance with one or more embodiments of the invention.
  • the interactive device detects the selection of a particular language to be used by the system, such as English, Arabic, or Mandarin Chinese.
  • the language selection may be used to facilitate the classification and grouping of the stroke input into object groups, based on the stroke data and object type(s) of the stroke input.
  • the interactive device obtains certain parameters to be used by the system, based on the language selection. Those parameters may include, for example, various time and distance thresholds corresponding with the general written language conventions of the selected language.
  • the language selection may determine the spacing between characters (e.g., proportional to the average character size of the text written in the board), the direction that the text is read, etc.
  • FIG. 6 shows a flowchart for grouping stroke input in accordance with one or more embodiments of the invention. More specifically, FIG. 6 shows how certain stroke input is grouped into either a new or an existing object group(s).
  • step 600 stroke input is detected on the interactive display. The stroke input is then converted into stroke data.
  • there may be one or more object groups that are currently in an "open” state discussed above.
  • the determination related to whether the stroke input (as defined by the stroke data) is within a proximity threshold may be performed on a per object group basis (if more than one object group is in an "open” state) or may be performed for only one object group of the set of object groups that are currently in an "open” state.
  • the object groups may be associated with a user. In such instances, the determination in step 602 is only performed on object groups associated with the user that provided the stroke input to generate the stroke data in step 600.
  • a proximity threshold is a requisite proximity value, based on a distance of the stroke input (as defined in stroke data) in relation to the distance of existing stroke input (as defined by corresponding stroke data) associated with an object group.
  • an object type is a logical subgrouping of a particular set of stroke input, which may be categorized as a marking, a stroke, a letter, a word, a sentence or a paragraph.
  • An example of multiple proximity thresholds based on object types may include a first requisite proximity value associated with letters, a second requisite proximity value associated with words, a third requisite proximity value associated with sentences, etc. Additionally, one or more proximity thresholds may be defined during the initialization phase based on the selected language, as discussed above and in accordance with the embodiments in FIG. 5. Further, one or more proximity thresholds may be dynamically defined based on certain user-dependent stroke data, such as the average size of the user's handwriting when drawing the strokes.
  • step 604 if the stroke input is determined to be within the proximity threshold of the existing object group(s), then step 604 is performed.
  • step 604 a determination is made as to whether the stroke input (as defined by the stroke data) is within a time threshold of an existing object group(s).
  • the determination of whether stroke data is associated with a given object group may be based solely on the proximity threshold. In such instances, step 604 is not performed.
  • the determination related to whether the stroke input is within a time threshold may be performed on a per object group basis (if more than one object group is in an "open” state) or may be performed for only one object group of the set of object groups that are currently in an "open” state.
  • the object groups may be associated with a user. In such instances, the determination in step 604 is only performed on object groups associated with the user that provided the stroke input to generate the stroke data in step 600.
  • a time threshold is a requisite time value, based on an amount of time elapsed between when the current stroke input was drawn (as defined by stroke data) and when the existing stroke input was drawn, enabling the current stroke input to be grouped into the same object group as the existing stroke input.
  • the proximity threshold there may be multiple time thresholds, which may be based on, among other things, the object type of the stroke input of the existing group or the selected language, as discussed above.
  • one or more time thresholds may be statically defined during the initialization phase, or dynamically defined based on certain user-dependent stroke data, such as the average time it takes a user to create certain stroke inputs.
  • step 606 if the stroke input is determined to be within the time threshold of the existing object group(s), after it has also been determined to be within the proximity threshold, then step 606 is performed.
  • the stroke data is associated with the existing object group(s) that is currently in an "open” state.
  • An object group may remain in an "open” state as long as the requisite proximity and time thresholds of the object group are met. Otherwise, the object group will be transitioned to a "closed” state.
  • a closed object group may be reopened in accordance with the embodiments of FIG. 7.
  • the stroke data may be associated with multiple object groups. For example, consider a scenario in which the user has written the following on an interactive device "Wor" and then subsequently writes the stroke data corresponding to the letter "1".
  • the stroke data corresponding to letter "1" may be associated with an object group of object type letter (which may be created using step 610, discussed below), an object group of object type word (which also includes stroke data corresponding to the "W", "o", and “r”) and an object group of object type sentence (which also includes stroke data corresponding to "W", "o", and "r”).
  • the timer(s) for the existing open object group(s) is set.
  • the timer(s) for the existing object group is used in determining whether the existing object group should continue to remain open or be closed. To make this determination, the timer(s) takes into consideration the relevant stroke data, object type(s) and time and proximity thresholds associated with the object group. After the timer expires for an existing object group, the object group is closed. If there are multiple open object groups (i.e., object groups with a state of "open"), then there may be a separate timer for each of the object groups. Further, the duration of the timer for each of the object groups may be the same or different. For example, the object group of object type word may have a shorter timer than an object group of object type sentence.
  • step 602 where a determination is made as to whether the stroke input (as defined by the stroke data) is within a requisite proximity threshold of an existing object group(s), if the stroke input is not within the requisite proximity threshold, then the stroke input is not added to the existing group(s). That is, when the current stroke input is detected too far away from the existing stroke previously made, even if the current stroke input is made within a required time threshold, the current stroke input is not added to the existing open group. Instead, the existing group is closed and the process moves to step 610.
  • step 602 if there are multiple open object groups then the determination in step 602 is performed on a per object group basis and only open object groups for which the proximity threshold is exceed are closed.
  • Step 610 a new object group(s) is created to accommodate the current stroke input (or, more specifically, the stroke data associated with the current stroke input).
  • Step 610 may include creating one or more new object groups. For example, consider a scenario in which the user has written the following on an interactive device: "Hello" and the current stroke input is a first stroke corresponding to a portion of the letter "W.” Further assume that the distance between the first stroke corresponding to the letter "W" and the letter “o” is: (i) greater than a proximity threshold for an object group of object type letter, (ii) greater than a proximity threshold for an object group of object type word, and (iii) less than a proximity threshold for an object group of object type sentence. In this example, an object group of object type letter and the object group of object type word are closed but the object group of object type sentence remains open. Further, in step 610, a new object of object type letter and a new object of object type word are created.
  • step 612 the stroke data is associated with the new object group(s).
  • the stroke data is associated with the new object of object type letter, the new object of object type word, and an existing object of object type sentence.
  • the timer(s) for the new open object group(s) is set. The new object group(s) remains open as long as the timer does not expire. Further, in step 614 if there are also existing object groups with which the stroke data is associated, then the timers associated with the existing object groups are also set. Continuing with the example described in Step 612, a timer is set for the new object of object type letter, for the new object of object type word, and for an existing object of object type sentence.
  • FIG. 7 shows a flowchart for manipulating a grouping of stroke data in accordance with one or more embodiments of the invention. More specifically, FIG. 7 shows how a closed object group may be reopened and modified to include new stroke input.
  • a closed object group is an object group that has been finalized and is closed to the inclusion of subsequent stroke input.
  • An object group may remain open as long as the requisite proximity and time thresholds of the object group are met, thereby preventing the expiration of the object group's timer. Upon the expiration of the object group's timer, the object group will be closed.
  • a request to modify a closed object group is detected.
  • the request may be sent as a result of a user selecting the closed object group, using the selection tools of the interactive device. For example, the user may select an area corresponding to the closed object group by using a finger (touch input) or a digital marker to draw a circle around the area.
  • the user can reactivate the closed object group by selecting the appropriate button or prompt on the interactive display.
  • step 702 the closed object group is reopened in response to the modification request.
  • the closed object group is reopened and all other object groups are closed. The user may then request to modify certain elements of the reactivated group, as further discussed below.
  • a requested modification action may include reformatting the group, removing or modifying stroke input within the object group, or adding new stroke input to the object group.
  • reformatting the object group may include, among other things, resizing an area of the object group, and modifying the structure of the object group to re-accommodate the stroke input after the area is resized.
  • removing or modifying stroke input may include completely erasing certain stroke data within the object group, or making certain changes to the stroke input, such as changing the spelling of a word or adding a punctuation marks.
  • step 706 a determination is made as to whether the modified object group is complete.
  • the modified object group is complete when modification actions should no longer be executed within the modified object group. The determination of whether the modified object group is complete may be made in response to the user selecting a certain prompt to indicate that the modified object group is complete. Additionally, in the absence of the user indicating that the modified object group is complete, the determination may be made based on a timer of the modified object group. The timer is used to determine whether subsequent stroke input or modification actions are made within the requisite proximity and time thresholds associated with the modified object group.
  • step 706 if the modified object group is not complete, then any subsequent modification actions will continue to be executed within the modified object group, until the modified object group is complete.
  • step 708 upon the completion of the modified object group, the modified object group is closed and any subsequent modifications are not included within the modified object group.
  • FIGS. 8A-E show examples of one or more of the method steps described in FIGS. 6-7, in accordance with one or more embodiments of the invention.
  • FIGS. 8A-8C illustrates how certain stroke data is derived from stroke input drawn on an interactive device.
  • FIG. 8A provides an illustration of the strokes comprising an uppercase letter "E", drawn on an interactive display using a digital marker (810), and the start/end points of the strokes.
  • the letter E is comprised of four separate strokes (802, 804, 806, 808), each of which has a corresponding starting point and ending point.
  • the first stroke (802) is a vertical line with starting point 812 and ending point 814.
  • the second stroke (804) is a horizontal line with starting point 816 and ending point 818.
  • the third stroke (806) is a horizontal line with starting point 820 and ending point 822.
  • the fourth stroke (808) is a horizontal line with starting point 824 and ending point 826.
  • FIG. 8B illustrates how the four strokes in FIG. 8A are grouped as a single object, based on certain predetermined parameters.
  • the predetermined parameters take into consideration the proximity of the start or end point of a stroke, in relation to subsequent strokes (requisite proximity) (830).
  • the predetermined parameters also take into consideration the amount of time elapsed between each stroke (requisite time) (828). If the stroke input meets both the proximity and time elements of the predetermined parameters, then it is grouped as a single object (832).
  • FIGs. 8D-8E further expand on the concept illustrated in FIGS. 8A-8C.
  • FIG. 8D shows an object group comprising the word “hello” (834) and the letter “w”(836).
  • the word “hello” (834) consists of a first group of letters that are drawn within the requisite time and proximity parameters to be associated with an object group of object type word. As such, the first group of letters is associated with an object group of object type word.
  • the letter “w"(836) is outside of the requisite proximity parameters associated with the object group of object type word, and is therefore not included as a part of the object group. However, the letter “w” (836) may still be included in the existing object group (834) if it is drawn within the requisite time and proximity parameters of the existing object group (834), as shown in FIG. 8E and discussed below.
  • FIG. 8E shows the existing object group from FIG. 8D, which now comprises the words “hello” and "world” (834).
  • "hello” consists of the first group of letters, which are associated with an object group of object type word.
  • "world” consists of a second group of letters, which are associated with a second object group of object type word.
  • the second word is drawn within the requisite time and proximity parameters of the existing object group, based on the relationship between the ending point of a stroke of the first word and the starting point of a subsequent stroke of the second word.
  • the stroke data associated with the object group for "world” and the stroke data associated with the object group for "hello” may also be associated with an object group of object type sentence (835).
  • FIG. 9A shows a flowchart describing a method for updating an interactive display in accordance with one or more embodiments of the invention.
  • a digitizer e.g., an electronic pen, a digital marker, a stylus, etc.
  • a digitizer may be in erase mode when the electronic eraser (see e.g., 202 in FIGs. 2A-3C) on the digitizer is touching or proximal to the interactive display.
  • the electronic eraser may be deemed to be proximal to the interactive display when the interactive display can detect the electronic erase in order to perform the steps shown in FIGS. 9A-9C.
  • the specific distance between the electronic eraser and the interactive display for the electronic eraser to be consider proximal to the interactive display may vary based on the implementation of the digitizer and the interactive display.
  • the electronic eraser may be deemed to be proximal to the interactive board when the electronic eraser is closer to surface of the interactive display than the tip of the digitizer.
  • a digitizer may be in erase mode when a button on the digitizer is activated, wherein the button may be programmed to enable or enter erase mode when engaged by a user handling the digitizer.
  • a user while in erase mode, a user may select corresponding locations (e.g., enclosed areas (discussed below)) on the interactive display from which content may be removed or cleared.
  • an eraser stroke input may be substantially similar to a stroke input except instead of generating content on an interactive (or reflective display), an eraser stroke input enables a user of the digitizer to select and remove or clear content from the interactive display.
  • eraser stroke data may be created and stored in a manner substantially similar to stroke input (discussed above), however, eraser stroke data references data associated with one or more eraser stroke inputs.
  • an enclosed area is extrapolated based on the eraser stroke data (converted from the one or more eraser stroke inputs detected in Step 902).
  • an enclosed area may correspond to an area selected, by a user, on the interactive display from which content is to be removed or cleared.
  • the enclosed area may be extrapolated based on, for example, the location data (e.g., the x, y coordinates of the detected locations of the eraser stroke input) of the corresponding eraser stroke data.
  • the enclosed area may be any free-form shape (e.g., a shape having any irregular contour).
  • the enclosed area may be any geometric shape (e.g., circle, square, rectangle, triangle, etc.).
  • the enclosed area may be incomplete, thereby meaning that no or zero points (or pixels) of the free-form shape intersect.
  • a connection or extension may be extrapolated from, for example, a starting and ending point of an eraser stroke input (or corresponding eraser stroke data) to complete the free-form shape.
  • the extrapolation may employ any existing area and/or data extrapolation techniques.
  • one or more object groups are identified using the enclosed area. More specifically, in one or more embodiments of the invention, one or more object groups relating to stroke data at least partially overlapping the enclosed area on the interactive display are identified. Further details elaborating Step 906 are discussed below with respect to FIG. 9B.
  • one or more strokes not affiliated with an object group are identified using the enclosed area instead.
  • the one or more strokes may encompass a subset (e.g., a portion or all) of a collection of strokes grouped to represent, for example, a drawing, a doodle, or an image, rather than an object group.
  • one or more strokes relating to stroke data at least partially overlapping the enclosed area on the interactive display are identified.
  • Step 908 the interactive display is updated using the one or more object groups (or one or more strokes not affiliated with an object group) identified in Step 906. More specifically, the interactive display is updated to remove or clear stroke inputs corresponding to the one or more object groups (or one or more strokes). Additional details describing Step 908 are discussed below with respect to FIG. 9C.
  • FIG. 9B shows a flowchart describing a method for identifying object groups (or strokes not affiliated with an object group) relating to a set of stroke data in accordance with one or more embodiments of the invention.
  • Step 920 a set of stroke data fragments are identified.
  • stroke data may be converted from, and thus corresponds to, a single stroke input.
  • a stroke input (or stroke) may be portrayed as one or more points, and/or one or more lines connecting the one or more points, on an interactive display.
  • a stroke data fragment thus corresponds to a subset of the stroke data associated with a stroke input.
  • a stroke data fragment may further correspond to a subset of the one or more points, and/or one or more lines, that form the stroke input.
  • a set of stroke data fragments may be a set of one or more subsets of stroke data, wherein each subset corresponds to a subset of the points and/or lines for a different stroke input.
  • each stroke data fragment overlaps the enclosed area (extrapolated in Step 904). That is, the subset of points and/or lines forming a stroke input for which each stroke data fragment corresponds may be situated within or intersect the enclosed area.
  • all stroke data for every stroke input presented on an interactive display may be compared to eraser stroke data corresponding to the eraser stroke input(s), and subsequently, the enclosed area.
  • the location data of the stroke data and the location data of the eraser stroke data may be compared to identify at least subsets of stroke data (e.g., stroke data fragments) that lie inside and/or intersect the enclosed area.
  • Step 922 for each stroke data fragment of the set of stroke data fragments identified in Step 920, contiguously associated stroke data is identified to obtain a set of stroke data.
  • a stroke data fragment corresponds to a subset of stroke data for a stroke input (discussed above)
  • contiguously associated stroke data associated with a stroke data fragment may refer to the remainder of the stroke data for the stroke input excluded from the stroke data fragment and/or residing outside the enclosed area.
  • the addition of a stroke data fragment and corresponding contiguously associated stroke data equates to the totality of stroke data for a single stroke input.
  • one or more points may be contiguously associated between a starting point and an ending point, thus producing a contiguous (or continuous) stroke.
  • stroke data for the various points of a stroke input may be contiguously associated.
  • the remainder of stroke data contiguously associated with each stroke data fragment overlapping the enclosed area is identified, thereby obtaining stroke data (in entirety) for each stroke input with which each stroke data fragment is associated.
  • the erase mode settings may be a set of one or more settings associated with the erasing functionality (or feature) of the interactive display and/or digitizer.
  • the settings may be representative of default values preset during the manufacturing of the interactive display and/or digitizer.
  • the settings may be representative of dynamically changing preferences of a user throughout the operation of the interactive display and/or digitizer.
  • Examples of erase mode settings may include, but are not limited to: (i) a setting for indicating the event that triggers the enablement of the erase mode (see e.g., Step 900); (ii) a setting for whether an enclosed area is to be interpreted as a free-form or a geometric shape; (iii) a setting for whether eraser stroke inputs are to be visibly or invisibly portrayed on an interactive display prior to the disengagement (e.g., lifting) of the digitizer with the interactive display by a user; (iv) a setting for indicating which granularity of stroke data overlapping an enclosed area is removed or cleared from the interactive display, etc.
  • Step 926 a determination is made based on the lookup of Step 924.
  • Step 928 a determination is made as to whether an erase mode setting has indicated that the granularity of stroke data to be removed pertains to merely a stroke (or stroke input). If it is determined that the erase mode is set to remove just the stroke input(s) (e.g., the lowest object types) associated with the set of stroke data (identified in Step 922), the process proceeds to Step 932. On the other hand, if it is determined that the erase mode is set to remove object group(s) relating to the set of stroke data of another object type (e.g., a letter, a word, a sentence, etc.), the process proceeds to Step 928.
  • the erase mode is set to remove just the stroke input(s) (e.g., the lowest object types) associated with the set of stroke data (identified in Step 922)
  • the process proceeds to Step 932.
  • object group(s) relating to the set of stroke data of another object type e.g., a letter, a word, a sentence, etc.
  • the selected (non-stroke) object type is identified.
  • the selected object type may be a default erase mode setting preset during manufacturing of the interactive display and/or digitizer.
  • the selected object type may be an erase mode setting set by preferences exhibited by a user throughout operation of the interactive display and/or digitizer.
  • examples of the selected object type may include, but is not limited to, a letter, a word, a sentence, etc.
  • Step 930 one or more object groups of the selected object type
  • the one or more object groups relate to the set of stroke data obtained in Step 922.
  • each stroke data of the set of stroke data corresponds to a different stroke input.
  • the collective stroke data for one or more stroke inputs may be logically grouped into an object group or a set of nested object groups based on, for example, time and proximity parameters.
  • Each object group may subsequently be associated with an object type, thereby associating the collective stroke data with a letter, a word, a sentence, and/or any other combination of stroke inputs.
  • the stroke data corresponding to the letter "H” is associated with: (i) an object group of object type letter that is associated with all stroke data for letter ' ⁇ ', (ii) an object group of object type word associated with all stroke data corresponding to the word "Hello”; and (iii) an object group of object type sentence associated with all stroke data corresponding to the words "Hello World.”
  • the stroke data is associated with a set of nested object groups, where each object group has a different level of granularity.
  • Step 930 may be realized by scaling the set of nested object groups originating at the object group of object type stroke corresponding to a stroke input forming the letter ⁇ " until the object group of the selected object type relating to the stroke data is identified.
  • Step 930 an object group of the selected object type for each stroke data of the set of stroke data (identified in Step 922) may identified as described above.
  • Step 932 in determining that the erase mode is set to remove just stroke inputs, one or more object groups of the lowest object type (e.g., object type stroke) relating to the set of stroke data are identified.
  • object type stroke e.g., object type stroke
  • one or more strokes not affiliated with an object group, but associated with a collection of strokes grouped to form, for example, a drawing, a doodle, or an image, which relate to the set of stroke data are identified.
  • an object group associated with stroke data for a single stroke input for each stroke data identified in Step 922 may thus be identified based on the relationships specified in FIG. 4.
  • FIG. 9C shows a flowchart describing a method for removing stroke inputs from an interactive display in accordance with one or more embodiments of the invention.
  • Step 940 following the identification of one or more object groups (or one or more strokes not affiliated with an object group) relating to the set of stroke data (see e.g., FIG. 9B), the collective stroke data encompassing the set of stroke data is deleted.
  • deletion of the set of stroke data may include deleting the set of stroke data from volatile memory, local persistent storage, and/or cloud storage associated with the interactive device.
  • Step 942 in response to the deletion of the set of stroke data, the one or more stroke inputs to which the set of stroke data corresponded are removed from the interactive display. Subsequently, in one or more embodiments of the invention, the corresponding stroke input(s) would no longer be visible to, and/or be manipulated by, a user of the interactive device.
  • FIGS. 10A-10J show examples of one or more of the method steps described in FIGS. 9A-9C in accordance with one or more embodiments of the invention.
  • FIG. 10A illustrates the performing of stroke inputs (1006) to derive a word drawn on an interactive device (1000).
  • the stroke inputs (1006) may, in one or more embodiments of the invention, be drawn using the tip (1004) of a digitizer (1002).
  • the word “hello” is drawn using a series of stroke inputs, each of which may correspond to a letter of the aforementioned word.
  • the word “hello” may be formed by five stroke inputs corresponding to the letters "h", “e”, “1", “1", and "o".
  • each of the aforementioned letters may be considered a stroke input because a user would perform no more than a single, contiguous (or continuous) stroke to draw each of the above letters.
  • the word “turtle” may encompass eight stroke inputs, which may include: (i) two separate (e.g., discontinuous) strokes for each letter "t”; and (ii) one stroke for each of the remaining four letters "u", "r", "1", and "e”.
  • FIG. 10B illustrates the intent of a user to evoke the erase mode of the digitizer (1002).
  • the erase mode may be triggered by (as shown in FIG. 10B) the flipping of the digitizer (1002) so that the electronic eraser (1008), rather than the tip (1004), is proximal to the interactive display.
  • an eraser stroke input (1012A) performed by the user may be detected.
  • the eraser stroke input (1012A) may be presented (as a visible or invisible element to the user) on the interactive display, and may be converted into stroke data (or more specifically, eraser stroke data) in substantially the same manner as stroke data for the stroke inputs (1006). Further, using the eraser stroke data, an enclosed area (101 OA) from which content may be removed is extrapolated. In the illustration of FIG. IOC, the enclosed area encircles the entirety of the word "hello".
  • one or more object groups are subsequently identified using the eraser stroke data for the eraser stroke input (1012A) and thus, the extrapolated enclosed area (101 OA). More specifically, a set of stroke data corresponding to stroke inputs that at least partially overlap the enclosed area is obtained. In the instant case the enclosed area encircles the entirety of the word "hello", thus the stroke inputs that at least partially overlap the enclosed area are identified to be all the stroke inputs (1006) forming the word "hello”.
  • a lookup of the current erase mode settings may be performed to determine whether the erase mode is set to remove just the stroke inputs or object groups of a selected object type. In either case, since the enclosed area envelops the entirety of the word "hello", all of the earlier identified stroke inputs of the word "hello” are removed (1014) from the interactive display as portrayed in FIG. 10D.
  • the erase mode may be set so that identified object groups overlapping the enclosed area, which are associated with the object type stroke (e.g., the lowest object type) are to be removed.
  • the set of nested object groups with which the stroke data for the letter "e" is associated may not need to be scaled because the currently obtained stroke data already corresponds to an object group associated with the object type stroke.
  • the removed stroke input (1014B) solely corresponds to the stroke input for the stroke (or letter) "e”.
  • the erase mode may be set so that identified object groups overlapping the enclosed area, which are associated with a selected object type (e.g., a letter, a word, a sentence, etc.) are to be removed.
  • a selected object type e.g., a letter, a word, a sentence, etc.
  • the set of nested object groups with which the stroke data for the letter "e” is associated may be scaled up to the object group associated with the selected object type, or the object type word, is identified.
  • the collective stroke data for all stroke inputs forming the word with which the letter "e" is associated e.g., the word "hello" is obtained.
  • all stroke inputs (1006) pertaining to the word "hello” are removed (1014C) from the interactive display, and the associated collective stroke data is deleted (or disassociated) from the interactive device.
  • a set of stroke data fragments may be identified first.
  • a stroke data fragment may refer to a subset of the stroke data corresponding to a contiguous (or continuous) stroke or stroke input.
  • the entirety of the letter "h” may not be enveloped by the enclosed area in this version of the example, the entirety of a fragment of the letter "h” is.
  • the stroke data fragment corresponding to that fragment of the letter "h", alongside the stroke data fragment corresponding to the fragment of the letter "e” overlapping the enclosed area are subsequently identified.
  • any contiguously associated stroke data is further identified.
  • contiguously associated stroke data associated with a stroke data fragment may refer to the remainder of the stroke data for the stroke input excluded from the stroke data fragment and/or residing outside the enclosed area.
  • FIG. 101 illustrates the differentiation between a fragment of a stroke (e.g., associated with a stroke data fragment (1020)) corresponding to the letter "h" and the remainder of the stroke, which may reside outside the enclosed area (1012C) (e.g., associated with the contiguously associated stroke data (1022)).
  • stroke data for the stroke inputs corresponding to the letters "h" and "e” are obtained as a set of stroke data.
  • a lookup of the current erase mode settings on the interactive device may be performed. Further, in one or more embodiments of the invention, one of the erase mode settings may specify that the removal of object groups associated with the object type letter is set. Subsequently, for each stroke input (e.g., associated with each stroke data of the set of stroke data), the set of nested object groups for the stroke input may be scaled until the object group associated with the object type letter is identified. From here, the collective stroke data for the aforementioned object group may be obtained (for each stroke input). Finally, the collective stroke data, for each stroke input (e.g., the letters "h” and "e"), is subsequently deleted and, as illustrated in FIG. 10 J, the corresponding stroke inputs (1014D) associated with those collective stroke data are removed from the interactive display.
  • Software instructions in the form of computer readable program code to perform embodiments of the invention may be stored, in whole or in part, temporarily or permanently, on a non-transitory computer readable medium such as a CD, DVD, storage device, a diskette, a tape, flash memory, physical memory, or any other computer readable storage medium.
  • the software instructions may correspond to computer readable program code that when executed by a processor(s), is configured to perform one or more embodiments of the inventions.

Abstract

L'invention concerne un procédé et un système permettant de mettre à jour un affichage interactif sur un dispositif interactif. En particulier, le procédé vise à éliminer des entrées de frappe identifiées à partir de l'affichage interactif. Chacune des entrées de frappe identifiées correspond aux données de frappe qui chevauchent au moins partiellement un espace clos spécifié par un utilisateur en manipulant un numériseur dans un mode d'effacement.
PCT/CA2017/051330 2016-11-09 2017-11-08 Procédé et système d'effacement d'un espace clos sur un dispositif d'affichage interactif WO2018085929A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/347,481 US20180046345A1 (en) 2016-01-05 2016-11-09 Method and system for erasing an enclosed area on an interactive display
US15/347,481 2016-11-09

Publications (1)

Publication Number Publication Date
WO2018085929A1 true WO2018085929A1 (fr) 2018-05-17

Family

ID=62110168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2017/051330 WO2018085929A1 (fr) 2016-11-09 2017-11-08 Procédé et système d'effacement d'un espace clos sur un dispositif d'affichage interactif

Country Status (1)

Country Link
WO (1) WO2018085929A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115603A (zh) * 2021-11-26 2022-03-01 哈尔滨地图出版社有限公司 遥控云挂图

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020157880A1 (en) * 1996-10-01 2002-10-31 Stephen Atwood Electronic whiteboard system eraser
US20030179214A1 (en) * 2002-03-22 2003-09-25 Xerox Corporation System and method for editing electronic images
US20080231635A1 (en) * 2007-03-23 2008-09-25 Palo Alto Research Center Incorporated Methods and processes for recognition of electronic ink strokes
US20090273585A1 (en) * 2008-04-30 2009-11-05 Sony Ericsson Mobile Communications Ab Digital pen with switch function
US20140258901A1 (en) * 2013-03-11 2014-09-11 Samsung Electronics Co., Ltd. Apparatus and method for deleting an item on a touch screen display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020157880A1 (en) * 1996-10-01 2002-10-31 Stephen Atwood Electronic whiteboard system eraser
US20030179214A1 (en) * 2002-03-22 2003-09-25 Xerox Corporation System and method for editing electronic images
US20080231635A1 (en) * 2007-03-23 2008-09-25 Palo Alto Research Center Incorporated Methods and processes for recognition of electronic ink strokes
US20090273585A1 (en) * 2008-04-30 2009-11-05 Sony Ericsson Mobile Communications Ab Digital pen with switch function
US20140258901A1 (en) * 2013-03-11 2014-09-11 Samsung Electronics Co., Ltd. Apparatus and method for deleting an item on a touch screen display

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115603A (zh) * 2021-11-26 2022-03-01 哈尔滨地图出版社有限公司 遥控云挂图

Similar Documents

Publication Publication Date Title
US7486282B2 (en) Size variant pressure eraser
US8479110B2 (en) System and method for summoning user interface objects
TWI433029B (zh) 用於呼叫電子手寫或書寫界面之系統、方法和電腦可讀媒體
US20180046345A1 (en) Method and system for erasing an enclosed area on an interactive display
US11460988B2 (en) Method of styling content and touch screen device for styling content
US20050015731A1 (en) Handling data across different portions or regions of a desktop
US20140189593A1 (en) Electronic device and input method
US9256588B1 (en) Transferring content to a substantially similar location in a virtual notebook using a stylus enabled device
CN103729055A (zh) 多显示设备、输入笔、多显示控制方法和多显示系统
JP2004030632A (ja) ドキュメントの上に電子インクを重ねる方法
US10129335B2 (en) Method and system for dynamic group creation in a collaboration framework
CN109032463B (zh) 笔记擦除方法、电子设备及计算机存储介质
CA3010449A1 (fr) Procede et systeme de representation d'un canevas "absolu" virtuel numerique partage
US20190332762A1 (en) Method for recording stroke data made on a touch sensitive interactive device
US10324618B1 (en) System and method for formatting and manipulating digital ink
US20180074775A1 (en) Method and system for restoring an action between multiple devices
WO2018085929A1 (fr) Procédé et système d'effacement d'un espace clos sur un dispositif d'affichage interactif
US10755029B1 (en) Evaluating and formatting handwritten input in a cell of a virtual canvas
US10430924B2 (en) Resizable, open editable thumbnails in a computing device
WO2017117658A1 (fr) Procédé et système de portage de données d'espace de travail multi-dispositifs
CN204303184U (zh) 基于光学原理的书法笔迹呈现系统
JP5852876B2 (ja) 表示システムおよび表示プログラム
EP3128412B1 (fr) Détection d'écriture manuscrite naturel sur une surface tactile
US10430053B1 (en) Edge navigation mechanism that mimics the use of a flipchart
JP2015064805A (ja) 表示装置およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17870267

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17870267

Country of ref document: EP

Kind code of ref document: A1