WO2013188890A1 - Method and apparatus for initiating an interactive learning experience - Google Patents

Method and apparatus for initiating an interactive learning experience Download PDF

Info

Publication number
WO2013188890A1
WO2013188890A1 PCT/US2013/046204 US2013046204W WO2013188890A1 WO 2013188890 A1 WO2013188890 A1 WO 2013188890A1 US 2013046204 W US2013046204 W US 2013046204W WO 2013188890 A1 WO2013188890 A1 WO 2013188890A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
border
square
landscape
Prior art date
Application number
PCT/US2013/046204
Other languages
French (fr)
Inventor
Alexander Kay
Tinsley A. GAYLEAN
Carl H. POPPER
Original Assignee
Alexander Kay
Gaylean Tinsley A
Popper Carl H
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alexander Kay, Gaylean Tinsley A, Popper Carl H filed Critical Alexander Kay
Publication of WO2013188890A1 publication Critical patent/WO2013188890A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an. interactive learning program designed principally for children, in which a computer display dispiays images that create a "story world " and in which the user draws a closed geometric figure, such as a square or circle, to interact with the story world.
  • a closed geometric figure such as a square or circle
  • the closed geometric figure creates a "portal” through which the user enters the story world and interactively explores the story world, by finding items, connecting items, creating items, and navigating.
  • the present invention fills one such need in that it provides a simp l e method of interaction, relying on the creation of closed geometric figures as a means for navigating a ' "story world" and is intended to be easily accessible fav children.
  • a method for interacting with an interactive media on a computer having a. display which changes in response to a user action is presented
  • the display has a display border and the method performs an act of displaying a landscape display having thereon an object having a closed geometric figure as object border on. the display, the border having a length.
  • the method performs an act of detecting the user action as an action selected from a group consisting of detecting motion of a user's touch in a direction of the object border within a predetermined, distance and along a predetermined percentage of the length of the object border; and detecting a user's touch within the object border.
  • the present invention further includes act of displaying a
  • landscape display the object is a square.
  • the act of detecting is completed withi a predetermined, time duration.
  • the object border remains displayed.
  • the invention further comprises an act of detecting when the user's touch points at the object border and moves toward the center of the display and contracting the object border to follow the touch,
  • the object border automatically contracts after the object border reaches a first predetermined size
  • the invention performs a still further act of replacing the
  • selection of the second object causes the landscape display to be replaced with the different display.
  • a prompt is presented to the user if the predetermined task is not completed within a predetermined time
  • the act of expanding the size of the closed geometric figure comprises fading the landscape display into the different display.
  • the different display has a new object having a closed geometric figure as an object border displayed therein and where the acts of using the mechanism to detect the motion of the user's touch, expanding the size of the closed geometric figure, and replacing the landscape display are repeated for the new object.
  • the object border is delineated on the display by predetermined display indicia
  • the predetermined display indicia include at least one of a glowing line and a sparkling line.
  • the act of detecting the user's touch comprises generating a prompt presentation to the user when the user 's finger has not moved along a predetermined percentage of the object border length within a predetermined time.
  • the acts described above are in the form of computer- readable instructions operated by a data processing system comprising an interactive data processing device.
  • the acts described above are in the form of computer- readable instructions stored on a computer-readable medium for operation by a. data, processing system comprising an. interactiv -data processing device.
  • PIG is an illustration of a data processing system used to facilitate the present invention
  • FIG. 2 is an illustration of a computer program product used to facilitate the present invention
  • FIG. 3A is an illustration of a landscape with an episode icon thereon
  • FIG, 3B is an illustration of a user interacting with the icon on the landscape, w th the ico being highlighted as a result;
  • FIG. 4A is an ii lustration of a landscape with an episode icon and a group of inter-episode game squares
  • FIG. 4B is an illustration of a landscape with an episode icon and a group of inter-episode segment squares
  • FIG. 5A through 5D is an illustrative sequence depicting the user opening a square and, alternatively, the square closing when the user fails to complete the opening process;
  • FIGs. 6 A and 613 illustrate squares that are highlighted to indicate the
  • FIGs. 6B and 6C illustrate the same sqoares shown in FIGs. 6A. and 6B, where a user has drawn around a portion of the perimeter of the square;
  • FIG, 7 is an illustration of a "ideal" square along with a region of tolerance within which a user can draw and with the figure still being considered sufficient as a square;
  • FIG. 8 is an illustration depicting a discretszed version of the "idea! square and reeion of tolerance .from FIG. 7;
  • FIGs, 9A and 9 are a sequence illustrating a user panning across a scene according to the present invention.
  • FIG. 10 is an illustration depicting the user's frame of vie with respect to a three-dimensional scene within a story world
  • FIG . 11 A through FIG. 1.1C is a sequence of illustrations wherein a user finds 5 a letter '3 ⁇ 4 " ' and moves it from an obscured location to an unobscured. location where the letter is highlighted and a square can be drawn thereabout;
  • FIGs. 12A and 12B illustrate the user moving the letter “b” to complete the word "bat,” and the subsequent formation of a square around the word to indicate that a square can be dra wn thereabout;
  • FIGs. 13A and 13B illustrate the user moving the final base in position to finish the formation of a baseball diamond, and the subsequent formation of a square around the base to indicate that a square can be drawn thereabout;
  • FIGs. 14A through 14C il lustrate a user shrinking out of a scene or a game, where the user stops and the square retains its size for a predetermined amount of time; [00048.]
  • FIGs. 15A and B illustrate, a. user closing a square using a two-finger "pinching" method;
  • FiO. 16 is an illustration of major components of a software system according to the present invention.
  • FIG. 17 is a state information flow diagram for the components preseoted in FIG, 16;
  • FIG. 1.8 is a set of tables showing class definitions for the state tracker and state objects
  • FIG. 19 is a flow chart presenting the interactions in the system available through drawing a square
  • FIG, 20 is an illustration the components of the landscape according to the pre sen t i a v en ti o ;
  • FIG. 21 is a set of tables showing class definitions and objects used to
  • FIG, 22 is an illustration, of the components of the feedback and help system
  • FIG. 23 is a set of tables showing class definitions and objects used to
  • FIG. 24 is 351 illustration of the components of the square drawing controller
  • FIG. 25 is a set of tables showing class definitions and objects used to
  • FiG. 26 is an illustration of the components of the square transition controller
  • FIG. 27 is a set of tables showing class definitions, and objects used to
  • FIG. 28 is an illustration of the components of the video playback controller
  • FIG. 29 is a set of tables showing class definitions and objects used to
  • FIG. 30 is an illustration of the components of the story world controller
  • FiG. 31 is a set of tables showing class definitions and objects used to
  • FIG, 32 is an illustration of the components of the game controller.
  • FIG. 33 is a set of tables showing class-definitions- nd objects used to
  • The- resent invention relates to an interactive user interface with which a user can interact and navigate through an interactive storyline by drawing closed geometric objects such as squares on a display. Such interaction may take the form of touch in the case where the display is a touch screen.
  • Other non ⁇ imiimg examples include interaction through acoustics, pointing, and motion sensing.
  • the following descriptio is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications. Various modifications, as well as a variety of uses in different applications will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of embodiments. Thus, the present invention is not intended to be limited to the embodiments presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
  • clockwise and counter-clockwise have been used for convenience only and are not intended to imply any particular fixed direction. Instead, they are used to reflect relative locations and/or directions between various portions of an objec t. As such, as the remotely controlled vehicle is turned around and/or over, the above labels may change their relative configurations,
  • the present in vention relates to an. interactive learning program designed
  • the program runs on a computer/computing device having a touch screen or other interactive display.
  • the screen displays images thai create a "stor world,"
  • the user draws a closed geometric figure, such as a square, in order to open a portal to. a story world.
  • a square is used as an example in. the discussion below .
  • other figures such as circles and stars could also be used. .Further, combinations of different figures may be used to cause different actions.
  • the use can enter the story world through the portal and can interactively explore the story world, find things, put things together, build something - a variation of putting things together, watch video segments and play mini-games.
  • Some of the interactive acti viti es create opportunities to draw more figures and open additional portals into additional episodes or story worids.
  • the user's experience accumulates a collection of "play squares" ----- badges of their successes and shortcuts back to places they have already been.
  • the present invention has three ⁇ prmcipal" aspects.
  • the first is a story world user interface system.
  • the story world user interface system is typically in the form of data processor having a computer system operating software or in the form of a "hard-coded" instruction set. This system ma be incorporated into a wide variety of devices that provide different functionalities.
  • the second princ ipal aspect is a method, typically in the form of software, operated using a data processing system (computer).
  • the third principal aspect is a computer program product.
  • the compuier program product generally represents computer- readable mstructions ' stored on. a computer-readable medium such as an optica!
  • CD compact disc
  • DVD digital versatile disc
  • magnetic storage device such as a floppy disk or magnetic tape.
  • computer-readable media include hard disks, read-only memory (ROM), and flash-type memories.
  • the interface system comprises a user interface system 100 having an inpu 102 for receiving user input, principally from a touch screen.
  • the input 102 may include multiple "ports.”
  • An output 104 is connected with the processor for providing information to the user - typically both visual and audio. Output may also be provided to other devices or other programs; e.g., to other software modules, for use therein.
  • the input 102 and the output 104 are both coupled with a processor 106, which may be a general-purpose computer processor or a specialized processor designed specifically for use with the present invention.
  • the processor 106 is cou le with a memory 108 to permit storage of data and software to be manipulated by commands to the processor,
  • FIG. 2 An illustrative diagram of a computer program product embodyin the present invention is depicted in FIG. 2.
  • the computer program product 200 is depicted as an optical disk such as a CD or DVD.
  • the computer program prodoct generally represents computer-readable instructions stored on any compatible computer-readable medium.
  • Episode A collection of video segments.
  • Extended game A more fully developed mini-game, a collection of related mini-games. Extended games may be based on the property but not on the specific episode -from which i is accessible. "Extended games” and “mini- games ' " may simply be referred to as "games" in this document when the context applies to both.
  • Interactive episode A story-based episode, which is an interactive experience that is built around a given media segmen and comprises of a number of interactive tasks.
  • Interactive segment Segment within an interactive episode where a user can interact with the content of the interactive episode.
  • Landscape A part of the interface in which a user begins the learning experience. This is a point from which the user can navigate through all of the content they have unlocked, it is typically a location where all of the geometric figures i.e.. squares) a user has unlocked are located.
  • Story world A location or set of locations in which the user's
  • Video segment A portion of an interactive episode that comprises non- interactive video content; non-limiting examples of which incl ude text, graphics, images, and motion pictures,
  • interactive episodes are interactive stories that are based on either original concepts or properties that have been licensed from other content generators.
  • the present invention is designed as an interface for navigating within and interacting with an interactive episode's story world.
  • Interactive episodes are derived from existing media such as books, television shows, and movies.
  • An example of a property is Word WorldTM, a series of televisio episodes produced for the . Public- Broadcasting System.
  • Some properties such as television shows will have episodes, each of which is rep rposed to build an interactive learning experience.
  • the discussion herein is built around the non-limiting example of an interactive episode which is an interactive experience that is built around a given episode and comprises of a n umber of interactive tasks.
  • each task by the user will unlock either a video segment to be viewed or a game to be played.
  • Each episode will he divided into a series of smaller video segments. For example, m eleven minute to fourteen minute television episode may be divided into seven to ten segments.
  • the landscape (touch screen or portion thereof) has a single episode icon 300.
  • an appendage 302 of the user touches the episode icon 300, a glowing square outline 304 appears around it as shown in FIG. 3B.
  • the term "appendage” may he part of the user's body or an accessory such as a stylus or
  • the square outline 304 is depicted as a glowing outline, many other variations are possible, non-limiting examples of which include highlighting, lowiig ting, color-changing, flashing, etc.
  • the landscape may he made up of various scenes and that the portal may be represented by an icon or may
  • the landscape is also the place where users will return between episodes. It is the launching place for activating and entering episodes and it is where a user aggregates earned squares for later .navigation.
  • the user will trace the outline of the square 304 to cause the square to activate.
  • Other interactions with the square outline 304 are also possible, though, such as simply touchmg ' the square or touching certain parts of the square, as will be appreciated by one of skill in the art.
  • inter-episode squares 400 will appear, adding to the square collection, as depicted in the exaniple landscapes represented by FIGs. 4A and 4B.
  • Inter-episode squares 400 represent the video segments and games that that user has unlocked in the course of the experience, in the example shown in FIGs. 4A and 4B, the resulting landscape will have an episode square 402 on tt. that represents the episode, and other (typically smaller) inter-episode squares 400 that represent both segment and game squares that have been unlocked.
  • a total of nine segment and game squares appear in two different lists, with the inter-episode squares 400 shown in FIG. 4A representing two game squares and those in FIG. 4B representing seven segment squares.
  • Squares or icons representing content may be placed in various landscape forms.
  • resetting the episode typically does not remove the inter-episode squares 400 from the landscape; it simply allows the user to replay the story world tasks in order again.
  • Drawing a square creates a portal into a new piece of content.
  • One situation is in the play square landscape around an icon for a particular episode.
  • the second situation is, once the user has opened that episode and entered the story world, the user can complete a task and be rewarded with glowing squares and the opportunity to draw a new square and open a porta! to an episode segment.
  • FIGs. 6A through 6D Glowing, transparent squares 600 and 602 are shown surrounding the words "Word World” and “bat” in FIGs. ()A and 6B, respectively.
  • the glowing, transparent squares 600 and 602 appear when it is time to draw a Square. Jn the example shown, they appear behind the icon or object around which a square is to be drawn and serve as guides ' for where the user should draw the square.
  • an effect such as a light sparkle (lens .flare) travels around the edge of the square following the path that the user traverses as they draw the square.
  • Such paths are indicated by the highlights for " Word World” 604 and "bat” 606 in FIGs, 6C and 6D, respectively.
  • the paths may be highlighted either ahead of the user ' s touch to assist them in tracing the square or behind the user's touch as an indication that they are tracing correctly (or with different highlights ahead, and behind).
  • FIG. 7 depicts an ideal square 700 with an inne tolerance figure 702 and a outer tolerance figure 704.
  • a user traces the square as lon as the path of their touch is between the inner tolerance figure 702 and the outer tolerance figure 704 (within the area of tolerance), their trace is considered valid and will, continue toward completion of the square. However, if ft deviates .from. the area of tolerance, the trace will he reset and the user will be required to start the trace again.
  • some embodiments may allow a user to deviate from the area of tolerance for a certain time interval or to a certain geometrical degree before being required to start the trace again.
  • the area of tolerance may involve other factors.
  • the percentage drawn may be simply the linear percentage of the square that has been traced (neglecting deviations from the ideal square 700), or it may be determined by a more complex mechanism such as by dividing the drawing area into a number of smaller (more) discretized areas 800 as shown in FIG. 8 (with only a few representative areas 800 being numbered for clarity).
  • the discretized. areas 800 will then be examined to determine if they were included in the user's trace of the square. Once the predetermined percentage of the discretized areas 800 have been included in the user's trace, the square will be considered complete and the porta! will open. This completion evaluation only occurs when the user is no longer touching the screen (or has no longer touched the screen for a predetermined time interval). This way the user's trace is not interrupted - they are allowed to finish tracing to the degree they intend. Hit their finger, and watch for the reward of the square opening into a portal.
  • This implementation approach does not necessarily require the square to be drawn in any particular order (particularly if the user is afforded a time interval to stop touching and resume). They can keep drawing parts of the square until they have filled in a. predetermined percentage of the perimeter.
  • a guide ''character may enter the frame and prompt for where to draw to complete the square as an aid to the user. If there is still longer idle time, the character may come in and explain that the drawing is fading out and the user will need to start, over.
  • Completion of the square may be noted by visual and audio feedback. For example, a sound and visual effect may appear, with the square then growing to fill the screen, stretching and morphing from a square to the aspect ratio of the computer display. As the square expands to fill the screen, the content of the square cross-dissolves to the new conten video, game, or story world location.
  • the user enters the stor world location.
  • the drawn line of the square may appear around the edge of the screen in order to remind the user that they are in a "square."
  • Such an outline on the edge of the screen can also serve as an interface object tor exiting the square, as will be described further below.
  • Navigating for looking around the story world comprises touching the background and dragging the user's touch from side to side. This will pan and scroll the background with movements of the user's touch, as depicted in FIG. 9A and 9B, where FIG. 9 A depicts a user moving their touch to the left 900, and Fid, 9B depicts the resultin scene.
  • Each interactive segment has a single panoramic view across which a user can pan.
  • the panoramic view may include an image of a bird's nest in front of a lake and, as the user pans around a baseball field comes into view.
  • the panoramic views may loop back on themselves, creating the effect of a 360 degree landscape. This is meant to give the user the feeling of being in a particular location in the story work!
  • An example of this concept is depicted in FI G. 10, where a 360 degree landscape is depicted by the larger outline 1000.
  • the view afforded to the user is depicted by the smaller outline .1002, representing the
  • the larger outline 1000 shown may allow for 360 degree panning in both horizontal and vertical directions such that the lareer outline 1.000 takes the form of a a!obe across wh ich the user ma scan, with their field of view being that of the smaller outline 1002.
  • Each story world is populated with objects. As the user pans and scrolls, they see these objects, many of which they are able to
  • findrag/discoverrag it As a non-limiting example, the task is to find the letter in the story world. As shown in FIG. 11 A, the letter is partially occluded behind a tree. This provides a clue to the user. The user mav then touch and or draa the letter "b" from behind the tree as shown in FIG. 1 I B. The act of touching and/or dragging the object will bring a translucent square behind the object, with the result being shown in FIG. 1 1 C, This translucent square is the indicator that the user lias discovered another square. There may also be a special sound as the discovery is made. The user is given an opportunity to draw a square around the letter "b" and open the portal to the episode segment or mini-game to which the square links.
  • Plot objects can also be found and the placed into an appropriate location in the story world.
  • the bases for the baseball field can be placed into their appropriate locations in a baseball field.
  • the bases may initially be located in a pile in the story world.
  • the act of touching one of the bases 130 activates a glowing indicator 1302 at the location where that object needs to be placed, as shown in FIG. 13 A.
  • the user can then move the base 1300 to the glowing indicator 1302. This is repeated until all of the objects have been placed, at their appropriate locations.
  • a glowing square 1304 appears, allowing a portal to be opened as shown in FIG. 13B.
  • the square that appears after completion may not encompass all the objects placed but only the last object placed.
  • mini-game squares are unlocked by actions in the story world and also expand like the segment squares to fill the screen. However rather than opening a video player, a mini-game square cross-dissol ves to a new screen that is the setting for a mini-game. These mini -games wi ll be in their own environmen t and may he a rendering of a particular location in the story world from a vantage point not seen in the panoramic view. Using the example from FIGs. 13A and 13B, by completing the baseball diamond and subsequently opening the resulting glowing square 1304, the user may enter a baseball game.
  • the square continues shrinking automatically and collapses, and. the user is returned to a previous location. If the user drags the frame past the point of a minimum siz for the square, but not enough to trigger a foil collapse and return to a previous location, the square stops shrinking and maintains its size until either finger is released or the user moves the finger hack out at which point the square will start tracking the finger again. This operation is shown in the sequence depicted by FIGs. I4A through FIG. 14C.
  • two finger "pinching * ' may he used as a way . of closing a square. In this case, if two fingers are placed on the background and. moved toward each other the square starts to shrink. As with the previous example, the user also needs to cross a frame size threshold by a predetermined amount before the square will close on release; otherwise the square will spring back. This operation is shown in the sequence depicted in FIGs. I SA and 158,
  • the system of the present invention may be implemented in, software running on a touch screen-based computing platform, examples of which include desktop computer systems with touch screens, tablet computers, laptop computers with touch screens, and mobile phones.
  • a touch screen-based computing platform examples of which include desktop computer systems with touch screens, tablet computers, laptop computers with touch screens, and mobile phones.
  • FIG. 16 The major components of such a software system according to the present invention are presented in FIG. 16, The state information flow for these components is presented in FiG. 17. The remainder of this portion of the description focuses on class definitions for the objects used to implement this system.
  • the PSStateTracker 1800 is a singleton object that has methods to receive state changes and notifications of touch events.
  • the state tracker 1S00 also owns the PSf eedbackController 1802 and passes state information to it.
  • the state information is represented by
  • State objects 1804 are cached in a dictionary
  • State objects such as the anient
  • State information that needs to be gathered includes all touches (for timeout purposes), state transitions (enter, exit may either be explicit or implied), current location (landscape, story world, video player, game), drawing a square, current, goal n story world, current episode and scene in story world, and the first time in current location.
  • FIG. 19 A. flow chart illustrating interactions in the system, that are available through the drawing of spare is depicted in FIG. 19.
  • the components of the landscape are illustrated in FIG. 20, wit class definitions for the objects used to implement them shown in FIG. 21.
  • the PSPlaySquareLandscape object 2100 is a singleton object thai is a CCLayer with a background Sprite (and. may have multiple background tiles for scrolling) and multiple
  • Each PSLandscapelcon object 2102 acts as a link to another location.
  • Subclasses are customized to particular types of locations: PSStoryWorldXcon 2104, PSYideolcon 21 6, and PSGamelcon 108.
  • Each icon responds to touches, drags, and long presses in the same way. Touches start the square drawing controller if the location has not been previously visited. Drags drag the icon around the landscape. Long presses will start to expand the scon to fill the screen and transition to the location using a cross-dissolve once the size passes a certain threshold.
  • the main customization the subclasses will do is to provide the correct .PSSquareDestination object to the
  • the components of the feedback and help system are illustrated in FIG, 22, with class definitions for the objects used to implement the system shown in FIG. 23.
  • the PSFeedhackControlier object 2300 is a subsystem that monitors the user's progress and provides helpful tips and feedback .
  • the feedback is desirably in the form of an animated character that appears on the screen and says a short phrase.
  • the current state is examined.
  • a feedback action is performed.
  • the mapping of triggers to actions will be loaded from a file
  • the PSFeedbackAetion object 2302 handles feedback
  • a feedback action may take the form of either a
  • the animation uses image sprites and a file listing
  • Each chancier has a
  • the position of the arm may be predetermined from
  • the animation frame file or it may be determined dynamically at run time.
  • the action may also call a named function instead of or in additio to an
  • the file for the animation will specify the position and orientation of each visible sprite at each frame of the animation, it will need to reference the audio file and the sprite image files (or offsets in a single sprite sheet image).
  • Some possible actions include help with square drawing, help with video playing, help with square zoom out (back to previous environment) . , erase partly drawn, square (after much inactivity), additional interface help or hints, episode specific help or hints and help with a game.
  • the PSFeedbackTrigger object 2304 handles trigger situations.
  • the feedback system will keep track, of the current state of the application and the user's actions to determine when to perform a feedback action.
  • the application has a number of possible states and a few time-based triggers that will be combined to create unique trigger situations. Each trigger sitaation may be mapped to a feedback action.
  • the triggers correspond to the following questions: "where are you?,” “have you been here before?,” “what are you trying to do?.; 5 “how long have you been trying to do that?,” and “when did you last do something?"
  • Possible example trigger states include the time since last touch is above some threshold, the time since last state transition (indicating progress in the current, state), current environment sqaare episode scene. progress in current episode/scene, performing video playback, drawing a square and first visit or return visit to square.
  • the trigger states are combined to generate trigger situations which prompt an action from the system. Two examples are provided below.
  • Example A A trigger situation with no touch for 30 seconds AND during square drawing prompts the hel character to appear and prompt the user to keep drawing a square.
  • Example B A trigger situation in the "Duck at Bat"
  • triggers may be defined externally and loaded from files.
  • the components of the square drawing controller are shown in FIG. 24, with class defin tions for the objects used for its implementation
  • the Square Drawing Controller class is a subclass of
  • PSSquareDrawingLayer 2500 It is given the current CCScene as a parent and a frame rectangle in which to draw a square. There will be some sprites in the current scene that the background square should appear behind so there is a convention of which objects appear in which Z-order, The actual
  • the square fram will be divided into n sections (20) with the struct PSSquareSection ⁇ CGRect,
  • the PS Square Effects object 2504 contains two CCParticleSystem objects for displaying an initial shower of teaser particles (following the outline of the square) and drawing particles to provide visual feedback while actually drawing.
  • the PSSquareEffects object 2504 will also be responsible for playing the feedback sounds.
  • the square transition controller components are shown in FIG. 26, with the class definitions for the objects used to implement them shown in FIG. 27.
  • the Square Drawing Controller object creates a PSSquareTransitionControHer object 2700 and gives it a square frame image (as a CCSprite), a PSSquareOr.ig.in object 2702 and a PSSquareDestraation object 2704.
  • the transition controller then creates a CCLayer with a CCSprite for the frame, the origin image and the destination image (which starts hidden).
  • the layer is then zoomed via an action to fill the screen. During the zoom, the origin sprite ikies out and the destination sprite fades hi.
  • the Origin object 2702 and the Destination object 2704 get messages notifying them when the transition begins and ends.
  • the Controller 2700 will ask the Destination object 2704 for the 111 View that will be shown. This allows the Controller 2700 to attach its UiGestureReeognizers to the view to listen for touches that, would, signal a reverse transition (destination back to origin), Keeping the touch handlms in the transition controller is cleaner than bavins a variety of destination objects handle it.
  • Video playback controller components are illustrated in FIG. 28, with the class definitions for objects used for their implementation shown in FIG. 29.
  • Video playback will use a customized UlKit movie player to load and play movie files foil screen. There is an overlay view showing controls and the square frame bordering the screen.
  • the stow work! controller components are illustrated in FIG. 30, with objects used to implement them shown in FIG. 31.
  • the PSStoryWorld object 3100 is a container for the parts of the world.
  • the PSTileScroller object 3102 handles continuous scrolling.
  • the story world comprises a background sprite with a variety of (possibly interactive) story objects placed in it.
  • the PSTileScroller object 3102 handles scrolling of the story world.
  • the world is divided into four layer tiles (each represented by a PSStoryTile object 3106).
  • Each tile contains a background sprite and some collection of object sprites.
  • the object sprites are children of the tile layer and thus positioned relative to it.
  • the PSTileScroller object 3102 is responsible for moving the tile to a new position as necessary to maintain the illusion of a continuously scrolling scene.
  • Duri ng each panGesture movement there is a ' extra offscreen tile in place ' -opposite the directio of the movement.
  • the tile layers will be moved during the scrolling, not the parent story world layer.
  • the PSObjectAssembler object 3108 implements a controller that handles assembling story objects on a floating "shelf ' " The shelf does not scroll with the rest of the world but remains fixed on the screen as the user scrolls for more objects.
  • the controller handles animating an objeci to the correct position on the shelf when the object is touched.
  • the controller knows which objects are needed and their relative positions on the shelf.
  • the assembler acts as a delegate for the story objects to intercept their standard touch handling. After the objects are assembled, move to square drawing around the objects.
  • Each scene in a given plot will be described by data objects (in some embodiments, these objects may be loaded from a file).
  • the PSStoryScene object 3104 contains an initial positi on for the view, a iist of objects present in the scene as well as their initial positions in the world. Subclasses will be customized with the specific objects and goals of each scene in the episode. In other embodiments, the data from these custom files will be abstracted and loaded from a file.
  • the PSSto.ryObjectCa.che 1 10 implements an object store.
  • Story objects will get reused between scenes and an SDictionar is used as a cache to hold the story objects rather than recreating them each time. If memory runs low, the cac he will need to be cleared of unused story objects.
  • the PSStoryOhject object 31 10 is a CCSprite that has been customized to be a particular character or interactive object, in the world. Each story object .gets the option to respond to touches and drags that affect it. In some embodiments, story objects may be coded directly into the application, while in other versions the story objects may be loaded from Hies. There is a protocol for a delegate to the story object that, if attached, will receive the touch events instead of the object. [ 46] The game controller components are illustrated in FIG. 32, with class definitions for objects used for their implementation shown in Fid. 33. Games will largely be custom coded classes.
  • PSObjectFindingGarne 3302 will be a game in which the user looks for an object and the PSTargetHittingGame will be a game in which the oser tries to hit a target such as a. baseball or archery game. Games will use PSStoryObjeets 3306 as their assets (possibly shared with the PSStory World 3100).

Abstract

.A mechanism for navigating through and interacting with a touch screen-based media, where a user touches the screen and forms closed geometric figures to cause actions within the media. The display shows a landscape and when the user interacts with the landscape a geometric figure appears, which she user can trace to open an episode, play a scene or game, or to complete a task, Users can find or assemble objects, solve puzzles; on completion, a geometric figure is presented for the user to trace. After completing the trace, the user is being allowed to access new scenery or play a game. For example where the geometric figure is a square, a user traces the square to expand the square and access content. The user can exit their current location by touching and dragging an edge of a square to shrink the square and go back to their previous location.

Description

[0001 ] METHOD AND APPARATUS FOR INITIATING AN INTERACTIVE
LEARNING EXPERIENCE
[0002] PRIORITY CLAIM
[0003] This application is a non-provisional application, claiming the benefit of priority to provisional application number 61/660,051, filed in the United States on June 15, 2012, titled "Method and Apparatus tor initiating an Interactive Learning Experience/'
[0004] BACKGROUND OF THE INVENTION
[0005] (1. ) Field of the invention
[0006] The present invention relates to an. interactive learning program designed principally for children, in which a computer display dispiays images that create a "story world " and in which the user draws a closed geometric figure, such as a square or circle, to interact with the story world. Typically ., the closed geometric figure creates a "portal" through which the user enters the story world and interactively explores the story world, by finding items, connecting items, creating items, and navigating. [0007] (2} Description of Related Art.
[0008] Methods for interacting with computer systems are currently commonplace, with devices that use tactile input such as through a keyboard, mouse, and/or touch screen as well as more exotic inputs such as motion sensing and sound. These inputs are commonly used for navigating ihrough an environment such as those provided by Window-type operating systems. Most common interactive methods are general purpose and are not suited to a specific demographic.
[0009] A continuing need exists for interactive methods tailored to a specific
demographic, The present invention fills one such need in that it provides a simp l e method of interaction, relying on the creation of closed geometric figures as a means for navigating a '"story world" and is intended to be easily accessible fav children.
[00010] SUMMARY OF INVENTION
[00011] A method for interacting with an interactive media on a computer having a. display which changes in response to a user action is presented The display has a display border and the method performs an act of displaying a landscape display having thereon an object having a closed geometric figure as object border on. the display, the border having a length. Next, the method performs an act of detecting the user action as an action selected from a group consisting of detecting motion of a user's touch in a direction of the object border within a predetermined, distance and along a predetermined percentage of the length of the object border; and detecting a user's touch within the object border. A fter the detecting act has been performed, an act selected from, a group consisting of expanding a size of the closed geometric figure until the object border expands to at least the size of the display border; and expanding a size of the closed geometric figure until the objec t border expands to at least the size of the display border and dissolving the geometric figure as it expands is performed. Finally, the landscape display is replaced with a different display.
[00012] in another aspect, the present invention further includes act of displaying a
landscape display the object is a square.
[00013] In a still further aspect, the act of detecting is completed withi a predetermined, time duration.
[0001.4] In yet another aspect, after act of expanding a size of the closed geometric
figure, the object border remains displayed. [00015] In another, aspect, the invention further comprises an act of detecting when the user's touch points at the object border and moves toward the center of the display and contracting the object border to follow the touch,
[00016] In a still tiirtlier aspect, the object border automatically contracts after the object border reaches a first predetermined size,
[00017] In. another aspect, the invention performs a still further act of replacing the
different display with the landscape display when the object border contracts to a second predetermined size,
[00018] In a yet further aspect, responsive to the user completing a predetermined task in conjunction with the different display, an act of displaying on the landscape display a second object indicative of a completion of the predetermined task is performed,
[0001.9] In a further aspect, selection of the second object causes the landscape display to be replaced with the different display.
[00020] In still another aspect, responsive to the user failing to complete a
predetermined task in conjunction with the different display, an act of fading the different display into the landscape display and redisplaying the object on the display thereby resetting the mechanism to detect, the motion of the user's touch again is performed.
[00021] m another aspect, a prompt is presented to the user if the predetermined task is not completed within a predetermined time,
[00022] In yet another aspect, the act of expanding the size of the closed geometric figure comprises fading the landscape display into the different display. [00023] In a further aspect, the different display has a new object having a closed geometric figure as an object border displayed therein and where the acts of using the mechanism to detect the motion of the user's touch, expanding the size of the closed geometric figure, and replacing the landscape display are repeated for the new object.
[00024] hi still another aspect, in the act of displaying a landscape display, the object border is delineated on the display by predetermined display indicia,
[00025] In another aspect, the predetermined display indicia include at least one of a glowing line and a sparkling line.
[00026] In a further aspect, the act of detecting the user's touch comprises generating a prompt presentation to the user when the user 's finger has not moved along a predetermined percentage of the object border length within a predetermined time.
[00027] In a yet further aspect, the acts described above are in the form of computer- readable instructions operated by a data processing system comprising an interactive data processing device.
[00028] In a still further aspect, the acts described above are in the form of computer- readable instructions stored on a computer-readable medium for operation by a. data, processing system comprising an. interactiv -data processing device.
[00029] BRIEF DESCRIPTION OF THE DRAWINGS
[00030] The objects, features and advantages of the present invention will be apparent from the following detailed descriptions of the various aspects of the invention in conjunction with reference to the following drawings, where: [00031] PIG, 1 is an illustration of a data processing system used to facilitate the present invention;
[00032] FIG. 2 is an illustration of a computer program product used to facilitate the present invention;
[00033] FIG. 3A is an illustration of a landscape with an episode icon thereon;
[00034] FIG, 3B is an illustration of a user interacting with the icon on the landscape, w th the ico being highlighted as a result;
[00035] FIG. 4A is an ii lustration of a landscape with an episode icon and a group of inter-episode game squares;
[00036] FIG. 4B is an illustration of a landscape with an episode icon and a group of inter-episode segment squares;
[00037] FIG. 5A through 5D is an illustrative sequence depicting the user opening a square and, alternatively, the square closing when the user fails to complete the opening process;
[00038] FIGs. 6 A and 613 illustrate squares that are highlighted to indicate the
perimeter around which a user should draw a square;
[00039] FIGs. 6B and 6C illustrate the same sqoares shown in FIGs. 6A. and 6B, where a user has drawn around a portion of the perimeter of the square; [00040] FIG, 7 is an illustration of a "ideal" square along with a region of tolerance within which a user can draw and with the figure still being considered sufficient as a square;
5 [00041 ] FIG. 8 is an illustration depicting a discretszed version of the "idea!" square and reeion of tolerance .from FIG. 7;
[00042] FIGs, 9A and 9 are a sequence illustrating a user panning across a scene according to the present invention;
[00043] FIG. 10 is an illustration depicting the user's frame of vie with respect to a three-dimensional scene within a story world;
[00044] FIG . 11 A through FIG. 1.1C is a sequence of illustrations wherein a user finds 5 a letter '¾"' and moves it from an obscured location to an unobscured. location where the letter is highlighted and a square can be drawn thereabout;
[00045] FIGs. 12A and 12B illustrate the user moving the letter "b" to complete the word "bat," and the subsequent formation of a square around the word to indicate that a square can be dra wn thereabout;
[00046] FIGs. 13A and 13B illustrate the user moving the final base in position to finish the formation of a baseball diamond, and the subsequent formation of a square around the base to indicate that a square can be drawn thereabout;
i>
7] FIGs. 14A through 14C il lustrate a user shrinking out of a scene or a game, where the user stops and the square retains its size for a predetermined amount of time; [00048.] FIGs. 15A and B illustrate, a. user closing a square using a two-finger "pinching" method;
[00049] FiO. 16 is an illustration of major components of a software system according to the present invention;
[00050] FIG. 17 is a state information flow diagram for the components preseoted in FIG, 16;
[00051] FIG. 1.8 is a set of tables showing class definitions for the state tracker and state objects;
[00052] FIG. 19 is a flow chart presenting the interactions in the system available through drawing a square;
[00053] FIG, 20 is an illustration the components of the landscape according to the pre sen t i a v en ti o ;
[00054] FIG. 21 is a set of tables showing class definitions and objects used to
implement the components presented in FIG. 20;
[00055] FIG, 22 is an illustration, of the components of the feedback and help system;
[00056] FIG. 23 is a set of tables showing class definitions and objects used to
implement the components presented in FIG, 22; [00057] FIG, 24 is 351 illustration of the components of the square drawing controller;
[00058] FIG. 25 is a set of tables showing class definitions and objects used to
implement the components presented in FiG. 24;
[00059] FiG. 26 is an illustration of the components of the square transition controller;
[00060] FIG. 27 is a set of tables showing class definitions, and objects used to
implement the components presented in FIG, 26;
[0006 !] FIG. 28 is an illustration of the components of the video playback controller;
[00062] FIG. 29 is a set of tables showing class definitions and objects used to
implement the components presented in FIG. 28;
[00063] FIG. 30 is an illustration of the components of the story world controller;
[00064] FiG. 31 is a set of tables showing class definitions and objects used to
implement the components presented in FIG. 30;
[00065] FIG, 32 is an illustration of the components of the game controller; and
[00066] FIG. 33 is a set of tables showing class-definitions- nd objects used to
implement the components presented in FIG. 32.
[00067] DETAILED DESCRIPTION
[00068] The- resent invention relates to an interactive user interface with which a user can interact and navigate through an interactive storyline by drawing closed geometric objects such as squares on a display. Such interaction may take the form of touch in the case where the display is a touch screen. Other non^imiimg examples include interaction through acoustics, pointing, and motion sensing. The following descriptio is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications. Various modifications, as well as a variety of uses in different applications will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of embodiments. Thus, the present invention is not intended to be limited to the embodiments presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
[00060] in t e following- detailed description, numerous specific details are set forth in, order to pro vide a more thorough understanding of the present invention.
However, it will, be apparent to one skilled, in the art that the present invention ma be practiced without necessarily bein limited to these specific details, in other instances, well-known structures and devices are shown in block, diagram form, rather than in detail, in order to avoid obscuring the present invention.
[00070] The reader's attention is directed to all papers and documents which are filed concurrently with this specification and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference. All the features disclosed in this specification, (including any accompanying claims, abstract, and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
[00071 J Furthermore, any element in a claim that does not explicitly state "means for" performing a specified function, or "step for" performing a specific function, is not to be interpreted as a "means" or "step" clause as specified in 35 U.S.C. Section 1 12, Paragraph 6. in particular, the use of "step of or "act of* in the claims herein Is not .intended to invoke the provisions of 35 U.S.C. 1 12, Paragraph 6.
[00072] Note, the labels left, right, front, hack, top, bottom, forward, reverse,
clockwise and counter-clockwise have been used for convenience only and are not intended to imply any particular fixed direction. Instead, they are used to reflect relative locations and/or directions between various portions of an objec t. As such, as the remotely controlled vehicle is turned around and/or over, the above labels may change their relative configurations,
[00073] Before describing the invention in detail, an introduction is provided to
provide the reader wi th, a general understanding of the present invention. Next a descriptio of the principal aspects of the invention is provided. Then, a
description' of various interface functions of the presen t inventi on is provided. Finally, a software implementation components are described to assist in a better technical understanding of the invention.
[00074] (1) introduction
[00075] The present in vention relates to an. interactive learning program designed
principally for children, and desirably for users in the 2 - 8 year old range. The program, runs on a computer/computing device having a touch screen or other interactive display. The screen displays images thai create a "stor world," The user draws a closed geometric figure, such as a square, in order to open a portal to. a story world. Although a square is used as an example in. the discussion below., and as will be appreciated by those of skill in the art, other figures such as circles and stars could also be used. .Further, combinations of different figures may be used to cause different actions. The use can enter the story world through the portal and can interactively explore the story world, find things, put things together, build something - a variation of putting things together, watch video segments and play mini-games. Some of the interactive acti viti es create opportunities to draw more figures and open additional portals into additional episodes or story worids. The user's experience accumulates a collection of "play squares" ----- badges of their successes and shortcuts back to places they have already been.
[00076] (2) Principal Aspects
[00077] The present invention has three ^prmcipal" aspects. The first is a story world user interface system. The story world user interface system is typically in the form of data processor having a computer system operating software or in the form of a "hard-coded" instruction set. This system ma be incorporated into a wide variety of devices that provide different functionalities. The second princ ipal aspect is a method, typically in the form of software, operated using a data processing system (computer). The third principal aspect is a computer program product. The compuier program product generally represents computer- readable mstructions' stored on. a computer-readable medium such as an optica! storage device, e.g., a compact disc (CD) or digital versatile disc (DVD), or a magnetic storage device such as a floppy disk or magnetic tape. Other, non- limiting examples of computer-readable media include hard disks, read-only memory (ROM), and flash-type memories. These aspects will be described in more detail below.
[00078] A b lock diagram depicting the components of an interactive media user
interface system of the present invention is provided in FIG. 1. The interface system comprises a user interface system 100 having an inpu 102 for receiving user input, principally from a touch screen. Note that the input 102 may include multiple "ports." An output 104 is connected with the processor for providing information to the user - typically both visual and audio. Output may also be provided to other devices or other programs; e.g., to other software modules, for use therein. The input 102 and the output 104 are both coupled with a processor 106, which may be a general-purpose computer processor or a specialized processor designed specifically for use with the present invention. The processor 106 is cou le with a memory 108 to permit storage of data and software to be manipulated by commands to the processor,
[00079] An illustrative diagram of a computer program product embodyin the present invention is depicted in FIG. 2. The computer program product 200 is depicted as an optical disk such as a CD or DVD. However, as mentioned previously, the computer program prodoct generally represents computer-readable instructions stored on any compatible computer-readable medium.
[00080] (3) Glossary
[0008.1 ] Before describing the specific detail s of the present invention, a glossary is provided in which varioos terms used herein and in the claims are defined. The glossary provided is intended to provide the reader with a general understanding for the intended meaning of the terms, but is not intended to conve tine entire scope of each term. Rather, the glossary is intended to supplement the rest of the specification in more accurately explaining the terms used.
82] Episode: A collection of video segments. [00083] Extended game: A more fully developed mini-game, a collection of related mini-games. Extended games may be based on the property but not on the specific episode -from which i is accessible. "Extended games" and "mini- games'" may simply be referred to as "games" in this document when the context applies to both.
Interactive episode: A story-based episode, which is an interactive experience that is built around a given media segmen and comprises of a number of interactive tasks. [O0085] Interactive segment: Segment within an interactive episode where a user can interact with the content of the interactive episode. [00086] Landscape: A part of the interface in which a user begins the learning experience. This is a point from which the user can navigate through all of the content they have unlocked, it is typically a location where all of the geometric figures i.e.. squares) a user has unlocked are located.
[00087] Mini-game; Goal-oriented interactive activities that take place within an interactive segment of story-based episode,
[00088] Story world; A location or set of locations in which the user's
interactions with the story-based episode take place.
[0008°] Video segment; A portion of an interactive episode that comprises non- interactive video content; non-limiting examples of which incl ude text, graphics, images, and motion pictures,
[00090] (4.1 ) Interlace Functions
[00091] interactive episodes are interactive stories that are based on either original concepts or properties that have been licensed from other content generators. The present invention is designed as an interface for navigating within and interacting with an interactive episode's story world. Interactive episodes are derived from existing media such as books, television shows, and movies. An example of a property is Word World™, a series of televisio episodes produced for the. Public- Broadcasting System. Some properties such as television shows will have episodes, each of which is rep rposed to build an interactive learning experience. The discussion herein is built around the non-limiting example of an interactive episode which is an interactive experience that is built around a given episode and comprises of a n umber of interactive tasks. The completion of each task by the user will unlock either a video segment to be viewed or a game to be played. Each episode will he divided into a series of smaller video segments. For example, m eleven minute to fourteen minute television episode may be divided into seven to ten segments.
[00092] The geometric figures, e.g., squares are a unique and defining element of the
5 interface. Within the interface users will be presented with times when they can draw a square. The act of drawing a square will al low the square itself to open as a portal to new content. The more squares a user discovers and creates, the more content the user unlocks. In this example, there are three types of squares to start with. An episode square lets the user "enter" and play an episode. A segment square unlocks and
10 pro vides a portal to an episode segment and a game square unlocks and provides a
portal to a game. Note that the geometric figures may or ma not have an apparent or visible border.
[00093] The interactive experience starts with a play square 'landscape", which is the
.15 part of the interface at which the user begins the learning experience. In one
embodiment, as shown in FIG. 3 A, the landscape (touch screen or portion thereof) has a single episode icon 300. When an appendage 302 of the user touches the episode icon 300, a glowing square outline 304 appears around it as shown in FIG. 3B. Note that the term "appendage" may he part of the user's body or an accessory such as a stylus or
20 even a pointing device that does not require physical contact with the screen. Also note that although the square outline 304 is depicted as a glowing outline, many other variations are possible, non-limiting examples of which include highlighting, lowiig ting, color-changing, flashing, etc. finally, note that the landscape may he made up of various scenes and that the portal may be represented by an icon or may
25 take the form of one of many different shapes. The landscape is also the place where users will return between episodes. It is the launching place for activating and entering episodes and it is where a user aggregates earned squares for later .navigation. Typically, the user will trace the outline of the square 304 to cause the square to activate. Other interactions with the square outline 304 are also possible, though, such as simply touchmg'the square or touching certain parts of the square, as will be appreciated by one of skill in the art. As a user plays an episode additional, typically smaller, inter-episode squares 400 will appear, adding to the square collection, as depicted in the exaniple landscapes represented by FIGs. 4A and 4B. Inter-episode squares 400 represent the video segments and games that that user has unlocked in the course of the experience, in the example shown in FIGs. 4A and 4B, the resulting landscape will have an episode square 402 on tt. that represents the episode, and other (typically smaller) inter-episode squares 400 that represent both segment and game squares that have been unlocked. In the example shown, a total of nine segment and game squares appear in two different lists, with the inter-episode squares 400 shown in FIG. 4A representing two game squares and those in FIG. 4B representing seven segment squares. Squares or icons representing content may be placed in various landscape forms.
[00095] When in the landscape., a user can re-enter any of the squares. Re-entering the episode square 402 allows the user to pick up
where they left off and continue their journey, in another aspect, when a user re-enters an episode square 402 after having completed and unlocked all of the inter-episode (segment and game) squares 400, the user may be
presented with an option to reset the episode and start the story at the
beginning. Unless otherwise desired, resetting the episode typically does not remove the inter-episode squares 400 from the landscape; it simply allows the user to replay the story world tasks in order again.
[00096] From the landscape a user can re-enter a square thai they have already created by, for exaniple, touching and holding the square. As the user maintains their hold on the square, the square expands and fills the scree as progressively shown in FIGs. 5 A, 5B, and SC. The square 500 is shown before being touched in FIG, 5 A. When the user touches the square 500 as shown in FIG. 5B, th square 500 expands to fill the landscape with the resulting expansion shown in PIG. 5C. On the oilier hand, if the user releases their hold on the square before the square expands beyond a predetermined size as shown in FIG. 5D, the square 500 contracts to its smaller, unopened state. Once the square 500 has expanded beyond the predetermined size, it will continue to expand (open) even if the user's hold is released. In addition, after the size threshold is reached, the image in the square 500 cross-dissolves into an image of the next environment.
[00097] When a user first opens an "episode square" he/she opens a portal into a location within the story world, which is the world of the underlying media, (licensed property). Each story world has a variety of different locations therein. When the user opens an interactive episode he/she will enter a. particular location in the story world. It is from this starting location tha the user will begin to explore and unlock all the episode segments and games. 'Story worlds may have several unique story world locations per episode..
[00098] Once in a story world location, the user will be prompted (or discover) how to unlock the segment squares and the game squares. Although variations may be created without deviating from the present invention, the ability to unlock segment and game squares is desirably presented in a linear sequence. Therefore, in most cases, ai any given time there is only one square to be discovered and created, it is only after unlocking the presently available square that the next square 'becomes available to unlock. This allows for a simple and natural progression and helps to ensure that the user experiences the segments and aames in a desired order.
[00099] The drawing of a square and the unlocking of a portal is the signature interface for the inventive system. A translucent square in the background framing a logo or an object signifies thai there is the opportunity to draw a square.
Drawing a square creates a portal into a new piece of content. There are generally two situations where the transparent square appears and signifies that there is an opportunity to draw a square and open a portal One situation is in the play square landscape around an icon for a particular episode. The second situation is, once the user has opened that episode and entered the story world, the user can complete a task and be rewarded with glowing squares and the opportunity to draw a new square and open a porta! to an episode segment.
[000100] The user draws a square with their finger and receives feedback as they do so, requiring them to stay relatively close to drawing something that looks like a square and allows the application to judge when it is complete enough to he considered, a square. This is illustrated in FIGs. 6A through 6D. Glowing, transparent squares 600 and 602 are shown surrounding the words "Word World" and "bat" in FIGs. ()A and 6B, respectively. The glowing, transparent squares 600 and 602 appear when it is time to draw a Square. Jn the example shown, they appear behind the icon or object around which a square is to be drawn and serve as guides 'for where the user should draw the square. As the user draws the square, and to emphasize the action of tracing the outline of the square, an effect such as a light sparkle (lens .flare) travels around the edge of the square following the path that the user traverses as they draw the square. Such paths are indicated by the highlights for " Word World" 604 and "bat" 606 in FIGs, 6C and 6D, respectively. Note that the paths may be highlighted either ahead of the user' s touch to assist them in tracing the square or behind the user's touch as an indication that they are tracing correctly (or with different highlights ahead, and behind).
[000101] As a user traces a square, an area of "tolerance" is provided so that the user can deviate to a desired dearee. FIG. 7 depicts an ideal square 700 with an inne tolerance figure 702 and a outer tolerance figure 704. As a user traces the square as lon as the path of their touch is between the inner tolerance figure 702 and the outer tolerance figure 704 (within the area of tolerance), their trace is considered valid and will, continue toward completion of the square. However, if ft deviates .from. the area of tolerance, the trace will he reset and the user will be required to start the trace again. In order to make the system more forgiving, in some aspects, it may be desirable to allow for time or aeomeuicaj-based deviations from the area of tolerance. In other words, some embodiments may allow a user to deviate from the area of tolerance for a certain time interval or to a certain geometrical degree before being required to start the trace again. In other words, although being depicted as a simple geometric construct, the area of tolerance may involve other factors.
[000102] Additionally- the figure drawn by the user should match closely with the edge of the transparent square that floats and glows behind the icon or object. Around the ideal location for drawing (represented in FIG. 7 by the ideal square 700) is a zone of "drawable" area 706. falling in the area of tolerance. As lona as the user draws within this area, the drawing will appear on .the screen. The square will be considered complete once a pre~deteranned percentage (for example ninety percent) of the square is drawn. The percentage drawn may be simply the linear percentage of the square that has been traced (neglecting deviations from the ideal square 700), or it may be determined by a more complex mechanism such as by dividing the drawing area into a number of smaller (more) discretized areas 800 as shown in FIG. 8 (with only a few representative areas 800 being numbered for clarity).
[000103] The discretized. areas 800 will then be examined to determine if they were included in the user's trace of the square. Once the predetermined percentage of the discretized areas 800 have been included in the user's trace, the square will be considered complete and the porta! will open. This completion evaluation only occurs when the user is no longer touching the screen (or has no longer touched the screen for a predetermined time interval). This way the user's trace is not interrupted - they are allowed to finish tracing to the degree they intend. Hit their finger, and watch for the reward of the square opening into a portal. This implementation approach does not necessarily require the square to be drawn in any particular order (particularly if the user is afforded a time interval to stop touching and resume). They can keep drawing parts of the square until they have filled in a. predetermined percentage of the perimeter.
[0001.04] As a further aspect, if the square is not complete and the system, is idle for a predetermined period of time, a guide ''character" may enter the frame and prompt for where to draw to complete the square as an aid to the user. If there is still longer idle time, the character may come in and explain that the drawing is fading out and the user will need to start, over.
[000105] Completion of the square may be noted by visual and audio feedback. For example, a sound and visual effect may appear, with the square then growing to fill the screen, stretching and morphing from a square to the aspect ratio of the computer display. As the square expands to fill the screen, the content of the square cross-dissolves to the new conten video, game, or story world location.
[000106] When an episode square is opened, the user enters the stor world location. In a farther aspect, the drawn line of the square may appear around the edge of the screen in order to remind the user that they are in a "square." Such an outline on the edge of the screen can also serve as an interface object tor exiting the square, as will be described further below. Navigating for looking around the story world) comprises touching the background and dragging the user's touch from side to side. This will pan and scroll the background with movements of the user's touch, as depicted in FIG. 9A and 9B, where FIG. 9 A depicts a user moving their touch to the left 900, and Fid, 9B depicts the resultin scene.
[000107] Each interactive segment has a single panoramic view across which a user can pan. For example, the panoramic view may include an image of a bird's nest in front of a lake and, as the user pans around a baseball field comes into view. The panoramic views may loop back on themselves, creating the effect of a 360 degree landscape. This is meant to give the user the feeling of being in a particular location in the story work! An example of this concept is depicted in FI G. 10, where a 360 degree landscape is depicted by the larger outline 1000. The view afforded to the user is depicted by the smaller outline .1002, representing the
"landscape" that the user can view at any given time. Note that the larger outline 1000 shown may allow for 360 degree panning in both horizontal and vertical directions such that the lareer outline 1.000 takes the form of a a!obe across wh ich the user ma scan, with their field of view being that of the smaller outline 1002.
[0001 8] Each story world is populated with objects. As the user pans and scrolls, they see these objects, many of which they are able to
interact with. Interaction with these objects may unlock the next square. .At this level, there are two important types of objects: simple objects and 'plot
objects, interacting with plot objects facilitates the discovery of a new square and moves the user along in the episode. Example interactions with both
simple objects and plot objects are described below1.
[000 i 09] Simple objects are objects that do something very
simple - perform an animation or sound when touched. They are not part of the larger progression of interactivity of the game play. They are just fun entertaining and/or educational surprises that the user will benefit by
disco vering. Within most story worlds there will be at least a few of these when the user enters a location for the first time. Over time, and with
subsequent updates, more objects will be added to the story world location.
This will add to the feeling that the user is expanding the story world and that the story is becoming richer and fuller (more involving). Interacting or no interacting with these objects has little or no bearing on the progression of the episode. They are just entertaining diversions. [0001 10] The user can also interact with plot objects, as shown in FIGs, 11 A through .1 1 C. The simplest interaction: with a plot object is just
findrag/discoverrag it As a non-limiting example, the task is to find the letter in the story world. As shown in FIG. 11 A, the letter is partially occluded behind a tree. This provides a clue to the user. The user mav then touch and or draa the letter "b" from behind the tree as shown in FIG. 1 I B. The act of touching and/or dragging the object will bring a translucent square behind the object, with the result being shown in FIG. 1 1 C, This translucent square is the indicator that the user lias discovered another square. There may also be a special sound as the discovery is made. The user is given an opportunity to draw a square around the letter "b" and open the portal to the episode segment or mini-game to which the square links.
(0001 1] A more complicated action with plot objects would involve collecting a few plot objects and putting them together to create a
synergistic meaningtul/difi¾rent whole. As an example putting the letters "a " and Ύ' together in order to make the word "bat." which has a meanine apart horn its components. In this case, once the word i completed, as a reward, a new square may be unlocked. When a new square is unlocked, the system may alert the user by playing a special sound and highlighting the new square, As the user finds plot objects and drags them to, at, or near their appropriate respecti ve locations, they wilt snap into place, awaiting the rest of the objects.
[0001 12] As an example, in the case where the user is to form the word "bat," when the user first finds and touches the letter "a," it will move to a location near the top center of the screen and a grayed out translucent square (large enough to contain all three Setters will appear behind it). The user can then pan or scroll by dragging the background. The letter "a" and the grayed out square stays in view, maintaining its top center location. When the letters "b*! and "t" are found and dragged to the gray square they snap into place. Once the word is completed, the square glows and the reward sound plays. In FIG. 12A, the user has collected the letters "a" and and is shown in the process of dragging *¾" to its location. After the user has completed this action, a square is highlighted around the word "hat" as shown in FIG. 12B, and the reward sound plays.
[0001 13] Plot objects can also be found and the placed into an appropriate location in the story world. For example, the bases for the baseball field can be placed into their appropriate locations in a baseball field. The bases may initially be located in a pile in the story world. The act of touching one of the bases 130 activates a glowing indicator 1302 at the location where that object needs to be placed, as shown in FIG. 13 A. The user can then move the base 1300 to the glowing indicator 1302. This is repeated until all of the objects have been placed, at their appropriate locations. Once this occurs, a glowing square 1304 appears, allowing a portal to be opened as shown in FIG. 13B. For larger tasks, such as this example of placing the bases on the baseball diamond, the square that appears after completion may not encompass all the objects placed but only the last object placed.
[0001 1 ] This same process can be used to build any object in the context of the story world. For example the user might place the letters "B," "A," 'It," and "N"" into positions that form the word "BARN1' in order to unlock a segment about a pig and his barn. Then when the user returns to the story world location after seeing the segment, the barn is now visible. Non-limiting examples of items to be moved are; moving letters to form words, moving a set of sub- objects into their proper geometrical relationship such as forming a baseball, field f om a set of bases, moving puzzle pieces into their proper configurations, and moving numbers into their logical relationship to form an equation. As will be appreciated by one of skill in the art, items to be moved and their final
configuration/relationships can be used to teach a wide variety of concepts. [0001 15] For video playback, after the square expands to fill the screen and cross-dissolves to the first frame of the video and video controls fade in. At this point, the drawn frame of the square remains around the edge of the screen. The video will automaticaHy begin playing. As the video starts playing, both the video controls and the square frame fade out. If the screen is touched at any point, the video controls and the frame fade hack in. The customized controls include a piay/pause button, a rewind to beginning button, and a volume slider. The square frame and the controls fade n when video comes to the end and stay visible until a action is taken (replay or zoom out). As previously discussed, the square frame allows a user to exit the video and close the current square, thus navigating back to a previous location.
[0001 ! 6] Just like the segment squares, mini-game squares are unlocked by actions in the story world and also expand like the segment squares to fill the screen. However rather than opening a video player, a mini-game square cross-dissol ves to a new screen that is the setting for a mini-game. These mini -games wi ll be in their own environmen t and may he a rendering of a particular location in the story world from a vantage point not seen in the panoramic view. Using the example from FIGs. 13A and 13B, by completing the baseball diamond and subsequently opening the resulting glowing square 1304, the user may enter a baseball game. Once the user unlocks and enters this game square, the user -finds him herself on the pitcher's mound ready to throw the ball. Structuring the system so that squares are the way to enter these mint-games allows views and items to be created that are best for the games - instead of trying to incorporate them into the story world panoramic, it also makes it clear that the games are collected Items that the user has earned.
[0001 17] Once a user has entered a square of any type, the user must exit the square in order to return to higher level, in any square, after the square expands, the frame (dra wn square line) will be visible along the edge of the screen. Touching this frame and dragging toward the center will, shrink the square such thai the frame stays under the user's finger. The frame follows the finger movement until the finger is released, if the finger is released with less than a predetermined amount of movement (e.g., twenty percent of the distance toward the center) then the square springs hack to foil size when released.
Otherwise, the square continues shrinking automatically and collapses, and. the user is returned to a previous location. If the user drags the frame past the point of a minimum siz for the square, but not enough to trigger a foil collapse and return to a previous location, the square stops shrinking and maintains its size until either finger is released or the user moves the finger hack out at which point the square will start tracking the finger again. This operation is shown in the sequence depicted by FIGs. I4A through FIG. 14C.
[0001 18] When the user returns to the landscape, the square comes back to its location k the landscape and cross-dissolves to the original icon. The cross- dissolve does not start until the frame is shrank beyond a threshold size. When returning to a story world location, the square simply shrinks down to nothing returning the user to the story world, in some versions, the square may continue to have a presence in the story world location after it has been viewed.. This presence would allow a user to return to a square already discovered without going back out to the starting landscape.
[0001 1 ] In another example, two finger "pinching*' may he used as a way. of closing a square. In this case, if two fingers are placed on the background and. moved toward each other the square starts to shrink. As with the previous example, the user also needs to cross a frame size threshold by a predetermined amount before the square will close on release; otherwise the square will spring back. This operation is shown in the sequence depicted in FIGs. I SA and 158,
[000120] (4,2) Software implementation Considerations [000121] The system of the present invention may be implemented in, software running on a touch screen-based computing platform, examples of which include desktop computer systems with touch screens, tablet computers, laptop computers with touch screens, and mobile phones. Whereas the previous portion of description presented the functionality of the invention from a user experience perspective, this portion is intended to provide technical details for an exemplary implementation of software according to the inven tion.
[000122] The major components of such a software system according to the present invention are presented in FIG. 16, The state information flow for these components is presented in FiG. 17. The remainder of this portion of the description focuses on class definitions for the objects used to implement this system.
[000123} Class definitions for the state tracker and state
objects are presented in FIG. 18. The PSStateTracker 1800 is a singleton object that has methods to receive state changes and notifications of touch events. The state tracker 1S00 also owns the PSf eedbackController 1802 and passes state information to it. The state information is represented by
PSStaie objects 1804. State objects 1804 are cached in a dictionary
(keyed by name) and updated with current time information when they are revisited. All other parts of the application will need to pass any state
changes to the state tracker 1800. State objects (such as the anient
location and current story world) can be accessed by other parts of the
application but. should not be modified by anything other than the state
tracker 1800.
[0001243 State information that needs to be gathered includes all touches (for timeout purposes), state transitions (enter, exit may either be explicit or implied), current location (landscape, story world, video player, game), drawing a square, current, goal n story world, current episode and scene in story world, and the first time in current location.
[000125] A. flow chart illustrating interactions in the system, that are available through the drawing of spare is depicted in FIG. 19.
[000126] The components of the landscape are illustrated in FIG. 20, wit class definitions for the objects used to implement them shown in FIG. 21. The PSPlaySquareLandscape object 2100 is a singleton object thai is a CCLayer with a background Sprite (and. may have multiple background tiles for scrolling) and multiple
PSLandscapelcon objects 2102. Each PSLandscapelcon object 2102 acts as a link to another location. Subclasses are customized to particular types of locations: PSStoryWorldXcon 2104, PSYideolcon 21 6, and PSGamelcon 108. Each icon responds to touches, drags, and long presses in the same way. Touches start the square drawing controller if the location has not been previously visited. Drags drag the icon around the landscape. Long presses will start to expand the scon to fill the screen and transition to the location using a cross-dissolve once the size passes a certain threshold. The main customization the subclasses will do is to provide the correct .PSSquareDestination object to the
PSSquareTranshionCoutrolier.
[000127] The components of the feedback and help system are illustrated in FIG, 22, with class definitions for the objects used to implement the system shown in FIG. 23. The PSFeedhackControlier object 2300 is a subsystem that monitors the user's progress and provides helpful tips and feedback . The feedback is desirably in the form of an animated character that appears on the screen and says a short phrase. At every pass through the event loop, the current state is examined. When certain combinations of trigger events happen, a. feedback action is performed. The mapping of triggers to actions will be loaded from a file
at runtime.
[000 i 28] The PSFeedbackAetion object 2302 handles feedback
actions. A feedback action, for example, may take the form of either a
sound voiceover or a short animation of a character (possibly with an
accompanying sound). The animation uses image sprites and a file listing
their positions and orientations over time. The character's mouth is
animated to lip sync to the speech in the sound file. Each chancier has a
variety of body sprites and face sprites showing different positions,
expressions, and mouth shapes. Additionally, linked to the body sprite, there are sprites for the character's arm that are independently -animated to point to
things on the screen. The position of the arm may be predetermined from
the animation frame file or it may be determined dynamically at run time.
The action may also call a named function instead of or in additio to an
animation.
[000129] The file for the animation, will specify the position and orientation of each visible sprite at each frame of the animation, it will need to reference the audio file and the sprite image files (or offsets in a single sprite sheet image).. Some possible actions include help with square drawing, help with video playing, help with square zoom out (back to previous environment)., erase partly drawn, square (after much inactivity), additional interface help or hints, episode specific help or hints and help with a game.
[000130] The PSFeedbackTrigger object 2304 handles trigger situations. The feedback system will keep track, of the current state of the application and the user's actions to determine when to perform a feedback action. The application has a number of possible states and a few time-based triggers that will be combined to create unique trigger situations. Each trigger sitaation may be mapped to a feedback action.
Generally, the triggers correspond to the following questions: "where are you?," "have you been here before?," "what are you trying to do?.;5 "how long have you been trying to do that?," and "when did you last do something?"
[000131 ] Possible example trigger states include the time since last touch is above some threshold, the time since last state transition (indicating progress in the current, state), current environment sqaare episode scene. progress in current episode/scene, performing video playback, drawing a square and first visit or return visit to square.
[000132] The trigger states are combined to generate trigger situations which prompt an action from the system. Two examples are provided below.
[000133] Example A: A trigger situation with no touch for 30 seconds AND during square drawing prompts the hel character to appear and prompt the user to keep drawing a square.
[000134] Example B: A trigger situation in the "Duck at Bat"
episode, scene 1 , AND the elapsed time since last transition is 300 seconds
AND no touch for 10 seconds prompts the help character to appear and
say "look for the letter B.
[000135] The triggers m y be coded, directly into th
application in some embodiments, while in other embodiments new
triggers may be defined externally and loaded from files. [000136] The components of the square drawing controller are shown in FIG. 24, with class defin tions for the objects used for its implementation
presented in. FIG, 25. The Square Drawing Controller class is a subclass of
CCLayer called PSSquareDrawingLayer 2500. It is given the current CCScene as a parent and a frame rectangle in which to draw a square. There will be some sprites in the current scene that the background square should appear behind so there is a convention of which objects appear in which Z-order, The actual
drawing will take place using a CCRenderTexture and a CCSprite as a
brush stamp that gets painted to the texture following touches. The square fram will be divided into n sections (20) with the struct PSSquareSection {CGRect,
BOOL) 2502, Touch handling will be done without using gesture recognizers since none of them map particularly well to this situation.
[000137] The PS Square Effects object 2504 contains two CCParticleSystem objects for displaying an initial shower of teaser particles (following the outline of the square) and drawing particles to provide visual feedback while actually drawing. The PSSquareEffects object 2504 will also be responsible for playing the feedback sounds.
[000138] The square transition controller components are shown in FIG. 26, with the class definitions for the objects used to implement them shown in FIG. 27. The Square Drawing Controller object creates a PSSquareTransitionControHer object 2700 and gives it a square frame image (as a CCSprite), a PSSquareOr.ig.in object 2702 and a PSSquareDestraation object 2704. The transition controller then creates a CCLayer with a CCSprite for the frame, the origin image and the destination image (which starts hidden). The layer is then zoomed via an action to fill the screen. During the zoom, the origin sprite ikies out and the destination sprite fades hi. During and after the transition, the Origin object 2702 and the Destination object 2704 get messages notifying them when the transition begins and ends. After the transition is over, the Controller 2700 will ask the Destination object 2704 for the 111 View that will be shown. This allows the Controller 2700 to attach its UiGestureReeognizers to the view to listen for touches that, would, signal a reverse transition (destination back to origin), Keeping the touch handlms in the transition controller is cleaner than bavins a variety of destination objects handle it.
[000139] The video playback controller components are illustrated in FIG. 28, with the class definitions for objects used for their implementation shown in FIG. 29. Video playback will use a customized UlKit movie player to load and play movie files foil screen. There is an overlay view showing controls and the square frame bordering the screen.
[000140] The stow work! controller components are illustrated in FIG. 30, with objects used to implement them shown in FIG. 31. The PSStoryWorld object 3100 is a container for the parts of the world. The PSTileScroller object 3102 handles continuous scrolling. There are a number of PSStorySceoes 31.04 that encapsulate the logic of the goal of each scene and the reward (video or game) to be gained from it. Visually, the story world comprises a background sprite with a variety of (possibly interactive) story objects placed in it.
[000141] The PSTileScroller object 3102 handles scrolling of the story world. The world is divided into four layer tiles (each represented by a PSStoryTile object 3106). Each tile contains a background sprite and some collection of object sprites. The object sprites are children of the tile layer and thus positioned relative to it. The PSTileScroller object 3102 is responsible for moving the tile to a new position as necessary to maintain the illusion of a continuously scrolling scene. Duri ng each panGesture movement, there is a 'extra offscreen tile in place' -opposite the directio of the movement. The tile layers will be moved during the scrolling, not the parent story world layer. Each tile's background should be at least as wide as the screen, if not a bit bigger. This may impose some design constraints on the artwork of the background. [000142] The PSObjectAssembler object 3108 implements a controller that handles assembling story objects on a floating "shelf'" The shelf does not scroll with the rest of the world but remains fixed on the screen as the user scrolls for more objects. The controller handles animating an objeci to the correct position on the shelf when the object is touched. The controller knows which objects are needed and their relative positions on the shelf. The assembler acts as a delegate for the story objects to intercept their standard touch handling. After the objects are assembled, move to square drawing around the objects.
[000143] Each scene in a given plot will be described by data objects (in some embodiments, these objects may be loaded from a file). The PSStoryScene object 3104 contains an initial positi on for the view, a iist of objects present in the scene as well as their initial positions in the world. Subclasses will be customized with the specific objects and goals of each scene in the episode. In other embodiments, the data from these custom files will be abstracted and loaded from a file.
[000144} The PSSto.ryObjectCa.che 1 10 implements an object store.
Story objects will get reused between scenes and an SDictionar is used as a cache to hold the story objects rather than recreating them each time. If memory runs low, the cac he will need to be cleared of unused story objects.
[000145] The PSStoryOhject object 31 10 is a CCSprite that has been customized to be a particular character or interactive object, in the world. Each story object .gets the option to respond to touches and drags that affect it. In some embodiments, story objects may be coded directly into the application, while in other versions the story objects may be loaded from Hies. There is a protocol for a delegate to the story object that, if attached, will receive the touch events instead of the object. [ 46] The game controller components are illustrated in FIG. 32, with class definitions for objects used for their implementation shown in Fid. 33. Games will largely be custom coded classes. There is some common functionality and some variables that will be abstracted out into a PSGame class 3300, and two more specific example game type classes are: PSObjectFindingGarne 3302 and PSTargetHittingGame 3304. As PSObjectFindingGarne will be a game in which the user looks for an object and the PSTargetHittingGame will be a game in which the oser tries to hit a target such as a. baseball or archery game. Games will use PSStoryObjeets 3306 as their assets (possibly shared with the PSStory World 3100).

Claims

What is claimed is:
1 , A method for interacting with an interactive media on a computer having a display which changes in response to a riser action, the display having a display border, and, the method comprising acts of;
displaying a landscape display having thereon an object having a closed geometric figure as an object border on the display, the border having a length;
detecting the user action as an action selected from a group consisting of detecting motion of a user's touch in a direction of the object border within a predetermined distance and along a predetermined percentage of the length of the object border; and detecting a user's touch within the object border;
after the detecting act has been performed, performing an act selected from a group consisting of expanding a size of the closed geometric figure until the object border expands to at least the size of the di splay border; and expanding a size of the closed geometric figure until the object borde expands to at l east the s ize of the display border and dissolving the geometric figure as it expands; and
replacing the landscape display with a different display.
2, The method of Claim 1, wherein in act of displaying a landscape display the object is a square.
3 , The method of Claim 1 > wherein the act of detecting i s
completed within a predetermined time duration.
4, The method of Claim 1 , wherein after act of expanding a size of the
closed geometric figure, the object border remains displayed.
5, The method of Claim , further comprising an act of detecting when the user's touch points at the object border and moves toward the center of the di splay and contracting the object border to follow the touch.
6 , The method of Claim 5, wherein t he object border automatically contracts after the object border reaches a first predetermined size.
7. The method of Claim 6, further comprising replacing the different display with the landscape display when the object border contracts to a second predetermined size,
8. The method of Claim 7, further comprising, responsive to the user completing a predetennined task in conjunction with the different display, displaying on the landscape displa s second object indicative of a completion of the pfedetennined task .
9. The method of Claim 8, wherein selection of the second object causes
the landscape display to be replaced with the different display.
10. The method of Claim 7, further comprising, responsive to the user failing to complete a predetermined task in conjunction with the different display, fading the different display into the landscape display and redisplaying the object on the display thereby resetting the mechanism to detect the motion of the user's touch aeain.
1 1. The method of Claim 10, wherein a prompt is presented to the user if
the predetermined task is not completed within a predetermined time.
12. The method of Claim 1, wherein the act of expanding the size of the
closed geometric- figure comprises fading the landscape display into the different display.
13. The method of Claim 1, wherein the different display has a new object having a closed geometric figure as an object border displayed therein and where the acts of usina the mechanism to detect: the motion of the user's touch. expanding the size of the closed geometric figure, and replacing the landscape display are repeated for the new object.
The method of Claim 1, wherein, in the act of displaying a landscape display, the object border is delineated on the display by predetermined display indicia.
15. The method of Claim 13, wherein the predetermined display indicia include at least one of a glowing line and a sparkling line. 1 The method of Claim 1 , wherein the act of detecting the user's touch comprises generating a prompt presentation to the user when the user's finger has not moved along a predetermined percentage of the object border length within a
predetermined time.
17. A system for interacting with an interactive media, the system comprising a data processing system having a display which changes in response to a user action, the display having a display border, and where the system performs operations of:
displaying a landscape display havin thereon an objeci having a closed geometric figure as an objeci border on the display, the border having a length:
detecting the user action as an action selected from a group consisting of detecting motion of a user's touch in a direction of the object border within a predetermined distance and along a predetermined percentage of the length of the object border; and detecting a user's touch within the object border;
after the detecting act has been performed, performing an act selected f om a group consisting of expanding a size of the closed geometric figure until the. object border expands to at least the size of the display border; and. expanding a size of the closed geometric figure until the objeci border expands to at leas the size of the display border and dissolving the geometric figure as it expands; and
replacing the landscape display with a different display. IS. A computer program product for facilitating interaction with an interactive media on a data processing system comprising a computer having a display which changes in response to a user action, the display having a display border, and, the computer program product having computer-readable instructions encoded therein, for causing the system to perform operations of:
displaying a landscape display having thereon an object having a closed geometric figure as an object border on the display, the border having a length;
detecting the user action as an action selected from a group consisting of detecting motion of a user's touch in a direction of the object border within a predetermined distance and along a predetermined percentage of the length of the object border; and detecting a user's touch within the object border;
after the detecting act has been performed, performing an act selected from a group consisting of expanding a size of the closed geometric figure until the object border expands to at least the size of the display border; and expanding a size of the closed geometric figure until the object borde expands to at least the s ize of the display border and dissolving the geometric figure as it expands; and
replacing the landscape display with a different display.
PCT/US2013/046204 2012-06-15 2013-06-17 Method and apparatus for initiating an interactive learning experience WO2013188890A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261660051P 2012-06-15 2012-06-15
US61/660,051 2012-06-15

Publications (1)

Publication Number Publication Date
WO2013188890A1 true WO2013188890A1 (en) 2013-12-19

Family

ID=49758782

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/046204 WO2013188890A1 (en) 2012-06-15 2013-06-17 Method and apparatus for initiating an interactive learning experience

Country Status (1)

Country Link
WO (1) WO2013188890A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3056972A1 (en) * 2015-02-11 2016-08-17 Volkswagen Aktiengesellschaft Method for operating a user interface in a vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080092081A1 (en) * 2006-10-11 2008-04-17 Samsung Electronics Co., Ltd. Mobile terminal and idle screen display method for the same
US20080094368A1 (en) * 2006-09-06 2008-04-24 Bas Ording Portable Electronic Device, Method, And Graphical User Interface For Displaying Structured Electronic Documents
US20100123737A1 (en) * 2008-11-19 2010-05-20 Apple Inc. Techniques for manipulating panoramas
US20100146436A1 (en) * 2008-02-01 2010-06-10 Gabriel Jakobson Displaying content associated with electronic mapping systems
US20110316884A1 (en) * 2010-06-25 2011-12-29 Microsoft Corporation Alternative semantics for zoom operations in a zoomable scene

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080094368A1 (en) * 2006-09-06 2008-04-24 Bas Ording Portable Electronic Device, Method, And Graphical User Interface For Displaying Structured Electronic Documents
US20080092081A1 (en) * 2006-10-11 2008-04-17 Samsung Electronics Co., Ltd. Mobile terminal and idle screen display method for the same
US20100146436A1 (en) * 2008-02-01 2010-06-10 Gabriel Jakobson Displaying content associated with electronic mapping systems
US20100123737A1 (en) * 2008-11-19 2010-05-20 Apple Inc. Techniques for manipulating panoramas
US20110316884A1 (en) * 2010-06-25 2011-12-29 Microsoft Corporation Alternative semantics for zoom operations in a zoomable scene

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3056972A1 (en) * 2015-02-11 2016-08-17 Volkswagen Aktiengesellschaft Method for operating a user interface in a vehicle

Similar Documents

Publication Publication Date Title
JP2022036934A (en) Use of confirmation response option in graphical message user interface
Wei et al. Time and space in digital game storytelling
WO2005116805A1 (en) An interactive system and method
EP1769321A1 (en) An interactive system and method
US20110209117A1 (en) Methods and systems related to creation of interactive multimdedia applications
Cuddihy et al. Embodied interaction in social virtual environments
Wei Analyzing the game narrative: Structure and technique
Armoni Computer science concepts in scratch
WO2013188890A1 (en) Method and apparatus for initiating an interactive learning experience
Stemkoski et al. Game development with construct 2: from design to realization
Head Designing interface animation: improving the user experience through animation
Bertolini Hands-On Game Development without Coding: Create 2D and 3D games with Visual Scripting in Unity
Kerfs Beginning Android Tablet Games Programming
Ferro et al. Unity 2017 2D Game Development Projects
Davidson An evaluation of visual gesture based controls for exploring three dimensional environments
Green Don’t forget to save! User experience principles for video game narrative authoring tools.
Stemkoski Java game development with LibGDX: From beginner to professional
Thorn Unity 2018 By Example: Learn about game and virtual reality development by creating five engaging projects
Davis et al. PlaySketch: turning animation sketches into game logic
Johnson et al. Learning 2D game development with Unity: a hands-on guide to game creation
Besley et al. Foundation ActionScript for Flash 8
Shankar Pro HTML5 Games: Learn to Build Your Own Games Using HTML5 and JavaScript
Patel Desegmenting a Gameworld: The Super Mario Series
Shankar et al. Pro HTML5 games
de Pinho Framework for Developing Interactive 360-Degree Video Adventure Games

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13803932

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, EPO FORM 1205A DATED 29/05/2015)

122 Ep: pct application non-entry in european phase

Ref document number: 13803932

Country of ref document: EP

Kind code of ref document: A1