WO2015113440A1 - System and method for changing the state of user interface element marked on physical objects - Google Patents

System and method for changing the state of user interface element marked on physical objects Download PDF

Info

Publication number
WO2015113440A1
WO2015113440A1 PCT/CN2014/091918 CN2014091918W WO2015113440A1 WO 2015113440 A1 WO2015113440 A1 WO 2015113440A1 CN 2014091918 W CN2014091918 W CN 2014091918W WO 2015113440 A1 WO2015113440 A1 WO 2015113440A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive surface
computer program
state
touch action
marked
Prior art date
Application number
PCT/CN2014/091918
Other languages
French (fr)
Inventor
Zheng Shi
Xingyi GONG
Original Assignee
Zheng Shi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/CN2014/071850 external-priority patent/WO2014139349A1/en
Priority claimed from PCT/CN2014/086745 external-priority patent/WO2015113404A1/en
Application filed by Zheng Shi filed Critical Zheng Shi
Priority to CN201480061857.2A priority Critical patent/CN105723306B/en
Priority to EP14880849.6A priority patent/EP3100148A1/en
Priority to JP2016548113A priority patent/JP2017507349A/en
Priority to KR1020167020190A priority patent/KR101813557B1/en
Priority to CN201580000167.0A priority patent/CN105027192B/en
Priority to EP15743177.6A priority patent/EP3100258A4/en
Priority to PCT/CN2015/070162 priority patent/WO2015113457A1/en
Priority to PCT/CN2015/074570 priority patent/WO2015188643A1/en
Priority to US14/737,514 priority patent/US9299330B2/en
Publication of WO2015113440A1 publication Critical patent/WO2015113440A1/en
Priority to US15/057,092 priority patent/US9690473B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • A63F2003/00662Electric board games; Electric features of board games with an electric sensor for playing pieces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • A63F2003/00662Electric board games; Electric features of board games with an electric sensor for playing pieces
    • A63F2003/00665Electric board games; Electric features of board games with an electric sensor for playing pieces using inductance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • A63F2003/00678Electric board games; Electric features of board games with circuits closed by mechanical means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2483Other characteristics
    • A63F2009/2485Other characteristics using a general-purpose personal computer
    • A63F2009/2486Other characteristics using a general-purpose personal computer the computer being an accessory to a board game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet

Definitions

  • the present invention relates to operation of a computer program by end-users using physical objects, by touching the physical objects to effect a change in the state of the user interface element marked on the physical objects.
  • Computer systems use a combination of screens and input devices such as keyboards and mouse devices in order for a user to operate computer programs.
  • the GUI Graphic User Interface
  • WIMP window, icon, menu and pointing device
  • the present invention discloses a system and method to operate a computer program by an end-user, through the touch actions by the end-user upon a plurality of physical objects that have been placed on an interactive surface. More specifically, this present invention discloses a system and method to manage the state of a user interface element marked on an object, in order to offer enhanced interactivity between the end-user and computer program and mediated by a plurality of physical objects.
  • a physical object has been embedded within it a unique identification code (UID) , in the form of an RFID tag, one or more capacitive sensor tags, and is visually marked with an element of user interface of the computer program.
  • UID unique identification code
  • An interactive surface has been embedded with an array of RF antennas to read the UID of an object, an array of capacitive sensors of much higher density than the RF antenna array to allow a more precise detection of the location and orientation of an object that has been placed on the interactive surface.
  • one or more sensory accessories have been embedded either within the object or within the interactive surface.
  • the interactive surface controls the sensory pattern produced by the sensory accessory.
  • the interactive surface By directing a sensory accessory to produce a sensory pattern, the interactive surface generates an indication of the “state” of the user interface element marked on an object.
  • the interactive surface is configured to detect such touch action upon the object through an embedded capacitance sensor positioned under the object, and direct a sensory accessory to indicate a new state of the UI element marked on an object by exhibiting a new sensory pattern.
  • the present invention offers enhanced interactivity between an end-user and a computer program mediated by physical objects.
  • Fig. 1 is an exemplary schematic diagram illustrating the system process flow in accordance with one embodiment of the present invention.
  • Fig. 2 is an exemplary schematic diagram of the interactive surface in accordance with an embodiment of the invention.
  • Fig. 3 is an exemplary schematic diagram of the interactive surface detecting a touch action in accordance with an embodiment of the invention.
  • Fig. 4 is an exemplary schematic diagram illustrating the system for a music application in accordance with one embodiment of the present invention.
  • Fig. 5 is an exemplary schematic diagram illustrating the system for a word game in accordance with one embodiment of the present invention.
  • Fig. 6 is an exemplary schematic diagram illustrating the method process flow in accordance with one embodiment of the present invention.
  • the embodiments of the present invention provide a system and method to operate a computer program by an end-user using a plurality of physical objects.
  • Fig. 1 is an exemplary schematic diagram illustrating the system process flow in accordance with one embodiment of the present invention.
  • the system includes a plurality of physical objects 101, each comprising a unique identification code (UID) and visually marked with a UI element of a computer program.
  • the UID can be encoded with a radio frequency identification chip (RFID) , a pattern of capacitance tabs, or a pattern of magnetic tabs.
  • RFID radio frequency identification chip
  • the system also includes an interactive surface 102 that has no display screen and is configured to detect the UID of an object 101 and to derive the location and orientation of the object 101 once the object 101 is placed on it.
  • the system further includes a sensory accessory 104 that is operatively controlled by the interactive surface 102 and indicates the current state of the UI element by exhibiting a sensory pattern.
  • a computer system 105 is operatively linked to the interactive surface 102 and is configured to execute the computer program and direct the various electronic components of the interactive surface 102.
  • the interactive surface 102 is configured to detect the touch action 103 and direct the sensory accessory 104 to indicate a new state of the UI element marked on the object 101 by exhibiting a new sensory pattern
  • the computer system 105 is configured to incorporate such state change as an input in running the computer program.
  • the computer system 105 stores the current state and a range of valid states of the UI element marked on an object 101, and the relationship between the touch action 103 and a change of the state of the UI element marked on the object 101 according to the computer program.
  • the object can be a card, a button, a block, an icon, a sheet, or a figurine.
  • the sensory accessory could be an LED light, an audio device, a video device, or a vibration generator device, and could provide end-users with physical feedback in the form of audio or visual effects.
  • the sensory accessory can be embedded within the interactive surface or within the object itself.
  • Fig. 2 is an exemplary schematic diagram of the system of the interactive surface 201 whereby each layer of the interactive surface 201 has been separated for illustration sake in accordance to one embodiment of the present invention.
  • the interactive surface 201 can be separated into four layers.
  • the bottom layer 202 is the substrate or base of the interactive surface 201.
  • the second layer which consists of an array of RF antennas 203 whose purpose is to wirelessly communicate with the RFID tags of objects 101 in order for the computer system of the interactive surface 201 to determine the unique ID of the objects placed upon it.
  • the third layer On top of the second layer is the third layer which consists of an array of capacitance sensors 204 whose purpose is to detect, through capacitive coupling with the object’s structure made with material of high dielectric constant (i.e., capacitance tab) , the location and orientation of objects placed upon the interactive surface 201 and transmit that information to the computer system.
  • the top layer consists of an array of sensory accessories 205 such as LED lights whose purpose is to provide user feedback by, for example, lighting up the areas surrounding specifically relevant objects whenever instructed by the computer system.
  • Fig. 3 is an exemplary schematic diagram further illustrating the process of detection of a touch action by the interactive surface using the system design described in Fig. 2.
  • a plurality of cards 302 each visually marked with a UI element of a computer program, are placed on the interactive surface 301.
  • the UI element in this particular embodiment can be: start, stop, save, delete, okay, cancel, play, replay, record, complete, copy, duplicate, export, import, a letter of an alphabet, a word of a language, and an icon representing a musical symbol, etc.
  • Each of the cards 302 further has an identifier that contains the ID information of the card.
  • the interactive surface 301 further includes an array of capacitance sensors 303 and an RF antenna array 304 that are capable of detecting the ID, location and orientation of the cards 302 placed upon it and operatively linked to a computer system.
  • the cards 302 are each embedded with a material of high dielectric constant (i.e., capacitance tab) 306 so as to allow for capacitive coupling between the card 302 and the interactive surface’s 301 capacitance sensors located under the card 302.
  • the design of this material of high dielectric constant 306 is made so as to allow for further capacitive coupling between an end-user finger touch 308 on the card 302 and the interactive surface’s 301 capacitance sensors.
  • the cards 302 are also each embedded with an RFID chip 307 so as to allows for the interactive surface’s 301 computer system to read the UID of the card through wireless communication between the card’s 302 RFID chip 307 and the RF antenna of the RF antenna array 304 that is located closest to the card 302.
  • the interactive surface 301 is configured to detect the UID of the card 302 and derive the location and orientation of the card 302. Afterwards, a sensory accessory 305, 309 operatively controlled by the interactive surface 301 is directed by the computer system to indicate the current state of the UI element marked on the card 302.
  • the state of a UI element can be: active, non-active, in use, not in use, enabled, not enabled, selected, not-selected, not-executing, executing, completed, correct, incorrect, succeeded, failed, etc.
  • the initial state of the UI elements marked on the three cards 302 placed on the interactive surface 301 are non-active or not enabled so none of the LED lights are lit up in this case.
  • the sensory accessories consist of an array of LED lights 305 embedded on the surface of the interactive surface 301 as well as a speaker system 309 also embedded in the interactive surface 301.
  • each card 302 is embedded with a structure made with material of high dielectric constant 306 so that, after the card 302 is placed upon the interactive surface 301 and a change in capacitance has been detected and measured by the capacitance sensors of the interactive surface 301, a further change in capacitance is detected and measured by the capacitance sensors whenever a human finger 308 touches the card 302.
  • a signal representing this change in capacitance is sent to and recorded by the computer system and determined to be caused by a touch action, which enables the computer program to interpret the touch action into a change of the current state of the UI element marked on the card 302.
  • An instruction is then provided to the computer system according to the computer program regarding changing the current state of the UI element marked on the card 302 acted upon by the touch action, and for the computer system to direct those LED lights of the LED light array 305 located around the card 302 to light up to indicate the new state of the UI element marked on the card 302 acted upon by the touch action.
  • the computer system also simultaneously directs the speaker system 309 to broadcast an audio recording in order to further indicate the new state of the UI element marked on the card 302 acted upon by the touch action. For example, the LED lights are lit up once the card “PLAY” is touched by the finger of an end-user.
  • the touch action could also include touching a card 302 twice or more in rapid succession, touching the card 302 once but for a longer duration of time, changing the orientation of a card 302 with finger touch, or changing the location of a card 302 with finger touch.
  • Fig. 4 is an exemplary schematic diagram illustrating the system for a music application in accordance with one embodiment of the present invention.
  • a number of cards 402 are placed upon the interactive surface 401 initially causing the computer system of the interactive surface 401 to determine the location, orientation and UID of each of these cards 402 through the same process described previously in Fig. 3.
  • Each of the cards 402 is visually marked with a UI element of a music application.
  • the cards can be marked with the number of a music saved file 403, an accompanying melody 404, a musical instrument 405, or basic user functions such as save 406, record 407, play 408, and stop 409.
  • a sensory accessory is directed by the computer system to indicate the current state of the UI element marked on the card 402.
  • the computer system directs those LED lights of the LED light array located around the card 402 to either turn on or off in order to indicate the state of the UI element marked on the card 402. Referring to the cards 402 that have placed on the interactive surface 401, one can see that a number of cards 402 have their LEDs on indicating an “active” state whereas all other cards 402 are deemed “non-active” . It will be noted that the LEDs of these “active” cards 402 could either be continuously on or flashing.
  • An end-user may start the music application by creating a piece of music himself by playing the electronic keyboard 410. Whenever the end-user wants to record this piece of music, they can press the “record” card 407 with their finger. As elaborated previously, a change in capacitance caused by the finger touch is then detected and measured by the capacitance sensors of the interactive surface 401 that are located underneath the “record” card 407 and a signal representing this change in capacitance is sent to and processed by the computer system. The computer system determines that this change in capacitance was caused by a touch action and the computer program interprets the touch action as a change of the current state of the UI element marked on the “record” card 407.
  • Activating the “record” card 407 will cause the computer program to record the music being created by the end-user.
  • An instruction is provided by computer system to the interactive surface 401 to direct the LED lights located around the card to light up or off so as to visually indicate to the end-user the new state of the UI element.
  • the end-user wants to stop the recording of the music piece they are creating, they proceed by pressing the “stop” card 409. This “stop” action will lead the computer program to stop recording and the computer software to simultaneously light up the “stop” card 409 and switch off the “record” card 407.
  • a range of valid states can exist for the UI element marked on a card, and the relationship between a touch action and a change of the state of the UI element is established and stored by the computer program.
  • the state of the UI element marked on the “record” card 407 is “non-active” or “not in use” .
  • the state of the UI element is changed to “active” or “in use” , which triggers the execution of the computer program by recording the music.
  • the state of the UI element marked on the “stop” card 409 is changed from “not enabled” to “enabled” with a single touch by the end-user.
  • the state of the UI element marked on the “record” card 407 is also changed back to “non-active” or “not in use” .
  • the UI element marked on the “play” card 408 is changed from “not-executing” to “executing” by a double touch in quick succession by the end-user, which triggers the music to be played by the computer system.
  • the state of the “stop” card 409 is changed back to “not enabled” from “enabled” .
  • an end-user wants to save the music piece they have created, they proceed by placing a “save file” card 403 with a specific number assigned to it upon the interactive surface 401 and proceeds to save the music piece by pressing on the card for a determined amount of time (i.e. usually long enough for the interactive surface to confirm with the end-user through audio feedback broadcasted out via the speaker system 411) .
  • the end-user wants to hear back a saved music piece they proceed by pressing twice in rapid succession on the specific “save file” card 403 which triggers the interactive surface 401 to begin broadcasting the audio file via the speaker system 411 as well as have the LEDs surrounding the specific “save file” card 403 to start blinking (thus, indicating to the end-user that the music being played is attributed to that “save file” card 403) .
  • the computer system will also direct the LED lights surrounding the “play” card 408 to light up so as to illustrate the change in “state” for that particular user interface as well (i.e. the music is currently being played) .
  • An end-user can have the music piece they have created to be accompanied by a preset melody.
  • the “accompanying melody” card 404 named “Bach” is placed on the interactive surface 401.
  • the end-user can select the playing of this piece at anytime by activating this “accompanying melody” card 404 by pressing on it using their finger.
  • Such an action will cause the computer system to broadcast via the speaker system 411 a piece of the melody which could either be played by itself or as a complement to a piece of music created by the end-user.
  • visual feedback in the form of LED lights lighting on or off is used to indicate to the end-user the state of that particular “accompanying music” card 404 at any given time.
  • An end-user can add, subtract or change the type of musical instrument that is used throughout the broadcasting of music pieces by selecting or deselecting with their finger “musical instrument” cards 405. Referring back to Fig. 4, the surrounding LED lights of the “violin” card 405 is switched on indicating to the end-user that the music piece is being played using the acoustics of a violin.
  • FIG. 5 is an exemplary schematic diagram illustrating the system for a word spelling game in accordance with one embodiment of the present invention.
  • the embodiment described in Fig. 5 is similar to that of the embodiment illustrated in Fig. 3with a few notable differences.
  • the embodiment consists of an interactive surface 501 consisting of a base layer, a second layer comprising an RF antenna array and a third layer comprising an array of capacitance sensors.
  • the interactive surface 501 further comprises a computer system which itself consists of a processor and a memory unit.
  • the cards 502 of the embodiment illustrated in Fig. 5 consist of cards 502 with capacitance tabs embedded into them so as to allow for capacitive coupling with the interactive surface’s 501 capacitance sensors as well as an end-user’s finger touch.
  • Each card 502 further has an RFID containing the UID information of the card which can be wirelessly transmitted to the interactive surface’s 501 computer system when directed to do so by one of the interactive surface’s RF antenna.
  • the interactive surface 501 does not have a fourth layer comprising an array of LED lights. Instead, each card is embedded with one or more LED lights and the accompanying components that allow for the LED light (s) to be powered on and directed by the RF antenna of the interactive surface 501 through RF to DC energy conversion technology.
  • Each card has an identifier imprinted upon its surface wherein the identifier is either a letter of the alphabet 502 or a general user interface command 503 (e.g. “Difficulty” , “Verb” , “Animals” , “French” ) .
  • the computer system of the interactive surface’s 501 has access to a database (located locally or remotely) whereby each of the cards’ UID match the identifier imprinted upon the surface of the cards and is related to a particular “state” or function of the electronic program that can be altered by physically pressing on the card 502.
  • An end-user begins a game by placing a number of cards 502 with letters of the alphabet imprinted upon the interactive surface 501, as illustrated in Fig. 5. This leads to the automatic detection and recognition of the location and UID of each card 502 by the interactive surface by the same process described in the previous embodiments.
  • the end-user then initiates the language spelling game by pressing on the “Go” button 504. This causes the computer program of the computer system to broadcast an audio recording via the speakers 507prompting the end-user to correctly spell out a word based on the letters of the alphabet in play (i.e. cards 502 already placed upon the interactive surface 501) .
  • the end-user can then proceed to respond to the challenge by selecting through a finger touch, in the correct sequence, the cards 502 whose letters correspond to the spelling of the word that has been broadcasted to the end-user.
  • the end-user is effectively changing the state of the card’s user interface function.
  • the computer system directs the RF antenna located closest to that card to light up the card 502 in question through RF to DC energy conversion technology.
  • the end-user may also choose to replay the audio broadcast of the word to spell by pressing on the “replay” button 505or they can also press the “example” button 506to hear the word in the context of a sentence.
  • the end-user can choose to alter the language spelling game by placing general user interface command cards 503 upon the interface surface and proceed by touching them.
  • the end-user may choose to alter the level of spelling difficulty by pressing once on the “Difficulty” card 503 to cause the computer program to go into an easy mode (thus challenging the player to spell out relatively easy words) .
  • the end-user may instead choose intermediate or difficult modes by tapping on the same card 503 twice or thrice respectively in rapid succession. In effect this card 503 will change from three different states (i.e. easy, intermediate and difficult) by the type of touch the end-user has with it.
  • the end-user may also choose to change the form of the word the computer program will challenge them with by picking any of the general user interface command cards 503 that correspond to this description and change their relative state by pressing on them. Referring to Fig. 5, one can see that in that particular scenario, two cards 503 with the words “Verb” and “Noun” have been placed upon the interactive surface 501 but only the card “Verb” 503 is in an active state as its LED light is activated. This indicates the end-user that all words that the computer program will challenge them at that time with will be verbs.
  • the end-user may also choose to select words that correspond to a certain category of elements. For example, referring back to Fig. 5, one can see that the general user interface card 503 with “animal” has been placed upon the interactive surface 501 (it will be noted that an alternative design will be to have the image of an animal on the card instead of the actual word “animal” ) . If the end-user presses on that card, leading to its activation (e.g. changing its state) and the lighting up of its LED light, the computer program will then proceed to challenge the end-user with the spelling of words comprising of animal names (e.g. the word “Zebra” is a possibility in the scenario depicted in Fig. 5) .
  • the word “Zebra” is a possibility in the scenario depicted in Fig. 5
  • the end-user may select the language that the game will proceed in by placing and touching the general user interface command cards 503 with the language name imprinted upon the card 503. Referring to Fig. 5, if the end-user presses on the card 503 “Francais” , then the word challenges will be made in French.
  • This game design described above can be used in a form of language spelling “free form” game whereby the end-user randomly picks and places cards with letters printed upon them upon the interactive surface 501.
  • This free form language spelling game design is particularly well suited for young children that have just started or are about to start learning spelling at its teaches them how to match certain letters or words with their corresponding sounds in a free and unrestrictive manner.
  • Fig. 6 is an exemplary schematic diagram illustrating the method process flow in accordance with one embodiment of the present invention. As shown in Fig. 6, the system process includes the following steps.
  • Step 601 placing a plurality of objects on an interactive surface
  • Step 602 indicating, by exhibiting a sensory pattern, the current state of the UI element marked on an object, by a sensory accessory;
  • Step 603 acting upon the object by a touch action
  • Step 604 detecting a touch action by the interactive surface
  • Step 605 notifying the computer system of the touch action by the interactive surface
  • Step 606 receiving an instruction according to the computer program regarding changing the current state of the UI element marked on the object acted upon by the touch action;
  • Step 607 directing, by the interactive surface, the sensory accessory to indicate a new state of the UI element marked on the object by exhibiting a new sensory pattern for the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)
  • Position Input By Displaying (AREA)
  • Near-Field Transmission Systems (AREA)

Abstract

The present invention discloses a system and method for changing the state of a user interface (UI) element of a computer program visually marked on a physical object, through a touch action upon the object, among a plurality of objects on the interactive surface, each comprising a capacitance tab and a unique identification code. During runtime execution of the computer program, and in response to a touch action upon an object by an end-user, the interactive surface is configured to detect such touch action upon the object through an embedded capacitance sensor positioned under the object, and direct a sensory accessory to indicate a new state of the UI element marked on an object by exhibiting a new sensory pattern. The present invention offers enhanced interactivity between an end-user and a computer program mediated by physical objects.

Description

[Corrected under Rule 26, 21.01.2015] SYSTEM AND METHOD FOR CHANGING THE STATE OF USER INTERFACE ELEMENT MARKED ON PHYSICAL OBJECTS TECHNICAL FIELD
The present invention relates to operation of a computer program by end-users using physical objects, by touching the physical objects to effect a change in the state of the user interface element marked on the physical objects.
BACKGROUND
Computer systems use a combination of screens and input devices such as keyboards and mouse devices in order for a user to operate computer programs. The GUI (Graphical User Interface) that uses the WIMP (window, icon, menu and pointing device) principle was invented at the Xerox Park Lab in the 1970s. This was to become the template by which all commercial computer systems would adhere to. Indeed, all commercial systems developed by Apple, Microsoft and Sun Microsystems to this day use some form of GUI system in order to allow users to naturally interact with computer programs.
However, depending on the application, it is at times desirable to allow for the interaction with a computer program to be made through the use of physical objects. This is particularly true for young children who have a natural affinity for physically manipulating objects. This also holds true whenever an open platform for group discussion and group play is used as the manipulation of physical objects naturally accommodates the interaction of several people using the same computer program. There is a clear contrast between the user experience derived from using physical objects and that of using the traditional screen-based method; a screen serves as a window of information whereas physical manipulation is a medium for personal use.
In order to enhance the experience of operating a computer program through physical object manipulation, there is a need to enable the “state” of a user interface element represented by a physical object to be managed. That is to say, the “state” should be visually presented to an end-user, and the “state” needs to change in response to the action by an end-user and according to logic of the computer program.
The use of physical objects placed on the surface of smart screens or electronic pads are known. A popular example of such would be Mattel’s Apptivity toy series whereby players interact with a video game by moving a toy figurine across an iPad surface. Although this product is visually impressive, actual interaction between the object and the software is limited to tracking the location and orientation of the object relative to the surface of the iPad. Thus, enhanced interactions between the player and the object such as allowing for touch-sensitivity of the object itself would therefore allow a whole new interactive dimension between the user and the software.
SUMMARY OF INVENTION
The present invention discloses a system and method to operate a computer program by an end-user, through the touch actions by the end-user upon a plurality of physical objects that have been placed on an interactive surface. More specifically, this present invention discloses a system and method to manage the state of a user interface element marked on an object, in order to offer enhanced interactivity between the end-user and computer program and mediated by a plurality of physical objects.
In one embodiment of the present invention, a physical object has been embedded within it a unique identification code (UID) , in the form of an RFID tag, one or more capacitive sensor tags, and is visually marked with an element of user interface of the computer program. An interactive surface has been embedded with an array of RF antennas to read the UID of an object, an array of capacitive sensors of much higher density than the RF antenna array to allow a more precise detection of the location and orientation of an object that has been placed on the interactive surface.
In one embodiment of the present invention, one or more sensory accessories have been embedded either within the object or within the interactive surface. In both cases, the interactive surface controls the sensory pattern produced by  the sensory accessory. By directing a sensory accessory to produce a sensory pattern, the interactive surface generates an indication of the “state” of the user interface element marked on an object.
In one embodiment of the present invention, during runtime execution of the computer program, and in response to a touch action upon an object by an end-user, the interactive surface is configured to detect such touch action upon the object through an embedded capacitance sensor positioned under the object, and direct a sensory accessory to indicate a new state of the UI element marked on an object by exhibiting a new sensory pattern. The present invention offers enhanced interactivity between an end-user and a computer program mediated by physical objects.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is an exemplary schematic diagram illustrating the system process flow in accordance with one embodiment of the present invention.
Fig. 2 is an exemplary schematic diagram of the interactive surface in accordance with an embodiment of the invention.
Fig. 3 is an exemplary schematic diagram of the interactive surface detecting a touch action in accordance with an embodiment of the invention.
Fig. 4 is an exemplary schematic diagram illustrating the system for a music application in accordance with one embodiment of the present invention.
Fig. 5 is an exemplary schematic diagram illustrating the system for a word game in accordance with one embodiment of the present invention.
Fig. 6 is an exemplary schematic diagram illustrating the method process flow in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
While the present invention will be described using specific embodiments, the invention is not limited to these embodiments. People skilled in the art will  recognize that the system and method of the present invention may be used in many other applications. The present invention is intended to cover all alternatives, modifications and equivalents within the spirit and scope of invention, which is defined by the apprehended claims.
Furthermore, in the detailed description of the present invention, specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits are not described in details to avoid unnecessarily obscuring a clear understanding of the present invention.
The present invention will be better understood and its numerous objects and advantages will become apparent to those skilled in the art by reference to the accompanying drawings.
The embodiments of the present invention provide a system and method to operate a computer program by an end-user using a plurality of physical objects.
Fig. 1 is an exemplary schematic diagram illustrating the system process flow in accordance with one embodiment of the present invention. As shown in Fig. 1, the system includes a plurality of physical objects 101, each comprising a unique identification code (UID) and visually marked with a UI element of a computer program. The UID can be encoded with a radio frequency identification chip (RFID) , a pattern of capacitance tabs, or a pattern of magnetic tabs. The system also includes an interactive surface 102 that has no display screen and is configured to detect the UID of an object 101 and to derive the location and orientation of the object 101 once the object 101 is placed on it. The system further includes a sensory accessory 104 that is operatively controlled by the interactive surface 102 and indicates the current state of the UI element by exhibiting a sensory pattern. A computer system 105 is operatively linked to the interactive surface 102 and is configured to execute the  computer program and direct the various electronic components of the interactive surface 102.
During runtime execution of the computer program, and in response to a touch action 103 by an end-user upon an object 101, the interactive surface 102 is configured to detect the touch action 103 and direct the sensory accessory 104 to indicate a new state of the UI element marked on the object 101 by exhibiting a new sensory pattern, and the computer system 105 is configured to incorporate such state change as an input in running the computer program. In details, the computer system 105stores the current state and a range of valid states of the UI element marked on an object 101, and the relationship between the touch action 103 and a change of the state of the UI element marked on the object 101 according to the computer program.
The object can be a card, a button, a block, an icon, a sheet, or a figurine. The sensory accessory could be an LED light, an audio device, a video device, or a vibration generator device, and could provide end-users with physical feedback in the form of audio or visual effects.
The sensory accessory can be embedded within the interactive surface or within the object itself.
Fig. 2 is an exemplary schematic diagram of the system of the interactive surface 201 whereby each layer of the interactive surface 201 has been separated for illustration sake in accordance to one embodiment of the present invention. Within the confines of this embodiment, the interactive surface 201 can be separated into four layers. The bottom layer 202 is the substrate or base of the interactive surface 201. On top of the bottom layer is the second layer which consists of an array of RF antennas 203 whose purpose is to wirelessly communicate with the RFID tags of objects 101 in order for the computer system of the interactive surface 201 to determine the unique ID of the objects placed upon it. On top of the second layer is the third layer which consists of an array of capacitance sensors 204 whose purpose is to detect, through capacitive coupling with the object’s structure made with material  of high dielectric constant (i.e., capacitance tab) , the location and orientation of objects placed upon the interactive surface 201 and transmit that information to the computer system. Finally, the top layer consists of an array of sensory accessories 205 such as LED lights whose purpose is to provide user feedback by, for example, lighting up the areas surrounding specifically relevant objects whenever instructed by the computer system.
Fig. 3 is an exemplary schematic diagram further illustrating the process of detection of a touch action by the interactive surface using the system design described in Fig. 2. In this embodiment, a plurality of cards 302, each visually marked with a UI element of a computer program, are placed on the interactive surface 301. The UI element in this particular embodiment can be: start, stop, save, delete, okay, cancel, play, replay, record, complete, copy, duplicate, export, import, a letter of an alphabet, a word of a language, and an icon representing a musical symbol, etc. Each of the cards 302 further has an identifier that contains the ID information of the card. The interactive surface 301 further includes an array of capacitance sensors 303 and an RF antenna array 304 that are capable of detecting the ID, location and orientation of the cards 302 placed upon it and operatively linked to a computer system.
The cards 302 are each embedded with a material of high dielectric constant (i.e., capacitance tab) 306 so as to allow for capacitive coupling between the card 302 and the interactive surface’s 301 capacitance sensors located under the card 302. The design of this material of high dielectric constant 306 is made so as to allow for further capacitive coupling between an end-user finger touch 308 on the card 302 and the interactive surface’s 301 capacitance sensors. The cards 302 are also each embedded with an RFID chip 307 so as to allows for the interactive surface’s 301 computer system to read the UID of the card through wireless communication between the card’s 302 RFID chip 307 and the RF antenna of the RF antenna array 304 that is located closest to the card 302.
Each time a card 302 is placed on the interactive surface 301, during runtime execution of a computer program, the interactive surface 301 is configured to detect  the UID of the card 302 and derive the location and orientation of the card 302. Afterwards, a  sensory accessory  305, 309 operatively controlled by the interactive surface 301 is directed by the computer system to indicate the current state of the UI element marked on the card 302. The state of a UI element can be: active, non-active, in use, not in use, enabled, not enabled, selected, not-selected, not-executing, executing, completed, correct, incorrect, succeeded, failed, etc. In the example illustrated on Fig. 3 the initial state of the UI elements marked on the three cards 302 placed on the interactive surface 301 are non-active or not enabled so none of the LED lights are lit up in this case.
In Fig. 3, the sensory accessories consist of an array of LED lights 305 embedded on the surface of the interactive surface 301 as well as a speaker system 309 also embedded in the interactive surface 301.
The method of the present invention can be described as flows. As each card 302 is embedded with a structure made with material of high dielectric constant 306 so that, after the card 302 is placed upon the interactive surface 301 and a change in capacitance has been detected and measured by the capacitance sensors of the interactive surface 301, a further change in capacitance is detected and measured by the capacitance sensors whenever a human finger 308 touches the card 302. A signal representing this change in capacitance is sent to and recorded by the computer system and determined to be caused by a touch action, which enables the computer program to interpret the touch action into a change of the current state of the UI element marked on the card 302. An instruction is then provided to the computer system according to the computer program regarding changing the current state of the UI element marked on the card 302 acted upon by the touch action, and for the computer system to direct those LED lights of the LED light array 305 located around the card 302 to light up to indicate the new state of the UI element marked on the card 302 acted upon by the touch action. The computer system also simultaneously directs the speaker system 309 to broadcast an audio recording in order to further indicate the new state of the UI element marked on the card 302 acted upon by the touch action.  For example, the LED lights are lit up once the card “PLAY” is touched by the finger of an end-user. Besides touching a card 302 once, the touch action could also include touching a card 302 twice or more in rapid succession, touching the card 302 once but for a longer duration of time, changing the orientation of a card 302 with finger touch, or changing the location of a card 302 with finger touch.
Fig. 4 is an exemplary schematic diagram illustrating the system for a music application in accordance with one embodiment of the present invention. As shown in Fig. 4, during runtime execution of the music application, a number of cards 402 are placed upon the interactive surface 401 initially causing the computer system of the interactive surface 401 to determine the location, orientation and UID of each of these cards 402 through the same process described previously in Fig. 3.
Each of the cards 402 is visually marked with a UI element of a music application. For example, the cards can be marked with the number of a music saved file 403, an accompanying melody 404, a musical instrument 405, or basic user functions such as save 406, record 407, play 408, and stop 409.
As an end-user places cards 402 upon the interactive surface 401, these automatically become part of the user interface of the music application.
Once the location, orientation and UID of each of these cards 402 has been determined, a sensory accessory is directed by the computer system to indicate the current state of the UI element marked on the card 402. In this embodiment, the computer system directs those LED lights of the LED light array located around the card 402 to either turn on or off in order to indicate the state of the UI element marked on the card 402. Referring to the cards 402 that have placed on the interactive surface 401, one can see that a number of cards 402 have their LEDs on indicating an “active” state whereas all other cards 402 are deemed “non-active” . It will be noted that the LEDs of these “active” cards 402 could either be continuously on or flashing.
An end-user may start the music application by creating a piece of music himself by playing the electronic keyboard 410. Whenever the end-user wants to  record this piece of music, they can press the “record” card 407 with their finger. As elaborated previously, a change in capacitance caused by the finger touch is then detected and measured by the capacitance sensors of the interactive surface 401 that are located underneath the “record” card 407 and a signal representing this change in capacitance is sent to and processed by the computer system. The computer system determines that this change in capacitance was caused by a touch action and the computer program interprets the touch action as a change of the current state of the UI element marked on the “record” card 407. Activating the “record” card 407 will cause the computer program to record the music being created by the end-user. An instruction is provided by computer system to the interactive surface 401 to direct the LED lights located around the card to light up or off so as to visually indicate to the end-user the new state of the UI element. Whenever the end-user wants to stop the recording of the music piece they are creating, they proceed by pressing the “stop” card 409. This “stop” action will lead the computer program to stop recording and the computer software to simultaneously light up the “stop” card 409 and switch off the “record” card 407.
As seen above, a range of valid states can exist for the UI element marked on a card, and the relationship between a touch action and a change of the state of the UI element is established and stored by the computer program.
Before being touched, the state of the UI element marked on the “record” card 407 is “non-active” or “not in use” . With a single long touch by the end-user on the “record” card 407, the state of the UI element is changed to “active” or “in use” , which triggers the execution of the computer program by recording the music.
Similarly, the state of the UI element marked on the “stop” card 409 is changed from “not enabled” to “enabled” with a single touch by the end-user. As a consequence of this touch action, the state of the UI element marked on the “record” card 407 is also changed back to “non-active” or “not in use” .
Lastly, the UI element marked on the “play” card 408 is changed from  “not-executing” to “executing” by a double touch in quick succession by the end-user, which triggers the music to be played by the computer system. As a consequence of this touch action, the state of the “stop” card 409 is changed back to “not enabled” from “enabled” .
Whenever an end-user wants to save the music piece they have created, they proceed by placing a “save file” card 403 with a specific number assigned to it upon the interactive surface 401 and proceeds to save the music piece by pressing on the card for a determined amount of time (i.e. usually long enough for the interactive surface to confirm with the end-user through audio feedback broadcasted out via the speaker system 411) .
Whenever the end-user wants to hear back a saved music piece they proceed by pressing twice in rapid succession on the specific “save file” card 403 which triggers the interactive surface 401 to begin broadcasting the audio file via the speaker system 411 as well as have the LEDs surrounding the specific “save file” card 403 to start blinking (thus, indicating to the end-user that the music being played is attributed to that “save file” card 403) . It will be noted that the computer system will also direct the LED lights surrounding the “play” card 408 to light up so as to illustrate the change in “state” for that particular user interface as well (i.e. the music is currently being played) .
There exists a multitude of potential design alternatives for the card elements belonging to the embodiment described above but, for the sake of simplicity, the present document will only describe some basic features.
An end-user can have the music piece they have created to be accompanied by a preset melody. For example, referring back to Fig. 4, the “accompanying melody” card 404 named “Bach” is placed on the interactive surface 401. The end-user can select the playing of this piece at anytime by activating this “accompanying melody” card 404 by pressing on it using their finger. Such an action will cause the computer system to broadcast via the speaker system 411 a piece of the melody which could  either be played by itself or as a complement to a piece of music created by the end-user. As with the previously disclosed cases, visual feedback in the form of LED lights lighting on or off is used to indicate to the end-user the state of that particular “accompanying music” card 404 at any given time.
An end-user can add, subtract or change the type of musical instrument that is used throughout the broadcasting of music pieces by selecting or deselecting with their finger “musical instrument” cards 405. Referring back to Fig. 4, the surrounding LED lights of the “violin” card 405 is switched on indicating to the end-user that the music piece is being played using the acoustics of a violin.
FIG. 5 is an exemplary schematic diagram illustrating the system for a word spelling game in accordance with one embodiment of the present invention.
The embodiment described in Fig. 5 is similar to that of the embodiment illustrated in Fig. 3with a few notable differences. As previously the embodiment consists of an interactive surface 501 consisting of a base layer, a second layer comprising an RF antenna array and a third layer comprising an array of capacitance sensors. The interactive surface 501 further comprises a computer system which itself consists of a processor and a memory unit.
The cards 502 of the embodiment illustrated in Fig. 5 consist of cards 502 with capacitance tabs embedded into them so as to allow for capacitive coupling with the interactive surface’s 501 capacitance sensors as well as an end-user’s finger touch. Each card 502 further has an RFID containing the UID information of the card which can be wirelessly transmitted to the interactive surface’s 501 computer system when directed to do so by one of the interactive surface’s RF antenna.
Unlike the interactive surface described in Fig. 3, the interactive surface 501 here does not have a fourth layer comprising an array of LED lights. Instead, each card is embedded with one or more LED lights and the accompanying components that allow for the LED light (s) to be powered on and directed by the RF antenna of the interactive surface 501 through RF to DC energy conversion technology.
One embodiment for a language spelling game for the abovementioned system is the following. Each card has an identifier imprinted upon its surface wherein the identifier is either a letter of the alphabet 502 or a general user interface command 503 (e.g. “Difficulty” , “Verb” , “Animals” , “French” ) . Furthermore, the computer system of the interactive surface’s 501 has access to a database (located locally or remotely) whereby each of the cards’ UID match the identifier imprinted upon the surface of the cards and is related to a particular “state” or function of the electronic program that can be altered by physically pressing on the card 502.
An end-user begins a game by placing a number of cards 502 with letters of the alphabet imprinted upon the interactive surface 501, as illustrated in Fig. 5. This leads to the automatic detection and recognition of the location and UID of each card 502 by the interactive surface by the same process described in the previous embodiments. The end-user then initiates the language spelling game by pressing on the “Go” button 504. This causes the computer program of the computer system to broadcast an audio recording via the speakers 507prompting the end-user to correctly spell out a word based on the letters of the alphabet in play (i.e. cards 502 already placed upon the interactive surface 501) . The end-user can then proceed to respond to the challenge by selecting through a finger touch, in the correct sequence, the cards 502 whose letters correspond to the spelling of the word that has been broadcasted to the end-user. By pressing on a card 502, the end-user is effectively changing the state of the card’s user interface function. There are two potential outcomes from selecting a non-activated letter card 502 during the course of a word spelling challenge. The first is for the state of the selected card 502 to change from “non-active” to “active and correct” and the second is for the state of the card to change from “non-active” to “active and incorrect” . Depending on whether the end-user selection was correct or not will yield different feedback to the end-user.
Each time the end-user presses upon the surface of a card 502, and depending on whether the selection was correct, the computer system directs the RF antenna located closest to that card to light up the card 502 in question through RF to  DC energy conversion technology.
The end-user may also choose to replay the audio broadcast of the word to spell by pressing on the “replay” button 505or they can also press the “example” button 506to hear the word in the context of a sentence.
Referring back to Fig. 5, one can see a specific example of the game design described above. In this case, the end-user was challenged to spell correctly the word “Apple” and they have proceeded to correctly press in the cards whose letters spell out the word “Apple” . More specifically, as the end-user selected all 5 cards, the state of each of these is changed from “non-active” to “active and correct” . This has caused the LED lights of each card 502 to light up in order to indicate the correct selection by the end-user. In a preferred embodiment, a correct or incorrect selection is also followed by an indicative sound broadcasted by the speakers 507 of the interactive surface 501. The use of more than one LED lights of different colors per cards 502 in order to differentiate a correct of incorrect selection is another alternative feedback design.
The end-user can choose to alter the language spelling game by placing general user interface command cards 503 upon the interface surface and proceed by touching them. Thus, the end-user may choose to alter the level of spelling difficulty by pressing once on the “Difficulty” card 503 to cause the computer program to go into an easy mode (thus challenging the player to spell out relatively easy words) . The end-user may instead choose intermediate or difficult modes by tapping on the same card 503 twice or thrice respectively in rapid succession. In effect this card 503 will change from three different states (i.e. easy, intermediate and difficult) by the type of touch the end-user has with it.
The end-user may also choose to change the form of the word the computer program will challenge them with by picking any of the general user interface command cards 503 that correspond to this description and change their relative state by pressing on them. Referring to Fig. 5, one can see that in that particular scenario,  two cards 503 with the words “Verb” and “Noun” have been placed upon the interactive surface 501 but only the card “Verb” 503 is in an active state as its LED light is activated. This indicates the end-user that all words that the computer program will challenge them at that time with will be verbs.
The end-user may also choose to select words that correspond to a certain category of elements. For example, referring back to Fig. 5, one can see that the general user interface card 503 with “animal” has been placed upon the interactive surface 501 (it will be noted that an alternative design will be to have the image of an animal on the card instead of the actual word “animal” ) . If the end-user presses on that card, leading to its activation (e.g. changing its state) and the lighting up of its LED light, the computer program will then proceed to challenge the end-user with the spelling of words comprising of animal names (e.g. the word “Zebra” is a possibility in the scenario depicted in Fig. 5) .
Finally, the end-user may select the language that the game will proceed in by placing and touching the general user interface command cards 503 with the language name imprinted upon the card 503. Referring to Fig. 5, if the end-user presses on the card 503 “Francais” , then the word challenges will be made in French.
This game design described above can be used in a form of language spelling “free form” game whereby the end-user randomly picks and places cards with letters printed upon them upon the interactive surface 501. This free form language spelling game design is particularly well suited for young children that have just started or are about to start learning spelling at its teaches them how to match certain letters or words with their corresponding sounds in a free and unrestrictive manner.
Fig. 6 is an exemplary schematic diagram illustrating the method process flow in accordance with one embodiment of the present invention. As shown in Fig. 6, the system process includes the following steps.
Step 601: placing a plurality of objects on an interactive surface;
Step 602: indicating, by exhibiting a sensory pattern, the current state of the UI element marked on an object, by a sensory accessory;
Step 603: acting upon the object by a touch action;
Step 604: detecting a touch action by the interactive surface;
Step 605: notifying the computer system of the touch action by the interactive surface;
Step 606: receiving an instruction according to the computer program regarding changing the current state of the UI element marked on the object acted upon by the touch action;
Step 607: directing, by the interactive surface, the sensory accessory to indicate a new state of the UI element marked on the object by exhibiting a new sensory pattern for the object.

Claims (20)

  1. A system for changing the state of a user interface (UI) element visually marked on physical objects, comprising
    -a plurality of physical objects, each comprising a unique identification code (UID) and each visually marked with a UI element of a computer program,
    -an interactive surface that has no display screen, and is configured to detect the UID of an object, to derive the location and orientation of the object, and to measure capacitance change caused by the object and subsequently by a touch action upon the object, upon the object being placed on the interactive surface,
    -a sensory accessory that is operatively controlled by the interactive surface, and indicates the current state of the UI element by exhibiting a sensory pattern,
    wherein, during runtime execution of the computer program, and in response to a touch action upon a first object, the interactive surface is configured to detect the touch action and direct the sensory accessory to indicate a new state of the UI element of a second object by exhibiting a new sensory pattern for the second object.
  2. The system of claim 1, wherein the UI element is selected from a group comprising start, stop, save, delete, okay, cancel, play, replay, record, complete, copy, duplicate, export, import, a letter of an alphabet, a word of a language, and an icon representing a musical symbol.
  3. The system of claim 1, wherein the state of a UI element is selected from a group comprising active, non-active, in use, not in use, enabled, not enabled, selected, not-selected, not-executing, executing, completed, correct, incorrect, succeeded, and failed.
  4. The system of claim 1, wherein the touch action is selected from a group comprising touching an object once, touching an object twice in rapid succession, touching an object once but for a duration of time, changing the orientation of an  object with finger touch, and changing the location of an object with finger touch.
  5. The system of claim 1, wherein the second object is either the same or different from the first object.
  6. The system of claim 1, further comprising a computer system that is operatively linked to the interactive surface and is configured to execute the computer program.
  7. The system of claim 6, wherein the computer system stores the current state and a range of valid states of the UI element of the computer program marked on an object, and the relationship between a touch action and a change of the state of the UI element.
  8. The system of claim 6, wherein the interactive surface notifies the computer system of the touch action and receives an instruction according to the computer program regarding changing the current state of the UI element marked on an object.
  9. The system of claim 1, wherein an object is selected from a group comprising a card, a button, a block, an icon, a sheet, and a figurine.
  10. The system of claim 1, wherein the sensory accessory is selected from a group comprising an LED light, an audio device, a video device, and a vibration generator device.
  11. A method for changing the state of a user interface (UI) element visually marked on physical objects, comprising
    -placing, during runtime execution of a computer program, a plurality of objects on an interactive surface,
    wherein each object comprises a unique identification code (UID) and is visually marked with a UI element of the computer program, and
    wherein the interactive surface has no display screen, and is configured to  detect the UID of the object, derive the location and orientation of the object, and measure capacitance change caused by the object and subsequently by a touch action acted upon the object, upon the object being placed on the interactive surface,
    -indicating, by exhibiting a sensory pattern, the current state of the UI element marked on a first object, by a sensory accessory that is operatively controlled by the interactive surface,
    -acting upon the first object by a touch action,
    -detecting the touch action by the interactive surface,
    -directing, by the interactive surface, the sensory accessory to indicate a new state of the UI element marked on a second object by exhibiting a new sensory pattern for the second object.
  12. The method of claim 11, wherein the UI element is selected from a group comprising start, stop, save, delete, okay, cancel, play, replay, record, complete, copy, duplicate, export, import, a letter of an alphabet, a word of a language, and an icon representing a musical symbol.
  13. The method of claim 11, wherein the state of a UI element is selected from a group comprising active, non-active, in use, not in use, enabled, not enabled, selected, not-selected, not-executing, executing, completed, correct, incorrect, succeeded, failed.
  14. The method of claim 11, wherein the touch action is selected from a group comprising touching an object once, touching an object twice in rapid succession, touching an object once but for a short duration of time, changing the orientation of an object with finger touch, and changing the location of an object with finger touch.
  15. The method of claim 11, wherein the second object is either the same or different from the first object.
  16. The method of claim 11, further comprising executing the computer program  by a computer system that is operatively linked to the interactive surface.
  17. The method of claim 16, further comprising, storing the current state and a range of  valid states of the UI element of the computer program marked on an object, and the relationship between a touch action and a change in the state of the UI element, by the computer system.
  18. The method of claim 16, further comprising notifying the computer system of the touch action by the interactive surface and receiving an instruction according to the computer program regarding changing the current state of the UI element marked on an object.
  19. The method of claim 11, wherein an object is selected from a group comprising a card, a button, a block, an icon, a sheet, and a figurine.
  20. The method of claim 11, wherein the sensory accessory is selected from a group comprising an LED light, an audio device, a video device, and a vibration generator device.
PCT/CN2014/091918 2013-03-12 2014-11-21 System and method for changing the state of user interface element marked on physical objects WO2015113440A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
CN201480061857.2A CN105723306B (en) 2014-01-30 2014-11-21 Change the system and method for the state of user interface element of the label on object
EP14880849.6A EP3100148A1 (en) 2014-01-30 2014-11-21 System and method for changing the state of user interface element marked on physical objects
PCT/CN2015/070162 WO2015113457A1 (en) 2014-01-30 2015-01-06 Apparatus and method to enhance expressive qualities of digital music
EP15743177.6A EP3100258A4 (en) 2014-01-30 2015-01-06 Apparatus and method to enhance expressive qualities of digital music
KR1020167020190A KR101813557B1 (en) 2014-01-30 2015-01-06 Apparatus and method to enhance expressive qualities of digital music
JP2016548113A JP2017507349A (en) 2013-03-12 2015-01-06 Apparatus and method for enhancing the expression quality of digital music
CN201580000167.0A CN105027192B (en) 2014-01-30 2015-01-06 Enhance the device and method of digital music expressive force
PCT/CN2015/074570 WO2015188643A1 (en) 2014-06-13 2015-03-19 System and method for interactive game
US14/737,514 US9299330B2 (en) 2014-01-30 2015-06-12 Apparatus and method to enhance the expressive qualities of digital music
US15/057,092 US9690473B2 (en) 2014-06-13 2016-02-29 System and method for changing the state of user interface element marked on physical objects

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
CNPCT/CN2014/071850 2014-01-30
PCT/CN2014/071850 WO2014139349A1 (en) 2013-03-12 2014-01-30 System and method for identifying an object's id and location relative to an interactive surface
CNPCT/CN2014/072961 2014-03-06
PCT/CN2014/072961 WO2014139369A1 (en) 2013-03-12 2014-03-06 System and method for identifying object's id and location relative to interactive surface
PCT/CN2014/079892 WO2015113359A1 (en) 2013-03-12 2014-06-13 System and method for identifying an object's id and location relative to an interactive surface
CNPCT/CN2014/079892 2014-06-13
CNPCT/CN2014/080495 2014-06-23
PCT/CN2014/080495 WO2015113365A1 (en) 2014-01-30 2014-06-23 System and method to recognize object's id, orientation and location relative to interactive surface
PCT/CN2014/086745 WO2015113404A1 (en) 2014-01-30 2014-09-17 System and method for directing small scale object to generate sensory output to user powered by rf energy harvesting
CNPCT/CN2014/086745 2014-09-17
CNPCT/CN2014/091143 2014-11-14
PCT/CN2014/091143 WO2015113433A1 (en) 2014-01-30 2014-11-14 System and method to interact with elements of a language using physical objects

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2014/086745 Continuation-In-Part WO2015113404A1 (en) 2014-01-30 2014-09-17 System and method for directing small scale object to generate sensory output to user powered by rf energy harvesting
PCT/CN2014/090890 Continuation-In-Part WO2015113431A1 (en) 2013-03-12 2014-11-12 System and method for recognizing objects with continuous capacitance sensing

Related Child Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2015/070162 Continuation-In-Part WO2015113457A1 (en) 2013-03-12 2015-01-06 Apparatus and method to enhance expressive qualities of digital music
US15/057,092 Continuation-In-Part US9690473B2 (en) 2014-06-13 2016-02-29 System and method for changing the state of user interface element marked on physical objects

Publications (1)

Publication Number Publication Date
WO2015113440A1 true WO2015113440A1 (en) 2015-08-06

Family

ID=53756211

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/CN2014/080495 WO2015113365A1 (en) 2013-03-12 2014-06-23 System and method to recognize object's id, orientation and location relative to interactive surface
PCT/CN2014/090890 WO2015113431A1 (en) 2013-03-12 2014-11-12 System and method for recognizing objects with continuous capacitance sensing
PCT/CN2014/091918 WO2015113440A1 (en) 2013-03-12 2014-11-21 System and method for changing the state of user interface element marked on physical objects
PCT/CN2014/093763 WO2015113446A1 (en) 2013-03-12 2014-12-12 Apparatus and method for eliminating blind spot in an rf antenna array

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/CN2014/080495 WO2015113365A1 (en) 2013-03-12 2014-06-23 System and method to recognize object's id, orientation and location relative to interactive surface
PCT/CN2014/090890 WO2015113431A1 (en) 2013-03-12 2014-11-12 System and method for recognizing objects with continuous capacitance sensing

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/093763 WO2015113446A1 (en) 2013-03-12 2014-12-12 Apparatus and method for eliminating blind spot in an rf antenna array

Country Status (2)

Country Link
EP (2) EP3100142A1 (en)
WO (4) WO2015113365A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106422297A (en) * 2016-10-20 2017-02-22 中南大学 Automatic chessman feeding mechanism of go chess
WO2017088735A1 (en) * 2015-11-27 2017-06-01 Zheng Shi Interactive system and method for learning a language

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017031764A1 (en) * 2015-08-27 2017-03-02 Zheng Shi Apparatus and method for an rfid touch panel
CN107305444A (en) * 2016-04-19 2017-10-31 施政 Detection system and method
US9589161B1 (en) * 2016-05-13 2017-03-07 Kazoo Technology (Hong Kong) Limited Substrate with electrically conductive patterns that are readable
TW202343320A (en) * 2022-04-20 2023-11-01 美商阿泰訊控股公司 Personal Protection Equipment Digital Management (PPE-DM)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050275635A1 (en) * 2004-06-15 2005-12-15 Microsoft Corporation Manipulating association of data with a physical object
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US8584029B1 (en) * 2008-05-23 2013-11-12 Intuit Inc. Surface computer system and method for integrating display of user interface with physical objects

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6443796B1 (en) * 2000-06-19 2002-09-03 Judith Ann Shackelford Smart blocks
FR2860985B1 (en) * 2003-10-20 2005-12-30 Numicom ELECTRONIC LUDO-EDUCATIONAL ASSEMBLY WITH COMMUNICATING ELEMENTS WITH RADIO FREQUENCY LABEL
CN101111295B (en) * 2005-02-02 2010-06-23 皇家飞利浦电子股份有限公司 Pawn with triggerable sub parts
US7623081B2 (en) * 2008-01-25 2009-11-24 Mitsubishi Electric Research Laboratories, Inc. Wireless UWB connection for rotating RF antenna array
US10265609B2 (en) * 2008-06-03 2019-04-23 Tweedletech, Llc Intelligent game system for putting intelligence into board and tabletop games including miniatures
EP2209158A1 (en) * 2009-01-16 2010-07-21 Serious Toys B.V. System for detecting a position of an object in a plane
CN101637655A (en) * 2009-02-27 2010-02-03 黄煜能 Radio frequency (RF) entertainment or instructional system
CN102039045A (en) * 2009-10-12 2011-05-04 朱立圣 Electronic chessboard type game system
US20120249430A1 (en) * 2011-03-31 2012-10-04 Oster David Phillip Multi-Touch Screen Recognition of Interactive Objects, and Application Thereof
EP2657717A1 (en) * 2012-04-26 2013-10-30 Koninklijke Philips N.V. Magnetic resonance imaging (MRI) radio frequency (RF) antenna array with Gysel power splitter
CN104303133A (en) * 2013-03-12 2015-01-21 施政 System and method for interactive board
CN203264242U (en) * 2013-05-22 2013-11-06 王宇 Sound production chess

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US20050275635A1 (en) * 2004-06-15 2005-12-15 Microsoft Corporation Manipulating association of data with a physical object
US8584029B1 (en) * 2008-05-23 2013-11-12 Intuit Inc. Surface computer system and method for integrating display of user interface with physical objects

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017088735A1 (en) * 2015-11-27 2017-06-01 Zheng Shi Interactive system and method for learning a language
CN106422297A (en) * 2016-10-20 2017-02-22 中南大学 Automatic chessman feeding mechanism of go chess
CN106422297B (en) * 2016-10-20 2022-09-13 中南大学 Automatic go piece feeding mechanism

Also Published As

Publication number Publication date
EP3100148A1 (en) 2016-12-07
EP3100142A1 (en) 2016-12-07
WO2015113431A1 (en) 2015-08-06
WO2015113365A1 (en) 2015-08-06
WO2015113446A1 (en) 2015-08-06

Similar Documents

Publication Publication Date Title
US9690473B2 (en) System and method for changing the state of user interface element marked on physical objects
WO2015113440A1 (en) System and method for changing the state of user interface element marked on physical objects
US20160180734A1 (en) System and method to interact with elements of a language using physical objects
WO2015113358A1 (en) System and method for operating computer program with physical objects
US20160162040A1 (en) System and method for operating a computer program with physical objects
CN104102376B (en) Touch input device touch feedback
AU2017262857B2 (en) Touch screen overlay for the visually impaired and computer program
US20120007870A1 (en) Method of changing processor-generated individual characterizations presented on multiple interacting processor-controlled objects
US20160162036A1 (en) System and accompanying method for interacting with a card on an interactive surface
US8952887B1 (en) Interactive references to related application
US20040008182A1 (en) Methods and systems for providing programmable computerized interactors
US20210004211A1 (en) System and method for user created object, property, method, or event with physical manipulatives
JP2006039508A (en) Apparatus, method, and program for typing practice
US20120212427A1 (en) Driving device for interacting with touch screen panel assembly and method for interacting same with touch screen panel assembly
KR101789057B1 (en) Automatic audio book system for blind people and operation method thereof
US20180315334A1 (en) System and method for developing sense of rhythm
CN105917293B (en) The system and method interacted using object with language element
CN109375768A (en) Interactive bootstrap technique, device, equipment and storage medium
KR101246919B1 (en) Voice output system using by mat printed oid code and the control method thereof
US20160175698A1 (en) System and method for directing a targeted object on an interactive surface to produce a response
KR20140015672A (en) Apparatus and method for providing language learning service using character
TWM491889U (en) Smart electronic audio book
EP3986585B1 (en) Toys with capacitive touch features
CN105637465A (en) System and method for operating computer program with physical objects
CN107393355A (en) A kind of electronic sound convenient for collecting knows figure product

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14880849

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014880849

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014880849

Country of ref document: EP