WO2016202270A1 - System and method for physical programming on interactive surface - Google Patents

System and method for physical programming on interactive surface Download PDF

Info

Publication number
WO2016202270A1
WO2016202270A1 PCT/CN2016/085973 CN2016085973W WO2016202270A1 WO 2016202270 A1 WO2016202270 A1 WO 2016202270A1 CN 2016085973 W CN2016085973 W CN 2016085973W WO 2016202270 A1 WO2016202270 A1 WO 2016202270A1
Authority
WO
WIPO (PCT)
Prior art keywords
programming
physical
objects
interactive surface
processor
Prior art date
Application number
PCT/CN2016/085973
Other languages
French (fr)
Inventor
Zheng Shi
Huihui Wang
Yaohong DU
Original Assignee
Zheng Shi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zheng Shi filed Critical Zheng Shi
Publication of WO2016202270A1 publication Critical patent/WO2016202270A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs

Definitions

  • the present invention relates to the field of human computer interaction (HCI) and, particularly, to a system and method for physical programming on an interactive surface.
  • HCI human computer interaction
  • Graphical programming provides children an attractive alternative way. It converts a variety of programming concepts to graphics that can be displayed on a screen. Children only need to drag the graphics to finish the whole process of programming.
  • Physical programming (or physical object programming) is considered a branch of graphical programming. The difference from graphical programming is that physical programming is not restricted by computer screens. Physical programming programs by interacting with physical objects using techniques such as physical touch and physical perception, and converts physical logic to programming logic. Compared with manipulating computers, it is easier to devote children to the whole process by making children manipulate physical objects to interact with physical objects. Through the physical programming system, children can understand the logic of programming languages more intuitively. Programs are no longer lines of boring codes, but combinations of physical objects. By combining physical objects, children can finish the work that needs to be typed using a keyboard in common programming languages. The characteristic of physical programming determines that it is more suitable for children to program.
  • the present invention provides a method and system for physical programming on an interactive surface, using finger touch gestures to express the programming logic of programming symbols for the physical programming objects.
  • the system for physical programming includes multiple physical programming objects, with each visually marked with a programming symbol; an interactive surface comprising an array of electrodes; a processor operatively linked to the interactive surface, configured to detect the magnitude of the capacitive coupling between physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and further configured to identify the location of the physical programming objects as well as finger touch gestures made upon the programmable physical objects, and further configured to recognize the IDs of the physical programming objects; a first memory unit operatively linked to the processor and configured to store correlation relationships among and between IDs, programming symbols and programming codes; and a second memory unit operatively linked to the processor and configured to store correlation relationships between the finger touch gestures and the programming logic of the programming symbols.
  • the processor is configured to produce a programming code based on the finger touch gesture and the programming symbols.
  • the processor is configured to recognize the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
  • the finger touch made upon the programmable physical objects can be a light finger touch, a double tap, or a pressing finger touch.
  • the system includes a sensory accessory that can be an LED light, an audio device, a video device, or a vibration generator device.
  • the processor is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
  • the interactive surface is configured to display the programming logic of the programming symbols, and wherein the display device can be LED lights, touch screens, or e-ink screens.
  • the method of the present invention includes the following steps:
  • the method includes recognizing by the processor the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
  • the method includes directing by the processor a sensory accessory to produce an output to indicate the progress of execution of the programming code, and the sensory accessory can be an LED light, an audio device, a video device, or a vibration generator device.
  • the method includes displaying by the interactive surface the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
  • the present invention provides a method and system for physical programming on an interactive surface that uses finger touch gestures to represent the programming logic of the programming symbols of the physical programming objects.
  • This physical programming system is more convenient to use and makes it easier for children and newcomers to learn to program.
  • FIG. 1 is an exemplary schematic diagram illustrating the physical programming system in accordance with one embodiment of the present invention.
  • FIG. 2A is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents subroutine functions in accordance with one embodiment of the present invention.
  • FIG. 2B is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents movement modes in accordance with one embodiment of the present invention.
  • FIG. 2C is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents actions in accordance with one embodiment of the present invention.
  • FIG. 2D is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents grid positions in accordance with one embodiment of the present invention.
  • FIG. 3 is an exemplary schematic diagram illustrating lighting up LED lights placed in the third row on the interactive surface in accordance with one embodiment of the present invention.
  • FIG. 4 is an exemplary schematic diagram illustrating lighting up LED lights placed in the first and the third rows on the interactive surface in accordance with one embodiment of the present invention.
  • FIG. 5 is an exemplary schematic diagram illustrating lighting up LED lights placed in the second row on the interactive surface in accordance with one embodiment of the present invention.
  • FIG. 6 is an exemplary schematic diagram illustrating the process flow of the physical programming method in accordance with one embodiment of the present invention.
  • FIG. 1 is an exemplary schematic diagram illustrating the physical programming system in accordance with one embodiment of the present invention.
  • the physical programming system includes multiple physical programming objects 1, an interactive surface 2, a processor 3, a first memory unit 4, and a second memory unit 5.
  • Each physical programming object 1 has a visually marked programming symbol with different patterns. And each physical programming object 1 further includes an RFID tag.
  • the physical programming object 1 can be a card, a button, an icon, a sheet, or a statue.
  • the interactive surface 2 in FIG. 1 includes an array of electrodes and an RF antenna array.
  • electrodes can be metal materials, such as aluminum, copper and so on. The materials make up physical programming object 1 can capacitively couple with the electrodes.
  • Processor 3 is operatively linked to the interactive surface 2 and can detect the magnitude of the capacitive coupling between the physical programming objects 1 placed on the interactive surface and the array of electrodes of interactive surface 2, and further derive the location of the physical programming objects 1.
  • the processor 3 can recognize the IDs of the physical programming objects through wireless communication between RFID tags of the physical programming objects 1 on the interactive surface and the RF antenna array of the interactive surface 2.
  • a finger touch gesture is acted on a physical programming object 1
  • the magnitude of capacitive coupling between the physical programming object 1 and the interactive surface 2 changes, and the processor 3 detects such changes and then further recognizes the finger touch gesture.
  • the finger touch gesture made upon the programmable physical objects can be a light finger touch, a double tap, and a pressing finger touch.
  • the light finger touch refers to rapid touching of a physical programming object.
  • the double tap refers to continuous rapid tapping a physical programming object twice.
  • the pressing finger touch refers to placing a finger on a physical programming object for a period of time.
  • the first memory unit 4 is operatively linked to the processor 3 and stores the relationships among and between physical programming objects’IDs, programming symbols and programming codes.
  • the second memory unit 5 is operatively linked to the processor 3 and stores relationships between the finger touch gestures and the programming logic of the programming symbols. Users can determine the programming logic in accordance with the finger touch gestures based on their own preferences.
  • the second memory unit 5 has an initial value (i.e., the default relationship between finger touch gestures and the programming logic of the programming symbols) that can be restored by users.
  • the finger touch gesture is acted on the Nth (N is an integer greater than 1) physical object 1.
  • processor 3 can generate a program code based on the programming symbols of the physical programming objects 1 and the finger touch gestures.
  • the interactive surface 2 is configured to display the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
  • the physical programming system shown in FIG. 1 further includes a sensory accessory 11.
  • the sensory accessory 11 can be LED light, an audio device, a video device, or a vibration generator device.
  • the processor 3 is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
  • Embodiment two provides an example of physical programming using the physical programming system in the present invention.
  • FIGs. 2A, 2B, 2C and 2D multiple physical programming objects are used for embodiment two, three and four.
  • the programming symbols represent movement modes that include moving upward, moving downward, moving left, and moving right.
  • the programming symbols represent actions such as lighting up and not lighting up.
  • the programming symbols represent locations of grid points. For example, “odd” is used to represent a grid point with an odd column number.
  • the programming task in embodiment two is to light up the LED lights 10 located at the intersections of the third row and all columns with odd numbers (please be noted that only one LED light is marked as number 10, whereas the other LED lights are not marked in FIG. 3) .
  • the first grid point in the first row is the starting position 9.
  • the interactive surface 2 is divided into a display area and a programming area. There are five rows of grid points in the display area, with each row having nine grid points.
  • a light finger touch corresponds to the sequence structure, which means that the programming symbols of each physical programming object will be executed in sequence.
  • the processor 3 operatively linked to the interactive surface 2 is configured to recognize the IDs and location information of the physical programming objects 1 on the interactive surface 2.
  • the logical button 8 is pressed, and light finger touches are acted on the physical programming objects 1, which are recognized by the processor 3.
  • the interactive surface 2 can display the programming logic, and the display device can be LED lights, touch screens, or e-ink screens.
  • the physical programming objects 1 on the interactive surface are connected with straight or curve arrow lines, and the direction of the arrows indicates sequence of the finger touch gestures.
  • the processor 3 derives the programming symbols, based on the correlation relationships among and between physical programming objects’IDs, programming symbols and programming codes stored in the first memory unit operatively linked to the processor. And the processor 3 derives the programming logic of programming symbols based on the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in the second memory unit operatively linked to the processor. The processor 3 thus generates a programming code based on the physical programming objects and the finger touch gestures.
  • a user can execute a computer program created by himself and see the execution result of the programming code through a display device (in embodiment two, the display device is an LED light) .
  • the display device is an LED light
  • the user wants to debug the computer program, she/he can press the stop button 7 and adjust the physical programming objects 1 and the finger touch gestures.
  • the physical programming system further includes a sensory accessory, and the processor is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
  • each physical programming object 1 is connected to an LED light. Once the programming code represented by a physical programming object 1 is executed, the corresponding LED light is lighted up. Thus, the user can witness the process of the execution of the programming code, which would make it easier to debug.
  • physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up the LED lights 10 located at the intersections of the third row and all columns with odd numbers.
  • the programming logic of the physical programming objects used in subroutine function P1 is as follows: lighting up, moving right, moving right.
  • the programming logic of the physical programming objects used in the main function is: moving downward, moving downward, calling subroutine function P1, calling subroutine function P1, calling subroutine function P1, calling subroutine function P1, calling subroutine function P1, and lighting up.
  • the programming code generated based on the physical programming objects 1 and the finger touch gestures in FIG. 3 is as follows:
  • Embodiment three provides another example of physical programming using the physical programming system of the present invention.
  • a light finger touch corresponds to the sequence structure, which means that the programming symbols of each physical programming object will be executed in sequence.
  • the locations of all physical programming objects 1 used in embodiment three are all the same as those in embodiment two, but the sequence of the finger touch gestures made on the physical programming objects 1 has been changed. As a result, LED lights at different intersectional grid points are lit up.
  • physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up three LED lights 10 in the first row and two LED lights in the third row.
  • the programming logic of the physical programming objects used in subroutine function P1 is as follows: lighting up, moving right, moving right.
  • the programming logic of the physical programming objects used in the main function is: calling subroutine function P1, calling subroutine function P1, lighting up, moving downward, moving downward, calling subroutine function P1, and calling subroutine function P1.
  • the programming code generated based on the physical programming objects 1 and the finger touch gestures in FIG. 4 is as follows.
  • Embodiment four provides another example of physical programming using the physical programming system of the present invention.
  • a light finger touch corresponds to the sequence structure as well as the execution in sequence of the programming symbols of each physical programming object on a branch where the branch condition is met
  • a pressing finger touch represents a programming symbol of a physical programming object that acts as a branch point
  • a double tap corresponds to the execution in sequence of the programming symbols of each physical programming object on a branch where the branch condition is not met.
  • the sequence of the finger touch gestures is defined as follows: pressing finger touches acted on physical programming objects 1 representing branch points; light finger touches acted on physical programming objects 1 on branches where the branch conditions are met; pressing finger touches acted on a physical programming objects 1 representing branch points; double taps acted on physical programming objects 1 on branches where the branch conditions are not met.
  • physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up the LED lights 10 located at the intersections of the second row and all columns with odd numbers.
  • users can determine the directions of the objects based on their own preferences.
  • the physical programming object 1 that represent branch points are placed along the first direction and other physical programming objects are placed along the second direction, which would make it more straightforward for the programming logic of the programming symbols of the physical programming objects 1.
  • the programming logic of the physical programming objects used in subroutine function P2 is as follows: lighting up if an LED light is located at the intersection of the Nth row and a column with an odd number; moving right; not lighting up if an LED light is located at the intersection of the Nth row and a column with an even number; moving right.
  • the programming logic of the physical programming objects used in the main function is as follows: moving downward, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2.
  • the programming code generated by the physical programming objects 1 and the finger touch gestures in FIG. 5 is as follows:
  • Embodiment five provides a method for generating and executing computer programs using physical programming objects and finger touch gestures.
  • FIG. 6 is an exemplary schematic diagram illustrating the process flow of the method. As shown in FIG. 6, the method includes the following steps.
  • Step 601 placing multiple physical programming objects upon an interactive surface.
  • the interactive surface includes an array of electrodes and an array of RF antennas.
  • Each physical programming object is visually marked with a programming symbol and embedded with a RFID tag.
  • Step 602 detecting by a processor the magnitude of the capacitive coupling between physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and deriving the locations of the physical programming objects.
  • the processor recognizes the IDs of the physical programming objects through wireless communication between the array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
  • Step 603 touching the physical programming objects with a finger gesture.
  • Step 604 detecting by the processor the changes in the magnitude of the capacitive coupling between the physical programming objects and the array of electrodes as a result of the finger touch gesture acted on the physical programming objects, and further recognizing the finger touch gesture.
  • Step 605 deriving the programming symbols and their programming logic.
  • the processor derives the programming symbols based on the correlation relationships among and between physical programming objects’IDs, programming symbols and programming codes stored in the first memory unit operatively linked to the processor.
  • the processor derives the programming logic of the programming symbols based on the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in the second memory unit operatively linked to the processor.
  • the interactive surface is configured to display the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
  • Step 606 producing by the processor a program code based on the programming symbols and the finger touch gesture.
  • Step 607 directing by the processor a sensory accessory to produce an output to indicate the progress of execution of the programming code.
  • the sensory accessory can be an LED light, an audio device, a video device, or a vibration generator device.
  • Step 608 stopping execution of the programming code if the result is wrong, .
  • Step 609 debugging the programming code by changing the programming symbols and the finger touch gesture.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A system and method are provided for physical programming on an interactive surface. The system includes multiple physical programming objects (1) with each object (1) visually marked with a programming symbol; an interactive surface (2) comprising an array of electrodes; a processor (3) operatively linked to the interactive surface (2), configured to detect the magnitude of the capacitive coupling between the physical programming objects (1) placed on the interactive surface (2) and the array of electrodes of the interactive surface (2), and further configured to identify the location information of the physical programming objects (1) as well as finger touch gestures made upon the programmable physical objects (1), and further configured to recognize the IDs of the physical programming objects (1); a first memory unit (4) operatively linked to the processor (3), configured to store correlation relationships among and between the objects' IDs, programming symbols and programming codes; a second memory unit (5) operatively linked to the processor (3), configured to store correlation relationships between the finger touch gestures and the programming logic of the programming symbols. Once the physical programming objects (1) are placed on the interactive surface (2) and a finger touch gesture is made upon the physical programming objects (1), the processor (3) is configured to produce a programming code based on the finger touch gesture and the programming symbols.

Description

SYSTEM AND METHOD FOR PHYSICAL PROGRAMMING ON AN INTERACTIVE SURFACE TECHNICAL FIELD
The present invention relates to the field of human computer interaction (HCI) and, particularly, to a system and method for physical programming on an interactive surface.
BACKGROUND
With the rapid development of computer technology, more and more researchers are paying attention to computational thinking. Professor Jeannette Wing from Carnegie Mellon University has proposed that computational thinking will become a fundamental skill for everyone in the world by in the 21st century. She further points out that computation thinking will become an ability that each child should have as essential as reading, writing and counting. Computational thinking helps develop children’s analytics kills and greatly benefits them in STEM (Science, Technology, Engineering, and Mathematics) as well as other areas, even daily life. Programming allows children to explore creative issues and makes them learn the skills to solve problems. Although computational thinking does not only encompass programming, it is the ability needed everywhere in the programming ability required in computer science and technology, and thus these abilities are mutually promoting.
Traditional programming is usually finished by text language input through keyboard. This type of programming is inconvenient for children to understand and use, which is primarily due to language syntax, complex commands, and a large number of input work in the traditional programming language. It is hard for children to memorize and understand the professional knowledge, such as language syntax, logical relationships and programming architecture. Furthermore, for those children not skilled in characters, text editing mode of programming lacks intuition, and thus children cannot create their own program using traditional way of programming.
Graphical programming provides children an attractive alternative way. It converts a variety of programming concepts to graphics that can be displayed on a  screen. Children only need to drag the graphics to finish the whole process of programming. Physical programming (or physical object programming) is considered a branch of graphical programming. The difference from graphical programming is that physical programming is not restricted by computer screens. Physical programming programs by interacting with physical objects using techniques such as physical touch and physical perception, and converts physical logic to programming logic. Compared with manipulating computers, it is easier to devote children to the whole process by making children manipulate physical objects to interact with physical objects. Through the physical programming system, children can understand the logic of programming languages more intuitively. Programs are no longer lines of boring codes, but combinations of physical objects. By combining physical objects, children can finish the work that needs to be typed using a keyboard in common programming languages. The characteristic of physical programming determines that it is more suitable for children to program.
At present, there has been some physical programming work in which users place physical programming objects according to the selected physical programming task or execution rules of a task, to form a sequence of physical programming objects. It has many restrictions on the positions of physical programming objects.
SUMMARY OF INVENTION
To solve the problem above, the present invention provides a method and system for physical programming on an interactive surface, using finger touch gestures to express the programming logic of programming symbols for the physical programming objects.
The system for physical programming provided in the present invention includes multiple physical programming objects, with each visually marked with a programming symbol; an interactive surface comprising an array of electrodes; a processor operatively linked to the interactive surface, configured to detect the magnitude of the capacitive coupling between physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and further configured to identify the location of the physical programming objects as well as finger touch gestures made upon the programmable physical objects, and further configured to recognize the IDs of the physical programming objects; a first memory  unit operatively linked to the processor and configured to store correlation relationships among and between IDs, programming symbols and programming codes; and a second memory unit operatively linked to the processor and configured to store correlation relationships between the finger touch gestures and the programming logic of the programming symbols.
Once the programming physical objects are placed on the interactive surface and a finger touch gesture is made upon the physical programming objects, the processor is configured to produce a programming code based on the finger touch gesture and the programming symbols.
Further, the processor is configured to recognize the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
Further, the finger touch made upon the programmable physical objects can be a light finger touch, a double tap, or a pressing finger touch.
Further, the system includes a sensory accessory that can be an LED light, an audio device, a video device, or a vibration generator device. The processor is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
Further, the interactive surface is configured to display the programming logic of the programming symbols, and wherein the display device can be LED lights, touch screens, or e-ink screens.
The method of the present invention includes the following steps:
1) placing multiple physical programming objects upon an interactive surface, and each physical programming object is visually marked with a programming symbol, and the interactive surface includes an array of electrodes;
2) detecting, by a processor operatively linked to the interactive surface, the magnitude of the capacitive coupling between the physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and identifying the location information of the physical programming objects, and further recognizing the IDs of the physical programming objects;
3) touching with a finger gesture the physical programming objects;
4) detecting, by the processor, the changes in the magnitude of the capacitive coupling between the physical programming objects and the array of electrodes, and further recognizing the finger touch gesture;
5) producing, by the processor, a programming code based on the finger touch gesture and the programming symbols, in accordance with the correlation relationships among and between physical programming objects’IDs, programming symbols and programming codes stored in a first memory unit operatively linked to the processor and the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in a second memory unit operatively linked to the processor.
Further, the method includes recognizing by the processor the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
Further, the method includes directing by the processor a sensory accessory to produce an output to indicate the progress of execution of the programming code, and the sensory accessory can be an LED light, an audio device, a video device, or a vibration generator device.
Further, the method includes displaying by the interactive surface the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
The present invention provides a method and system for physical programming on an interactive surface that uses finger touch gestures to represent the programming logic of the programming symbols of the physical programming objects. This physical programming system is more convenient to use and makes it easier for children and newcomers to learn to program.
BRIEF DESCRIPTION OF THE DRAWINGS
To better illustrate the technical features of the embodiments of the present invention, various embodiments of the present invention will be briefly described in conjunction with the accompanying drawings. It should be obvious that the drawings  are only for exemplary embodiments of the present invention, and that a person of ordinary skill in the art may derive additional drawings without deviating from the principles of the present invention.
FIG. 1 is an exemplary schematic diagram illustrating the physical programming system in accordance with one embodiment of the present invention.
FIG. 2A is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents subroutine functions in accordance with one embodiment of the present invention.
FIG. 2B is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents movement modes in accordance with one embodiment of the present invention.
FIG. 2C is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents actions in accordance with one embodiment of the present invention.
FIG. 2D is an exemplary schematic diagram illustrating a physical programming object whose programming symbol represents grid positions in accordance with one embodiment of the present invention.
FIG. 3 is an exemplary schematic diagram illustrating lighting up LED lights placed in the third row on the interactive surface in accordance with one embodiment of the present invention.
FIG. 4 is an exemplary schematic diagram illustrating lighting up LED lights placed in the first and the third rows on the interactive surface in accordance with one embodiment of the present invention.
FIG. 5 is an exemplary schematic diagram illustrating lighting up LED lights placed in the second row on the interactive surface in accordance with one embodiment of the present invention.
FIG. 6 is an exemplary schematic diagram illustrating the process flow of the physical programming method in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Reference will now be made in detail to various embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the embodiments, it will be understood that this is not intended to limit the scope of the invention to these specific embodiments. The invention is intended to cover all alternatives, modifications and equivalents within the spirit and scope of invention, which is defined by the apprehended claims.
Furthermore, in the detailed description of the present invention, specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits are not described in details to avoid unnecessarily obscuring a clear understanding of the present invention.
Embodiment One
Embodiment one provides a system for generating and executing computer programs using physical programming objects and finger touch gestures. FIG. 1 is an exemplary schematic diagram illustrating the physical programming system in accordance with one embodiment of the present invention. The physical programming system includes multiple physical programming objects 1, an interactive surface 2, a processor 3, a first memory unit 4, and a second memory unit 5.
Each physical programming object 1 has a visually marked programming symbol with different patterns. And each physical programming object 1 further includes an RFID tag. The physical programming object 1 can be a card, a button, an icon, a sheet, or a statue. The interactive surface 2 in FIG. 1 includes an array of electrodes and an RF antenna array. In this embodiment, electrodes can be metal materials, such as aluminum, copper and so on. The materials make up physical programming object 1 can capacitively couple with the electrodes.
Processor 3 is operatively linked to the interactive surface 2 and can detect the magnitude of the capacitive coupling between the physical programming objects 1 placed on the interactive surface and the array of electrodes of interactive surface 2, and further derive the location of the physical programming objects 1. The processor  3 can recognize the IDs of the physical programming objects through wireless communication between RFID tags of the physical programming objects 1 on the interactive surface and the RF antenna array of the interactive surface 2. Once a finger touch gesture is acted on a physical programming object 1, the magnitude of capacitive coupling between the physical programming object 1 and the interactive surface 2 changes, and the processor 3 detects such changes and then further recognizes the finger touch gesture. The finger touch gesture made upon the programmable physical objects can be a light finger touch, a double tap, and a pressing finger touch. The light finger touch refers to rapid touching of a physical programming object. The double tap refers to continuous rapid tapping a physical programming object twice. The pressing finger touch refers to placing a finger on a physical programming object for a period of time.
The first memory unit 4 is operatively linked to the processor 3 and stores the relationships among and between physical programming objects’IDs, programming symbols and programming codes. The second memory unit 5 is operatively linked to the processor 3 and stores relationships between the finger touch gestures and the programming logic of the programming symbols. Users can determine the programming logic in accordance with the finger touch gestures based on their own preferences. The second memory unit 5 has an initial value (i.e., the default relationship between finger touch gestures and the programming logic of the programming symbols) that can be restored by users. When a user uses the first physical programming object 1 to program, a finger touch gesture is acted on the first physical programming object 1 by the user. And when the user needs to use the Nth (N is an integer greater than 1) physical programming object 1 to program, the finger touch gesture is acted on the Nth (N is an integer greater than 1) physical object 1. Once multiple physical programming objects 1 are placed on the interactive surface and finger touch gestures are made on the physical programming objects 1, processor 3 can generate a program code based on the programming symbols of the physical programming objects 1 and the finger touch gestures. The interactive surface 2 is configured to display the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
The physical programming system shown in FIG. 1 further includes a sensory accessory 11. The sensory accessory 11 can be LED light, an audio device, a  video device, or a vibration generator device. The processor 3 is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
Embodiment Two
Embodiment two provides an example of physical programming using the physical programming system in the present invention.
In FIGs. 2A, 2B, 2C and 2D, multiple physical programming objects are used for embodiment two, three and four. As shown in FIG. 2A, the programming symbols such as “P1” and “P2” are used to represents subroutine functions, and “=” is used for defining a subroutine function. As shown in FIG. 2B, the programming symbols represent movement modes that include moving upward, moving downward, moving left, and moving right. As shown in FIG. 2C, the programming symbols represent actions such as lighting up and not lighting up. As shown in FIG. 2D, the programming symbols represent locations of grid points. For example, “odd” is used to represent a grid point with an odd column number.
The programming task in embodiment two is to light up the LED lights 10 located at the intersections of the third row and all columns with odd numbers (please be noted that only one LED light is marked as number 10, whereas the other LED lights are not marked in FIG. 3) . The first grid point in the first row is the starting position 9. As shown in FIG. 3, the interactive surface 2 is divided into a display area and a programming area. There are five rows of grid points in the display area, with each row having nine grid points. In embodiment two, as stored in the second memory unit, a light finger touch corresponds to the sequence structure, which means that the programming symbols of each physical programming object will be executed in sequence.
Once multiple physical programming object 1 are placed in the programming area, the processor 3 operatively linked to the interactive surface 2 is configured to recognize the IDs and location information of the physical programming objects 1 on the interactive surface 2. In accordance with the programming task, the logical button 8 is pressed, and light finger touches are acted on the physical programming objects 1, which are recognized by the processor 3. The interactive surface 2 can display the programming logic, and the display device can be LED  lights, touch screens, or e-ink screens. As shown FIG. 3, the physical programming objects 1 on the interactive surface are connected with straight or curve arrow lines, and the direction of the arrows indicates sequence of the finger touch gestures.
The processor 3 derives the programming symbols, based on the correlation relationships among and between physical programming objects’IDs, programming symbols and programming codes stored in the first memory unit operatively linked to the processor. And the processor 3 derives the programming logic of programming symbols based on the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in the second memory unit operatively linked to the processor. The processor 3 thus generates a programming code based on the physical programming objects and the finger touch gestures.
By pressing the start button 6, a user can execute a computer program created by himself and see the execution result of the programming code through a display device (in embodiment two, the display device is an LED light) . If the user wants to debug the computer program, she/he can press the stop button 7 and adjust the physical programming objects 1 and the finger touch gestures. The physical programming system further includes a sensory accessory, and the processor is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code. For example, each physical programming object 1 is connected to an LED light. Once the programming code represented by a physical programming object 1 is executed, the corresponding LED light is lighted up. Thus, the user can witness the process of the execution of the programming code, which would make it easier to debug.
As shown in FIG. 3, physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up the LED lights 10 located at the intersections of the third row and all columns with odd numbers. The programming logic of the physical programming objects used in subroutine function P1 is as follows: lighting up, moving right, moving right. The programming logic of the physical programming objects used in the main function is: moving downward, moving downward, calling subroutine function P1, calling subroutine function P1, calling subroutine function P1, calling subroutine function P1, and lighting up. The programming code generated based on the physical programming objects 1 and the finger touch gestures in FIG. 3 is as follows:
main ()
{
down ()
down ()
proc1 ()
proc1 ()
proc1 ()
proc1 ()
light ()
}
proc1 ()
{
light ()
right ()
right ()
}
Embodiment Three
Embodiment three provides another example of physical programming using the physical programming system of the present invention. In the embodiment, as stored in the second memory unit, a light finger touch corresponds to the sequence structure, which means that the programming symbols of each physical programming object will be executed in sequence.
The locations of all physical programming objects 1 used in embodiment three are all the same as those in embodiment two, but the sequence of the finger touch gestures made on the physical programming objects 1 has been changed. As a result, LED lights at different intersectional grid points are lit up. As shown in FIG. 4, physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up three LED lights 10 in the first row and two LED  lights in the third row. The programming logic of the physical programming objects used in subroutine function P1 is as follows: lighting up, moving right, moving right. The programming logic of the physical programming objects used in the main function is: calling subroutine function P1, calling subroutine function P1, lighting up, moving downward, moving downward, calling subroutine function P1, and calling subroutine function P1. The programming code generated based on the physical programming objects 1 and the finger touch gestures in FIG. 4 is as follows.
main ()
{
proc1 ()
proc1 ()
light ()
down ()
down ()
proc1 ()
proc1 ()
}
proc1 ()
{
light ()
right ()
right ()
}
Embodiment Four
Embodiment four provides another example of physical programming using the physical programming system of the present invention. In the embodiment, as stored in the second memory unit, a light finger touch corresponds to the sequence structure as well as the execution in sequence of the programming symbols of each  physical programming object on a branch where the branch condition is met, a pressing finger touch represents a programming symbol of a physical programming object that acts as a branch point, and a double tap corresponds to the execution in sequence of the programming symbols of each physical programming object on a branch where the branch condition is not met. When a branch structure is created, the sequence of the finger touch gestures is defined as follows: pressing finger touches acted on physical programming objects 1 representing branch points; light finger touches acted on physical programming objects 1 on branches where the branch conditions are met; pressing finger touches acted on a physical programming objects 1 representing branch points; double taps acted on physical programming objects 1 on branches where the branch conditions are not met.
As shown in FIG. 5, physical programming objects 1 and finger touch gestures are used to program on the interactive surface 2 to light up the LED lights 10 located at the intersections of the second row and all columns with odd numbers. When placing physical programming objects 1, users can determine the directions of the objects based on their own preferences. The physical programming object 1 that represent branch points are placed along the first direction and other physical programming objects are placed along the second direction, which would make it more straightforward for the programming logic of the programming symbols of the physical programming objects 1.
When the subroutine function P2 is defined, the sequence of the finger touch gestures is as follows: a light finger touch acted on the physical programming object with the programming symbol “P2” ; a light finger touch acted on the physical programming object with the programming symbol “=” ; a pressing finger touch acted on the physical programming object with the programming symbol “odd” ; a light finger touch acted on the physical programming object with the programming symbol “lighting up” ; a light finger touch acted on the physical programming object with the programming symbol “moving right” ; a pressing finger touch acted on the physical programming object with the programming symbol “odd” ; a double tap acted on the physical programming object with the programming symbol “not lighting up” ; a double tap acted on the physical programming object with the programming symbol “moving right” . The programming logic of the physical programming objects used in subroutine function P2 is as follows: lighting up if an LED light is located at the  intersection of the Nth row and a column with an odd number; moving right; not lighting up if an LED light is located at the intersection of the Nth row and a column with an even number; moving right. The programming logic of the physical programming objects used in the main function is as follows: moving downward, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2, calling subroutine function P2. The programming code generated by the physical programming objects 1 and the finger touch gestures in FIG. 5 is as follows:
main ()
{
down ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
proc2 ()
}
proc2 ()
{
if (type==odd)
{
light ()
right ()
}
else
{
no-light ()
right ()
}
}
Embodiment Five
Embodiment five provides a method for generating and executing computer programs using physical programming objects and finger touch gestures. FIG. 6 is an exemplary schematic diagram illustrating the process flow of the method. As shown in FIG. 6, the method includes the following steps.
Step 601: placing multiple physical programming objects upon an interactive surface. The interactive surface includes an array of electrodes and an array of RF antennas. Each physical programming object is visually marked with a programming symbol and embedded with a RFID tag.
Step 602: detecting by a processor the magnitude of the capacitive coupling between physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and deriving the locations of the physical programming objects. The processor recognizes the IDs of the physical programming objects through wireless communication between the array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
Step 603: touching the physical programming objects with a finger gesture.
Step 604: detecting by the processor the changes in the magnitude of the capacitive coupling between the physical programming objects and the array of electrodes as a result of the finger touch gesture acted on the physical programming objects, and further recognizing the finger touch gesture.
Step 605: deriving the programming symbols and their programming logic. The processor derives the programming symbols based on the correlation  relationships among and between physical programming objects’IDs, programming symbols and programming codes stored in the first memory unit operatively linked to the processor. And the processor derives the programming logic of the programming symbols based on the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in the second memory unit operatively linked to the processor. The interactive surface is configured to display the programming logic of the programming symbols, and the display device can be LED lights, touch screens, or e-ink screens.
Step 606: producing by the processor a program code based on the programming symbols and the finger touch gesture.
Step 607: directing by the processor a sensory accessory to produce an output to indicate the progress of execution of the programming code. The sensory accessory can be an LED light, an audio device, a video device, or a vibration generator device.
Step 608: stopping execution of the programming code if the result is wrong, .
Step 609: debugging the programming code by changing the programming symbols and the finger touch gesture.

Claims (10)

  1. A system for physical programming on an interactive surface, comprising:
    -a plurality of physical programming objects with each physical programming object visually marked with a programming symbol;
    -an interactive surface comprising an array of electrodes;
    -a processor operatively linked to the interactive surface, configured to detect the magnitude of the capacitive coupling between physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and further configured to identify the location of the physical programming objects as well as finger touch gestures made upon the programmable physical objects, and further configured to recognize the IDs of the physical programming objects;
    -a first memory unit operatively linked to the processor, configured to store correlation relationships among and between IDs, programming symbols and programming codes;
    -a second memory unit operatively linked to the processor, configured to store correlation relationships between the finger touch gestures and the programming logic of the programming symbols;
    wherein, upon a plurality of programming physical objects being placed on the interactive surface and a finger touch gesture being made upon the physical programming objects, the processor is configured to produce a programming code based on the finger touch gesture and the programming symbols.
  2. The system of claim 1, wherein the processor is further configured to recognize the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
  3. The system of claim 1, wherein the finger touch gesture made upon the programmable physical objects is selected from a group consisting of a light finger touch, a double tap and a pressing finger touch.
  4. The system of claim 1, further comprising a sensory accessory selected from a group consisting of an LED light, an audio device, a video device, and a vibration  generator device, wherein the processor is configured to direct the sensory accessory to produce an output to indicate the progress of execution of the programming code.
  5. The system of claim 1, wherein the interactive surface is configured to display the programming logic of the programming symbols, and wherein the display device is selected from a group consisting of LED lights, touch screens and e-ink screens.
  6. A method for physical programming on an interactive surface, comprising:
    -placing a plurality of physical programming objects upon an interactive surface, wherein each physical programming object is visually marked with a programming symbol and the interactive surface comprises an array of electrodes;
    -detecting, by a processor operatively linked to the interactive surface, the magnitude of the capacitive coupling between the physical programming objects placed on the interactive surface and the array of electrodes of the interactive surface, and identifying the location information of the physical programming objects, and further recognizing the IDs of the physical programming objects;
    -touching with a finger gesture the plurality of physical programming objects;
    -detecting, by the processor, the changes in the magnitude of the capacitive coupling between the physical programming objects and the array of electrodes and further recognizing the finger touch gesture;
    -producing, by the processor, a programming code based on the finger touch gesture and the programming symbols, in accordance with the correlation relationships among and between physical programming objects’ IDs, programming symbols and programming codes stored in a first memory unit operatively linked to the processor and the correlation relationships between the finger touch gestures and the programming logic of the programming symbols stored in a second memory unit operatively linked to the processor.
  7. The method of claim 6, further comprising, recognizing by the processor the IDs of the physical programming objects placed on the interactive surface through wireless communication between an array of RF antennas embedded in the interactive surface and the RFID tags embedded in the physical programming objects.
  8. The method of claim 6, wherein the finger touch gesture made upon the programmable physical objects is selected from a group consisting of a light finger touch, a double tap and a pressing finger touch.
  9. The method of claim 6, further comprising, directing by the processor a sensory accessory to produce an output to indicate the progress of execution of the programming code, and wherein the sensory accessory is selected from a group consisting of an LED light, an audio device, a video device, and a vibration generator device.
  10. The method of claim 6, further comprising, displaying by the interactive surface the programming logic of the programming symbols, and wherein the display device is selected from a group consisting of LED lights, touch screens and e-ink screens.
PCT/CN2016/085973 2015-06-17 2016-06-16 System and method for physical programming on interactive surface WO2016202270A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510335980.5A CN104991640B (en) 2015-06-17 2015-06-17 Programing system in kind and method on interactive interface
CN2015103359805 2015-06-17

Publications (1)

Publication Number Publication Date
WO2016202270A1 true WO2016202270A1 (en) 2016-12-22

Family

ID=54303456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/085973 WO2016202270A1 (en) 2015-06-17 2016-06-16 System and method for physical programming on interactive surface

Country Status (2)

Country Link
CN (1) CN104991640B (en)
WO (1) WO2016202270A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991640B (en) * 2015-06-17 2018-03-27 施政 Programing system in kind and method on interactive interface
CN105893060A (en) * 2016-05-09 2016-08-24 福建省闽骏科教设备有限公司 Graphical programming system and graphical programming method
CN109308181A (en) * 2018-08-23 2019-02-05 深圳点猫科技有限公司 A kind of the split screen operating method and system of the mobile terminal programming convenient for juvenile's operation
WO2020047815A1 (en) * 2018-09-07 2020-03-12 Zheng Shi System and method for user created object, property, method, or event with physical manipulatives
CN112394913B (en) * 2019-08-19 2022-09-09 中国科学院自动化研究所 Material object programming system supporting program decomposition and reuse
CN111515948B (en) * 2020-04-16 2021-02-26 杭州大嘴鸟信息技术有限公司 Control method and control system of programming robot
CN112466187B (en) * 2020-12-08 2023-04-18 杭州优必学科技有限公司 Gesture-based material object programming control method
CN114415924A (en) * 2021-12-07 2022-04-29 杭州超乎智能科技有限公司 Multi-mode interaction method based on physical programming and related equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080084271A1 (en) * 2006-10-06 2008-04-10 Denny Jaeger Continuous variable wireless data input to RFID reader
US7770136B2 (en) * 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
CN102136208A (en) * 2011-03-30 2011-07-27 中国科学院软件研究所 Material object programming method and system
CN102800223A (en) * 2012-07-19 2012-11-28 中国科学院软件研究所 Collaborative entity programming method
CN103197929A (en) * 2013-03-25 2013-07-10 中国科学院软件研究所 System and method for graphical programming facing children
CN103456203A (en) * 2013-09-12 2013-12-18 中国科学院软件研究所 Portable physical programming method and system
CN103793176A (en) * 2014-02-27 2014-05-14 朱印 Method and device for fast switching between application programs
CN104216646A (en) * 2013-05-30 2014-12-17 华为软件技术有限公司 Method and device for creating application program based on gesture
CN104991640A (en) * 2015-06-17 2015-10-21 施政 Material object programming system on interactive interface and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080084271A1 (en) * 2006-10-06 2008-04-10 Denny Jaeger Continuous variable wireless data input to RFID reader
US7770136B2 (en) * 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
CN102136208A (en) * 2011-03-30 2011-07-27 中国科学院软件研究所 Material object programming method and system
CN102800223A (en) * 2012-07-19 2012-11-28 中国科学院软件研究所 Collaborative entity programming method
CN103197929A (en) * 2013-03-25 2013-07-10 中国科学院软件研究所 System and method for graphical programming facing children
CN104216646A (en) * 2013-05-30 2014-12-17 华为软件技术有限公司 Method and device for creating application program based on gesture
CN103456203A (en) * 2013-09-12 2013-12-18 中国科学院软件研究所 Portable physical programming method and system
CN103793176A (en) * 2014-02-27 2014-05-14 朱印 Method and device for fast switching between application programs
CN104991640A (en) * 2015-06-17 2015-10-21 施政 Material object programming system on interactive interface and method

Also Published As

Publication number Publication date
CN104991640A (en) 2015-10-21
CN104991640B (en) 2018-03-27

Similar Documents

Publication Publication Date Title
WO2016202270A1 (en) System and method for physical programming on interactive surface
KR102241618B1 (en) A Device for operating according to pressure state of touch input and method thereof
Foley et al. The art of natural man-machine conversation
US8125440B2 (en) Method and device for controlling and inputting data
US9250738B2 (en) Method and system for assigning the position of a touchpad device
US20160202903A1 (en) Human-Computer Interface for Graph Navigation
US20110209087A1 (en) Method and device for controlling an inputting data
US20120280927A1 (en) Simple touch interface and hdtp grammars for rapid operation of physical computer aided design (cad) systems
CN104704451A (en) Provision of haptic feedback for localization and data input
JP2013527539A5 (en)
CN109074224A (en) For the method for insertion character and corresponding digital device in character string
US20160124633A1 (en) Electronic apparatus and interaction method for the same
CN104866097A (en) Hand-held signal output apparatus and method for outputting signals from hand-held apparatus
KR20190065746A (en) Electronic apparatus, method for controlling thereof and the computer readable recording medium
Lee et al. From seen to unseen: Designing keyboard-less interfaces for text entry on the constrained screen real estate of Augmented Reality headsets
Goguey et al. Leveraging finger identification to integrate multi-touch command selection and parameter manipulation
US20140173522A1 (en) Novel Character Specification System and Method that Uses Remote Selection Menu and Touch Screen Movements
Tada et al. Tangible programming environment using paper cards as command objects
KR20110049617A (en) Method and medium for inputting korean characters for touch screen
KR101269842B1 (en) Input method of letter in touch screen
CN204740560U (en) Handheld signal output device
KR101568716B1 (en) Korean language input device using using drag type
TW201447733A (en) Braille input method based on touch track
KR20100045617A (en) Korean alphabet input method utilizing a multi-touch sensing touch screen
KR20150131662A (en) Enlarging the condition of combination of characters when inputting text

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16811011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16811011

Country of ref document: EP

Kind code of ref document: A1