GB2395345A - Sentence teaching system and display control. - Google Patents
Sentence teaching system and display control. Download PDFInfo
- Publication number
- GB2395345A GB2395345A GB0226880A GB0226880A GB2395345A GB 2395345 A GB2395345 A GB 2395345A GB 0226880 A GB0226880 A GB 0226880A GB 0226880 A GB0226880 A GB 0226880A GB 2395345 A GB2395345 A GB 2395345A
- Authority
- GB
- United Kingdom
- Prior art keywords
- text
- store
- bigedit
- window
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B13/00—Teaching typing
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/0053—Computers, e.g. programming
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Entrepreneurship & Innovation (AREA)
- Document Processing Apparatus (AREA)
Abstract
A teaching system comprises a display system for displaying text and an input for selecting and manipulating text. Text is stored as passages of text for display and a processor is arranged to allow portions of text to be selected, placed in a buffer, manipulated and transferred back to the position from which it was extracted. The system is particularly for teaching sentence construction.
Description
TEACHING SYSTEM AND DISPLAY CONTROL
FIELD OF THE INVENTION
5 The present invention relates to teaching systems, and in particular to a system to facilitate the teaching of sentence editing.
BACKGROUND OF THE INVENTION
In order to rework a sentence within a passage of text, a teacher would traditionally copy the sentence onto another board and then have to rewrite the sentence each time they wanted to create a new iteration. The final iteration 15 would then need to be copied back into the original text, which could be difficult, or impossible, if the length of the text has changed.
This process is time consuming and can disrupt the flow of 20 a lesson as much of the teacher's time is spent having to rewrite the original sentence.
It is, of course, known to edit text for document printing using any one of a plethora of word processing systems.
25 However, these are not suitable in a teaching environment, and so do not constitute teaching systems. In particular, such word processing systems can only conduct operations such as copying and pasting text.
30 SUMMARY OF THE INVENTION
We have appreciated the need for a better user interface, storing and display arrangement to assist in the teaching of sentence structure.
The invention is defined in the claims to which reference is directed. Preferred features are set out in the dependent claims.
5 The embodiment of the invention provides the technical advantage over known systems of having a store and display area for selected text from which a selected iteraction can be simply transferred to a passage of text from which the selected text was taken.
The Invention is designed for use on WCT technologies such as electronic whiteboards.
The teacher selects part of a passage of text and clicks 15 the 'BigEdit' tool. A window is displayed which shows a larger version of the text and a copy of the text in an editable area. The teacher can then edit the text in the editable area to create a new iteration of the text.
20 e.g. 'the cat sat on the mat' might be changed to 'the big cat sat on the mat' for the first iteration.
The teacher can then create any number of new iterations, which appear in the editable area. In this way pupils can 25 see how an initially simple sentence can be worked through and improved.
BRIEF DESCRIPTION OF THE FIGURES
30 An embodiment of the invention will now be described by way of example only and with reference to the accompanying figures in which: Figure l: is a diagram showing how the user 35 transfers the text to be BigEdited.
Figure 2: shows the BigEdit window and an example of a set of iterations on some example text.
Figure 3: is a flow diagram showing what happens when the BigEdit button is pressed.
Figure 4: is a flow diagram showing what happens when a BigEdit class is constructed. This is triggered at 5 the end of Figure 3.
Figure 5: is a flow diagram showing the events that occur when the BigEdit window receives a repaint event from the Operating System.
Figure 6: is a flow diagram showing what happens 10 when the BigEdit Window receives a mouse click event from the Operating System.
Figure 7: is a flow diagram showing specifically what happens when the final text iteration is accepted.
Figure 8: is a block diagram of the main functional 15 components. DESCRIPTION OF AN EMBODIMENT
The embodiment is implemented on a Microsoft Windows 20 platform, but could equally be implemented on any event driven graphic operating system.
A system embodying the invention is shown in Figure 8 and comprises a display screen 2, input device 4 and store 6 25 which can be logically split as a text store and a buffer store. A processor 8 runs a subroutine to process text and perform operations as will be described.
Figure l shows how a section of a piece of text can be 30 selected and displayed in the BigEdit window. l.l shows the text which the user has selected. This selection is stored as a property of the text object on the document.
The user presses l.2 which copies the original text into the BigEdit window for refinement (see figure 3 and 35 accompanying descriptions below).
The BigEdit window is displayed in figure 2. The selected text is displayed at the top of the window (2.1). The BigEdit document (2.2) is displayed on the BigEdit window.
5 The first iteration is automatically copied into the top of the document window (2.3), where it can be edited.
Clicking the 'New' button (2.4) adds a new iteration (2.5) which is initially a copy of the previous iteration (2.3).
The new iteration can then be edited. Clicking 'Accept' 10 (2.7) replaces the selected text on the main document (1.1) with the last iteration.
Figure 3 shows what happens when a mouse click event is sent to the application window from the operating system 15 specifically what happens when the BigEdit button is clicked. At step 3.1, a mouse click event is posted from the operating system to the main application. At step 3.2, the position of the mouse is interrogated and used to determine which button on the application has been 20 pressed. At step 3.3, the BigEdit button event handler has been called - if another button was clicked, the appropriate event handler for that button would be called here. 25 At step 3.4, a pointer to the selected text on the main document is obtained. At step 3.5, this text is copied into a temporary storage buffer and at step 3.6 a BigEdit object is created and the temporary storage buffer is transferred to the objects constructor.
Figure 4 shows the flow of events that occur when a BigEdit object is created. At step 4.1, the text buffer is passed into the object's constructor. At step 4.3, the text buffer parameter is copied into the object's data 35 structure. Step 4.5 creates a new document and attaches it onto the BigEdit window. A pointer to the new document is stored in the object's data structure.
At step 4.6, a text object is constructed and placed at the top of the BigEdit document. The text object is set to display the text in the BigEdit text buffer. At step S 4.7, a message is posted to the operating system to redraw the BigEdit window.
Figure 5 shows the steps which occur when a repaint event is sent from the operating system to the BigEdit window.
10 At step 5.1 the message is posted to the window. At step 5.2, the background image which sits behind the BigEdit
window is rendered onto it's Device Context (DC). -
At step 5.3, the text to be displayed at the top of the IS window (2.1) is extracted from the object's data structure. At step 5.4 the text is rendered onto the windows DC.
Figure 6 shows the process that happens when a mouse click 20 event is posted to the BigEdit window. At step 6.1 the operating system posts the click event to the BigEdit window. At step 6.2 the mouse position is interrogated and compared against the positions of each of the buttons on the window. At 6.3, the 'Add' iteration button event 25 handler has been called. Temporary stores for an initial position and text object pointer are initialized to O and NULL respectively at step 6.4. At step 6.5 we start to iterate through all the text objects on the BigEdit document. At step 6.6, the position of the text object we 30 are currently looking at is compared with the position in the temporary store. If the y value in the temporary store is less than the position of the text object, we go to step 6.10 which updates the two stores with the new data.
At step 6.7 we check if there are any more text objects on 35 the document. If there are we go back to step 6.5.
At step 6.8, we create a new text object on the BigEdit document at a position below the position in the temporary store. At 6.9, the text string associated with the text object pointed to by the temporary store is extracted. At 5 step 6.11, this text is copied into the text buffer of the newly created text object. At 6.12 the areas of the screen which need redrawing are flagged and at 6.13 a message is sent to the operating system to trigger the redraw of these areas.
At step 6.14, the 'delete' iteration button has been called. Temporary stores for an initial position and text object pointer are initialised to O and NULL respectively at step 6.15. At step 6.16 we start to iterate through 15 all the text objects on the BigEdit document. At step 6.17, the position of the text object we are currently looking at is compared with the position in the temporary store. If the y value in the temporary store is less than the position of the text object, we go to step 6.18 which 20 updates the two stores with the new data. At step 6.19 we check if there are any more text objects on the document.
If there are we go back to step 6.16.
At step 6.20, we delete the object in the temporary store, 25 if it exists. At 6.12 and 6.13 data is passed to the operating system to redraw the window in the appropriate places. At step 6.21, the 'accept' button event handler has been 30 called. Figure 7 shows what process occurs in this event handler. At step 7.2, a pointer is obtained to the text on the main document which is selected. At 7.3, a pointer to the 35 final iteration on the BigEdit window is obtained by looping through all the text objects on the document until a pointer to the object lowest down the document is
attained. At step 7.4, the selected text is replaced with the text pointed to by the last text object on the BigEdit window. S At step 7.5, the BigEdit window is closed.
Claims (3)
1. A teaching system for demonstrating sentence construction, comprising: 5 - a display system for displaying text; - an input device for selecting and manipulating text; - a text store for storing a passage of text for display and a buffer store for storing text 10 iterations; - a processor for receiving input instructions and for controlling the display screen and arranged to run a subroutine comprising: transferring a portion of text from the text 15 store to the buffer store on receipt of a selection user input from the input device, storing successive iterations of the portion of text in the buffer store, replacing the portion of text in the text 20 store with the latest iteration in the buffer store on receipt of an accept user input from the input device, and controlling the display screen to display the text in the text store on a first area of the 25 screen and the successive iterations from the buffer store in a second area of the screen.
2. A teaching system according to claim l, wherein the subroutine comprises controlling the screen to create 30 the second area on receipt of the selection user input.
3. A teaching system according to claim l or 2, wherein the subroutine comprises controlling the screen to 35 show a copy of the selected text in the second area, and successive iterations of that text on following lines.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0226880A GB2395345A (en) | 2002-11-18 | 2002-11-18 | Sentence teaching system and display control. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0226880A GB2395345A (en) | 2002-11-18 | 2002-11-18 | Sentence teaching system and display control. |
Publications (2)
Publication Number | Publication Date |
---|---|
GB0226880D0 GB0226880D0 (en) | 2002-12-24 |
GB2395345A true GB2395345A (en) | 2004-05-19 |
Family
ID=9948052
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0226880A Withdrawn GB2395345A (en) | 2002-11-18 | 2002-11-18 | Sentence teaching system and display control. |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2395345A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115909828B (en) * | 2023-02-03 | 2023-05-09 | 长春职业技术学院 | Basketball theory electronic simulation teaching equipment |
-
2002
- 2002-11-18 GB GB0226880A patent/GB2395345A/en not_active Withdrawn
Non-Patent Citations (3)
Title |
---|
"Clicker 4", Crick Software, 2001. See http://www.cricksoftware.com/uk/clicker4/guide/12.htm * |
"Easiteach", Softease Ltd, 2001. See http://www.textease.com/easiteach * |
"SMART Board Notebook (v. 2.5)", Smart Technologies Inc., July 2002. See http://www.smarttech.com/sbsoftware/index.asp * |
Also Published As
Publication number | Publication date |
---|---|
GB0226880D0 (en) | 2002-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Myers | Demonstrational interfaces: A step beyond direct manipulation | |
Myers | State of the art in user interface software tools | |
US5289205A (en) | Method and apparatus of enhancing presentation of data for selection as inputs to a process in a data processing system | |
US20010039552A1 (en) | Method of reducing the size of a file and a data processing system readable medium for performing the method | |
US20070226642A1 (en) | Apparatus and method for displaying transparent windows when copying or moving items between windows | |
JPH04344928A (en) | Method and apparatus for graphically associating user-dialogue display with main application in data processing system | |
EP0557205A2 (en) | Language processing system using object networks | |
JPH0540594A (en) | Information processor | |
GB2395345A (en) | Sentence teaching system and display control. | |
Frankel et al. | An on-line assistance system for the simulation model development environment | |
Yang | Anatomy of the design of an undo support facility | |
Plimmer et al. | FreeForm: A tool for sketching form designs | |
JP2001318811A (en) | Graphical user interface simulation device | |
US20120180033A1 (en) | System and Methodology for Autonomous, Value-Centric, Architectural, Software Programming | |
JP3008537B2 (en) | Data processing device | |
US20030033332A1 (en) | Control/display unit page builder software tool | |
Rosenberg et al. | Driving design with use cases | |
De Guzman et al. | Function composition in physical chaining applications | |
SZCZUR | Transportable Applications Environment (TAE) Plus-A NASA productivity tool used to develop graphical user interfaces | |
Noonan | Development of an accessible user interface for people who are blind or vision impaired as part of the re-computerisation of Royal Blind Society | |
Al-Allaf et al. | Guidelines for Using MultiMedia in Web Page and User Interface Design | |
Jenkins et al. | CLIPS application user interface for the PC | |
GB2395055A (en) | Teaching system and controlling cloze text display. | |
Szczur | Transportable Applications Environment (TAE) Plus: A NASA tool used to develop and manage graphical user interfaces | |
Borstad | Man-machine interface design for modeling and simulation software |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |