CN114415924A - Multi-mode interaction method based on physical programming and related equipment - Google Patents
Multi-mode interaction method based on physical programming and related equipment Download PDFInfo
- Publication number
- CN114415924A CN114415924A CN202111488416.9A CN202111488416A CN114415924A CN 114415924 A CN114415924 A CN 114415924A CN 202111488416 A CN202111488416 A CN 202111488416A CN 114415924 A CN114415924 A CN 114415924A
- Authority
- CN
- China
- Prior art keywords
- instruction
- programming
- physical programming
- feedback
- interactive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000002452 interceptive effect Effects 0.000 claims description 73
- 238000012790 confirmation Methods 0.000 claims description 20
- 230000000694 effects Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 11
- 239000000463 material Substances 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000019771 cognition Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 241000270666 Testudines Species 0.000 description 4
- 241000270708 Testudinidae Species 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 235000001808 Ceanothus spinosus Nutrition 0.000 description 1
- 241001264786 Ceanothus spinosus Species 0.000 description 1
- 241000086550 Dinosauria Species 0.000 description 1
- 241000257303 Hymenoptera Species 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000004326 stimulated echo acquisition mode for imaging Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/0053—Computers, e.g. programming
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Entrepreneurship & Innovation (AREA)
- Electrically Operated Instructional Devices (AREA)
Abstract
The invention relates to the technical field of children programming education, and provides a multi-mode interaction method based on physical programming.
Description
Technical Field
The invention relates to the technical field of children programming education, in particular to a multi-modal interaction method based on physical programming and related equipment.
Background
The physical programming is to express the program logic physically with the code block in the screen on the basis of the graphical programming language.
At present, some object programming equipment in the market cannot meet the requirement of multiple purposes of one machine, and each machine has a special playing method. If a new model programming machine is desired, new play needs to be purchased, which is very uneconomical in terms of teaching costs or home costs.
Therefore, a multi-mode interaction method based on physical programming is needed, which can integrate physical programming equipment and intelligent hardware or programming blocks connected or identified by the physical programming equipment, so that the programming machine has multiple playing methods and can be identified by other intelligent hardware, and the effect of one machine with multiple purposes is achieved.
Disclosure of Invention
In order to achieve the purpose, the invention provides the following technical scheme:
the invention provides a multi-modal interaction method based on physical programming, which comprises the following steps:
acquiring or recognizing a multi-modal interactive instruction through an interactive interface of a physical programming device, wherein the multi-modal interactive instruction comprises an instruction touched by a user, a physical programming block execution instruction or an interactive mode instruction;
the object programming device analyzes the multi-mode interactive instruction, responds to the corresponding multi-mode interactive instruction according to the current programming interactive inlet and outputs feedback, wherein the feedback comprises the following steps: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
Further, the programming interaction portal comprises: the system comprises a material object programming inlet, a role confirmation inlet, a color selection inlet, a film identification inlet and an intelligent hardware identification inlet.
Further, a multi-modal interaction instruction is obtained through an interactive interface of the physical programming equipment, and when the multi-modal interaction instruction is a user touch instruction, the current programming interaction inlet is a role confirmation inlet; and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed by aiming at the role, and outputs the role matching result as feedback.
Further, the method also comprises the following steps: when the programming interaction entry is a color confirmation entry, displaying color library data; and the physical programming equipment analyzes the user touch instruction and outputs a color matching result as feedback.
Further, the current programming interaction entry is a film identification entry; and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed for the film, and a scene matching result of the film is used as feedback to be output.
Further, when the programming interactive inlet is a physical programming inlet, identifying an instruction touched by a user and/or a physical programming block execution instruction placed on the physical programming interactive interface; and the physical programming equipment analyzes the instruction touched by the user, the physical programming block execution instruction and/or the intelligent hardware identification instruction, and outputs a programming execution result as feedback.
Further, the interaction mode instructions further include, but are not limited to: a special effect playing instruction, a playing speed instruction, a sound effect instruction and a circulation instruction.
The invention also provides a physical programming device, wherein an interactive interface acquires a multi-mode interactive instruction, and the multi-mode interactive instruction comprises an instruction touched by a user, a physical programming block execution instruction or an interactive mode instruction; analyzing the multi-mode interactive instruction, responding to the corresponding multi-mode interactive instruction according to the current programming interactive entry and outputting feedback, wherein the feedback comprises the following steps: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
The invention also provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor is capable of implementing the multi-modal interaction method when executing the computer program.
The present invention also provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, is capable of implementing the above-mentioned multi-modal interaction method.
The invention has the following beneficial effects:
(1) the invention is provided with a plurality of programming interaction inlets such as a real object programming inlet, a role confirmation inlet, a color selection inlet, a film identification inlet, an intelligent hardware identification inlet and the like, can realize interaction multimodernization, and adopts integrated design of a playful method and real object programming logic, so that the programming machine has a plurality of playful methods and can be identified with other intelligent hardware, thereby achieving the effect of one machine with multiple purposes, and students can select one or more of different programming interaction inlets for programming, so that the students keep certain curiosity and enthusiasm;
(2) the invention is provided with an instruction touched by a user and an instruction executed by the physical programming block, students can perform different programming operations by touching the interactive interface of the physical programming device or moving or placing the physical programming block, and the tedious property of the students in the learning process is reduced by different operations;
(3) the invention is also provided with a film recognition entrance, and a story scene is imported through recognition of the film, so that students do not think that the students only do questions or complete tasks but have strong learning and exploring intentions, learn programming and grasp related cognition;
(4) the intelligent programming system is also provided with an intelligent hardware identification inlet, the physical programming equipment can communicate with external intelligent equipment by identifying the intelligent hardware, the external intelligent equipment and the physical programming equipment are integrated, the physical programming equipment is used for further programming the intelligent programming hardware, meanwhile, the scene can be changed by programming, the intelligent programming hardware interacts with the scene more abundantly, and the learning interest of students is improved;
(5) the invention is also provided with interactive mode instructions including a special playing effect instruction, a playing speed instruction, a sound effect instruction and a circulation instruction, the special playing effect, the playing speed, the sound effect and the circulation of the identified role can be changed through the interactive mode instructions, and the creativity of students is improved while the students learn programming.
Drawings
Fig. 1 is a flow chart of a multimodal interaction method in embodiment 1 of the present invention.
Fig. 2 is a flow chart of a multi-modal interaction method of game programming in embodiment 2 of the present invention.
FIG. 3 is an interface diagram of multi-modal interaction of game programming in embodiment 2 of the present invention.
FIG. 4 is a flow chart of a multi-modal interaction method of animation programming in embodiment 3 of the present invention.
FIG. 5 is an interface diagram of multi-modal interaction of animation programming in embodiment 3 of the present invention.
Fig. 6 is a flow chart of a multi-modal interaction method of route awareness programming in embodiment 4 of the present invention.
Fig. 7 is an interface schematic diagram of multi-modal interaction of route-aware programming in embodiment 4 of the present invention.
FIG. 8 is a flow chart of a multi-modal interaction method of hardware creative programming in embodiment 5 of the present invention.
FIG. 9 is an interface schematic of multi-modal interaction of hardware creative programming in embodiment 5 of the present invention.
FIG. 10 is a logic diagram of the physical programming apparatus of the present invention.
Fig. 11 is a schematic structural diagram of a cloud server of the present invention.
Fig. 12 is a schematic structural diagram of a computer-readable storage medium of the present invention.
In the figure: the system comprises a material object programming inlet 1, a role confirmation inlet 2, a color selection inlet 3, a film identification inlet 4 and an intelligent hardware identification inlet 5.
Detailed Description
The following detailed description of the embodiments of the present invention is provided in conjunction with the accompanying drawings, and it should be noted that the embodiments are merely illustrative of the present invention and should not be considered as limiting the invention, and the purpose of the embodiments is to make those skilled in the art better understand and reproduce the technical solutions of the present invention, and the protection scope of the present invention should be subject to the scope defined by the claims.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Specific embodiments will be described in detail below with reference to the accompanying drawings.
The physical programming in the invention is a special programming form, and is used as a tangible programming language. The physical programming may be: on the basis of a graphical programming language, the code blocks in the screen are materialized and packaged into physical modules similar to le Gao building blocks, and the physical modules are endowed with different attributes or functions, such as functions, variables, logic, sensors and the like. They may express the program logic in different stacked ways and respond to instructions. In the present invention, the physical programming may be embodied as a program, and such program may be stored, interacted with, or executed in the electronic device.
It should be noted that the object programming device in the invention can be an intelligent learning desktop, which is a product that uses a touch support, a film and a capacitance identification board as hardware carriers, combines with a task scene of subject object construction and situational object construction, and helps children promote comprehensive development of different age groups in a edutainment mode. The intelligent toy adopts a powerful software and hardware combined technical system, an AIoT cloud platform technology, an Internet of things communication technology, an AI algorithm, an array touch sensing interaction technology and a light field imaging technology, combines ergonomic, IP and cartoon styles designed for children, serves as a medium for programming, cognition and music education, and serves as a control panel to be connected with the intelligent toy for physical programming.
Example 1
As shown in fig. 1, the present invention provides a multi-modal interaction method based on physical programming, which includes:
s11, acquiring or recognizing a multi-modal interactive instruction through the interactive interface of the physical programming equipment;
it should be noted that the multi-modal interaction described in the present invention is a man-machine interaction form with multiple senses integrated. The intelligent equipment is used for processing and outputting various data such as characters, voice, vision, actions, environment and the like, and the interaction mode between people is fully simulated. In multimodal interaction with a device, the output signals for a person include, but are not limited to, visual, auditory, tactile, and emotional aspects.
In this embodiment, the multi-modal interactive instruction includes an instruction touched by a user, a physical programming block execution instruction, or an interactive mode instruction;
the medium for receiving the user touch instruction can be a touch panel, a film or a touch and read pen scanning paper surface, the physical programming block execution instruction can be read through the physical programming control panel and the programming block placed on the control panel surface in a communication mode, or the physical programming control panel controls the execution object to transmit the instruction, and the interaction mode instruction mainly aims at the selected display and play object to display or output preset circulation, volume, picture and brightness on the multi-mode interaction interface;
s12, the physical programming device analyzes the multi-modal interactive instruction, responds to the corresponding multi-modal interactive instruction according to the current programming interactive inlet and outputs feedback;
the feedback includes: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
In some preferred embodiments, the programmatic interaction portal described with reference to fig. 3-7 includes: the system comprises a material object programming inlet 1, a role confirmation inlet 2, a color selection inlet 3, a film identification inlet 4 and an intelligent hardware identification inlet 5.
The programming interaction entry can be integrated in various ways such as teaching and games, and specifically refers to an event type or an event itself which needs to be confirmed in one interaction, wherein the physical programming entry is used for interaction related to an operation programming compiling language, a compiling rule and a compiling logic of a physical programming block; a character confirmation entry for identifying interaction of a character or animation; the color selection inlet is used for displaying interaction of colors and patterns; the film identification inlet is used for identifying the interaction of the film placed on the equipment; and the intelligent hardware identification inlet is used for identifying intelligent hardware which is a physical programming execution object.
The invention is provided with a plurality of programming interaction inlets such as a real object programming inlet, a role confirmation inlet, a color selection inlet, a film identification inlet, an intelligent hardware identification inlet and the like, can realize interaction multimodernization, and adopts integrated design of a playful method and real object programming logic, so that the programming machine has a plurality of playful methods and can be identified with other intelligent hardware, thereby achieving the effect of one machine with multiple purposes, and students can select one or more of different programming interaction inlets for programming, thereby keeping the students in a certain curiosity and enthusiasm.
Example 2
Acquiring a multi-modal interaction instruction through an interactive interface of a physical programming device, wherein when the multi-modal interaction instruction is a user touch instruction, a current programming interaction inlet is a role confirmation inlet 2;
and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed by aiming at the role, and outputs the role matching result as feedback.
As shown in fig. 2-3, the multi-modal interaction method using game programming in which the programming interaction portal is a real object programming portal, the character confirmation portal and the color selection portal are combined as an embodiment is as follows:
s21, the student displays all roles through the role confirmation entrance;
in the game, the real object programming equipment interactive interface is displayed on an intelligent learning desktop, and the role confirmation entry displays a plurality of roles such as dinosaur, tortoises, bees and rabbits, so as to support the completion of game programming in a jigsaw puzzle mode;
s22, confirming the character by the user touch instruction through the interactive interface of the object programming equipment; the two columns on the left side of the intelligent learning desktop are code areas, and 20 code blocks can be placed in the two columns to meet the number of the code blocks required by current physical programming; the middle area of the intelligent learning desktop is a color filling and displaying area, and the student A can perform self-defined modification on own role and finally display a game; the upper right corner of the intelligent learning desktop is a role identification area, and different game programming playing methods can be started by using different role NFC cards; student A selects a turtle as a role, as shown in FIG. 3;
s23, analyzing the user touch instruction by the physical programming equipment to obtain an instruction confirmed by the role, and outputting the role matching result as feedback;
and displaying the interface layout and the patterns of the small tortoise on the intelligent learning desktop as feedback output.
S24, the student displays the color database through the color confirmation entrance;
in the game, a color database is displayed in the middle area of the right side of the intelligent learning desktop for color selection, and when a student A modifies and edits game roles in a self-defined manner, the student A takes colors as a function of a color disc;
s25, confirming the color by the user touch instruction through the interactive interface of the physical programming equipment;
s26, analyzing the touch instruction of the user by the physical programming equipment to obtain an instruction for selecting and confirming the color, and outputting the color matching result as feedback;
the color switching button is arranged in the area below the right side of the intelligent learning desktop, different colors are switched through the color switching button, the fact that students have enough color selection to draw unique roles of each student is guaranteed, the students A select green and yellow gradient colors to draw, and the small tortoise assembly is completely displayed after the steps S25-S26 are executed for many times.
S27, the student issues an instruction to the material object programming block by confirming to enter the material object programming entrance;
after the small turtles are spliced, the colors of the small turtles are gradually changed from green to yellow, the small turtles student A further clicks the physical programming block to issue an instruction, and the physical programming block which advances +3 steps + turns left is selected to execute the instruction;
s28, responding to the real object programming block execution instruction by the real object programming equipment interactive interface;
and S29, the physical programming device analyzes the physical execution instruction, and outputs the result of the programming execution as feedback.
The intelligent learning desktop recognizes that the programming instruction of forward +3 steps + left turn can be executed through the processor, and the small tortoise can finish walking on the desktop as a programming object.
Example 3
As shown in fig. 4-5, the multi-modal interaction method of animation programming, which takes the role confirmation entry as an example, is:
s31, the student displays the animation character through the character confirmation entrance;
s32, confirming the animation role by the user touch instruction through the interactive interface of the physical programming equipment;
like game programming in embodiment 2, the left two sides of the intelligent learning desktop are object code areas, and parameters such as playing special effect, playing speed, sound effect and circulation of the animation can be set through the object code blocks, so that the animations of each student are different; the middle area is an animation creation editing area and an animation display area, and the two-in-one area can better utilize the space; the right area is also the animated character recognition area and the paint tray.
S33, the object programming device analyzes the user touch instruction to obtain an instruction confirmed by the animation character and takes the matching result of the animation character as feedback output;
s34, the interactive interface of the object programming equipment acquires the confirmation of the interactive mode by the interactive mode instruction;
s35, the entity programming equipment analyzes the interactive mode instruction to obtain an instruction for selecting and confirming the interactive mode and takes the interactive mode matching result as feedback output;
the interactive mode is a play special effect mode, a play speed mode, a sound effect mode and a circulation mode corresponding to the interactive mode instruction.
As shown in fig. 5, the animated character selected by student B is the heart of red, which selects the interactive mode command to flash with diffusion at the center of the heart of red, and play at a 5 second cycle, accompanied by music.
In the process of selecting and executing the interactive mode instruction, the method is equivalent to a code compiling and debugging process in physical programming; of course, a mode for issuing an execution instruction of the physical programming block such as red heart skip can be selected in the code block area, and the execution result of the programming is output as feedback.
Example 4
As shown in fig. 6-7, when the programming interaction entry is the film identification entry 3;
and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed for the film, and a scene matching result of the film is used as feedback to be output.
As shown in fig. 6 to 7, the multi-modal interaction method of route-aware programming in which the entity programming entry and the film recognition entry are combined as an embodiment is as follows:
s41, the student identifies the entrance display scene through the film;
the film is used as a typical medium for scene interaction and story interaction, a film story area is arranged on the intelligent learning desktop, and a programming process is realized by clicking the film story area.
S42, confirming the scene by the user touch instruction obtained by the interactive interface of the physical programming equipment;
s43, analyzing the user touch instruction by the physical programming equipment to obtain an instruction for scene confirmation, and outputting the scene matching result as feedback;
s44, recognizing the user touch instruction and/or the physical programming block execution instruction placed on the physical programming interactive interface by the physical programming equipment interactive interface;
as shown in fig. 7, in the passing process, the recognition of scenes such as a road, a lifesaving mode and the like can be completed by matching with an identification object programming block, such as a fire extinguisher programming block and a life buoy programming block; or, directly performing click touch on the film.
And S45, the physical programming device analyzes the instruction touched by the user and the physical programming block execution instruction, and outputs a programming execution result as feedback.
The route programming is used as a cognitive 'way', the cognition is used as a node of the 'way', stories, programming and cognition are deeply integrated, students do not think that the students only do questions or complete tasks but have strong learning and exploring desires any more, the students learn the programming and grasp the related cognition at the same time, and the concept of STEAM education is deeply implemented.
Example 5
When the programming interactive inlet is a physical programming inlet, identifying an instruction touched by a user and/or a physical programming block execution instruction placed on the physical programming interactive interface; and the physical programming equipment analyzes the instruction touched by the user, the physical programming block execution instruction and/or the intelligent hardware identification instruction, and outputs a programming execution result as feedback.
As shown in fig. 8-9, the multi-modal interaction method of hardware creative programming, which combines a physical programming entry and an intelligent hardware recognition entry into an embodiment, is as follows:
s51, the student identifies the intelligent hardware through the intelligent hardware identification entrance;
as shown in fig. 9, in this embodiment, the intelligent learning desktop is used as a control and designer for creative programming of hardware, and the intelligent hardware and the intelligent learning desktop can control one or more intelligent hardware, such as a robot and an intelligent car, so as to implement a richer interaction system of the intelligent programming hardware and the intelligent programming hardware.
S52, the interactive interface of the physical programming equipment acquires and identifies the intelligent hardware to complete the identification of the current intelligent hardware;
in this embodiment, the identification of a smart entity airplane is taken as an example;
s53, recognizing the real object programming block execution instruction by the interactive interface of the real object programming equipment;
two columns on the left side of the intelligent learning desktop are code areas, and 20 code blocks can be placed in the intelligent learning desktop, so that the number of the code blocks required by current physical programming is met. And the student C selects a physical programming block of backing, turning right and lighting the plane to execute the instruction.
And S54, the physical programming device analyzes the physical programming block execution instruction, and outputs the result of the programming execution as feedback.
As shown in fig. 10, the present invention further provides a physical programming device, where an interactive interface obtains a multi-modal interactive instruction, where the multi-modal interactive instruction includes an instruction touched by a user, a physical programming block execution instruction, or an interactive mode instruction;
analyzing the multi-mode interactive instruction, responding to the corresponding multi-mode interactive instruction according to the current programming interactive entry and outputting feedback, wherein the feedback comprises the following steps: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
As shown in fig. 11, the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor can implement the above-mentioned multi-modal interaction method when executing the computer program.
As shown in fig. 12, the present invention further provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, is capable of implementing the above-mentioned multi-modal interaction method.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
Claims (10)
1. A multi-modal interaction method based on physical programming is characterized by comprising the following steps:
acquiring or recognizing a multi-mode interactive instruction through an interactive interface of a physical programming device;
the multi-modal interactive instruction comprises a user touch instruction, a physical programming block execution instruction or an interactive mode instruction;
the physical programming equipment analyzes the multi-mode interactive instruction, responds to the corresponding multi-mode interactive instruction according to the current programming interactive inlet and outputs feedback;
the feedback includes: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
2. The multi-modal interaction method based on physical programming according to claim 1, wherein the programming interaction portal comprises: the system comprises a material object programming inlet, a role confirmation inlet, a color selection inlet, a film identification inlet and an intelligent hardware identification inlet.
3. The method according to claim 2, wherein a multi-modal interaction command is obtained through an interactive interface of a physical programming device, and when the multi-modal interaction command is a user touch command, the current programming interaction entry is a role confirmation entry;
and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed by aiming at the role, and outputs the role matching result as feedback.
4. The multi-modal interaction method based on physical programming according to claim 3, further comprising:
when the programming interaction entry is a color confirmation entry, displaying color library data;
and the physical programming equipment analyzes the user touch instruction and outputs a color matching result as feedback.
5. The multi-modal interaction method based on physical programming according to claim 3, wherein the current programming interaction portal is a film recognition portal; and the physical programming equipment analyzes the user touch instruction to obtain an instruction confirmed for the film, and a scene matching result of the film is used as feedback to be output.
6. The multi-modal interaction method based on physical programming according to claim 2, wherein when the programming interaction portal is a physical programming portal, instructions for user touch and/or physical programming block execution instructions placed on the physical programming interaction interface are recognized;
and the physical programming equipment analyzes the instruction touched by the user, the physical programming block execution instruction and/or the intelligent hardware identification instruction, and outputs a programming execution result as feedback.
7. The method of claim 1, wherein the interactive mode instructions include but are not limited to: a special effect playing instruction, a playing speed instruction, a sound effect instruction and a circulation instruction.
8. The physical programming equipment is characterized in that an interactive interface acquires a multi-mode interactive instruction, wherein the multi-mode interactive instruction comprises an instruction touched by a user, a physical programming block execution instruction or an interactive mode instruction;
analyzing the multi-mode interactive instruction, responding to the corresponding multi-mode interactive instruction according to the current programming interactive entry and outputting feedback, wherein the feedback comprises the following steps: and displaying feedback, performing feedback by the physical programming block, and identifying one or more combinations of feedback.
9. A computer device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the computer program, is capable of implementing the multi-modal interaction method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, is able to carry out a multi-modal interaction method as claimed in any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111488416.9A CN114415924A (en) | 2021-12-07 | 2021-12-07 | Multi-mode interaction method based on physical programming and related equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111488416.9A CN114415924A (en) | 2021-12-07 | 2021-12-07 | Multi-mode interaction method based on physical programming and related equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114415924A true CN114415924A (en) | 2022-04-29 |
Family
ID=81265732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111488416.9A Pending CN114415924A (en) | 2021-12-07 | 2021-12-07 | Multi-mode interaction method based on physical programming and related equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114415924A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115562640A (en) * | 2022-10-26 | 2023-01-03 | 江苏睿博启创人工智能科技有限公司 | Software platform compatible with various programming hardware |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104991640A (en) * | 2015-06-17 | 2015-10-21 | 施政 | Material object programming system on interactive interface and method |
US20170337841A1 (en) * | 2016-05-20 | 2017-11-23 | Creative Styles LLC | Interactive multimedia story creation application |
CN111515948A (en) * | 2020-04-16 | 2020-08-11 | 杭州大嘴鸟信息技术有限公司 | Control method and control system of programming robot |
CN113380113A (en) * | 2021-06-19 | 2021-09-10 | 中国海洋大学 | Material object programming children drawing toy, control method and computer equipment |
-
2021
- 2021-12-07 CN CN202111488416.9A patent/CN114415924A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104991640A (en) * | 2015-06-17 | 2015-10-21 | 施政 | Material object programming system on interactive interface and method |
US20170337841A1 (en) * | 2016-05-20 | 2017-11-23 | Creative Styles LLC | Interactive multimedia story creation application |
CN111515948A (en) * | 2020-04-16 | 2020-08-11 | 杭州大嘴鸟信息技术有限公司 | Control method and control system of programming robot |
CN113380113A (en) * | 2021-06-19 | 2021-09-10 | 中国海洋大学 | Material object programming children drawing toy, control method and computer equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115562640A (en) * | 2022-10-26 | 2023-01-03 | 江苏睿博启创人工智能科技有限公司 | Software platform compatible with various programming hardware |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Yu et al. | A survey of computational kits for young children | |
Yu et al. | A review of computational toys and kits for young children | |
Marji | Learn to program with Scratch: A visual introduction to programming with games, art, science, and math | |
Ishii | Tangible user interfaces | |
Payne et al. | Danceon: Culturally responsive creative computing | |
CN107103810A (en) | Situational programming teaching method | |
KR102095951B1 (en) | A system for coding education using generating an application for robot control | |
CN110969237B (en) | Man-machine virtual interaction construction method, equipment and medium under amphiprotic relation view angle | |
CN114415924A (en) | Multi-mode interaction method based on physical programming and related equipment | |
Patterson | Programming in the primary grades: beyond the hour of code | |
CN111984161A (en) | Control method and device of intelligent robot | |
Geiger et al. | HYUI: a visual framework for prototyping hybrid user interfaces | |
Sabuncuoglu et al. | Kart-on: An extensible paper programming strategy for affordable early programming education | |
Suvak | Learn Unity3D Programming with UnityScript | |
Vlieg | Scratch by Example: Programming for All Ages | |
Czakóová et al. | Interactive Programming Tools for Beginners in Elementary School Informatics | |
Riedenklau | Development of actuated tangible user interfaces: New interaction concepts and evaluation methods | |
Romero et al. | Embodied interaction in authoring environments | |
Lee et al. | ‘MoleBot’: An organic user-interface-based robot that provides users with richer kinetic interactions | |
Li | VR interactive game design based on unity3d engine | |
Im | Draw2Code: Low-Cost Tangible Programming for Young Children to Create Interactive AR Animations | |
Glenn | The Makar Project: Empowering Youth to Design, Build, and Play Through Interactions with Augmented Reality, Physical Prototyping, and the Internet of Things | |
Gyory | Building Beholder: An Approachable Computer Vision Toolkit for Physical Computing | |
Kotevski et al. | Augmented Reality Application for Improving Writing and Motoric Skills in Children With Disabilities | |
Horn | Tangible programming with Quetzal: Opportunities for education |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |