CN115390734A - Control method and device for intelligent interactive panel - Google Patents

Control method and device for intelligent interactive panel Download PDF

Info

Publication number
CN115390734A
CN115390734A CN202110502493.9A CN202110502493A CN115390734A CN 115390734 A CN115390734 A CN 115390734A CN 202110502493 A CN202110502493 A CN 202110502493A CN 115390734 A CN115390734 A CN 115390734A
Authority
CN
China
Prior art keywords
touch
media
operations
feedback information
interactive interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110502493.9A
Other languages
Chinese (zh)
Inventor
林德熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202110502493.9A priority Critical patent/CN115390734A/en
Priority to PCT/CN2022/091583 priority patent/WO2022237702A1/en
Publication of CN115390734A publication Critical patent/CN115390734A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The invention discloses a control method and device for an intelligent interactive panel. Wherein, the method comprises the following steps: displaying a multi-person interactive interface; receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined. The invention solves the technical problem that the screen splitting is needed when a plurality of people operate the intelligent interactive flat panel simultaneously in the prior art.

Description

Control method and device of intelligent interactive panel
Technical Field
The invention relates to the field of computers, in particular to a control method and device of an intelligent interactive tablet.
Background
The smart interactive tablet may be applied in a variety of fields, for example, an education field, a conference field, and the like. It has the demand that many people use simultaneously under multiple scenes. At present, when an intelligent interactive tablet faces a scene in which multiple people use the tablet simultaneously, split screen control can be generally performed only, and a part of usable areas are provided for each operation object.
Taking an interactive teaching classroom based on an intelligent interactive flat plate as an example, classroom activities with competitive subjects can be introduced into the classroom, and the classroom activities are in the form of interactive activities of a plurality of students on the desk, so that classroom activity is promoted. In order to distinguish the operations of different students, a mode of generating two interactive areas is adopted, each area allows one student to operate, and the operation of the students in different areas is scored respectively, so that the score of each student can be obtained. As shown in fig. 1, the current theme is "find multiples of 5", and left and right interactive regions are provided, the two interactive regions share a countdown marker and the same theme, but the two interactive regions have corresponding operation regions and scoring regions, thereby realizing the effect of competitive interaction of two students in a classroom.
Aiming at the problem that the operation mode provided by the intelligent interaction panel is limited because different operation objects can be distinguished only through an operation area when a plurality of people operate on the intelligent interaction panel at the same time at present, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a control method and a control device for an intelligent interactive panel, which are used for at least solving the technical problem that in the prior art, when a plurality of people operate the intelligent interactive panel simultaneously, the screen division is needed.
In a first aspect, an embodiment of the present application provides a method for controlling an intelligent interactive tablet, including: displaying a multi-person interactive interface; receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined.
In a second aspect, an embodiment of the present application provides a method for controlling an intelligent interactive tablet, including: receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and determining a touch medium for generating the touch operation based on the touch area of the touch operation.
In a third aspect, an embodiment of the present application provides a control apparatus for an intelligent interactive tablet, including: the first display module is used for displaying the multi-person interactive interface; the receiving module is used for receiving at least two touch operations generated by at least two touch media on the multi-person interactive interface, wherein the touch areas corresponding to the touch media are different; and the second display module is used for displaying the feedback information corresponding to the at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined.
In a fourth aspect, an embodiment of the present application provides a control device for an intelligent interactive tablet, including: the receiving module is used for receiving at least two touch operations generated by at least two touch media on the multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and the determining module is used for determining a touch medium for generating the touch operation based on the touch area of the touch operation.
In a fifth aspect, embodiments of the present application provide a computer storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a sixth aspect, an embodiment of the present application provides an intelligent interactive tablet, including: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
In the embodiment of the application, a multi-person interactive interface is displayed; receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined. According to the scheme, the mode of distinguishing the touch media through the touch area is provided, so that when a plurality of people use different touch media to operate on the interactive interface at the same time, users belonging to different operations can be distinguished without dividing the interactive interface into a plurality of areas, the problem that in the prior art, when the plurality of people operate the intelligent interactive panel at the same time, the screen division is needed is solved, further more operation modes are provided for the intelligent interactive panel, and the intelligent interactive panel can support more playing methods such as multi-person answering and the like.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a multi-person interactive interface according to the prior art;
fig. 2 is a flowchart of a control method of an intelligent interactive tablet according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a multi-person interactive interface according to an embodiment of the present application;
FIG. 4 is a schematic diagram of another multi-person interaction interface, according to an embodiment of the present application;
FIG. 5 is an object binding interface according to an embodiment of the present application;
fig. 6 is a flowchart of another control method of an intelligent interactive tablet according to an embodiment of the present application;
fig. 7 is a schematic diagram of another control device of a smart interactive tablet according to an embodiment of the present application;
fig. 8 is a schematic diagram of another control apparatus of an intelligent interactive tablet according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an intelligent interactive tablet according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, nor should be construed to indicate or imply relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The hardware part of mutual dull and stereotyped of intelligence comprises parts such as display module assembly, intelligent processing system (including the controller), combines together by whole structure, also is regarded as the support by dedicated software system simultaneously, and wherein the display module assembly includes display screen and backlight module spare, and wherein the display screen includes transparent electric conduction layer and liquid crystal layer etc..
The display screen, in the embodiments of the present specification, refers to a touch screen, and a touch panel, and is an inductive liquid crystal display device, when a graphical button on the screen is touched, the tactile feedback system on the screen can drive various connection devices according to a pre-programmed program, so as to replace a mechanical button panel, and create a vivid video effect by using a liquid crystal display screen. Touch screens are distinguished from technical principles and can be divided into five basic categories; a vector pressure sensing technology touch screen, a resistance technology touch screen, a capacitance technology touch screen, an infrared technology touch screen and a surface acoustic wave technology touch screen. According to the working principle of the touch screen and the medium for transmitting information, the touch screen can be divided into four categories: resistive, capacitive, infrared, and surface acoustic wave.
When a user touches the screen with a finger or a pen, the point coordinates are positioned, so that the control of the intelligent processing system is realized, and then different functional applications are realized along with the built-in software of the intelligent processing system.
The 'screen' and 'large screen' mentioned in the application refer to the display screen of the intelligent interactive flat panel; the intelligent interaction panel displays a certain interface, namely the display screen of the intelligent interaction panel displays the interface.
Example 1
The embodiment of the application discloses a control method of an intelligent interactive panel, which is applied to the intelligent interactive panel. A control method of an intelligent interactive tablet according to an embodiment of the present application will be described in detail below with reference to fig. 2 to 5. Referring to fig. 2, fig. 2 is a flowchart illustrating a control method of an intelligent interactive tablet according to an embodiment of the present application. The method comprises the following steps:
and step S202, displaying the multi-person interactive interface.
Specifically, the multi-user interaction interface may be displayed on a display screen of the smart interaction tablet. The multi-user interactive interface can support multi-point touch, namely when the touch screen receives touch operation of one point, other positions of the touch screen can not lose focus, and other touch operations can still be received.
The content displayed by the multi-person interaction interface is associated with an actual scene. For example, in a group activity, a multiplayer game interface can be displayed in the multiplayer interactive interface; for another example, in a classroom teaching scenario, a subject to be answered by a student may be displayed in the multi-person interactive interface.
Step S204, receiving at least two touch operations generated by at least two touch media on the multi-person interactive interface, wherein the touch areas corresponding to the touch media are different.
Specifically, the at least two touch media are used to represent a plurality of touch media having different touch areas, for example, a touch pen having different touch areas, or a finger stall having different touch areas.
The touch area of the touch medium can be represented by the number of pixels occupied by one landing point of the touch medium on the touch screen, and the number of pixels corresponding to each touch medium can be a fixed value or a range. The touch area of the touch medium can also be represented by size information (height and width) of one landing point of the touch medium on the touch screen, and the size information corresponding to each touch medium can also be a fixed value or a range.
In the scheme, a plurality of users use different touch media to operate on the multi-user interaction interface, and the intelligent interaction panel can receive various touch operations generated by the various touch media.
It should be noted that if a user uses a finger as a touch medium to perform an operation on the multi-user interactive interface, the touch areas are different, but the touch areas may be different in each operation due to the fact that the finger has certain elasticity, and therefore it is difficult to distinguish touch operations of different operation objects according to the touch areas of the finger. Based on this problem, finger sleeves having different touch areas can be selected as touch media for different operation objects.
In an optional embodiment, taking a teaching answer scene as an example, a multi-user interactive interface is displayed on the intelligent interactive panel, and the multi-user interactive interface is used for displaying questions, and a plurality of students answer the questions on the touch screen by using different touch media, so that the intelligent interactive panel receives a plurality of touch operations generated by a plurality of touch media.
Step S206, displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through touch areas, and the feedback information corresponding to the at least two touch operations is respectively determined.
Specifically, the feedback information may be information related to an actual service, and after the touch medium to which the touch operation belongs is determined according to the touch area, a subsequent preset service module may be used to execute the corresponding service based on the touch operation, for example, to score the touch operation. Still taking a teaching answer scene as an example, the touch operations generated by different touch media are answer operations of different students, so that statistics is performed on whether the touch operations are answered or not, and scores corresponding to different touch operations, namely the feedback information, can be obtained.
In the above scheme, after receiving a plurality of touch operations, the interactive smart tablet may determine, according to the touch area, to generate a touch medium for each touch operation, and use the touch operation generated by the same touch medium as a behavior of the same operation object, so as to obtain a score of each operation object by scoring each touch operation.
The intelligent interaction panel can receive touch operation by using the high-precision touch frame, and detect the touch area of the touch medium through the high-precision touch frame. When a plurality of persons operate simultaneously, the touch medium is associated with the touch area of the touch medium, and then the operation object is associated with the touch medium, so that the operation object from which each touch operation comes can be identified.
In an alternative embodiment, still taking the teaching answer scenario as an example, each student will hold a different touch medium, such as a pen or a finger stall with different thickness, and when the student touches the screen through the touch medium, the touch frame will detect the touch area of the touch medium. The interactive whiteboard software identifies the touch medium from which the touch operation comes according to the touch area received from the touch frame, and then can determine which student the touch operation is created by according to the touch medium, so that business behaviors such as scoring and the like on the software can be realized. Therefore, the above example provides an interactive mode for a plurality of students to compete on the desk, and feedback information such as rating information corresponding to the operation of the plurality of students is provided according to the interaction with the plurality of students. Not only do not need separately a plurality of different scopes, let the student interact in the scope of difference inside to can let mutual whiteboard software support more students and put on the desk, and provide the active mode of racing to be first to answer a question, still do not have the requirement of installing other electronic module (for example bluetooth module etc.) to the touch medium, only need the touch area of every kind of touch medium different can, thereby avoided the touch medium to need electrified to lead to the inconvenience that needs charge for the touch medium.
Therefore, the embodiment of the application displays a multi-user interactive interface; receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined. According to the scheme, the mode of distinguishing the touch media through the touch area is provided, so that when a plurality of people use different touch media to operate on the interactive interface at the same time, users belonging to different operations can be distinguished without dividing the interactive interface into a plurality of areas, the problem that in the prior art, when the plurality of people operate the intelligent interactive panel at the same time, the screen division is needed is solved, further more operation modes are provided for the intelligent interactive panel, and the intelligent interactive panel can support more playing methods such as multi-person answering and the like.
As an optional embodiment, the multi-person interactive interface includes a plurality of optional elements, and after receiving at least two touch operations generated by at least two touch mediums on the multi-person interactive interface, the method further includes: detecting any touch operation hit element; and responding to the touch operation, and displaying an animation effect corresponding to the hit element.
In the above scheme, the animation effects of the touch operation hit elements of different touch media may be the same or different; the animation effect of different elements being hit may be the same or different.
In an alternative embodiment, also exemplified in the above teaching answer scenario, optional elements are used to present candidate answers, one animation effect may be displayed when the correct answer element is hit, and another animation effect may be displayed when the wrong answer element is hit.
As an alternative embodiment, the element is displayed at a fixed position in the multi-person interactive interface, and the touch operation comprises: and selecting the element and dragging the element to a preset position.
In the above scheme, the elements are displayed at fixed positions in the multi-user interactive interface, and the user operates the elements by selecting and dragging the elements. Such an approach is suitable for a race-to-answer scenario.
Fig. 3 is a schematic diagram of a multi-user interactive interface according to an embodiment of the present application, and in combination with fig. 3, the multi-user interactive interface does not need to be divided into two different ranges, but students can interact in a common multi-user interactive interface. In this example, the multi-user interactive interface includes a plurality of elements (china, usa, new york, beijing) representing answers, the answer mode is to select and drag the elements representing the answers to corresponding categories (city name, national city), a plurality of students can drag different answers to the corresponding categories by means of their own speed and reaction, and the whiteboard software can automatically distinguish the students corresponding to the currently dragged answers, so as to give corresponding scores.
More specifically, the subsequent service module may set different data storage areas for different operation objects, so as to score the touch operations of different operation objects respectively. In the above example, different scoreboards may be provided for different students. If two students are used as participants, when the answer is dragged, the current student dragging the answer is judged according to the touch area of the touch operation, and when the student drags the answer into the classification, the corresponding student is scored according to whether the dragged answer is correct or not. For example, student A drags answer < China > into the category < Country name >, then student A is scored because this is a correct answer.
As an alternative embodiment, the element moves in the multi-person interactive interface according to a preset path, and the touch operation includes: a click operation to select an element or a slide operation to segment an element.
Fig. 4 is a schematic diagram of another multi-user interaction interface according to an embodiment of the present application, as shown in fig. 4, with the title of finding a multiple of 5, which represents that an element of an answer falls from above (the dotted line is a path along which the element falls), and a student can touch and click the element by using different touch media or slide to segment the element, and the corresponding student is added or subtracted according to whether the element clicked or slid by the student meets the title of the answer.
As an alternative embodiment, before displaying the multi-person interactive interface, the method further includes: displaying an object binding interface, wherein the object binding interface is used for binding an operation object and a touch medium; receiving operation object identifications corresponding to at least two touch media, wherein the operation objects and the corresponding touch media are bound based on the operation object identifications.
Specifically, the object binding interface is used for binding the touch medium with the operation object, so that the operation object can be determined after the touch medium generating the touch operation is identified. The operation object identifier may be information such as a name and a number of the user.
Fig. 5 is an object binding interface according to an embodiment of the present application, and in an alternative embodiment, in the teaching answer scenario, two students a and B who hold different touch media answer the questions first before entering the object binding interface. In the object binding interface, firstly, a prompt message of 'please fill in the name of the first participant' is displayed, the classmate A fills in the name 'A' in the designated area through writing operation, and the whiteboard software detects and records the touch area of the touch medium used by the classmate A through the writing operation. After the classmate A finishes filling in the name, selecting a 'finishing' control, and displaying 'please ask the second participant to fill in the name', please note that the first participant uses different touch media, the classmate B fills in the name 'B' in a designated area through another touch media, and the whiteboard software detects and records the touch area of the touch media used by the classmate B through the writing operation. Therefore, the whiteboard software completes the binding of the touch area and the touch medium and the binding of the operation object and the touch medium.
It should be noted that the object binding interface is only used for illustration, and the corresponding relationship between the touch medium and the touch area may also be established in advance, that is, the relationship between the operation object and the touch medium may be established only in the object binding interface, so that the binding between the touch medium and the touch area is not required before each multi-user operation.
It should be noted that, multiple operation objects may also be bound at the same time in the same object binding interface.
As an alternative embodiment, displaying feedback information corresponding to at least two touch operations includes: respectively determining an operation object identifier corresponding to each touch operation; and correspondingly displaying the operation object identification and the feedback information corresponding to the touch operation.
In the above-described aspect, since the operation object and the touch medium are bound, when the feedback information is displayed, the feedback information can be displayed in correspondence with the operation object representation.
Still taking the teaching answer scene as an example, after the student a and the student B complete a round of answer, the touch operation of the student a and the student B is identified according to the touch area of the touch operation, and then the scores of the student a and the student B are obtained. Finally, the scores of student A and student B can be displayed respectively.
Example 2
The embodiment of the application discloses a control method of an intelligent interactive panel, which is applied to the intelligent interactive panel. The following describes in detail a control method of an intelligent interactive tablet according to an embodiment of the present application with reference to fig. 6. Referring to fig. 6, fig. 6 is a flowchart illustrating another method for controlling an intelligent interactive tablet according to an embodiment of the present application. The method comprises the following steps:
step S602, receiving at least two touch operations generated by at least two touch media on a multi-user interaction interface, where the touch areas corresponding to each touch medium are different.
Specifically, the multi-user interaction interface may be displayed on a display screen of the smart interaction tablet. The multi-user interactive interface can support multi-point touch, namely when the touch screen receives touch operation of one point, other positions of the touch screen can not lose focus, and other touch operations can still be received. The at least two touch media are used to represent a plurality of touch media having different touch areas, for example, a stylus pen having different touch areas or a finger stall having different touch areas.
The touch area of the touch medium can be represented by the number of pixels occupied by one landing point of the touch medium on the touch screen, and the number of pixels corresponding to each touch medium can be a fixed value or a range. The touch area of the touch medium can also be represented by size information (height and width) of one landing point of the touch medium on the touch screen, and the size information corresponding to each touch medium can also be a fixed value or a range.
In the scheme, a plurality of users use different touch media to operate on the multi-user interactive interface, and the intelligent interactive tablet can receive various touch operations generated by the various touch media.
It should be noted that if a user uses a finger as a touch medium to perform an operation on the multi-user interactive interface, the touch areas are different, but the touch areas may be different in each operation due to the elasticity of the finger, so that it is difficult to distinguish the touch operations of different operation objects according to the touch areas of the finger. Based on this problem, finger sleeves having different touch areas can be selected as touch media for different operation objects.
In an optional embodiment, taking a teaching answer scene as an example, a multi-user interactive interface is displayed on the intelligent interactive panel, and is used for displaying questions, and a plurality of students use different touch media to answer the questions on the touch screen, so that the intelligent interactive panel receives a plurality of touch operations generated by a plurality of touch media.
Step S604, determining a touch medium for generating the touch operation based on the touch area of the touch operation.
The touch medium generating the touch operation is used for representing the touch medium corresponding to the touch operation, and the steps can be executed by a high-precision touch frame. The touch frame stores the corresponding relation between the touch area and the touch medium in advance, and after the touch operation is detected and the touch area is determined, the touch medium for generating the touch operation can be determined according to the corresponding relation.
Above-mentioned scheme can be applied to intelligent mutual equipment, and intelligent mutual flat board includes: touch frame, complete machine, PC module. The high-precision touch frame can accurately identify the touch area of the touch medium. After the touch data is collected by the touch frame, the touch data is uploaded to the PC module through USB using a standard HID protocol. And a system driving layer in the PC module analyzes the touch data to obtain touch information, and then the Windows system distributes the touch information to application software.
It should be noted that, in the interactive whiteboard software, the touch information may be received in the above manner, which is equivalent to the interactive whiteboard software acquiring the touch information collected by the touch frame. In an input layer module of the interactive whiteboard software, touch information is read, a touch area, namely width and height information of a touch point is obtained, a current touch medium is judged, and a result is transmitted to a subsequent service module. Therefore, the subsequent service module does not need to pay attention to how to judge which touch medium generates the current touch point, and therefore the subsequent service module can be multiplexed on other forms of interaction. For example, the electronic device (e.g., a bluetooth module) is arranged in a touch medium held by a student, the touch frame can be a non-high-precision touch frame, an electronic signal is actively sent to the intelligent interactive flat panel through the touch medium, so that the whiteboard application software receives information, the whiteboard application software can actively judge the medium corresponding to the touch point, and then the result is transmitted to a subsequent service module. The subsequent service module at this time is universal and supports a plurality of forms of software and hardware interaction modes.
In view of the above, the above embodiments of the present application display a multi-user interactive interface; receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished through the touch area, and the feedback information corresponding to the at least two touch operations is respectively determined. According to the scheme, the mode of distinguishing the touch media through the touch areas is provided, so that when a plurality of people use different touch media to operate on the interactive interface at the same time, users belonging to different operations can be distinguished without dividing the interactive interface into a plurality of areas, the problem that in the prior art, when the intelligent interactive panel is operated by the plurality of people at the same time, the screen division is needed is solved, further more operation modes are provided for the intelligent interactive panel, and the intelligent interactive panel can support more playing methods such as multi-person answering first.
As an alternative embodiment, determining a touch medium generating the touch operation based on the touch area of the touch operation includes: acquiring a touch area of a target touch point in touch operation; and matching the touch area of the target touch point with the touch areas of the preset multiple touch media, and determining the touch media for generating the touch operation.
Specifically, the target touch point may be a first touch point of a touch operation, or another designated touch point. When the touch area corresponding to the touch medium is a preset range, the ranges of the plurality of different touch area pairs do not have an overlapped area. And if the touch area of the first touch point belongs to the interval corresponding to a certain touch medium, determining that the touch operation to which the touch bottom point belongs is generated by the touch medium. And under the condition that the touch area corresponding to the touch medium is a preset value, the values of a plurality of different touch area pairs are different, and if the touch area of the first touch point is the same as the value corresponding to a certain touch medium, determining that the touch operation of the bottom touch point is generated by the touch medium.
As an optional embodiment, after determining the touch medium generating the touch operation based on the touch area of the touch operation, the method further includes: acquiring feedback information corresponding to at least two touch operations; and displaying feedback information corresponding to at least two touch operations.
Specifically, the feedback information may be information related to an actual service, and after the touch medium to which the touch operation belongs is determined according to the touch area, a subsequent preset service module may be used to execute the corresponding service based on the touch operation, for example, to score the touch operation. Still taking a teaching answer scene as an example, the touch operations generated by different touch media are answer operations of different students, so that statistics is performed on whether the answer is answered or not by the touch operations, and scores corresponding to different touch operations, that is, the feedback information, can be obtained.
In the above scheme, after receiving a plurality of touch operations, the interactive smart tablet may determine, according to the touch area, to generate a touch medium for each touch operation, and use the touch operation generated by the same touch medium as a behavior of the same operation object, so as to obtain a score of each operation object by scoring each touch operation.
As an alternative embodiment, the multi-person interactive interface comprises questions and optional elements, wherein the elements comprise: the method includes the steps that a first element used for showing a correct answer and a second element used for showing a wrong answer are used for obtaining feedback information corresponding to at least two touch operations, and the method includes the following steps:
respectively acquiring the number of first elements hit by at least two touch operations;
and determining feedback information corresponding to the at least two touch operations according to the number of the first elements hit by the at least two touch operations.
The step of obtaining the feedback information may be performed by a service module in the whiteboard application software, and it is not necessary to pay attention to how to distinguish touch operations generated by different touch media, but only to determine corresponding feedback information according to each touch operation.
As also shown in fig. 4, the title is found to be a multiple of 5, the element representing the answer falls from above (the dotted line is the falling path of the element), the student can use different touch media to touch and click the element, or slide to segment the element, and the corresponding student is added or subtracted according to whether the element clicked or slid by the student meets the title answer.
Example 3
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 7, a schematic diagram of a control device of an intelligent interactive tablet according to an exemplary embodiment of the present application is shown. The control device of the smart interactive tablet can be implemented by software, hardware or a combination of the two to form all or a part of the smart interactive tablet. The apparatus includes a first display module 71, a receiving module 72, and a second display module 73.
And the first display module 71 is used for displaying the multi-person interactive interface.
The receiving module 72 is configured to receive at least two touch operations generated by at least two touch media on the multi-user interactive interface, where a touch area corresponding to each touch medium is different.
The second display module 74 is configured to display feedback information corresponding to at least two touch operations, wherein the at least two touch operations are distinguished by a touch area, and the feedback information corresponding to the at least two touch operations is determined respectively.
As an alternative embodiment, the multi-person interactive interface includes a plurality of optional elements, and the apparatus further includes: the detection module is used for detecting any touch operation hit element after receiving at least two touch operations generated by at least two touch media on the multi-person interactive interface; and the third display module is used for responding to the touch operation and displaying the animation effect corresponding to the hit element.
As an alternative embodiment, the element is displayed at a fixed position in the multi-person interactive interface, and the touch operation comprises: and selecting the element and dragging the element to a preset position.
As an alternative embodiment, the element moves in the multi-person interactive interface according to a preset path, and the touch operation includes: a click operation to select an element or a slide operation to segment an element.
As an optional embodiment, the apparatus further comprises: the fourth display module is used for displaying an object binding interface before displaying the multi-user interaction interface, wherein the object binding interface is used for binding an operation object and a touch medium; and the second receiving module is used for receiving operation object identifications corresponding to at least two touch media, wherein the operation objects and the corresponding touch media are bound based on the operation object identifications.
As an alternative embodiment, the second display module includes: the determining submodule is used for respectively determining an operation object identifier corresponding to each touch operation; and the recognition submodule is used for correspondingly displaying the operation object identification and the feedback information corresponding to the touch operation.
It should be noted that, when the control device of the smart interactive tablet provided in the foregoing embodiment executes the control method of the smart interactive tablet, only the division of the functional modules is taken as an example, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions described above. In addition, the control device of the intelligent interactive tablet and the control method embodiment of the intelligent interactive tablet provided by the above embodiments belong to the same concept, and details of implementation processes are found in the method embodiments, which are not described herein again.
Example 4
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 8, which illustrates a schematic diagram of a control device of an intelligent interactive tablet according to an exemplary embodiment of the present application. The control device of the smart interactive tablet can be realized by software, hardware or a combination of the software and the hardware to form all or part of the smart interactive tablet. The apparatus comprises a receiving module 81 and a determining module 82.
The receiving module 81 is configured to receive at least two touch operations generated by at least two touch media on a multi-user interactive interface, where a touch area corresponding to each touch medium is different;
and a determining module 82, configured to determine a touch medium generating the touch operation based on the touch area of the touch operation.
As an alternative embodiment, the determining module includes: the first acquisition module is used for acquiring the touch area of a target touch point in touch operation; and the determining submodule is used for matching the touch area of the target touch point with the touch areas of the preset multiple touch media and determining the touch media for generating the touch operation.
As an optional embodiment, the apparatus further comprises: the acquisition module is used for acquiring feedback information corresponding to at least two touch operations after determining a touch medium for generating the touch operations based on the touch area of the touch operations; and the display module is used for displaying the feedback information corresponding to at least two touch operations.
As an alternative embodiment, the multi-person interactive interface comprises questions and optional elements, wherein the elements comprise: a first element for showing a correct answer and a second element for showing a wrong answer, the obtaining module comprising: the obtaining submodule is used for respectively obtaining the number of the first elements hit by at least two touch operations; and the determining submodule is used for determining the feedback information corresponding to the at least two touch operations according to the number of the first elements hit by the at least two touch operations.
Example 5
An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the method steps in the embodiments shown in fig. 1 to 6, and a specific execution process may refer to specific descriptions of the embodiments shown in fig. 1 to 6, which are not described herein again.
The device on which the storage medium is located may be a smart interactive tablet.
Example 6
Please refer to fig. 9, which provides a schematic structural diagram of an intelligent interactive tablet according to an embodiment of the present application. As shown in fig. 9, the smart interaction tablet 1000 may include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
Wherein a communication bus 1002 is used to enable connective communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 1001 may include one or more processing cores, among other things. The processor 1001 connects various parts throughout the smart interaction tablet 1000 using various interfaces and lines, and performs various functions of the smart interaction tablet 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005 and invoking data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or a combination of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 1001, but may be implemented by a single chip.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 9, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an operating application of the smart interactive tablet.
In the smart interactive tablet 1000 shown in fig. 9, the user interface 1003 is mainly used to provide an input interface for a user, and obtain data input by the user; and the processor 1001 may be configured to call an operation application of the smart interactive tablet stored in the memory 1005, and specifically perform the following operations:
displaying a multi-person interactive interface; receiving at least two touch operations generated by at least two touch media on the multi-user interactive interface, wherein the touch areas corresponding to the touch media are different; and displaying feedback information corresponding to the at least two touch operations, wherein the at least two touch operations are distinguished through touch areas, and the feedback information corresponding to the at least two touch operations is respectively determined.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described in detail in a certain embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional identical elements in the process, method, article, or apparatus comprising the element.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (14)

1. A control method of an intelligent interactive tablet is characterized by comprising the following steps:
displaying a multi-person interactive interface;
receiving at least two touch operations generated by at least two touch media on the multi-user interactive interface, wherein the touch areas corresponding to the touch media are different;
and displaying feedback information corresponding to the at least two touch operations, wherein the at least two touch operations are distinguished through touch areas, and the feedback information corresponding to the at least two touch operations is respectively determined.
2. The method of claim 1, wherein the multi-user interactive interface comprises a plurality of selectable elements, and wherein after receiving at least two touch operations generated on the multi-user interactive interface by at least two touch media, the method further comprises:
detecting that any of the touch operations hits on the element;
and responding to the touch operation, and displaying an animation effect corresponding to the hit element.
3. The method of claim 2, wherein the element is displayed at a fixed location in the multi-person interactive interface, and wherein the touch operation comprises: and selecting the element and dragging the element to a preset position.
4. The method of claim 2, wherein the element moves in the multi-user interactive interface according to a preset path, and wherein the touch operation comprises: a click operation to select the element or a slide operation to segment the element.
5. The method of claim 1, wherein prior to displaying the multi-person interactive interface, the method further comprises:
displaying an object binding interface, wherein the object binding interface is used for binding an operation object and a touch medium;
receiving operation object identifications corresponding to the at least two touch media, wherein the operation objects and the corresponding touch media are bound based on the operation object identifications.
6. The method of claim 5, wherein displaying feedback information corresponding to the at least two touch operations comprises:
respectively determining an operation object identifier corresponding to each touch operation;
and correspondingly displaying the operation object identification and the feedback information corresponding to the touch operation.
7. A control method of an intelligent interactive tablet is characterized by comprising the following steps:
receiving at least two touch operations generated by at least two touch media on a multi-user interactive interface, wherein the touch areas corresponding to the touch media are different;
and determining a touch medium for generating the touch operation based on the touch area of the touch operation.
8. The method of claim 7, wherein determining a touch medium generating the touch operation based on a touch area of the touch operation comprises:
acquiring the touch area of a target touch point in the touch operation;
and matching the touch area of the target touch point with the touch areas of a plurality of preset touch media, and determining the touch media for generating the touch operation.
9. The method of claim 7, wherein after determining a touch medium that generated the touch operation based on a touch area of the touch operation, the method further comprises:
acquiring feedback information corresponding to the at least two touch operations;
and displaying feedback information corresponding to the at least two touch operations.
10. The method of claim 9, wherein the multi-user interactive interface comprises questions and optional elements, wherein the elements comprise: the method for acquiring the feedback information corresponding to the at least two touch operations includes the following steps:
respectively acquiring the number of first elements hit by the at least two touch operations;
and determining feedback information corresponding to the at least two touch operations according to the number of the first elements hit by the at least two touch operations.
11. The utility model provides a controlling means of mutual dull and stereotyped of intelligence which characterized in that includes:
the first display module is used for displaying the multi-person interactive interface;
the receiving module is used for receiving at least two touch operations generated by at least two touch media on the multi-user interactive interface, wherein the touch areas corresponding to the touch media are different;
and the second display module is used for displaying the feedback information corresponding to the at least two touch operations, wherein the at least two touch operations are distinguished through touch areas, and the feedback information corresponding to the at least two touch operations is respectively determined.
12. The utility model provides a controlling means of mutual dull and stereotyped of intelligence which characterized in that includes:
the receiving module is used for receiving at least two touch operations generated by at least two touch media on the multi-person interactive interface, wherein the touch areas corresponding to the touch media are different;
and the determining module is used for determining a touch medium for generating the touch operation based on the touch area of the touch operation.
13. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to perform the method steps of any of claims 1 to 10.
14. An intelligent interactive tablet, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 10.
CN202110502493.9A 2021-05-08 2021-05-08 Control method and device for intelligent interactive panel Pending CN115390734A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110502493.9A CN115390734A (en) 2021-05-08 2021-05-08 Control method and device for intelligent interactive panel
PCT/CN2022/091583 WO2022237702A1 (en) 2021-05-08 2022-05-08 Control method and device for smart interactive board

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110502493.9A CN115390734A (en) 2021-05-08 2021-05-08 Control method and device for intelligent interactive panel

Publications (1)

Publication Number Publication Date
CN115390734A true CN115390734A (en) 2022-11-25

Family

ID=84028821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110502493.9A Pending CN115390734A (en) 2021-05-08 2021-05-08 Control method and device for intelligent interactive panel

Country Status (2)

Country Link
CN (1) CN115390734A (en)
WO (1) WO2022237702A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013149045A (en) * 2012-01-18 2013-08-01 Sharp Corp Touch panel type input device
CN102760033A (en) * 2012-03-19 2012-10-31 联想(北京)有限公司 Electronic device and display processing method thereof
US20150153897A1 (en) * 2013-12-03 2015-06-04 Microsoft Corporation User interface adaptation from an input source identifier change
JP2017041283A (en) * 2016-11-29 2017-02-23 シャープ株式会社 Touch-panel input device
CN109947300A (en) * 2019-03-18 2019-06-28 深圳市康冠商用科技有限公司 A kind of method, apparatus and medium for adjusting infrared touch-control machine and writing color

Also Published As

Publication number Publication date
WO2022237702A1 (en) 2022-11-17

Similar Documents

Publication Publication Date Title
US11833426B2 (en) Virtual object control method and related apparatus
CN110568984A (en) Online teaching method and device, storage medium and electronic equipment
US20160142471A1 (en) Systems and methods for facilitating collaboration among multiple computing devices and an interactive display device
JP6205767B2 (en) Learning support device, learning support method, learning support program, learning support system, and server device
US20090094528A1 (en) User interfaces and uploading of usage information
CN104461318A (en) Touch read method and system based on augmented reality technology
CN109954276B (en) Information processing method, device, medium and electronic equipment in game
CN106843681A (en) The progress control method of touch-control application, device and electronic equipment
CN109697004A (en) Method, apparatus, equipment and storage medium for touch apparatus pen annotation
Zagermann et al. " It's in my other hand!"–Studying the Interplay of Interaction Techniques and Multi-Tablet Activities
CN105575198A (en) Method and device for demonstrating teaching video
CN108958731A (en) A kind of Application Program Interface generation method, device, equipment and storage medium
WO2018000606A1 (en) Virtual-reality interaction interface switching method and electronic device
WO2023241369A1 (en) Question answering method and apparatus, and electronic device
JP2015176079A (en) Learning support system and learning support method
CN115390734A (en) Control method and device for intelligent interactive panel
KR102013368B1 (en) Association mapping game
CN111651102B (en) Online teaching interaction method and device, storage medium and electronic equipment
CN109255997A (en) A kind of electronic teaching material is prepared lessons teaching methods and device
CN111796846B (en) Information updating method, device, terminal equipment and readable storage medium
CN111282264B (en) Virtual object control method and device
CN106371644B (en) Method and device for simultaneously writing by multiple persons on screen
CN113570227A (en) Online education quality evaluation method, system, terminal and storage medium
CN106648432A (en) Control method and control apparatus for large-screen display device
CN111930971A (en) Online teaching interaction method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination