CN114416008B - Multi-screen interaction system and operation method thereof - Google Patents

Multi-screen interaction system and operation method thereof Download PDF

Info

Publication number
CN114416008B
CN114416008B CN202210309711.1A CN202210309711A CN114416008B CN 114416008 B CN114416008 B CN 114416008B CN 202210309711 A CN202210309711 A CN 202210309711A CN 114416008 B CN114416008 B CN 114416008B
Authority
CN
China
Prior art keywords
touch
mobile terminal
parameter
user
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210309711.1A
Other languages
Chinese (zh)
Other versions
CN114416008A (en
Inventor
刘露
李�昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhangshi Mutual Entertainment Network Co ltd
Original Assignee
Shenzhen Zhangshi Mutual Entertainment Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhangshi Mutual Entertainment Network Co ltd filed Critical Shenzhen Zhangshi Mutual Entertainment Network Co ltd
Priority to CN202210309711.1A priority Critical patent/CN114416008B/en
Publication of CN114416008A publication Critical patent/CN114416008A/en
Application granted granted Critical
Publication of CN114416008B publication Critical patent/CN114416008B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the invention discloses a multi-screen interaction system and an operation method based on the same, wherein the system comprises a first main mobile terminal, a second auxiliary mobile terminal and a display terminal which are in communication connection, and the method comprises the following steps: acquiring a first touch operation input by a user through a first main mobile terminal, and determining target course data; acquiring a touch parameter of a second touch operation input by a user through a second secondary mobile terminal; acquiring a second operation behavior characteristic of historical behavior data corresponding to a second secondary mobile terminal; judging whether the touch parameters are matched with the second operation behavior characteristics; if yes, determining and operating an instruction and executing to send an execution result to a display terminal; and if not, calculating whether the touch parameters of the second touch operation are different from the second operation behavior characteristics or not, carrying out different behavior classification to obtain target abnormal classification, and sending the target abnormal classification to the first main mobile terminal. By adopting the method and the device, the operation interactivity and the accuracy of the multi-screen interactive system can be improved.

Description

Multi-screen interaction system and operation method thereof
Technical Field
The invention relates to the technical field of computers, in particular to a multi-screen interaction system and an operation method thereof.
Background
With the rapid development of computers and the internet, more and more courses develop online courses through computer terminals such as smart phones and computers, and the convenience of the courses is improved. Especially with the development of the internet of things technology and the multi-terminal interconnection technology, more and more courses can be interconnected through multiple terminals of computer equipment such as smart phones and smart televisions, interaction among multiple terminals is achieved, and interestingness of online courses is improved.
However, a main audience of the online course is students or users with a small age, and these users are likely to have a situation of lacked or abnormal operation in the process of learning the online course, and parents or other guardians cannot supervise the learning effect of the course anytime and anywhere, thereby resulting in poor learning effect.
Disclosure of Invention
Therefore, it is necessary to provide a multi-screen interactive system and an operating method thereof to solve the above problems.
In a first part of the present invention, an operation method of a multi-screen interaction system is provided, where the method is based on a multi-screen interaction system including a first main mobile terminal, a second auxiliary mobile terminal, and a display terminal, and the first main mobile terminal, the second auxiliary mobile terminal, and the display terminal are in communication connection;
the method comprises the following steps:
acquiring a first touch operation input by a user through a first main mobile terminal, determining target course data corresponding to the first touch operation, sending the target course data to a second auxiliary mobile terminal and a display terminal, and displaying the target course data on the second auxiliary mobile terminal and the display terminal;
in the process of displaying the target course data on the second secondary mobile terminal, acquiring a second touch operation input by a user through the second secondary mobile terminal, and determining a touch parameter of the second touch operation;
acquiring historical behavior data corresponding to the second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features;
judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics;
if so, determining and executing an operation instruction corresponding to the second touch operation, and sending an execution result to a display terminal so as to display the execution result on the display terminal;
if not, calculating the difference characteristic between the touch parameter of the second touch operation and the second operation behavior characteristic, and performing difference behavior classification based on the calculated difference characteristic to determine that the target abnormality classification corresponding to the second touch operation is sent to the first main mobile terminal so as to remind the user of the first main mobile terminal of the abnormality of the current second auxiliary mobile terminal.
In a second aspect of the present invention, a multi-screen interaction system is provided, where the system includes a first primary mobile terminal, a second secondary mobile terminal, and a display terminal, and the first primary mobile terminal, the second secondary mobile terminal, and the display terminal are connected in a communication manner;
the method comprises the steps that a first main mobile terminal obtains a first touch operation input by a user, determines target course data corresponding to the first touch operation, sends the target course data to a second auxiliary mobile terminal and a display terminal, and displays the target course data on the second auxiliary mobile terminal and the display terminal;
in the process of displaying the target course data on the second secondary mobile terminal, the second secondary mobile terminal acquires a second touch operation input by the user through the second secondary mobile terminal and determines a touch parameter of the second touch operation; acquiring historical behavior data corresponding to the second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features; judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics; if so, determining and executing an operation instruction corresponding to the second touch operation, and sending an execution result to the display terminal;
displaying an execution result by a display terminal;
if not, the second auxiliary mobile terminal calculates whether the touch parameter of the second touch operation is different from the second operation behavior characteristic or not, and performs different behavior classification based on the calculated different characteristic so as to determine that the target abnormity classification corresponding to the second touch operation is sent to the first main mobile terminal;
and the first main mobile terminal acquires the abnormal condition of the current second auxiliary mobile terminal according to the received target abnormal classification.
The embodiment of the invention has the following beneficial effects:
after the multi-screen interaction system and the operation method of the multi-screen interaction system are adopted, parents, teachers or other guardians can control and assist learning of children or students on the second auxiliary mobile terminal through the first main mobile terminal, and screen projection interaction can be carried out through the display terminal. Specifically, a first touch operation input by a user through a first main mobile terminal is obtained, target course data corresponding to the first touch operation are determined, the target course data are sent to a second auxiliary mobile terminal and a display terminal, and the target course data are displayed on the second auxiliary mobile terminal and the display terminal; in the process of displaying the target course data on the second secondary mobile terminal, acquiring a second touch operation input by a user through the second secondary mobile terminal, and determining a touch parameter of the second touch operation; acquiring historical behavior data corresponding to the second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features; judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics; if so, determining an operation instruction corresponding to the second touch operation and executing, and sending an execution result to a display terminal so as to display the execution result on the display terminal; if not, calculating whether the touch parameter of the second touch operation is different from the second operation behavior characteristic or not, and performing different behavior classification based on the calculated different characteristic to determine that a target abnormity classification corresponding to the second touch operation is sent to the first main mobile terminal so as to remind a user of the first main mobile terminal of the current abnormity condition of the second auxiliary mobile terminal. That is to say, for the user of the second secondary mobile terminal, the multi-screen interaction system and the operation method thereof can be used for assisting the touch operation, so as to improve the accuracy of input; for the user of the first main mobile terminal, the target course displayed on the second auxiliary mobile terminal can be controlled, and the corresponding situation can be timely known and intervened under the condition that the user of the second auxiliary mobile terminal is detected to be abnormal, so that the learning monitoring effectiveness of the multi-screen interaction system is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Wherein:
FIG. 1 is a schematic diagram illustrating an exemplary multi-screen interactive system;
FIG. 2 is a flowchart illustrating a method of operating a multi-screen interaction system according to an embodiment;
fig. 3 is a schematic structural diagram of a computer device for executing the operation method of the multi-screen interaction system in one embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
In this embodiment, a multi-screen interactive system and a control method for interactive operation based on the system are provided.
Specifically, referring to fig. 1, fig. 1 is a schematic diagram illustrating a multi-screen interaction system. The multi-screen interaction system 100 includes a first main mobile terminal 201, a second sub-mobile terminal 202, and a display terminal 300 connected to the first main mobile terminal 201 and the second sub-mobile terminal 202.
The first main mobile terminal 201, the second auxiliary mobile terminal 202 and the display terminal 300 can be in communication connection through bluetooth, WIFI or NFC, and can perform data communication with each other. For example, the first main mobile terminal 201 and the second sub mobile terminal 202 transmit the interactive lesson data displayed and played on the mobile terminals to the display terminal 300 for displaying, so as to enable multi-screen display and multi-screen interaction of the interactive lesson data. In the present embodiment, the control of the display terminal 300 is realized by the first main mobile terminal 201 and the second sub-mobile terminal 202. In a specific embodiment, the first primary mobile terminal 201 and the second secondary mobile terminal 202 are portable mobile devices, such as smart phones or tablet computers, and a user can hold the first primary mobile terminal 201 and the second secondary mobile terminal 202 and operate on the mobile terminals; the display terminal 300 may be a display terminal device such as a television, a display screen, a projector, and the like, and the display terminal 300 may receive data sent by the first main mobile terminal 201 and the second sub mobile terminal 202 and display the data on the display terminal 300.
In a specific embodiment, a user may perform course learning through the multi-screen interaction system, for example, an interactive course, the user starts the course through APP or an applet preset on the first main mobile terminal 201 and the second sub mobile terminal 202, and then sends interactive course data corresponding to the course to the display terminal 300 through a multi-screen sharing operation of the user, and then the course on the first main mobile terminal 201 and the second sub mobile terminal 202 may be played on the display terminal 300, and the user may operate on the first main mobile terminal 201 and the second sub mobile terminal 202 to implement interaction of the course.
In a specific embodiment, the first main mobile terminal 201 may be a mobile terminal used by a parent, and the second sub-mobile terminal 202 may be a mobile terminal used by a child, and the parent, a teacher or other guardian may control the content displayed on the second sub-mobile terminal 202 through the first main mobile terminal 201, for example, which lesson is specifically displayed, so that a user such as a child may learn a specified lesson on the second sub-mobile terminal 202 under the guidance of the parent, the teacher or the other guardian, thereby implementing effective monitoring on the lesson learning effect.
Further, referring to fig. 2, fig. 2 is a schematic diagram illustrating an operation method of the multi-screen interaction system.
Specifically, the operation method of the multi-screen interaction system includes the steps shown in fig. 2:
step S102: the method comprises the steps of obtaining a first touch operation input by a user through a first main mobile terminal, determining target course data corresponding to the first touch operation, sending the target course data to a second auxiliary mobile terminal and a display terminal, and displaying the target course data on the second auxiliary mobile terminal and the display terminal.
The user of the first primary mobile terminal is a parent, teacher, or guardian, who may monitor and control the online lesson of the user of the second secondary mobile terminal. In this embodiment, the user of the first main mobile terminal can determine which lesson (target lesson data) should be learned on the second sub-mobile terminal through the first main mobile terminal. Specifically, a user can select a selectable online course through a first main mobile terminal, that is, a first touch operation for selecting a corresponding course is input, then a course corresponding to the first touch operation course is determined, that is, a target course is obtained, target course data corresponding to the target course is obtained, and then the target course data is sent to a second auxiliary mobile terminal, so that the user of the second auxiliary mobile terminal can learn the target course on the second auxiliary mobile terminal, and the target course data can be further sent to a display terminal, so that the target course data can be displayed on a display terminal by screen projection, and the user can enjoy experience of screen projection and multi-screen interaction.
Step S104: and in the process of displaying the target course data on the second auxiliary mobile terminal, acquiring a second touch operation input by the user through the second auxiliary mobile terminal, and determining a touch parameter of the second touch operation.
The user can independently learn in the process of learning the target course on the second auxiliary mobile terminal, the second auxiliary mobile terminal can monitor the learning condition of the user, and the first main mobile terminal is informed under the abnormal condition, so that the user such as parents, teachers or other guardians of the first main mobile terminal can timely know the condition of the user of the second auxiliary mobile terminal, and the intervention or correction can be timely carried out in real life.
Specifically, under the condition that the user learns the courses through the multi-screen interactive system, in order to achieve the interaction of the user, the user can input through the second secondary mobile terminal to achieve the learning and interaction of the courses. Specifically, the user may input a second touch operation through the second secondary mobile terminal to implement an operation on the content displayed on the second secondary mobile terminal. In order to improve the accuracy of the user operation, it is first required to determine whether the current time is in the preset interactive operation time, and only under the condition that the current time is in the preset interactive operation time, the second touch operation input by the user is continuously analyzed and executed, otherwise, under the condition that the current time is not in the preset interactive operation time, the second touch operation input by the user is ignored. The preset interactive operation time is determined according to the corresponding courses, and each course is preset with the corresponding interactive operation time (for example, the answering time corresponding to the course). In this embodiment, each course determines, in advance, a corresponding interactive operation time according to a specific content of the course, where the interactive operation time is determined according to a playing progress of the course.
Further, operation parameters corresponding to a second touch operation input by the user are obtained, wherein the operation parameters include touch parameters and a touch area, the touch parameters include touch strength, a touch trajectory and touch duration, and are used for identifying relevant parameters of the second touch operation for input, and the touch area is an area of the touch trajectory on a current operation interface of the second mobile terminal, which is determined according to the touch trajectory of the second touch operation.
Step S106: and acquiring historical behavior data corresponding to the second secondary mobile terminal, performing feature extraction on the historical behavior data, and acquiring corresponding second operation behavior features.
Generally, touch parameters corresponding to touch operations input by the same user change within a certain range, and do not change greatly in a short time, so that behavior characteristics corresponding to the touch operations in the historical behavior data of the user can be determined according to the historical behavior data, specifically, the historical behavior data of the user of the second secondary mobile terminal is obtained, the touch parameters corresponding to the touch operations in the user behavior data are determined, and feature extraction is performed based on a specific value range of the touch parameters, so as to determine corresponding second operation behavior characteristics.
In a specific embodiment, the second operation behavior characteristic may be determined according to a value range corresponding to a touch parameter corresponding to the touch operation input by the user, for example, a reference range (touch parameter reference range) of a touch parameter value.
Further, in this embodiment, the user behavior data for determining the touch parameter reference range needs to be further determined, for example, the touch parameter reference range is determined only according to effective touch operations in the user behavior data, or the touch parameter reference range is determined according to touch operations in the user behavior data within a preset time period, or the touch parameter reference range is determined according to touch operations in the user behavior data with a matching degree within a preset range, which is further described later.
When the touch operation of the second touch operation meets a preset touch parameter reference range, judging that the second touch operation currently input by the user is effective, and performing further operation; and otherwise, if the touch parameter does not meet the touch parameter reference range, judging that the currently input second touch operation is invalid, ignoring the corresponding second touch operation, generating prompt information for inputting invalidity, and displaying the prompt information on the display terminal and the second secondary mobile terminal to remind the user of inputting the second touch operation again.
Further, in an embodiment, according to the historical behavior data of the user, a plurality of second touch operations of the user in the historical behavior data are obtained, a touch parameter corresponding to each second touch operation is determined, and feature extraction is performed on each touch parameter to extract a feature value of each touch parameter. And then, based on the statistical learning model, inputting the characteristic values corresponding to the touch parameters corresponding to the plurality of second touch operations into the statistical learning model, and performing statistical analysis and clustering on the values or characteristic values corresponding to the touch parameters of each type to determine the distribution rule corresponding to the touch parameters corresponding to the valid/invalid second touch operations input by the user.
After the distribution rule of the characteristic values of the second touch parameters is determined, the distribution conditions of the characteristic values of the effective touch operation and the invalid second touch parameters can be further determined, so that the touch parameter reference range corresponding to the effective touch operation can be determined accordingly. Further, in an optional embodiment, there may be overlap between the touch parameter corresponding to the valid touch operation and the touch parameter corresponding to the invalid touch operation, in this embodiment, the validity and invalidity of the touch parameter are not directly used as the threshold of the reference range of the touch parameter, and further processing needs to be performed on the distribution rule of the characteristic value of the touch parameter of the second touch operation. Specifically, the distribution rule corresponding to the characteristic value is input to a preset deep learning model to obtain a touch parameter reference range so as to determine a corresponding second operation behavior characteristic.
In a specific embodiment, according to the distribution rule of the touch parameters, an area in which the distribution of the touch parameters is concentrated is selected, and the value of the touch parameter corresponding to the area is used as the reference range of the touch parameter.
In another embodiment, the distribution rule corresponding to the feature value is input to a preset deep learning model for learning, and then, in this step, the feature value of the touch parameter of the currently received touch operation is directly input to the deep learning model, and the deep learning model can output a classification result (including yes or no) whether the touch parameter of the touch operation input by the user satisfies the touch parameter reference range.
In a case that the touch parameter of the second touch operation satisfies the touch parameter reference range, the current touch operation may be considered to be valid input of the user, and in this case, it needs to be further determined whether the touch operation is valid input of a control on an operation interface (a current operation interface displayed on the mobile terminal) corresponding to the course.
Further, whether the touch parameter of the second touch parameter meets a parameter reference range in the second operation behavior characteristic is judged, if yes, the touch parameter of the second touch parameter meets the second operation behavior characteristic is judged, and if not, the touch parameter of the second touch parameter does not meet the second operation behavior characteristic is judged.
Step S108: judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics;
if yes, go to step S110: determining and executing an operation instruction corresponding to the second touch operation, and sending an execution result to a display terminal so as to display the execution result on the display terminal;
if not, go to step S112: and calculating the difference characteristic between the touch parameter of the second touch operation and the second operation behavior characteristic, and performing difference behavior classification based on the calculated difference characteristic to determine that the target abnormity classification corresponding to the second touch operation is sent to the first main mobile terminal so as to remind the user of the first main mobile terminal of the abnormal condition of the current second auxiliary mobile terminal.
Determining at least one operable control in a current operation interface in the target course data displayed on the second secondary mobile terminal, and determining at least one to-be-selected control corresponding to the second touch operation in the at least one operable control; determining a first touch offset parameter of the user according to the second operation behavior data; determining the matching degree between the second touch operation and each control to be selected according to the first touch offset parameter and a second touch offset parameter corresponding to each control to be selected set by the system, and determining a target control in at least one control to be selected according to the matching degree; and determining an operation instruction corresponding to the target control as an operation instruction corresponding to the second touch operation.
In this step, in the case of detecting the second touch operation input by the user, one or more operable controls are determined in the current operation interface, and then, a control to be selected is determined in the one or more operable controls according to the second touch operation input by the user. In a specific embodiment, the control to be selected may be determined according to the touch area of the second touch operation, where a distance value between the touch range corresponding to the control to be selected and the touch area of the second touch operation is smaller than a preset value, or an area of the overlap area is greater than or equal to the preset value. That is to say, in this step, one or more candidate controls are determined in the one or more operable controls according to a distance between a touchable range (where, the touchable range may be a region where the control icon is located) corresponding to the one or more operable controls and a touch region of the second touch operation or an area of an overlapping region, where the one or more candidate controls are controls that may correspond to the second touch operation input by the user.
In this embodiment, because the user corresponding to the course may be a pupil with a small age, the accuracy of the input touch operation may be insufficient, and the touch area has a certain deviation, in this embodiment, it is further required to further determine the target control in one or more candidate controls that may correspond to the second touch operation.
Specifically, according to the historical behavior data of the user, the offset condition corresponding to the second touch operation input by the user before is determined, and the first touch offset parameter is determined based on the offset condition.
In addition, in this embodiment, for each control, the system determines, according to the position, the area, and the like set by the control, a possible offset condition corresponding to the control, that is, the second touch offset parameter.
And then, according to the first touch offset parameter and the second touch offset parameter, determining a matching degree between a second touch operation input by the user and each control to be selected, wherein the matching degree is used for identifying the possibility that the second touch operation corresponds to the corresponding control to be selected. Further, a target control can be further determined in the multiple controls to be selected according to the matching degree.
In a specific embodiment, the matching degree between the second touch operation and the target control is greater than or equal to the matching degree between the second touch operation and the other candidate controls, that is, the candidate control with the highest matching degree is taken as the target control.
Further, in this embodiment, the matching degree of the target control is greater than a preset matching degree value, and the second touch operation is considered to be valid only if the matching degree of the target control is greater than the preset matching degree value, otherwise, if the matching degree between each to-be-selected control and the second touch operation is lower than the preset matching degree value, it is determined that the currently input second touch operation is invalid, and prompt information of input invalidity is generated; and displaying the prompt information with invalid input on the second secondary mobile terminal and the display terminal, and reminding the user to input again.
In a specific embodiment, the first touch offset parameter represents an offset degree between a center of a touch area of a second touch operation input by the user in the historical behavior data of the user and a center of a touch area of a control corresponding to the corresponding target control, and specifically, the first touch offset parameter of the user may be determined according to offset data of the second touch operation input by the user to the control in the historical behavior data. Determining all/effective second touch operations input by the user in the historical behavior data of the user, determining touch parameters corresponding to the second touch operations and control touch areas corresponding to target controls, and calculating the offset condition corresponding to each second touch operation based on the touch parameters and the control touch areas, wherein the offset distance between the center corresponding to the touch area of the second touch operation and the center of the control touch area corresponding to the target control is included; in addition, it is also necessary to calculate the distribution of touch parameters including parameters such as touch pressure, and then determine the offset condition corresponding to each touch parameter based on the distribution of touch parameters, so as to determine the corresponding offset parameter (first touch offset parameter).
In a specific embodiment, the touch parameters corresponding to all the second touch operations and the touch areas corresponding to the corresponding target controls are input into a preset deep learning model, and the offset parameters corresponding to the touch areas in the touch parameters are determined according to the offset condition between the touch areas corresponding to the second touch operations and the touch areas corresponding to the target controls by the deep learning model; learning other touch parameters except the touch area in the touch parameters through a deep learning model to determine offset parameters corresponding to the touch parameters; therefore, the first touch offset parameter corresponding to each touch parameter can be determined.
In a specific embodiment, for a touch area included in the touch parameter, determining a touch area in the touch parameter of the second touch operation and a control touch area corresponding to a target control corresponding to the second touch operation, calculating offset eigenvalues between the 2 touch areas, and then determining parameter distribution corresponding to the offset eigenvalue corresponding to the touch area based on a preset statistical learning model; dividing the offset characteristic value into a plurality of parameter intervals based on the determined parameter distribution, determining a reference value and a corresponding distribution probability value of each parameter interval, and then calculating a first touch offset parameter corresponding to the touch area based on the reference value and the corresponding distribution probability value of each parameter interval. The first touch offset parameter corresponding to the touch area may be calculated by performing a weighted calculation based on a reference value (here, the average value of the reference interval or a parameter value with the largest probability value) and a corresponding weighted value (the weighted value may be calculated according to the distribution probability value corresponding to the parameter interval). The weighted value corresponding to the parameter interval may be a distribution probability value directly equal to the parameter interval, or may have an inverse correlation with the distribution probability value of the parameter interval, for example, the weighted value may be the inverse of the distribution probability value.
In another embodiment, for other touch parameters, a similar method may be used to determine the first touch offset parameter, but here, the parameter distribution may be calculated by directly using the characteristic value corresponding to the touch parameter, and the remaining calculation method is the same as the calculation method of the characteristic value corresponding to the touch interval, which is not described herein again.
Further, in an embodiment, a second touch offset parameter corresponding to the current operation interface is determined according to a second touch operation allowed offset degree preset by the system, where the second touch offset parameter represents an allowed offset degree between touch areas corresponding to at least one operable control in the current operation interface. The second touch offset parameters corresponding to each of the operable spaces may be the same or different. For example, in an optional embodiment, the second touch offset parameter corresponding to each control is determined according to the shape and size of the control touch area corresponding to each control and the position of the control touch area in the current operation interface, and in consideration of the frequency of being clicked.
Specifically, the shape and size of the touch area of the control, the position of the current operation interface, and the characteristic value corresponding to the frequency of being touched are determined, and then, the second touch offset parameter corresponding to each control is determined according to the preset weighting coefficient and characteristic value corresponding to each characteristic value. In a specific embodiment, each feature value is input into a preset deep learning model, and then the second touch offset parameter is determined through the deep learning model.
Further, in an embodiment, in order to determine which control corresponds to the second touch operation input by the user, in this embodiment, a matching degree between each of the controls to be selected and the second touch operation input by the user needs to be calculated, and then a target control is determined in the multiple controls to be selected according to the calculated matching degree.
Specifically, the first touch offset parameter and the second touch offset parameter represent historical behavior data of the user, offset conditions corresponding to a second touch operation allowed by the system, and a touch area corresponding to each to-be-selected control, so as to calculate a matching degree between the touch area corresponding to the second touch operation input by the user and the control touch area corresponding to each to-be-selected control. In a specific embodiment, the degree of match is the match between the two regions.
In specific execution, an extended touch area corresponding to each control to be selected is calculated according to the first touch offset parameter and the second touch offset parameter, wherein the extended touch area is obtained by extending a control touch area corresponding to the control to be selected according to the first touch offset parameter and the second touch offset parameter, and the extension is an offset area allowed by the control touch area corresponding to the control to be selected. And then calculating an overlapping area between the extended touch area and a touch area corresponding to the input second touch operation, and determining the matching degree between the second touch operation and the control to be selected according to the ratio of the area of the overlapping area to the area of the touch area, wherein the matching degree can be a numerical value between [0 and 1 ].
Further, in another embodiment, an overlap area between a touch area corresponding to each control to be selected and a touch area corresponding to the input second touch operation needs to be calculated first, the calculated ratio is corrected according to the ratio between the area of the overlap area and the area of the touch area and the offset conditions allowed by the first touch offset parameter and the second touch offset parameter, and the corrected ratio is used as the matching degree between the control to be selected and the touch area corresponding to the input second touch operation; and according to a preset second correction function, performing second correction on the ratio after the first correction based on the second touch offset parameter so as to obtain the matching degree between the control to be selected and the touch area corresponding to the input second touch operation.
In a specific embodiment, according to the first touch offset parameter and the first correction function, the ratio is used as an independent variable to calculate a corresponding correction value, so as to complete a first correction process of the comparison value. And according to the second touch offset parameter and the second correction function, taking the ratio after the first correction as an independent variable, and calculating a corresponding correction value to complete a second correction process of the comparison value, so as to obtain the matching degree between the control to be selected and the touch area corresponding to the input second touch operation.
The matching degree indicates the possibility that the second touch operation input by the user is directed at the corresponding control to be selected, so that the higher the matching degree is, the higher the possibility that the second touch operation input by the user is directed at the corresponding control to be selected is, the lower the matching degree is, and the lower the feasibility that the second touch operation input by the user is directed at the corresponding control to be selected is. Generally speaking, it is desirable that the higher the matching degree between the second touch operation input by the user and the control is, the better, which indicates that the accuracy of the second touch operation input by the user is higher and the possibility of the existence of the misoperation is lower. Further, in this embodiment, the current input accuracy corresponding to the input second touch operation may be determined according to the matching degree between each control to be selected and the input second touch operation; and meanwhile, determining historical input accuracy corresponding to historical behavior data of the user, and determining a difference value between the current input accuracy and the historical input accuracy to serve as an accuracy difference value. When the precision difference value is larger, it indicates that the possibility that the second touch operation currently input by the user has the misoperation is higher. Specifically, it needs to be further determined whether the precision difference value is less than or equal to a preset difference threshold, if so, it indicates that the second touch operation currently input by the user is identical to the historical behavior data of the user, and the step of taking the operation instruction corresponding to the target control as the touch operation request corresponding to the second touch operation may be further performed, otherwise, the second touch operation input by the user is not identical to the historical behavior data of the user, and the current second touch operation may be an erroneous operation. Under the condition, a difference interval corresponding to the precision difference value needs to be determined, the precision reminding message is determined according to the difference interval and is sent to the display terminal, and the precision reminding message and the preset animation effect are displayed on the display terminal. Different difference intervals correspond to different prompt messages, when the difference interval is large, the fact that a user is only misoperation possibly is shown, when the difference interval is small, the fact that the user is only operation accuracy is not enough is shown, the user can be readjusted under the prompt condition, and therefore the user is reminded to input again, the user can know the effective condition of the input interactive operation, and the operation effectiveness of the multi-screen interactive system is improved.
After determining which target control corresponds to the second touch operation input by the user, the operation instruction corresponding to the target control can be determined according to the second touch operation, the operation instruction is used as a touch operation request corresponding to the second touch operation, and then the touch operation request is executed. In this step, the touch operation request is executed on the second secondary mobile terminal, and then the execution result is sent to the display terminal and displayed. Or sending the second touch operation request to a display terminal so that the multi-screen interaction system executes the touch operation request, and then sending the execution result to the display terminal for displaying on the display terminal.
Further, if the touch parameter of the second touch operation input by the user does not meet the second operation behavior characteristic, it is indicated that the user is currently in a wrong operation or has an abnormal operation. Here, it is further considered which case the user's input is.
Specifically, the difference characteristic between the touch parameter of the second touch operation and the second operation behavior characteristic is calculated, and the difference behavior classification is performed based on the calculated difference characteristic to determine a target abnormality classification corresponding to the second touch operation, where the target abnormality classification identifies that the current operation of the user is abnormal, and further a prompt message related to the target abnormality classification is obtained to determine whether the prompt message is sent to the first primary mobile terminal to remind the user of the first primary mobile terminal of the abnormal condition of the current secondary mobile terminal or only remind the user of re-inputting on the second secondary mobile terminal.
In a specific embodiment, the difference features may be classified based on a preset artificial intelligence classification network, so as to obtain a target anomaly classification corresponding to the difference features. Here, whether the difference characteristics between the touch parameters of the second touch operation and the second operation behavior characteristics are input into a preset artificial intelligence classification network, the artificial intelligence classification network can classify the difference characteristics, and specifically, the confidence of the difference characteristics under each classification abnormal label is determined based on the preset artificial intelligence classification network; and obtaining a target abnormal classification corresponding to the difference features according to the confidence of the difference features under each classification abnormal label, for example, selecting the classification abnormal label with the highest confidence as the target abnormal classification.
Further, in one embodiment, in order to monitor the user of the second mobile terminal, the user identity needs to be verified and authenticated in real time or periodically during the online course learning process. Specifically, image information of a user is acquired through a camera device on a second secondary mobile terminal; the image information comprises face information of the user, and then whether the user corresponding to the second secondary mobile terminal has the operation authority or not is judged according to the acquired image information; the method comprises the steps of obtaining face information of a user, and judging whether the face information is matched with a preset face sample or not so as to determine whether the user is the user and whether the user has corresponding operation authority or not.
In another embodiment, the acquired image information may also be sent to the first primary mobile terminal, so that the user of the first primary mobile terminal determines whether the user corresponding to the second secondary mobile terminal has the operation right, inputs the authentication result through the first primary mobile terminal, and sends the authentication result to the second secondary mobile terminal; and the second secondary mobile terminal determines whether the user has the operation authority or not according to the received authentication result. That is to say, the determination of the user identity here is performed by the first master mobile station, so that the first master mobile station can know the learning state of the user at any time.
Further, in the process of determining the user permission, because the first main mobile terminal and the second sub mobile terminal may operate on the online course differently, in this embodiment, in the process of authenticating the user of the second sub mobile terminal, it is further necessary to further determine an operation permission level corresponding to the user corresponding to the second sub mobile terminal, and determine at least one operable control in the current operation interface in the target course data displayed on the second sub mobile terminal according to the operation permission level, which may further reduce the related calculation amount for the control in the target control determination process.
In the normal learning state, the time spent by the user on any one learning node is limited, so in this embodiment, the learning condition of the user on each learning node can be monitored to determine whether the user is in the normal learning state. Specifically, the allowable response time length corresponding to the current course progress is determined according to the target course data; determining the waiting time length input by the user on the second secondary mobile terminal, wherein the waiting time length is the time length from the last touch operation input by the user through the second secondary mobile terminal; and then judging whether the waiting time is greater than or equal to the preset allowable reaction time, generating preset prompt information under the condition that the waiting time is greater than or equal to the preset allowable reaction time, and sending the preset prompt information to the first main mobile terminal for prompting the possible abnormal condition of a user of the first main mobile terminal and a second auxiliary mobile terminal. That is, if the user stays on a certain node for too long, it indicates that the user may be in a state of being slack or inattentive, and the user of the first main mobile terminal needs to intervene or pay attention, so that the effectiveness of monitoring online course learning of children by the user is improved.
After the multi-screen interaction system and the operation method of the multi-screen interaction system are adopted, parents, teachers or other guardians can control and assist learning of children or students on the second auxiliary mobile terminal through the first main mobile terminal, and screen projection interaction can be carried out through the display terminal. Specifically, a first touch operation input by a user through a first main mobile terminal is obtained, target course data corresponding to the first touch operation are determined, the target course data are sent to a second auxiliary mobile terminal and a display terminal, and the target course data are displayed on the second auxiliary mobile terminal and the display terminal; in the process of displaying the target course data on the second secondary mobile terminal, acquiring a second touch operation input by a user through the second secondary mobile terminal, and determining a touch parameter of the second touch operation; acquiring historical behavior data corresponding to a second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features; judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics; if so, determining and executing an operation instruction corresponding to the second touch operation, and sending an execution result to a display terminal so as to display the execution result on the display terminal; if not, calculating the difference characteristic between the touch parameter of the second touch operation and the second operation behavior characteristic, and performing difference behavior classification based on the calculated difference characteristic to determine that the target abnormality classification corresponding to the second touch operation is sent to the first main mobile terminal so as to remind the user of the first main mobile terminal of the abnormality of the current second auxiliary mobile terminal. That is to say, for the user of the second secondary mobile terminal, the multi-screen interaction system and the operation method thereof can be used for assisting the touch operation, so as to improve the accuracy of input; for a user of the first main mobile terminal, the target course displayed on the second auxiliary mobile terminal can be controlled, and under the condition that the user of the second auxiliary mobile terminal is detected to have abnormity, corresponding conditions are timely known and intervened in time, so that the learning monitoring effectiveness of the multi-screen interaction system is improved.
Fig. 3 illustrates an internal structural diagram of a computer device implementing the operation method of the multi-screen interaction system in one embodiment. As shown in fig. 3, the computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program which, when executed by the processor, causes the processor to carry out the above method. The internal memory may also have stored therein a computer program which, when executed by the processor, causes the processor to perform the method described above. Those skilled in the art will appreciate that the architecture shown in fig. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (8)

1. An operation method of a multi-screen interaction system is characterized in that the method is based on the multi-screen interaction system comprising a first main mobile terminal, a second auxiliary mobile terminal and a display terminal, wherein the first main mobile terminal, the second auxiliary mobile terminal and the display terminal are in communication connection;
the method comprises the following steps:
acquiring a first touch operation input by a user through a first main mobile terminal, determining target course data corresponding to the first touch operation, sending the target course data to a second auxiliary mobile terminal and a display terminal, and displaying the target course data on the second auxiliary mobile terminal and the display terminal;
in the process of displaying the target course data on the second secondary mobile terminal, acquiring a second touch operation input by a user through the second secondary mobile terminal, and determining a touch parameter of the second touch operation;
acquiring historical behavior data corresponding to a second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features; acquiring historical behavior data corresponding to a second secondary mobile terminal, wherein the historical behavior data comprises touch parameters of touch operation input by a user through the second secondary mobile terminal; extracting a characteristic value of each touch parameter according to a preset characteristic extraction algorithm; determining parameter distribution corresponding to the characteristic value of the touch parameter based on a statistical learning model, and determining a second operation behavior characteristic based on the parameter distribution, wherein the second operation behavior characteristic comprises a parameter reference range corresponding to the touch parameter;
judging whether the touch parameters of the second touch operation are matched with the second operation behavior characteristics; judging whether the touch parameter of the second touch parameter meets a parameter reference range in the second operation behavior characteristic, if so, judging that the touch parameter of the second touch parameter meets the second operation behavior characteristic, and if not, judging that the touch parameter of the second touch parameter does not meet the second operation behavior characteristic;
if so, determining and executing an operation instruction corresponding to the second touch operation, and sending an execution result to a display terminal so as to display the execution result on the display terminal; wherein, the step of determining the operation instruction corresponding to the second touch operation further comprises: determining at least one operable control in a current operation interface in the target course data displayed on the second secondary mobile terminal, and determining at least one control to be selected corresponding to the second touch operation in the at least one operable control; determining a first touch offset parameter of the user according to the second operation behavior data; determining the matching degree between the second touch operation and each control to be selected according to the first touch offset parameter and a second touch offset parameter corresponding to each control to be selected set by the system, and determining a target control in at least one control to be selected according to the matching degree; determining an operation instruction corresponding to the target control as an operation instruction corresponding to the second touch operation;
if not, calculating the difference characteristics between the touch parameters of the second touch operation and the second operation behavior characteristics, classifying the difference behaviors based on the calculated difference characteristics to determine a target abnormity classification corresponding to the second touch operation, and sending the target abnormity classification to the first main mobile terminal to remind a user of the first main mobile terminal of the abnormal condition of the current second auxiliary mobile terminal.
2. A method for operating a multi-screen interaction system according to claim 1, wherein the step of determining a first touch offset parameter of a user according to the second operation behavior data further includes:
determining a first touch offset parameter of a user according to offset data of touch operation input by the user to the control in the historical behavior data of the user of the second secondary mobile terminal, wherein the first touch offset parameter represents the offset degree of the center of the control area in the touch operation input by the user in the historical behavior data of the user and the center of the touch area in the touch operation input by the user;
determining a second touch offset parameter corresponding to the current operation interface according to an offset degree allowed by a second touch operation preset by a system, wherein the second touch offset parameter represents the allowed offset degree between touch areas corresponding to at least one operable control in the current operation interface;
calculating the matching degree between each control to be selected and the touch area corresponding to the input second touch operation according to the first touch offset parameter, the second touch offset parameter and the touch area corresponding to each control to be selected, wherein the expanded touch area corresponding to each control to be selected is calculated according to the first touch offset parameter and the second touch offset parameter, the overlapping area between the expanded touch area and the touch area corresponding to the input second touch operation is calculated, and the matching degree between the touch operation and the control to be selected is determined according to the ratio of the area of the overlapping area to the area of the touch area;
or calculating an overlapping area between a touch area corresponding to each control to be selected and a touch area corresponding to the input second touch operation, correcting the calculated ratio according to the first touch offset parameter and the second touch offset parameter according to the ratio between the area of the overlapping area and the area of the touch area, and taking the corrected ratio as the matching degree between the control to be selected and the touch area corresponding to the input second touch operation; and according to a preset second correction function, performing second correction on the ratio after the first correction based on the second touch offset parameter so as to obtain the matching degree between the control to be selected and the touch area corresponding to the input touch operation.
3. A method for operating a multi-screen interaction system according to claim 2, wherein the step of determining a first touch offset parameter of a user according to the second operation behavior data further includes:
determining a plurality of touch operations in the historical behavior data, acquiring touch parameters corresponding to the plurality of touch operations, determining offset data between the touch parameters of each touch operation and the second operation behavior data, and determining parameter distribution corresponding to the offset data of each touch parameter;
the parameter distribution is divided into a plurality of parameter intervals, and corresponding first touch offset parameters are calculated based on a parameter reference value and a weighted value corresponding to each parameter interval, wherein the weighted value is calculated according to a distribution probability value corresponding to each parameter interval.
4. A method for operating a multi-screen interaction system according to claim 1, wherein the step of performing a differential behavior classification based on the calculated differential features to determine a target anomaly classification corresponding to the second touch operation further includes:
classifying the difference characteristics based on a preset artificial intelligence classification network to obtain target abnormity classifications corresponding to the difference characteristics;
the step of classifying the difference features based on the preset artificial intelligence classification network to obtain the target abnormity classification corresponding to the difference features further comprises:
determining the confidence of the difference features under each classification abnormal label based on a preset artificial intelligence classification network; and obtaining a target abnormal classification corresponding to the difference characteristic according to the confidence of the difference characteristic under each classification abnormal label.
5. A method for operating a multi-screen interaction system according to claim 1, further comprising:
acquiring image information of a user through a camera device on a second secondary mobile terminal;
judging whether a user corresponding to the second secondary mobile terminal has an operation authority or not according to the acquired image information;
wherein, the step of judging whether the user corresponding to the second secondary mobile terminal has the operation authority according to the acquired image information further comprises:
sending the acquired image information to a first main mobile terminal so as to enable a user of the first main mobile terminal to judge whether a user corresponding to a second auxiliary mobile terminal has an operation authority, inputting an authentication result through the first main mobile terminal, and sending the authentication result to the second auxiliary mobile terminal;
the second secondary mobile terminal determines whether the user has the operation authority or not according to the received authentication result;
after the step of determining whether the user corresponding to the second secondary mobile terminal has the operation right according to the acquired image information, the method further includes:
and determining an operation authority level corresponding to the user corresponding to the second secondary mobile terminal, and determining at least one operable control in the current operation interface in the target course data displayed on the second secondary mobile terminal according to the operation authority level.
6. A method of operating a multi-screen interaction system as recited in claim 1, the method further comprising:
determining the allowable response time length corresponding to the current course progress according to the target course data;
according to the waiting time length input by the user on the second auxiliary mobile terminal, wherein the waiting time length is the time length from the last touch operation input by the user through the second auxiliary mobile terminal;
and generating preset prompt information and sending the preset prompt information to the first main mobile terminal under the condition that the waiting time is greater than or equal to the preset allowable reaction time, wherein the preset prompt information is used for prompting the possible abnormal condition of a second auxiliary mobile terminal of the user of the first main mobile terminal.
7. A multi-screen interaction system is characterized by comprising a first main mobile terminal, a second auxiliary mobile terminal and a display terminal, wherein the first main mobile terminal, the second auxiliary mobile terminal and the display terminal are in communication connection;
the method comprises the steps that a first main mobile terminal obtains a first touch operation input by a user, determines target course data corresponding to the first touch operation, sends the target course data to a second auxiliary mobile terminal and a display terminal, and displays the target course data on the second auxiliary mobile terminal and the display terminal;
in the process of displaying the target course data on the second secondary mobile terminal, the second secondary mobile terminal acquires a second touch operation input by the user through the second secondary mobile terminal and determines a touch parameter of the second touch operation; acquiring historical behavior data corresponding to the second secondary mobile terminal, and performing feature extraction on the historical behavior data to acquire corresponding second operation behavior features; acquiring historical behavior data corresponding to a second secondary mobile terminal, wherein the historical behavior data comprises touch parameters of touch operation input by a user through the second secondary mobile terminal; extracting a characteristic value of each touch parameter according to a preset characteristic extraction algorithm; determining parameter distribution corresponding to the characteristic value of the touch parameter based on a statistical learning model, and determining a second operation behavior characteristic based on the parameter distribution, wherein the second operation behavior characteristic comprises a parameter reference range corresponding to the touch parameter; judging whether the touch parameter of the second touch operation is matched with a second operation behavior characteristic; judging whether the touch parameter of the second touch parameter meets a parameter reference range in the second operation behavior characteristic, if so, judging that the touch parameter of the second touch parameter meets the second operation behavior characteristic, and if not, judging that the touch parameter of the second touch parameter does not meet the second operation behavior characteristic; if so, determining an operation instruction corresponding to the second touch operation, executing the operation instruction, and sending an execution result to the display terminal; wherein, the step of determining the operation instruction corresponding to the second touch operation further comprises: determining at least one operable control in a current operation interface in the target course data displayed on the second secondary mobile terminal, and determining at least one control to be selected corresponding to the second touch operation in the at least one operable control; determining a first touch offset parameter of the user according to the second operation behavior data; determining the matching degree between the second touch operation and each control to be selected according to the first touch offset parameter and a second touch offset parameter corresponding to each control to be selected set by the system, and determining a target control in at least one control to be selected according to the matching degree; determining an operation instruction corresponding to the target control as an operation instruction corresponding to the second touch operation;
displaying an execution result by the display terminal;
if not, the second secondary mobile terminal calculates whether the touch parameter of the second touch operation is different from the difference characteristic of the second operation behavior characteristic, performs difference behavior classification based on the calculated difference characteristic to determine target abnormality classification corresponding to the second touch operation, and sends the target abnormality classification to the first primary mobile terminal;
and the first main mobile terminal acquires the abnormal condition of the current second auxiliary mobile terminal according to the received target abnormal classification.
8. A multi-screen interaction system as claimed in claim 7, wherein the second secondary mobile terminal determines an allowed response duration corresponding to the current course schedule according to the target course data; according to the waiting time length input by the user on the second auxiliary mobile terminal, wherein the waiting time length is the time length from the last touch operation input by the user through the second auxiliary mobile terminal; under the condition that the waiting time is longer than or equal to the preset allowable reaction time, generating preset prompt information and sending the preset prompt information to the first main mobile terminal;
and the first main mobile terminal determines the possible abnormal conditions of the second auxiliary mobile terminal according to the received prompt information.
CN202210309711.1A 2022-03-28 2022-03-28 Multi-screen interaction system and operation method thereof Active CN114416008B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210309711.1A CN114416008B (en) 2022-03-28 2022-03-28 Multi-screen interaction system and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210309711.1A CN114416008B (en) 2022-03-28 2022-03-28 Multi-screen interaction system and operation method thereof

Publications (2)

Publication Number Publication Date
CN114416008A CN114416008A (en) 2022-04-29
CN114416008B true CN114416008B (en) 2022-07-26

Family

ID=81262943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210309711.1A Active CN114416008B (en) 2022-03-28 2022-03-28 Multi-screen interaction system and operation method thereof

Country Status (1)

Country Link
CN (1) CN114416008B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116562923B (en) * 2023-05-26 2023-12-22 深圳般若海科技有限公司 Big data analysis method, system and medium based on electronic commerce behaviors
CN116909465B (en) * 2023-07-13 2024-03-15 广州昊旻晟电子技术有限公司 Interactive system and method of intelligent touch integrated machine

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427141A (en) * 2019-07-09 2019-11-08 彼乐智慧科技(北京)有限公司 A kind of method and system of multi-screen interactive

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101968932A (en) * 2010-09-19 2011-02-09 深圳市摩拓触摸科技有限公司 Intelligent interactive multimedia teaching system and implementation method thereof
US9336373B2 (en) * 2014-04-15 2016-05-10 Verizon Patent And Licensing Inc. User biometric pattern learning and prediction
CN104050840A (en) * 2014-05-30 2014-09-17 深圳市浪涛科技有限公司 Interactive type electronic whiteboard teaching method and system
US9977505B2 (en) * 2014-06-06 2018-05-22 International Business Machines Corporation Controlling inadvertent inputs to a mobile device
US11301550B2 (en) * 2016-09-07 2022-04-12 Cylance Inc. Computer user authentication using machine learning
CN113176847A (en) * 2021-04-07 2021-07-27 上海墨案智能科技有限公司 Method and equipment for preventing mistaken touch of ink screen
CN113760123A (en) * 2021-07-26 2021-12-07 杭州逗酷软件科技有限公司 Screen touch optimization method and device, terminal device and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427141A (en) * 2019-07-09 2019-11-08 彼乐智慧科技(北京)有限公司 A kind of method and system of multi-screen interactive

Also Published As

Publication number Publication date
CN114416008A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN114416008B (en) Multi-screen interaction system and operation method thereof
CN106599089B (en) Knowledge point-based test question recommendation method and device and user equipment
US20210342427A1 (en) Electronic device for performing user authentication and operation method therefor
US10701315B2 (en) Video communication device and video communication method
CN114397997B (en) Control method for interactive operation and multi-screen interactive system
CN111354237A (en) Context-based deep knowledge tracking method and computer readable medium thereof
US20150269195A1 (en) Model updating apparatus and method
CN110852450B (en) Method and device for identifying countermeasure sample to protect model security
CN109583161B (en) Information processing method and device and storage medium
KR20190089628A (en) Method and system for processing Neural network model using a plurality of electronic devices
US10192042B2 (en) User verifying method, terminal device, server and storage medium
US10606992B2 (en) User authentication system and user authentication application program
CN111898577B (en) Image detection method, device, equipment and computer readable storage medium
US20180356491A1 (en) Indoor Room-Localization System and Method Thereof
US20200202068A1 (en) Computing apparatus and information input method of the computing apparatus
KR102630820B1 (en) Electronic device and operating method for detecting a messenger phishing or a voice phishing
KR20220106619A (en) Electronic device for performing federated learning using hardware security architecture and federated learning method using the thereof
CN111353140A (en) Verification code generation and display method, device and system
CN110850941A (en) Terminal equipment cooling method and related equipment
CN107682427B (en) Message pushing method, device, equipment and storage medium
US20220157188A1 (en) Learning problem recommendation system for recommending evaluable problems through unification of forms of score probability distribution and method of operating the same
CN114237794A (en) Application available time duration display method, device, equipment and storage medium
CN106534041B (en) Verification method, verification platform and client
KR20230013876A (en) System and method for providing interface to applied actual feeling learning contents platform
CN113379163A (en) Teaching assistance method, teaching assistance device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant