CN110413187B - Method and device for processing annotations of interactive intelligent equipment - Google Patents

Method and device for processing annotations of interactive intelligent equipment Download PDF

Info

Publication number
CN110413187B
CN110413187B CN201810387694.7A CN201810387694A CN110413187B CN 110413187 B CN110413187 B CN 110413187B CN 201810387694 A CN201810387694 A CN 201810387694A CN 110413187 B CN110413187 B CN 110413187B
Authority
CN
China
Prior art keywords
touch event
annotation
type
application
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810387694.7A
Other languages
Chinese (zh)
Other versions
CN110413187A (en
Inventor
向淘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shizhen Information Technology Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shizhen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shizhen Information Technology Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201810387694.7A priority Critical patent/CN110413187B/en
Publication of CN110413187A publication Critical patent/CN110413187A/en
Application granted granted Critical
Publication of CN110413187B publication Critical patent/CN110413187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a device for processing annotations of interactive intelligent equipment. Wherein, the method comprises the following steps: starting the annotation application and entering an interface-free annotation mode; receiving a touch event; detecting whether the touch event is a touch event of a predetermined type, wherein the touch event of the predetermined type is used for generating an annotation track; if the touch event is determined to be a touch event of a preset type, generating an annotation track according to the touch event; if it is determined that the touch event is not a predetermined type of touch event, the touch event is distributed to an application corresponding to the touch event. The invention solves the technical problem that other applications cannot be operated when the intelligent interactive panel carries out annotation in the annotation mode in the prior art.

Description

Method and device for processing annotations of interactive intelligent equipment
Technical Field
The invention relates to the field of touch screens, in particular to a method and a device for processing annotations of interactive intelligent equipment.
Background
In the field of electromagnetic interactive smart tablets, conference tablets, devices typically include annotation functionality, i.e., annotating content currently displayed on the tablet.
When the tablet enters the annotation mode, the annotated interface (transparent layer) is at the uppermost layer of all application software, and because the page corresponding to the annotation is transparent, the user can still see the interfaces of other applications under annotation, but cannot operate the other applications. However, if the annotation is located at the top layer of all the application software, the system will distribute the touch data to the application located at the top layer by default, and then the annotation application will naturally acquire all the touch data. If the operation needs to be carried out on other applications, the annotating mode needs to be exited, and if the annotation needs to be carried out again, the annotation needs to be started again, so that the flat plate enters the annotating mode.
Taking the smart interactive tablet to display the PPT as an example, when the first page of PPT is displayed, annotation needs to be performed, the user starts an annotation function, at this time, the tablet enters an annotation mode, page turning needs to be performed after the annotation of the user is completed, if page turning operation is directly performed (for example, the screen is slid leftwards), the smart interactive tablet cannot respond to the page turning operation, and only draws a corresponding track according to the operation. Therefore, the user can intelligently quit the annotation mode to perform page turning operation, and the annotation function needs to be started again if the user needs to perform annotation after page turning.
Therefore, in the prior art, the annotating function of the intelligent interactive panel needs to be repeatedly operated in the processes of starting the annotating function and quitting the annotating function, the operation is complex, and the impromptu annotating cannot be achieved.
Aiming at the problem that other applications cannot be operated when the intelligent interactive panel carries out annotation in the annotation mode in the prior art, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a method and a device for processing annotations of an interactive intelligent device, which are used for at least solving the technical problem that other applications cannot be operated when an intelligent interactive panel carries out annotations in an annotation mode in the prior art.
According to an aspect of the embodiments of the present invention, a method for processing annotations of an interactive smart device is provided, including: starting the annotation application and entering an interface-free annotation mode; receiving a touch event; detecting whether the touch event is a touch event of a predetermined type, wherein the touch event of the predetermined type is used for generating an annotation track; if the touch event is determined to be a touch event of a preset type, generating an annotation track according to the touch event; if it is determined that the touch event is not a predetermined type of touch event, the touch event is distributed to an application corresponding to the touch event.
Further, starting the annotation application and entering an interface-free annotation mode, wherein the interface-free annotation mode comprises the following steps: and covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track.
Further, the transparent layer is a mouse hardware display layer.
Further, distributing the touch event to an annotation application; the annotation application generates an annotation trace based on the touch event.
Further, detecting a target application below the transparent layer; the touch event is distributed to a target application, wherein the target application responds to the touch event.
Further, a suspension frame is displayed on the uppermost layer of the current display interface, wherein the suspension frame is used for displaying a function control of the annotation application and detecting whether the touch event is used for triggering the function control of the annotation application; determining that the touch event is used for triggering a function control of the annotation application; the touch event is distributed to an annotation application.
Further, determining that the touch event is not used for triggering a function control of the annotation application; the touch event is distributed to a target application below the transparent layer.
Further, it is detected whether the touch event is generated by a predetermined operation, wherein the predetermined operation includes any one or more of the following: performing preset multi-finger operation; for triggering operation of a functionality control of the annotation application.
Further, the type of touch event includes any one or more of: electromagnetic pen touch events, capacitive touch events, and infrared touch events.
Further, after generating the annotation trace according to the touch event, displaying the annotation trace through a mouse hardware display layer.
According to an aspect of the embodiments of the present invention, a method for processing annotations of an interactive smart device is provided, including: starting the annotation application and entering an interface-free annotation mode; receiving a first type of touch event; displaying an annotation track corresponding to a first type of touch event, wherein the first type of touch event is used for generating the annotation track; receiving a second type of touch event; and displaying a response result obtained by the interactive intelligent equipment executing the second type of touch event.
Further, in a case where the second type of touch event is used to control the annotation application, the annotation trajectory is updated according to the second type of touch event.
Further, if the content of the current display interface is changed by the control triggered by the second type of touch event, clearing or storing the annotation track; and if the touch event of the second type is generated by the preset multi-finger operation, clearing or saving the annotation track according to the instruction corresponding to the preset multi-finger operation.
Further, after receiving the first type of touch event, displaying a floating frame on the uppermost layer of the current display interface, wherein the floating frame displays a function control of the annotation application.
Further, after the suspension frame is displayed on the uppermost layer of the interactive intelligent device, whether the first type of touch event is continuously received within a preset time is detected; if the first type of touch event is not received within the preset time, the floating frame disappears.
Further, if the touch event of the second type triggers a functionality control, the annotation trajectory is updated according to the triggered functionality control.
Further, the functionality control comprises any one or more of: the device comprises a commenting handwriting color selecting control, a commenting handwriting thickness selecting control, a commenting handwriting erasing control, a commenting track storing control and a commenting track clearing control.
Further, after displaying the annotation track corresponding to the first type of touch event, receiving an instruction for changing the annotation track sent by the electromagnetic pen, wherein the electromagnetic pen comprises a preset button corresponding to the instruction, and when the button is triggered, the electromagnetic pen sends the instruction corresponding to the button to the interactive intelligent device; and changing the annotation track according to the instruction.
Further, the first type of touch event and the second type of touch event are generated by different touch subjects.
According to another aspect of the embodiments of the present invention, there is also provided a processing apparatus for annotation of an interactive smart device, including: the first starting module is used for starting the annotation application and entering an interface-free annotation mode; a first receiving module for receiving a touch event; the detection module is used for detecting whether the touch event is a touch event of a preset type, wherein the touch event of the preset type is used for generating an annotation track; the generating module is used for generating an annotation track according to the touch event if the touch event is determined to be the touch event of the preset type; the distribution module is used for distributing the touch event to an application corresponding to the touch event if the touch event is determined not to be the touch event of the preset type.
According to another aspect of the embodiments of the present invention, there is also provided a processing apparatus for annotation of an interactive smart device, including: the second starting module is used for starting the annotation application and entering an interface-free annotation mode; a second receiving module for receiving a first type of touch event; the display device comprises a first display module, a second display module and a display module, wherein the first display module is used for displaying an annotation track corresponding to a first type of touch event, and the first type of touch event is used for generating the annotation track; a third receiving module, configured to receive a second type of touch event; and the second display module is used for displaying a response result obtained by the interactive intelligent equipment executing the second type of touch event.
According to another aspect of the embodiments of the present invention, there is also provided an interactive intelligent device, including a processor and a storage medium, where the storage medium includes a stored program, and the interactive intelligent device is characterized in that the processor is configured to execute the program, where the program executes the following steps: starting the annotation application and entering an interface-free annotation mode; receiving a touch event; detecting whether the touch event is a touch event of a predetermined type, wherein the touch event of the predetermined type is used for generating an annotation track; if the touch event is determined to be a touch event of a preset type, generating an annotation track according to the touch event; if it is determined that the touch event is not a predetermined type of touch event, the touch event is distributed to an application corresponding to the touch event.
In the embodiment of the invention, an annotation application is started, an interface-free annotation mode is entered, a touch event is received, whether the touch event is a touch event of a preset type or not is detected, if the touch event is determined to be the touch event of the preset type, an annotation track is generated according to the touch event, and if the touch event is determined not to be the touch event of the preset type, the touch event is distributed to an application corresponding to the touch event. In the scheme, the annotation writing is separated from the operation by detecting the type of the touch event, the annotation writing does not influence the normal display logic of the equipment system, and the GPU utilization rate can be saved because the annotation track is not drawn by the normal drawing process of the application equipment system.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of processing annotations for an interactive smart device according to an embodiment of the invention;
FIG. 2 is a schematic illustration of a method of batch processing according to an embodiment of the invention;
FIG. 3 is a schematic diagram of controlling an annotation application according to an embodiment of the invention;
FIG. 4 is a schematic illustration of the control of other applications according to an embodiment of the present invention;
fig. 5 is a flowchart of an intelligent interactive tablet performing the annotation processing method according to an embodiment of the present invention;
FIG. 6 is a flow chart of a method of processing annotations by an interactive smart device according to an embodiment of the invention;
FIG. 7 is a flow chart of a method of processing annotations by an interactive smart device according to an embodiment of the invention; and
FIG. 8 is a flowchart of a method for processing annotations of an interactive smart device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for processing annotations of an interactive smart device, it should be noted that the steps illustrated in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be executed in an order different from that herein.
Fig. 1 is a flowchart of a processing method of annotations of an interactive smart device according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
and step S11, starting the annotation application and entering an interface-free annotation mode.
The above step S11 is executed by the apparatus itself, and the other steps in the present embodiment are also executed by the apparatus itself. The device performing the steps in embodiment 1 may be a smart interactive tablet, a conference tablet, or the like.
In the above scheme, since an interface (view) intercepts and controls all touch events, and thus other applications cannot receive the touch events, the step does not generate a view for displaying an annotation track after the annotation application is started, and since the annotation track needs to exist in an upper layer of the content, the annotation application is started and enters an interface-free annotation mode, which includes: and covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track, and further the transparent layer is a mouse hardware display layer.
In step S13, a touch event is received.
The touch event may be generated by a user operating a touch frame of the device. After receiving the touch event, the device sends the touch data corresponding to the touch event to a system of the device, and the system distributes the touch data. In an alternative embodiment, taking the smart interactive tablet as an example, the touch frame of the smart interactive tablet may include an electromagnetic touch function, a capacitive touch function, and an infrared touch function, in which case, the touch event may be a touch event generated by a finger or a touch event generated by an electromagnetic pen.
Step S15, detecting whether the touch event is a predetermined type of touch event, where the predetermined type of touch event is used to generate an annotation trace.
The type of the touch event is determined according to the touch data, and specifically, the touch events generating the same touch data may be the same type of touch event. For example, still taking a touch screen including an electromagnetic touch function, a capacitive touch function, and an infrared touch function as an example, for an electromagnetic touch event, the generated touch data is an electromagnetic signal, and for an infrared touch event, the generated touch data is an infrared signal; for a capacitive touch event, the generated touch data is a capacitive signal, and thus, the touch event can be classified into several types, i.e., an electromagnetic touch event, an infrared touch event, and a capacitive touch event.
Specifically, the annotation trace is used to represent a trace generated by the annotation application according to the touch event. In an alternative embodiment, the predetermined type of touch event may be an electromagnetic touch event. When a system of the device detects that the received touch event is an electromagnetic touch event, an annotation trace can be generated from the touch event.
In step S17, if it is determined that the touch event is a predetermined type of touch event, an annotation trace is generated according to the touch event.
In the above step S17, only when the touch event is a predetermined type of touch event, the annotation trace is generated according to the touch event, that is, in the above scheme, the annotation application does not intercept all touch data.
Fig. 2 is a schematic diagram of a batch processing method according to an embodiment of the present invention, and in an alternative embodiment, in conjunction with fig. 2, a smart interactive tablet is taken as an example, and the predetermined type of touch event is an electromagnetic touch event. The intelligent interaction panel displays the PPT, the annotation application is started at any time before annotation, when the current page needs to be annotated, a user can write on a touch screen of the intelligent interaction panel by using an electromagnetic pen, the intelligent interaction panel detects an electromagnetic touch event, and an annotation track is generated according to the touch event generated by the electromagnetic pen.
In step S19, if it is determined that the touch event is not a predetermined type of touch event, the touch event is distributed to an application corresponding to the touch event.
In the above steps, the touch event is distributed by the system of the device, and the touch event is distributed to which application by the system, and the touch event is processed by which application. The application corresponding to the touch event can be obtained according to a preset corresponding relation, and under the condition that the touch event is not a touch event of a predetermined type, the application corresponding to the touch event can be other applications except for the annotation application or the annotation application.
In the step S19, when the received touch event is not a predetermined type of touch event, the system of the device does not directly distribute the touch event to the annotation application, and generates an annotation track according to the touch event, but an event distribution flow of the system distributes the touch event to a corresponding application. Specifically, if it is determined that the touch event is not the predetermined type of touch event, distributing the touch event to the application corresponding to the touch event may include the steps of: detecting a target application below the transparent layer; the touch event is distributed to a target application, wherein the target application responds to the touch event.
And the target application below the transparent layer is the application displayed on the current display interface. In an optional embodiment, the intelligent interactive tablet displays the PPT, and when an annotation application is started at any time before annotation, the intelligent interactive tablet enters an interface-free annotation mode, and a transparent layer is covered on the PPT, wherein the PPT application is a target application below the transparent layer.
As can be seen from the above, the foregoing embodiment of the present application starts an annotation application, receives a touch event, detects whether the touch event is a predetermined type of touch event, generates an annotation track according to the touch event if it is determined that the touch event is the predetermined type of touch event, and distributes the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event. In the scheme, the annotation writing is separated from the operation by detecting the type of the touch event, the annotation writing does not influence the normal display logic of the equipment system, and the GPU utilization rate can be saved because the annotation track is not drawn by the normal drawing process of the application equipment system.
Optionally, according to the foregoing embodiment of the present application, if it is determined that the touch event is the predetermined touch type, generating an annotation track according to the touch event includes:
in step S171, the touch event is distributed to the annotation application.
After the system of the device receives the touch event, the touch event is distributed. In the above step S171, the touch event is distributed to the annotation application, so that the annotation application can draw an annotation trajectory.
In step S173, the annotation application generates an annotation trace according to the touch event.
In the above step S173, since the touch event is a predetermined type of touch event, the annotation application generates an annotation trajectory according to the touch event.
Optionally, according to the foregoing embodiment of the present application, displaying a floating frame on an uppermost layer of a current display interface, where the floating frame is used to display a function control of an annotation application, and if it is determined that the touch event is not a predetermined type of touch event, distributing the touch event to an application corresponding to the touch event, where the method includes:
step S191, detecting whether the touch event is used for triggering a function control of the annotation application.
Specifically, the functionality control of the annotation application can be used to control the annotation as follows: screen clearing, screen capturing, saving and the like, and can also comprise: eraser, selecting handwriting color, selecting handwriting thickness, etc. The function control for triggering the annotation application can be performed by triggering the function control in the annotation application, or by a predetermined operation.
In step S193, it is determined that the touch event is used to trigger a function control of the annotation application.
In step S195, the touch event is distributed to the annotation application.
The touch event of the user controlling the annotation application is distributed to the annotation application, so that the annotation application is controlled.
Fig. 3 is a schematic diagram of controlling an annotation application according to an embodiment of the present invention, and in an alternative embodiment, in combination with fig. 3, still taking the smart interactive tablet as an example, the predetermined type of touch event is an electromagnetic touch event. The intelligent interactive flat panel displays the PPT, and after the annotation of the user is finished, the touch screen is operated through the designated gesture (the two fingers slide downwards) so as to store the current annotation. And when detecting that the operation of the user is the operation of saving the annotation, the system of the device distributes the touch event to the annotation application, and the annotation application executes the operation of saving the annotation.
Optionally, according to the foregoing embodiment of the present application, after detecting whether the touch event is a predetermined operation, the method further includes:
step S197, it is determined that the touch event is not used to trigger a functionality control of the annotation application.
Step S199, distribute the touch event to the target application below the transparent layer.
Fig. 4 is a schematic diagram of controlling other applications according to an embodiment of the present invention, and in an alternative embodiment, in combination with fig. 4, still taking the smart interactive tablet as an example, the predetermined type of touch event is an electromagnetic touch event. The intelligent interactive flat panel displays the PPT, and after the annotation of the user is finished, page turning is needed. When the user performs a page turning operation (slides the finger to the left), the device detects that the touch event of the page turning operation is not used for controlling the annotation application but used for controlling the PPT application, so that the touch signal is distributed to the PPT application.
Still in the foregoing embodiment, the annotations of the device correspond to the current display content of the device, and for the PPT, after the PPT pages, the annotations of the previous page do not correspond to the content after the page is turned, so that after the system of the device distributes the touch event to other applications except the annotation application, the system may further determine whether the current display interface changes after the other applications respond to the touch event, and if the current display interface changes, screen cleaning is performed on the previous annotations. The user may also be prompted to choose whether to save an existing annotation track before the screen is cleared.
Optionally, according to the above embodiment of the present application, detecting whether the touch event is used for controlling the annotation application includes: detecting whether the touch event is generated by a predetermined operation, wherein the predetermined operation comprises any one or more of the following operations: performing preset multi-finger operation; and an operation for triggering a functionality control of the annotation application.
In an alternative embodiment, the predetermined operation may be a preset multi-finger operation. In this example, a corresponding multi-finger gesture can be set for each operation, the system intercepts the multi-finger event, and performs corresponding handwriting saving or cleaning actions when five or more fingers pull up, pull down, pull left and pull right on the interface.
In an optional embodiment, the predetermined operation may also be an operation of a function control for triggering the annotation application, when handwriting writing is performed (in a writing state), a floating window (added in an android manner) is added on an interface of the system, a corresponding function control is added in the floating window, and the floating window is displayed when writing handwriting exists (in a writing mode), otherwise, the floating window is hidden. The predetermined operation may be an operation to trigger a control in the floating window, or the like.
Besides touch events, the annotation application can be controlled by other operation modes, for example, corresponding operation buttons are arranged on an electromagnetic pen (writing pen), each button has a corresponding operation, and when the button is pressed, the electromagnetic pen sends an instruction signal to the intelligent interaction device through Bluetooth.
Optionally, according to the above embodiments of the present application, the type of the touch event includes any one or more of the following: electromagnetic pen touch events, capacitive touch events, and infrared touch events.
Fig. 5 is a flowchart of a method for executing the annotation processing by an intelligent interactive tablet according to an embodiment of the present invention, and the solution of the present application is fully described below with reference to fig. 5.
S51: and starting an interface-free annotation mode.
Specifically, the interface-free annotation mode is that the view for displaying the annotation track is not generated after the annotation application is started. After the non-interface annotation mode is started, if the user does not write, the whole system picture and operation are the same as those of the non-interface annotation mode. The annotating software runs in the background, and no control is added to the system, so that a user cannot perceive that the annotating software is in the no-interface annotating mode, optionally, an information box can be actively popped up to inform the user that the no-interface annotating mode is started by the intelligent interactive tablet.
S52: electromagnetic stylus/infrared/capacitive hardware devices generate touch events.
It should be noted that the generator of the touch event may also include all hardware devices such as a mouse and a roller ball that can generate the touch event.
S53: it is determined whether the touch event is of a predetermined type. When the touch event is a predetermined type of touch event, the process proceeds to step S54, otherwise, the process proceeds to step S56.
In the above step S53, since the devices generating the touch events are different, it can be known by which hardware device each touch event is generated. If the event generated by the electromagnetic pen is taken as a predetermined type of touch event, the touch event generated by the electromagnetic pen can be directly distributed to the annotation application, and the touch events generated by other hardware devices are distributed through a normal system event processing flow.
S54: the annotation application obtains a drawn handwriting of the touch event.
Specifically, after the annotation application obtains the writing touch event, a handwriting curve is drawn according to the touch data point.
S55: and updating the display handwriting.
Because the annotation application is above all the applications of the system, if the annotation application uses the control to display the handwriting, the control will capture all the touch events (the top-level interface application will capture all the events by default) and other software cannot respond to the touch events. The handwriting therefore needs to be displayed in a way that does not intercept normal touch events (distributed by the system normal flow), for example: the written script may be displayed using the mouse hardware display layer (hwcursor).
In the android system, a hardware display layer can be divided into three layers, namely a mouse hardware display layer, a common view display layer and a video media layer, wherein the mouse hardware display layer is positioned at the top, the video media layer is positioned at the bottom, and interface windows and the like of the system are all displayed on the common view display layer, so that the content on the mouse hardware display layer can be always positioned at the top layer, and handwriting on the mouse hardware display layer can also be positioned above all interfaces. Therefore, the annotation track can be displayed on all the contents, and the annotation application can be prevented from intercepting all the touch events.
S56: events are distributed normally through the system.
Specifically, based on the normal distribution flow of the smart interactive tablet, the system distributes the touch event to the top-most application. If the coordinate of the touch event is (100, 200), the system determines that x is 100 and y is 200 for all applications at the position, and distributes the event to the top-most application corresponding to the coordinate point. It should be noted that the touch event (100, 200) and the touch event (101, 200) may be distributed to two different applications because they are located at the topmost layer of the respective locations.
S57: and judging whether the touch event is an annotation operation. If the touch event is the annotating operation, the process proceeds to step S59, otherwise, the process proceeds to step S58.
Although the annotation application can acquire the writing event, specific operations such as clearing, saving, switching the handwriting color, switching the handwriting thickness and the like are required. The operation of applying annotations can be performed in several alternative ways: in the first method, corresponding operation buttons are provided on an electromagnetic pen (writing pen), and each button clicks corresponding respective operation (the electromagnetic pen establishes communication with an intelligent interactive tablet through a Bluetooth function and transmits a control instruction). In the second method, when the system interface is operated (non-writing events such as infrared/capacitance and the like) and the content of the system interface changes, the system interface is stored and cleaned; the third method comprises the following steps: setting corresponding multi-finger gestures for each operation, intercepting multi-finger events of the system, and performing corresponding handwriting saving or cleaning actions when five or more fingers perform pull-up, pull-down, left-pull and right-pull on an interface; and a fourth method, when handwriting writing (writing state), adding a suspension window (added in android in a Windows manager mode) on the interface of the system, adding corresponding functional controls in the suspension window, displaying the suspension window when the handwriting exists (writing mode), or hiding the suspension window.
S58: other software responds to the touch event.
In the above step S58, events not intercepted by the annotation operation (e.g., positions not blocked by the annotation operation button) can be distributed to corresponding applications normally through the operating system, and the applications can respond to their respective operations (e.g., button click, sliding, etc.).
S59: and exiting the non-interface annotation mode.
After exiting the non-interface annotation mode, touch events generated by all hardware devices are normally distributed to each application through the operating system, and annotation writing events cannot be distinguished.
Example 2
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for processing annotations of an interactive smart device, it should be noted that the steps illustrated in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be executed in an order different from that herein.
Fig. 6 is a flowchart of a processing method of annotations of an interactive smart device according to an embodiment of the present invention, as shown in fig. 6, the method includes the following steps:
and step S61, starting the annotation application and entering an interface-free annotation mode.
Specifically, the annotation application is used for displaying an annotation on the currently implemented content on the uppermost layer of the display interface of the current intelligent interactive tablet.
Step S63, receiving a first type of touch event, where the first type of touch event is used to generate an annotation trace.
Specifically, the first type of touch event may be a predetermined type of touch event in step S15 in embodiment 1, different types of touch events have different touch data, and the first type of touch event may be an electromagnetic touch event.
Step S65, displaying an annotation track corresponding to a first type of touch event, where the first type of touch event is used to generate the annotation track.
In the above step S65, the smart interactive tablet distributes the first type of touch event to the annotation application, draws an annotation track by the annotation application according to the first type of touch event, and displays the drawn annotation track by the smart interactive tablet.
In an alternative embodiment, taking the smart interactive tablet running the PPT as an example, the first type of touch event may be an electromagnetic touch event, and as shown in fig. 2, the pen in the drawing is an electromagnetic pen, and the smart interactive tablet displays an annotation track according to the touch event generated by the electromagnetic pen.
In step S67, a second type of touch event is received.
Specifically, the second type of touch event may be an infrared touch event or a capacitive touch event.
It should be noted that, step S67 and step S63 have no certain sequential relationship, and if the smart interactive tablet receives a first type of touch event, step S65 is executed, and if the smart interactive tablet receives a second type of touch event, step S69 is executed.
And step S69, displaying a response result obtained by the interactive intelligent device executing the second type of touch event.
Specifically, in the case that the touch event is not a predetermined type of touch event, the application executing the second type of touch event may be an application other than the annotation application, and may also be an annotation application.
In the above step, the smart interactive tablet distributes the touch event of the second type to the corresponding application, and the corresponding application responds to the touch event of the second type and displays the response result.
In an optional embodiment, taking the smart interaction tablet as an example of displaying the PPT, if the second type of touch event is used for turning pages of the PPT, the smart interaction tablet distributes the touch event to the PPT application, the PPT application executes a page turning action in response to the touch event, and the interactive smart device displays a page turning result of the PPT, that is, displays a next page of the PPT.
In another optional embodiment, still taking the smart interactive tablet to show the PPT as an example, if the touch event of the second type is used to erase the annotation trace, the smart interactive tablet distributes the touch event to the annotation application, the annotation application performs erasing of the annotation trace in response to the touch event, and the interactive smart device displays an erasing result of the annotation trace.
Optionally, according to the foregoing embodiment of the present application, in a case that the second type of touch event is used to control the annotation application, the displaying, by the smart interactive tablet, a response result corresponding to the second type of touch event includes: and updating the annotation trajectory according to the second type of touch event.
Optionally, according to the foregoing embodiment of the present application, updating the annotation trajectory according to the second type of touch event includes: if the control triggered by the second type of touch event changes the content of the current display interface, clearing or storing the annotation track; and if the touch event of the second type is generated by the preset multi-finger operation, clearing or saving the annotation track according to the instruction corresponding to the preset multi-finger operation.
Still in the foregoing embodiment, the annotations of the device correspond to the current display content of the device, and for the PPT, after the PPT pages, the annotations of the previous page do not correspond to the content after the page is turned, so that after the system of the device distributes the touch event to other applications except the annotation application, the system may further determine whether the current display interface changes after the other applications respond to the touch event, and if the current display interface changes, screen cleaning is performed on the previous annotations. The user may also be prompted to choose whether to save an existing annotation track before the screen is cleared.
In an alternative embodiment, corresponding multi-finger gestures can be set for clearing and saving the instructions, the system intercepts multi-finger events, and executes corresponding instructions according to multi-finger operations. For example: the multi-finger operation may be: the five fingers pull up, pull down, pull left and pull right on the interface.
Optionally, according to the above embodiment of the present application, after receiving the first type of touch event, the method further includes: and displaying a suspension frame on the uppermost layer of the current display interface, wherein the suspension frame displays the functional control of the annotation application.
Specifically, the suspension frame comprises a functional control for controlling annotation, the color of the handwriting is changed, the thickness of the handwriting is changed, and the like, and the suspension frame can disappear under the condition that the interface has no annotation track.
In an optional embodiment, the predetermined operation may also be an operation of a function control for triggering the annotation application, when handwriting writing is performed (in a writing state), a floating window (added in an android manner) is added on an interface of the system, a corresponding function control is added in the floating window, and the floating window is displayed when writing handwriting exists (in a writing mode), otherwise, the floating window is hidden.
Optionally, according to the above embodiment of the present application, after the floating frame is displayed on the uppermost layer of the interactive smart device, the method further includes: detecting whether a first type of touch event continues to be received within a preset time; if the first type of touch event is not received within the preset time, the floating frame disappears.
The scheme is used for clearing the floating frame after the user stops annotating for the preset time. Still in the above embodiment, if the annotating application does not detect the writing time of the electromagnetic pen any more within the preset time, the floating frame is controlled to disappear. When the user writes again using the electromagnetic pen, the floating frame appears again.
Optionally, according to the foregoing embodiment of the present application, updating the annotation trajectory according to the second type of touch event further includes: and if the touch event of the second type triggers the function control, updating the annotation track according to the triggered function control.
Specifically, the functional control is a control in the suspension frame. In an optional embodiment, when the user triggers an annotation track clearing control in the function control, the annotation application clears the current annotation. And when the user triggers a storage control in the function control, the annotation application stores the current annotation.
Optionally, according to the above embodiments of the present application, the functionality control includes any one or more of the following: the device comprises a commenting handwriting color selecting control, a commenting handwriting thickness selecting control, a commenting handwriting erasing control, a commenting track storing control and a commenting track clearing control.
Optionally, according to the foregoing embodiment of the present application, after displaying the annotation track corresponding to the first type of touch event, the method further includes: receiving an instruction for changing the annotation track sent by an electromagnetic pen, wherein the electromagnetic pen comprises a preset button corresponding to the instruction, and when the button is triggered, the electromagnetic pen sends the instruction corresponding to the button to the interactive intelligent device; and changing the annotation track according to the instruction.
Besides touch events, the above scheme also provides other operation modes for controlling the annotation application, for example, corresponding operation buttons are arranged on an electromagnetic pen (writing pen), each button has a corresponding operation, and when the button is pressed, the electromagnetic pen sends an instruction signal to the interactive intelligent device through bluetooth.
Optionally, according to the above embodiments of the present application, the first type of touch event and the second type of touch event are generated by different touch subjects.
In an alternative embodiment, the touch subject generating the first type of touch event may include an electromagnetic pen, the first type of touch event being generated by the electromagnetic pen and an electromagnetic screen contact; the touch subject generating the second type of touch event may include a hand, the second type of touch event being generated by a finger in contact with a capacitive screen or an infrared screen.
Example 3
According to an embodiment of the present invention, an embodiment of a processing apparatus for annotations of an interactive smart device is provided, fig. 7 is a flowchart of a processing method for annotations of an interactive smart device according to an embodiment of the present invention, and as shown in fig. 7, the apparatus includes:
the first starting module 70 is configured to start the annotation application and enter the interface-less annotation mode.
In the above scheme, since an interface (view) intercepts and controls all touch events, and thus other applications cannot receive the touch events, the step does not generate a view for displaying an annotation track after the annotation application is started, and since the annotation track needs to exist in an upper layer of the content, the annotation application is started and enters an interface-free annotation mode, which includes: and covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track, and further the transparent layer is a mouse hardware display layer.
A first receiving module 72 is configured to receive a touch event.
The touch event may be generated by a user operating a touch frame of the device. After receiving the touch event, the device sends the touch data corresponding to the touch event to a system of the device, and the system distributes the touch data. In an alternative embodiment, taking the smart interactive tablet as an example, the touch frame of the smart interactive tablet may include an electromagnetic touch function, a capacitive touch function, and an infrared touch function, in which case, the touch event may be a touch event generated by a finger or a touch event generated by an electromagnetic pen.
And a detecting module 74, configured to detect whether the touch event is a predetermined type of touch event, where the predetermined type of touch event is used to generate an annotation trace.
The type of the touch event is determined according to the touch data, and specifically, the touch events generating the same touch data may be the same type of touch event. For example, still taking a touch screen including an electromagnetic touch function, a capacitive touch function, and an infrared touch function as an example, for an electromagnetic touch event, the generated touch data is an electromagnetic signal, and for an infrared touch event, the generated touch data is an infrared signal; for a capacitive touch event, the generated touch data is a capacitive signal, and thus, the touch event can be classified into several types, i.e., an electromagnetic touch event, an infrared touch event, and a capacitive touch event.
Specifically, the annotation trace is used to represent a trace generated by the annotation application according to the touch event. In an alternative embodiment, the predetermined type of touch event may be an electromagnetic touch event. When a system of the device detects that the received touch event is an electromagnetic touch event, an annotation trace can be generated from the touch event.
And a generating module 76, configured to generate an annotation trace according to the touch event if the touch event is determined to be a predetermined type of touch event.
In the above scheme, only when the touch event is a predetermined type of touch event, the annotation track is generated according to the touch event, that is, in the above scheme, the annotation application does not intercept all touch data.
In an alternative embodiment, as shown in fig. 2, the smart interactive tablet is taken as an example, and the predetermined type of touch event is an electromagnetic touch event. The intelligent interactive flat panel displays the PPT, the annotation application is started at any time before annotation, when the current page needs to be annotated, a user can write on a touch screen of the intelligent interactive flat panel by using an electromagnetic pen, the intelligent interactive flat panel detects an electromagnetic touch event, and an annotation track is generated according to the touch event generated by the electromagnetic pen
A distributing module 78, configured to distribute the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event.
In the above solution, the touch event is distributed by the system of the device, and the touch event is distributed to which application by the system, and the touch event is processed by which application. The application corresponding to the touch event can be obtained according to a preset corresponding relation, and under the condition that the touch event is not a touch event of a predetermined type, the application corresponding to the touch event can be other applications except for the annotation application or the annotation application.
As can be seen from the above, the foregoing embodiment of the present application starts an annotation application, receives a touch event, detects whether the touch event is a predetermined type of touch event, generates an annotation track according to the touch event if it is determined that the touch event is the predetermined type of touch event, and distributes the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event. In the scheme, the annotation writing is separated from the operation by detecting the type of the touch event, the annotation writing does not influence the normal display logic of the equipment system, and the GPU utilization rate can be saved because the annotation track is not drawn by the normal drawing process of the application equipment system.
Optionally, according to the foregoing embodiment of the present application, the first starting module includes: and the covering submodule is used for covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track.
Optionally, according to the above embodiments of the present application, the transparent layer is a hardware display layer of the mouse.
Optionally, according to the foregoing embodiment of the present application, the generating module includes: the first distribution submodule is used for distributing the touch event to the annotation application; and the generation submodule is used for generating an annotation track according to the touch event by the annotation application.
After the system of the device receives the touch event, the touch event is distributed. In the above scheme, the touch event is distributed to the annotation application, so that the annotation application can draw an annotation track, and the annotation application generates the annotation track according to the touch event because the touch event is a touch event of a predetermined type.
Optionally, according to the above embodiment of the present application, the distribution module includes: the first detection submodule is used for detecting a target application below the transparent layer; and the second distribution submodule is used for distributing the touch event to the target application, wherein the target application responds to the touch event.
And the target application below the transparent layer is the application displayed on the current display interface. In an optional embodiment, the intelligent interactive tablet displays the PPT, and when an annotation application is started at any time before annotation, the intelligent interactive tablet enters an interface-free annotation mode, and a transparent layer is covered on the PPT, wherein the PPT application is a target application below the transparent layer.
Optionally, according to the above embodiment of the present application, a floating frame is displayed on an uppermost layer of the current display interface, where the floating frame is used to display a function control of an annotation application, and the distribution module includes: the second detection submodule is used for detecting whether the touch event is used for triggering a functional control of the annotation application; the first determining submodule is used for determining that the touch event is used for triggering a functional control of the annotation application; and the third distributing submodule is used for distributing the touch event to the annotation application.
Specifically, the functionality control of the annotation application can be used to control the annotation as follows: screen clearing, screen capturing, saving and the like, and can also comprise: eraser, selecting handwriting color, selecting handwriting thickness, etc.
According to the scheme, the touch event of the annotation application controlled by the user is distributed to the annotation application, so that the annotation application is controlled.
Fig. 3 is a schematic diagram of controlling an annotation application according to an embodiment of the present invention, and in an alternative embodiment, in combination with fig. 3, still taking the smart interactive tablet as an example, the predetermined type of touch event is an electromagnetic touch event. The intelligent interactive flat panel displays the PPT, and after the annotation of the user is finished, the touch screen is operated through the designated gesture (the two fingers slide downwards) so as to store the current annotation. And when detecting that the operation of the user is the operation of saving the annotation, the system of the device distributes the touch event to the annotation application, and the annotation application executes the operation of saving the annotation.
Optionally, according to the above embodiment of the present application, the apparatus further includes: the first determining sub-module is used for determining that the touch event is not used for triggering the functional control of the annotation application after detecting whether the touch event is a preset operation; a fourth distribution sub-module for distributing the touch event to a target application below the transparent layer.
Fig. 4 is a schematic diagram of controlling other applications according to an embodiment of the present invention, and in an alternative embodiment, in combination with fig. 4, still taking the smart interactive tablet as an example, the predetermined type of touch event is an electromagnetic touch event. The intelligent interactive flat panel displays the PPT, and after the annotation of the user is finished, page turning is needed. When the user performs a page turning operation (slides the finger to the left), the device detects that the touch event of the page turning operation is not used for controlling the annotation application but used for controlling the PPT application, so that the touch signal is distributed to the PPT application.
Still in the foregoing embodiment, the annotations of the device correspond to the current display content of the device, and for the PPT, after the PPT pages, the annotations of the previous page do not correspond to the content after the page is turned, so that after the system of the device distributes the touch event to other applications except the annotation application, the system may further determine whether the current display interface changes after the other applications respond to the touch event, and if the current display interface changes, screen cleaning is performed on the previous annotations. The user may also be prompted to choose whether to save an existing annotation track before the screen is cleared.
Optionally, according to the foregoing embodiment of the present application, the second detection sub-module includes: a detection unit, configured to detect whether a touch event is generated by a predetermined operation, where the predetermined operation includes any one or more of: performing preset multi-finger operation; for triggering operation of a functionality control of the annotation application.
In an alternative embodiment, the predetermined operation may be a preset multi-finger operation. In this example, a corresponding multi-finger gesture can be set for each operation, the system intercepts the multi-finger event, and performs corresponding handwriting saving or cleaning actions when five or more fingers pull up, pull down, pull left and pull right on the interface.
In an optional embodiment, the predetermined operation may also be an operation of a function control for triggering the annotation application, when handwriting writing is performed (in a writing state), a floating window (added in an android manner) is added on an interface of the system, a corresponding function control is added in the floating window, and the floating window is displayed when writing handwriting exists (in a writing mode), otherwise, the floating window is hidden. The predetermined operation may be an operation to trigger a control in the floating window, or the like.
Besides touch events, the annotation application can be controlled by other operation modes, for example, corresponding operation buttons are arranged on an electromagnetic pen (writing pen), each button has a corresponding operation, and when the button is pressed, the electromagnetic pen sends an instruction signal to the intelligent interaction device through Bluetooth.
Optionally, according to the above embodiments of the present application, the type of the touch event includes any one or more of the following: electromagnetic pen touch events, capacitive touch events, and infrared touch events.
Example 4
According to an embodiment of the present invention, an embodiment of a processing apparatus for annotations of an interactive smart device is provided, fig. 8 is a flowchart of a processing method for annotations of an interactive smart device according to an embodiment of the present invention, and as shown in fig. 8, the apparatus includes:
and a second starting module 80, configured to start the annotation application, and enter the no-interface annotation mode.
Specifically, the annotation application is used for displaying an annotation on the currently implemented content on the uppermost layer of the display interface of the current intelligent interactive tablet.
A second receiving module 82 is configured to receive a touch event of the first type.
Specifically, the first type of touch event may be a predetermined type of touch event in step S15 in embodiment 1, different types of touch events have different touch data, and the first type of touch event may be an electromagnetic touch event.
The first display module 84 is configured to display an annotation track corresponding to a first type of touch event, where the first type of touch event is used to generate the annotation track.
In the above scheme, the smart interactive tablet distributes the touch event of the first type to the annotation application, the annotation application draws an annotation track according to the touch event of the first type, and the smart interactive tablet displays the drawn annotation track.
In an alternative embodiment, taking the smart interactive tablet running the PPT as an example, the first type of touch event may be an electromagnetic touch event, and as shown in fig. 2, the pen in the drawing is an electromagnetic pen, and the smart interactive tablet displays an annotation track according to the touch event generated by the electromagnetic pen.
A third receiving module 86 for receiving a second type of touch event.
Specifically, the second type of touch event may be an infrared touch event or a capacitive touch event.
It should be noted that, the steps executed by the second receiving module and the steps executed by the third receiving module have no certain chronological relationship.
And a second display module 88, configured to display a response result obtained by the interactive smart device executing the second type of touch event.
Specifically, in the case that the touch event is not a predetermined type of touch event, the application executing the second type of touch event may be an application other than the annotation application, and may also be an annotation application.
In the above step, the smart interactive tablet distributes the touch event of the second type to the corresponding application, and the corresponding application responds to the touch event of the second type and displays the response result.
In an optional embodiment, taking the smart interaction tablet as an example of displaying the PPT, if the second type of touch event is used for turning pages of the PPT, the smart interaction tablet distributes the touch event to the PPT application, the PPT application executes a page turning action in response to the touch event, and the interactive smart device displays a page turning result of the PPT, that is, displays a next page of the PPT.
In another optional embodiment, still taking the smart interactive tablet to show the PPT as an example, if the touch event of the second type is used to erase the annotation trace, the smart interactive tablet distributes the touch event to the annotation application, the annotation application performs erasing of the annotation trace in response to the touch event, and the interactive smart device displays an erasing result of the annotation trace.
Optionally, according to the foregoing embodiment of the present application, in a case that the second type of touch event is used to control the annotation application, the second display module includes: and the updating submodule is used for updating the annotation track according to the touch event of the second type.
Optionally, according to the foregoing embodiment of the present application, the update sub-module includes: the first updating unit is used for clearing or storing the annotation track if the content of the current display interface is changed by the control triggered by the second type of touch event; and the second updating unit is used for clearing or saving the annotation track according to an instruction corresponding to the preset multi-finger operation if the touch event of the second type is generated by the preset multi-finger operation.
Still in the foregoing embodiment, the annotations of the device correspond to the current display content of the device, and for the PPT, after the PPT pages, the annotations of the previous page do not correspond to the content after the page is turned, so that after the system of the device distributes the touch event to other applications except the annotation application, the system may further determine whether the current display interface changes after the other applications respond to the touch event, and if the current display interface changes, screen cleaning is performed on the previous annotations. The user may also be prompted to choose whether to save an existing annotation track before the screen is cleared.
In an alternative embodiment, corresponding multi-finger gestures can be set for clearing and saving the instructions, the system intercepts multi-finger events, and executes corresponding instructions according to multi-finger operations. For example: the multi-finger operation may be: the five fingers pull up, pull down, pull left and pull right on the interface.
Optionally, according to the above embodiment of the present application, the apparatus further includes: and after receiving the first type of touch event, a third display module for displaying a floating frame on the uppermost layer of the current display interface, wherein the floating frame displays a function control of the annotation application.
Specifically, the suspension frame comprises a functional control for controlling annotation, the color of the handwriting is changed, the thickness of the handwriting is changed, and the like, and the suspension frame can disappear under the condition that the interface has no annotation track.
Optionally, according to the above embodiment of the present application, the apparatus further includes: after the suspension frame is displayed on the uppermost layer of the interactive intelligent device, a detection module is used for detecting whether a first type of touch event is continuously received within a preset time; the disappearance module is used for disappearing the floating frame if the first type of touch event is not received within the preset time.
The scheme is used for clearing the floating frame after the user stops annotating for the preset time. Still in the above embodiment, if the annotating application does not detect the writing time of the electromagnetic pen any more within the preset time, the floating frame is controlled to disappear. When the user writes again using the electromagnetic pen, the floating frame appears again.
Optionally, according to the foregoing embodiment of the present application, the update sub-module further includes: and the third updating unit is used for updating the annotation track according to the triggered function control if the touch event of the second type triggers the function control.
Specifically, the functional control is a control in the suspension frame. In an optional embodiment, when the user triggers an annotation track clearing control in the function control, the annotation application clears the current annotation. And when the user triggers a storage control in the function control, the annotation application stores the current annotation.
Optionally, according to the above embodiments of the present application, the functionality control includes any one or more of the following: the device comprises a commenting handwriting color selecting control, a commenting handwriting thickness selecting control, a commenting handwriting erasing control, a commenting track storing control and a commenting track clearing control.
Optionally, according to the above embodiment of the present application, the apparatus further includes: after the annotation track corresponding to the first type of touch event is displayed, a receiving module is used for receiving an instruction for changing the annotation track sent by an electromagnetic pen, wherein the electromagnetic pen comprises a preset button corresponding to the instruction, and when the button is triggered, the electromagnetic pen sends the instruction corresponding to the button to the interactive intelligent equipment; and the changing module is used for changing the annotation track according to the instruction.
Besides touch events, the above scheme also provides other operation modes for controlling the annotation application, for example, corresponding operation buttons are arranged on an electromagnetic pen (writing pen), each button has a corresponding operation, and when the button is pressed, the electromagnetic pen sends an instruction signal to the interactive intelligent device through bluetooth.
Optionally, according to the above embodiments of the present application, the first type of touch event and the second type of touch event are generated by different touch subjects.
In an alternative embodiment, the touch subject generating the first type of touch event may include an electromagnetic pen, the first type of touch event being generated by the electromagnetic pen and an electromagnetic screen contact; the touch subject generating the second type of touch event may include a hand, the second type of touch event being generated by a finger in contact with a capacitive screen or an infrared screen.
Example 5
According to an embodiment of the present invention, there is provided an interactive intelligent device, including a processor and a storage medium, where the storage medium includes a stored program, and the interactive intelligent device is characterized in that the processor is configured to execute the program, where the program executes the following steps: starting the annotation application and entering an interface-free annotation mode; receiving a touch event; detecting whether the touch event is a touch event of a predetermined type, wherein the touch event of the predetermined type is used for generating an annotation track; if the touch event is determined to be a touch event of a preset type, generating an annotation track according to the touch event; if it is determined that the touch event is not a predetermined type of touch event, the touch event is distributed to an application corresponding to the touch event.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (17)

1. A processing method for annotation of an interactive intelligent device is characterized by comprising the following steps:
starting the annotation application and entering an interface-free annotation mode;
receiving a touch event;
detecting whether the touch event is a touch event of a preset type, wherein the touch event of the preset type is used for generating an annotation track, and the type of the touch event is determined according to the type of the touch signal;
if the touch event is determined to be the touch event of the preset type, generating an annotation track according to the touch event;
distributing the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event,
starting the annotation application and entering an interface-free annotation mode, wherein the interface-free annotation mode comprises the following steps: covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track,
distributing the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event, including:
detecting a target application below the transparent layer;
distributing the touch event to the target application, wherein the target application responds to the touch event,
besides the touch event, the method also comprises the steps of controlling the annotation application through other operation modes, setting corresponding operation buttons, wherein each button has corresponding operation, and sending an instruction signal to the intelligent interaction equipment when the button is pressed.
2. The method of claim 1, wherein the transparent layer is a mouse hardware display layer.
3. The method of claim 1 or 2, wherein if the touch event is determined to be the predetermined type of touch event, generating an annotation trace according to the touch event comprises:
distributing the touch event to the annotation application;
the annotation application generates the annotation trace according to the touch event.
4. The method according to claim 1 or 2, wherein a floating frame is displayed on the uppermost layer of the current display interface, wherein the floating frame is used for displaying a function control of the annotation application, and if the touch event is determined not to be the predetermined type of touch event, distributing the touch event to an application corresponding to the touch event comprises:
detecting whether the touch event is used for triggering a function control of the annotation application;
if the touch event is determined to be used for triggering the functional control of the annotation application;
the touch event is distributed to the annotation application.
5. The method of claim 4, wherein after detecting whether the touch event is used to trigger a functionality control of the annotation application, the method further comprises:
if the touch event is determined not to be used for triggering the functional control of the annotation application;
the touch event is distributed to a target application below the transparent layer.
6. The method of claim 4, wherein detecting whether the touch event is used to trigger a functionality control of the annotation application comprises: detecting whether the touch event is generated by a predetermined operation, wherein the predetermined operation comprises any one or more of the following operations:
performing preset multi-finger operation;
the operation of the function control used for triggering the annotation application is triggered.
7. The method of claim 1, wherein the type of touch event comprises any one or more of: electromagnetic pen touch events, capacitive touch events, and infrared touch events.
8. A processing method for annotation of an interactive intelligent device is characterized by comprising the following steps:
starting the annotation application and entering an interface-free annotation mode;
receiving a first type of touch event, wherein the type of the touch event is determined according to the type of the touch signal;
displaying an annotation track corresponding to the first type of touch event, wherein the first type of touch event is used for generating the annotation track;
receiving a second type of touch event;
displaying a response result from the interactive smart device executing the second type of touch event,
displaying a response result obtained by the interactive smart device executing the second type of touch event under the condition that the second type of touch event is used for controlling the annotation application, wherein the response result comprises: updating the annotation trajectory in accordance with the second type of touch event,
updating the annotation trajectory according to the second type of touch event, including:
if the control triggered by the second type of touch event changes the content of the current display interface, clearing or storing the annotation track;
if the touch event of the second type is generated by a preset multi-finger operation, clearing or saving the annotation track according to an instruction corresponding to the preset multi-finger operation,
besides the touch event, the method also comprises the steps of controlling the annotation application through other operation modes, setting corresponding operation buttons, wherein each button has corresponding operation, and sending an instruction signal to the intelligent interaction equipment when the button is pressed.
9. The method of claim 8, wherein after receiving the first type of touch event, the method further comprises:
displaying a suspension frame on the uppermost layer of the current display interface, wherein the suspension frame displays a function control of the annotation application.
10. The method of claim 9, wherein after the interactive smart device displays a floating frame on a top level, the method further comprises:
detecting whether the first type of touch event is continuously received within a preset time;
and if the first type of touch event is not received within the preset time, the floating frame disappears.
11. The method of claim 10, wherein updating the annotation trace in accordance with the second type of touch event further comprises:
and if the touch event of the second type triggers the functional control, updating the annotation track according to the triggered functional control.
12. The method of claim 9, wherein the functionality control comprises any one or more of: the device comprises a commenting handwriting color selecting control, a commenting handwriting thickness selecting control, a commenting handwriting erasing control, a commenting track storing control and a commenting track clearing control.
13. The method of claim 8, wherein after displaying an annotation trace corresponding to the first type of touch event, the method further comprises:
receiving an instruction for changing the annotation track sent by an electromagnetic pen, wherein the electromagnetic pen comprises a preset button corresponding to the instruction, and when the button is triggered, the electromagnetic pen sends the instruction corresponding to the button to the interactive intelligent device;
and changing the annotation track according to the instruction.
14. The method of claim 8, wherein the first type of touch event and the second type of touch event are generated by different touch subjects.
15. A processing apparatus of annotations of an interactive smart device, comprising:
the first starting module is used for starting the annotation application and entering an interface-free annotation mode;
a first receiving module for receiving a touch event;
the detection module is used for detecting whether the touch event is a touch event of a preset type, wherein the touch event of the preset type is used for generating an annotation track, and the type of the touch event is determined according to the type of the touch signal;
the generating module is used for generating an annotation track according to the touch event if the touch event is determined to be the touch event of the preset type;
a distribution module to distribute the touch event to an application corresponding to the touch event if it is determined that the touch event is not the predetermined type of touch event,
the first initiating module includes: an overlay sub-module for overlaying a transparent layer on an uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track,
the distribution module includes: a first detection sub-module for detecting a target application below the transparent layer; a second distribution submodule for distributing the touch event to the target application, wherein the target application responds to the touch event,
besides the touch event, the method also comprises the steps of controlling the annotation application through other operation modes, setting corresponding operation buttons, wherein each button has corresponding operation, and sending an instruction signal to the intelligent interaction equipment when the button is pressed.
16. A processing apparatus of annotations of an interactive smart device, comprising:
the second starting module is used for starting the annotation application and entering an interface-free annotation mode;
the second receiving module is used for receiving the touch event of the first type, and the type of the touch event is determined according to the type of the touch signal;
the first display module is used for displaying an annotation track corresponding to the first type of touch event, wherein the first type of touch event is used for generating the annotation track;
a third receiving module, configured to receive a second type of touch event;
a second display module for displaying a response result obtained by the interactive smart device executing the second type of touch event,
in a case where the second type of touch event is used to control the annotation application, the second display module includes: an update submodule for updating the annotation trajectory in accordance with the second type of touch event,
the update submodule includes: the first updating unit is used for clearing or storing the annotation track if the content of the current display interface is changed by the control triggered by the second type of touch event; and the second updating unit is used for clearing or saving the annotation track according to an instruction corresponding to the preset multi-finger operation if the touch event of the second type is generated by the preset multi-finger operation, controlling the annotation application in other operation modes except the touch event, setting corresponding operation buttons, wherein each button has corresponding respective operation, and sending an instruction signal to the intelligent interaction equipment when the button is pressed.
17. An interactive smart device comprising a processor and a storage medium, the storage medium comprising a stored program, wherein the processor is configured to execute the program, wherein the program when executed performs the steps of: starting the annotation application and entering an interface-free annotation mode; receiving a touch event; detecting whether the touch event is a touch event of a preset type, wherein the touch event of the preset type is used for generating an annotation track, and the type of the touch event is determined according to the type of the touch signal; if the touch event is determined to be the touch event of the preset type, generating an annotation track according to the touch event; if the touch event is determined not to be the touch event of the preset type, distributing the touch event to an application corresponding to the touch event, starting an annotation application, and entering an interface-free annotation mode, wherein the method comprises the following steps: covering a transparent layer on the uppermost layer of the current display interface, wherein the transparent layer is used for displaying the annotation track, and if the touch event is determined not to be the touch event of the predetermined type, distributing the touch event to an application corresponding to the touch event, and the method comprises the following steps: detecting a target application below the transparent layer; and distributing the touch event to the target application, wherein the target application responds to the touch event, controls the annotation application in other operation modes except the touch event, sets corresponding operation buttons, each button has corresponding respective operation, and sends an instruction signal to the intelligent interaction device when the button is pressed.
CN201810387694.7A 2018-04-26 2018-04-26 Method and device for processing annotations of interactive intelligent equipment Active CN110413187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810387694.7A CN110413187B (en) 2018-04-26 2018-04-26 Method and device for processing annotations of interactive intelligent equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810387694.7A CN110413187B (en) 2018-04-26 2018-04-26 Method and device for processing annotations of interactive intelligent equipment

Publications (2)

Publication Number Publication Date
CN110413187A CN110413187A (en) 2019-11-05
CN110413187B true CN110413187B (en) 2021-12-03

Family

ID=68346109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810387694.7A Active CN110413187B (en) 2018-04-26 2018-04-26 Method and device for processing annotations of interactive intelligent equipment

Country Status (1)

Country Link
CN (1) CN110413187B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111273847B (en) * 2020-01-07 2021-07-20 北京字节跳动网络技术有限公司 Page processing method and device, electronic equipment and storage medium
CN112650425B (en) * 2020-12-25 2024-01-19 深圳中电数码显示有限公司 PPT display page control method, device, terminal and storage medium
CN113553198A (en) * 2021-06-01 2021-10-26 刘启成 Data processing method and device
CN114444451A (en) * 2021-12-14 2022-05-06 北京鸿合爱学教育科技有限公司 Remote annotation method and device
WO2023230753A1 (en) * 2022-05-30 2023-12-07 广州视臻信息科技有限公司 Quick annotation methods, apparatus, interactive tablet, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609401A (en) * 2011-12-26 2012-07-25 北京大学 Webpage annotation method
CN103268198A (en) * 2013-05-24 2013-08-28 广东国笔科技股份有限公司 Gesture input method and device
CN103294375A (en) * 2012-02-24 2013-09-11 宇龙计算机通信科技(深圳)有限公司 Terminal and touch screen manipulation method
WO2017035650A1 (en) * 2015-09-03 2017-03-09 Smart Technologies Ulc Transparent interactive touch system and method
CN107193794A (en) * 2017-06-28 2017-09-22 广州视源电子科技股份有限公司 The annotation method and device of display content

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552345B2 (en) * 2014-02-28 2017-01-24 Microsoft Technology Licensing, Llc Gestural annotations
CN107329681A (en) * 2017-05-11 2017-11-07 广东网金控股股份有限公司 A kind of former person's handwriting hand-written inputting method, device and terminal based on WEB

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609401A (en) * 2011-12-26 2012-07-25 北京大学 Webpage annotation method
CN103294375A (en) * 2012-02-24 2013-09-11 宇龙计算机通信科技(深圳)有限公司 Terminal and touch screen manipulation method
CN103268198A (en) * 2013-05-24 2013-08-28 广东国笔科技股份有限公司 Gesture input method and device
WO2017035650A1 (en) * 2015-09-03 2017-03-09 Smart Technologies Ulc Transparent interactive touch system and method
CN107193794A (en) * 2017-06-28 2017-09-22 广州视源电子科技股份有限公司 The annotation method and device of display content

Also Published As

Publication number Publication date
CN110413187A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN110413187B (en) Method and device for processing annotations of interactive intelligent equipment
CN108829327B (en) Writing method and device of interactive intelligent equipment
US7880720B2 (en) Gesture recognition method and touch system incorporating the same
KR101814391B1 (en) Edge gesture
EP2715491B1 (en) Edge gesture
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US20140210797A1 (en) Dynamic stylus palette
US9317171B2 (en) Systems and methods for implementing and using gesture based user interface widgets with camera input
EP3491506B1 (en) Systems and methods for a touchscreen user interface for a collaborative editing tool
EP1942399A1 (en) Multi-event input system
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
CN110968227B (en) Control method and device of intelligent interactive panel
US20220221970A1 (en) User interface modification
CN108762657A (en) Operating method, device and the intelligent interaction tablet of intelligent interaction tablet
CN114501108A (en) Display device and split-screen display method
US10222866B2 (en) Information processing method and electronic device
US9323431B2 (en) User interface for drawing with electronic devices
CN110888581A (en) Element transfer method, device, equipment and storage medium
KR20210023434A (en) Display apparatus and control method thereof
CN110647268A (en) Control method and control device for display window in game
CN115562544A (en) Display device and revocation method
CN103777885B (en) A kind of drafting platform, a kind of touch panel device and a kind of computer implemented method
CN114442849B (en) Display equipment and display method
WO2023273462A1 (en) Display device and color filling method
CN113918069A (en) Information interaction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant