CN110968227A - Control method and device of intelligent interactive panel - Google Patents

Control method and device of intelligent interactive panel Download PDF

Info

Publication number
CN110968227A
CN110968227A CN201910955725.9A CN201910955725A CN110968227A CN 110968227 A CN110968227 A CN 110968227A CN 201910955725 A CN201910955725 A CN 201910955725A CN 110968227 A CN110968227 A CN 110968227A
Authority
CN
China
Prior art keywords
layer
writing
application interface
touch operation
whiteboard application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910955725.9A
Other languages
Chinese (zh)
Other versions
CN110968227B (en
Inventor
吴佳宝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shizhen Information Technology Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shizhen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shizhen Information Technology Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201910955725.9A priority Critical patent/CN110968227B/en
Publication of CN110968227A publication Critical patent/CN110968227A/en
Application granted granted Critical
Publication of CN110968227B publication Critical patent/CN110968227B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application discloses a control method and device for an intelligent interactive tablet. Wherein, the method comprises the following steps: receiving a touch operation on a whiteboard application interface, and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation. The method and the device solve the technical problem that in the prior art, when other types of elements are inserted into the whiteboard, control operation on the elements conflicts with handwriting touch operation of a user.

Description

Control method and device of intelligent interactive panel
Technical Field
The application relates to the field of interactive intelligent equipment, in particular to a control method and device of an intelligent interactive tablet.
Background
The existing whiteboard software has a writing plane and can be divided into a writing mode, a selection mode and a wiping mode, wherein the writing mode is mainly used for handwriting writing, the operable basic elements on the whiteboard are handwriting formed by points and lines, a user can generate handwriting at touch points left on the whiteboard, and the handwriting thickness, color, cancellation and other operations can be modified. The selection mode is that the handwriting which is touched by the user and passes through the track or is in the circle is selected, and operations such as moving, zooming and the like can be carried out. In the erasing mode, an eraser appears when a user touches the area, and handwriting passing by the eraser track is erased.
In the process of using the whiteboard, other types of elements such as pictures may need to be displayed, a user may open a picture browser or other software to display the pictures and the other types of elements, but when the user opens a plurality of software, only one software is displayed at the front end of the screen to interact with the user, and the other software is shielded, so that the effect of displaying the pictures and writing at the same time cannot be achieved, and the user needs to switch handwriting and picture operations between the two software, which is tedious.
Some existing whiteboard software supports the function of inserting other elements, but the insertion of other elements can be performed only by switching from a writing mode to a selection mode, and the operation of the elements generates conflict with the action of generating handwriting by the touch of a user, that is, the whiteboard is difficult to recognize whether the touch of the user is to be written or control the inserted other elements, for example, fig. 1 is a schematic diagram of inserting a picture on the whiteboard, and when a touch point of the user falls on the position of the picture and moves to the left, the whiteboard is difficult to determine whether the user is to move the picture or write.
Aiming at the problem that the control operation of elements conflicts with the operation of generating handwriting by touching of a user when other types of elements are inserted into the whiteboard in the prior art, no effective solution is provided at present.
Disclosure of Invention
The embodiment of the application provides a control method and device for an intelligent interactive tablet, and the technical problem that in the prior art, when other types of elements are inserted into a whiteboard, control operation on the elements conflicts with handwriting touch operation of a user is solved.
According to an aspect of an embodiment of the present application, there is provided a method for controlling an intelligent interactive tablet, including: receiving a touch operation on a whiteboard application interface, and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation.
Further, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the type of the touch operation is obtained, including: when the touch operation is generated on the writing layer, acquiring the touch operation with the stay time length at the initial position being greater than or equal to the preset time length as a selection operation, and acquiring the touch operation with the stay time length at the initial position being less than the preset time length as a writing operation; and when the touch operation is generated on the element layer, acquiring the touch operation as a selection operation.
Further, when the touch operation is a writing operation, generating writing according to the touch operation, including: when the writing layer is positioned on the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation; and when the element layer is positioned on the top layer of the whiteboard application interface, displaying the writing layer on the top layer, and generating writing handwriting according to the writing operation.
Further, when the touch operation is a selection operation, according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, including: and displaying the element layer selected by the selection operation on the top layer of the whiteboard application interface.
Further, after the element layer selected by the selection operation is displayed on the top layer of the whiteboard application interface, receiving a dragging operation, and moving the element selected by the dragging operation according to a dragging track of the dragging operation; receiving a zooming operation, and zooming the element selected by the zooming operation according to the zooming operation; and receiving a deleting operation, and clearing elements selected by the deleting operation from the whiteboard application interface according to the deleting operation.
According to an aspect of an embodiment of the present application, there is provided a method for controlling an intelligent interactive tablet, including: detecting a touch event on a whiteboard application interface; analyzing and obtaining a touch operation for generating a touch event, and judging the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation.
Further, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the determining of the type of the touch operation includes: judging a subject acted by the touch operation; when the main body acted by the touch operation is the element layer, determining the touch operation as a selection operation; and when the main body acted by the touch operation is the writing layer, determining the type of the touch operation according to the stay time of the touch operation at the initial position.
Further, when the main body receiving the touch operation is the writing layer, determining the type of the touch operation according to the time length of the touch operation staying at the initial position, including: if the time length of the touch operation staying at the initial position is greater than or equal to the preset time length, determining the touch operation as a selection operation; and if the time length of the touch operation staying at the initial position is less than the preset time length, determining that the touch operation is writing operation.
Further, when the touch operation is a writing operation, generating writing according to the touch operation, including: when the writing layer is positioned on the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation; and when the element layer is positioned at the top layer of the whiteboard application interface, adjusting the writing layer to the top layer of the whiteboard application interface, and generating writing handwriting according to the writing operation.
Further, when the touch operation is a selection operation, according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, including: when the selection operation falls on any element graph layer on the upper layer of the writing graph layer, determining the element graph layer in which the selection operation falls as a target graph layer; placing the target graph layer on the top layer of the whiteboard application interface; and reducing the element layers with the layer levels higher than the target layer by one layer.
Further, when the touch operation is a selection operation, according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, including: when an element layer below a writing layer is selected by a selection operation, a first element layer hit by the selection operation and a second element layer higher than the first element layer are obtained; arranging the levels of the first element image layer in an ascending order to obtain a first array, and arranging the levels of the second element image layer in an ascending order to obtain a second array; and ascending the equal steps of the levels represented by the first array until the maximum value in the first array is equal to the number of layers of the layer opened by the whiteboard application interface, and descending the equal steps of the levels in the second array until the minimum value in the second array is equal to the minimum level in the first element layer.
Further, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface comprises a writing layer for bearing writing content and a plurality of element layers for bearing the elements, the element layers are provided with buoys displayed on the top layer of the whiteboard application interface, and the step of judging the type of touch operation further comprises: if the touch operation falls into the floating mark of any layer, determining the touch operation as a selection operation; according to the selection operation, the hierarchy of the elements selected by the selection operation in the whiteboard application interface is adjusted, and the method comprises the following steps: and adjusting the layer corresponding to the buoy indicated by the touch operation to the top layer of the white board.
Furthermore, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the step of determining the type of the touch operation further includes: if the touch operation is used for calling out a layer relation table of the whiteboard application interface, determining that the touch operation is a selection operation, wherein the layer relation table comprises: the levels of the elements and the layers of the elements which are opened by the whiteboard application interface; according to the selection operation, the hierarchy of the elements selected by the selection operation in the whiteboard application interface is adjusted, and the method comprises the following steps: detecting a dragging operation on any one element in the graph-layer relation table, wherein the dragging operation is used for dragging the identifier of any one element in the graph-layer relation table from a first position in the graph-layer relation table to a second position in the graph-layer relation table; replacing the hierarchy of elements represented by the element identification of the first location and the element identification of the second location.
Further, after the target image layer is placed on the top layer of the whiteboard application interface, a dragging event acting on the target image layer is received, and elements in the target image layer are moved according to a dragging track of the dragging event; receiving a zooming instruction acting on a target layer, and zooming the elements in the target layer according to the zooming instruction; and receiving a deleting instruction acting on the target layer, and clearing the elements in the target layer from the whiteboard application interface according to the deleting instruction.
According to an aspect of an embodiment of the present application, there is provided a control apparatus of an intelligent interactive tablet, including: the receiving module is used for receiving touch operation on the whiteboard application interface and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; the generating module is used for generating writing handwriting according to the touch operation when the touch operation is the writing operation; and the adjusting module is used for adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation when the touch operation is the selection operation.
According to an aspect of the embodiments of the present application, there is provided a computer storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to execute the above-mentioned control method of the smart interactive tablet.
According to an aspect of an embodiment of the present application, there is provided an intelligent interactive tablet, including: a processor and a memory; the memory stores a computer program, and the computer program is suitable for being loaded by the processor and executing the control method of the intelligent interactive tablet.
In an embodiment of the present application, in the above embodiment of the present application, a touch operation on a whiteboard application interface is received, and a type of the touch operation is obtained, where the type at least includes: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation. According to the scheme, the type of the touch operation is acquired after the touch operation is received, so that the corresponding action can be executed according to the type of the touch operation, the technical problem that in the prior art, when the whiteboard inserts other types of elements, the control operation of the elements conflicts with the operation of generating handwriting by touching of a user is solved, and the elements in any layer can be controlled by the user through adjustment of the layer, so that the effect of convenience and quickness in operation is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic illustration of a whiteboard with a picture inserted;
fig. 2 is a flowchart of a control method of an intelligent interactive tablet according to an embodiment of the present application;
FIG. 3a is a diagram of layers in a whiteboard application interface;
FIG. 3b is a display effect diagram of FIG. 3 a;
FIG. 4a is a schematic diagram of layers in another whiteboard application interface;
FIG. 4b is a display effect diagram of FIG. 4 a;
FIG. 5a is a schematic diagram of a multiple element layer bitmap writing layer according to an embodiment of the present application;
FIG. 5b is a display effect diagram of FIG. 5 a;
FIG. 5c is a schematic view of a hierarchy of click pictures 1 in the example shown in FIG. 5 a;
FIG. 5d is a display effect diagram of FIG. 5 c;
FIG. 6a is a schematic diagram illustrating an operation of modifying a hierarchy of layers according to an embodiment of the present application;
FIG. 6b is a display effect diagram of FIG. 6 a;
FIG. 7a is a diagram illustrating a plurality of element layers all disposed below a written layer according to an embodiment of the present application;
FIG. 7b is a display effect diagram of FIG. 7 a;
FIG. 7c is a schematic diagram of a long press operation in the example shown in FIG. 7a according to an embodiment of the present application;
FIG. 7d is a display effect diagram of FIG. 7 c;
FIG. 7e is the display effect diagram of FIG. 7 d;
FIG. 8a is a schematic diagram of a drag element according to an embodiment of the present application;
FIG. 8b is a schematic diagram of a zoom element according to an embodiment of the present application;
FIG. 8c is a schematic diagram of a delete element according to an embodiment of the present application;
fig. 9 is a flowchart of another control method of an intelligent interactive tablet according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating adjustment of layer levels by floats according to an embodiment of the present disclosure;
FIG. 11a is a diagram illustrating a layer relationship table according to an embodiment of the present application;
FIG. 11b is a diagram illustrating an adjustment of layers through a layer relation table according to an embodiment of the present application;
fig. 12 is a schematic diagram of a control device of an intelligent interactive tablet according to an embodiment of the application; and
fig. 13 is a schematic diagram of another control device of an intelligent interactive tablet according to an embodiment of the application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the embodiments described are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order, nor should be construed to indicate or imply relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The scheme provided by the application can be applied to the intelligent interactive tablet. The hardware part of mutual dull and stereotyped of intelligence comprises parts such as display module assembly, intelligent processing system (including the controller), combines together by whole structure, also is regarded as the support by dedicated software system simultaneously, and wherein the display module assembly includes display screen and backlight module spare, and wherein the display screen includes transparent electric conduction layer and liquid crystal layer etc..
The display screen, in the embodiments of the present specification, refers to a touch screen, and a touch panel, and is a sensing type liquid crystal display device, when a graphic button on the screen is touched, the tactile feedback system on the screen can drive various connection devices according to a pre-programmed program, and can be used to replace a mechanical button panel, and a vivid video effect can be produced by the liquid crystal display screen. Touch screens are distinguished from technical principles and can be divided into five basic categories; a vector pressure sensing technology touch screen, a resistance technology touch screen, a capacitance technology touch screen, an infrared technology touch screen, and a surface acoustic wave technology touch screen. According to the working principle of the touch screen and the medium for transmitting information, the touch screen can be divided into four categories: resistive, capacitive, infrared, and surface acoustic wave.
When a user touches the screen with a finger or a pen, the point coordinates are positioned, so that the control of the intelligent processing system is realized, and then different functional applications are realized along with the built-in software of the intelligent processing system.
Example 1
In accordance with an embodiment of the present application, there is provided an embodiment of a control method for an intelligent interactive tablet, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 2 is a flowchart of a control method of an intelligent interactive tablet according to an embodiment of the present application, and as shown in fig. 2, the method includes the following steps:
step S202, receiving a touch operation on the whiteboard application interface, and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selection operation for selecting an element on the whiteboard application interface.
Specifically, the whiteboard application interface is an operation interface provided by the whiteboard application, and can perform full-screen operation or reduce the operation to a window with a certain size. The touch operation on the whiteboard application interface may be an operation performed on the whiteboard application interface by touching the touch screen with a finger, or a touch operation generated on the whiteboard application interface by an electromagnetic pen or other writing equipment.
After receiving the touch operation, the type of the operation is determined first, and how to respond to the touch operation is determined. In the above scheme, it is necessary to determine whether the received touch operation is a writing operation or a selecting operation, so as to determine whether to generate writing on the whiteboard application interface or select other elements on the whiteboard application interface.
As an optional embodiment, a plurality of elements are opened on a whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and obtaining the type of the touch operation includes:
step S2021, when the touch operation is generated on the writing layer, acquiring a touch operation with a time length of the initial position being greater than or equal to a preset time length as a selection operation, and acquiring a touch operation with a time length of the initial position being less than the preset time length as a writing operation.
Step S2023 acquires the touch operation as the selection operation when the touch operation is generated on the element layer.
Specifically, the touch operation in which the time length of the initial position staying is greater than or equal to the preset time length may be a long-time pressing operation, and the touch operation in which the time length of the initial position staying is less than the preset time length may be a clicking operation, a tapping operation, a sliding operation, and the like.
When the touch operation is generated on the writing layer, if the touch operation is a long-press operation, the touch operation is a selection operation and is used for selecting other elements below the writing layer. When the touch operation is not the long press operation, if the writing layer is positioned on the top layer of the whiteboard application interface, writing is directly performed according to the touch operation, and if the writing layer is not positioned on the top layer of the whiteboard application interface, the writing layer is adjusted to the top layer of the whiteboard application interface and is directly written without the need that a user firstly selects the writing layer and then performs writing.
And when the touch operation is generated on the element layer, the touch operation is a selection operation, the element layer is selected according to the selection operation, and the element layer is adjusted to the top layer of the whiteboard application interface.
The element on the whiteboard application interface may be an element inserted in the whiteboard application interface. In an alternative embodiment, at least one layer can be inserted into the current whiteboard application interface through the detected insertion instruction. Specifically, the element to be inserted may be a picture, a text, or the like, or may also be an audio file, and if the element is an audio file, the displayed element may be a control icon for controlling audio playing or stopping.
Taking a whiteboard application as an example, fig. 3a is a schematic diagram of layers in a whiteboard application interface, and as shown in fig. 3a, when no other element is inserted into the whiteboard application interface, the whiteboard includes two layers, that is, a whiteboard background on the 0 th layer and a writing plane on the 1 st layer, the writing plane is located above the whiteboard background, the whiteboard background mainly determines a background color of the whiteboard, the writing plane mainly bears writing operations of a user, and a writing trace on the writing plane generates writing traces. At this point, a straight line is drawn on the whiteboard, and the effect shown in fig. 3b is obtained.
Still taking a whiteboard application as an example, fig. 4a is a schematic diagram of layers in another whiteboard application interface, where 4a can be obtained by inserting a picture 1 into fig. 3a, and a front effect diagram of the whiteboard application interface is shown in fig. 4b, after the picture 1 is inserted into a writing plane, the whiteboard currently includes three layers, that is, a whiteboard background on a 0 th layer, a writing plane on a 1 st layer, and a picture on a 2 nd layer.
And S204, when the touch operation is writing operation, generating writing handwriting according to the touch operation.
As an alternative embodiment, when the touch operation is a writing operation, generating handwriting according to the touch operation includes:
and step S2041, when the writing layer is located on the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation.
And step S2043, when the element layer is positioned on the top layer of the whiteboard application interface, displaying the writing layer on the top layer, and generating writing handwriting according to the writing operation.
Specifically, the touch operation is generated in the writing layer, and is used for indicating that a first point (namely a down point) falling from the touch operation is in the writing layer, and when the touch operation falls, if the writing layer is positioned on the top layer of the whiteboard application interface, writing handwriting is directly generated according to the writing operation; and if the writing layer is not positioned at the top layer of the whiteboard application interface, namely the element layer is positioned at the top layer of the whiteboard application interface, displaying the writing layer at the top layer, and generating writing handwriting according to the writing operation.
And step S206, when the touch operation is a selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation.
As an alternative embodiment, when the touch operation is a selection operation, adjusting, according to the selection operation, a hierarchy of an element selected by the selection operation in the whiteboard application interface includes: and displaying the element layer selected by the selection operation on the top layer of the whiteboard application interface.
When the touch operation is generated in the element layer, the touch operation is a selection operation. Next, the hierarchical adjustment of the element layers is described according to different selection operations, where the element layer selected by the selection operation is used as a target layer.
In the first case, the selection operation is a tap operation, and the selected target layer is located on the writing layer. In this case, the target layer is directly adjusted to the top layer of the whiteboard application interface.
This is explained below with reference to several figures. Fig. 5a is a schematic diagram of a bitmap writing layer with multiple element layers according to an embodiment of the present application, and fig. 5b is a display effect diagram of fig. 5a, where the uppermost element is picture 3, and a selection operation falls on an element layer other than any picture 3, which triggers a hierarchical transformation. As described in detail below.
Fig. 5c is a schematic view of a hierarchy of the clicked picture 1 in the example shown in fig. 5a, fig. 5d is a display effect diagram of fig. 5c, and in combination with fig. 5c and fig. 5d, after the user clicks the picture 1, a selection operation falls within the picture 1, the picture 1 is switched to the top layer, and other layers descend in sequence.
By means of the scheme, other elements are inserted into the writing plane, so that after the previously inserted element layer is covered by the new element layer, the previously inserted element layer can be adjusted to the top layer again through the operation, and therefore a user can continue to display the previously inserted element layer after inserting the new element layer conveniently. In the application process, when the element layer to be displayed is not positioned on the top layer, the user can touch or slide the layer to be displayed conveniently, the element layer to be displayed can be switched to the current top layer, and the operation is very convenient.
In the second case, the selection operation is a long-press operation, and the selected target layer is positioned below the writing layer. In this case, the target layer is still adjusted to the top layer of the whiteboard application interface.
Fig. 6a is an operation diagram illustrating an operation of changing a layer hierarchy according to an embodiment of the present application, and fig. 6b is a display effect diagram corresponding to fig. 6 a. With reference to fig. 6a and 6b, the writing plane is located above the picture 1, and in order to adjust the picture 1 to the top layer, the user can press for a long time (for example, at the position 60 shown in fig. 6 a) in an arbitrary area where the picture 1 is located, so as to determine that the layer of the picture 1 is the target layer, and further, the layer where the picture 1 is located can be adjusted to the top layer, so as to obtain the display effect in fig. 4b, where the changed layers include the whiteboard background of the 0 th layer, the writing plane of the 1 st layer, and the picture 1 of the 2 nd layer.
Fig. 7a is a schematic diagram of another modified layer hierarchy according to an embodiment of the present application, and fig. 7b is a display effect diagram of fig. 7a, which is shown in conjunction with fig. 7a and 7b, in this example, three element layers are all below a written layer. Fig. 7c is a schematic diagram of a long press operation performed on the basis of fig. 7b, and fig. 7e is a display effect diagram of fig. 7 d. The user performs a long press operation at the position 70 shown in fig. 7c, after the device detects the operation, the device determines that the target layer is the 1 st layer where the picture 1 is located and the 2 nd layer where the picture 2 is located, switches the 1 st layer and the 2 nd layer as a whole to the top layer, and obtains the hierarchical relationship shown in fig. 7d after switching to the top layer (the display effect diagram is shown in fig. 7 e), where the hierarchical relationship between the picture 1 and the picture 2 is not changed, that is, the picture 1 is still at the next layer of the picture 2, but both the picture 1 and the picture 2 are located above the picture 3 and the writing plane.
As an optional embodiment, after displaying the element layer selected by the selection operation on the top layer of the whiteboard application interface, the method further includes:
in step S2061, the drag operation is received, and the element selected by the drag operation is moved according to the drag trajectory of the drag operation.
Specifically, the dragging operation may be a sliding operation performed on the element layer on the top layer, and the element on the element layer selected by the dragging operation is the element selected by the dragging operation.
Fig. 8a is a schematic diagram of a dragging element according to an embodiment of the present application, as shown in fig. 8a, a dragging operation selects a picture 3, and the dragging operation slides to the left, so as to move the picture 3 to the left.
In step S2062, a zoom operation is received, and the element selected by the zoom operation is zoomed according to the zoom operation.
Specifically, the scaling operation is used to enlarge or reduce the element, and four corners of the top element layer may be selected to scale the element.
Fig. 8b is a schematic diagram of a zoom element, as shown in fig. 8b, a picture 3 is selected by a selection operation, and the zoom operation enlarges the selected picture 3 at the upper right corner.
Step S2063 receives the deletion operation, and removes the element selected by the deletion operation from the whiteboard application interface according to the deletion operation.
Specifically, the deletion operation is used to delete the element from the whiteboard application interface. The menu corresponding to the element may be triggered to delete the element, or the element may be dragged to a preset position of the whiteboard application interface to delete the element.
Fig. 8c is a schematic diagram according to a deletion element, as shown in fig. 8c, the whiteboard application interface includes a deletion area, and after selecting the selected picture 3, the picture 3 can be deleted by dragging the selected picture to the deletion area in the whiteboard application interface.
As can be seen from the above, in the above embodiments of the present application, a touch operation on a whiteboard application interface is received, and a type of the touch operation is obtained, where the type at least includes: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation. According to the scheme, the type of the touch operation is acquired after the touch operation is received, so that the corresponding action can be executed according to the type of the touch operation, the technical problem that the control operation of the element conflicts with the operation of handwriting generated by the touch of a user when the whiteboard inserts other types of elements in the prior art is solved, and the element on any layer can be controlled by the user through adjustment of the layer, so that the effect of convenience in operation is achieved.
Example 2
According to an embodiment of the present application, there is further provided an embodiment of a method for controlling an intelligent interactive tablet, and fig. 9 is a flowchart of the method for controlling the intelligent interactive tablet according to the embodiment of the present application, as shown in fig. 9, the method includes the following steps:
step S902, a touch event on the whiteboard application interface is detected.
Specifically, the whiteboard application interface is an operation interface provided by the whiteboard application, and can perform full-screen operation or reduce the operation to a window with a certain size. The touch operation on the whiteboard application interface can be an operation performed on the whiteboard application interface by touching the touch screen with a finger after use, and can also be a touch operation generated on the whiteboard application interface by an electromagnetic pen or other writing equipment.
Step S904, analyzing and obtaining the touch operation generating the touch event, and determining a type of the touch operation, where the type at least includes: a writing operation for writing on the whiteboard application interface and a selection operation for selecting an element on the whiteboard application interface.
After receiving the touch operation, the type of the operation is determined first, and how to respond to the touch operation is determined. In the above scheme, it is necessary to determine whether the received touch operation is a writing operation or a selection operation, so as to determine whether to generate writing on the whiteboard or to select other elements on the whiteboard application interface according to the selection operation.
As an optional embodiment, a plurality of elements are opened on a whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and determining the type of the touch operation includes:
and step S9041, judging a main body acted by the touch operation.
In step S9043, when the body on which the touch operation acts is an element layer, it is determined that the touch operation is a selection operation.
And step S9045, when the main body acted by the touch operation is the writing layer, determining the type of the touch operation according to the stay time of the touch operation at the initial position.
Specifically, the body on which the touch operation is performed refers to a body on which a first point, i.e., a down point, is located at which the touch operation is dropped. When the down point falls into the element layer, determining that a main body acted by the touch operation is the element layer; and when the down point falls into the writing layer, determining that the main body acted by the touch operation is the writing layer.
And when the touch operation is generated on the element layer, the touch operation is a selection operation, the element layer is selected according to the selection operation, and the element layer is adjusted to the top layer of the whiteboard application interface. When the touch operation is applied to the written image layer, further judgment is carried out according to the stay time of the touch operation at the initial position.
As an optional embodiment, when the main body receiving the touch operation is the writing layer, determining the type of the touch operation according to the time length of the touch operation staying at the initial position includes:
in step S90451, if the length of time that the touch operation stays at the initial position is greater than or equal to a preset length of time, it is determined that the touch operation is a selection operation.
In step S90453, if the length of time that the touch operation stays at the initial position is less than the preset length of time, it is determined that the touch operation is a writing operation.
Specifically, the touch operation in which the time length of the initial position staying is greater than or equal to the preset time length may be a long-time pressing operation, and the touch operation in which the time length of the initial position staying is less than the preset time length may be a clicking operation, a tapping operation, a sliding operation, and the like.
When the touch operation is generated on the writing layer, if the touch operation is a long-press operation, the touch operation is a selection operation and is used for selecting other elements below the writing layer. When the touch operation is not long press operation, if the writing layer is positioned on the top layer of the whiteboard application interface, writing is directly performed according to the touch operation, and if the writing layer is not positioned on the top layer of the whiteboard application interface, the whiteboard application adjusts the writing layer to the top layer of the whiteboard application interface and directly performs writing without selecting the writing layer by a user and then performing writing.
And step S906, when the touch operation is writing operation, generating writing handwriting according to the touch operation.
As an alternative embodiment, when the touch operation is a writing operation, generating handwriting according to the touch operation includes:
and step S9061, when the writing layer is located on the top layer of the whiteboard application interface, generating writing according to the writing operation.
And step S9063, when the element layer is located on the top layer of the whiteboard application interface, after the writing layer is adjusted to the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation.
Specifically, the touch operation is generated in the writing layer, and is used for indicating that a first point (namely a down point) falling from the touch operation is in the writing layer, and when the touch operation falls, if the writing layer is positioned on the top layer of the whiteboard application interface, writing handwriting is directly generated according to the writing operation; and if the writing layer is not positioned at the top layer of the whiteboard application interface, namely the element image layer is positioned at the top layer of the whiteboard application interface, displaying the writing layer at the top layer, and generating writing handwriting according to the writing operation.
Step S908, when the touch operation is a selection operation, adjusting a level of an element selected by the selection operation in the whiteboard application interface according to the selection operation.
It should be noted that the above hierarchical relationship is embodied as the vertical height of a certain view under different systems, for example, in an android system, the height of the view in the vertical direction (i.e., the Z-axis height) may be set to determine the display relationship of the view, and a view with a large vertical height may block a view with a small vertical height and preferentially obtain a touch event response. Altering the hierarchy refers to modifying the vertical height of the view.
As an alternative embodiment, when the touch operation is a selection operation, adjusting, according to the selection operation, a hierarchy of an element selected by the selection operation in the whiteboard application interface includes:
and step S9081, when the selection operation falls on any element layer on the upper layer of the writing layer, determining the element layer in which the selection operation falls as a target layer.
And step S9083, the target graph layer is placed on the top layer of the whiteboard application interface.
And step S9085, reducing the element layer with the layer level higher than the target layer by one layer.
When the touch operation is generated in the element layer, the touch operation is a selection operation. In the above scheme, the selection operation falls in any element layer on the upper layer of the writing layer, the selection operation may be a tap operation, and the selected target layer is on the writing layer. In this case, the target layer is directly adjusted to the top layer of the whiteboard application interface, and the element layer higher than the target layer is reduced by one layer.
This is explained below with reference to several figures. Fig. 5a is a schematic diagram of a bitmap writing layer with multiple element layers according to an embodiment of the present application, and fig. 5b is a display effect diagram of fig. 5a, where the uppermost element is picture 3, and a selection operation falls on an element layer other than any picture 3, which triggers a hierarchical transformation. As described in detail below.
Fig. 5c is a schematic view of a hierarchy of the clicked picture 1 in the example shown in fig. 5a according to an embodiment of the present application, and fig. 5d is a display effect diagram of fig. 5c, and with reference to fig. 5c and fig. 5d, after the user clicks the picture 1, a selection operation falls within the picture 1, the picture 1 is switched to a top layer, and layers of other elements fall down in sequence.
By means of the scheme, other elements are inserted into the writing plane, so that after the previously inserted element layer is covered by the new element layer, the previously inserted element layer can be adjusted to the top layer again through the operation, and therefore a user can continue to display the previously inserted element layer after inserting the new element layer conveniently. In the application process, when the element layer to be displayed is not positioned on the top layer, the user can touch or slide the layer to be displayed conveniently, the element layer to be displayed can be switched to the current top layer, and the operation is very convenient.
For example, if the highest hierarchy is t, the hierarchy of the layer hit by the first type of operation is i (0< i < t), and there are m elements with hierarchies greater than i, the hierarchy of the layer hit by the first type of operation is set to t, 1 is subtracted from each of the hierarchies of the m element layers with hierarchies greater than i, and no processing is performed on the element layers with hierarchies less than i.
As an alternative embodiment, when the touch operation is a selection operation, adjusting, according to the selection operation, a hierarchy of an element selected by the selection operation in the whiteboard application interface includes:
step S9087, when the element layer below the writing layer is selected by the selecting operation, acquiring a first element layer hit by the selecting operation and a second element layer higher than the first element layer.
And step S9089, arranging the levels of the first element layers in an ascending order to obtain a first array, and arranging the levels of the second element layers in an ascending order to obtain a second array.
And step S9091, increasing the equal steps of the levels represented by the first array until the maximum value in the first array is equal to the number of layers of the layer opened by the whiteboard application interface, and decreasing the equal steps of the levels in the second array until the minimum value in the second array is equal to the minimum level in the first element layer.
In the scheme, the selection operation is a long-time pressing operation, and the selected target layer is located below the writing layer. In this case, the target layer is still adjusted to the top layer of the whiteboard application interface.
Fig. 6a is an operation diagram illustrating an operation of changing a layer hierarchy according to an embodiment of the present application, and fig. 7b is a display effect diagram corresponding to fig. 7 a. With reference to fig. 6a and 6b, the writing plane is located above the picture 1, and in order to adjust the picture 1 to the top layer, the user can press for a long time (for example, at the position 50 shown in fig. 6 a) in an arbitrary area where the picture 1 is located, so as to determine that the layer of the picture 1 is the target layer, and further, the layer where the picture 1 is located can be adjusted to the top layer, so as to obtain the display effect in fig. 4b, where the changed layers include the whiteboard background of the 0 th layer, the writing plane of the 1 st layer, and the picture 1 of the 2 nd layer.
Fig. 7a is a schematic diagram of a plurality of element map layers all disposed below a written map layer according to an embodiment of the present application, and fig. 7b is a display effect diagram of fig. 7a, which is shown in conjunction with fig. 7a and 7b, in this example, three element map layers are all below the written map layer.
Fig. 7c is a schematic diagram of performing a long press operation on the basis of fig. 7b, where a user performs the long press operation at a position 70 shown in fig. 7c, and after detecting the operation, the device determines that the target image layers are the 1 st layer where the picture 1 is located and the 2 nd layer where the picture 2 is located, switches the 1 st layer and the 2 nd layer as a whole to the top layer, and obtains the hierarchical relationship shown in fig. 7d after switching to the top layer (the display effect diagram is shown in fig. 7 e), where the hierarchical relationship between the picture 1 and the picture 2 is unchanged, that is, the picture 1 is still on the next layer of the picture 2, but both the picture 1 and the picture 2 are on the picture 3 and the writing plane.
To illustrate the manner of layer switching, in an alternative embodiment, assuming that the writing plane has a level t, there are m element layers that need to be changed to be above the writing layer, wherein the minimum level is i, the element layers from the level i to t are divided into two groups according to the ascending order of the levels, one group is the element layer containing the long coordinate point, and is set as [ j1, j 2.·, jn ], another group includes other element layers (i.e. element layers higher than the target layer) except the former group, which is set as [ k1, k 2.·, km ], and two groups are merged as [ k1, k 2.·, km, j1, j 2.·, jm ], and setting the hierarchy of the element layers in the combined group as i, i +1, i +2, the, t from left to right, thereby completing the purpose of pressing and starting the element layers in writing layer length.
The hierarchy size is realized by changing the height of the elements in the vertical direction, and the clicked elements can become the uppermost layer through the hierarchy transformation mode, and meanwhile, the relative hierarchy among other elements is not influenced.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing the elements, the element layers are provided with a buoy displayed on a top layer of the whiteboard application interface, and the step of determining the type of the touch operation further includes: if the touch operation falls into the buoy of any one layer, determining the touch operation as a selection operation; according to the selection operation, the hierarchy of the elements selected by the selection operation in the whiteboard application interface is adjusted, and the method comprises the following steps: and adjusting the layer corresponding to the buoy indicated by the touch operation to the top layer of the white board.
The embodiment also provides a scheme for adjusting the layer level through the buoy. The layers are bound with the corresponding buoys, the buoys of each layer can be displayed at the designated positions of the layers, the buoys of the layers can be hidden at ordinary times, and can be displayed when the triggering of a preset instruction is received, or can be displayed all the time.
In the scheme, no matter the layer belongs to the second layer, the buoy of the layer is displayed on the top layer, so that the selection operation of a user can be received.
Fig. 10 is a schematic diagram of adjusting layer levels by using buoys according to an embodiment of the present application, and with reference to fig. 10, by entering an adjustment level mode in a toolbar, except for a writing plane, a buoy is displayed in an upper left corner of each layer, and the buoys are located above a top layer.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the step of determining the type of the touch operation further includes: if the touch operation is used for calling out a layer relation table of the whiteboard application interface, determining that the touch operation is a selection operation, wherein the layer relation table comprises: the method comprises the steps that elements opened by a whiteboard application interface and the levels of the layers of the elements are located; according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, wherein the hierarchy comprises: detecting a dragging operation on any one element in the layer relation table, wherein the dragging operation is used for dragging the identifier of any one element in the layer relation table from a first position in the layer relation table to a second position in the layer relation table; replacing the hierarchy of elements represented by the element identification of the first location and the element identification of the second location.
Specifically, the layer relationship table is used for recording the layer relationship in the application in a list manner, the layer relationship table is arranged according to a hierarchical sequence, and each row or each column corresponds to one hierarchy.
Fig. 11a is a schematic diagram of a layer relationship table according to an embodiment of the present application, and as shown in fig. 11a, the layer relationship table includes, from bottom to top, a writing plane (layer 1), a picture 1 (layer 2), a picture 2 (layer three), and a picture 3 (layer 4), and since a whiteboard background of layer 0 is a fixed layer, the whiteboard background may not be displayed in the layer relationship table.
Fig. 11b is a schematic diagram of adjusting an image layer through an image layer relationship table according to an embodiment of the present application, and as shown in fig. 11b, a dragging operation in this example is shown by an arrow, where the arrow points from the image 1 to the image 3, that is, a user can drag the image 1 to the image 3, so that a first position in this example is a position where the image 1 is located, and a second position is a position where the image 3 is located, and a hierarchy of the image layer where the image 1 is located and a hierarchy of the image layer where the image 3 is located are exchanged, that is, after the operation is performed, the image 1 is located at a 4 th layer, and the image 3 is located at a 2 nd layer.
As an alternative embodiment, after placing the target graph layer on the top layer of the whiteboard application interface, the method further comprises:
and step S9093, receiving the dragging event acting on the target layer, and moving the elements in the target layer according to the dragging track of the dragging event.
Specifically, the dragging operation may be a sliding operation after the image layer falls on the target image layer, as shown in fig. 8a, the image layer where the picture 3 is located is the target image layer, and the dragging operation slides to the left, so as to drive the picture 3 to move to the left.
And step S9095, receiving a zooming instruction acting on the target layer, and zooming the elements in the target layer according to the zooming instruction.
Specifically, the scaling operation is used to enlarge or reduce the elements in the target layer, and four corners of the elements in the target layer may be selected to scale the elements, as shown in fig. 8b, the layer where the picture 3 is located is the target layer, and the scaling operation selects the upper right corner of the picture 3 to enlarge the layer.
And step S9097, receiving a deleting instruction acting on the target layer, and clearing the elements in the target layer from the whiteboard application interface according to the deleting instruction.
Specifically, the deletion operation is used to delete the element from the whiteboard application interface. The menu corresponding to the element may be triggered to delete the element, or the element may be dragged to a preset position of the whiteboard application interface to delete the element, as shown in fig. 8c, the whiteboard application interface includes a deletion area, the layer where the picture 3 is located is a target layer, and the picture 3 may be deleted by dragging the target layer to the deletion area in the whiteboard application interface.
As can be seen from the above, the above embodiments of the present application detect a touch event on a whiteboard application interface; analyzing and obtaining a touch operation for generating a touch event, and judging the type of the touch operation, wherein the type at least comprises: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface; when the touch operation is writing operation, generating writing handwriting according to the touch operation; and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation. According to the scheme, the type of the touch operation is acquired after the touch operation is received, so that the corresponding action can be executed according to the type of the touch operation, the technical problem that in the prior art, when other types of elements are inserted into the whiteboard, the control operation of the elements conflicts with the operation of handwriting generation through the touch of a user is solved, the elements on any layer can be controlled by the user through adjustment of the layer, and the effect of convenience and quickness in operation is achieved.
Example 3
According to an embodiment of the present application, there is provided an embodiment of a control apparatus of an intelligent interactive tablet, and fig. 12 is a schematic diagram of a control apparatus of an intelligent interactive tablet of a device according to an embodiment of the present application, as shown in fig. 12, the apparatus includes:
a receiving module 120, configured to receive a touch operation on the whiteboard application interface and obtain a type of the touch operation, where the type at least includes: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface.
And the generating module 122 is configured to generate a writing trace according to the touch operation when the touch operation is the writing operation.
And the adjusting module 124 is used for adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation when the touch operation is the selection operation.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the receiving module includes: the first obtaining sub-module is used for obtaining a touch operation with the stay time length at the initial position being greater than or equal to a preset time length as a selection operation and obtaining a touch operation with the stay time length at the initial position being less than the preset time length as a writing operation when the touch operation is generated on the writing layer; and the second obtaining sub-module is used for obtaining the touch operation as a selection operation when the touch operation is generated on the element layer.
As an alternative embodiment, the generating module includes: the first production sub-module is used for generating writing handwriting according to writing operation when the writing layer is positioned on the top layer of the whiteboard application interface; and the second production submodule is used for displaying the writing layer on the top layer when the element image layer is positioned on the top layer of the whiteboard application interface and generating writing handwriting according to the writing operation.
As an alternative embodiment, when the touch operation is the selection operation, the adjusting module includes: and the display submodule is used for displaying the element layer selected by the selection operation on the top layer of the whiteboard application interface.
As an alternative embodiment, the apparatus further comprises: the dragging module is used for receiving dragging operation after the element layer selected by the selection operation is displayed on the top layer of the whiteboard application interface, and moving the element selected by the dragging operation according to a dragging track of the dragging operation; the zooming module is used for receiving zooming operation and zooming the elements selected by the zooming operation according to the zooming operation; and the deleting module is used for receiving the deleting operation and clearing the elements selected by the deleting operation from the whiteboard application interface according to the deleting operation.
Example 4
According to an embodiment of the present application, there is provided another embodiment of a control apparatus of an intelligent interaction tablet, and fig. 13 is a schematic diagram of the control apparatus of another intelligent interaction tablet of a device according to the embodiment of the present application, as shown in fig. 13, the apparatus includes:
the detecting module 130 is configured to detect a touch event on the whiteboard application interface.
The parsing module 132 is configured to parse the touch operation for obtaining the generated touch event, and determine a type of the touch operation, where the type at least includes: a writing operation for writing on the whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface.
And the generating module 134 is configured to generate a writing trace according to the touch operation when the touch operation is the writing operation.
And the adjusting module 136 is configured to, when the touch operation is a selection operation, adjust, according to the selection operation, a hierarchy of the element selected by the selection operation in the whiteboard application interface.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the parsing module includes: the judging submodule is used for judging a main body acted by the touch operation; the first determining submodule is used for determining that the touch operation is the selection operation when a main body acted by the touch operation is an element layer; and the second determining submodule is used for determining the type of the touch operation according to the stay time of the touch operation at the initial position when the main body acted by the touch operation is the writing layer.
As an alternative embodiment, the second determination submodule includes: the first determining unit is used for determining that the touch operation is the selection operation if the stay time of the touch operation at the initial position is greater than or equal to the preset time; and the second determination unit is used for determining that the touch operation is the writing operation if the stay time of the touch operation at the initial position is less than the preset time.
As an alternative embodiment, the generating module comprises: the first production sub-module is used for generating writing handwriting according to writing operation when the writing layer is positioned on the top layer of the whiteboard application interface; and the second production submodule is used for adjusting the writing layer to the top layer of the whiteboard application interface when the element image layer is positioned on the top layer of the whiteboard application interface, and then generating writing according to the writing operation.
As an alternative embodiment, when the touch operation is the selection operation, according to the selection operation, the adjusting module includes: the third determining submodule is used for determining the element layer in which the selection operation falls as a target layer when the selection operation falls on any element layer on the upper layer of the writing layer; the setting submodule is used for placing the target graph layer on the top layer of the whiteboard application interface; and the reducing submodule is used for reducing the element layers with the layer levels higher than the target layer by one layer.
As an alternative embodiment, the adjusting module comprises: the obtaining submodule is used for obtaining a first element layer hit by the selection operation and a second element layer higher than the first element layer when the element layer below the writing layer is selected by the selection operation; the arrangement submodule is used for arranging the levels of the first element map layer in an ascending order to obtain a first array and arranging the levels of the second element map layer in an ascending order to obtain a second array; and the adjusting submodule is used for ascending the equal steps of the levels represented by the first array until the maximum value in the first array is equal to the number of layers of the layer opened by the whiteboard application interface, and descending the equal steps of the levels in the second array until the minimum value in the second array is equal to the minimum level in the first element layer.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing the elements, the element layers are provided with a buoy displayed on a top layer of the whiteboard application interface, and the determining module further includes: the fourth determining submodule is used for determining that the touch operation is the selection operation if the touch operation falls into the buoy of any one layer; wherein, the adjustment module includes: and the adjusting submodule is used for adjusting the layer corresponding to the buoy indicated by the touch operation to the top layer of the whiteboard.
As an optional embodiment, a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface includes a writing layer for bearing writing content and a plurality of element layers for bearing elements, and the determining module further includes: if the touch operation is used for calling out a layer relation table of the whiteboard application interface, determining that the touch operation is a selection operation, wherein the layer relation table comprises: the levels of the elements and the layers of the elements which are opened by the whiteboard application interface; wherein, the adjustment module includes: the detection submodule is used for detecting the dragging operation of any element in the layer relation table, wherein the dragging operation is used for dragging the identifier of any element in the layer relation table from a first position in the layer relation table to a second position in the layer relation table; and the replacing submodule is used for replacing the hierarchy of the elements represented by the element identifications of the first position and the element identifications of the second position.
As an alternative embodiment, the apparatus further comprises: the system comprises a dragging module, a display module and a display module, wherein the dragging module is used for receiving a dragging event acting on a target layer after the target layer is arranged on the top layer of a whiteboard application interface, and moving elements in the target layer according to a dragging track of the dragging event; the scaling module is used for receiving a scaling instruction acting on a target layer and scaling the elements in the target layer according to the scaling instruction; and the deleting module is used for receiving a deleting instruction acting on the target layer and clearing the elements in the target layer from the whiteboard application interface according to the deleting instruction.
Example 5
According to an embodiment of the present application, there is provided a computer storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor and to execute the method for controlling an intelligent interactive tablet according to any one of embodiment 1 or embodiment 2.
Example 6
According to an embodiment of the present application, there is provided an intelligent interactive tablet, including: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method of controlling an intelligent interactive tablet as described in any of embodiment 1 or embodiment 2.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in a software product stored in a storage medium, and the software product includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (17)

1. A control method of an intelligent interactive tablet is characterized by comprising the following steps:
receiving a touch operation on a whiteboard application interface, and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on a whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface;
when the touch operation is the writing operation, generating writing handwriting according to the touch operation;
and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation.
2. The method according to claim 1, wherein a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface comprises a writing layer for carrying writing contents and a plurality of element layers for carrying the elements, and the obtaining of the type of the touch operation comprises:
when the touch operation is generated on the writing layer, acquiring the touch operation with the time length of staying at the initial position being greater than or equal to a preset time length as the selection operation, and acquiring the touch operation with the time length of staying at the initial position being less than the preset time length as the writing operation;
and when the touch operation is generated on the element layer, acquiring the touch operation as the selection operation.
3. The method of claim 2, wherein generating a writing script according to the touch operation when the touch operation is the writing operation comprises:
when the writing layer is positioned on the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation;
and when the element layer is positioned on the top layer of the whiteboard application interface, displaying the writing layer on the top layer, and generating writing handwriting according to the writing operation.
4. The method according to claim 2, wherein when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation comprises:
and displaying the element layer selected by the selection operation on the top layer of the whiteboard application interface.
5. The method according to claim 4, wherein after the element layer selected by the selection operation is displayed on the top layer of the whiteboard application interface, the method further comprises:
receiving a dragging operation, and moving an element selected by the dragging operation according to a dragging track of the dragging operation;
receiving a zooming operation, and zooming the element selected by the zooming operation according to the zooming operation;
and receiving a deleting operation, and clearing elements selected by the deleting operation from the whiteboard application interface according to the deleting operation.
6. A control method of an intelligent interactive tablet is characterized by comprising the following steps:
detecting a touch event on a whiteboard application interface;
analyzing and obtaining a touch operation for generating the touch event, and judging the type of the touch operation, wherein the type at least comprises: a writing operation for writing on a whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface;
when the touch operation is the writing operation, generating writing handwriting according to the touch operation;
and when the touch operation is the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation.
7. The method according to claim 6, wherein a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface comprises a writing layer for carrying writing contents and a plurality of element layers for carrying the elements, and the determining the type of the touch operation comprises:
judging a main body acted by the touch operation;
when the main body acted by the touch operation is an element layer, determining the touch operation as the selection operation;
and when the main body acted by the touch operation is a writing layer, determining the type of the touch operation according to the stay time of the touch operation at the initial position.
8. The method according to claim 7, wherein when the subject of the touch operation is the written layer, determining the type of the touch operation according to the time length of the touch operation staying at the initial position comprises:
if the time length of the touch operation staying at the initial position is greater than or equal to the preset time length, determining that the touch operation is a selection operation;
and if the time length of the touch operation staying at the initial position is less than the preset time length, determining that the touch operation is a writing operation.
9. The method according to claim 7, wherein when the touch operation is the writing operation, generating handwriting according to the touch operation comprises:
when the writing layer is positioned on the top layer of the whiteboard application interface, generating writing handwriting according to the writing operation;
and when the element layer is positioned at the top layer of the whiteboard application interface, adjusting the writing layer to the top layer of the whiteboard application interface, and then generating writing handwriting according to the writing operation.
10. The method according to claim 7, wherein when the touch operation is the selection operation, adjusting the level of the element selected by the selection operation in the whiteboard application interface according to the selection operation comprises:
when the selection operation falls on any element layer on the upper layer of the writing layer, determining the element layer in which the selection operation falls as a target layer;
placing the target graph layer on the top layer of the whiteboard application interface;
and reducing the element layers with the layer levels higher than the target layer by one layer.
11. The method according to claim 7, wherein when the touch operation is the selection operation, adjusting the level of the element selected by the selection operation in the whiteboard application interface according to the selection operation comprises:
when the element layer below the writing layer is selected by the selection operation, a first element layer hit by the selection operation and a second element layer higher than the first element layer are obtained;
arranging the levels of the first element layers in an ascending order to obtain a first array, and arranging the levels of the second element layers in an ascending order to obtain a second array;
and increasing the equal steps of the levels represented by the first array until the maximum value in the first array is equal to the number of layers of the layer opened by the whiteboard application interface, and decreasing the equal steps of the levels in the second array until the minimum value in the second array is equal to the minimum level in the first element layer.
12. The method according to claim 6, wherein a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface comprises a writing layer for carrying writing contents and a plurality of element layers for carrying the elements, the element layers are provided with a buoy displayed on a top layer of the whiteboard application interface, and the step of determining the type of the touch operation further comprises:
if the touch operation falls into a buoy of any layer, determining that the touch operation is the selection operation;
according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, including: and adjusting the layer corresponding to the buoy indicated by the touch operation to the top layer of the whiteboard.
13. The method according to claim 6, wherein a plurality of elements are opened on the whiteboard application interface, the whiteboard application interface comprises a writing layer for carrying writing contents and a plurality of element layers for carrying the elements, and the step of determining the type of the touch operation further comprises:
if the touch operation is used for calling out a layer relation table of the whiteboard application interface, determining that the touch operation is the selection operation, wherein the layer relation table comprises: the whiteboard application interface comprises elements opened by the whiteboard application interface and the levels of the layers of the elements;
according to the selection operation, adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface, including:
detecting a dragging operation on any one element in the layer relation table, wherein the dragging operation is used for dragging an identifier of any one element in the layer relation table from a first position in the layer relation table to a second position in the layer relation table;
replacing the hierarchy of elements represented by the element identification of the first location and the element identification of the second location.
14. The method of claim 10, wherein after placing the target graph layer on top of the whiteboard application interface, the method further comprises:
receiving a dragging event acting on the target layer, and moving elements in the target layer according to a dragging track of the dragging event;
receiving a zooming instruction acting on the target layer, and zooming the elements in the target layer according to the zooming instruction;
and receiving a deleting instruction acting on the target layer, and clearing elements in the target layer from the whiteboard application interface according to the deleting instruction.
15. The utility model provides a controlling means of mutual dull and stereotyped of intelligence which characterized in that includes:
the receiving module is used for receiving touch operation on a whiteboard application interface and acquiring the type of the touch operation, wherein the type at least comprises: a writing operation for writing on a whiteboard application interface and a selecting operation for selecting an element on the whiteboard application interface;
the generating module is used for generating writing handwriting according to the touch operation when the touch operation is the writing operation;
and the adjusting module is used for adjusting the hierarchy of the element selected by the selection operation in the whiteboard application interface according to the selection operation when the touch operation is the selection operation.
16. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method of controlling an intelligent interactive tablet of any one of claims 1 to 14.
17. An intelligent interactive tablet, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method of controlling a smart interactive tablet according to any of claims 1 to 14.
CN201910955725.9A 2019-10-09 2019-10-09 Control method and device of intelligent interactive panel Active CN110968227B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910955725.9A CN110968227B (en) 2019-10-09 2019-10-09 Control method and device of intelligent interactive panel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910955725.9A CN110968227B (en) 2019-10-09 2019-10-09 Control method and device of intelligent interactive panel

Publications (2)

Publication Number Publication Date
CN110968227A true CN110968227A (en) 2020-04-07
CN110968227B CN110968227B (en) 2022-03-08

Family

ID=70029703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910955725.9A Active CN110968227B (en) 2019-10-09 2019-10-09 Control method and device of intelligent interactive panel

Country Status (1)

Country Link
CN (1) CN110968227B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379822A (en) * 2020-11-25 2021-02-19 武汉市人机科技有限公司 Method for removing whiteboard cards
CN112462991A (en) * 2020-11-27 2021-03-09 广州视源电子科技股份有限公司 Control method of intelligent interactive tablet, storage medium and related equipment
CN113434106A (en) * 2021-08-30 2021-09-24 广州市保伦电子有限公司 Online electronic whiteboard content synchronous sharing system
CN113703652A (en) * 2021-08-31 2021-11-26 重庆杰夫与友文化创意有限公司 Image processing method, system and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226445A (en) * 2013-05-10 2013-07-31 广东国笔科技股份有限公司 Handwriting input method, system and terminal
CN105446647A (en) * 2015-12-25 2016-03-30 智慧方舟科技有限公司 E-ink book and reading note implementation method and device thereof
US20170024359A1 (en) * 2015-07-20 2017-01-26 Sas Institute Inc. Techniques to provide processing enhancements for a text editor in a computing environment
CN106933328A (en) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 A kind of control method of mobile terminal frame per second, device and mobile terminal
CN107040749A (en) * 2016-02-04 2017-08-11 株式会社理光 Share the method for screen hand-written image and the terminal for carrying out video conference in video conference
CN107918549A (en) * 2017-11-27 2018-04-17 广州视睿电子科技有限公司 Labeling method, device, computer equipment and the storage medium of stereo unfolding drawing
CN108710460A (en) * 2018-05-15 2018-10-26 广州视源电子科技股份有限公司 element control method, device, equipment and storage medium
CN108958630A (en) * 2018-07-06 2018-12-07 广州视源电子科技股份有限公司 Written contents display methods, device and electronic equipment
CN109696971A (en) * 2017-10-20 2019-04-30 夏普株式会社 Display device, display methods and recording medium
CN109697004A (en) * 2018-12-28 2019-04-30 广州视源电子科技股份有限公司 Method, apparatus, equipment and storage medium for touch apparatus pen annotation
CN110058755A (en) * 2019-04-15 2019-07-26 广州视源电子科技股份有限公司 A kind of method, apparatus, terminal device and the storage medium of PowerPoint interaction

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226445A (en) * 2013-05-10 2013-07-31 广东国笔科技股份有限公司 Handwriting input method, system and terminal
US20170024359A1 (en) * 2015-07-20 2017-01-26 Sas Institute Inc. Techniques to provide processing enhancements for a text editor in a computing environment
CN105446647A (en) * 2015-12-25 2016-03-30 智慧方舟科技有限公司 E-ink book and reading note implementation method and device thereof
CN107040749A (en) * 2016-02-04 2017-08-11 株式会社理光 Share the method for screen hand-written image and the terminal for carrying out video conference in video conference
CN106933328A (en) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 A kind of control method of mobile terminal frame per second, device and mobile terminal
CN109696971A (en) * 2017-10-20 2019-04-30 夏普株式会社 Display device, display methods and recording medium
CN107918549A (en) * 2017-11-27 2018-04-17 广州视睿电子科技有限公司 Labeling method, device, computer equipment and the storage medium of stereo unfolding drawing
CN108710460A (en) * 2018-05-15 2018-10-26 广州视源电子科技股份有限公司 element control method, device, equipment and storage medium
CN108958630A (en) * 2018-07-06 2018-12-07 广州视源电子科技股份有限公司 Written contents display methods, device and electronic equipment
CN109697004A (en) * 2018-12-28 2019-04-30 广州视源电子科技股份有限公司 Method, apparatus, equipment and storage medium for touch apparatus pen annotation
CN110058755A (en) * 2019-04-15 2019-07-26 广州视源电子科技股份有限公司 A kind of method, apparatus, terminal device and the storage medium of PowerPoint interaction

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379822A (en) * 2020-11-25 2021-02-19 武汉市人机科技有限公司 Method for removing whiteboard cards
CN112462991A (en) * 2020-11-27 2021-03-09 广州视源电子科技股份有限公司 Control method of intelligent interactive tablet, storage medium and related equipment
CN113434106A (en) * 2021-08-30 2021-09-24 广州市保伦电子有限公司 Online electronic whiteboard content synchronous sharing system
CN113434106B (en) * 2021-08-30 2021-11-23 广州市保伦电子有限公司 Online electronic whiteboard content synchronous sharing system
CN113703652A (en) * 2021-08-31 2021-11-26 重庆杰夫与友文化创意有限公司 Image processing method, system and storage medium

Also Published As

Publication number Publication date
CN110968227B (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN110968227B (en) Control method and device of intelligent interactive panel
CN110716680B (en) Control method and device of intelligent interactive panel
US9996176B2 (en) Multi-touch uses, gestures, and implementation
EP3180687B1 (en) Hover-based interaction with rendered content
CN104205098B (en) It navigates using between the content item of array pattern in a browser
RU2609070C2 (en) Context menu launcher
EP2192477B1 (en) Portable terminal with touch screen and method for displaying tags in the portable terminal
US9223471B2 (en) Touch screen control
US10684751B2 (en) Display apparatus, display method, and program
US8468460B2 (en) System and method for displaying, navigating and selecting electronically stored content on a multifunction handheld device
US20150067582A1 (en) Content navigation structure and transition mechanism
US20140210797A1 (en) Dynamic stylus palette
CN108829327B (en) Writing method and device of interactive intelligent equipment
US20140189593A1 (en) Electronic device and input method
CN111625158B (en) Electronic interaction panel, menu display method and writing tool attribute control method
CN110928475B (en) Page interaction method, device, equipment and storage medium of intelligent interaction panel
CN111475097A (en) Handwriting selection method and device, computer equipment and storage medium
KR20080066416A (en) User interface methods in mobile terminal having touch screen
CN108762657B (en) Operation method and device of intelligent interaction panel and intelligent interaction panel
CN110413187B (en) Method and device for processing annotations of interactive intelligent equipment
KR100795590B1 (en) Method of navigating, electronic device, user interface and computer program product
US20240004532A1 (en) Interactions between an input device and an electronic device
CN104281383A (en) Information display apparatus
CN108255558B (en) Writing software calling method, device, equipment and computer readable storage medium
KR20090017828A (en) Method for controling interface, and apparatus for implementing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant