CN110727383A - Touch interaction method and device based on small program, electronic equipment and storage medium - Google Patents

Touch interaction method and device based on small program, electronic equipment and storage medium Download PDF

Info

Publication number
CN110727383A
CN110727383A CN201910882703.4A CN201910882703A CN110727383A CN 110727383 A CN110727383 A CN 110727383A CN 201910882703 A CN201910882703 A CN 201910882703A CN 110727383 A CN110727383 A CN 110727383A
Authority
CN
China
Prior art keywords
graph
touch
user
applet
drawn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910882703.4A
Other languages
Chinese (zh)
Other versions
CN110727383B (en
Inventor
邬一平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910882703.4A priority Critical patent/CN110727383B/en
Publication of CN110727383A publication Critical patent/CN110727383A/en
Application granted granted Critical
Publication of CN110727383B publication Critical patent/CN110727383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch interaction method and device based on an applet, an electronic device and a storage medium, and relates to the technical field of applet application. The specific implementation scheme is as follows: if the fact that a user touches a display interface of an applet embedded in other application clients is detected, acquiring a touch position executed by the user on the display interface of the applet; detecting whether the touch of the user falls into at least one graph drawn by the drawing component or not based on the touch position, the position of the drawing component in the display interface of the small program acquired in advance and the vertex of each graph drawn by the drawing component; and if so, performing interactive response based on a target graph in the at least one graph, the gesture information touched by the user and a preset interactive configuration. By adopting the technical scheme, the method and the device can make up the defects of the prior art, support the touch interaction of the applet and enrich the use function of the applet.

Description

Touch interaction method and device based on small program, electronic equipment and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to an applet application technology, and in particular, to a method and an apparatus for touch interaction based on an applet, an electronic device, and a storage medium.
Background
The small program is a light-weight application running on the application of the mobile terminal, does not need to be downloaded and installed, can be used by scanning codes, and is very convenient to use. For example, many of the existing instant messaging applications are embedded with small programs, which greatly facilitates users.
In existing applets, a drawing component is provided, which can be implemented, for example, by a Canvas in a common case. The Canvas provides a way of drawing graphics through JavaScript and HTML elements, and can be used for animation, game pictures, data visualization, picture editing and real-time video processing, for example, the Canvas is used for applet map-related rendering. But the Canvas itself has only drawing capabilities, no interaction at all, and animation capabilities. The Canvas used by the traditional applet end is also only used for showing animations, but there is no way to achieve user interaction with the applet, making some deeper level of interaction. Therefore, it is desirable to provide an applet-based touch interaction scheme.
Disclosure of Invention
The application provides a touch interaction method and device based on an applet, an electronic device and a storage medium, which are used for making up the defects of the prior art and providing a touch interaction scheme based on the applet.
The application provides a touch interaction method based on an applet, which comprises the following steps:
if the fact that a user touches a display interface of an applet embedded in other application clients is detected, acquiring a touch position executed by the user on the display interface of the applet;
detecting whether the touch of the user falls into at least one graph drawn by the drawing component or not based on the touch position, the position of the drawing component in the display interface of the small program acquired in advance and the vertex of each graph drawn by the drawing component;
and if so, performing interactive response based on a target graph in the at least one graph, the gesture information touched by the user and a preset interactive configuration.
Further optionally, in the method as described above, detecting whether the touch of the user falls within at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet, and vertices of the graphs drawn by the drawing component, includes:
acquiring the relative position of the touch position relative to the drawing assembly based on the touch position and the position of the drawing assembly;
acquiring boundary information of the corresponding graph based on the vertex of each graph drawn by the drawing component;
and detecting whether the touch of the user falls in the at least one graph or not according to the relative position, the vertex of each graph and the boundary information of each graph.
Further optionally, in the method, detecting whether the touch of the user falls within the at least one graph according to the relative position, a vertex of each graph, and boundary information of each graph includes:
for each graph, analyzing whether the touch of the user is suspected to fall in the corresponding graph according to the relative position and the boundary information of the graph;
and if so, further detecting whether the touch of the user falls into the corresponding graph or not according to the relative position and the corresponding vertex of the graph.
Further optionally, in the method described above, before performing an interactive response based on a target graphic in the at least one graphic, gesture information of the user touch, and a preset interactive configuration after detecting that the user touch falls within the at least two graphics drawn by the drawing component, the method includes:
and acquiring the graph at the uppermost level from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
Further optionally, in the method, before performing an interactive response based on a target graphic in the at least one graphic, the gesture information touched by the user, and a preset interactive configuration, the method further includes:
and generating the preset interaction configuration based on the target graph, gesture information for touching the target graph and response information of the target graph based on the gesture information.
The application also provides a touch interaction device based on the applet, the device includes:
the acquisition module is used for acquiring a touch position executed by a user on a display interface of an applet embedded in other application clients if the fact that the user touches the display interface of the applet is detected;
the detection module is used for detecting whether the touch of the user falls into at least one graph drawn by the drawing component or not based on the touch position, the position of the drawing component in the display interface of the small program obtained in advance and the vertex of each graph drawn by the drawing component;
and the interaction module is used for carrying out interaction response based on a target graph in the at least one graph, the gesture information of the user touch and a preset interaction configuration when the fact that the user touch falls into the at least one graph drawn by the drawing assembly is detected.
The present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method as any one of above.
The present application also provides a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any of the above.
One embodiment in the above application has the following advantages or benefits: by adopting the technical scheme, the defects of the prior art can be made up, the touch interaction of the applet is supported, and the use function of the applet is enriched.
Further, in the application, whether the touch of the user falls into at least one graph drawn by the drawing component is detected based on the touch position, the position of the drawing component in the display interface of the applet and the vertex of each graph drawn by the drawing component, so that whether any graph in the Canvas component is touched can be detected very accurately, the detection accuracy is ensured, and the performance of touch interaction can be ensured.
Moreover, in the method and the device, all the graphs are rendered in one Canvas component by the applet only, a large number of external View components or Canvas components do not need to be created for assistance, the Canvas components can be greatly utilized, excessive performance of the applet is not consumed, and therefore touch interaction in the applet is low in implementation cost and better in implementation performance.
Moreover, the touch interaction of the small program realized by the application has strong interaction capacity, can be used for click interaction and animation interaction, and can greatly improve the interaction capacity of the Canvas assembly in the small program.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present application;
fig. 2 is a schematic diagram of detection according to an embodiment of the present application.
Fig. 3 is a schematic touch diagram according to an embodiment of the present disclosure.
FIG. 4 is a schematic illustration according to a second embodiment of the present application;
FIG. 5 is a block diagram of an electronic device for implementing a method of applet-based touch interaction in an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a flowchart of an embodiment of a touch interaction method based on an applet. As shown in fig. 1, the touch interaction method based on an applet of this embodiment may specifically include the following steps:
s101, if a user touches and is embedded into the display interfaces of the applets of other application clients, acquiring the touch position executed by the user on the display interfaces of the applets;
s102, detecting whether the touch of a user falls into at least one graph drawn by a drawing component or not based on the touch position, the position of the drawing component in a display interface of the small program obtained in advance and the vertex of each graph drawn by the drawing component; if yes, go to step S102; otherwise, determining that the touch of the user does not fall into any drawn graph of the drawing assembly, and ending the small program without interacting with the user.
S103, interactive response is carried out based on a target graph in at least one graph, gesture information of user touch and preset interactive configuration.
The execution subject of the applet-based touch interaction method according to this embodiment may be the applet-based touch interaction device according to this embodiment, and the applet-based touch interaction device may be configured in an applet to implement touch interaction between the applet and a user.
The technical scheme of the embodiment is applied to the applets, the applets can be quick applications which are directly embedded into other applications without downloading, the use mode is convenient, and the application is favored by more and more users.
The existing applet is provided with a drawing component Canvas, which can show some simple animations, such as showing the process of drawing a graph or making the drawn graph perform some simple actions according to the preset setting. However, when a user touches a graphic on the interface of the applet, the applet cannot make any interaction with the user. In this embodiment, in order to implement the interaction between the applet and the user based on the graphics, in this embodiment, the touch control interaction device based on the applet may be used to detect the touch control of the user, for example, it may be detected that the user touches a display interface of the applet and needs to perform interaction at this time, and it is further detected whether the touch control of the user falls into the graphics drawn by the drawing component, and if the touch control of the user falls into the graphics drawn by the drawing component, it may be determined that the user needs to make a corresponding interaction response when requesting the interaction at this time; if not, no response may be made.
In this embodiment, when it is detected that the user touches the display interface of the applet embedded in the other application client, first, a touch position executed by the user on the display interface of the applet is obtained, which may be represented as pageX or pageY. In this embodiment, the touch of the user may be a single-point touch, such as a click performed by using one finger, or a drag performed after the click by using one finger; the user's touch may also be a multi-point touch, such as a zoom-in action performed by two fingers, a grab action or an open action performed by multiple fingers, and so on. For multi-point touch, the touch position of each point needs to be acquired.
In this embodiment, the touch interactive device based on the applet may also obtain, in advance, a position of a drawing component, such as a Canvas, in a display interface of the applet from information of the drawing component recorded by the applet. In practical application, the drawing component in the applet may be a rectangle, and the length and width of the rectangle are preset, and in order to facilitate marking, in this embodiment, the vertex at the upper left corner of the drawing component may be used to identify the position of the drawing component.
In this embodiment, the touch interactive device based on the applet may also obtain vertices of each graph drawn by the drawing component in advance, where the vertices of each graph drawn by the drawing component may be recorded when the drawing component draws no matter a regular graph or an irregular graph. Thus, the vertices of each graph drawn by the drawing component can be obtained from the pre-recorded information.
Then, based on the touch position, the position of the drawing component in the display interface of the applet, and vertices of each graph drawn by the drawing component, it is detected whether the touch of the user falls within at least one graph drawn by the drawing component, for example, when the touch detection is implemented, the method specifically includes the following steps:
(a1) acquiring the relative position of the touch position relative to the drawing assembly based on the touch position and the position of the drawing assembly;
for example, in this embodiment, the touch position minus the position of the drawing component may be used as the relative position of the touch position with respect to the drawing component, so as to realize the association between the touch position and the position of the drawing component.
(b1) Acquiring boundary information of corresponding graphs based on the vertexes of the graphs drawn by the drawing component;
in this embodiment, a smallest rectangle that can just surround the graph may be drawn for each graph, so that each vertex of the outermost periphery of the graph just falls on an edge of the rectangle, and at this time, the boundary information of the rectangle may be taken as the boundary information of the graph. For example, the boundary information of a graph may be represented by a set of parameters x1, y1, w, and h, where x1 and y1 are the coordinates of the upper left corner of the rectangle, w is the width of the rectangle, and h is the height of the rectangle.
(c1) And detecting whether the touch of the user falls in at least one graph or not according to the relative position, the vertex of each graph and the boundary information of each graph.
The detecting whether the touch of the user falls within the at least one graph of the present embodiment may include two steps of detecting: coarse detection and fine detection.
For example, in the rough detection, for each graph, whether the touch of the user is suspected to fall within the corresponding graph may be analyzed according to the relative position and the boundary information of the graph; if the relative position falls into the graph surrounded by the boundary information of the graph, the touch control of the user is considered to fall into the corresponding graph; otherwise, if the relative position does not fall into the graph surrounded by the boundary information of the graph, the touch control of the user is considered not to fall into the corresponding graph. During detection, if the relative position of the click is represented as x, y, it can be detected whether the following relational expressions exist x1< x < x1+ w, and y1< y < y1+ h, if so, the relative position is considered to fall within the graph surrounded by the boundary information of the graph, otherwise, the relative position does not fall within the graph surrounded by the boundary information of the graph.
However, since there is still a certain gap between the graph and the boundary of the graph, it is necessary to perform further fine detection when the touch of the user is suspected to fall within the corresponding graph. Specifically, whether the touch of the user falls within the corresponding graph may be detected according to the relative position and the vertex of the corresponding graph.
In this embodiment, whether the touch of the user falls into the corresponding graph may be detected based on the following mathematical law. Taking the relative position as a judgment point, drawing a ray along the x axis from the judgment point, and acquiring the number of intersection points of the ray and the graph side; if the number of intersections is an odd number, the determination point is inside the polygon figure, and if the number of intersections is an even number, the determination point is outside the polygon figure. Based on the principle, whether the judgment point is in the graph or not can be accurately detected, and whether the touch of the user falls into the corresponding graph or not can be further determined.
For example, the above mathematical law may be implemented by the following code:
Figure BDA0002206358260000071
fig. 2 is a schematic diagram of detection according to an embodiment of the present application. As shown in fig. 2, taking the drawing component to draw an irregular graph as an example, taking O in the graph as a determination point, taking the determination point as a starting point, a ray can be drawn along the X axis to the left or to the right, respectively, and taking the ray drawn in both directions as an example in fig. 2, as shown in fig. 2, the ray drawn to the right has 3 intersections with the graph, the ray drawn to the left has 5 intersections with the graph, and no matter the number of intersections in any direction, the touch point can be determined to fall into the graph. Therefore, even for irregular graphs, whether convex polygons or concave polygons are adopted, the mathematical law can be adopted to realize the detection of whether the touch of the user falls into the corresponding graph.
In this embodiment, for each graph drawn by the drawing component, it may be detected whether the touch of the user falls within the graph in the manner of the above-described embodiment. By analyzing based on each graph, the number of graphs into which the user's touch falls may be determined. If the touch of the user only falls into one graph, the graph is the target graph at the moment. And if the touch of the user falls into the at least two graphs, at this time, since the display interface of the applet shows the graph of the uppermost layer to the user, the user can be considered to touch the graph of the uppermost layer, and the graph of the uppermost layer is obtained from the at least two graphs according to the level information of each graph in the at least two graphs to be used as the target graph.
That is to say, in practical applications, the number of the graphics drawn by the drawing component may be greater than 1, for example, fig. 3 is a schematic touch control diagram provided in the embodiment of the present application. As shown in fig. 3, in the touch scene of the user, from top to bottom, the top layer is an application, and the second layer is an applet embedded in the application of the top layer. The latter two layers can be understood as the Canvas component layer in an applet, which in the schematic diagram shown in fig. 3 can be considered to draw two graphics, with circles on the top layer and irregular graphics on the bottom layer. For this scenario, in the manner of the above embodiment, since the two graphs are not completely overlapped up and down, the touch of the user may fall into only one of the graphs, or may fall into both graphs. At this time, the uppermost graph needs to be acquired as the target graph according to the hierarchy of the two graphs.
It should be noted that, before performing an interactive response based on the target graph in the at least one graph, the gesture information touched by the user, and the preset interactive configuration in step S103 in the above embodiment, the method may further include: and generating preset interaction configuration based on the target graph, the gesture information of the touch target graph and the response information of the target graph based on the gesture information.
That is, the interaction configuration must be set first, and then the interactive response can be made based on the interaction configuration when there is an interaction requirement. The interaction configuration of the embodiment comprises a target graph, gesture information of a touch target graph and response information of the target graph based on the gesture information; the target graph represents which graph is based on which interaction is realized, the gesture information of the touch target graph can include various gesture operations such as clicking, amplifying, dragging and the like, and the operation characteristics of each gesture information can be defined in the interaction configuration in advance, so that when the touch of the user is detected, which gesture information the touch of the user belongs to is analyzed. The response information of the target graph based on the gesture information can respond based on the defined gesture information, for example, when the gesture information is clicking, the target graph can be highlighted; when the gesture information is enlarged, the target graph can be enlarged and displayed; when the gesture information is reduced, correspondingly reducing the target graph; when the gesture information is dragged, the target graph can be correspondingly moved along the dragging direction; or the response information of the embodiment may also be responses in other animation forms, and the like. In this embodiment, interaction configuration may be set for each target graph based on requirements, which is not described in detail herein for example.
By adopting the technical scheme, the touch interaction method based on the applet can make up the defects of the prior art, support the touch interaction of the applet and enrich the use functions of the applet.
Further, in this embodiment, whether the touch of the user falls into at least one graph drawn by the drawing component is detected based on the touch position, the position of the drawing component in the display interface of the applet, and the vertex of each graph drawn by the drawing component, so that whether any graph in the Canvas component is touched can be very accurately detected, the detection accuracy is ensured, and the performance of touch interaction can be ensured.
In addition, in the embodiment, only the applet needs to render all the graphics in one Canvas component, and a large number of external View components or Canvas components do not need to be created for assistance, so that the Canvas component can be greatly utilized, excessive applet performance is not consumed, the touch interaction implementation cost in the applet is lower, and the implementation performance is better.
Moreover, the touch interaction of the applet realized by the embodiment has strong interaction capability, can be used for click interaction and animation interaction, and can greatly improve the interaction capability of the Canvas component in the applet.
Fig. 4 is a structural diagram of an embodiment of an applet-based touch interaction device provided in the present application. As shown in fig. 4, the applet-based touch interaction device 400 of the present embodiment may specifically include:
the obtaining module 401 is configured to obtain a touch position executed by a user on a display interface of an applet, if it is detected that the user touches the display interface of the applet embedded in another application client;
the detection module 402 is configured to detect whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position acquired by the acquisition module 401, the position of the drawing component in the display interface of the applet acquired in advance, and vertices of each graph drawn by the drawing component;
the interaction module 403 is configured to perform an interaction response based on a target graph in the at least one graph, gesture information of the user touch, and a preset interaction configuration if the detection module 402 detects that the user touch falls into the at least one graph drawn by the drawing component.
Further optionally, in the applet-based touch interaction device of this embodiment, the detection module 402 is specifically configured to:
acquiring the relative position of the touch position relative to the drawing assembly based on the touch position and the position of the drawing assembly acquired by the acquisition module 401;
acquiring boundary information of corresponding graphs based on the vertexes of the graphs drawn by the drawing component;
and detecting whether the touch of the user falls in at least one graph or not according to the relative position, the vertex of each graph and the boundary information of each graph.
Further optionally, in the applet-based touch interaction device of the embodiment, the detecting module 402 is
For each graph, analyzing whether the touch of the user is suspected to fall in the corresponding graph or not according to the relative position and the boundary information of the graph;
and if so, further detecting whether the touch control of the user falls into the corresponding graph or not according to the relative position and the vertex of the corresponding graph.
Further optionally, in the applet-based touch interaction device of this embodiment, the obtaining module 401 is further configured to:
if the detection module 402 detects that the touch of the user falls into at least two graphs drawn by the drawing component, the graph at the top level is obtained from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
Further optionally, in the applet-based touch interaction device of this embodiment, the method further includes:
the generating module 404 is configured to generate a preset interaction configuration based on the target graph, the gesture information of the touch target graph, and the response information of the target graph based on the gesture information.
Correspondingly, the interaction module 403 is configured to perform an interaction response based on the target graph generated by the generation module 404, the gesture information of the user touch, and a preset interaction configuration if the detection module 402 detects that the user touch falls into at least one graph drawn by the drawing component.
The implementation principle and technical effect of the applet-based touch interaction device in this embodiment are the same as those of the related method embodiments, and reference may be made to the description of the related method embodiments in detail, which is not repeated herein.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 5, the electronic device is a block diagram of an electronic device according to an applet-based touch interaction method in an embodiment of the application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system).
Memory 502 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by at least one processor to cause the at least one processor to perform the method of applet-based touch interaction provided herein. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to perform the methods of applet-based touch interaction provided herein.
The memory 502, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the acquisition module 401, the detection module 402, the interaction module 403, and the generation module 404 shown in fig. 4) corresponding to the method for widget-based touch interaction in the embodiments of the present application. The processor 501 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 502, that is, implements the method of applet-based touch interaction in the above method embodiments.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of the electronic device based on the applet touch interaction, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 502 optionally includes memory located remotely from processor 501, which may be connected to an applet-based touch interactive electronic device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the applet-based touch interaction method may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus based on the applet touch interaction, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or other input devices. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, by adopting the technical scheme, the defects of the prior art can be made up, the touch interaction of the applet is supported, and the use function of the applet is enriched.
Further, in this embodiment, whether the touch of the user falls into at least one graph drawn by the drawing component is detected based on the touch position, the position of the drawing component in the display interface of the applet, and the vertex of each graph drawn by the drawing component, so that whether any graph in the Canvas component is touched can be very accurately detected, the detection accuracy is ensured, and the performance of touch interaction can be ensured.
In addition, in the embodiment, only the applet needs to render all the graphics in one Canvas component, and a large number of external View components or Canvas components do not need to be created for assistance, so that the Canvas component can be greatly utilized, excessive applet performance is not consumed, the touch interaction implementation cost in the applet is lower, and the implementation performance is better.
Moreover, the touch interaction of the applet realized by the embodiment has strong interaction capability, can be used for click interaction and animation interaction, and can greatly improve the interaction capability of the Canvas component in the applet.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (12)

1. An applet-based touch interaction method, the method comprising:
if the fact that a user touches a display interface of an applet embedded in other application clients is detected, acquiring a touch position executed by the user on the display interface of the applet;
detecting whether the touch of the user falls into at least one graph drawn by the drawing component or not based on the touch position, the position of the drawing component in the display interface of the small program acquired in advance and the vertex of each graph drawn by the drawing component;
and if so, performing interactive response based on a target graph in the at least one graph, the gesture information touched by the user and a preset interactive configuration.
2. The method of claim 1, wherein detecting whether the user's touch falls within at least one of the graphics drawn by the drawing component based on the touch location, a location of the drawing component in a display interface of the applet, and vertices of the graphics drawn by the drawing component comprises:
acquiring the relative position of the touch position relative to the drawing assembly based on the touch position and the position of the drawing assembly;
acquiring boundary information of the corresponding graph based on the vertex of each graph drawn by the drawing component;
and detecting whether the touch of the user falls in the at least one graph or not according to the relative position, the vertex of each graph and the boundary information of each graph.
3. The method of claim 2, wherein detecting whether the user's touch falls within the at least one graph according to the relative position, the vertex of each graph, and the boundary information of each graph comprises:
for each graph, analyzing whether the touch of the user is suspected to fall in the corresponding graph according to the relative position and the boundary information of the graph;
and if so, further detecting whether the touch of the user falls into the corresponding graph or not according to the relative position and the corresponding vertex of the graph.
4. The method according to claim 1, wherein after detecting that the user's touch falls within the at least two graphics drawn by the drawing component, before performing an interactive response based on a target graphic in the at least one graphic, gesture information of the user's touch, and a preset interactive configuration, the method comprises:
and acquiring the graph at the uppermost level from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
5. The method according to any one of claims 1 to 4, wherein before performing an interactive response based on a target graphic in the at least one graphic, the gesture information touched by the user, and a preset interactive configuration, the method further comprises:
and generating the preset interaction configuration based on the target graph, gesture information for touching the target graph and response information of the target graph based on the gesture information.
6. An applet-based touch interaction apparatus, the apparatus comprising:
the acquisition module is used for acquiring a touch position executed by a user on a display interface of an applet embedded in other application clients if the fact that the user touches the display interface of the applet is detected;
the detection module is used for detecting whether the touch of the user falls into at least one graph drawn by the drawing component or not based on the touch position, the position of the drawing component in the display interface of the small program obtained in advance and the vertex of each graph drawn by the drawing component;
and the interaction module is used for carrying out interaction response based on a target graph in the at least one graph, the gesture information of the user touch and a preset interaction configuration when the fact that the user touch falls into the at least one graph drawn by the drawing assembly is detected.
7. The apparatus of claim 6, wherein the detection module is configured to:
acquiring the relative position of the touch position relative to the drawing assembly based on the touch position and the position of the drawing assembly;
acquiring boundary information of the corresponding graph based on the vertex of each graph drawn by the drawing component;
and detecting whether the touch of the user falls in the at least one graph or not according to the relative position, the vertex of each graph and the boundary information of each graph.
8. The apparatus of claim 7, wherein the detection module is configured to:
for each graph, analyzing whether the touch of the user is suspected to fall in the corresponding graph according to the relative position and the boundary information of the graph;
and if so, further detecting whether the touch of the user falls into the corresponding graph or not according to the relative position and the corresponding vertex of the graph.
9. The apparatus of claim 6, wherein the obtaining module is further configured to:
when the fact that the touch of the user falls into the at least two graphs drawn by the drawing assembly is detected, the graph of the uppermost level is obtained from the at least two graphs according to the level information of each graph in the at least two graphs and serves as the target graph.
10. The apparatus of any of claims 6-9, further comprising:
and the generating module is used for generating the preset interaction configuration based on the target graph, the gesture information for touching the target graph and the response information of the target graph based on the gesture information.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-5.
CN201910882703.4A 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium Active CN110727383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910882703.4A CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910882703.4A CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110727383A true CN110727383A (en) 2020-01-24
CN110727383B CN110727383B (en) 2024-05-28

Family

ID=69219164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910882703.4A Active CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110727383B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111524210A (en) * 2020-04-10 2020-08-11 北京百度网讯科技有限公司 Method and apparatus for generating drawings
CN111857488A (en) * 2020-06-30 2020-10-30 北京百度网讯科技有限公司 Method and device for popping up menu in applet, electronic equipment and storage medium
CN115857786A (en) * 2023-02-27 2023-03-28 蔚来汽车科技(安徽)有限公司 Method for realizing touch interaction and touch interaction equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459794B1 (en) * 2014-03-24 2016-10-04 Amazon Technologies, Inc. Interactions based on multiple stylus inputs
US20170199748A1 (en) * 2016-01-13 2017-07-13 International Business Machines Corporation Preventing accidental interaction when rendering user interface components
CN106952316A (en) * 2017-03-22 2017-07-14 福建中金在线信息科技有限公司 The display methods and device of share stock time-sharing map in a kind of wechat small routine
CN109634603A (en) * 2018-11-28 2019-04-16 广东智合创享营销策划有限公司 A kind of H5 page animation method and apparatus based on Canvas painting canvas
CN109783102A (en) * 2019-01-18 2019-05-21 北京城市网邻信息技术有限公司 Method, apparatus, equipment and the storage medium that Canvas painting canvas generates in a kind of small routine
CN110109598A (en) * 2019-05-06 2019-08-09 北京奇艺世纪科技有限公司 A kind of animation interaction implementation method, device and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459794B1 (en) * 2014-03-24 2016-10-04 Amazon Technologies, Inc. Interactions based on multiple stylus inputs
US20170199748A1 (en) * 2016-01-13 2017-07-13 International Business Machines Corporation Preventing accidental interaction when rendering user interface components
CN106952316A (en) * 2017-03-22 2017-07-14 福建中金在线信息科技有限公司 The display methods and device of share stock time-sharing map in a kind of wechat small routine
CN109634603A (en) * 2018-11-28 2019-04-16 广东智合创享营销策划有限公司 A kind of H5 page animation method and apparatus based on Canvas painting canvas
CN109783102A (en) * 2019-01-18 2019-05-21 北京城市网邻信息技术有限公司 Method, apparatus, equipment and the storage medium that Canvas painting canvas generates in a kind of small routine
CN110109598A (en) * 2019-05-06 2019-08-09 北京奇艺世纪科技有限公司 A kind of animation interaction implementation method, device and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JINTAE KIM; HYUNSOO SONG; DONG-SOO KWON: "Behavioral Analysis of a Touch-Based Interaction between Humans and an Egg-shaped Robot According to Protrusions", IEEE, 22 November 2018 (2018-11-22) *
欧楠;陈翔;: "手机游戏中3D场景的应用", 中国西部科技, no. 14, 15 May 2008 (2008-05-15) *
辛义忠;姜欣慧;李岩;李洋;: "面向惯用手的笔+触控输入", 计算机辅助设计与图形学学报, no. 09, 15 September 2017 (2017-09-15) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111524210A (en) * 2020-04-10 2020-08-11 北京百度网讯科技有限公司 Method and apparatus for generating drawings
CN111857488A (en) * 2020-06-30 2020-10-30 北京百度网讯科技有限公司 Method and device for popping up menu in applet, electronic equipment and storage medium
CN111857488B (en) * 2020-06-30 2022-06-28 北京百度网讯科技有限公司 Method and device for popping up menu in applet, electronic equipment and storage medium
CN115857786A (en) * 2023-02-27 2023-03-28 蔚来汽车科技(安徽)有限公司 Method for realizing touch interaction and touch interaction equipment
CN115857786B (en) * 2023-02-27 2023-07-07 蔚来汽车科技(安徽)有限公司 Method for realizing touch interaction and touch interaction device

Also Published As

Publication number Publication date
CN110727383B (en) 2024-05-28

Similar Documents

Publication Publication Date Title
JP6264293B2 (en) Display control apparatus, display control method, and program
EP3843031A2 (en) Face super-resolution realization method and apparatus, electronic device and storage medium
US20140282269A1 (en) Non-occluded display for hover interactions
US9870144B2 (en) Graph display apparatus, graph display method and storage medium
KR20170041219A (en) Hover-based interaction with rendered content
US10275910B2 (en) Ink space coordinate system for a digital ink stroke
CN110727383B (en) Touch interaction method and device based on small program, electronic equipment and storage medium
US20130191714A1 (en) Fill by example animation and visuals
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
US20140285507A1 (en) Display control device, display control method, and computer-readable storage medium
KR20160003683A (en) Automatically manipulating visualized data based on interactivity
CN110663017B (en) Multi-stroke intelligent ink gesture language
US11068119B2 (en) Optimizing an arrangement of content on a display of a user device based on user focus
US20160239186A1 (en) Systems and methods for automated generation of graphical user interfaces
CN110889056A (en) Page marking method and device
JP2016110518A (en) Information processing equipment, control method thereof, program, and storage medium
US10855481B2 (en) Live ink presence for real-time collaboration
CN112036315A (en) Character recognition method, character recognition device, electronic equipment and storage medium
CN109766034B (en) Method, device and equipment for quickly starting application program and storage medium
CN108885556B (en) Controlling digital input
US20150370468A1 (en) Graphical interface for editing an interactive dynamic illustration
CN107615229B (en) User interface device and screen display method of user interface device
CN112581589A (en) View list layout method, device, equipment and storage medium
US20140365955A1 (en) Window reshaping by selective edge revisions
CN104951223A (en) Method and device for achieving magnifying lens on touch screen and host

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant