CN110727383B - Touch interaction method and device based on small program, electronic equipment and storage medium - Google Patents

Touch interaction method and device based on small program, electronic equipment and storage medium Download PDF

Info

Publication number
CN110727383B
CN110727383B CN201910882703.4A CN201910882703A CN110727383B CN 110727383 B CN110727383 B CN 110727383B CN 201910882703 A CN201910882703 A CN 201910882703A CN 110727383 B CN110727383 B CN 110727383B
Authority
CN
China
Prior art keywords
graph
touch
user
drawing component
applet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910882703.4A
Other languages
Chinese (zh)
Other versions
CN110727383A (en
Inventor
邬一平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910882703.4A priority Critical patent/CN110727383B/en
Publication of CN110727383A publication Critical patent/CN110727383A/en
Application granted granted Critical
Publication of CN110727383B publication Critical patent/CN110727383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch interaction method and device based on an applet, electronic equipment and a storage medium, and relates to the technical field of applet application. The specific implementation scheme is as follows: if the touch control of the user is detected to be embedded in the display interfaces of the applets of other application clients, acquiring the touch control position executed by the user on the display interfaces of the applets; detecting whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, wherein the position of the drawing component is acquired in advance; if yes, performing interaction response based on the target graph in the at least one graph, the gesture information of the user touch and preset interaction configuration. By adopting the technical scheme, the application can make up the defects of the prior art, support the touch interaction of the applet and enrich the use functions of the applet.

Description

Touch interaction method and device based on small program, electronic equipment and storage medium
Technical Field
The present application relates to computer technology, and in particular, to an applet application technology, and more particularly, to an applet-based touch interaction method and apparatus, an electronic device, and a storage medium.
Background
The applet is a lightweight application running on the application of the mobile terminal, can be used without downloading and installing, and has very convenient use mode. For example, many of the existing instant messaging applications have embedded applets, which greatly facilitates the user.
In existing applets, a drawing component is provided, such as can be typically implemented through a Canvas. The Canvas provides a way to draw graphics through elements of JavaScript and HTML, and can be used for animation, game pictures, data visualization, picture editing and real-time video processing, such as the rendering related to an applet map is the Canvas used. The Canvas itself has only drawing capabilities, no interaction at all, and animation capabilities. The Canvas used by the traditional applet end is only used for displaying the animation, but there is no way to realize the interaction between the user and the applet, and some more deep interactions are performed. Therefore, there is a need to provide an applet-based touch interaction scheme.
Disclosure of Invention
The application provides a touch interaction method and device based on an applet, electronic equipment and a storage medium, which are used for making up the defects of the prior art and providing a touch interaction scheme based on the applet.
The application provides a touch interaction method based on an applet, which comprises the following steps:
If the touch control of the user is detected to be embedded in the display interfaces of the applets of other application clients, acquiring the touch control position executed by the user on the display interfaces of the applets;
Detecting whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, wherein the position of the drawing component is acquired in advance;
If yes, performing interaction response based on the target graph in the at least one graph, the gesture information of the user touch and preset interaction configuration.
Further optionally, in the method as described above, detecting whether the touch of the user falls within at least one graphic drawn by the drawing component based on the touch position, a position of the drawing component in the display interface of the applet, and vertices of each graphic drawn by the drawing component includes:
Based on the touch position and the position of the drawing component, acquiring the relative position of the touch position relative to the drawing component;
acquiring boundary information of the corresponding graph based on the vertexes of the graphs drawn by the drawing component;
and detecting whether the touch control of the user falls in at least one graph according to the relative position, the vertexes of the graphs and the boundary information of the graphs.
Further alternatively, in the method as described above, detecting whether the touch of the user falls within the at least one graphic according to the relative position, the vertex of each graphic, and the boundary information of each graphic includes:
for each graph, analyzing whether the touch of the user is suspected to be in the corresponding graph according to the relative position and the boundary information of the graph;
if so, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
Further optionally, in the method as described above, before performing the interactive response based on the target graphic in the at least one graphic, the gesture information of the user touch, and the preset interaction configuration after detecting that the user touch falls within the at least two graphics drawn by the drawing component, the method includes:
And acquiring the graph at the uppermost level from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
Further optionally, in the method as described above, before performing the interactive response, based on the target graphic in the at least one graphic, the gesture information of the user touch, and the preset interaction configuration, the method further includes:
And generating the preset interaction configuration based on the target graph, gesture information of the target graph in touch control and response information of the target graph based on the gesture information.
The application also provides a touch interaction device based on the small program, which comprises:
the acquisition module is used for acquiring the touch position executed by the user on the display interface of the applet if the display interface of the applet embedded in other application clients is detected to be touched by the user;
The detection module is used for detecting whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, which are acquired in advance;
and the interaction module is used for carrying out interaction response based on the target graph in the at least one graph, the gesture information of the user touch and preset interaction configuration when the touch of the user is detected to fall into the at least one graph drawn by the drawing component.
The application also provides an electronic device, comprising:
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the preceding claims.
The application also provides a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of the above.
One embodiment of the above application has the following advantages or benefits: by adopting the technical scheme, the defects of the prior art can be overcome, the touch interaction of the applet is supported, and the use function of the applet is enriched.
Further, in the application, whether the touch of the user falls into at least one graph drawn by the drawing component is detected based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, so that whether any graph in the Canvas component is touched can be very accurately detected, the detection accuracy is ensured, and the performance of touch interaction can be ensured.
In addition, in the application, only the applet is needed to render all graphics in one Canvas component, and a large number of external View components or Canvas components are not needed to be created for assistance, so that the Canvas component can be greatly utilized, and the performance of the applet is not excessively consumed, so that the realization cost of touch interaction in the applet is lower, and the realization performance is better.
Moreover, the touch interaction of the applet realized by the application has strong interaction capability, can be used for click interaction and animation interaction, and can greatly improve the interaction capability of Canvas components in the applet.
Other effects of the above alternative will be described below in connection with specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present application and are not to be construed as limiting the application. Wherein:
FIG. 1 is a schematic diagram of a first embodiment according to the present application;
fig. 2 is a schematic diagram of detection according to an embodiment of the present application.
Fig. 3 is a schematic touch diagram according to an embodiment of the present application.
FIG. 4 is a schematic diagram of a second embodiment according to the present application;
Fig. 5 is a block diagram of an electronic device for implementing a method of applet-based touch interaction in accordance with an embodiment of the application.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a flowchart of an embodiment of an applet-based touch interaction method provided by the present application. As shown in fig. 1, the touch interaction method based on the applet in the embodiment may specifically include the following steps:
s101, acquiring a touch position executed by a user on a display interface of an applet if the user touches the display interface of the applet embedded in other application clients;
S102, detecting whether the touch of a user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in a pre-acquired display interface of the applet and the vertexes of the graphs drawn by the drawing component; if yes, go to step S102; otherwise, determining that the touch control of the user does not fall into any drawn graph of the drawing component, and ending the process without interaction with the user.
S103, performing interaction response based on the target graph in at least one graph, gesture information of user touch control and preset interaction configuration.
The execution subject of the applet-based touch interaction method of the present embodiment may be the applet-based touch interaction device of the present embodiment, and the applet-based touch interaction device may be disposed in the applet to implement the touch interaction between the applet and the user.
The technical scheme of the embodiment is applied to the applet, the applet can be a quick application which is directly embedded in other applications without downloading, the use mode is convenient, and the applet is favored by more and more users, for example, various applets are loaded in a plurality of instant messaging applications, and the user can directly enter the corresponding applet through the instant messaging application, so that the complexity of downloading and installing the corresponding application is avoided.
The existing applet is provided with a drawing component Canvas which can show some simple animations, such as showing the process of drawing the graphics or making the drawn graphics do some simple actions according to the preset settings. However, when the user touches the graphic on the interface of the applet, the applet cannot perform any interaction with the user. In this embodiment, in order to realize the interaction between the applet and the user based on the graphics, in this embodiment, the touch interaction device based on the applet may detect that the user touches the display interface of the applet, and whether the user needs to interact at this time, and further detect whether the touch of the user falls into the graphics drawn by the drawing component, if so, it may be considered that the user is requesting to interact at this time, and a corresponding interaction response needs to be made; if not, the method can not respond.
In this embodiment, when it is detected that the user touches the display interface of the applet embedded in the other application client, the touch location executed by the user on the display interface of the applet is first obtained, which may be represented as pageX, pageY. In this embodiment, the touch of the user may be a single-point touch, such as clicking performed by one finger, or dragging performed after clicking by one finger; the touch of the user may also be a multi-point touch, such as an amplifying action performed by two fingers, a grabbing action or an opening action performed by a plurality of fingers, and so on. For multi-point touch, the touch position of each point needs to be acquired.
In this embodiment, the touch interaction device based on the applet may also obtain, in advance, the position of the drawing component, such as Canvas, in the display interface of the applet from the information of the drawing component recorded by the applet. In practical application, the drawing component in the applet may be a rectangle, and the length and width of the rectangle are preset, so that for convenience of marking, in this embodiment, the top left corner of the drawing component may be used to identify the position of the drawing component, and since the length and width of the rectangle are known, the area range of the rectangle can be known according to the top left corner of the drawing component.
In this embodiment, the touch interaction device based on the applet may also obtain, in advance, vertices of each graph drawn by the drawing component, where the vertices of each drawn graph may be recorded by the drawing component when drawing, whether the graph is a regular graph or an irregular graph. Thus, vertices of the graphics drawn by the drawing component can be obtained from the pre-recorded information.
Then, based on the touch position, the position of the drawing component in the display interface of the applet and the vertices of each graphic drawn by the drawing component, detecting whether the touch of the user falls into at least one graphic drawn by the drawing component, for example, when implementing, the method specifically may include the following steps:
(a1) Based on the touch position and the position of the drawing component, acquiring the relative position of the touch position relative to the drawing component;
For example, in this embodiment, the position of the drawing component may be subtracted from the touch position, which is used as the relative position of the touch position with respect to the drawing component, to achieve association between the touch position and the position of the drawing component.
(B1) Obtaining boundary information of corresponding graphics based on vertexes of the graphics drawn by the drawing component;
In this embodiment, a minimum rectangle that just can surround the graph may be drawn for each graph, so that each vertex at the outermost periphery of the graph just falls on the edge of the rectangle, and at this time, the boundary information of the rectangle may be taken as the boundary information of the graph. For example, the boundary information of a graphic may be represented by a set of parameters x1, y1, w, and h, where x1 and y1 are the coordinates of the upper left corner of the rectangle, w is the width of the rectangle, and h is the height of the rectangle.
(C1) And detecting whether the touch control of the user falls in at least one graph according to the relative position, the vertexes of the graphs and the boundary information of the graphs.
The detecting whether the touch of the user falls within at least one graphic in this embodiment may include two steps of detection: coarse detection and fine detection.
For example, in the coarse detection, for each graphic, whether the touch of the user is suspected to fall in the corresponding graphic may be analyzed according to the relative position and the boundary information of the graphic; if the relative position falls into the graph surrounded by the boundary information of the graph, the touch control of the user is considered to be suspected to fall into the corresponding graph; otherwise, if the relative position does not fall into the graph enclosed by the boundary information of the graph, the touch control of the user is considered not to fall into the corresponding graph. When the detection is carried out, if the relative position of the click is expressed as x and y, at the moment, whether the following relational expression exists x1< x < x1+ w and y1< y < y1+ h can be detected, if so, the relative position is considered to fall into the graph enclosed by the boundary information of the graph, otherwise, the relative position does not fall into the graph enclosed by the boundary information of the graph.
However, since there is a certain gap between the boundaries of the graphics, when the touch of the user is suspected to fall into the corresponding graphics, further fine detection is required. Specifically, whether the touch of the user falls into the corresponding graph can be detected according to the relative position and the vertex of the corresponding graph.
In this embodiment, whether the touch of the user falls into the corresponding graph may be detected based on the following mathematical law. Drawing a ray along the x axis from the judging point by taking the relative position as the judging point, and obtaining the number of intersection points of the ray and the graph edge; if the number of the intersection points is odd, the judging point is inside the polygon, and if the number of the intersection points is even, the judging point is outside the polygon. Based on the principle, whether the judging point is in the graph or not can be accurately detected, and whether the touch control of the user falls into the corresponding graph or not is further determined.
For example, the mathematical law described above may be implemented by the following code:
Fig. 2 is a schematic diagram of detection according to an embodiment of the present application. As shown in fig. 2, an irregular graph is drawn by the drawing component, a judgment point of O in the graph is taken as a starting point, rays can be respectively drawn leftwards or rightwards along an X-axis, rays are drawn in two directions in fig. 2, as shown in fig. 2, the rays to the right have 3 intersection points with the graph, the rays to the left have 5 intersection points with the graph, and the touch point can be determined to fall into the graph no matter the number of intersection points in any direction. Therefore, even for irregular patterns, whether convex or concave, the mathematical law can still be used to detect whether the user's touch falls within the corresponding pattern.
In this embodiment, for each graphic drawn by the drawing component, whether the touch of the user falls into the graphic may be detected in the manner of the embodiment described above. By analyzing on a per graph basis, the number of graphs into which the user's touch falls can be determined. If the touch of the user only falls into one graph, the graph is a target graph. If the touch of the user falls into at least two graphs, at this time, the display interface of the applet displays the graph at the uppermost layer for the user, so that the graph at the uppermost layer of the touch of the user can be considered, and the graph at the uppermost layer is obtained from at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
That is, in practical applications, the number of graphics drawn by the drawing component may be greater than 1, for example, fig. 3 is a schematic touch diagram provided in the embodiment of the present application. As shown in fig. 3, in the touch scene of the user, from top to bottom, the uppermost layer is an application, and the second layer is an applet, and the applet is embedded in the application of the upper layer. The latter two layers can be understood as Canvas component layers in the applet, and in the schematic diagram shown in fig. 3, the Canvas can be considered to draw two figures, with a circle on the upper layer and an irregular figure on the lower layer. For this scene, in the manner of the above embodiment, since the two graphics are not completely overlapped up and down, the touch of the user may fall into only one of the graphics, or may fall into both of the graphics at the same time. At this time, it is necessary to acquire the uppermost graphic as the target graphic from the hierarchy of the two graphics.
It should be noted that, before performing the interactive response in step S103 of the foregoing embodiment based on the target graphic in the at least one graphic, the gesture information of the user touch, and the preset interaction configuration, the method may further include: and generating a preset interaction configuration based on the gesture information of the target graph, the touch control target graph and the response information of the target graph based on the gesture information.
That is, the interaction configuration must be set first to be able to respond interactively based on the interaction configuration when there is an interaction demand. The interaction configuration of the embodiment comprises gesture information of the target graph and the touch target graph, and response information of the target graph based on the gesture information; the target graph representation is based on which graph is interacted, gesture information of the touch target graph can comprise various gesture operations such as clicking, amplifying, dragging and the like, and operation characteristics of each gesture information can be defined in interaction configuration in advance so as to analyze which gesture information the touch of the user belongs to when the touch of the user is detected. The response information of the target graph based on the gesture information can respond based on the defined gesture information, for example, when the gesture information is a click, the target graph can be highlighted at the moment; when the gesture information is amplified, the target graph can be amplified and displayed; correspondingly shrinking the target graph when the gesture information is shrinking; when the gesture information is drag, the target graph can be correspondingly moved along the drag direction; or the response information of the embodiment can also be a response in other animation forms, and the like. In this embodiment, the interaction configuration may be set for each target graphic based on the requirement, which is not described herein in detail.
By adopting the technical scheme, the touch interaction method based on the small program can make up the defects of the prior art, support the touch interaction of the small program and enrich the use functions of the small program.
Further, in this embodiment, whether the touch of the user falls into at least one graphic drawn by the drawing component is detected by detecting based on the touch position, the position of the drawing component in the display interface of the applet and the vertices of each graphic drawn by the drawing component, so that whether any graphic in the Canvas component is touched can be very accurately detected, the accuracy of detection is ensured, and the performance of touch interaction can be ensured.
In addition, in the embodiment, only the applet is needed to render all graphics in one Canvas component, and a large number of external View components or Canvas components are not needed to be created for assistance, so that the Canvas component can be greatly utilized, and the performance of the applet is not excessively consumed, so that the realization cost of touch interaction in the applet is lower, and the realization performance is better.
Moreover, the touch interaction of the applet realized by the embodiment has strong interaction capability, can be used for click interaction and animation interaction, and can greatly improve the interaction capability of Canvas components in the applet.
Fig. 4 is a block diagram of an embodiment of an applet-based touch interaction device according to the present application. As shown in fig. 4, the applet-based touch interaction device 400 of the present embodiment may specifically include:
The obtaining module 401 is configured to obtain a touch position performed by a user on a display interface of an applet if it is detected that the user touches the display interface of the applet embedded in other application clients;
The detection module 402 is configured to detect whether a touch of a user falls into at least one graphic drawn by the drawing component based on the touch position acquired by the acquisition module 401, the position of the drawing component in the display interface of the applet acquired in advance, and the vertex of each graphic drawn by the drawing component;
The interaction module 403 is configured to perform an interaction response based on the target graphic in the at least one graphic, gesture information of the user touch, and a preset interaction configuration if the detection module 402 detects that the user touch falls into the at least one graphic drawn by the drawing component.
Further optionally, in the applet-based touch interaction device of this embodiment, the detection module 402 is specifically configured to:
Based on the touch position and the position of the drawing component acquired by the acquisition module 401, acquiring the relative position of the touch position relative to the drawing component;
obtaining boundary information of corresponding graphics based on vertexes of the graphics drawn by the drawing component;
and detecting whether the touch control of the user falls in at least one graph according to the relative position, the vertexes of the graphs and the boundary information of the graphs.
Further alternatively, in the applet-based touch interaction device of this embodiment, the detection module 402
For each graph, analyzing whether the touch control of the user is suspected to be in the corresponding graph according to the relative position and the boundary information of the graph;
If so, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
Further alternatively, in the applet-based touch interaction device of this embodiment, the obtaining module 401 is further configured to:
If the detection module 402 detects that the touch of the user falls into at least two graphs drawn by the drawing component, the graph at the uppermost level is obtained from the at least two graphs according to the level information of each graph in the at least two graphs and is taken as the target graph.
Further optionally, in the applet-based touch interaction device of this embodiment, the method further includes:
The generating module 404 is configured to generate a preset interaction configuration based on the target graphic, gesture information of the touch target graphic, and response information of the target graphic based on the gesture information.
Correspondingly, the interaction module 403 is configured to perform an interaction response based on the target graphic generated by the generating module 404, gesture information of the user touch, and a preset interaction configuration if the detecting module 402 detects that the user touch falls into at least one graphic drawn by the drawing component.
The principle and technical effects of implementing the applet-based touch interaction by using the module in the applet-based touch interaction device in this embodiment are the same as those of the related method embodiment, and detailed description of the related method embodiment may be referred to and will not be repeated here.
According to an embodiment of the present application, the present application also provides an electronic device and a readable storage medium.
Fig. 5 is a block diagram of an electronic device according to an embodiment of the application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 5, the electronic device includes: one or more processors 501, memory 502, and interfaces for connecting components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system).
Memory 502 is a non-transitory computer readable storage medium provided by the present application. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the applet-based touch interaction method provided by the present application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method of applet-based touch interaction provided by the present application.
The memory 502 is used as a non-transitory computer readable storage medium, and may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the acquisition module 401, the detection module 402, the interaction module 403, and the generation module 404 shown in fig. 4) corresponding to the applet-based touch interaction method in the embodiment of the application. The processor 501 executes various functional applications of the server and data processing, i.e. the method of implementing the applet-based touch interaction in the above method embodiments, by running non-transitory software programs, instructions and modules stored in the memory 502.
Memory 502 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created from the use of the electronic device based on the applet-based touch interactions, and the like. In addition, memory 502 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 502 optionally includes memory remotely located with respect to processor 501, which may be connected to the applet-based touch-interactive electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the method for touch interaction based on the applet may further include: an input device 503 and an output device 504. The processor 501, memory 502, input devices 503 and output devices 504 may be connected by a bus or otherwise, for example in fig. 5.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device for applet-based touch interactions, such as a touch screen, keypad, mouse, trackpad, touch pad, pointer stick, one or more mouse buttons, trackball, joystick, and the like. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme provided by the embodiment of the application, the defects of the prior art can be overcome by adopting the technical scheme, the touch interaction of the applet is supported, and the use function of the applet is enriched.
Further, in this embodiment, whether the touch of the user falls into at least one graphic drawn by the drawing component is detected by detecting based on the touch position, the position of the drawing component in the display interface of the applet and the vertices of each graphic drawn by the drawing component, so that whether any graphic in the Canvas component is touched can be very accurately detected, the accuracy of detection is ensured, and the performance of touch interaction can be ensured.
In addition, in the embodiment, only the applet is needed to render all graphics in one Canvas component, and a large number of external View components or Canvas components are not needed to be created for assistance, so that the Canvas component can be greatly utilized, and the performance of the applet is not excessively consumed, so that the realization cost of touch interaction in the applet is lower, and the realization performance is better.
Moreover, the touch interaction of the applet realized by the embodiment has strong interaction capability, can be used for click interaction and animation interaction, and can greatly improve the interaction capability of Canvas components in the applet.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the disclosed embodiments are achieved, and are not limited herein.
The above embodiments do not limit the scope of the present application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application should be included in the scope of the present application.

Claims (10)

1. The touch interaction method based on the small program is characterized by comprising the following steps of:
If the touch control of the user is detected to be embedded in the display interfaces of the applets of other application clients, acquiring the touch control position executed by the user on the display interfaces of the applets;
Detecting whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, wherein the position of the drawing component is acquired in advance;
If yes, performing interaction response based on the target graph in the at least one graph, the gesture information of the user touch and preset interaction configuration;
Based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, detecting whether the touch of the user falls into at least one graph drawn by the drawing component includes:
Based on the touch position and the position of the drawing component, acquiring the relative position of the touch position relative to the drawing component;
Based on the vertexes of each graph drawn by the drawing component, obtaining the boundary information of the corresponding graph, including: drawing a minimum rectangle which just can surround the graph for each graph, so that all vertexes of the outermost periphery of the graph just fall on the edges of the rectangle; taking the boundary information of the rectangle as the boundary information of the graph;
detecting whether the touch of the user falls in the at least one graph according to the relative position, the vertex of each graph and the boundary information of each graph, comprises the following two steps of: coarse detection and fine detection;
In coarse detection, for each graph, according to the relative position and boundary information of the graph, analyzing whether the touch control of the user is suspected to be in the corresponding graph;
in the fine detection, for each graph, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
2. The method of claim 1, wherein detecting whether the user's touch falls within the at least one graphic based on the relative position, the vertices of each of the graphics, and the boundary information of each of the graphics, comprises:
for each graph, analyzing whether the touch of the user is suspected to be in the corresponding graph according to the relative position and the boundary information of the graph;
if so, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
3. The method according to claim 1, wherein, if it is detected that the touch of the user falls within the at least two graphics drawn by the drawing component, before performing the interactive response based on the target graphics in the at least one graphics, the gesture information of the touch of the user, and the preset interaction configuration, the method comprises:
And acquiring the graph at the uppermost level from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
4. A method according to any one of claims 1-3, wherein before performing the interactive response based on the target graphic in the at least one graphic, the gesture information of the user touch, and the preset interaction configuration, the method further comprises:
And generating the preset interaction configuration based on the target graph, gesture information of the target graph in touch control and response information of the target graph based on the gesture information.
5. An applet-based touch interaction device, the device comprising:
the acquisition module is used for acquiring the touch position executed by the user on the display interface of the applet if the display interface of the applet embedded in other application clients is detected to be touched by the user;
The detection module is used for detecting whether the touch of the user falls into at least one graph drawn by the drawing component based on the touch position, the position of the drawing component in the display interface of the applet and the vertexes of the graphs drawn by the drawing component, which are acquired in advance;
The interaction module is used for carrying out interaction response based on a target graph in at least one graph, gesture information of the user touch and preset interaction configuration when the touch of the user is detected to fall into at least one graph drawn by the drawing component;
The detection module is used for:
Based on the touch position and the position of the drawing component, acquiring the relative position of the touch position relative to the drawing component;
Based on the vertexes of each graph drawn by the drawing component, obtaining the boundary information of the corresponding graph, including: drawing a minimum rectangle which just can surround the graph for each graph, so that all vertexes of the outermost periphery of the graph just fall on the edges of the rectangle; taking the boundary information of the rectangle as the boundary information of the graph;
detecting whether the touch of the user falls in the at least one graph according to the relative position, the vertex of each graph and the boundary information of each graph, comprises the following two steps of: coarse detection and fine detection;
In coarse detection, for each graph, according to the relative position and boundary information of the graph, analyzing whether the touch control of the user is suspected to be in the corresponding graph;
in the fine detection, for each graph, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
6. The apparatus of claim 5, wherein the detection module is configured to:
for each graph, analyzing whether the touch of the user is suspected to be in the corresponding graph according to the relative position and the boundary information of the graph;
if so, detecting whether the touch of the user falls into the corresponding graph according to the relative position and the vertex of the corresponding graph.
7. The apparatus of claim 5, wherein the acquisition module is further configured to:
When the touch control of the user is detected to fall into the at least two graphs drawn by the drawing component, acquiring the graph at the uppermost level from the at least two graphs as the target graph according to the level information of each graph in the at least two graphs.
8. The apparatus according to any one of claims 5-7, wherein the apparatus further comprises:
the generating module is used for generating the preset interaction configuration based on the target graph, gesture information of the target graph in touch control and response information of the target graph based on the gesture information.
9. An electronic device, comprising:
At least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-4.
10. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-4.
CN201910882703.4A 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium Active CN110727383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910882703.4A CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910882703.4A CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110727383A CN110727383A (en) 2020-01-24
CN110727383B true CN110727383B (en) 2024-05-28

Family

ID=69219164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910882703.4A Active CN110727383B (en) 2019-09-18 2019-09-18 Touch interaction method and device based on small program, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110727383B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111524210A (en) * 2020-04-10 2020-08-11 北京百度网讯科技有限公司 Method and apparatus for generating drawings
CN111857488B (en) * 2020-06-30 2022-06-28 北京百度网讯科技有限公司 Method and device for popping up menu in applet, electronic equipment and storage medium
CN115857786B (en) * 2023-02-27 2023-07-07 蔚来汽车科技(安徽)有限公司 Method for realizing touch interaction and touch interaction device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459794B1 (en) * 2014-03-24 2016-10-04 Amazon Technologies, Inc. Interactions based on multiple stylus inputs
CN106952316A (en) * 2017-03-22 2017-07-14 福建中金在线信息科技有限公司 The display methods and device of share stock time-sharing map in a kind of wechat small routine
CN109634603A (en) * 2018-11-28 2019-04-16 广东智合创享营销策划有限公司 A kind of H5 page animation method and apparatus based on Canvas painting canvas
CN109783102A (en) * 2019-01-18 2019-05-21 北京城市网邻信息技术有限公司 Method, apparatus, equipment and the storage medium that Canvas painting canvas generates in a kind of small routine
CN110109598A (en) * 2019-05-06 2019-08-09 北京奇艺世纪科技有限公司 A kind of animation interaction implementation method, device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170199748A1 (en) * 2016-01-13 2017-07-13 International Business Machines Corporation Preventing accidental interaction when rendering user interface components

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9459794B1 (en) * 2014-03-24 2016-10-04 Amazon Technologies, Inc. Interactions based on multiple stylus inputs
CN106952316A (en) * 2017-03-22 2017-07-14 福建中金在线信息科技有限公司 The display methods and device of share stock time-sharing map in a kind of wechat small routine
CN109634603A (en) * 2018-11-28 2019-04-16 广东智合创享营销策划有限公司 A kind of H5 page animation method and apparatus based on Canvas painting canvas
CN109783102A (en) * 2019-01-18 2019-05-21 北京城市网邻信息技术有限公司 Method, apparatus, equipment and the storage medium that Canvas painting canvas generates in a kind of small routine
CN110109598A (en) * 2019-05-06 2019-08-09 北京奇艺世纪科技有限公司 A kind of animation interaction implementation method, device and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Jintae Kim ; Hyunsoo Song ; Dong-Soo Kwon.Behavioral Analysis of a Touch-Based Interaction between Humans and an Egg-shaped Robot According to Protrusions.IEEE.2018,全文. *
手机游戏中3D场景的应用;欧楠;陈翔;;中国西部科技;20080515(第14期);全文 *
面向惯用手的笔+触控输入;辛义忠;姜欣慧;李岩;李洋;;计算机辅助设计与图形学学报;20170915(第09期);全文 *

Also Published As

Publication number Publication date
CN110727383A (en) 2020-01-24

Similar Documents

Publication Publication Date Title
CN111709878B (en) Face super-resolution implementation method and device, electronic equipment and storage medium
CN110727383B (en) Touch interaction method and device based on small program, electronic equipment and storage medium
JP2017523515A (en) Change icon size
US9870144B2 (en) Graph display apparatus, graph display method and storage medium
KR20160003683A (en) Automatically manipulating visualized data based on interactivity
US20160070460A1 (en) In situ assignment of image asset attributes
US20140285507A1 (en) Display control device, display control method, and computer-readable storage medium
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
CN107479818B (en) Information interaction method and mobile terminal
US20160239186A1 (en) Systems and methods for automated generation of graphical user interfaces
US20150370447A1 (en) Computerized systems and methods for cascading user interface element animations
CN112926000A (en) Display area rendering method, device and equipment, readable storage medium and product
US11169652B2 (en) GUI configuration
JP2016110518A (en) Information processing equipment, control method thereof, program, and storage medium
RU2768526C2 (en) Real handwriting presence for real-time collaboration
CN113918260A (en) Application program display method and device and electronic equipment
CN110471700B (en) Graphic processing method, apparatus, storage medium and electronic device
US10193959B2 (en) Graphical interface for editing an interactive dynamic illustration
CN112634401B (en) Plane track drawing method, device, equipment and storage medium
US10970476B2 (en) Augmenting digital ink strokes
CN112581589A (en) View list layout method, device, equipment and storage medium
CN107615229B (en) User interface device and screen display method of user interface device
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
US20140365955A1 (en) Window reshaping by selective edge revisions
CN112035210B (en) Method, apparatus, device and medium for outputting color information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant