CN111208929B - Response method, device and equipment of multi-level interface and storage medium - Google Patents
Response method, device and equipment of multi-level interface and storage medium Download PDFInfo
- Publication number
- CN111208929B CN111208929B CN202010006608.0A CN202010006608A CN111208929B CN 111208929 B CN111208929 B CN 111208929B CN 202010006608 A CN202010006608 A CN 202010006608A CN 111208929 B CN111208929 B CN 111208929B
- Authority
- CN
- China
- Prior art keywords
- view component
- determining
- touch
- interface
- touch operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004044 response Effects 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000004590 computer program Methods 0.000 claims description 10
- 230000005540 biological transmission Effects 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 239000013307 optical fiber Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 238000005316 response function Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses a response method, a response device, response equipment and a response storage medium for a multi-level interface. The method comprises the following steps: when the touch operation of a user is detected, determining the touch type of the touch operation; the touch type comprises a sliding operation and a clicking operation; determining a first view component arranged in each hierarchy interface according to the touch type; and determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type. According to the response method of the multi-level interface disclosed by the embodiment of the invention, the first view component arranged in each level interface is determined according to the touch type, and the second view component is determined from the first view component, so that the second view component responds to the touch operation corresponding to the touch type, the problem of transmission of the touch event of the multi-level interface is solved, the timely response of each layer of interface to the touch event is ensured, and the accuracy and reliability of interface response are improved.
Description
Technical Field
The embodiment of the invention relates to the technical field of application program interfaces, in particular to a response method, a response device, response equipment and a storage medium for a multi-level interface.
Background
At present, the interface structure of an application program is generally complex and generally has two to three levels. Taking a live broadcast application program as an example, the upper interface is a basic function of a live broadcast room, and the lower interface is main broadcast video or microphone information. In order to realize the left-right sliding of the upper interface, functional components of the system are generally adopted, and the components intercept touch events to acquire coordinates of user points, so that the sliding in different directions is realized. Because the interface of the application program is in a multi-level, if the event penetration processing is not carried out, the lower-layer interface can not respond to the touch event.
Disclosure of Invention
The embodiment of the invention provides a response method, a response device, response equipment and a storage medium for a multi-layer interface, which solve the problem of transmission of a touch event of the multi-layer interface, ensure the timely response of each layer of interface to the touch event and improve the accuracy and reliability of interface response.
In a first aspect, an embodiment of the present invention provides a response method for a multi-level interface, including:
when the touch operation of a user is detected, determining the touch type of the touch operation; the touch type comprises a sliding operation and a clicking operation;
determining a first view component arranged in each hierarchy interface according to the touch type;
and determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type.
Further, determining a touch type of the touch operation includes:
acquiring a starting point coordinate and an end point coordinate of the touch operation;
calculating the sliding distance of the touch operation according to the starting point coordinate and the end point coordinate;
and if the sliding distance exceeds a set value, the touch operation is a sliding operation, otherwise, the sliding operation is a clicking operation.
Further, determining a first view component disposed in each hierarchical interface according to the touch type includes:
if the touch type is a sliding operation, acquiring a registered view component in an upper-layer interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinate of the touch operation as a first view component.
Further, determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type includes:
and determining the first view component as a second view component, and controlling the second view component to respond to the sliding operation.
Further, determining a first view component disposed in each hierarchical interface according to the touch type includes:
if the touch type is click operation, acquiring a registered view component in each level of interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinates of the touch operation as a first view component.
Further, determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type includes:
if the first view components comprise a plurality of first view components, acquiring the priority of each first view component, and determining the first view component with the highest priority as the second view component;
and controlling the second view component to respond to the clicking operation.
Further, still include: registering or deregistering view components arranged in interfaces of all levels to obtain registered view components and deregistered view components; the registered view component responds to the touch operation of the user, and the anti-registered view component does not respond to the touch operation of the user.
In a second aspect, an embodiment of the present invention further provides a response apparatus for a multi-level interface, including:
the touch type determining module is used for determining the touch type of the touch operation when the touch operation of a user is detected; the touch type comprises a sliding operation and a clicking operation;
the first view component determining module is used for determining a first view component arranged in each hierarchy interface according to the touch type;
and the response module is used for determining a second view component from the first view component and controlling the second view component to respond to the touch operation corresponding to the touch type.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the response method for the multi-level interface according to the embodiment of the present invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the response method of the multi-level interface according to the embodiment of the present invention.
According to the embodiment of the invention, when the touch operation of a user is detected, the touch type of the touch operation is determined; determining a first view component arranged in each level interface according to the touch type; and determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type. According to the response method of the multi-level interface disclosed by the embodiment of the invention, the first view component arranged in each level interface is determined according to the touch type, and the second view component is determined from the first view component, so that the second view component responds to the touch operation corresponding to the touch type, the problem of transmission of the touch event of the multi-level interface is solved, the timely response of each layer of interface to the touch event is ensured, and the accuracy and reliability of interface response are improved.
Drawings
FIG. 1 is a flow chart of a response method of a multi-level interface according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of a response device of a multi-level interface according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a computer device in a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a method for responding to a multi-level interface according to an embodiment of the present invention, where the embodiment is applicable to a case where an interface in an application responds to a touch operation, and the method may be executed by a response apparatus of the multi-level interface, where the apparatus may be composed of hardware and/or software, and may be generally integrated in a device having a response function of the multi-level interface, where the device may be an electronic device such as a server or a server cluster. As shown in fig. 1, the method specifically includes the following steps:
step 110, when the touch operation of the user is detected, determining the touch type of the touch operation.
The touch type includes a sliding operation and a clicking operation. Specifically, when a user performs a touch operation on an interface of an application program, the mobile terminal may recognize the touch operation, and further determine whether to perform a click operation or a slide operation during the touch operation.
In this embodiment, the manner of determining the touch type of the touch operation may be: acquiring a starting point coordinate and an end point coordinate of the touch operation; calculating the sliding distance of the touch operation according to the coordinates of the starting point and the coordinates of the end point; and if the sliding distance exceeds a set value, the touch operation is a sliding operation, otherwise, the sliding operation is a clicking operation.
The position of the touch operation of the user may be identified through a function interface (MotionEvent Action) set in the application system. The manner of acquiring the coordinates of the start point of the touch operation may be acquired by calling a function motionevent. The manner of acquiring the coordinates of the termination point of the touch operation may be acquired by calling a function motionevent. The start point coordinates and the end point coordinates are two-dimensional coordinates. And then, calculating the coordinates of the starting point and the coordinates of the end point by adopting a distance formula to obtain the sliding distance of the touch operation. And when the sliding distance is greater than the set value, the touch type of the touch operation is the sliding operation, and when the sliding distance is less than or equal to the set value, the touch type of the touch operation is the clicking operation.
And step 120, determining a first view component arranged in each level interface according to the touch type.
Wherein the first view component may be a component of the registered view components whose position in the interface falls into the start point coordinates of the touch operation. In this embodiment, the display interface of the application program is provided with view components with different functions, and the view components arranged in the interfaces of different levels are registered or unregistered to obtain a registered view component and an unregistered view component. The registered view component responds to the touch operation of the user, and the anti-registered view component does not respond to the touch operation of the user. Specifically, the view component may be registered and unregistered by using a registration function and an unregistering function set in the application system, for example: and adopting a register View function to realize the registration of the view component, and adopting a unregisterView function to realize the anti-registration of the view component.
In this embodiment, the user interface of the application includes at least two levels, and each level is provided with a view component with a different function.
Specifically, the manner of determining the first view component arranged in each hierarchical interface according to the touch type may be: if the touch type is a sliding operation, acquiring a registered view component in the upper-layer interface; and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinate of the touch operation as a first view component.
The current position of each registered view component in the interface may be determined by using a set position obtaining function, for example: the location of the registered view component in the interface is obtained through the getLocationOnScreen function. When the touch operation is a sliding operation, only the first view component falling into the starting point coordinate of the touch operation in the upper layer interface is acquired. That is, if the operation is a sliding operation, only the upper interface needs to respond to the sliding operation, and the lower interface does not need to respond. The advantage of this is that the sliding operation of the upper layer interface is prevented from being conflicted with the click operation of the lower layer.
Specifically, the manner of determining the first view component arranged in each hierarchical interface according to the touch type may be: if the touch type is click operation, acquiring a registered view component in each level of interface; and determining the current position of each registered view component in the interface, and determining the registered view component falling into the coordinates of the starting point of the touch operation as the first view component.
And acquiring the position of the registered view component in each hierarchy interface by adopting a getLocationOnScreen function, and determining a component falling into the coordinate of the starting point of the touch operation as a first view component. Wherein the number of the first view components may be 1 or more. If the user interface comprises 3 levels which are respectively an upper-layer interface, a middle-layer interface and a lower-layer interface, and if the registered view components falling into the starting point coordinates of the touch operation are acquired by the upper-layer interface and the middle-layer interface, the number of the first view components is two.
And step 130, determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type.
Specifically, if the touch type is a sliding operation, determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type in the following manner: and determining the first view component as a second view component, and controlling the second view component to respond to the sliding operation.
Specifically, if the touch type is a click operation, determining the second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type may be: if the first view components comprise a plurality of first view components, acquiring the priority of each first view component, and determining the first view component with the highest priority as the second view component; and controlling the second view component to respond to the clicking operation.
Wherein the priority of the view component can be set at registration time. Exemplarily, assuming that the user interface includes 3 hierarchies, which are an upper interface, a middle interface and a lower interface, if the upper interface and the middle interface acquire registered view components falling in a start point coordinate of the touch operation, the first view components are two, and the priority of the first view component in the middle interface is highest, the first view component in the middle interface is determined as a second view component, and the second view component of the middle interface is controlled to respond to a click operation, so as to execute a function that the second view component can implement, such as: playing the video, skipping the interface and the like.
According to the technical scheme of the embodiment, when the touch operation of a user is detected, the touch type of the touch operation is determined; determining a first view component arranged in each level interface according to the touch type; and determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type. According to the response method of the multi-level interface disclosed by the embodiment of the invention, the first view component arranged in each level interface is determined according to the touch type, and the second view component is determined from the first view component, so that the second view component responds to the touch operation corresponding to the touch type, the problem of transmission of the touch event of the multi-level interface is solved, the timely response of each layer of interface to the touch event is ensured, and the accuracy and reliability of interface response are improved.
Example two
Fig. 2 is a schematic structural diagram of a response apparatus of a multi-level interface according to a second embodiment of the present invention. As shown in fig. 2, the apparatus includes: a touch type determination module 210, a first view component determination module 220, and a response module 230.
A touch type determining module 210, configured to determine a touch type of a touch operation when the touch operation of a user is detected; the touch type comprises a sliding operation and a clicking operation;
a first view component determining module 220, configured to determine a first view component arranged in each hierarchical interface according to the touch type;
a response module 230, configured to determine a second view component from the first view components, and control the second view component to respond to the touch operation corresponding to the touch type.
Optionally, the touch type determining module 210 is further configured to:
acquiring a starting point coordinate and an end point coordinate of the touch operation;
calculating the sliding distance of the touch operation according to the starting point coordinate and the end point coordinate;
and if the sliding distance exceeds a set value, the touch operation is a sliding operation, otherwise, the sliding operation is a clicking operation.
Optionally, the first view component determining module 220 is further configured to:
if the touch type is a sliding operation, acquiring a registered view component in an upper-layer interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinate of the touch operation as a first view component.
Optionally, the response module 230 is further configured to:
and determining the first view component as a second view component, and controlling the second view component to respond to the sliding operation.
Optionally, the first view component determining module 220 is further configured to: :
if the touch type is click operation, acquiring a registered view component in each level of interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinates of the touch operation as a first view component.
Optionally, the response module 230 is further configured to:
if the first view components comprise a plurality of first view components, acquiring the priority of each first view component, and determining the first view component with the highest priority as the second view component;
and controlling the second view component to respond to the clicking operation.
Optionally, the method further includes: registering or deregistering view components arranged in interfaces of all levels to obtain registered view components and deregistered view components; the registered view component responds to the touch operation of the user, and the anti-registered view component does not respond to the touch operation of the user.
The device can execute the methods provided by all the embodiments of the invention, and has corresponding functional modules and beneficial effects for executing the methods. For details not described in detail in this embodiment, reference may be made to the methods provided in all the foregoing embodiments of the present invention.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a computer device according to a third embodiment of the present invention. FIG. 3 illustrates a block diagram of a computer device 312 suitable for use in implementing embodiments of the present invention. The computer device 312 shown in FIG. 3 is only an example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention. Device 312 is a computing device that responds to functions of a typical multi-level interface.
As shown in FIG. 3, computer device 312 is in the form of a general purpose computing device. The components of computer device 312 may include, but are not limited to: one or more processors 316, a storage device 328, and a bus 318 that couples the various system components including the storage device 328 and the processors 316.
The computer device 312 may also communicate with one or more external devices 314 (e.g., keyboard, pointing device, camera, display 324, etc.), with one or more devices that enable a user to interact with the computer device 312, and/or with any devices (e.g., network card, modem, etc.) that enable the computer device 312 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 322. Also, computer device 312 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), etc.) and/or a public Network, such as the internet, via Network adapter 320. As shown, network adapter 320 communicates with the other modules of computer device 312 via bus 318. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the computer device 312, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape drives, and data backup storage systems, to name a few.
Example four
The fourth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the response method of the multi-level interface provided in the fourth embodiment of the present invention.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the response method of the multi-level interface provided by any embodiments of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (9)
1. A method for responding to a multi-level interface, comprising:
when the touch operation of a user is detected, determining the touch type of the touch operation; the touch type comprises a sliding operation and a clicking operation;
determining a first view component arranged in each hierarchy interface according to the touch type; the method comprises the following steps: if the touch type is click operation, acquiring a registered view component in each level of interface; determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinates of the touch operation as a first view component;
and determining a second view component from the first view component, and controlling the second view component to respond to the touch operation corresponding to the touch type.
2. The method of claim 1, wherein determining the touch type of the touch operation comprises:
acquiring a starting point coordinate and an end point coordinate of the touch operation;
calculating the sliding distance of the touch operation according to the starting point coordinate and the end point coordinate;
and if the sliding distance exceeds a set value, the touch operation is a sliding operation, otherwise, the sliding operation is a clicking operation.
3. The method of claim 2, wherein determining the first view component to be disposed in the hierarchy of interfaces based on the touch type comprises:
if the touch type is a sliding operation, acquiring a registered view component in an upper-layer interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinate of the touch operation as a first view component.
4. The method according to claim 3, wherein determining a second view component from the first view components and controlling the second view component to respond to the touch operation corresponding to the touch type comprises:
and determining the first view component as a second view component, and controlling the second view component to respond to the sliding operation.
5. The method according to claim 1, wherein determining a second view component from the first view components and controlling the second view component to respond to the touch operation corresponding to the touch type comprises:
if the first view components comprise a plurality of first view components, acquiring the priority of each first view component, and determining the first view component with the highest priority as the second view component;
and controlling the second view component to respond to the clicking operation.
6. The method of any of claims 1-5, further comprising: registering or deregistering view components arranged in interfaces of all levels to obtain registered view components and deregistered view components; the registered view component responds to the touch operation of the user, and the anti-registered view component does not respond to the touch operation of the user.
7. A response device for a multi-level interface, comprising:
the touch type determining module is used for determining the touch type of the touch operation when the touch operation of a user is detected; the touch type comprises a sliding operation and a clicking operation;
the first view component determining module is used for determining a first view component arranged in each hierarchy interface according to the touch type;
the response module is used for determining a second view component from the first view component and controlling the second view component to respond to the touch operation corresponding to the touch type;
the first view component determination module to further:
if the touch type is click operation, acquiring a registered view component in each level of interface;
and determining the current position of each registered view component in the interface, and determining the registered view component falling into the starting point coordinates of the touch operation as a first view component.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of responding to a multi-level interface of any of claims 1-6 when executing the program.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method of responding to a multi-level interface according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010006608.0A CN111208929B (en) | 2020-01-03 | 2020-01-03 | Response method, device and equipment of multi-level interface and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010006608.0A CN111208929B (en) | 2020-01-03 | 2020-01-03 | Response method, device and equipment of multi-level interface and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111208929A CN111208929A (en) | 2020-05-29 |
CN111208929B true CN111208929B (en) | 2021-11-02 |
Family
ID=70786588
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010006608.0A Active CN111208929B (en) | 2020-01-03 | 2020-01-03 | Response method, device and equipment of multi-level interface and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111208929B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113918067A (en) * | 2020-11-20 | 2022-01-11 | 完美世界(北京)软件科技发展有限公司 | Interface logic execution method and device, electronic equipment and medium |
CN114356194A (en) * | 2022-03-07 | 2022-04-15 | 北京搜狐新媒体信息技术有限公司 | Method and device for processing native advertisement |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103500070A (en) * | 2013-10-23 | 2014-01-08 | 北京三星通信技术研究有限公司 | Touch operation method and device and terminal unit |
CN103677518A (en) * | 2013-12-02 | 2014-03-26 | 北京像素软件科技股份有限公司 | Method and device for responding to touch messages on mobile terminal |
CN105335084A (en) * | 2014-08-08 | 2016-02-17 | 富泰华工业(深圳)有限公司 | Electronic apparatus and method for processing multilayer stacking interface operation |
CN106325668A (en) * | 2016-08-11 | 2017-01-11 | 网易(杭州)网络有限公司 | Touch event response processing method and system |
CN106873874A (en) * | 2017-01-20 | 2017-06-20 | 维沃移动通信有限公司 | A kind of application program open method and mobile terminal |
CN108803968A (en) * | 2018-06-29 | 2018-11-13 | 掌阅科技股份有限公司 | Multiview linkage method, computing device and the storage medium of user's display interface |
CN109358801A (en) * | 2018-09-27 | 2019-02-19 | 武汉华中时讯科技有限责任公司 | Detect device, method and the storage medium of multi-level view element touch event |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9703477B2 (en) * | 2013-02-19 | 2017-07-11 | Facebook, Inc. | Handling overloaded gestures |
US9213472B2 (en) * | 2013-03-12 | 2015-12-15 | Sap Se | User interface for providing supplemental information |
CN105335038B (en) * | 2014-07-30 | 2019-05-07 | 联想企业解决方案(新加坡)有限公司 | Method and system for prompting touch input operation |
CN105786395B (en) * | 2016-04-07 | 2019-04-09 | 广州华多网络科技有限公司 | A kind of live streaming method for switching between, apparatus and system based on mobile terminal |
CN107728868B (en) * | 2016-08-11 | 2021-03-09 | 阿里巴巴集团控股有限公司 | Method and device for synchronizing components in mobile page and mobile terminal |
CN108009078B (en) * | 2016-11-01 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Application interface traversal method, system and test equipment |
CN107193479B (en) * | 2017-05-26 | 2018-07-10 | 网易(杭州)网络有限公司 | Information processing method, device, electronic equipment and storage medium |
CN110069182A (en) * | 2019-04-28 | 2019-07-30 | 努比亚技术有限公司 | Wallpaper control method, mobile terminal and computer readable storage medium |
CN110262749B (en) * | 2019-06-27 | 2021-05-28 | 北京思维造物信息科技股份有限公司 | Webpage operation method, device, container, equipment and medium |
-
2020
- 2020-01-03 CN CN202010006608.0A patent/CN111208929B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103500070A (en) * | 2013-10-23 | 2014-01-08 | 北京三星通信技术研究有限公司 | Touch operation method and device and terminal unit |
CN103677518A (en) * | 2013-12-02 | 2014-03-26 | 北京像素软件科技股份有限公司 | Method and device for responding to touch messages on mobile terminal |
CN105335084A (en) * | 2014-08-08 | 2016-02-17 | 富泰华工业(深圳)有限公司 | Electronic apparatus and method for processing multilayer stacking interface operation |
CN106325668A (en) * | 2016-08-11 | 2017-01-11 | 网易(杭州)网络有限公司 | Touch event response processing method and system |
CN106873874A (en) * | 2017-01-20 | 2017-06-20 | 维沃移动通信有限公司 | A kind of application program open method and mobile terminal |
CN108803968A (en) * | 2018-06-29 | 2018-11-13 | 掌阅科技股份有限公司 | Multiview linkage method, computing device and the storage medium of user's display interface |
CN109358801A (en) * | 2018-09-27 | 2019-02-19 | 武汉华中时讯科技有限责任公司 | Detect device, method and the storage medium of multi-level view element touch event |
Also Published As
Publication number | Publication date |
---|---|
CN111208929A (en) | 2020-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108595029B (en) | Method, device and equipment for determining inclination angle of capacitive pen and storage medium | |
CN111208929B (en) | Response method, device and equipment of multi-level interface and storage medium | |
CN108553894B (en) | Display control method and device, electronic equipment and storage medium | |
CN109213668B (en) | Operation recording method and device and terminal | |
US8875060B2 (en) | Contextual gestures manager | |
US10983625B2 (en) | Systems and methods for measurement of unsupported user interface actions | |
CN110399443B (en) | Map editing method and device, mobile platform and storage medium | |
CN112287010A (en) | Map service providing method, device, terminal and storage medium based on android system | |
CN110198487B (en) | Video playing method, device, equipment and storage medium | |
CN110674050B (en) | Memory out-of-range detection method and device, electronic equipment and computer storage medium | |
CN112711051A (en) | Flight control system positioning method, device, equipment and storage medium | |
CN107493339A (en) | Information-pushing method, device, terminal and computer-readable recording medium | |
CN114356475B (en) | Display processing method, device, equipment and storage medium | |
CN109241059A (en) | A kind of building method of point cloud data, device, electronic equipment and storage medium | |
CN114625472A (en) | Page display method and device, electronic equipment and storage medium | |
CN113407102A (en) | Virtual key display method, device, system and storage medium | |
CN113656286A (en) | Software testing method and device, electronic equipment and readable storage medium | |
CN112346640A (en) | Method, computing device and medium for searching target objects at same position | |
KR102094944B1 (en) | Method for eye-tracking and terminal for executing the same | |
CN111859502A (en) | Method and related equipment for positioning drawing review result | |
CN111428544B (en) | Scene recognition method and device, electronic equipment and storage medium | |
CN116820354B (en) | Data storage method, data storage device and data storage system | |
CN116910631B (en) | Array comparison method, device, electronic equipment and readable storage medium | |
CN115168478B (en) | Data type conversion method, electronic device and readable storage medium | |
CN116860128A (en) | Element control identification method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |