CN110554892A - Information acquisition method and device - Google Patents

Information acquisition method and device Download PDF

Info

Publication number
CN110554892A
CN110554892A CN201810550451.0A CN201810550451A CN110554892A CN 110554892 A CN110554892 A CN 110554892A CN 201810550451 A CN201810550451 A CN 201810550451A CN 110554892 A CN110554892 A CN 110554892A
Authority
CN
China
Prior art keywords
view control
page
response
type
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810550451.0A
Other languages
Chinese (zh)
Inventor
戴旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201810550451.0A priority Critical patent/CN110554892A/en
Publication of CN110554892A publication Critical patent/CN110554892A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an information acquisition method and device. One embodiment of the method comprises: in response to the detection of the trigger operation of the user, determining a view control corresponding to the trigger operation; determining a page corresponding to the view control; determining whether the type of a buried point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and buried point types corresponding to the view control identifications; and in response to the fact that the type of the embedded point corresponding to the view control is determined to be a first preset type, acquiring page data information corresponding to the view control in the page based on a data acquisition mode corresponding to the first preset type. According to the embodiment, the workload of setting the view controls corresponding to the embedded points is reduced, and the efficiency of setting the embedded points is improved.

Description

Information acquisition method and device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of internet, and particularly relates to an information acquisition method and device.
Background
In the era of mobile internet, a mobile client is the most important user terminal. These user terminals typically need to interact with a remote server to provide functionality and data usage to the user. In order to provide better service for users, statistics are generally needed for the frequency of access to data by users and the data accessed by users.
at present, when the access data of a user to a network is counted, the access data can be realized in a point burying mode. One existing point burying scheme is to generate a configuration file of a buried point in advance, and then send the configuration file to a user terminal. And setting an identification corresponding to the embedded point, a data field required to be reported by the embedded point and the like in the configuration file. The method needs to manually bind the reported data field with the view control of the page corresponding to the embedded point.
Disclosure of Invention
The embodiment of the application provides an information acquisition method and device.
In a first aspect, an embodiment of the present application provides an information obtaining method, where the method includes: in response to the detection of the trigger operation of the user, determining a view control corresponding to the trigger operation; determining a page corresponding to the view control; determining whether the type of a buried point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and buried point types corresponding to the view control identifications; and in response to the fact that the type of the embedded point corresponding to the view control is determined to be a first preset type, acquiring page data information corresponding to the view control in the page based on a data acquisition mode corresponding to the first preset type.
In some embodiments, determining the page corresponding to the view control includes: determining a page corresponding to the view control based on the response chain; the response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
in some embodiments, determining the page corresponding to the view control further includes: and in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control triggered by the trigger operation based on the top stack element of the page navigation controller.
In some embodiments, the type corresponding to each responder in the response chain is detected step by step along the direction from the first-stage responder to the second-stage responder until the responder corresponding to the page type is detected.
in some embodiments, the method further comprises: and in response to the fact that the type of the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identification of the view control in a preset configuration table.
In a second aspect, an embodiment of the present application provides an information acquiring apparatus, including: the view control determining unit is configured to respond to the detection of the trigger operation of the user and determine a view control corresponding to the trigger operation; the page determining unit is configured to determine a page corresponding to the view control; the type determining unit is configured to determine whether the type of a buried point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and buried point types corresponding to the view control identifications; the obtaining unit is configured to obtain page data information corresponding to the view control in the page based on a data obtaining mode corresponding to a first preset type in response to the fact that the type of the embedded point corresponding to the view control is determined to be the first preset type.
In some embodiments, the page determination unit is further configured to: determining a page corresponding to the view control based on the response chain; the response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
In some embodiments, the page determination unit is further configured to: and in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control triggered by the trigger operation based on the top stack element of the page navigation controller.
In some embodiments, the page determination unit is further configured to: and detecting the type corresponding to each responder in the response chain step by step along the direction from the first-stage responder to the second-stage responder until the responder corresponding to the page type is detected.
in some embodiments, the obtaining unit is further configured to: and in response to the fact that the type of the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identification of the view control in a preset configuration table.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes: one or more processors; a storage device, on which one or more programs are stored, which, when executed by the one or more processors, cause the one or more processors to implement the method as described in any implementation manner of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the information acquisition method and device provided by the embodiment of the application, the view control corresponding to the trigger operation is determined in response to the trigger operation of the user, then the page corresponding to the view control is determined, whether the type of the view control is the first preset type is determined based on the preset configuration table, and finally the page data information corresponding to the view control in the page is acquired in response to the determination that the type of the view control is the first preset type.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of an information acquisition method according to the present application;
FIG. 3 is a schematic diagram of an application scenario of an information acquisition method according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of an information acquisition method according to the present application;
FIG. 5 is a schematic structural diagram of one embodiment of an information acquisition apparatus according to the present application;
Fig. 6 is a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present application.
Detailed Description
the present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the information acquisition method or information acquisition apparatus of the present application may be applied.
as shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various client applications installed thereon, such as a web browser application, a shopping-type application, a search-type application, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The terminal devices 101, 102, and 103 may obtain data corresponding to a view control in a webpage operated by a user, and send operation information of the view control by the user and the obtained data to the server 105.
the server 105 may provide various services, such as a background server that receives the data obtained and the operation of the view control by the user sent by the terminal device 101, 102, 103. The background server can analyze and process the received operation of the user on the view control and the acquired data.
It should be noted that the information acquisition method provided in the embodiment of the present application is generally executed by the terminal devices 101, 102, and 103, and accordingly, the information acquisition apparatus is generally disposed in the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of an information acquisition method according to the present application is shown. The information acquisition method comprises the following steps:
Step 201, in response to detecting a trigger operation of a user, determining a view control corresponding to the trigger operation.
In this embodiment, an execution subject of the information acquisition method (for example, the terminal device shown in fig. 1) may detect a trigger operation of a user in real time. The trigger operation may include, for example, a click, a touch, or a cursor hover operation by the user.
After detecting the triggering operation of the user, the executing body may determine the view control triggered by the triggering operation. Namely, determining the view control corresponding to the trigger operation.
In general, an operating system and various applications installed on the operating system may be installed in the execution main body. The operating system may be, for example, an android system, an iOS system, or the like. The following describes a process of determining a view control corresponding to a trigger operation by taking the iOS system as an example. For any application, it usually includes a main window (KeyWindow) and multiple pages (usually a page can be used as a view controller). There are multiple View controls (views) in the page. After any application is started, when a user performs a trigger operation (e.g., a touch operation) on the display screen of the terminal device, the terminal device generates a trigger event (e.g., a touch event), and performs distribution processing on the trigger event. The touch screen is usually distributed to a main window (KeyWindow) of an application program, the main window judges whether the touch point is in the current view, if so, all view controls (sub-views of the main window) of the main window are traversed to detect the view control corresponding to the trigger operation.
it should be noted that the process of determining the view control corresponding to the trigger event is a well-known technology widely studied and applied, and is not described herein again.
Step 202, determining a page corresponding to the view control.
For any application, the Model View Controller (MVC) architecture mode can be adopted. In the MVC architecture Model, a Model (Model) encapsulates data of an application and defines logic and operations to manipulate and process the data. For example, the model (referred to herein as a data model) may be a list representing commodity data. A View (View) is an object that a user can see in an application. The view knows how to draw itself out and can respond to the user's actions. The primary purpose of the view is to display data from the model of the application and to make the data editable. Views are typically bound to view controls. A view Controller (generally, a page of an application) may receive an input from a user, call a corresponding model, and construct a view to be displayed to the user, thereby completing an interaction and satisfying the user's requirements. The view controller itself does not enter any data and do any processing, it simply receives the request and decides which model to invoke to process the request, and then determines what views to compose to display the returned data. The operation of creating or modifying data in the view by the user is communicated through the view controller, and finally the model is created or updated. When a model changes (e.g., new data is received over a network connection), the model notifies the view controller, which updates the corresponding view according to the change in the model.
In this embodiment, after the view control triggered by the trigger operation obtained in step 201, the execution main body (for example, the terminal device shown in fig. 1) may analyze the page corresponding to the view control by using various analysis means.
in some optional implementations of this embodiment, the step 202 may further include: and determining the page corresponding to the view control based on the response chain. The response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
after a trigger event (for example, a touch event generated by a user touching a screen of the terminal device) is generated by a trigger operation of the user, the executing body finds a suitable view control to process the event through a series of event delivery processes. After finding the appropriate view control, the execution body calls the corresponding method to do specific event processing. The appropriate view controls described above may be considered first-level responders (typically first-level responders default to not handling events, but are only responsible for passing events). The first level responder (i.e., the view control described above) passes the event to its parent view. The parent view of the view control determines whether it can handle the event. The parent view of the view control described above may be considered a second level responder. If the second level responder is still unable to handle the event, the event is further passed to the parent view of the view control. The parent view of the parent view described above may be considered a third level responder. By analogy, the event can be passed to the view control at the top level. If the top-level View control still cannot handle the event, the top-level View control will pass the event to a View Controller (View Controller) (i.e., the page containing the View controls). If the view controller is still unable to handle this event, the event is passed to the application window (UIwindow). If the window is still unable to handle the event, the event is passed to the root view (UIapplication). The view controller, application window, and root view are also all respondents. The first-level responder, the second-level responder, the third-level responder, …, the view controller, the application window, the root view, etc. form a responder chain in response to the trigger operation, and the responder chain can be used as the response chain. That is, the response chain is a multi-level responder chain formed in response to the above-described trigger operation.
In these alternative implementations, the execution body may detect the type corresponding to each responder in the response chain step by step along a direction from the first-stage responder to the second-stage responder. Until a responder of the page type is detected. Further, the execution body may determine a responder of the corresponding page type as a page corresponding to the view control triggered by the trigger operation.
The determining the page corresponding to the view control in step 202 may further include: and in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control indicated by the trigger operation based on the top-of-stack element of the page navigation controller.
In these alternative implementations, in response to determining that the page corresponding to the view control is not determined based on the chain of responses, the execution body may determine, based on a top-of-stack element of the page navigation controller, the page corresponding to the view control triggered by the trigger operation.
In general, one application may include a plurality of pages. The application may employ a navigation controller to manage the pages. Further, the navigation controller may manage the plurality of pages with a stack. The page that appears in front of the user, called the top-of-stack element of the navigation controller. The execution body may determine, by a method of obtaining a stack top element of the navigation controller, a page corresponding to a view control triggered by a trigger operation of a user.
Step 203, determining whether the type of the embedded point corresponding to the view control is a first preset type based on a preset configuration table.
After determining the page corresponding to the view control in step 202, the execution body may determine whether the type of the buried point corresponding to the view control is the first preset type based on the preset configuration table.
In this embodiment, the execution main body of the information acquisition method may store a preset configuration table in advance. The preset configuration table may be preset and stored in the server by the developer, and then sent to the execution main body by the server. The preset configuration table comprises view control identifications corresponding to the view controls and embedded point types corresponding to the view control identifications. Here, the view control identifier corresponding to any view control may be generated by an encoding rule used by the server and the terminal device together. For example, a developer may generate, on a server side, a view control identifier corresponding to a view control using a preset encoding rule. And on the terminal equipment side, when the view control is triggered by the triggering operation of the user, generating a view control identifier corresponding to the view control by using the same preset coding rule as the preset coding rule of the server side.
the type of the buried point corresponding to the view control may be, for example, a type that is pre-specified by a developer for the buried point corresponding to the view control. For example, the developer may preset an identifier for each of the burial points, and consider the burial point with a certain specific identifier as the burial point belonging to the first preset type. The identification can be set according to specific application scenarios. For a plurality of buried points with the same identification, the buried points can be bound with the same function for acquiring data in advance. When the user performs a trigger operation on the view control corresponding to the identifier, the execution main body may obtain page data corresponding to the view control according to the function for obtaining data.
The execution body can be matched with a preset configuration table based on the view control identification generated by the terminal equipment. When the execution main body determines that the view control identifier of the view control triggered by the user trigger operation is the same as one view control identifier in the preset configuration table, it may further determine whether the type of the embedded point corresponding to the view control corresponding to the user trigger operation is the first preset type according to the type of the embedded point corresponding to the view control identifier preset in the preset configuration table.
Step 204, in response to determining that the type of the view control is the first preset type, acquiring page data information corresponding to the view control in the page based on a data acquisition mode corresponding to the first preset type.
In this embodiment, in response to determining that the type of the buried point corresponding to the view control is the first preset type, the execution main body may obtain, based on a data obtaining manner corresponding to the first preset type, page data information corresponding to the view control corresponding to the user's trigger operation in the page.
typically, the page data is sent by the server to the executing agent. In the execution body, a page view controller (page for short) performs unified management on page data types and view controls.
When the type of the embedded point corresponding to any view control is a first preset type, the execution main body may obtain page data information corresponding to the view control in a page according to a data obtaining manner corresponding to the first preset type.
The data acquisition method corresponding to the first preset type may be, for example, a call to a function.
further optionally, the data obtaining manner may be encapsulated in a code corresponding to the page (view controller).
in this way, when the embedded point types corresponding to the multiple view controls in one page are the first preset type, for the embedded point of the first preset type, the page data corresponding to the view control can be acquired by using the data acquisition mode corresponding to the first preset type; and the data types to be returned by the corresponding buried points of each view control are not bound into the view control manually. Therefore, the operation of setting the buried point can be simplified, and the efficiency of setting the buried point is improved.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the information acquisition method according to the present embodiment. In the application scenario of fig. 3, a user 302 first initiates a trigger operation 303 to the terminal device 301 screen. After detecting the trigger operation of the user, the terminal device 301 first determines the view control 304 corresponding to the trigger operation. The page 305 to which the view control corresponds is then determined. Then, it is determined whether the type of the buried point corresponding to the view control is a first preset type based on the preset configuration table 306, and then, in response to determining that the type of the buried point corresponding to the view control is the first preset type, page data corresponding to the view control in the page is acquired based on a data acquisition manner corresponding to the first preset type 307. Finally, the page data 308 corresponding to the view control described above is sent to the server 309.
In the method provided by the embodiment of the application, for the view control of which the type is the first preset type, the page data information corresponding to the view control in the page is acquired by using the data acquisition mode corresponding to the first preset type, so that the workload of setting the view controls corresponding to the embedded points is reduced, and the efficiency of setting the embedded points is improved.
With further reference to fig. 4, a flow 400 of yet another embodiment of an information acquisition method is shown. The process 400 of the information obtaining method includes the following steps:
Step 401, in response to detecting a trigger operation of a user, determining a view control corresponding to the trigger operation.
Step 401 is the same as step 201 shown in fig. 2, and is not described herein again.
Step 402, determining a page corresponding to the view control.
step 402 is the same as step 202 shown in fig. 2, and is not described herein.
Step 403, determining whether the type of the embedded point corresponding to the view control is a first preset type based on a preset configuration table.
Step 403 is the same as step 203 shown in fig. 2, and is not described herein.
In response to determining that the type of the embedded point corresponding to the view control is a first preset type, page data information corresponding to the view control in the page is acquired based on a data acquisition mode corresponding to the first preset type, in step 404.
Step 404 is the same as step 204 shown in FIG. 2 and will not be described here.
Step 405, in response to determining that the type of the buried point corresponding to the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identifier of the view control in a preset configuration table.
In this embodiment, in response to determining that the type of the buried point corresponding to the view control is not the first preset type (for example, the type of the buried point corresponding to the view control is a user-defined type), the execution main body may obtain the page data information corresponding to the view control based on a keyword corresponding to the view control identifier of the view control, which is preset in a preset configuration table.
For a view control corresponding to a buried point which is not of the first preset type, page data information corresponding to the view control can be acquired by a Key-Value Coding (KVC) method. In particular, keywords may be set by the relevant developer for the data fields to be returned by the view control. The keyword is set as a property of the view control. And setting the data field to be returned by the view control as the value of the attribute. In the preset configuration table, the relevant developer may also preset a view control identifier and a keyword corresponding to the view control identifier. When a user clicks any view control, the terminal device may generate a view control identifier of the view control according to a preset encoding rule. And the terminal equipment control matches the view control identification with a preset configuration table. And when the view control identification consistent with the view control identification is matched in the preset configuration table, determining a keyword corresponding to the view control identification in the preset configuration table. And then, a value taking method in the KVC method is used for obtaining the value of the keyword. It should be noted that the value-taking method in the key value coding mode and the KVC method is a well-known technology widely studied and applied at present, and is not described herein again.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the process 400 of the information obtaining method in this embodiment highlights a step of obtaining, based on a preset keyword in a preset configuration table, page data information corresponding to a view control in response to determining that the type of the buried point corresponding to the view control is not the first preset type. Therefore, the scheme described in the embodiment can introduce more types of embedded points, so as to add and delete the data fields to be uploaded by the embedded points according to the requirements of users.
With further reference to fig. 5, as an implementation of the method shown in the above-mentioned figures, the present application provides an embodiment of an information obtaining apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied to various electronic devices.
As shown in fig. 5, the information acquisition apparatus 500 of the present embodiment includes: a view control determining unit 501, a page determining unit 502, a type determining unit 503, and an obtaining unit 504. The view control determining unit 501 is configured to determine, in response to detecting a trigger operation of a user, a view control corresponding to the trigger operation; a page determining unit 502 configured to determine a page corresponding to the view control; a type determining unit 503, configured to determine whether a type of a buried point corresponding to the view control is a first preset type based on a preset configuration table, where the preset configuration table includes a plurality of view control identifiers and buried point types corresponding to the view control identifiers; the obtaining unit 504 is configured to, in response to determining that the type of the buried point corresponding to the view control is a first preset type, obtain page data information corresponding to the view control in the page based on a data obtaining manner corresponding to the first preset type.
In this embodiment, specific processes of the view control determining unit 501, the page determining unit 502, the type determining unit 503, and the obtaining unit 504 of the information obtaining apparatus 500 and technical effects brought by the specific processes may refer to related descriptions of step 201, step 202, step 203, and step 204 in the corresponding embodiment of fig. 2, and are not repeated herein.
In some optional implementations of this embodiment, the page determining unit 502 is further configured to: determining a page corresponding to the view control based on the response chain; the response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
In some optional implementations of this embodiment, the page determining unit 502 is further configured to: and in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control triggered by the trigger operation based on the top stack element of the page navigation controller.
In some optional implementations of this embodiment, the page determining unit 502 is further configured to: and detecting types corresponding to all responders in the response chain step by step along the direction from the first-stage responders to the second-stage responders until the responders corresponding to the page types are detected.
in some optional implementations of this embodiment, the obtaining unit 504 is further configured to: and in response to the fact that the type of the embedded point corresponding to the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identification of the view control in a preset configuration table.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device of an embodiment of the present application. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM602, and RAM 603 are connected to each other via a bus 604. An Input/Output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard and the like; an output portion 607 including a Liquid Crystal Display (LCD) and the like, a speaker and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a semiconductor memory or the like is mounted on the drive 610 as necessary, so that the computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a view control determining unit, a page determining unit, a type determining unit, and an obtaining unit. The names of the cells do not constitute a limitation on the cells themselves in some cases, for example, the view control determination unit may also be described as "a cell that determines to trigger the corresponding view control in response to detecting the trigger operation of the user".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: in response to the detection of the trigger operation of the user, determining a view control corresponding to the trigger operation; determining a page corresponding to the view control; determining whether the type of a buried point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and buried point types corresponding to the view control identifications; and in response to the fact that the type of the embedded point corresponding to the view control is determined to be a first preset type, acquiring page data information corresponding to the view control in the page based on a data acquisition mode corresponding to the first type.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. An information acquisition method, comprising:
In response to the detection of the trigger operation of a user, determining a view control corresponding to the trigger operation;
Determining a page corresponding to the view control;
Determining whether the type of a buried point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and buried point types corresponding to the view control identifications;
And in response to the fact that the type of the embedded point corresponding to the view control is determined to be a first preset type, acquiring page data information corresponding to the view control in the page based on a data acquisition mode corresponding to the first preset type.
2. The method of claim 1, wherein the determining the page to which the view control corresponds comprises:
Determining a page corresponding to the view control based on the response chain; wherein the response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
3. The method of claim 2, wherein the determining the page to which the view control corresponds further comprises:
And in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control triggered by the trigger operation based on a stack top element of a page navigation controller.
4. the method of claim 2, wherein the determining the page to which the view control corresponds based on the response chain comprises:
And detecting types corresponding to all responders in the response chain step by step along the direction from the first-stage responders to the second-stage responders until the responders corresponding to the page types are detected.
5. The method of claim 1, wherein the method further comprises:
and in response to the fact that the type of the embedded point corresponding to the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identification of the view control in a preset configuration table.
6. An information acquisition apparatus comprising:
the view control determining unit is configured to respond to the detection of the trigger operation of a user and determine a view control corresponding to the trigger operation;
the page determining unit is configured to determine a page corresponding to the view control;
The type determining unit is configured to determine whether the type of the embedded point corresponding to the view control is a first preset type or not based on a preset configuration table, wherein the preset configuration table comprises a plurality of view control identifications and embedded point types corresponding to the view control identifications;
The obtaining unit is configured to obtain page data information corresponding to the view control in the page based on a data obtaining mode corresponding to a first preset type in response to the fact that the type of the embedded point corresponding to the view control is determined to be the first preset type.
7. the apparatus of claim 6, wherein the page determination unit is further configured to:
determining a page corresponding to the view control based on the response chain; wherein the response chain is a multi-level responder chain formed in response to the trigger operation, the view control in the response chain is a first-level responder, the parent view of the view control is a second-level responder, and the response chain at least comprises the first-level responder and the second-level responder.
8. The apparatus of claim 7, wherein the page determination unit is further configured to:
And in response to determining that the page corresponding to the view control is not determined based on the response chain, determining the page corresponding to the view control triggered by the trigger operation based on a stack top element of a page navigation controller.
9. The method of claim 7, wherein the page determination unit is further configured to:
and detecting types corresponding to all responders in the response chain step by step along the direction from the first-stage responders to the second-stage responders until the responders corresponding to the page types are detected.
10. The apparatus of claim 6, wherein the obtaining unit is further configured to:
And in response to the fact that the type of the embedded point corresponding to the view control is not the first preset type, acquiring page data information corresponding to the view control based on a preset keyword corresponding to the view control identification of the view control in a preset configuration table.
11. A terminal device, comprising:
One or more processors;
A storage device having one or more programs stored thereon,
When executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
CN201810550451.0A 2018-05-31 2018-05-31 Information acquisition method and device Pending CN110554892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810550451.0A CN110554892A (en) 2018-05-31 2018-05-31 Information acquisition method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810550451.0A CN110554892A (en) 2018-05-31 2018-05-31 Information acquisition method and device

Publications (1)

Publication Number Publication Date
CN110554892A true CN110554892A (en) 2019-12-10

Family

ID=68733707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810550451.0A Pending CN110554892A (en) 2018-05-31 2018-05-31 Information acquisition method and device

Country Status (1)

Country Link
CN (1) CN110554892A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448832A (en) * 2020-06-18 2021-09-28 北京新氧科技有限公司 Control exposure detection method and application program operation monitoring system
CN113835697A (en) * 2020-06-23 2021-12-24 北京字节跳动网络技术有限公司 Event response method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN106844217A (en) * 2017-01-26 2017-06-13 网易(杭州)网络有限公司 Control to applying bury method and device, readable storage medium storing program for executing a little
CN107506291A (en) * 2017-06-30 2017-12-22 杭州大搜车汽车服务有限公司 A kind of analysis method and device based on data acquisition
CN107818163A (en) * 2017-11-01 2018-03-20 平安科技(深圳)有限公司 Page display method, device, computer equipment and storage medium
CN107861655A (en) * 2017-11-01 2018-03-30 平安科技(深圳)有限公司 Control matching process, device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017084508A1 (en) * 2015-11-17 2017-05-26 阿里巴巴集团控股有限公司 Method and device for automatically burying points
CN106844217A (en) * 2017-01-26 2017-06-13 网易(杭州)网络有限公司 Control to applying bury method and device, readable storage medium storing program for executing a little
CN107506291A (en) * 2017-06-30 2017-12-22 杭州大搜车汽车服务有限公司 A kind of analysis method and device based on data acquisition
CN107818163A (en) * 2017-11-01 2018-03-20 平安科技(深圳)有限公司 Page display method, device, computer equipment and storage medium
CN107861655A (en) * 2017-11-01 2018-03-30 平安科技(深圳)有限公司 Control matching process, device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HOGGENWANG: "响应链总结梳理与应用", pages 1 - 3, Retrieved from the Internet <URL:https://www.jianshu.com/p/7add2353b4aa> *
IBINGEWIN: "响应链详解", pages 1 - 4, Retrieved from the Internet <URL:https://www.jianshu.com/p/e541244a8e9f> *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113448832A (en) * 2020-06-18 2021-09-28 北京新氧科技有限公司 Control exposure detection method and application program operation monitoring system
CN113448832B (en) * 2020-06-18 2024-03-12 北京新氧科技有限公司 Control exposure detection method and application program operation monitoring system
CN113835697A (en) * 2020-06-23 2021-12-24 北京字节跳动网络技术有限公司 Event response method and device

Similar Documents

Publication Publication Date Title
US10942708B2 (en) Generating web API specification from online documentation
US20190325074A1 (en) Application programing interface document generator
US10360087B2 (en) Web API recommendations based on usage in cloud-provided runtimes
US9852015B2 (en) Automatic discovery of a JavaScript API
US10044837B2 (en) Generation and distribution of named, definable, serialized tokens
US20190018867A1 (en) Rule based data processing
CN112631924A (en) Automatic testing method and device, computer equipment and storage medium
CN110795181A (en) Application program interface display method and device based on skip protocol and electronic equipment
CN114528269A (en) Method, electronic device and computer program product for processing data
CN105094857B (en) Method and system for application load
CN110554892A (en) Information acquisition method and device
CN111125503A (en) Method and apparatus for generating information
CN111787041B (en) Method and device for processing data
CN111488386A (en) Data query method and device
CN108628909B (en) Information pushing method and device
US10282732B2 (en) Analysis of customer feedback for applications executing on distributed computational systems
CN111767111B (en) Page data processing method and device, electronic equipment and storage medium
CN112579428A (en) Interface testing method and device, electronic equipment and storage medium
CN111767447A (en) Method and device for determining user traffic path
CN111367517B (en) Information generation method and device
CN111104626B (en) Information storage method and device
CN115185798A (en) Data acquisition and analysis method and device, electronic equipment and storage medium
CN113535153A (en) Method, device, equipment and medium for encoding custom label
CN115098391A (en) Page detection method, device, equipment and medium
CN113760694A (en) Method and device for calculating code coverage rate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination