CN108829473B - Event response method, device and storage medium - Google Patents

Event response method, device and storage medium Download PDF

Info

Publication number
CN108829473B
CN108829473B CN201810520265.2A CN201810520265A CN108829473B CN 108829473 B CN108829473 B CN 108829473B CN 201810520265 A CN201810520265 A CN 201810520265A CN 108829473 B CN108829473 B CN 108829473B
Authority
CN
China
Prior art keywords
control
user interface
view control
drawable object
target view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810520265.2A
Other languages
Chinese (zh)
Other versions
CN108829473A (en
Inventor
常群
龙海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201810520265.2A priority Critical patent/CN108829473B/en
Publication of CN108829473A publication Critical patent/CN108829473A/en
Application granted granted Critical
Publication of CN108829473B publication Critical patent/CN108829473B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The disclosure relates to an event response method, an event response device and a storage medium, and belongs to the technical field of terminals. The method comprises the following steps: acquiring a floating layer of a target view control in a target user interface; adding a drawable object in the floating layer of the target view control; receiving an operation event corresponding to a drawable object; and executing the operation corresponding to the drawable object so as to respond to the operation event. According to the method, a drawable object is added in the floating layer of the target view control by acquiring the floating layer of the target view control in a target user interface, and then when an operation event corresponding to the drawable object is received, corresponding operation is executed to respond to the operation event; because the drawable object is not a UI control, the structure of the control tree of the target user interface does not need to be adjusted, and bug generation is avoided.

Description

Event response method, device and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of terminals, and in particular, to an event response method, an event response device and a storage medium.
Background
In an Android system, a UI (User Interface) control (abbreviated as "control") has a function of receiving an operation event. For example, the button control can receive a user-triggered click operation event. And when the control receives the operation event, executing corresponding operation so as to respond to the operation event.
The picture-in-picture function of the Android system means that the user interface of a first application program is displayed, and simultaneously the user interface of a second application program is superposed and displayed on the upper layer of the user interface of the first application program in a floating window mode, so that the purpose of multitasking is achieved. For example, through a picture-in-picture function, a user can chat with a friend using an instant messaging application while watching a video using a video application.
In order to implement the picture-in-picture function, a button is usually added to the user interface of the first application, and the button is used for triggering and displaying the user interface of the second application. In the related art, the button is implemented as a UI control, and a node corresponding to the button control is inserted into a control tree of the user interface of the first application.
The above method may affect the structure of the control tree and the position of the original node in the control tree, and some bugs are easily generated.
Disclosure of Invention
The embodiment of the disclosure provides an event response method, an event response device and a storage medium. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an event response method, the method including:
acquiring a floating layer of a target view control in a target user interface;
adding a drawable object in the floating layer of the target view control;
receiving an operation event corresponding to the drawable object;
and executing the operation corresponding to the drawable object to respond to the operation event.
Optionally, the adding a drawable object in the floating layer of the target view control includes:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
Optionally, the receiving an operation event corresponding to the drawable object includes:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
Optionally, the executing an operation corresponding to the drawable object includes:
and overlaying and displaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object.
Optionally, the obtaining a floating layer of a target view control in a target user interface includes:
acquiring a control tree of the target user interface, wherein the control tree comprises controls in the target user interface;
detecting whether the control tree contains the target view control or not;
and if the control tree contains the target view control, acquiring a floating layer of the target view control.
According to a second aspect of embodiments of the present disclosure, there is provided an event response apparatus, the apparatus comprising:
the system comprises a floating layer acquisition module, a floating layer acquisition module and a target view control module, wherein the floating layer acquisition module is configured to acquire a floating layer of a target view control in a target user interface;
an object adding module configured to add a drawable object in a floating layer of the target view control;
an event receiving module configured to receive an operation event corresponding to the drawable object;
an event response module configured to perform an operation corresponding to the drawable object in response to the operation event.
Optionally, the object adding module is configured to:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
Optionally, the event receiving module is configured to:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
Optionally, the event response module is configured to:
and overlaying and displaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object.
Optionally, the floating layer obtaining module is configured to:
acquiring a control tree of the target user interface, wherein the control tree comprises controls in the target user interface;
detecting whether the control tree contains the target view control or not;
and if the control tree contains the target view control, acquiring a floating layer of the target view control.
According to a third aspect of embodiments of the present disclosure, there is provided an event response apparatus, the apparatus comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
acquiring a floating layer of a target view control in a target user interface;
adding a drawable object in the floating layer of the target view control;
receiving an operation event corresponding to the drawable object;
and executing the operation corresponding to the drawable object to respond to the operation event.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method according to the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
adding a drawable object in the floating layer of the target view control by acquiring the floating layer of the target view control in the target user interface, and then executing corresponding operation to respond to an operation event when the operation event corresponding to the drawable object is received; because the drawable object is not a UI control, the structure of the control tree of the target user interface does not need to be adjusted, and bug generation is avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating an event response method according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating one type of adding a drawable object in accordance with an illustrative embodiment;
FIG. 3 is an interface diagram illustrating one manner of event response according to an exemplary embodiment;
FIG. 4 is a block diagram illustrating an event response device in accordance with an exemplary embodiment;
FIG. 5 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the event response method provided by the embodiment of the present disclosure, the execution subject of each step is a terminal. For example, the terminal may be an electronic device such as a mobile phone, a tablet Computer, an electronic book reader, a multimedia playing device, a wearable device, and a PC (Personal Computer).
An Operating System (OS) may be installed in the terminal, and an execution subject of each step in the embodiment of the present disclosure may be the OS. The operating system is a computer program that manages and controls the hardware and software resources of the terminal. The technical scheme provided by the disclosure mainly aims at the problems in the Android system, and provides a corresponding solution. Of course, the technical solution provided by the present disclosure is also applicable to solve similar problems in other operating systems (such as Windows system, iOS system, or other customized systems based on Android system).
FIG. 1 is a flow chart illustrating an event response method according to an exemplary embodiment. The method may include the steps of:
in step 101, a floating layer of a target view control in a target user interface is obtained.
The target user interface is the user interface currently displayed by the first application. The first application may be any application installed and running in the terminal. The first application may be a third party application or may be a system application. In the embodiment of the present disclosure, the system application refers to an application provided by a developer of the operating system, and the system application is usually preinstalled in the terminal before the terminal leaves a factory, or may be installed in the terminal when the version of the operating system is updated. The third party application refers to an application provided by an application developer other than the developer of the operating system, and is usually downloaded and installed in the terminal by a user after the terminal is shipped.
Optionally, when the first application displays the target user interface, the operating system acquires a floating layer of a target view control in the target user interface. The target user interface is composed of one or more views, each of which may include at least one control. In the embodiment of the present disclosure, a view control refers to a control included in a view. The target view control may be any of the target user interfaces described above. The target view control has a function of receiving an operation event. The floating layer of the target view control is located on top of (i.e., upper layer) the target view control for adding a drawable object on top of the target view control. In the Android system, a view is called view, a floating layer is called overlay, and a drawable object is called drawable object.
In one example, the step 101 includes the following sub-steps:
1. acquiring a control tree of a target user interface;
the control tree contains controls in the target user interface. The control tree includes at least one node, each node corresponding to a control in the target user interface. And the hierarchical structure of each node in the control tree represents the hierarchical structure of each control in the target user interface.
2. Detecting whether the control tree contains a target view control or not;
optionally, the target view control is a view control of a specified type. The view controls are divided according to their functions and may include different types, such as a video control for playing a video, a text control for displaying text content, and a button control for receiving a click or press operation. The specified type can be set according to actual product requirements, for example, the specified type is a video control, and therefore a drawable object is added to the video control.
3. And if the control tree contains the target view control, acquiring the floating layer of the target view control.
Optionally, the operating system traverses the controls in the control tree one by one, determines whether the ith control in the control tree is a target view control, stops traversing and obtains a floating layer of the ith control if the ith control is the target view control, and if the ith control is not the target view control, makes i ═ i +1 and starts to be executed again from the step of determining whether the ith control in the control tree is the target view control until the target view control is found or the traversal of the entire tree control is completed, where i is a positive integer.
Taking the Android system as an example, when a first application opens an activity, the operating system traverses the control in the activity tree and determines whether the control is a SurfaceView type control (i.e., a video control). And recording the control of the first found SurfaceView type as targetView, and acquiring the floating layer of the targetView, wherein the floating layer of the targetView is targetOverlay. Optionally, the targetOverlay is acquired as follows: getoverlay () is used to determine the value of the target overlay. In the Android system, getOverlay () may be employed to obtain a floating layer of view controls.
In step 102, a drawable object is added in the floating layer of the target view control.
And after the operating system acquires the floating layer of the target view control, adding a drawable object in the floating layer of the target view control.
In one example, the step 102 includes the following sub-steps:
1. creating a drawable object;
taking the Android system as an example, the operating system may create a drawable object according to an Android standard manner, where the drawable object may be denoted as mbndrowable as a button (button) to be presented to the user. It should be noted that the button is not a control, and cannot receive an operation event, and the button is only a graphic or an icon.
2. Setting attribute information of a drawable object;
the attribute information is used to indicate a display attribute of the drawable object. Optionally, the attribute information comprises a position and/or a size.
Taking the Android system as an example, the position and size of the drawable object can be set in the following manner: setposts (mbtnpopitionx, mbtnpopitionx + mBtnWidth, mbtnpopitiony + mBtnHeight). With reference to fig. 2, taking the creation of the drawable object 21 as an example, mbtnpotionx represents the left edge abscissa of the drawable object 21, mbtnpotionx + mBtnWidth represents the right edge abscissa of the drawable object 21, mBtnWidth is the width of the drawable object 21, mbtnpotiony represents the top edge ordinate of the drawable object 21, mbtnpotiony + mBtnHeight represents the bottom edge ordinate of the drawable object 21, and mBtnHeight is the height of the drawable object 21.
3. And adding a drawable object in the floating layer of the target view control according to the attribute information.
After finishing setting the attribute information of the drawable object, the operating system can add the drawable object in the floating layer of the target view control according to the set attribute information.
Taking an Android system as an example, the following method can be adopted to add mbbntrawable in the floating layer targetOverlay of targetView: target over lay. add (icon); wherein icon represents the created drawable object mbbntrawable. To this end, the drawable object is already able to be presented to the user.
In step 103, an operational event corresponding to a drawable object is received.
The operating system receives an operating event corresponding to the drawable object, wherein the operating event corresponding to the drawable object refers to an operating event of which the trigger position of the operating event is located in the display area of the drawable object. The operation event refers to an event triggered by a user operation, such as a click, a long press, a slide and other operation events.
In one example, the step 103 includes the following sub-steps:
1. when the target view control receives an operation event, acquiring the position information of the operation event;
2. detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
3. and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
Because the drawable object does not have the ability to receive manipulation events, the target view control needs to be utilized to receive manipulation events. The target view control can receive the operation event of which the trigger position is located in the display area of the target view control, and the operation event corresponding to the drawable object can be received by the target view control because the display area of the drawable object is located in the display area of the target view control. When the target view control receives an operation event, the operating system acquires position information of the operation event, wherein the position information can be represented by coordinates, and the position information is used for indicating a trigger position of the operation event. And then, the operating system judges and processes the operating event by comparing the trigger position of the operating event with the display area of the drawable object.
In step 104, an operation corresponding to the drawable object is performed in response to the operation event described above.
If the trigger position of the operation event is located in the display area of the drawable object, the operating system performs an operation corresponding to the drawable object in response to the operation event. In the embodiment of the present disclosure, specific contents of the operation corresponding to the drawable object are not limited. Optionally, the operating system overlays a display floating window on the target user interface, where the floating window includes display content corresponding to the drawable object, so as to implement a picture-in-picture function. For example, the operating system overlays a floating window on top of the target user interface, the floating window containing interface content for the second application. The second application is another application different from the first application, and the second application may be a third party application or a system application.
In one example, referring to fig. 3 in combination, the first application is a video application and the second application is an instant messaging application. A video playing interface 31 of a video application program is displayed in the screen of the terminal, and the video application program is playing a video. In the manner described above, the operating system adds an icon 32 in the floating layer of the video control of the video playing interface 31 as an operation entry for triggering and displaying the instant messaging application program. The user clicks on icon 32. Correspondingly, the video control receives a click operation event, and determines that the trigger position of the click operation event is located in the display area of the icon 32, then a floating window 33 is additionally displayed on the video playing interface 31, and the interface content of the instant messaging application program is displayed in the floating window 33. Of course, the application scenario shown in fig. 3 is only an example, and besides implementing the picture-in-picture function in the video playing interface, the picture-in-picture function may be implemented in any user interface such as a game interface, a web page display interface, and the like, which is not limited in this disclosure.
To sum up, in the technical solution provided by the embodiment of the present disclosure, a drawable object is added to a floating layer of a target view control by obtaining the floating layer of the target view control in a target user interface, and then when an operation event corresponding to the drawable object is received, a corresponding operation is executed to respond to the operation event; because the drawable object is not a UI control, the structure of the control tree of the target user interface does not need to be adjusted, and bug generation is avoided.
In addition, the target view control where the drawable object is located is used for receiving the operation event, and whether to execute the operation corresponding to the drawable object is judged based on the position of the operation event and the display area of the drawable object, so that the function requirement of event response is also realized.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
FIG. 4 is a block diagram illustrating an event response device according to an exemplary embodiment. The device has the functions of realizing the method examples, and the functions can be realized by hardware or by hardware executing corresponding software. The apparatus may include: a floating layer acquisition module 410, an object addition module 420, an event receiving module 430, and an event response module 440.
A floating layer acquisition module 410 configured to acquire a floating layer of a target view control in a target user interface.
An object adding module 420 configured to add a drawable object in the floating layer of the target view control.
An event receiving module 430 configured to receive an operation event corresponding to the drawable object.
An event response module 440 configured to perform an operation corresponding to the drawable object in response to the operation event.
To sum up, in the technical solution provided by the embodiment of the present disclosure, a drawable object is added to a floating layer of a target view control by obtaining the floating layer of the target view control in a target user interface, and then when an operation event corresponding to the drawable object is received, a corresponding operation is executed to respond to the operation event; because the drawable object is not a UI control, the structure of the control tree of the target user interface does not need to be adjusted, and bug generation is avoided.
In an optional embodiment provided based on the embodiment of fig. 4, the object adding module 420 is configured to:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
In another optional embodiment provided based on the embodiment of fig. 4, the event receiving module 430 is configured to:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
In another optional embodiment provided based on the embodiment of fig. 4, the event response module 440 is configured to:
and overlaying and displaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object.
In another optional embodiment provided based on the embodiment of fig. 4, the floating layer obtaining module 410 is configured to:
acquiring a control tree of the target user interface, wherein the control tree comprises controls in the target user interface;
detecting whether the control tree contains the target view control or not;
and if the control tree contains the target view control, acquiring a floating layer of the target view control.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the above functional modules is illustrated, and in practical applications, the above functions may be distributed by different functional modules according to actual needs, that is, the content structure of the device is divided into different functional modules, so as to complete all or part of the functions described above.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
An exemplary embodiment of the present disclosure also provides an event response device, which can implement the event response method provided by the present disclosure. The device includes: a processor, and a memory for storing executable instructions for the processor. Wherein the processor is configured to:
acquiring a floating layer of a target view control in a target user interface;
adding a drawable object in the floating layer of the target view control;
receiving an operation event corresponding to the drawable object;
and executing the operation corresponding to the drawable object to respond to the operation event.
Optionally, the processor is further configured to:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
Optionally, the processor is further configured to:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
Optionally, the processor is further configured to:
and overlaying and displaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object.
Optionally, the processor is further configured to:
acquiring a control tree of the target user interface, wherein the control tree comprises controls in the target user interface;
detecting whether the control tree contains the target view control or not;
and if the control tree contains the target view control, acquiring a floating layer of the target view control.
Fig. 5 is a block diagram illustrating an apparatus 500 for implementing the event response function described above according to an example embodiment. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 5, the apparatus 500 may include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, and communication component 516.
The processing component 502 generally controls overall operation of the device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 502 may include one or more processors 520 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interaction between the processing component 502 and other components. For example, the processing component 502 can include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
The memory 504 is configured to store various types of data to support operations at the apparatus 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 504 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 500.
The multimedia component 508 includes a screen that provides an output interface between the device 500 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 508 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 500 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 510 is configured to output and/or input audio signals. For example, audio component 510 includes a Microphone (MIC) configured to receive external audio signals when apparatus 500 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 504 or transmitted via the communication component 516. In some embodiments, audio component 510 further includes a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 514 includes one or more sensors for providing various aspects of status assessment for the device 500. For example, the sensor assembly 514 may detect an open/closed state of the apparatus 500, the relative positioning of the components, such as a display and keypad of the apparatus 500, the sensor assembly 514 may also detect a change in the position of the apparatus 500 or a component of the apparatus 500, the presence or absence of user contact with the apparatus 500, orientation or acceleration/deceleration of the apparatus 500, and a change in the temperature of the apparatus 500. The sensor assembly 514 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 514 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The device 500 may access a wireless network based on a communication standard, such as Wi-Fi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 516 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 516 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the event response methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 504 comprising instructions, executable by the processor 520 of the apparatus 500 to perform the event response method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions, when executed by a processor of the apparatus 500, enable the apparatus 500 to perform the event response method provided by the above embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (8)

1. An event response method, the method comprising:
acquiring a control tree of a target user interface, wherein the target user interface is a user interface currently displayed by a first application program, the control tree comprises controls in the target user interface, and the control tree is used for representing the hierarchical structure of each control in the target user interface;
detecting whether the control tree contains a target view control or not;
if the control tree comprises the target view control, acquiring a floating layer of the target view control, wherein the floating layer of the target view control is positioned at the top of the target view control and is used for adding a drawable object at the top of the target view control;
adding the drawable object in a floating layer of the target view control;
receiving an operation event corresponding to the drawable object;
overlaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object, so that the display content comprises interface content of a second application program in response to the operation event, wherein the second application program is another application program different from the first application program.
2. The method of claim 1, wherein adding a drawable object in the floating layer of the target view control comprises:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
3. The method of claim 1, wherein receiving the operational event corresponding to the drawable object comprises:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
4. An event response device, the device comprising:
the system comprises a floating layer acquisition module, a first application program and a second application program, wherein the floating layer acquisition module is configured to acquire a control tree of a target user interface, the target user interface is a user interface currently displayed by a first application program, the control tree comprises controls in the target user interface, and the control tree is used for representing the hierarchical structure of each control in the target user interface; detecting whether the control tree contains a target view control or not; if the control tree comprises the target view control, acquiring a floating layer of the target view control, wherein the floating layer of the target view control is positioned at the top of the target view control and is used for adding a drawable object at the top of the target view control;
an object addition module configured to add the drawable object in a floating layer of the target view control;
an event receiving module configured to receive an operation event corresponding to the drawable object;
and the event response module is configured to overlay and display a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object, and in response to the operation event, the display content comprises interface content of a second application program, wherein the second application program is another application program different from the first application program.
5. The apparatus of claim 4, wherein the object addition module is configured to:
creating the drawable object;
setting attribute information of the drawable object, wherein the attribute information comprises a position and/or a size;
and adding the drawable object in the floating layer of the target view control according to the attribute information.
6. The apparatus of claim 4, wherein the event receiving module is configured to:
when the target view control receives an operation event, acquiring the position information of the operation event;
detecting whether the trigger position of the operation event is located in the display area of the drawable object according to the position information of the operation event;
and if the trigger position of the operation event is located in the display area of the drawable object, determining that the operation event corresponding to the drawable object is received.
7. An event response device, the device comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
acquiring a floating layer of a target view control in a target user interface, wherein the target user interface is a user interface currently displayed by a first application program;
adding a drawable object in the floating layer of the target view control;
receiving an operation event corresponding to the drawable object;
overlaying a floating window on the target user interface, wherein the floating window comprises display content corresponding to the drawable object, so that the display content comprises interface content of a second application program in response to the operation event, wherein the second application program is another application program different from the first application program.
8. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201810520265.2A 2018-05-28 2018-05-28 Event response method, device and storage medium Active CN108829473B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810520265.2A CN108829473B (en) 2018-05-28 2018-05-28 Event response method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810520265.2A CN108829473B (en) 2018-05-28 2018-05-28 Event response method, device and storage medium

Publications (2)

Publication Number Publication Date
CN108829473A CN108829473A (en) 2018-11-16
CN108829473B true CN108829473B (en) 2022-03-11

Family

ID=64145767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810520265.2A Active CN108829473B (en) 2018-05-28 2018-05-28 Event response method, device and storage medium

Country Status (1)

Country Link
CN (1) CN108829473B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324275B (en) * 2018-12-17 2022-02-22 腾讯科技(深圳)有限公司 Broadcasting method and device for elements in display picture
CN111459598B (en) * 2020-04-02 2021-05-14 上海极链网络科技有限公司 Information display method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049847A (en) * 2014-06-30 2014-09-17 宇龙计算机通信科技(深圳)有限公司 Information prompt method and system of mobile terminal
CN104793929A (en) * 2015-02-15 2015-07-22 深圳市中兴移动通信有限公司 User-defined method and device for application interface display information
CN107656671A (en) * 2017-09-29 2018-02-02 珠海市魅族科技有限公司 Suspend small window control method and device, terminal installation and computer-readable recording medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104346085A (en) * 2013-07-25 2015-02-11 北京三星通信技术研究有限公司 Control object operation method and device and terminal device
CN103677527B (en) * 2013-12-24 2017-10-24 北京奇立软件技术有限公司 Suspension problem interaction control display method and device suitable for mobile terminal
CN104836906A (en) * 2015-04-13 2015-08-12 惠州Tcl移动通信有限公司 Mobile terminal and method for acquiring images from short message operation interface in real time thereof
CN106168869B (en) * 2016-06-24 2019-06-21 北京奇虎科技有限公司 Desktop view processing method, device and terminal based on suspended window
CN107193542B (en) * 2017-03-30 2022-06-14 腾讯科技(深圳)有限公司 Information display method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104049847A (en) * 2014-06-30 2014-09-17 宇龙计算机通信科技(深圳)有限公司 Information prompt method and system of mobile terminal
CN104793929A (en) * 2015-02-15 2015-07-22 深圳市中兴移动通信有限公司 User-defined method and device for application interface display information
CN107656671A (en) * 2017-09-29 2018-02-02 珠海市魅族科技有限公司 Suspend small window control method and device, terminal installation and computer-readable recording medium

Also Published As

Publication number Publication date
CN108829473A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN105955607B (en) Content sharing method and device
CN107908351B (en) Application interface display method and device and storage medium
CN107526494B (en) Keyboard display method, device, terminal and storage medium
US10509540B2 (en) Method and device for displaying a message
CN110231901B (en) Application interface display method and device
CN107153541B (en) Browsing interaction processing method and device
US20190235745A1 (en) Method and device for displaying descriptive information
CN105487805B (en) Object operation method and device
CN107562349B (en) Method and device for executing processing
EP3147802B1 (en) Method and apparatus for processing information
EP3454192A1 (en) Method and device for displaying page
CN105786507B (en) Display interface switching method and device
CN108804179B (en) Method, device, terminal and storage medium for displaying notification bar message
CN107526591B (en) Method and device for switching types of live broadcast rooms
US20190361581A1 (en) Method, terminal and computer-readable storage medium device for displaying interface of application program
US11221734B2 (en) Punch-hole screen display method and apparatus
CN108829473B (en) Event response method, device and storage medium
EP3828682A1 (en) Method, apparatus for adding shortcut plug-in, and intelligent device
CN107992257B (en) Screen splitting method and device
CN109521923B (en) Floating window control method and device and storage medium
CN106919302B (en) Operation control method and device of mobile terminal
CN110147191B (en) Method and device for controlling window and storage medium
CN111381739B (en) Application icon display method and device, electronic equipment and storage medium
CN110417987B (en) Operation response method, device, equipment and readable storage medium
CN107656616B (en) Input interface display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant