CN108829473A - event response method, device and storage medium - Google Patents
event response method, device and storage medium Download PDFInfo
- Publication number
- CN108829473A CN108829473A CN201810520265.2A CN201810520265A CN108829473A CN 108829473 A CN108829473 A CN 108829473A CN 201810520265 A CN201810520265 A CN 201810520265A CN 108829473 A CN108829473 A CN 108829473A
- Authority
- CN
- China
- Prior art keywords
- rendered object
- control
- event
- action event
- target view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to a kind of event response method, device and storage mediums, belong to field of terminal technology.The method includes:Obtain the floating layer of the target view control in target user interface;Addition can rendered object in the floating layer of target view control;Receive correspond to can rendered object action event;Execute with can the corresponding operation of rendered object, in response to aforesaid operations event.The floating layer that the disclosure passes through the target view control in acquisition target user interface, addition can rendered object in the floating layer of target view control, then when receive correspond to this can rendered object action event when, execute corresponding operation in response to aforesaid operations event;Due to can rendered object be not be UI control, do not need to be adjusted the structure of the control tree of target user interface, avoid generate bug.
Description
Technical field
The embodiment of the present disclosure is related to field of terminal technology, in particular to a kind of event response method, device and storage medium.
Background technique
In Android (Android) system, UI (User Interface, user interface) control (referred to as " control ") has
Receive the function of action event.For example, button control can receive the clicking operation event of user's triggering.Control is receiving behaviour
When making event, corresponding operation is executed, in response to aforesaid operations event.
The picture-in-picture function of android system refers to while showing the user interface of the first application program, by second
The user interface of application program in the form of suspended window Overlapping display on the upper layer of the user interface of above-mentioned first application program, with
Achieve the purpose that multitasking.For example, by picture-in-picture function, user can watch video using Video Applications on one side, one
It is chatted using instant messaging application and good friend on side.
In order to realize picture-in-picture function, it usually needs add a button in the user interface of the first application program, lead to
The button is crossed to trigger the user interface of the second application program of display.In the related art, which is embodied as a UI control
Part, is inserted into a node in the control tree of the user interface of the first application program, which corresponds to above-mentioned button control.
Aforesaid way influences whether the position of original node in the structure and control tree of control tree, is easy to produce some bug
(defect).
Summary of the invention
The embodiment of the present disclosure provides a kind of event response method, device and storage medium.Technical solution is as follows:
According to the first aspect of the embodiments of the present disclosure, a kind of event response method is provided, the method includes:
Obtain the floating layer of the target view control in target user interface;
Addition can rendered object in the floating layer of the target view control;
Receive correspond to it is described can rendered object action event;
Execute with it is described can the corresponding operation of rendered object, in response to the action event.
Optionally, it is described in the floating layer of the target view control addition can rendered object, including:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
Optionally, it is described receive correspond to it is described can rendered object action event, including:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event, which is located at, described can be drawn
In the display area of object processed;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive pair
Described in Ying Yu can rendered object action event.
Optionally, the execution with it is described can the corresponding operation of rendered object, including:
Comprising being drawn pair with described in the upper layer Overlapping display suspended window of the target user interface, the suspended window
As corresponding display content.
Optionally, the floating layer for obtaining the target view control in target user interface, including:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree
Part;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
According to the second aspect of an embodiment of the present disclosure, a kind of event response device is provided, described device includes:
Floating layer obtains module, is configured as obtaining the floating layer of the target view control in target user interface;
Object adding module, being configured as in the floating layer of the target view control addition can rendered object;
Event receiving module, be configured as receiving correspond to it is described can rendered object action event;
Event response module, be configured as executing with it is described can the corresponding operation of rendered object, in response to the operation
Event.
Optionally, the object adding module, is configured as:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
Optionally, the event receiving module, is configured as:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event, which is located at, described can be drawn
In the display area of object processed;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive pair
Described in Ying Yu can rendered object action event.
Optionally, the event response module, is configured as:
Comprising being drawn pair with described in the upper layer Overlapping display suspended window of the target user interface, the suspended window
As corresponding display content.
Optionally, the floating layer obtains module, is configured as:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree
Part;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
According to the third aspect of an embodiment of the present disclosure, a kind of event response device is provided, described device includes:
Processor;
For storing the memory of the executable instruction of the processor;
Wherein, the processor is configured to:
Obtain the floating layer of the target view control in target user interface;
Addition can rendered object in the floating layer of the target view control;
Receive correspond to it is described can rendered object action event;
Execute with it is described can the corresponding operation of rendered object, in response to the action event.
According to a fourth aspect of embodiments of the present disclosure, a kind of non-transitorycomputer readable storage medium is provided, thereon
The step of being stored with computer program, method as described in relation to the first aspect realized when the computer program is executed by processor.
The technical solution that the embodiment of the present disclosure provides can include the following benefits:
By obtaining the floating layer of the target view control in target user interface, added in the floating layer of target view control
Can rendered object, then when receive correspond to this can rendered object action event when, execute corresponding operation in response to
Aforesaid operations event;Due to can rendered object be not be UI control, do not need to the control tree of target user interface
Structure is adjusted, and avoids generating bug.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.
Fig. 1 is a kind of flow chart of event response method shown according to an exemplary embodiment;
Fig. 2 be a kind of addition shown according to an exemplary embodiment can rendered object schematic diagram;
Fig. 3 is a kind of interface schematic diagram of event response mode shown according to an exemplary embodiment;
Fig. 4 is a kind of block diagram of event response device shown according to an exemplary embodiment;
Fig. 5 is a kind of block diagram of device shown according to an exemplary embodiment.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all implementations consistent with this disclosure.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the disclosure.
The event response method that the embodiment of the present disclosure provides, the executing subject of each step are terminal.For example, terminal can be
Mobile phone, tablet computer, E-book reader, multimedia play equipment, wearable device, PC (Personal Computer, it is a
People's computer) etc. electronic equipments.
Can be with installation and operation operating system (operating system, OS) in terminal, each step in the embodiment of the present disclosure
Executing subject can be operating system.Operating system is to manage and control the computer journey of the hardware and software resource of terminal
Sequence.Technical solution provided by the present disclosure provides corresponding solution mainly for the above problem present in android system
Scheme.Certainly, for solve other operating systems (as Windows system, iOS system or it is other based on android system
Custom-built system etc.) present in similar problems, technical solution provided by the present disclosure is equally applicable.
Fig. 1 is a kind of flow chart of event response method shown according to an exemplary embodiment.This method may include
The following steps:
In a step 101, the floating layer of the target view control in target user interface is obtained.
Target user interface is the user interface that the first application program is currently shown.First application program can be in terminal
Any one application program of installation and operation.First application program can be third party application, be also possible to system application
Program.In the embodiments of the present disclosure, system application refers to that, by the application program of developer's offer of operating system, system is answered
It is usually just pre-installed in the terminal before terminal factory with program, or can also the installation when operating system carries out version updating
Into terminal.Third party application refers to that in addition to the developer of operating system, what other application program developer provided answers
With program, third party application usually after terminal factory, is downloaded and installed into terminal by user.
Optionally, when the first application program displaying target user interface, operating system is obtained in the target user interface
Target view control floating layer.Target user interface is made of one or more view, may include in each view
At least one control.In the embodiments of the present disclosure, view control refers to the control for including in view.Target view control can be with
It is any one control in above-mentioned target user interface.Target view control has the function of receiving action event.Target view
The floating layer of figure control is located at the top (namely upper layer) of target view control, can draw for the addition at the top of target view control
Object processed.In android system, view is known as view, and floating layer is known as overlay, can rendered object be known as drawable pairs
As.
In one example, above-mentioned steps 101 include following several sub-steps:
1, the control tree of target user interface is obtained;
Include the control in target user interface in control tree.It include at least one node, each node in control tree
Corresponding to a control in target user interface.The hierarchical structure of each node, that is, characterize target user circle in control tree
The hierarchical structure of each control in face.
2, whether detect in control tree includes target view control;
Optionally, target view control is the view control of specified type.View control can wrap according to its function division
Different type, such as video control, text control, button control are included, video control is for playing video, and text control is for showing
Show content of text, button control is for receiving click or pressing operation.Above-mentioned specified type can be according to actual product demand
It is set, such as specified type is video control, to realize that addition can rendered object in video control.
If including 3, target view control in control tree, the floating layer of target view control is obtained.
Optionally, operating system traverses the control in control tree one by one, judges whether i-th of control in control tree is mesh
It marks view control and stops the floating layer for traversing and obtaining i-th of control if i-th of control is target view control, if
I-th of control is not target view control, then enable i=i+1 and again from above-mentioned i-th of the control judged in control tree whether be
The step of target view control, starts to execute, until stopping after the completion of finding target view control or entire control tree traversal, i
For positive integer.
By taking android system as an example, when the first application program opens an activity, operating system traversal should
Control in the control tree of activity, judge the control whether be SurfaceView type control (namely video control).
The control for the SurfaceView type that first is found is denoted as targetView, and obtains the floating layer of the targetView,
The floating layer of targetView is targetOverlay.Optionally, targetOverlay is obtained in the following way:
TargetOverlay=targetView.getOverlay ().In android system, getOverlay () can be used
To obtain the floating layer of view control.
In a step 102, addition can rendered object in the floating layer of target view control.
After the floating layer for getting target view control, adding in the floating layer of target view control can draw operating system
Object processed.
In one example, above-mentioned steps 102 include following several sub-steps:
1, creation can rendered object;
By taking android system as an example, operating system can create drawable object according to Android standard mode, should
Drawable object can be denoted as mBtndrawable, as the button (button) for being presented to user.A bit for needing to illustrate
It is that above-mentioned button is not control, action event can not be received, above-mentioned button is only a figure or icon.
2, setting can rendered object attribute information;
Attribute information be used to indicate can rendered object display properties.Optionally, attribute information includes position and/or ruler
It is very little.
By taking android system as an example, can be arranged in the following way can rendered object positions and dimensions:
mBtnDrawable.setBounds(mBtnPositionX,mBtnPositionY,mBtnPositionX+mBtnWidth,
mBtnPositionY+mBtnHeight)., can be for rendered object 21 by creation in conjunction with reference Fig. 2, mBtnPositionX table
Show can rendered object 21 left side edge abscissa, mBtnPositionX+mBtnWidth indicate can rendered object 21 right side
Edge abscissa, mBtnWidth be can rendered object 21 width, mBtnPositionY indicate can rendered object 21 top
Portion edge ordinate, mBtnPositionY+mBtnHeight indicate can rendered object 21 bottom margin ordinate,
MBtnHeight be can rendered object 21 height.
3, according to attribute information, addition can rendered object in the floating layer of target view control.
Operating system complete setting can after the attribute information of rendered object, can according to the attribute information set,
Addition can rendered object in the floating layer of target view control.
By taking android system as an example, it can add in the floating layer targetOverlay of targetView in the following way
Add mBtndrawable:targetOverLay.add(icon);Wherein, icon be indicate above-mentioned creation can rendered object
mBtndrawable.So far, can rendered object have been able to be presented to user.
In step 103, receive correspond to can rendered object action event.
Operating system receive correspond to can rendered object action event, it is above-mentioned correspond to can rendered object operation thing
Part, referring to that the trigger position of action event is located at can action event in the display area of rendered object.Action event refer to by
The event of user's operation triggering, such as click, long-pressing, sliding action event.
In one example, above-mentioned steps 103 include following several sub-steps:
1, when target view control receives action event, the location information of action event is obtained;
2, according to the location information of action event, detect action event trigger position whether be located at can rendered object it is aobvious
Show in region;
If 3, be located at can be in the display area of rendered object for the trigger position of action event, it is determined that receive that correspond to can
The action event of rendered object.
Due to can rendered object there is no receive action event ability, it is therefore desirable to received using target view control
Action event.Target view control can receive the operation thing that trigger position is located in the display area of the target view control
Part, due to can the display area of rendered object be located within the display area of target view control, corresponding to can draw pair
The action event of elephant can be received by target view control.When target view control receives action event, operating system is obtained
The location information of the action event is taken, location information can use coordinate representation, and location information is used to indicate the touching of action event
Send out position.Later, operating system by compare action event trigger position and can rendered object display area, judging and
Handle aforesaid operations event.
At step 104, execute with can the corresponding operation of rendered object, in response to aforesaid operations event.
Can be in the display area of rendered object if the trigger position of action event is located at, operating system is executed and can be drawn
The corresponding operation of object processed, in response to aforesaid operations event.In the embodiments of the present disclosure, to it is above-mentioned with can rendered object it is corresponding
The particular content of operation be not construed as limiting.Optionally, operating system, should in the upper layer Overlapping display suspended window of target user interface
In suspended window comprising with can the corresponding display content of rendered object, to realize picture-in-picture function.For example, operating system is used in target
The upper layer Overlapping display suspended window at family interface includes the interface content of the second application program in the suspended window.Second application program
It is the another application program different from the first application program, and the second application program can be third party application, it can also be with
It is system application.
In one example, in conjunction with reference Fig. 3, using the first application program as video application, the second application program is
For instant messaging application program.The video playing interface 31 of video application, Video Applications are shown in the screen of terminal
Program is playing video.Operating system is in a manner of described above, in the floating layer of the video control at video playing interface 31
An icon 32 is added, the operation entry as triggering display instant messaging application program.User clicks icon 32.Correspondingly,
Video control, which receives, clicks action event, judges that the trigger position of the clicking operation event is located at the display area of icon 32
In, then instant messaging is shown in one suspended window 33 of upper layer Overlapping display at video playing interface 31, the suspended window 33 answer
With the interface content of program.Certainly, above-mentioned application scenarios shown in Fig. 3 are merely exemplary, in addition in video playing interface
It realizes except picture-in-picture function, can also realize picture-in-picture function in any user interface such as interface, web displaying interface
Can, the disclosure is not construed as limiting this.
In conclusion being regarded in the technical solution that the embodiment of the present disclosure provides by the target obtained in target user interface
The floating layer of figure control, in the floating layer of target view control addition can rendered object, then when receive correspond to this can draw
When the action event of object, corresponding operation is executed in response to aforesaid operations event;Due to can rendered object not be UI control
Part, therefore do not need to be adjusted the structure of the control tree of target user interface, it avoids generating bug.
In addition, using can target view control where rendered object receive action event, and based on action event
Position and can rendered object display area, to determine whether execute with can the corresponding operation of rendered object, same realization gets over
The functional requirement of part response.
Following is embodiment of the present disclosure, can be used for executing embodiments of the present disclosure.It is real for disclosure device
Undisclosed details in example is applied, embodiments of the present disclosure is please referred to.
Fig. 4 is a kind of block diagram of event response device shown according to an exemplary embodiment.The device, which has, to be realized
The exemplary function of method is stated, the function can also be executed corresponding software realization by hardware realization by hardware.The device
May include:Floating layer obtains module 410, object adding module 420, event receiving module 430 and event response module 440.
Floating layer obtains module 410, is configured as obtaining the floating layer of the target view control in target user interface.
Object adding module 420, being configured as in the floating layer of the target view control addition can rendered object.
Event receiving module 430, be configured as receiving correspond to it is described can rendered object action event.
Event response module 440, be configured as executing with it is described can the corresponding operation of rendered object, in response to the behaviour
Make event.
In conclusion being regarded in the technical solution that the embodiment of the present disclosure provides by the target obtained in target user interface
The floating layer of figure control, in the floating layer of target view control addition can rendered object, then when receive correspond to this can draw
When the action event of object, corresponding operation is executed in response to aforesaid operations event;Due to can rendered object not be UI control
Part, therefore do not need to be adjusted the structure of the control tree of target user interface, it avoids generating bug.
In the alternative embodiment provided based on Fig. 4 embodiment, the object adding module 420 is configured as:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
In another alternative embodiment provided based on Fig. 4 embodiment, the event receiving module 430 is configured as:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event, which is located at, described can be drawn
In the display area of object processed;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive pair
Described in Ying Yu can rendered object action event.
In another alternative embodiment provided based on Fig. 4 embodiment, the event response module 440 is configured as:
Comprising being drawn pair with described in the upper layer Overlapping display suspended window of the target user interface, the suspended window
As corresponding display content.
In another alternative embodiment provided based on Fig. 4 embodiment, the floating layer obtains module 410, is configured as:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree
Part;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
It should be noted is that device provided by the above embodiment is when realizing its function, only with above-mentioned each function
The division progress of module, can be according to actual needs and by above-mentioned function distribution by different function for example, in practical application
Energy module is completed, i.e., the content structure of equipment is divided into different functional modules, to complete whole described above or portion
Divide function.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
One exemplary embodiment of the disclosure additionally provides a kind of event response device, can be realized the event of disclosure offer
Response method.The device includes:Processor, and the memory of the executable instruction for storage processor.Wherein, processor
It is configured as:
Obtain the floating layer of the target view control in target user interface;
Addition can rendered object in the floating layer of the target view control;
Receive correspond to it is described can rendered object action event;
Execute with it is described can the corresponding operation of rendered object, in response to the action event.
Optionally, processor is additionally configured to:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
Optionally, processor is additionally configured to:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event, which is located at, described can be drawn
In the display area of object processed;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive pair
Described in Ying Yu can rendered object action event.
Optionally, processor is additionally configured to:
Comprising being drawn pair with described in the upper layer Overlapping display suspended window of the target user interface, the suspended window
As corresponding display content.
Optionally, processor is additionally configured to:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree
Part;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
Fig. 5 is a kind of device 500 for realizing above-mentioned event response function shown according to an exemplary embodiment
Block diagram.For example, device 500 can be mobile phone, and computer, digital broadcasting terminal, messaging device, game console,
Tablet device, Medical Devices, body-building equipment, personal digital assistant etc..
Referring to Fig. 5, device 500 may include following one or more components:Processing component 502, memory 504, power supply
Component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor module 514, Yi Jitong
Believe component 516.
The integrated operation of the usual control device 500 of processing component 502, such as with display, telephone call, data communication, phase
Machine operation and record operate associated operation.Processing component 502 may include that one or more processors 520 refer to execute
It enables, to perform all or part of the steps of the methods described above.In addition, processing component 502 may include one or more modules, just
Interaction between processing component 502 and other assemblies.For example, processing component 502 may include multi-media module, it is more to facilitate
Interaction between media component 508 and processing component 502.
Memory 504 is configured as storing various types of data to support the operation in device 500.These data are shown
Example includes the instruction of any application or method for operating on device 500, contact data, and telephone book data disappears
Breath, picture, video etc..Memory 504 can be by any kind of volatibility or non-volatile memory device or their group
It closes and realizes, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable to compile
Journey read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash
Device, disk or CD.
Power supply module 506 provides electric power for the various assemblies of device 500.Power supply module 506 may include power management system
System, one or more power supplys and other with for device 500 generate, manage, and distribute the associated component of electric power.
Multimedia component 508 includes the screen of one output interface of offer between described device 500 and user.One
In a little embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen
Curtain may be implemented as touch screen, to receive input signal from the user.Touch panel includes one or more touch sensings
Device is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding action
Boundary, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more matchmakers
Body component 508 includes a front camera and/or rear camera.When device 500 is in operation mode, such as screening-mode or
When video mode, front camera and/or rear camera can receive external multi-medium data.Each front camera and
Rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 510 is configured as output and/or input audio signal.For example, audio component 510 includes a Mike
Wind (MIC), when device 500 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone is matched
It is set to reception external audio signal.The received audio signal can be further stored in memory 504 or via communication set
Part 516 is sent.In some embodiments, audio component 510 further includes a loudspeaker, is used for output audio signal.
I/O interface 512 provides interface between processing component 502 and peripheral interface module, and above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include, but are not limited to:Home button, volume button, start button and lock
Determine button.
Sensor module 514 includes one or more sensors, and the state for providing various aspects for device 500 is commented
Estimate.For example, sensor module 514 can detecte the state that opens/closes of device 500, and the relative positioning of component, for example, it is described
Component is the display and keypad of device 500, and sensor module 514 can be with 500 1 components of detection device 500 or device
Position change, the existence or non-existence that user contacts with device 500,500 orientation of device or acceleration/deceleration and device 500
Temperature change.Sensor module 514 may include proximity sensor, be configured to detect without any physical contact
Presence of nearby objects.Sensor module 514 can also include optical sensor, such as CMOS or ccd image sensor, at
As being used in application.In some embodiments, which can also include acceleration transducer, gyro sensors
Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 516 is configured to facilitate the communication of wired or wireless way between device 500 and other equipment.Device
500 can access the wireless network based on communication standard, such as Wi-Fi, 2G or 3G or their combination.In an exemplary reality
It applies in example, communication component 516 receives broadcast singal or the related letter of broadcast from external broadcasting management system via broadcast channel
Breath.In one exemplary embodiment, the communication component 516 further includes near-field communication (NFC) module, to promote short distance logical
Letter.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band (UWB) can be based in NFC module
Technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 500 can be believed by one or more application specific integrated circuit (ASIC), number
Number processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing above-mentioned event response method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided
It such as include the memory 504 of instruction, above-metioned instruction can be executed by the processor 520 of device 500 to complete above-mentioned event response side
Method.For example, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, magnetic
Band, floppy disk and optical data storage devices etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processing of device 500
When device executes, so that device 500 is able to carry out event response method provided by the above embodiment.
It should be understood that referenced herein " multiple " refer to two or more."and/or", description association
The incidence relation of object indicates may exist three kinds of relationships, for example, A and/or B, can indicate:Individualism A, exists simultaneously A
And B, individualism B these three situations.Character "/" typicallys represent the relationship that forward-backward correlation object is a kind of "or".
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the disclosure
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.
Claims (12)
1. a kind of event response method, which is characterized in that the method includes:
Obtain the floating layer of the target view control in target user interface;
Addition can rendered object in the floating layer of the target view control;
Receive correspond to it is described can rendered object action event;
Execute with it is described can the corresponding operation of rendered object, in response to the action event.
2. the method according to claim 1, wherein described, addition can in the floating layer of the target view control
Rendered object, including:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
3. the method according to claim 1, wherein it is described receive correspond to it is described can rendered object operation thing
Part, including:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event is located at described can draw pair
In the display area of elephant;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive and correspond to
It is described can rendered object action event.
4. the method according to claim 1, wherein the execution with it is described can the corresponding operation of rendered object,
Including:
In the upper layer Overlapping display suspended window of the target user interface, the suspended window comprising with it is described can rendered object pair
The display content answered.
5. the method according to claim 1, wherein the target view control obtained in target user interface
Floating layer, including:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
6. a kind of event response device, which is characterized in that described device includes:
Floating layer obtains module, is configured as obtaining the floating layer of the target view control in target user interface;
Object adding module, being configured as in the floating layer of the target view control addition can rendered object;
Event receiving module, be configured as receiving correspond to it is described can rendered object action event;
Event response module, be configured as executing with it is described can the corresponding operation of rendered object, in response to the action event.
7. device according to claim 6, which is characterized in that the object adding module is configured as:
It can rendered object described in creation;
Described in setting can rendered object attribute information, the attribute information includes position and/or size;
It, can rendered object described in addition in the floating layer of the target view control according to the attribute information.
8. device according to claim 6, which is characterized in that the event receiving module is configured as:
When the target view control receives action event, the location information of the action event is obtained;
According to the location information of the action event, whether the trigger position for detecting the action event is located at described can draw pair
In the display area of elephant;
If the trigger position of the action event be located at it is described can be in the display area of rendered object, it is determined that receive and correspond to
It is described can rendered object action event.
9. device according to claim 6, which is characterized in that the event response module is configured as:
In the upper layer Overlapping display suspended window of the target user interface, the suspended window comprising with it is described can rendered object pair
The display content answered.
10. device according to claim 6, which is characterized in that the floating layer obtains module, is configured as:
The control tree of the target user interface is obtained, includes the control in the target user interface in the control tree;
It whether detects in the control tree comprising the target view control;
If including the target view control in the control tree, the floating layer of the target view control is obtained.
11. a kind of event response device, which is characterized in that described device includes:
Processor;
For storing the memory of the executable instruction of the processor;
Wherein, the processor is configured to:
Obtain the floating layer of the target view control in target user interface;
Addition can rendered object in the floating layer of the target view control;
Receive correspond to it is described can rendered object action event;
Execute with it is described can the corresponding operation of rendered object, in response to the action event.
12. a kind of non-transitorycomputer readable storage medium, is stored thereon with computer program, which is characterized in that the meter
It is realized when calculation machine program is executed by processor such as the step of any one of claim 1 to 5 the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810520265.2A CN108829473B (en) | 2018-05-28 | 2018-05-28 | Event response method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810520265.2A CN108829473B (en) | 2018-05-28 | 2018-05-28 | Event response method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108829473A true CN108829473A (en) | 2018-11-16 |
CN108829473B CN108829473B (en) | 2022-03-11 |
Family
ID=64145767
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810520265.2A Active CN108829473B (en) | 2018-05-28 | 2018-05-28 | Event response method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108829473B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111324275A (en) * | 2018-12-17 | 2020-06-23 | 腾讯科技(深圳)有限公司 | Broadcasting method and device for elements in display picture |
CN111459598A (en) * | 2020-04-02 | 2020-07-28 | 上海极链网络科技有限公司 | Information display method and device, electronic equipment and storage medium |
CN113805750A (en) * | 2021-09-23 | 2021-12-17 | 闻泰通讯股份有限公司 | Application program display method and device, mobile device and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103677527A (en) * | 2013-12-24 | 2014-03-26 | 北京奇虎科技有限公司 | Suspension problem interaction control display method and device suitable for mobile terminal |
CN104049847A (en) * | 2014-06-30 | 2014-09-17 | 宇龙计算机通信科技(深圳)有限公司 | Information prompt method and system of mobile terminal |
CN104346085A (en) * | 2013-07-25 | 2015-02-11 | 北京三星通信技术研究有限公司 | Control object operation method and device and terminal device |
CN104793929A (en) * | 2015-02-15 | 2015-07-22 | 深圳市中兴移动通信有限公司 | User-defined method and device for application interface display information |
CN104836906A (en) * | 2015-04-13 | 2015-08-12 | 惠州Tcl移动通信有限公司 | Mobile terminal and method for acquiring images from short message operation interface in real time thereof |
CN106168869A (en) * | 2016-06-24 | 2016-11-30 | 北京奇虎科技有限公司 | Desktop view processing method based on suspended window, device and terminal |
CN107193542A (en) * | 2017-03-30 | 2017-09-22 | 腾讯科技(深圳)有限公司 | Method for information display and device |
CN107656671A (en) * | 2017-09-29 | 2018-02-02 | 珠海市魅族科技有限公司 | Suspend small window control method and device, terminal installation and computer-readable recording medium |
-
2018
- 2018-05-28 CN CN201810520265.2A patent/CN108829473B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104346085A (en) * | 2013-07-25 | 2015-02-11 | 北京三星通信技术研究有限公司 | Control object operation method and device and terminal device |
CN103677527A (en) * | 2013-12-24 | 2014-03-26 | 北京奇虎科技有限公司 | Suspension problem interaction control display method and device suitable for mobile terminal |
CN104049847A (en) * | 2014-06-30 | 2014-09-17 | 宇龙计算机通信科技(深圳)有限公司 | Information prompt method and system of mobile terminal |
CN104793929A (en) * | 2015-02-15 | 2015-07-22 | 深圳市中兴移动通信有限公司 | User-defined method and device for application interface display information |
CN104836906A (en) * | 2015-04-13 | 2015-08-12 | 惠州Tcl移动通信有限公司 | Mobile terminal and method for acquiring images from short message operation interface in real time thereof |
CN106168869A (en) * | 2016-06-24 | 2016-11-30 | 北京奇虎科技有限公司 | Desktop view processing method based on suspended window, device and terminal |
CN107193542A (en) * | 2017-03-30 | 2017-09-22 | 腾讯科技(深圳)有限公司 | Method for information display and device |
CN107656671A (en) * | 2017-09-29 | 2018-02-02 | 珠海市魅族科技有限公司 | Suspend small window control method and device, terminal installation and computer-readable recording medium |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111324275A (en) * | 2018-12-17 | 2020-06-23 | 腾讯科技(深圳)有限公司 | Broadcasting method and device for elements in display picture |
CN111324275B (en) * | 2018-12-17 | 2022-02-22 | 腾讯科技(深圳)有限公司 | Broadcasting method and device for elements in display picture |
CN111459598A (en) * | 2020-04-02 | 2020-07-28 | 上海极链网络科技有限公司 | Information display method and device, electronic equipment and storage medium |
CN113805750A (en) * | 2021-09-23 | 2021-12-17 | 闻泰通讯股份有限公司 | Application program display method and device, mobile device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108829473B (en) | 2022-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107908351B (en) | Application interface display method and device and storage medium | |
KR101984592B1 (en) | Mobile terminal and method for controlling the same | |
US11416112B2 (en) | Method and device for displaying an application interface | |
JP6393840B2 (en) | Information display method, apparatus, program, and recording medium | |
CN107066172B (en) | File transmission method and device of mobile terminal | |
CN107798309B (en) | Fingerprint input method and device and computer readable storage medium | |
CN104793843B (en) | desktop display method and device | |
CN107992257B (en) | Screen splitting method and device | |
CN105373334B (en) | Interactive screen control method and device | |
EP2921969A1 (en) | Method and apparatus for centering and zooming webpage and electronic device | |
CN108717344A (en) | page navigation method, device, terminal and computer readable storage medium | |
CN103995666A (en) | Method and device for setting work mode | |
CN108881634B (en) | Terminal control method, device and computer readable storage medium | |
CN110968364A (en) | Method and device for adding shortcut plug-in and intelligent equipment | |
CN108829473A (en) | event response method, device and storage medium | |
CN109725806A (en) | Website edit methods and device | |
CN108124058A (en) | information display control method and device | |
CN109521938A (en) | Determination method, apparatus, electronic equipment and the storage medium of data evaluation information | |
CN104750478B (en) | The display methods and device of application interface | |
CN106775210A (en) | The method and apparatus that wallpaper is changed | |
CN109582297A (en) | A kind of generation method of Code Template, device, electronic equipment and storage medium | |
CN106020694B (en) | Electronic equipment, and method and device for dynamically adjusting selected area | |
CN106919302A (en) | The method of controlling operation thereof and device of mobile terminal | |
CN106936986A (en) | Application processing method and device | |
CN108877742A (en) | Luminance regulating method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |