WO2022143094A1 - 一种窗口页面的交互方法、装置、电子设备以及可读存储介质 - Google Patents

一种窗口页面的交互方法、装置、电子设备以及可读存储介质 Download PDF

Info

Publication number
WO2022143094A1
WO2022143094A1 PCT/CN2021/136899 CN2021136899W WO2022143094A1 WO 2022143094 A1 WO2022143094 A1 WO 2022143094A1 CN 2021136899 W CN2021136899 W CN 2021136899W WO 2022143094 A1 WO2022143094 A1 WO 2022143094A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive
window
operation type
interactive operation
electronic device
Prior art date
Application number
PCT/CN2021/136899
Other languages
English (en)
French (fr)
Inventor
张晓波
黄豫
王小凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011644145.7A external-priority patent/CN114764300B/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21913835.1A priority Critical patent/EP4250078A1/en
Publication of WO2022143094A1 publication Critical patent/WO2022143094A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • the present application belongs to the technical field of device interaction, and in particular, relates to a window page interaction method, apparatus, electronic device, and readable storage medium.
  • operable controls can be added to the window page of the electronic device, such as a list control that can scroll up and down to switch the displayed content, or an album control that can turn pages left and right.
  • the control is attached to the window page for display, so when an interactive operation is initiated, the electronic device often cannot determine whether the target object of the interactive operation is an operable control or a window page, which reduces the accuracy of the interactive operation and affects the user. user experience.
  • the embodiments of the present application provide a window page interaction method, device, electronic device, and computer-readable storage medium, which can solve the existing window page interaction technology, when the window page contains operable controls, the interaction operation cannot be determined operation object, resulting in low response accuracy.
  • an embodiment of the present application provides an interaction method for a window page, including:
  • the target window contains at least one operable control
  • the operation type is the first operation type
  • the operable control is identified as the operation object of the interactive operation, and the operable control is controlled based on the interactive operation;
  • the first operation type is a the operation type associated with the operable control;
  • the target window is identified as the operation object of the interactive operation, and the target window is controlled based on the interactive operation.
  • Implementing the embodiments of the present application has the following beneficial effects: by identifying the operation type of the interactive operation when the electronic device receives the interactive operation initiated by the user, and determining the operation object corresponding to the interactive operation based on the operation type, for example, the operation type of the interactive operation is the first operation type, then it can be determined that the operation object corresponding to the interactive operation is an operable control, and based on the above-mentioned interactive operation, the operable controls on the target window are controlled, such as moving in a specified direction or turning pages; otherwise, if If the operation type of the interactive operation is the second operation type, the operation object corresponding to the interactive operation can be determined as the target window, and the target window is controlled based on the above-mentioned interactive operation.
  • the embodiment of the present application can distinguish whether the operation object is the underlying target window or the operable control attached to the window according to the operation type of the interactive operation, even if the operable control is different from the display area of the target window. If there is overlap, the corresponding operation objects can also be distinguished and the corresponding operation objects can be controlled, thereby improving the accuracy of the interactive operation and improving the user experience.
  • the method before the determining the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window, the method further includes:
  • the interactable information includes the interactable range and/or the interactable direction
  • the first operation type associated with the operable control is determined based on the interactable information.
  • the second operation type is an operation type other than the first operation type.
  • the determining the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window includes:
  • the operation type of the interactive operation is determined according to the pressing duration.
  • window control also includes:
  • the window event identifier corresponding to the target window is configured as a preset valid bit value.
  • the determining the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window includes:
  • the target window is controlled based on the interactive operation.
  • the method further includes:
  • the window event identifier is a preset invalid bit value, the determining the operation type of the interactive operation is performed.
  • the operation type is the first operation type
  • identify the operable control as the operation object of the interactive operation and perform an operation based on the interactive operation.
  • the control event identifier corresponding to the operable control is configured as a preset first value.
  • window controls including:
  • the operation type is the second operation type, determining the control event identifier corresponding to the operable control
  • control event identifier is a preset first value, then end responding to the interactive operation
  • control event identifier is a preset second bit value
  • the identifying the target window as the operation object of the interactive operation is performed, and the target window is controlled based on the interactive operation.
  • an interaction device for a window page including:
  • an operation type determination unit used for determining the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window; at least one operable control is included in the target window;
  • a first operation type response unit configured to identify the operable control as the operation object of the interactive operation if the operation type is the first operation type, and control the operable control based on the interactive operation ; the first operation type is the operation type associated with the operable control;
  • the second operation type response unit is configured to identify the target window as the operation object of the interactive operation if the operation type is the second operation type, and control the target window based on the interactive operation.
  • the device for interacting with a window page further includes:
  • an interactable information determining unit configured to determine the interactable information of the operable control; the interactable information includes the interactable range and/or the interactable direction;
  • a first operation type determination unit configured to determine the first operation type associated with the operable control based on the interactable information.
  • the second operation type is an operation type other than the first operation type.
  • the operation type determination unit includes:
  • a pressing duration acquiring unit configured to identify the pressing duration corresponding to the interactive operation
  • a pressing duration comparison file used for determining the operation type of the interactive operation according to the pressing duration.
  • the second operation type response unit further includes:
  • the window event identifier configuration unit is configured to configure the window event identifier corresponding to the target window as a preset valid bit value.
  • the operation type determination unit includes:
  • a window event identifier acquisition unit configured to acquire the window event identifier corresponding to the target window in response to the interactive operation
  • a valid bit value response unit configured to control the target window based on the interactive operation if the window event identifier is the valid bit value.
  • the device for interacting with a window page further includes:
  • An invalid bit value response unit configured to perform the determining of the operation type of the interactive operation if the window event identifier is a preset invalid bit value.
  • the device for interacting with a window page further includes:
  • the control event identifier configuration unit is configured to configure the control event identifier corresponding to the operable control as a preset first value.
  • the second operation type response unit includes:
  • control event identifier identification unit configured to determine the control event identifier corresponding to the operable control if the operation type is the second operation type
  • a first-order value response unit configured to end responding to the interactive operation if the control event is identified as a preset first-order value
  • the second bit value response unit is configured to perform the identifying of the target window as the operation object of the interactive operation if the control event identifier is a preset second bit value, and perform the operation object of the interactive operation based on the interactive operation. control the target window.
  • embodiments of the present application provide an electronic device, a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the The computer program implements the window page interaction method according to any one of the above-mentioned first aspects.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, wherein, when the computer program is executed by a processor, any one of the above-mentioned first aspect is implemented.
  • An interactive method of the window page is provided.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic device, enables the electronic device to execute the window page interaction method according to any one of the first aspect above.
  • an embodiment of the present application provides a chip system, including a processor, where the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the window page according to any one of the first aspect interactive method.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
  • Fig. 3 is the schematic diagram of existing window page
  • FIG. 4 is a schematic diagram of an electronic device receiving an interactive operation initiated by a user
  • Fig. 5 is the control schematic diagram that realizes the operable control to the window page based on the exclusive operation area
  • Fig. 6 is the realization flow chart of the interaction method of the window page provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a display of a multi-window page provided by an embodiment of the present application.
  • FIG. 8 is a display interface of a split-screen display provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an operable control provided by an embodiment of the present application.
  • FIG. 10 is a specific implementation flowchart of a window page interaction method provided by another embodiment of the present application.
  • FIG. 11 is a schematic diagram of an interactable range provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an interactive direction provided by an embodiment of the present application.
  • FIG. 13 is a specific implementation flowchart of a window page interaction method S601 provided by another embodiment of the present application.
  • FIG. 14 is a schematic diagram of a sliding operation provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of an implementation of determining an operable control provided by an embodiment of the present application.
  • 16 is a schematic diagram of a jitter range provided by an embodiment of the present application.
  • 17 is a schematic flowchart of a window page provided by the second embodiment of the present application.
  • FIG. 18 is a schematic diagram of the division of an interactive operation provided by an embodiment of the present application.
  • FIG. 19 is a schematic flowchart of a window page provided by the third embodiment of the present application.
  • 21 is a response flow chart of an interaction method for a window page provided by an embodiment of the present application when the anti-shake condition is not met;
  • 22 is a response flow chart of an interaction method of a window page after an interaction operation of an operable control is completed in response to an embodiment of the present application;
  • FIG. 23 is a response flow chart of an interaction method of a window page when responding to a target window for the first time provided by an embodiment of the present application;
  • 24 is a response flow chart of the interaction method of the window page after the interaction operation of the response target window provided by an embodiment of the present application;
  • 25 is a structural block diagram of a display device for a page provided by an embodiment of the present application.
  • FIG. 26 is a structural block diagram of an electronic device provided by an embodiment of the present application.
  • the term “if” may be contextually interpreted as “when” or “once” or “in response to determining” or “in response to detecting “.
  • the phrases “if it is determined” or “if the [described condition or event] is detected” may be interpreted, depending on the context, to mean “once it is determined” or “in response to the determination” or “once the [described condition or event] is detected. ]” or “in response to detection of the [described condition or event]”.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • the window page interaction method provided in the embodiments of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, super mobile devices
  • AR augmented reality
  • VR virtual reality
  • electronic devices such as a personal computer (ultra-mobile personal computer, UMPC), a netbook, and a personal digital assistant (personal digital assistant, PDA)
  • PDA personal digital assistant
  • the electronic device may be a station (STAION, ST) in a WLAN, a cellular phone, a cordless phone, a Session Initiation Protocol (Session Initiation Protocol, SIP) phone, a Wireless Local Loop (WLL) station, Personal Digital Assistant (PDA) devices, handheld devices with wireless communication capabilities, computing devices or other processing devices connected to wireless modems, computers, laptop computers, handheld communication devices, handheld computing devices, and /or other devices for communicating on wireless systems and next-generation communication systems, for example, mobile terminals in a 5G network or mobile terminals in a future evolved Public Land Mobile Network (PLMN) network, etc.
  • STAION Session Initiation Protocol
  • SIP Session Initiation Protocol
  • WLL Wireless Local Loop
  • PDA Personal Digital Assistant
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi satellite system) -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou satellite navigation system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • emitting diode, AMOLED organic light-emitting diode
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • Display 194 may include a touch panel as well as other input devices.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, the instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device. For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, etc.). Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon, for example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 193 captures still images or video.
  • an electronic device can display various types of display content through a display module, and the electronic device includes various types of electronic devices such as a smart phone, a computer, a notebook computer, a tablet computer, etc.
  • the display content includes but is not limited to: static Pictures, dynamic pictures, videos, interactive controls, web pages, articles, short messages, prompt messages, etc., the display content in the electronic device can be displayed in the form of window pages.
  • the window page specifically refers to the basic unit set in the graphical user interface by the application program installed in the electronic device to use the data.
  • the application program of the electronic device and the data stored locally on the electronic device are displayed in an integrated manner in the window page.
  • the electronic device can manage, generate and edit the application program or data displayed in the window page according to the operation initiated by the user.
  • menus and icons may be provided around the window page, and the data to be displayed on the window page may be displayed in the central area of the window page, wherein the window page specifically refers to displaying data, content, etc. in the form of a window actionable page.
  • the window page specifically refers to displaying data, content, etc. in the form of a window actionable page.
  • the displayed content of the window page is from The initial non-interactive display content such as text and pictures, as shown in (a) in Figure 3, to today's highly interactive display content such as embedded video playback controls, list controls, and interactive mini-game programs, as shown in Figure 3 (b), the forms of displayed content are more and more abundant, and the interaction with users is also stronger and stronger, and it can be adapted to various application scenarios.
  • each display content in the window page will occupy a part of the window page for display, that is, each display content is displayed on the window page, that is, the display level of the window page will be lower than the display level of the display content, and the display content will be displayed on the window page. Overlays the display area of the page portion of the window. Based on this, if the display content in the window page is an operable interactive control, when the electronic device receives an interactive operation initiated by the user on the window page, the display area of the window page and the display area of the display content may overlap. area, the electronic device cannot determine whether the operation object of the interactive operation is an operable control in the window page or the window page.
  • FIG. 4 shows a schematic diagram of an electronic device receiving an interaction operation initiated by a user.
  • the electronic device receives that the user initiates an upward swipe operation at the position 401 shown in FIG. 4 , since the position 401 is located on the operable control 402 in the window page displayed in the foreground of the electronic device, the The operation control 402 is specifically a list control, the list control can display a preset data list, the data list can contain N lines of data, and because the list control is limited by the display area of the window page, it can only display 3 lines of data , you can change the data displayed in the list control by sliding up and down.
  • the electronic device when the electronic device receives the upward swipe operation on the operable control 402, it cannot determine whether the interactive operation needs to change the data displayed in the list control, or whether it is to move the window page upward, resulting in the interactive operation.
  • the response is inaccurate, and the user's operation intention cannot be recognized, which reduces the accuracy of the operation and the user's experience.
  • the existing interactive operation response technology can configure a corresponding exclusive operation area in the operable control, that is, the interactive operation initiated in the exclusive operation area of the operable control, the corresponding operation object is Operable controls. On the contrary, if the initiated interactive operation is outside the exclusive operation area, the corresponding operation object is the window page, so as to distinguish the operation object of the interactive operation.
  • FIG. 5 shows a schematic diagram of implementing control of operable controls on a window page based on a dedicated operation area.
  • a window page is displayed in the electronic device, the window page includes a list control 501, the list control 501 includes an area 502 for displaying a data list, and the list control 501 corresponds to The exclusive operation area 503 (next page) and 504 (previous page), if the user needs to operate the list control 501, he can initiate an interactive operation in the exclusive operation area 503, for example, click the above-mentioned exclusive operation area 503, the list
  • the data column in the control 501 switches the display content, the originally displayed data columns are 1-3, switch to the list content on the next page, the data column displayed in the list control 501 can be changed to 4-6, as shown in Figure 5 ( b) shown. If the user initiates an interactive operation at other positions outside the dedicated operation area 503, the operation object of the interactive operation is identified as a window page.
  • the corresponding dedicated operation area is configured for different operable controls, so as to determine the operation object corresponding to the interactive operation initiated on the interactive window.
  • this method needs to divide a special area in the window page for collecting the interactive operations of the operable controls. If the area of the exclusive operation area is large, it will greatly affect the page layout of each control in the window page and the displayed content, reducing the The amount of information that can be accommodated in the window page; if the area of the exclusive operation area is small, the convenience of user operation will be reduced, and invalid operations such as wrong clicks and wrong clicks will be increased, thus affecting the user experience.
  • window pages cannot take into account the two aspects of distinguishing the operation objects of the interactive operation and reducing the layout of the window page, thus affecting the application of the operable controls on the window page, and then reducing the window page. Demonstrate the development of content diversity.
  • the execution body of the window page interaction method may specifically be an electronic device, and the electronic device may be a smart phone, Electronic devices such as tablet computers, computers, smart watches, etc., the electronic device includes a display module and an interaction module, the window page is displayed through the display module, and the user's interactive operation is obtained through the interaction module, wherein the display module may be a display, A module with a display function such as a projector; the interaction module can be a module with a human-computer interaction function such as a mouse, a keyboard, and a controller.
  • the above-mentioned display module is a touch screen with a touch function.
  • the interactive operations initiated by the user can be obtained.
  • FIG. 6 shows a flowchart of an implementation of a window page interaction method provided by an embodiment of the present application, which is described in detail as follows:
  • the operation type of the interactive operation is determined; the target window contains at least one operable control.
  • the electronic device may display at least one window page.
  • the window page can be generated based on the application program installed in the electronic device, such as the window page corresponding to the operation interface of the application program, or it can be generated based on the device system of the electronic device, such as when the electronic device receives a system notification or short message. pop-up window, etc. It should be noted that the electronic device can display multiple window pages simultaneously in the display interface, and the specific number of displayable window pages can be based on the device type of the electronic device, the size of the display module, and the device system and other factors. Sure.
  • the electronic device displays multiple window pages at the same time, that is, the multiple window pages displayed at the same time are all in the foreground running state.
  • the electronic device can operate according to the interactive operation of the previous response.
  • On the corresponding window page determine the above target window. Since multiple window pages are running in the foreground, according to the operation sequence of each window page, the most recently operated window page can be used as the window page being operated, that is, the above-mentioned target window, while other windows except the window page being operated can be used as the window page being operated. page, it can be used as the window page to be operated.
  • the window page to be operated by the interactive operation is the window page being operated; on the contrary, if the interactive operation is the selection of a window page operation, the window page to be specified can be determined from the window pages to be operated according to the interaction operation, and the window page specified by the interaction operation can be activated to the top layer of the display interface for display, and identified as the window page being operated.
  • FIG. 7 shows a schematic diagram of displaying a multi-window page provided by an embodiment of the present application.
  • a plurality of window pages may be displayed simultaneously on the display interface of the electronic device, namely the first window 701, the second window 702 and the third window 703, among which, the display at the top layer
  • the window page is the first window 701
  • the first window 701 is the most recently operated window page, that is, the window page that is being operated
  • the second window 702 and the third window 703 are displayed at a lower level than the first window 701, which is The window page to be manipulated.
  • the electronic device receives the interactive operation initiated by the user, it determines whether the interactive operation is a window selection operation, such as clicking on the display area of the second window 702, or clicking on the icon corresponding to 702, etc. If so, it is determined according to the window selection operation.
  • the window page selected by the user for example, in this embodiment, it is detected that the user clicks on the display area of the second window 702, then it is determined that the user needs to operate the second window 702, and the first window 701 is used as the window page to be operated,
  • the second window 702 is switched to the top layer for display, as shown in (b) in FIG. 7 , and the second window 702 is used as the window page being operated.
  • the electronic device may display multiple independent sub-screens.
  • FIG. 8 shows a display interface of split-screen display provided by an embodiment of the present application.
  • each sub-screen may be displayed with a corresponding window page.
  • the electronic device can determine the sub-screen to be operated by the interactive operation according to the click position of the interactive operation, and identify the window page displayed in the sub-screen as the target window.
  • the electronic device may determine the sub-screen corresponding to the interactive operation according to the coordinates of the interactive operation collected for the first time.
  • the interactive operation can be a sliding operation, and the electronic device can identify the coordinates of the first touch point corresponding to the sliding operation.
  • the sub-screen corresponding to the touch point is identified as the sub-screen required for the interactive operation, and the window page displayed on the sub-screen is used as the target page.
  • the electronic device may acquire coordinate information corresponding to the above-mentioned interactive operation, and if the coordinate information falls into the display area corresponding to a certain window page, the window page is identified as the target window corresponding to the above-mentioned interactive operation.
  • the target window may display operable controls.
  • the target window may also display other display contents, such as pictures, text, and videos.
  • the operable controls in the target window are specifically controls that can perform corresponding operations based on interactive operations.
  • FIG. 9 shows a schematic diagram of an operable control provided by an embodiment of the present application.
  • the target window contains a list control
  • the list control contains a list progress slider 901 for displaying the position of the currently displayed data in the list.
  • the list control can Sliding up and down is used to change the currently displayed content;
  • the target window contains an album control, which can display a plurality of images, and is configured with an image progress slider 902 , which is used to display the position of the currently displayed image in the album.
  • the album control can change the currently displayed image according to the left and right sliding operation initiated by the user to realize image switching.
  • the operable control can also respond to various types of interactive operations, and the operation types of the interactive operations are not limited here.
  • the target window may contain one operable control, or may contain two or more operable controls.
  • the target window does not contain any operable controls, it is not necessary to determine the operation type of the interactive operation, and the target window can be controlled directly based on the interactive operation. Based on this, the electronic device can first determine the number of operable controls contained in the target window before executing S601; if the number of operable controls contained in the target window is 0, the target window is controlled based on the interactive operation; On the contrary, if the number of operable controls contained in the target window is not 0, the operation of determining the operation type of the interactive operation is performed.
  • the electronic device can receive the interactive operation initiated by the user through the interactive module.
  • the interactive module is specifically a touch screen, the above-mentioned interactive operation may include: touch operation such as pressing operation and sliding operation; If the module is specifically a mouse, the above-mentioned interactive operations may include: single-click operation, double-click operation, scrolling operation of the mouse wheel, and frame picking operation, and the like.
  • the electronic device can collect the interactive operation initiated by the user through the interaction module, wherein the above-mentioned electronic device can be built in the electronic device.
  • the interaction module can be a built-in touch screen of the electronic device; It can be an external device, which is connected to the electronic device through a serial interface, so as to obtain the user's interactive operation, such as a mouse, a keyboard, a controller, and the like.
  • the electronic device can acquire the interactive operation initiated by the user through the interactive module, and the interactive module can send the collected interactive operation to the processor of the electronic device, and after the processor of the electronic device receives the above interactive operation, The operation type of the interactive operation will be identified first, and the corresponding response process will be executed based on the operation type.
  • the electronic device may pre-record the operation type associated with the operable control, and identify the operation type associated with the operable control as the first operation type. Similarly, the electronic device may also pre-record the operation type associated with the target window, and identify the operation type associated with the target window as the second operation type.
  • the electronic device may identify other operation types other than the first operation type as the second operation type. That is, the electronic device only needs to record the operation type associated with the operable controls, that is, the first operation type, and does not need to record the second operation type associated with the target window.
  • a second operation type of the operation type of the interactive operation may be determined.
  • the first operation types corresponding to all operable controls may be the same.
  • the electronic device may be preset with multiple standard gestures, and different standard gestures may control the operable controls of the target window to perform specified operations. Based on this, if the electronic device detects that the interactive operation initiated by the user matches any of the standard gestures, the operation type of the interactive operation is identified as the first operation type, and the operation of S602 is performed; otherwise, if the interactive operation and each standard gesture are both If it does not match, the operation type of the interactive operation is identified as the second operation type, and the operation of S603 is performed.
  • the first operation types corresponding to different operable controls may be different.
  • the first operation type and the second operation type are two mutually exclusive operation types, that is, if the interactive operation is not the first operation type, it must belong to the second operation type.
  • the electronic device can determine whether the interactive operation belongs to the first operation type, and if so, execute the operation of S602; on the contrary, if the interactive operation is not the first operation type, identify the interactive operation as the second operation type, and execute the operation of S602. Operation of S603.
  • the electronic device can also determine whether the interactive operation belongs to the second operation type, and if so, execute the operation of S603; on the contrary, if the interactive operation is not the second operation type, identify the interactive operation as the first operation type, and execute S602 operation.
  • the operation type of the interaction operation can be determined only by judging whether the interaction operation matches a certain operation type, thereby improving the identification efficiency of the operation type and improving the interaction efficiency. responding speed.
  • the electronic device may be configured with a preset anti-shake condition, and the anti-shake condition may specifically be a jitter range.
  • the electronic device can obtain the actually generated interactive trajectory of the above-mentioned interactive operation, and use the first touch point of the interactive operation as a reference point to obtain the jitter range corresponding to the interactive operation.
  • the above-mentioned jitter scope is a circular area, Then, taking the first touch point of the above interactive operation as the center of the circle, a circular area range can be generated as the shaking range of the interactive operation.
  • the interactive operation is identified as an invalid interactive operation, and there is no need to respond to the interactive operation; otherwise, if the interactive trajectory of the interactive operation is outside the above jitter range, the interactive operation is identified as an invalid interactive operation.
  • the operation is a valid interaction and determines the operation type of the interaction.
  • FIG. 10 shows a specific implementation flowchart of a window page interaction method provided by another embodiment of the present application.
  • the interaction method of the window page provided by the embodiment of the present application may further include: S1001 to S1002 before S601, and the specific description is as follows:
  • the method further includes:
  • the interactable information of the operable control is determined; the interactable information includes the interactable range and/or the interactable direction.
  • the electronic device may determine the first operation type according to the interactable information of the operable controls.
  • the interactable information of the operable control can be preset, and the interactable information can be recorded in the data packet corresponding to the operable control.
  • the electronic device can pass The configuration segment corresponding to the data packet is read, so as to obtain the interactable information of the operable control.
  • the electronic device when the electronic device loads the target window, the data packet of the operable control will be parsed, and the data packet of the operable control will be cached in the cache area of the memory of the electronic device. In this case, the electronic device can obtain the interactable information of the operable controls from the cache area.
  • the interactable information may include an interactable range and/or an interactable direction.
  • FIG. 11 shows a schematic diagram of an interactable range provided by an embodiment of the present application.
  • the operable control can be moved in the target window, and the target window can configure a corresponding movable area for the operable control, and the above-mentioned movable area is the corresponding movable area of the operable control.
  • the operable control is configured with a virtual joystick, the virtual joystick has a corresponding touch area, and the touchable area of the virtual joystick is the operable control corresponding to the operable control.
  • Interaction scope is configured with a virtual joystick, the virtual joystick has a corresponding touch area, and the touchable area of the virtual joystick is the operable control corresponding to the operable control.
  • FIG. 12 shows a schematic diagram of an interactable direction provided by an embodiment of the present application.
  • the operable control can move in a preset direction.
  • the operable control is a list control, and the data displayed in the list can be changed by sliding up and down.
  • Column, that is, the interactive direction corresponding to the list control is the vertical direction, and because the user initiates an interactive operation, the sliding direction cannot be guaranteed to be in the vertical direction.
  • the directions are all recognized as movable directions. Referring to (b) in FIG.
  • the operable control is specifically an album control
  • the images displayed in the album control can be switched by sliding left and right, that is, the interactive direction corresponding to the album control is the horizontal direction.
  • the electronic device The horizontal direction within a certain deviation angle can be recognized as a sliding operation of horizontal movement.
  • the above-mentioned interactable information specifically includes the invalid interaction direction and/or the invalid interaction range of the operable controls.
  • the electronic device may use other ranges except the invalid interaction direction and/or invalid interaction range of the operable control as the interactable direction and/or the interactable range of the operable control.
  • the operable control is specifically a photo album control, which can be moved in the left and right directions. Therefore, the movement in the vertical direction is invalid for the operable control. Based on this, the moving direction in the vertical direction can be used as the invalid direction corresponding to the operable control.
  • other directions other than the invalid direction can be used as the interactable direction of the operable control.
  • determining the interactable range of the interactable control based on the invalid interactive range can also be implemented with reference to the above manner.
  • the first operation type associated with the operable control is determined based on the interactable information.
  • the electronic device may determine the first operation type associated with the operable control according to the interaction information of the operable control. Specifically, if the interactive operation initiated by the user collected by the electronic device matches the interactable information, for example, the operation area of the interactive operation is within the operable range of the operable control, and/or the operation direction of the interactive operation is the operation direction of the operable control. Within the coverage range of the operable direction, the operation type of the interactive operation is identified as the first operation type.
  • the electronic device can also obtain the interactable information corresponding to the target window.
  • the interactable information of the target window is hereinafter referred to as the second interaction information.
  • the manner of acquiring the second interaction information may be the same as the manner of determining the interactable information of the operable controls, which will not be repeated here.
  • the above-mentioned second interaction information may also include the interactable range and/or the interactable direction of the target window.
  • the electronic device may match the interaction operation with the second interaction information, and if the interaction operation matches the second interaction information, identify the interaction operation as the second operation type; On the contrary, if the interaction operation does not match the second interaction information, it is identified that the interaction operation is not of the second operation type.
  • the second operation type is another operation type except the first operation type. Since the first operation type is the interactive operation of the operable control, the interactive operation that is invalid for the operable control must be the interactive operation initiated by the target window, so as to ensure that the control of the operable control is not affected, and at the same time the control of the operable control can be realized. Operational object distinction.
  • the effective operation type thus realizing the effective interactive operation of the operable control, the control of the operable control; and the invalid interactive operation of the operable control, the control of the target window, realizing the identification of the operation object of the interactive operation .
  • FIG. 13 shows a specific implementation flowchart of a window page interaction method S601 provided by another embodiment of the present application.
  • S601 may include: S1301 to S1302, and the specific description is as follows:
  • determining the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window includes:
  • the electronic device can determine the operation type corresponding to the interactive operation according to the pressing duration corresponding to the interactive operation, that is, the electronic device can identify the operation object to be operated by the user according to the pressing duration of the interactive operation initiated by the user, so as to Realize the intent recognition of the operation object.
  • the pressing duration corresponding to the above-mentioned interactive operation specifically refers to the pressing time corresponding to the first operation point of the interactive operation.
  • the interaction operation is a click operation
  • determine the first operation point of the interaction operation that is, the click position corresponding to the above click operation
  • identify the dwell time of the click operation corresponding to the click position and determine the dwell time of the click operation at the click position.
  • the time identification is the pressing duration corresponding to the above-mentioned interactive operation.
  • the time interval between when the interactive operation starts and before the sliding starts is identified as the pressing duration corresponding to the interactive operation.
  • the electronic device can determine the stay duration corresponding to the first touch point, and identify the stay duration as the pressing duration corresponding to the interactive operation.
  • the sliding operation can be at least divided into: a short-press sliding type and a long-pressing sliding type according to the dwell time corresponding to the above-mentioned first touch point.
  • FIG. 14 shows a schematic diagram of a sliding operation provided by an embodiment of the present application.
  • the first touch point is point A, and starts sliding from point A, and generates a sliding track 1401 , in which the user stops at point A
  • the time is 0.3s, and the dwell time of the first touch point is less than the preset time threshold, so the sliding track 1401 is a short-press sliding type.
  • the first touch point is point B, and starts sliding from point B, and generates a sliding track 1402 , wherein the time spent at point B is is 1 s, the dwell time of the first touch point is greater than or equal to a preset time threshold, so the sliding track 1402 is a long-press sliding type.
  • the electronic device may identify the interactive operation whose dwell time at the first touch point is less than the preset time threshold as the first operation type, that is, the operation object of the interactive operation is an operable control.
  • the electronic device An interactive operation whose dwell time at the first touch point is greater than or equal to a preset time threshold can be identified as the second operation type, that is, the operation object of the interactive operation is the target window.
  • the electronic device may identify an interactive operation in which the electronic device can recognize that the dwell time at the first touch point is less than the preset time threshold as the second operation type, that is, the operation object of the interactive operation is the target window, and correspondingly, The electronic device may identify an interactive operation whose dwell time at the first touch point is greater than or equal to a preset time threshold as the first operation type, that is, the operation object of the interactive operation is an operable control.
  • the operation type of the interactive operation is determined according to the pressing duration.
  • the electronic device may determine the operation type corresponding to the interactive operation according to the different pressing durations.
  • the electronic device may be configured with at least one time threshold, so that the pressing duration is divided into at least two time intervals according to the time threshold, and different time intervals may correspond to different operation types, and the electronic device can identify the The time interval in which the pressing duration corresponding to the second interactive operation falls, and the operation type associated with the time interval is used as the operation type corresponding to the interactive operation. For example, an interactive operation whose pressing duration is less than a preset time threshold is identified as the second operation type, that is, the operation object of the interactive operation is the target window; and an interactive operation whose pressing duration is greater than or equal to the preset time threshold is identified as the second operation type.
  • An operation type, that is, the operation object of the interactive operation is an operable window.
  • the operation type corresponding to the interactive operation is determined by identifying the pressing duration of the interactive operation, so as to realize the control of different operation objects through different operation modes without additionally adding corresponding operation controls or operation areas. Therefore, the accuracy of the interactive operation can be improved, and the influence on the layout of the window page can also be avoided.
  • the operation type is the first operation type, identifying the operable control as the operation object of the interactive operation, and controlling the operable control based on the interactive operation; the first operation
  • the action type is the action type associated with the actionable control.
  • the electronic device when the electronic device detects that the interactive operation matches the first operation type associated with the operable control, it indicates that the operation object corresponding to the interactive operation is the operable control. Therefore, the electronic device can Take control of operational controls. For example, if the interactive operation is to slide up 50 pixels, the operable control can be moved up by 50 pixels; of course, if the interactive operation is a close operation, the operable control can be removed from the target window, so , the control operation of the operable control is specifically determined according to the actual operation content of the interactive operation, and the operation content and the control effect of the operable control are not limited here.
  • the target window may also contain two or more operable controls.
  • different operable controls correspond to different operation types.
  • the operation types for the operable controls are collectively referred to as the first operation type
  • the operation types for the target window are collectively referred to as the second operation type. Therefore, if the target window contains multiple operable controls, when it is determined that the operation object of the interactive operation is an operable control, that is, the operation type of the interactive operation is the first operation type, the corresponding interactive operation can be further identified. Actionable controls.
  • FIG. 15 shows a schematic diagram of an implementation of determining an operable control provided by an embodiment of the present application.
  • the manner in which the electronic device determines the operable controls corresponding to the interactive operation may:
  • the characteristic coordinates of the operable control, and the above-mentioned position deviation value is calculated according to the characteristic coordinates and the coordinate information of the interactive operation; of course, the coordinates of the boundary point closest to the first touch point of the interactive operation on the operable control can also be obtained. , according to the coordinates of the boundary point and the coordinates of the first touch point, calculate the above-mentioned position deviation value, and determine the operable control corresponding to the interactive operation according to the position deviation value.
  • the electronic device can calculate each The display ratio of the displayable controls, select the operable control with the highest display ratio as the operation object corresponding to the interactive operation; if there are multiple operable controls with a display ratio of 100%, that is, in a fully displayed state, calculate the above The distance between the fully displayed operable control and the center coordinates of the display interface, select the operable control with the smallest distance as the operation object of the interactive operation.
  • different operable controls may correspond to different operation types.
  • the operation type corresponding to operable control A is the first operation type A
  • the operation type corresponding to operable control B is the first operation type B.
  • the electronic device can determine whether the interaction operation type is the second operation type, the first operation type A, or the second operation type B, and then determine the operation corresponding to the interaction operation. Which one is the control. That is, when the operation type of the interactive operation is determined, the operable control to be operated can be determined from a plurality of operable controls.
  • FIG. 16 shows a schematic diagram of a jitter range provided by an embodiment of the present application.
  • the electronic device may be preset with a jitter range. If the interactive operation is within the above-mentioned jitter range, the interactive operation is identified as an invalid operation, such as interactive operation 1; otherwise, if the interactive operation is outside the jitter range , the interaction operation is identified as a valid operation. For example, interaction operation 2 may not respond to an invalid interaction operation, but may respond to a valid interaction operation.
  • the electronic device can control the operable controls based on the acquired interactive operations while collecting the user's interactive operations. Based on this, the electronic device can be set with multiple collection periods. In order to realize real-time response, the period of the above collection period can be short, for example, it can be 0.1s.
  • the operation time of the interactive operation initiated by the user can include N collection periods.
  • the device After collecting the interaction operation corresponding to the nth (n is a positive integer less than N) collection period, the device can identify the operation type of the interaction operation, and control the corresponding operation object based on the interaction operation.
  • the operation type of the interactive operation is the first operation type
  • the operation object of the interactive operation is an operable control.
  • the electronic device when entering the n+1 th collection period, the electronic device can The acquired interactive operations control the operable controls, and at the same time acquire the interactive operations initiated by the user in the n+1th collection period, until all the interactive operations have been collected, the electronic device will be based on the N+1th collection period.
  • the interactive operation obtained in the Nth acquisition cycle controls the operable controls, thereby realizing the purpose of responding to the operable controls in real time.
  • the target window is identified as the operation object of the interactive operation, and the target window is controlled based on the interactive operation.
  • the electronic device when the electronic device detects that the interactive operation is not the first operation type associated with the operable controls or determines that the interactive operation is the second operation type associated with the target window, it indicates that the operation object corresponding to the interactive operation is the target window, therefore, the electronic device can control the target window based on the interactive operation. For example, if the interactive operation is to slide up 50 pixels, the target window can be moved up by 50 pixels; of course, if the interactive operation is a closing operation, the target window can be closed. Therefore, the control operation of the target window Specifically, it is determined according to the actual operation content of the interactive operation, and the operation content and the control effect of the operable controls are not limited here.
  • the method of controlling the target window based on the interactive operation is the same as the method of controlling the operable controls.
  • the interaction method of a window page can identify the operation type of the interaction operation when the electronic device receives the interaction operation initiated by the user, and determine the operation corresponding to the interaction operation based on the operation type.
  • object for example, the operation type of the interactive operation is the first operation type, then it can be determined that the operation object corresponding to the interactive operation is an operable control, and based on the above interactive operation, the operable controls on the target window are controlled, for example, in a specified direction
  • the operation type of the interactive operation is the second operation type
  • the operation object corresponding to the interactive operation can be determined as the target window, and the target window can be controlled based on the above-mentioned interactive operation.
  • the embodiment of the present application can distinguish whether the operation object is the underlying target window or the operable control attached to the window according to the operation type of the interactive operation, even if the operable control is different from the display area of the target window. If there is overlap, the corresponding operation objects can also be distinguished and the corresponding operation objects can be controlled, thereby improving the accuracy of the interactive operation and improving the user experience.
  • the embodiment of the present application does not need to configure a corresponding dedicated operation area in the operable control when determining the operation object of the interactive operation, it is possible to distinguish different operation objects according to the operation type of the interactive operation. It affects the layout of the operable controls on the window page, reduces the layout difficulty of the window page and improves the utilization efficiency of the area in the window page. Moreover, the above-mentioned interactive operation is not limited to be completed in a certain designated area, which further improves the flexibility of the interactive operation.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • FIG. 17 shows a schematic flowchart of a window page provided by the second embodiment of the present application.
  • a window event identifier is configured in the interaction method of the window page provided by the second embodiment.
  • the window event identifier is configured to configure the window event identifier of the target window after responding to an interactive operation in which the operation object is the target window, and before identifying the operation type of the interactive operation, the bit of the window event identifier will be configured.
  • the value is identified, and different response processes are executed based on the difference in the bit value of the window event identifier.
  • the interaction method of the window page provided by the second embodiment of the present application includes: S1701-S1706:
  • the window event identifier corresponding to the target window is acquired.
  • the electronic device may acquire the window event identifier of the target window displayed on the current display interface.
  • the window event identifier is used to determine whether the operation object of the last associated interactive operation is the target window.
  • the electronic device may determine whether there is a correlation between the interactive operation and the previous interactive operation. For example, when the user needs to drag the target window, he can first drag to the left, and then drag upward, that is, the above two sliding operations belong to the associated interactive operations. In this case, the electronic device will recognize the interaction.
  • the operands of the operations are the same.
  • the electronic device may be configured with an association time threshold, and if the electronic device detects that the time interval between any two interactive operations is within the above association time threshold, it identifies the above two interactive operations as association If the time interval between the above-mentioned two interactive operations is outside the above-mentioned association time threshold, then the above-mentioned two interactive operations are identified. If the operation is not an associated interactive operation, the operation type of the interactive operation is obtained again, and the operation object corresponding to the interactive operation is determined based on the operation type.
  • the electronic device may be configured with multiple collection periods, the interactive operation initiated by the user may be a continuous interactive operation, and the electronic device may, based on the above collection period, divide the continuous interactive operation initiated by the user into For multiple associated interaction operations, the operation objects of the associated interaction operations are identified as the same operation object.
  • the electronic device recognizes that the user's interactive operation is interrupted, for example, the finger leaves the touch screen or releases the click button of the mouse, then the continuous interactive operation is recognized to complete, and at this time, the above-mentioned window event identifier can be reset.
  • the window event identifier is configured as a preset invalid bit value, so that the electronic device can re-determine the operation object of the interactive operation.
  • FIG. 18 shows a schematic diagram of the division of an interaction operation provided by an embodiment of the present application.
  • the interactive operation is specifically a sliding operation, which will generate a touch track on the touch screen.
  • the electronic device can collect the above touch track according to the collection period or the preset effective moving distance during the process of collecting the touch track.
  • the above continuous interactive operations will be identified as multiple interactive operations with a short duration or a short moving distance, which are called sub-interactions here, that is, when the user initiates a continuous interactive operation, the electronic device will obtain A plurality of sub-interactions are obtained, and the operation of S1701 is respectively performed on each sub-interaction.
  • sub-interactions For the above-mentioned multiple sub-interactions belonging to the same continuous interactive operation, all of them are identified as related interactive operations, and when the continuous interactive operation is completed, the electronic device can reset the above-mentioned window event identifier.
  • the electronic device may allocate different threads to the target window and the operable controls to process corresponding operations.
  • the electronic device will first hand over the event of the interactive operation to the thread of the target window for processing.
  • the thread of the target window will obtain the window event identifier, and based on the value of the window event identifier, determine whether Continue to process through the thread of the target window (for example, the window event flag is in a valid bit value), or hand it over to the thread of the operable control (for example, the window event flag is in an invalid bit value).
  • the electronic device when the electronic device recognizes that the window event identifier is an invalid bit value, it means that there is no other operation associated with the interactive operation to control the target window at the last moment. In this case, re-identification is required.
  • the operation object corresponding to the interactive operation based on this, the operation type of the interactive operation needs to be determined, and the operation object of the interactive operation is determined according to the operation type. For a specific description, please refer to the relevant description of S601, which will not be repeated here.
  • the operation type is the first operation type, identify the operable control as the operation object of the interactive operation, and control the operable control based on the interactive operation; the first operation
  • the action type is the action type associated with the actionable control.
  • the target window is identified as the operation object of the interactive operation, and the target window is controlled based on the interactive operation.
  • S1703 is identical to S602, and S1704 is identical to S603, the specific implementation process of S1703 and S1704 can refer to the relevant descriptions of S602 and S603, which will not be repeated here.
  • the window event identifier corresponding to the target window is configured as a preset valid bit value.
  • the electronic device can configure the window event identifier corresponding to the target window to a preset valid value , and after the subsequent electronic device collects another interactive operation associated with the interactive operation, it can continue to use the target window as the operation object of another interactive operation, and control the target window without re-identifying the operation object. The continuity of interactive operations is maintained.
  • the method further includes: if the electronic device does not receive a new interactive operation within a preset valid time, resetting the window event identifier to a preset invalid bit value .
  • the method further includes: if the electronic device recognizes that the above-mentioned interactive operation has been completed (for example, the sliding operation ends, the finger leaves the touch screen, or the drag operation ends, releasing the click button, etc. ), the window event identifier is reset to a preset invalid bit value.
  • the electronic device controls the target window based on the interactive operation, it will determine whether the interactive operation is a valid operation of the target window, that is, whether the target window responds based on the interactive operation. Since some of the interactive operations are invalid operations for the target window, for example, the target window is already located on the far left of the screen, and the left-shift operation of the target window is initiated at this time, the target window cannot respond based on the interactive operation, then the interactive operation Operation is an invalid operation for the target window.
  • the window event flag is configured as an invalid bit value; otherwise, if the above-mentioned interactive operation is a valid operation, the operation of S1705 is performed.
  • the target window is controlled based on the interactive operation.
  • the electronic device determines that the window event identifier is a valid bit value, it indicates that the currently collected interactive operation has a last associated interactive operation, and the operation object of the last associated interactive operation is the target window , in this case, it is not necessary to determine the operation type of the interactive operation, but the target window can be controlled according to the interactive operation.
  • the window event tag by configuring the window event tag, it is possible to identify the type of the interactive operation collected this time when there is an associated last interactive operation, and use the operation object of the last interactive operation as the operation object.
  • the operation object of this operation improves the continuity of the interactive operation, avoids the occurrence of wrong switching of the operation object, and improves the accuracy of the operation.
  • FIG. 19 shows a schematic flowchart of the window page provided by the third embodiment of the present application.
  • a control event identifier is configured, and the The control event identifier is to configure the control event identifier of the operable control after responding to an interactive operation in which the operation object is an operable control, and after recognizing that the above-mentioned interactive operation is not the first operation type of the operable control, the control event identifier will be configured. Identify the bit value of the event identifier of the control, and execute different response processes based on the difference of the bit value of the event identifier of the control.
  • the interaction method of the window page provided in the third embodiment of the present application includes: S1901-S1906:
  • the operation type of the interactive operation is determined; the target window contains at least one operable control.
  • the operation type is the first operation type, identify the operable control as the operation object of the interactive operation, and control the operable control based on the interactive operation; the first operation
  • the action type is the action type associated with the actionable control.
  • S1902 is exactly the same as that of S602.
  • S1902 reference may be made to the relevant description of S602, which is not repeated here.
  • control event identifier corresponding to the operable control is configured as a preset first value.
  • the control event identifier of the operable control can be configured, and the control event identifier is configured as the preset first value, and the control event identifier is in the If the first value is lower, it means that the event identifier of the control is in a valid state.
  • the subsequent electronic device collects another interactive operation associated with the interactive operation, it can continue to use the operable control as the operation object of another interactive operation.
  • the operable control is controlled without re-identifying the operation object, and the continuity of the interactive operation is maintained.
  • the control event identifier is used to determine whether the operation object of the last associated interactive operation is an operable control.
  • the method further includes: if the electronic device does not receive a new interactive operation within a preset valid time, resetting the control event identifier to a preset second bit value.
  • the method further includes: if the electronic device recognizes that the above-mentioned interactive operation has been completed (for example, the sliding operation ends, the finger leaves the touch screen, or the drag operation ends, releasing the click button, etc. ), the control event flag is reset to the preset second bit value.
  • the electronic device when determining that the interactive operation is of the second operation type, can obtain the value of the above-mentioned control event identifier, and perform different response operations according to different control event identifiers. Specifically, if the control event identifier is the first value, the operation of S1905 is performed; otherwise, if the control event identifier is the second value, the operation of S1906 is performed.
  • the first operation type is determined based on the interactable information of the operable control, that is, the first operation type is the operation type that the operable control can respond to; correspondingly, the second operation type is It is an operation type that the operable control cannot respond to.
  • the control event identifier of the operable control is determined.
  • control event identifier is a preset first value
  • the response to the interaction operation is ended.
  • control event identifier is the preset first value, it means that the currently collected interactive operation has an associated relationship with the last collected interactive operation, and the operation objects are all operable controls. In this case , because the interaction operation collected this time is an invalid interaction operation for the operable control, in this case, the response to the interaction operation is directly terminated.
  • control event identifier is a preset second bit value
  • the identifying the target window as the operation object of the interactive operation is performed, and the target window is controlled based on the interactive operation .
  • control event identifier is the preset second bit value, it means that there is no correlation between the currently collected interactive operation and the last collected interactive operation, and the operation type of the interactive operation is not an operable control
  • the associated first operation type in this case, the target window can be identified as the operation object of the interactive operation, and the target window can be controlled through the interactive operation.
  • the electronic device is configured with a corresponding window event identifier for the target window
  • the above-mentioned window event identifier can be configured as a preset valid bit value, So that the subsequent collected interactive operations can identify the target window as the operation object.
  • the operation object of the last interactive operation can be used as the operation object of the current operation when there is an associated last interactive operation, which improves the continuity of the interactive operation , to avoid the wrong switching of the operation object, and improve the accuracy of the operation.
  • Embodiment 4 is a diagrammatic representation of Embodiment 4:
  • FIG. 20 shows a schematic flowchart of the window page provided by the fourth embodiment of the present application.
  • the interaction method of the window page provided by the fourth embodiment is configured with a control event identifier and a window.
  • the event identifier wherein the functions of the control event identifier and the window event identifier can be referred to the descriptions of the second embodiment and the third embodiment, which will not be repeated here.
  • the interaction method of the window page provided in the fourth embodiment of the present application includes: S2001-S2008:
  • the window event identifier corresponding to the target window is acquired.
  • the operation type of the interactive operation it can be determined whether the interactive operation satisfies the preset anti-shake condition, that is, whether the interactive operation exceeds the preset jitter range, and if so, the interactive operation is identified as an effective operation, The operation of determining the operation type of the interactive operation is performed; otherwise, if the interactive operation does not exceed the jitter range, the interactive operation is identified as an invalid operation, and the response to the interactive operation is ended.
  • the operation type is the first operation type, configure the control event identifier corresponding to the operable control as a preset first value; the first operation type is the same as the operable control the associated action type; the first action type is the action type associated with the actionable control.
  • the operable control is identified as the operation object of the interactive operation, and the operable control is controlled based on the interactive operation.
  • control event identifier is a preset second bit value
  • the identifying the target window as the operation object of the interactive operation is performed, and the target window is controlled based on the interactive operation .
  • S2007 is exactly the same as that of S1906.
  • S2007 reference may be made to the relevant description of S1906, and details are not repeated here.
  • S2007 may include:
  • the window event identifier corresponding to the target window is configured as a preset valid bit value, and a corresponding response operation is performed.
  • the response operation is ended.
  • the target window is controlled based on the interactive operation.
  • S2008 is exactly the same as that of S1706.
  • S2008 reference may be made to the relevant description of S1706, which is not repeated here.
  • S2008 may include:
  • the window event identifier corresponding to the target window is configured as a preset valid bit value, and a corresponding response operation is performed.
  • the response operation is ended.
  • FIG. 21 shows a response flowchart of a window page interaction method provided by an embodiment of the present application when the anti-shake condition is not met. Referring to Figure 21, it specifically includes the following steps:
  • the electronic device When the electronic device receives the interactive operation and obtains the invalid bit value of the window event identifier, the electronic device can perform anti-shake identification on the interactive operation.
  • the electronic device determines that the operation trajectory of the interactive operation does not exceed the preset anti-shake range, that is, the interactive operation does not meet the preset anti-shake conditions.
  • FIG. 22 shows a response flow chart of the interaction method of the window page after responding to the interactive operation of the operable control provided by an embodiment of the present application, that is, the control event identifier is configured as the first value.
  • the control event identifier is configured as the first value. Referring to Figure 22, it specifically includes the following steps:
  • the electronic device After the electronic device receives the interactive operation, it can hand over the interactive operation to the thread of the target window for processing. After receiving the above distributed event, the thread of the target window can obtain the window event identifier and determine that the window event identifier is invalid. value, at this time, the thread of the target window can hand over the above-mentioned interactive operation event to the thread of the operable control for processing, and the thread of the operable control can perform anti-shake identification on the interactive operation.
  • the electronic device judges that the operation trajectory of the interactive operation exceeds the preset anti-shake range through the thread of the operable control, that is, the interactive operation satisfies the preset anti-shake condition, and the thread of the operable control will further determine the operation of the interactive operation type.
  • the control event identifier can be configured as the first value, and the operable control can be controlled based on the interactive operation.
  • the time stamp of the control is the first value
  • the subsequent associated interactive operations received by the electronic device are all handed over to the thread of the operable control for processing, and the operation object is identified as the operable control until the event of the above-mentioned interactive operation is completed.
  • the thread of the operable control will recognize the control event identifier, and when it recognizes that the control event identifier is the first value, it will not be used this time.
  • the interactive operation responds.
  • FIG. 23 shows the response flow chart of the interaction method of the window page when responding to the target window for the first time provided by an embodiment of the present application, that is, the window event identifier is configured as an invalid bit value and the control event identifier is configured as an invalid bit value. Configured as the second bit value. Referring to Figure 23, it specifically includes the following steps:
  • the electronic device After the electronic device receives the interactive operation, it can hand over the interactive operation to the thread of the target window for processing. After receiving the above distributed event, the thread of the target window can obtain the window event identifier and determine that the window event identifier is invalid. value, at this time, the thread of the target window can hand over the above-mentioned interactive operation event to the thread of the operable control for processing, and the thread of the operable control can perform anti-shake identification on the interactive operation.
  • the electronic device judges that the operation trajectory of the interactive operation exceeds the preset anti-shake range through the thread of the operable control, that is, the interactive operation satisfies the preset anti-shake condition, and the thread of the operable control will further determine the operation of the interactive operation type.
  • the thread of the operable control recognizes that the operation type of the interactive operation is the second operation type, then the thread of the operable control will recognize the control event identifier, and when recognizing that the control event identifier is the second value, it will The events of the second interactive operation are handed over to the thread of the target window for processing.
  • the electronic device can configure the window event identifier as the first value through the thread of the target window, and control the target window based on the interactive operation.
  • FIG. 24 shows the response flow chart of the interaction method of the window page after the interaction operation of the target window is responded provided by an embodiment of the present application, that is, the window event identifier is configured as a valid bit value. Referring to Figure 24, it specifically includes the following steps:
  • the electronic device After the electronic device receives the interactive operation, it can hand over the interactive operation to the thread of the target window for processing. After receiving the above distributed event, the thread of the target window can obtain the window event identifier and determine that the window event identifier is a valid bit. value, will continue processing through the target window's thread.
  • the electronic device can control the target window through the thread of the target window until the interaction time ends.
  • Embodiment 5 is a diagrammatic representation of Embodiment 5:
  • FIG. 25 shows a structural block diagram of the window page interaction device provided by the embodiment of the present application. For the convenience of description, only the interface related to the embodiment of the present application is shown. part.
  • the interactive device of the window page includes:
  • the operation type determination unit 251 is configured to determine the operation type of the interactive operation in response to the interactive operation initiated by the user in the target window; the target window contains at least one operable control;
  • the first operation type response unit 252 is configured to identify the operable control as the operation object of the interactive operation if the operation type is the first operation type, and perform an operation on the operable control based on the interactive operation. control; the first operation type is the operation type associated with the operable control;
  • the second operation type response unit 253 is configured to identify the target window as the operation object of the interactive operation if the operation type is the second operation type, and control the target window based on the interactive operation.
  • the interactive device for the window page further includes:
  • an interactable information determining unit configured to determine the interactable information of the operable control; the interactable information includes the interactable range and/or the interactable direction;
  • a first operation type determination unit configured to determine the first operation type associated with the operable control based on the interactable information.
  • the second operation type is another operation type except the first operation type.
  • the operation type determination unit 251 includes:
  • a pressing duration acquiring unit configured to identify the pressing duration corresponding to the interactive operation
  • a pressing duration comparison file used for determining the operation type of the interactive operation according to the pressing duration.
  • the second operation type response unit 253 further includes:
  • the window event identifier configuration unit is configured to configure the window event identifier corresponding to the target window as a preset valid bit value.
  • the operation type determination unit 251 includes:
  • a window event identifier acquisition unit configured to acquire the window event identifier corresponding to the target window in response to the interactive operation
  • a valid bit value response unit configured to control the target window based on the interactive operation if the window event identifier is the valid bit value.
  • the interactive device for the window page further includes:
  • An invalid bit value response unit configured to perform the determining of the operation type of the interactive operation if the window event identifier is a preset invalid bit value.
  • the interactive device for the window page further includes:
  • the control event identifier configuration unit is configured to configure the control event identifier corresponding to the operable control as a preset first value.
  • the second operation type response unit 253 includes:
  • control event identifier identification unit configured to determine the control event identifier corresponding to the operable control if the operation type is the second operation type
  • a first-order value response unit configured to end responding to the interactive operation if the control event is identified as a preset first-order value
  • the second bit value response unit is configured to perform the identifying of the target window as the operation object of the interactive operation if the control event identifier is a preset second bit value, and to perform an operation on the interactive operation based on the interactive operation. control the target window.
  • the interaction device of the window page provided by the embodiment of the present application can also identify the operation type of the interaction operation when the electronic device receives the interaction operation initiated by the user, and determine the operation object corresponding to the interaction operation based on the operation type, for example, this If the operation type of the interactive operation is the first operation type, it can be determined that the operation object corresponding to the interactive operation is an operable control, and based on the above interactive operation, the operable controls on the target window are controlled, such as moving in a specified direction or turning pages etc.; on the contrary, if the operation type of the interactive operation is the second operation type, the operation object corresponding to the interactive operation can be determined as the target window, and the target window is controlled based on the above-mentioned interactive operation.
  • the embodiment of the present application can distinguish whether the operation object is the underlying target window or the operable control attached to the window according to the operation type of the interactive operation, even if the operable control is different from the display area of the target window. If there is overlap, the corresponding operation objects can also be distinguished and the corresponding operation objects can be controlled, thereby improving the accuracy of the interactive operation and improving the user experience.
  • FIG. 26 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device 26 of this embodiment includes: at least one processor 260 (only one is shown in FIG. 26 ), a processor, a memory 261 , and a processor 261 stored in the memory 261 and available for processing in the at least one processor
  • the computer program 262 running on the processor 260 when the processor 260 executes the computer program 262, implements the steps in any of the above-mentioned embodiments of the interaction method for each window page.
  • the electronic device 26 may be a computing device such as a desktop computer, a notebook, a palmtop computer, and a cloud server.
  • the electronic device may include, but is not limited to, the processor 260 and the memory 261 .
  • FIG. 26 is only an example of the electronic device 26, and does not constitute a limitation to the electronic device 26. It may include more or less components than the one shown, or combine some components, or different components , for example, may also include input and output devices, network access devices, and the like.
  • the so-called processor 260 may be a central processing unit (Central Processing Unit, CPU), and the processor 260 may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSP), application specific integrated circuits (Application Specific Integrated Circuits) , ASIC), off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory 261 may be an internal storage unit of the electronic device 26 in some embodiments, such as a hard disk or a memory of the electronic device 26 .
  • the memory 261 may also be an external storage device of the electronic device 26 in other embodiments, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, flash memory card (Flash Card), etc.
  • the memory 261 may also include both an internal storage unit of the electronic device 26 and an external storage device.
  • the memory 261 is used to store an operating system, an application program, a boot loader (Boot Loader), data, and other programs, such as program codes of the computer program, and the like.
  • the memory 261 may also be used to temporarily store data that has been output or will be output.
  • Embodiments of the present application further provide an electronic device, the electronic device comprising: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor executing The computer program implements the steps in any of the foregoing method embodiments.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the foregoing method embodiments can be implemented.
  • the embodiments of the present application provide a computer program product, when the computer program product runs on a mobile terminal, the steps in the foregoing method embodiments can be implemented when the mobile terminal executes the computer program product.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • the present application realizes all or part of the processes in the methods of the above embodiments, which can be completed by instructing the relevant hardware through a computer program, and the computer program can be stored in a computer-readable storage medium.
  • the computer program includes computer program code
  • the computer program code may be in the form of source code, object code, executable file or some intermediate form, and the like.
  • the computer-readable medium may include at least: any entity or device capable of carrying the computer program code to the photographing device/electronic device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electrical carrier signals, telecommunication signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals telecommunication signals
  • software distribution media For example, U disk, mobile hard disk, disk or CD, etc.
  • computer readable media may not be electrical carrier signals and telecommunications signals.
  • the disclosed apparatus/network device and method may be implemented in other manners.
  • the apparatus/network device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units. Or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.

Abstract

一种窗口页面的交互方法、装置、电子设备以及可读存储介质,适用于设备交互技术领域,该方法包括:响应于用户在目标窗口内发起的交互操作,确定交互操作的操作类型;目标窗口内包含至少一个可操作控件(S601);若操作类型为第一操作类型,则识别可操作控件为交互操作的操作对象,并基于交互操作对可操作控件进行控制;第一操作类型是与可操作控件关联的操作类型(S602);若操作类型为第二操作类型,则识别目标窗口为交互操作的操作对象,并基于交互操作对目标窗口进行控制(S603)。上述技术方案可以解决现有的窗口页面的交互技术,在窗口页面包含有可操作控件时,无法确定交互操作的操作对象,导致响应准确率低的问题。

Description

一种窗口页面的交互方法、装置、电子设备以及可读存储介质
本申请要求于2020年12月30日提交国家知识产权局、申请号为202011644145.7、申请名称为“一种窗口页面的交互方法、装置、电子设备以及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于设备交互技术领域,尤其涉及一种窗口页面的交互方法、装置、电子设备以及可读存储介质。
背景技术
随着电子设备技术的不断发展,电子设备上显示的内容的种类以及形式也逐渐丰富,而电子设备的窗口页面作为电子设备承载显示内容的重要部件之一,在页面显示领域的应用范围越来越广泛。
为了满足现有电子设备的显示需求,电子设备的窗口页面内可以添加有可操作的控件,如可以进行上下滚动切换显示内容的列表控件,或者进行左右翻页的相册控件,然而由于上述可操作控件是附着于窗口页面上进行显示,因此在发起交互操作时,电子设备常常无法判定该交互操作的目标对象是可操作控件抑或是窗口页面,从而降低了交互操作的准确性,并影响了用户的使用体验。
发明内容
本申请实施例提供了一种窗口页面的交互方法、装置、电子设备以及计算机可读存储介质,可以解决现有的窗口页面的交互技术,在窗口页面包含有可操作控件时,无法确定交互操作的操作对象,导致响应准确率低的问题。
第一方面,本申请实施例提供了一种窗口页面的交互方法,包括:
响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件;
若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型;
若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
实施本申请实施例具有以下有益效果:通过在电子设备接收到用户发起的交互操作时,识别该交互操作的操作类型,基于该操作类型确定交互操作对应的操作对象,例如该交互操作的操作类型为第一操作类型,则可以确定该交互操作对应的操作对象为可操作控件,并基于上述交互操作对目标窗口上的可操作控件进行控制,例如向指定方向移动或翻页等;反之,若该交互操作的操作类型为第二操作类型,则可以确定该交互操作对应的操作对象为目标窗口,并基于上述交互操作对目标窗口进行控制。与现有的交互响应技术相比,本申请实施例可以根据交互操作的操作类型区分操作对象是底层的目标窗口或是附着于窗口上的可操作控件,即便可操作控件与目标窗口的 显示区域存在重叠,也能够区分对应的操作对象,并对对应的操作对象进行控制,从而提高了交互操作的准确性,以及提升了用户的使用体验。
在第一方面的一种可能实现方式中,在所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型之前,还包括:
确定所述可操作控件的可交互信息;所述可交互信息包括可交互范围和/或可交互方向;
基于所述可交互信息确定所述可操作控件关联的所述第一操作类型。
在第一方面的一种可能实现方式中,所述第二操作类型为除所述第一操作类型外的其他操作类型。
在第一方面的一种可能实现方式中,所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型,包括:
识别所述交互操作对应的按压时长;
根据所述按压时长确定所述交互操作的所述操作类型。
在第一方面的一种可能实现方式中,所述若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制,还包括:
将所述目标窗口对应的窗口事件标识配置为预设的有效位值。
在第一方面的一种可能实现方式中,所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型,包括:
响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识;
若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
在第一方面的一种可能实现方式中,在所述响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识之后,还包括:
若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
在第一方面的一种可能实现方式中,在所述若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制之后,还包括:
将所述可操作控件对应的控件事件标识配置为预设的第一位值。
在第一方面的一种可能实现方式中,所述若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制,包括:
若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识;
若所述控件事件标识为预设的第一位值,则结束响应所述交互操作;
若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
第二方面,本申请实施例提供了一种窗口页面的交互装置,包括:
操作类型确定单元,用于响应于用户在目标窗口内发起的交互操作,确定所述交 互操作的操作类型;所述目标窗口内包含至少一个可操作控件;
第一操作类型响应单元,用于若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型;
第二操作类型响应单元,用于若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
在第二方面的一种可能实现方式中,窗口页面的交互装置,还包括:
可交互信息确定单元,用于确定所述可操作控件的可交互信息;所述可交互信息包括可交互范围和/或可交互方向;
第一操作类型确定单元,用于基于所述可交互信息确定所述可操作控件关联的所述第一操作类型。
在第二方面的一种可能实现方式中,所述第二操作类型为除所述第一操作类型外的其他操作类型。
在第二方面的一种可能实现方式中,所述操作类型确定单元,包括:
按压时长获取单元,用于识别所述交互操作对应的按压时长;
按压时长对比文件,用于根据所述按压时长确定所述交互操作的所述操作类型。
在第二方面的一种可能实现方式中,所述第二操作类型响应单元,还包括:
窗口事件标识配置单元,用于将所述目标窗口对应的窗口事件标识配置为预设的有效位值。
在第二方面的一种可能实现方式中,所述操作类型确定单元,包括:
窗口事件标识获取单元,用于响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识;
有效位值响应单元,用于若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
在第二方面的一种可能实现方式中,窗口页面的交互装置,还包括:
无效位值响应单元,用于若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
在第二方面的一种可能实现方式中,窗口页面的交互装置,还包括:
控件事件标识配置单元,用于将所述可操作控件对应的控件事件标识配置为预设的第一位值。
在第二方面的一种可能实现方式中,所述第二操作类型响应单元,包括:
控件事件标识识别单元,用于若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识;
第一位值响应单元,用于若所述控件事件标识为预设的第一位值,则结束响应所述交互操作;
第二位值响应单元,用于若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
第三方面,本申请实施例提供了一种电子设备,存储器、处理器以及存储在所述 存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现上述第一方面中任一项所述窗口页面的交互方法。
第四方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现上述第一方面中任一项所述窗口页面的交互方法。
第五方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面中任一项所述窗口页面的交互方法。
第六方面,本申请实施例提供一种芯片系统,包括处理器,处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现如第一方面中任一项所述窗口页面的交互方法。
可以理解的是,上述第二方面至第六方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1是本申请实施例提供的电子设备的结构示意图;
图2是本申请实施例的电子设备的软件结构框图;
图3是现有的窗口页面的示意图;
图4是电子设备接收到用户发起的交互操作的示意图;
图5是基于专属操作区域实现对窗口页面的可操作控件的控制示意图;
图6是本申请一实施例提供的窗口页面的交互方法的实现流程图;
图7是本申请一实施例提供的多窗口页面的显示示意图;
图8是本申请一实施例提供的分屏显示的显示界面;
图9是本申请一实施例提供的可操作控件的示意图;
图10是本申请另一实施例提供的窗口页面的交互方法的具体实现流程图;
图11是本申请一实施例提供的可交互范围的示意图;
图12是本申请一实施例提供的可交互方向的示意图;
图13是本申请又一实施例提供的窗口页面的交互方法S601的具体实现流程图;
图14是本申请一实施例提供的滑动操作的示意图;
图15是本申请一实施例提供的确定可操作控件的实现示意图;
图16是本申请一实施例提供的抖动范围的示意图;
图17是本申请第二实施例提供的窗口页面的流程示意图;
图18是本申请一实施例提供的交互操作的划分示意图;
图19是本申请第三实施例提供的窗口页面的流程示意图;
图20是本申请第四实施例提供的窗口页面的流程示意图;
图21是本申请一实施例提供的在未满足防抖条件下窗口页面的交互方法的响应流程图;
图22是本申请一实施例提供的在对响应完可操作控件的交互操作后,窗口页面的交互方法的响应流程图;
图23是本申请一实施例提供的在首次对目标窗口进行响应时,窗口页面的交互方法的响应流程图;
图24是本申请一实施例提供的在对响应完目标窗口的交互操作后,窗口页面的交互方法的响应流程图;
图25是本申请实施例提供的页面的显示装置的结构框图;
图26是本申请一实施例提供的电子设备的结构框图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请实施例提供的窗口页面的交互方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
例如,所述电子设备可以是WLAN中的站点(STAION,ST),可以是蜂窝电话、无绳电话、会话启动协议(Session InitiationProtocol,SIP)电话、无线本地环路(Wireless Local Loop,WLL)站、个人数字处理(Personal Digital Assistant,PDA)设备、具有无线通信功能的手持设备、计算设备或连接到无线调制解调器的其它处理设备、电脑、膝上型计算机、手持式通信设备、手持式计算设备、和/或用于在无线系统上进行通信的其它设备以及下一代通信系统,例如,5G网络中的移动终端或者未来演进的公共陆地移动网络(Public Land Mobile Network,PLMN)网络中的移动终端等。
图1示出了电子设备100的一种结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施 例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system, GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。显示屏194可包括触控面板以及其他输入设备。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,脸部识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子 设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看 短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2是本申请实施例的电子设备的一种软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN, 蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
在现有技术中,电子设备可以通过显示模块显示各类型的显示内容,电子设备包括有智能手机、计算机电脑、笔记本电脑、平板电脑等各种类型的电子设备,显示内容包括但不限于:静态图片、动态图片、视频、交互控件、网络页面、文章、短信息、提示信息等,电子设备内的显示内容可以通过窗口页面的形式进行展示。其中,窗口页面具体指的是电子设备安装的应用程序为使用数据而在图形用户界面中设置的基本单元。电子设备的应用程序和电子设备本地存储的数据在窗口页面内实现一体化显示。在窗口页面中,电子设备可以根据用户发起的操作,对窗口页面内显示的应用程序或数据,进行管理、生成和编辑。
在一种可能的实现方式中,窗口页面四周可以设有菜单、图标,窗口页面所需展示的数据可以显示于窗口页面的中央区域,其中,窗口页面具体是指采用窗口形式显示数据、内容等的可操作的页面。随着电子技术的不断发展,窗口页面内展示内容的类型也越来越丰富,示例性地,图3示出了现有的窗口页面的示意图,参见图3所示,窗口页面的显示内容从最开始的文字、图片等不可交互的显示内容,如图3中的(a),到现今的可嵌入视频播放控件、列表控件以及互动小游戏程序等交互性强的显示内容,如图3中的(b),使得展示内容的形式越来越丰富,与用户之间的互动性也越来越强,能够与各种应用场景相适配。
然而,随着窗口页面内显示内容的类型越来越丰富以及可交互性越来越强,一个新的问题逐渐显现。由于窗口页面内的各个显示内容均会占用窗口页面的部分的区域进行显示,即各个显示内容是附着于窗口页面上显示,即窗口页面的显示层级会低于显示内容的显示层级,显示内容会覆盖窗口页面部分的显示区域。基于此,若该窗口页面内的显示内容是可操作的交互控件,则电子设备接收到用户对于窗口页面内发起的交互操作时,由于窗口页面的显示区域与显示内容的显示区域会存在重叠的区域,则电子设备无法确定该交互操作的操作对象是窗口页面内的可操作控件,或是窗口页面。
示例性地,图4示出了电子设备接收到用户发起的交互操作的示意图。参见图4所示,电子设备接收到用户在如图4所示的位置401发起了一个向上滑动的操作,由于位置401位于该电子设备前台显示的窗口页面内的可操作控件402上,该可操作控件402具体为一列表控件,该列表控件可以显示有预设的数据列表,该数据列表可以包含N行数据,而由于列表控件受限于窗口页面的显示区域,因此可以只显示3行数据,可以通过上下滑动的方式改变该列表控件内显示的数据。基于此,电子设备在接收到位于可操作控件402上的向上滑动的操作时,无法确定该交互操作是需要改变该列表控件内显示的数据,抑或是对窗口页面向上移动,从而导致了交互操作的响应不 准确,无法识别用户的操作意图,降了操作的准确性以及用户的使用体验。
为了确定用户的操作意图,现有的交互操作的响应技术,可以在可操作控件内配置对应的专属操作区域,即在可操作控件的专属操作区域内发起的交互操作,其对应的操作对象为可操作控件,反之,若发起的交互操作在专属操作区域外,则对应的操作对象为窗口页面,从而区分交互操作的操作对象。
示例性地,图5示出了基于专属操作区域实现对窗口页面的可操作控件的控制示意图。参见图5中的(a)所示,该电子设备内显示有一窗口页面,该窗口页面内包含有一列表控件501,在列表控件501内包含有显示数据列表的区域502,以及该列表控件501对应的专属操作区域503(下一页)和504(上一页),若用户需要对列表控件501进行操作,可以在专属操作区域503内发起交互操作,例如点击上述的专属操作区域503,则列表控件501内的数据列切换显示内容,原本显示的数据列为1—3,切换到下一页的列表内容,列表控件501内显示的数据列可以变更为4-6,如图5中的(b)所示。若用户在专属操作区域503外的其他位置发起了交互操作,则识别该交互操作的操作对象为窗口页面。
上述方式通过为不同的可操作控件配置对应的专属操作区域,以便确定在交互窗口上发起的交互操作对应的操作对象。然而该方式需要在窗口页面内划分专门的区域用于采集可操作控件的交互操作,若该专属操作区域的区域面积较大,则会大大影响窗口页面内各个控件以及显示内容的页面布局,降低了窗口页面的可容纳的信息量;若该专属操作区域的面积较少,则会降低了用户操作的便捷性,增加了错点、误点等无效操作,从而影响用户的使用体验。
由此可见,现有的窗口页面的交互技术,无法同时兼顾区分交互操作的操作对象以及降低对于窗口页面布局的两个方面,从而影响了可操作控件在窗口页面的应用,继而降低了窗口页面展示内容多样性的发展。
实施例一:
因此,为了解决现有窗口页面的交互技术的缺陷,本申请提供一种窗口页面的交互方法,该窗口页面的交互方法的执行主体具体可以为一电子设备,该电子设备可以为一智能手机、平板电脑、计算机、智能手表等电子设备,该电子设备包含有显示模块以及交互模块,通过显示模块显示窗口页面,以及通过交互模块获取用户的交互操作,其中,该显示模块具体可以为一显示器、投影仪等具有显示功能的模块;该交互模块可以为:鼠标、键盘以及控制器等具有人机交互功能的模块,优选地,上述显示模块是具有触控功能的触控屏,在显示窗口页面的同时,可以获取用户发起的交互操作。图6示出了本申请一实施例提供的窗口页面的交互方法的实现流程图,详述如下:
在S601中,响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件。
在本实施例中,电子设备可以在显示至少一个窗口页面。该窗口页面可以基于电子设备内安装的应用程序生成的,如应用程序操作界面对应的窗口页面,也可以是基于电子设备的设备系统生成的,如在电子设备接收到系统通知或短信息时生成的弹框窗口等。需要说明的是,电子设备在显示界面内可以同时显示有多个窗口页面,具体可显示的窗口页面的显示个数可以根据电子设备的设备类型、显示模块的尺寸以及设 备系统等多个不同因素确定。
在一种可能的实现方式中,若电子设备同时显示有多个窗口页面,即上述同时显示的多个窗口页面均处于前台运行状态,在该情况下,电子设备可以根据上一响应的交互操作对应的窗口页面,确定上述的目标窗口。由于多个窗口页面均处于前台运行,然而根据各个窗口页面的操作次序,可以将最近操作的窗口页面作为正在操作的窗口页面,即上述的目标窗口,而除正在操作的窗口页面外的其他窗口页面,则可以作为待操作的窗口页面。因此,在用户发起交互操作时,若该交互操作并非窗口页面的选择操作,则可以识别该交互操作所需操作的窗口页面为正在操作的窗口页面;反之,若该交互操作为窗口页面的选择操作,则可以根据该交互操作从待操作的窗口页面中确定所需指定的窗口页面,并将交互操作指定的窗口页面激活至显示界面的最上层进行显示,并识别为正在操作的窗口页面。
示例性地,图7示出了本申请一实施例提供的多窗口页面的显示示意图。参见图7中的(a)所示,在电子设备的显示界面上可以同时显示有多个窗口页面,分别为第一窗口701、第二窗口702以及第三窗口703,其中,处于最上层显示的窗口页面为第一窗口701,该第一窗口701为最近操作的窗口页面,即正在操作的窗口页面,而第二窗口702以及第三窗口703是显示层级低于上述第一窗口701,为待操作的窗口页面。此时,若电子设备接收到用户发起的交互操作,则判断该交互操作是否窗口选择操作,例如点击第二窗口702的显示区域,或点击702对应的图标等,若是,则根据窗口选择操作确定用户所选择的窗口页面,例如在本实施例中,检测到用户点击第二窗口702的显示区域,则判断用户需要对第二窗口702进行操作,将第一窗口701作为待操作的窗口页面,而将第二窗口702切换至最上层进行显示,如图7中的(b),并将第二窗口702作为正在操作的窗口页面。
在一种可能的实现方式中,若电子设备支持分屏功能,则电子设备可以显示有多个独立的子屏幕,图8示出了本申请一实施例提供的分屏显示的显示界面。在该情况下,每个子屏幕可以显示有对应的窗口页面。电子设备可以根据交互操作的点击位置,确定该交互操作所需操作的子屏幕,并将该子屏幕内显示的窗口页面识别为目标窗口。
可选地,电子设备可以根据首次采集到交互操作的坐标确定该交互操作对应的子屏幕。例如,该交互操作可以为一滑动操作,则电子设备可以识别该滑动操作对应的首个触控点的坐标,若该坐标落入到上述分屏显示的某一子屏幕内,则将首个触控点对应的子屏幕识别为该交互操作所需操作的子屏幕,将该子屏幕显示的窗口页面作为目标页面。
可选地,电子设备可以获取上述交互操作对应的坐标信息,若该坐标信息落入到某一窗口页面对应的显示区域,则将该窗口页面识别为上述交互操作对应的目标窗口。
在本实施例中,目标窗口内可以显示有可操作控件,当然,目标窗口除了显示有可操作控件外,还可以显示有其他显示内容,如图片、文字、视频等。该目标窗口内的可操作控件具体是可以基于交互操作执行相应操作的控件。
示例性地,图9示出了本申请一实施例提供的可操作控件的示意图。参见图9中的(a)所示,目标窗口内包含有一列表控件,该列表控件内包含有列表进度滑块901,用于显示当前所显示的数据列在列表内的位置,该列表控件可以根据进行上下滑动用 以改变当前显示的内容;参见图9中的(b)所示,目标窗口内包含有一相册控件,该相册控件内可以显示有多个图像,并配置有一图像进度滑块902,用于显示当前所显示的图像在相册内的位置,该相册控件可以根据用户发起的左右滑动操作以改变当前显示的图像,以实现图像切换。当然,可操作控件除了响应用户发起的滑动操作外,还可以响应各类型的交互操作,在此并不对交互操作的操作类型进行限定。
在一种可能的实现方式中,该目标窗口内可以包含一个可操作控件,也可以包含两个或以上的可操作控件。
在一种可能的实现方式中,若该目标窗口不包含任一可操作控件,则无需确定交互操作的操作类型,可以直接基于该交互操作对目标窗口进行控制。基于此,电子设备在执行S601之前可以先确定该目标窗口包含的可操作控件的个数;若该目标窗口内包含的可操作控件的个数为0,则基于交互操作对目标窗口进行控制;反之,若该目标窗口内包含的可操作控件的个数非0,则执行确定交互操作的操作类型的操作。
在本实施例中,电子设备可以通过交互模块接收用户发起的交互操作,若该交互模块具体为一触控屏,则上述交互操作可以包括:按压操作、滑动操作等触控操作;若该交互模块具体为一鼠标,则上述交互操作可以包括:单击操作、双击操作、鼠标滚轮的滚动操作、框取操作等。电子设备可以通过交互模块采集用户发起的交互操作,其中,上述电子设备可以内置于电子设备上,如该电子设备为智能手机,则交互模块可以是电子设备内置的触控屏;该交互模块也可以是一外接设备,通过串行接口接入到电子设备,以便获取用户的交互操作,如鼠标、键盘、控制器等。
在本实施例中,电子设备通过交互模块可以获取用户发起的交互操作,交互模块可以将采集到的交互操作发送给电子设备的处理器,在电子设备的处理器接收到上述的交互操作后,会首先识别该交互操作的操作类型,基于操作类型执行对应的响应流程。
在一种可能的实现方式中,电子设备可以预先记录有可操作控件关联的操作类型,并将上述可操作控件关联的操作类型识别为第一操作类型。同样地,电子设备也可以预先记录有目标窗口关联的操作类型,并将上述目标窗口关联的操作类型识别为第二操作类型。
可选地,电子设备可以将除第一操作类型外的其他操作类型识别为第二操作类型。即电子设备只需记录有可操作控件关联的操作类型,即第一操作类型,而无需记录与目标窗口关联的第二操作类型,只需识别交互操作的操作类型并非第一操作类型时,则可以确定该交互操作的操作类型的第二操作类型。
在一种可能的实现方式中,所有可操作控件对应的第一操作类型可以是相同。举例性地,电子设备可以预设有多个标准手势,不同的标准手势可以控制目标窗口的可操作控件执行指定的操作。基于此,若电子设备检测到用户发起的交互操作与任一标准手势相匹配,则识别该交互操作的操作类型为第一操作类型,执行S602的操作;反之,若交互操作与各个标准手势均不匹配,则识别该交互操作的操作类型为第二操作类型,执行S603的操作。
在一种可能的实现方式中,不同的可操作控件对应的第一操作类型可以是不同的。
在一种可能的实现方式中,第一操作类型与第二操作类型是相互互斥的两个操作 类型,即若该交互操作并非第一操作类型,则必然属于第二操作类型。在该情况下,电子设备可以判断交互操作是否属于第一操作类型,若是,则执行S602的操作;反之,若该交互操作并非第一操作类型,则识别该交互操作为第二操作类型,执行S603的操作。当然,电子设备同样可以判断该交互操作是否属于第二操作类型,若是,则执行S603的操作;反之,若该交互操作并非第二操作类型,则识别该交互操作为第一操作类型,执行S602的操作。通过配置两个关系互斥的操作类型,只需判断交互操作是否与某一操作类型相匹配,即可确定该交互操作的操作类型,从而提高了操作类型的识别效率,继而提高了交互操作的响应速度。
在一种可能的实现方式中,电子设备可以配置有预设的防抖条件,该防抖条件具体可以为一抖动范围。电子设备可以获取上述交互操作的实际生成的交互轨迹,并以该交互操作的第一个触控点作为基准点,得到该交互操作对应的抖动范围,例如,上述抖动范围为一圆形区域,则可以以上述交互操作的第一个触控点为圆心,生成一个圆形区域范围作为该交互操作的抖动范围。若该实际生成的交互轨迹在该抖动范围内,则识别该交互操作为无效交互操作,无需对该交互操作进行响应;反之,若该交互操作的交互轨迹在上述抖动范围外,则识别该交互操作为有效交互操作,并确定该交互操作的操作类型。
进一步地,作为本申请的另一实施例,图10示出了本申请另一实施例提供的窗口页面的交互方法的具体实现流程图。参见图10所示,与上一实施例相比,本申请实施例提供的窗口页面的交互方法,在S601之前还可以包括:S1001~S1002,具体描述如下:
进一步地,在所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型之前,还包括:
在S1001中,确定所述可操作控件的可交互信息;所述可交互信息包括可交互范围和/或可交互方向。
在本实施例中,电子设备可以根据可操作控件的可交互信息,确定第一操作类型。每个可操作控件在配置于窗口页面时,可以预先设置该可操作控件的可交互信息,该可交互信息可以记录有该可操作控件对应的数据包内,在该情况下,电子设备可以通过读取该数据包对应的配置语段,从而可以获取得到该可操作控件的可交互信息。
在一种可能的实现方式中,在电子设备加载目标窗口时,会对可操作控件的数据包进行解析,并将该可操作控件的数据包缓存于电子设备的存储器的缓存区域内,在该情况下,电子设备可以从缓存区域内获取可操作控件的可交互信息。
在本实施例中,该可交互信息可以包括有可交互范围和/或可交互方向。
其中,上述的可交互范围具体指的是,该可操作控件和/或该可操作控件内的显示内容可以进行交互的范围。示例性地,图11示出了本申请一实施例提供的可交互范围的示意图。参见图11中的(a)所示,该可操作控件可以在目标窗口内进行移动,目标窗口可以为该可操作控件配置对应的可移动区域,上述可移动区域即为该可操作控件对应的可交互范围。参见图11中的(b)所示,该可操作控件配置有虚拟摇杆,该虚拟摇杆有对应的触控区域,该虚拟摇杆的可触控区域即为该可操作控件对应的可交互范围。
其中,上述可交互方向具体指的是,该可操作控件和/或该可操作控件内的显示内容可以进行交互的方向。示例性地,图12示出了本申请一实施例提供的可交互方向的示意图。参见图12中的(a)所示,该可操作控件可以在预设的方向上移动,举例性地,该可操作控件具体为一列表控件,可以通过上下滑动从而改变该列表内显示的数据列,即该列表控件对应的可交互的方向为垂直方向,而由于用户发起交互操作时,无法保证滑动的方向一定在垂直方向上,因此,可以将与垂直方向之间在一定偏差角度内的方向均识别为可以可移动方向。参见图12中的(b),该可操作控件具体为一相册控件,可以通过左右滑动从而切换相册控件内显示的图像,即该相册控件对应的可交互方向为水平方向,同样地,电子设备可以允许一定偏差角度内的水平方向均识别为水平移动的滑动操作。
在一种可能的实现方式中,上述可交互信息具体包括有可操作控件的无效交互方向和/或无效交互范围。对应地,电子设备可以将除可操作控件的无效交互方向和/或无效交互范围外的其他范围,作为可操作控件的可交互方向和/或可交互范围。举例性地,继续参见图12中的(b)所示,该可操作控件具体为一相册控件,可以在左右方向上移动,因此,垂直方向的移动对于该可操作控件而言是无效的移动方向,基于此,可以将垂直方向的移动方向作为该可操作控件对应的无效方向,如图12中的(c)所示,将无效方向外的其他方向作为可操作控件的可交互方向。对应地,基于无效交互范围确定可交互控件的可交互范围也可以参照上述方式实现。
在S1002中,基于所述可交互信息确定所述可操作控件关联的所述第一操作类型。
在本实施例中,电子设备可以根据可操作控件的交互信息确定该可操作控件关联的第一操作类型。具体地,若电子设备采集到用户发起的交互操作与可交互信息相匹配,例如交互操作的操作区域在可操作控件的可操作范围内,和/或交互操作的操作方向为该可操作控件的可操作方向的覆盖范围内,则识别该交互操作的操作类型为第一操作类型。
在一种可能的实现方式中,电子设备同样可以获取目标窗口对应的可交互信息,为了与可操作控件的可交互信息进行区分,以下称目标窗口的可交互信息为第二交互信息。其中,获取上述第二交互信息的方式可以与确定可操作控件的可交互信息的方式相同,在此不再一一赘述。上述的第二交互信息同样可以包括目标窗口的可交互范围和/或可交互方向。对应地,电子设备可以在获取到用户发起的交互操作后,将该交互操作与第二交互信息进行匹配,若交互操作与第二交互信息相匹配,则识别该交互操作为第二操作类型;反之,若该交互操作与第二交互信息不匹配,则识别该交互操作并非第二操作类型。
进一步地,作为本申请的另一实施例,所述第二操作类型为除所述第一操作类型外的其他操作类型。由于第一操作类型为可操作控件的可交互操作,因此对于可操作控件无效的交互操作,必然是对于目标窗口发起的交互操作,从而能够保证不影响可操作控件进行控制的同时,能够实现对操作对象的区分。
在本申请实施例中,通过确定可操作控件的可交互信息,并基于可操作控件的可交互信息确定其对应的第一操作类型,从而保证了第一操作类型均是对于可操作控件而言有效的操作类型,从而实现了对于可操作控件有效的交互操作则对可操作控件进 行控制;而对于可操作控件无效的交互操作则对目标窗口进行控制,实现了对交互操作的操作对象的识别。
进一步地,作为本申请的另一实施例,图13示出了本申请又一实施例提供的窗口页面的交互方法S601的具体实现流程图。参见图13所示,与上一实施例相比,本申请实施例提供的窗口页面的交互方法,S601可以包括:S1301~S1302,具体描述如下:
进一步地,所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型,包括:
在S1301中,识别所述交互操作对应的按压时长。
在本实施例中,电子设备可以根据交互操作对应的按压时长,确定该交互操作对应的操作类型,即电子设备可以根据用户发起的交互操作的按压时长,识别用户所需操作的操作对象,以实现对操作对象的意图识别。
在一种可能的实现方式中,上述交互操作对应的按压时长,具体指的是该交互操作的首个操作点对应的按压时间。
对于点击操作而言,若该交互操作为点击操作,则确定该交互操作的首个操作点,即上述点击操作对应的点击位置,识别该点击操作在该点击位置对应的停留时间,将该停留时间识别为上述交互操作对应的按压时长。
对于滑动操作而言,若该交互操作为滑动操作,则在交互操作开始时,至滑动开始前之间的时间间隔,则识别为该交互操作对应的按压时长。例如,用户在发起滑动操作的时候,会生成一条连续的滑动路径,而该滑动路径对应的第一个触控点,即为上述“在交互操作开始时,至滑动开始前”对应的触控点,电子设备可以确定第一个触控点对应的停留时长,将该停留时长识别为该交互操作对应的按压时长。通过上述第一个触控点对应的停留时长,可以将滑动操作至少划分为:短按滑动类型以及长按滑动类型。
示例性地,图14示出了本申请一实施例提供的滑动操作的示意图。参见图14中的(a)所示,用户在发起该滑动操作时,第一个触控点即为点A,并从点A开始进行滑动,并生成滑动轨迹1401,其中,在点A停留时间为0.3s,该第一个触控点的停留时间小于预设的时间阈值,因此该滑动轨迹1401为短按滑动类型。
参见图14中的(b)所示,用户在发起滑动操作时,第一个触控点为点B,并从点B开始进行滑动,并生成滑动轨迹1402,其中,在点B停留的时间为1s,该第一个触控点的停留时间大于或等于预设的时间阈值,因此该滑动轨迹1402为长按滑动类型。
可选地,电子设备可以将在第一个触控点的停留时间小于预设的时间阈值的交互操作识别为第一操作类型,即交互操作的操作对象为可操作控件,对应地,电子设备可以将在第一个触控点的停留时间大于或等于预设的时间阈值的交互操作识别为第二操作类型,即交互操作的操作对象为目标窗口。
又或者,电子设备可以将电子设备可以将在第一个触控点的停留时间小于预设的时间阈值的交互操作识别为第二操作类型,即交互操作的操作对象为目标窗口,对应地,电子设备可以将电子设备可以将在第一个触控点的停留时间大于或等于预设的时间阈值的交互操作识别为第一操作类型,即交互操作的操作对象为可操作控件。
在S1302中,根据所述按压时长确定所述交互操作的所述操作类型。
在本实施例中,电子设备在确定交互操作对应的按压时长后,可以根据按压时长的不同,确定该交互操作对应的操作类型。
在一种可能的实现方式中,电子设备可以配置有至少一个时间阈值,从而根据时间阈值将按压时长划分为至少两个时间区间,不同的时间区间可以对应不同的操作类型,电子设备可以识别本次交互操作对应的按压时长所落入的时间区间,并将该时间区间关联的操作类型作为该交互操作对应的操作类型。例如,将按压时长小于预设的时间阈值的交互操作识别为第二操作类型,即该交互操作的操作对象为目标窗口;而将按压时长大于或等于预设的时间阈值的交互操作识别为第一操作类型,即该交互操作的操作对象为可操作窗口。
在本申请实施例中,通过识别交互操作的按压时长,确定该交互操作对应的操作类型,从而实现通过不同的操作方式对不同的操作对象进行控制,无需额外增加对应的操作控件或者操作区域,从而能够提高交互操作准确性的同时,也能够避免对窗口页面布局的影响。
在S602中,若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型。
在本实施例中,电子设备在检测到交互操作与可操作控件关联的第一操作类型相匹配时,则表示该交互操作对应的操作对象为可操作控件,因此,电子设备可以基于该交互操作对可操作控件进行控制。例如,若该交互操作为向上滑动50个像素点pixel,则可以对可操作控件执行上移50pixel;当然,若该交互操作为关闭操作,则可以将可操作控件从目标窗口中移除,因此,对于可操作控件的控制操作具体根据交互操作实际的操作内容确定,在此不对操作内容以及对可操作控件的控制效果进行限定。
在一种可能的实现方式中,该目标窗口还可以包含两个或以上的可操作控件。在该情况下,不同的可操作控件对应不同的操作类型。为了便于区分,将对于可操作控件的操作类型统称为第一操作类型,而对于目标窗口的操作类型统称为第二操作类型。因此,若目标窗口内包含多个可操作控件,则在确定该交互操作的操作对象为可操作控件时,即该交互操作的操作类型为第一操作类型,则可以进一步识别该交互操作对应的可操作控件。
可选地,示例性地,图15示出了本申请一实施例提供的确定可操作控件的实现示意图。电子设备确定交互操作对应的可操作控件的方式可以:
1.参见图15中的(a)所示,根据该交互操作的坐标信息,计算与两个可操作控件之间的位置偏差值,将位置偏差值较小的可操作控件作为该交互操作对应的可操作控件。上述交互操作的坐标信息具体可以为该交互操作第一个触控点的坐标,计算交互操作与可操作控件之间的位置偏差值时,可以获取可操作控件的控件中心对应的坐标,作为该可操作控件的特征坐标,并根据该特征坐标与交互操作的坐标信息计算上述的位置偏差值;当然,也可以获取可操作控件上离交互操作的第一个触控点最近的边界点的坐标,根据该边界点的坐标以及上述第一个触控点的坐标,计算上述的位置偏差值,通过位置偏差值确定交互操作对应的可操作控件。
2.参见图15中的(b)所示,若一个目标窗口包含有多个可操作控件,而由于显示空间的限制,多个可操作控件无法同时在显示界面上完全展示,例如可操作控件1501处于完全展示状态,而可操作控件1502则部分区域被遮挡,当然,当可操作控件较多的情况下,部分可操作控件可以处于隐藏显示的状态,在该情况下,电子设备可以计算各个可显示控件的显示比例,选取显示比例最高的一个可操作控件作为交互操作对应的操作对象;若存在多个可操作控件的显示比例均为100%,即处于完全显示状态,则分别计算上述各个处于完全显示状态的可操作控件与显示界面中心坐标之间的相距距离,选取相距距离最小的可操作控件作为交互操作的操作对象。
可选地,不同的可操作控件可以对应不同的操作类型,例如可操作控件A对应的操作类型为第一操作A类型,而可操作控件B对应的操作类型为第一操作B类型,在该情况下,电子设备在确定交互的操作类型时,可以判断该交互操作类型是第二操作类型,或是第一操作A类型,或是第二操作B类型,继而确定该交互操作对应的可操作控件具体是哪一个。即在确定交互操作的操作类型时,即可以从多个可操作控件中确定所需操作的可操作控件。
进一步地,作为本申请的另一实施例,在确定该交互操作为第一操作类型后,并基于交互操作对可操作控件进行控制之前,可以识别该交互操作是否为有效的交互操作,即将交互操作与预设的抖动范围进行比对。示例性地,图16示出了本申请一实施例提供的抖动范围的示意图。参见图16所示,电子设备可以预设有抖动范围,若该交互操作在上述的抖动范围内,则识别该交互操作为无效操作,如交互操作1;反之,若该交互操作在抖动范围外,则识别该交互操作为有效操作,如交互操作2,对于无效的交互操作可以不进行响应;而对于有效的交互操作则可以进行响应。
在本申请一实施例中,由于用户发起交互操作是存在操作时间的,并且当该交互操作为一滑动操作时,则对应的操作时长则会较长,为了实现电子设备与交互操作的实时响应,电子设备可以在采集用户的交互操作的同时,基于以获取的交互操作对可操作控件进行控制。基于此,电子设备可以设置有多个采集周期,为了实现实时响应,上述采集周期的周期时长可以较短,例如可以为0.1s,用户发起的交互操作的操作时间可以包含N个采集周期,电子设备在采集完第n个(n为小于N的正整数)采集周期对应的交互操作后,则可以识别该交互操作的操作类型,并基于该交互操作对对应的操作对象进行控制。例如该交互操作的操作类型为第一操作类型,则该交互操作的操作对象为可操作控件,在该情况下,在进入第n+1个采集周期时,电子设备可以基于第n个采集周期获取到的交互操作对可操作控件进行控制,并同时在第n+1个采集周期获取用户发起的交互操作,直到交互操作已全部采集完毕,则电子设备会在第N+1个采集周期基于第N个采集周期获取到的交互操作对可操作控件进行控制,从而实现实时响应可操作控件的目的。
在S603中,若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
在本实施例中,电子设备在检测到交互操作并非可操作控件关联的第一操作类型或确定该交互操作为目标窗口关联的第二操作类型时,则表示该交互操作对应的操作对象为目标窗口,因此,电子设备可以基于该交互操作对目标窗口进行控制。例如, 若该交互操作为向上滑动50个像素点pixel,则可以对目标窗口执行上移50pixel;当然,若该交互操作为关闭操作,则可以将关闭目标窗口,因此,对于目标窗口的控制操作具体根据交互操作实际的操作内容确定,在此不对操作内容以及对可操作控件的控制效果进行限定。
在本实施例中,基于交互操作对目标窗口进行控制的方式与对可操作控件进行控制的方式相同,具体可以参见S602的相关描述,在此不再赘述。
以上可以看出,本申请实施例提供的一种窗口页面的交互方法可以通过在电子设备接收到用户发起的交互操作时,识别该交互操作的操作类型,基于该操作类型确定交互操作对应的操作对象,例如该交互操作的操作类型为第一操作类型,则可以确定该交互操作对应的操作对象为可操作控件,并基于上述交互操作对目标窗口上的可操作控件进行控制,例如向指定方向移动或翻页等;反之,若该交互操作的操作类型为第二操作类型,则可以确定该交互操作对应的操作对象为目标窗口,并基于上述交互操作对目标窗口进行控制。与现有的交互响应技术相比,本申请实施例可以根据交互操作的操作类型区分操作对象是底层的目标窗口或是附着于窗口上的可操作控件,即便可操作控件与目标窗口的显示区域存在重叠,也能够区分对应的操作对象,并对对应的操作对象进行控制,从而提高了交互操作的准确性,以及提升了用户的使用体验。
另一方面,由于本申请实施例在确定交互操作的操作对象时,无需在可操作控件内配置对应的专属操作区域,而是可以根据该交互操作的操作类型区分不同的操作对象,并不会对可操作控件在窗口页面上的布局造成影响,降低了窗口页面的布局难度并且提高了窗口页面内区域的利用效率。并且上述交互操作并不限定于在某个指定区域内完成,也进一步提高了交互操作的灵活性。
实施例二:
图17示出本申请第二实施例提供的窗口页面的流程示意图,参见图17所示,与上一实施例相比,实施例二提供的窗口页面的交互方法中,配置有窗口事件标识,该窗口事件标识是在响应了一次操作对象为目标窗口的交互操作后,会对该目标窗口的窗口事件标识进行配置,并在识别该交互操作的操作类型之前,会对该窗口事件标识的位值进行识别,基于窗口事件标识的位值的不同,执行不同的响应流程。具体地,本申请第二实施例提供的窗口页面的交互方法包括:S1701~S1706:
在S1701中,响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识。
在本实施例中,电子设备在接收到用户发起的交互操作后,可以获取当前显示界面上显示的目标窗口的窗口事件标识。该窗口事件标识用于确定上一关联的交互操作的操作对象是否为目标窗口。
在一种可能的实现方式中,用户在发起交互操作时,电子设备可以判断该交互操作与上一交互操作之间是否存在关联性。例如,用户需要对目标窗口进行拖动时,可以先进行向左拖动,然后在进行向上拖动,即上述两个滑动操作属于关联的交互操作,在该情况下,电子设备会识别该交互操作的操作对象相同。
在一种可能的实现方式中,电子设备可以配置有关联时间阈值,若电子设备检测到任意两个交互操作之间的时间间隔在上述的关联时间阈值内,则识别上述两个交互操作为关联的交互操作,则将上一关联的交互操作的操作对象作为该交互操作的操作 对象;反之,若上述两个交互操作之间的时间间隔在上述的关联时间阈值外,则识别上述两个交互操作并非关联的交互操作,则重新获取该交互操作的操作类型,基于操作类型确定该交互操作对应的操作对象。
在一种可能的实现方式中,电子设备可以配置有多个采集周期,用户发起的交互操作可以是一个连续的交互操作,电子设备可以基于上述采集周期,将用户发起的连续的交互操作划分为多个存在关联的交互操作,将存在关联的交互操作的操作对象识别为相同的操作对象。在该情况下,若电子设备识别得到用户的交互操作中断,例如手指离开触摸屏,或松开鼠标的点击按键,则识别该连续的交互操作以完成,此时,可以重置上述的窗口事件标识,例如将窗口事件标识配置为预设的无效位值,以便电子设备重新确定交互操作的操作对象。
示例性地,图18示出了本申请一实施例提供的交互操作的划分示意图。参见图18所示,该交互操作具体为一滑动操作,会在触控屏上生成一个触控轨迹,电子设备可以根据采集周期或者预设的有效移动距离,在采集上述触控轨迹的过程中,会将上述连续的交互操作识别为多个持续时间较短或移动距离较短的交互操作,在此称为子交互操作,即在用户发起一个连续的交互操作的过程中,电子设备会获取得到多个子交互操作,并分别对各个子交互操作执行S1701的操作。对于上述多个隶属于同一连续的交互操作的多个子交互操作,均识别为均在关联的交互操作,并在连续的交互操作完成时,电子设备可以重置上述的窗口事件标识。
在一种可能的实现方式中,电子设备可以为目标窗口以及可操作控件分配不同的线程以处理对应的操作。在该情况下,电子设备在接收到交互操作后,首先将交互操作的事件交由目标窗口的线程进行处理,该目标窗口的线程会获取窗口事件标识,并基于窗口事件标识的值,确定是继续通过目标窗口的线程处理(例如窗口事件标识处于有效位值),或是交由可操作控件的线程处理(例如窗口事件标识处于无效位值)。
在S1702中,若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
在本实施例中,电子设备在识别了窗口事件标识为无效位值时,即表示上一时刻并没有与该交互操作关联的另一操作对目标窗口进行控制,在该情况下,需要重新识别该交互操作对应的操作对象,基于此,需要确定该交互操作的操作类型,并根据该操作类型确定该交互操作的操作对象。具体描述可以参见S601的相关描述,在此不再赘述。
在S1703中,若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型。
在S1704中,若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
在本实施例中,由于S1703与S602完全相同,S1704与S603完全相同,因此,S1703和S1704的具体实现过程可以参见S602和S603的相关描述,在此不再赘述。
在S1705中,将所述目标窗口对应的窗口事件标识配置为预设的有效位值。
在本实施例中,电子设备在识别得到上述交互操作的操作对象为目标窗口并完成 基于交互操作对目标窗口的控制后,电子设备可以将目标窗口对应的窗口事件标识配置为预设的有效值,并在后续电子设备采集到与该交互操作关联的另一交互操作后,可以继续将目标窗口作为另一交互操作的操作对象,对该目标窗口进行控制,而无需再次进行操作对象的识别,保持了交互操作的连续性。
在一种可能的实现方式中,在S1705之后,还包括:若电子设备在预设的有效时间内未接收到新的交互操作,则将所述窗口事件标识重置为预设的无效位值。
在一种可能的实现方式中,在S1705之后,还包括:若电子设备识别得到上述交互操作已完成(例如滑动操作结束,手指离开触控屏,或拖拽操作结束,松开点击按键等等),则将所述窗口事件标识重置为预设的无效位值。
在一种可能的实现方式中,电子设备在基于交互操作对目标窗口进行控制前,会判断该交互操作是否为目标窗口的有效操作,即该目标窗口是否基于交互操作进行响应。由于部分的交互操作对于目标窗口而言,是无效操作,例如目标窗口已经位于屏幕的最左侧,此时发起目标窗口的左移操作,则目标窗口无法基于该交互操作进行响应,则该交互操作是目标窗口的无效操作。因此,若上述交互操作为无效操作,则不执行S1705的操作,而是将窗口事件标识配置为无效位值;反之,若上述交互操作为有效操作,则执行S1705的操作。
在S1706中,若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
在本实施例中,电子设备在确定了窗口事件标识为有效位值时,则表示当前采集到的交互操作存在上一关联的交互操作,且该上一关联的交互操作的操作对象为目标窗口,在该情况下,则无需确定该交互操作的操作类型,而是可以根据该交互操作对目标窗口进行控制。
在本申请实施例中,通过配置窗口事件标签,能够在存在关联的上一交互操作的情况下,无需对本次采集得到的交互操作的类型进行识别,并将上一交互操作的操作对象作为本次操作的操作对象,提高了交互操作的连续性,避免了错误切换操作对象的情况发生,提高了操作的准确性。
实施例三:
图19示出本申请第三实施例提供的窗口页面的流程示意图,参见图19所示,与实施例一相比,实施例三提供的窗口页面的交互方法中,配置有控件事件标识,该控件事件标识是在响应了一次操作对象为可操作控件的交互操作后,会对该可操作控件的控件事件标识进行配置,并在识别上述交互操作并非可操作控件的第一操作类型之后,会对该控件事件标识的位值进行识别,基于控件事件标识的位值的不同,执行不同的响应流程。具体地,本申请实施例三提供的窗口页面的交互方法包括:S1901~S1906:
在S1901中,响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件。
在本实施例中,S1901的实现方式与S601的实现方式完全相同,S1901的具体描述可以参见S601的相关描述,在此不再赘述。
在S1902中,若所述操作类型为第一操作类型,则识别所述可操作控件为所述交 互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型。
在本实施例中,S1902的实现方式与S602的实现方式完全相同,S1902的具体描述可以参见S602的相关描述,在此不再赘述。
在S1903中,将所述可操作控件对应的控件事件标识配置为预设的第一位值。
在本实施例中,电子设备在对可操作控件进行响应完毕后,可以对该可操作控件的控件事件标识进行配置,将控件事件标识配置为预设的第一位值,在控件事件标识处于第一位值下,则表示该控件事件标识处于有效状态,在后续电子设备采集到与该交互操作关联的另一交互操作后,可以继续将可操作控件作为另一交互操作的操作对象,对该可操作控件进行控制,而无需再次进行操作对象的识别,保持了交互操作的连续性。其中,该控件事件标识用于确定上一关联的交互操作的操作对象是否为可操作控件。
在本实施例中,判断本次采集的交互操作与上一采集的交互操作是否为关联的交互操作,可以参见S1701的相关描述,在此不再赘述。
在一种可能的实现方式中,在S1903之后,还包括:若电子设备在预设的有效时间内未接收到新的交互操作,则将所述控件事件标识重置为预设的第二位值。
在一种可能的实现方式中,在S1903之后,还包括:若电子设备识别得到上述交互操作已完成(例如滑动操作结束,手指离开触控屏,或拖拽操作结束,松开点击按键等等),则将所述控件事件标识重置为预设的第二位值。
在S1904中,若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识。
在本实施例中,电子设备在判定该交互操作为第二操作类型时,可以获取上述控件事件标识的值,并根据控件事件标识的不同,执行不同的响应操作。具体地,若该控件事件标识为第一位值,则执行S1905的操作;反之,若该控件事件标识为第二位值,则执行S1906的操作。
在一种可能的实现方式中,若第一操作类型为基于可操作控件的可交互信息确定的,即该第一操作类型为可操作控件可响应的操作类型;对应地,第二操作类型即为可操作控件不可响应的操作类型,在该情况下,在确定上述采集到的交互操作为可操作控件不可响应的交互操作时,则确定该可操作控件的控件事件标识。
在S1905中,若所述控件事件标识为预设的第一位值,则结束响应所述交互操作。
在本实施例中,若控件事件标识为预设的第一位值,则表示当前采集的交互操作与上一采集到的交互操作存在关联关系,操作对象均为可操作控件,在该情况下,由于本次采集到的交互操作对于可操作控件而言属于无效的交互操作,在该情况下,则直接结束响应该交互操作。
在S1906中,若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
在本实施例中,若控件事件标识为预设的第二位值,则表示当前采集的交互操作与上一采集到的交互操作不存在关联关系,且该交互操作的操作类型并非可操作控件关联的第一操作类型,在该情况下,可以识别目标窗口为交互操作的操作对象,并通 过交互操作对目标窗口进行控制。
在一种可能的实现方式中,若电子设备为目标窗口配置有对应的窗口事件标识,则在基于交互操作对目标窗口进行控制时,可以将上述窗口事件标识配置为预设的有效位值,以便后续采集到的交互操作可以将目标窗口识别为操作对象。
在本申请实施例中,通过配置控件事件标签,能够在存在关联的上一交互操作的情况下,能够将上一交互操作的操作对象作为本次操作的操作对象,提高了交互操作的连续性,避免了错误切换操作对象的情况发生,提高了操作的准确性。
实施例四:
图20示出本申请第四实施例提供的窗口页面的流程示意图,参见图20所示,与实施例一相比,实施例四提供的窗口页面的交互方法中,配置有控件事件标识以及窗口事件标识,其中,控件事件标识以及窗口事件标识的作用可以参见实施例二以及实施例三的描述,在此不再赘述。具体地,本申请实施例四提供的窗口页面的交互方法包括:S2001~S2008:
在S2001中,响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识。
在本实施例中,S2001的实现方式与S1701的实现方式完全相同,S2001的具体描述可以参见S1701的相关描述,在此不再赘述。
在S2002中,若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
在本实施例中,S2002的实现方式与S1702的实现方式完全相同,S2002的具体描述可以参见S1702的相关描述,在此不再赘述。
进一步地,在确定该交互操作的操作类型之前,可以判断该交互操作是否满足预设的防抖条件,即该交互操作是否超出预设的抖动范围,若是,则识别该交互操作为有效操作,执行确定所述交互操作的操作类型的操作;反之,若该交互操作没有超出抖动范围,则识别该交互操作为无效操作,结束响应该交互操作。
在S2003中,若所述操作类型为第一操作类型,则将所述可操作控件对应的控件事件标识配置为预设的第一位值;所述第一操作类型是与所述可操作控件关联的操作类型;所述第一操作类型是与所述可操作控件关联的操作类型。
在本实施例中,S2003的实现方式与S1903的实现方式完全相同,S2003的具体描述可以参见S1903的相关描述,在此不再赘述。
在S2004中,识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制。
在本实施例中,S2004的实现方式与S602的实现方式完全相同,S2004的具体描述可以参见S602的相关描述,在此不再赘述。
在S2005中,若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识。
在本实施例中,S2005的实现方式与S1904的实现方式完全相同,S2005的具体描述可以参见S1904的相关描述,在此不再赘述。
在S2006中,若所述控件事件标识为预设的第一位值,则结束响应所述交互操作。
在本实施例中,S2006的实现方式与S1905的实现方式完全相同,S2006的具体 描述可以参见S1905的相关描述,在此不再赘述。
在S2007中,若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
在本实施例中,S2007的实现方式与S1906的实现方式完全相同,S2007的具体描述可以参见S1906的相关描述,在此不再赘述。
进一步地,上述S2007可以包括:
判断所述目标窗口是否可以基于所述交互操作进行对应的响应操作。
若目标窗口可以基于交互操作进行对应的响应操作,则将所述目标窗口对应的窗口事件标识配置为预设的有效位值,并执行对应的响应操作。
若目标窗口无法基于交互操作进行对应的响应操作,则结束响应操作。
在S2008中,若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
在本实施例中,S2008的实现方式与S1706的实现方式完全相同,S2008的具体描述可以参见S1706的相关描述,在此不再赘述。
进一步地,上述S2008可以包括:
判断所述目标窗口是否可以基于所述交互操作进行对应的响应操作。
若目标窗口可以基于交互操作进行对应的响应操作,则将所述目标窗口对应的窗口事件标识配置为预设的有效位值,并执行对应的响应操作。
若目标窗口无法基于交互操作进行对应的响应操作,则结束响应操作。
示例性地,图21示出了本申请一实施例提供的在未满足防抖条件下窗口页面的交互方法的响应流程图。参见图21所示,具体包括以下步骤:
1.电子设备接收到交互操作,并获取窗口事件标识为无效位值,则电子设备可以对交互操作进行防抖识别。
2.电子设备判断该交互操作的操作轨迹并未超出预设的防抖范围,即该交互操作并未满足预设的防抖条件。
3.由于为满足预设的防抖条件,则结束响应上述交互操作。
示例性地,图22示出了本申请一实施例提供的在对响应完可操作控件的交互操作后,窗口页面的交互方法的响应流程图,即控件事件标识被配置为第一位值。参见图22所示,具体包括以下步骤:
1.电子设备接收到交互操作后,可以将交互操作交由目标窗口的线程进行处理,目标窗口的线程在接收到上述分发的事件后,可以获取窗口事件标识,并确定窗口事件标识为无效位值,此时,目标窗口的线程可以将上述的交互操作的事件交由可操作控件的线程进行处理,通过可操作控件的线程对交互操作进行防抖识别。
2.电子设备通过可操作控件的线程判断该交互操作的操作轨迹超出预设的防抖范围,即该交互操作满足预设的防抖条件,可操作控件的线程会进一步确定该交互操作的操作类型。
3.可操作控件的线程若识别到该交互操作的操作类型为第一操作类型,此时,可以将控件事件标识配置为第一位值,并基于交互操作对可操作控件进行控制。当控件时间标识为第一位值后,电子设备接收到的后续关联的交互操作均交由可操作控件的 线程进行处理,将操作对象识别为可操作控件,直到上述交互操作的事件完成。
4.可操作控件若识别到该交互操作的操作类型为第二操作类型,则可操作控件的线程会识别控件事件标识,并在识别到该控件事件标识为第一位值时,不对本次交互操作进行响应。
示例性地,图23示出了本申请一实施例提供的在首次对目标窗口进行响应时,窗口页面的交互方法的响应流程图,即窗口事件标识被配置为无效位值且控件事件标识被配置为第二位值。参见图23所示,具体包括以下步骤:
1.电子设备接收到交互操作后,可以将交互操作交由目标窗口的线程进行处理,目标窗口的线程在接收到上述分发的事件后,可以获取窗口事件标识,并确定窗口事件标识为无效位值,此时,目标窗口的线程可以将上述的交互操作的事件交由可操作控件的线程进行处理,通过可操作控件的线程对交互操作进行防抖识别。
2.电子设备通过可操作控件的线程判断该交互操作的操作轨迹超出预设的防抖范围,即该交互操作满足预设的防抖条件,可操作控件的线程会进一步确定该交互操作的操作类型。
3.可操作控件的线程识别到该交互操作的操作类型为第二操作类型,则可操作控件的线程会识别控件事件标识,并在识别到该控件事件标识为第二位值时,将本次的交互操作的事件交由目标窗口的线程进行处理。
4.电子设备可以通过目标窗口的线程可以配置窗口事件标识为第一位值,并基于交互操作对目标窗口进行控制。
示例性地,图24示出了本申请一实施例提供的在对响应完目标窗口的交互操作后,窗口页面的交互方法的响应流程图,即窗口事件标识被配置为有效位值。参见图24所示,具体包括以下步骤:
1.电子设备接收到交互操作后,可以将交互操作交由目标窗口的线程进行处理,目标窗口的线程在接收到上述分发的事件后,可以获取窗口事件标识,并确定窗口事件标识为有效位值,会继续通过目标窗口的线程进行处理。
2.电子设备可以通过目标窗口的线程对目标窗口进行控制,直到交互时间结束。
实施例五:
对应于上文实施例所述的窗口页面的交互方法,图25示出了本申请实施例提供的窗口页面的交互装置的结构框图,为了便于说明,仅示出了与本申请实施例相关的部分。
参照图25,该窗口页面的交互装置包括:
操作类型确定单元251,用于响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件;
第一操作类型响应单元252,用于若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型;
第二操作类型响应单元253,用于若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
可选地,窗口页面的交互装置,还包括:
可交互信息确定单元,用于确定所述可操作控件的可交互信息;所述可交互信息包括可交互范围和/或可交互方向;
第一操作类型确定单元,用于基于所述可交互信息确定所述可操作控件关联的所述第一操作类型。
可选地,所述第二操作类型为除所述第一操作类型外的其他操作类型。
可选地,所述操作类型确定单元251,包括:
按压时长获取单元,用于识别所述交互操作对应的按压时长;
按压时长对比文件,用于根据所述按压时长确定所述交互操作的所述操作类型。
可选地,所述第二操作类型响应单元253,还包括:
窗口事件标识配置单元,用于将所述目标窗口对应的窗口事件标识配置为预设的有效位值。
可选地,所述操作类型确定单元251,包括:
窗口事件标识获取单元,用于响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识;
有效位值响应单元,用于若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
可选地,窗口页面的交互装置,还包括:
无效位值响应单元,用于若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
可选地,窗口页面的交互装置,还包括:
控件事件标识配置单元,用于将所述可操作控件对应的控件事件标识配置为预设的第一位值。
可选地,所述第二操作类型响应单元253,包括:
控件事件标识识别单元,用于若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识;
第一位值响应单元,用于若所述控件事件标识为预设的第一位值,则结束响应所述交互操作;
第二位值响应单元,用于若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
因此,本申请实施例提供的窗口页面的交互装置同样可以通过在电子设备接收到用户发起的交互操作时,识别该交互操作的操作类型,基于该操作类型确定交互操作对应的操作对象,例如该交互操作的操作类型为第一操作类型,则可以确定该交互操作对应的操作对象为可操作控件,并基于上述交互操作对目标窗口上的可操作控件进行控制,例如向指定方向移动或翻页等;反之,若该交互操作的操作类型为第二操作类型,则可以确定该交互操作对应的操作对象为目标窗口,并基于上述交互操作对目标窗口进行控制。与现有的交互响应技术相比,本申请实施例可以根据交互操作的操作类型区分操作对象是底层的目标窗口或是附着于窗口上的可操作控件,即便可操作控件与目标窗口的显示区域存在重叠,也能够区分对应的操作对象,并对对应的操作 对象进行控制,从而提高了交互操作的准确性,以及提升了用户的使用体验。
图26为本申请一实施例提供的电子设备的结构示意图。如图26所示,该实施例的电子设备26包括:至少一个处理器260(图26中仅示出一个)处理器、存储器261以及存储在所述存储器261中并可在所述至少一个处理器260上运行的计算机程序262,所述处理器260执行所述计算机程序262时实现上述任意各个窗口页面的交互方法实施例中的步骤。
所述电子设备26可以是桌上型计算机、笔记本、掌上电脑及云端服务器等计算设备。该电子设备可包括,但不仅限于,处理器260、存储器261。本领域技术人员可以理解,图26仅仅是电子设备26的举例,并不构成对电子设备26的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如还可以包括输入输出设备、网络接入设备等。
所称处理器260可以是中央处理单元(Central Processing Unit,CPU),该处理器260还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器261在一些实施例中可以是所述电子设备26的内部存储单元,例如电子设备26的硬盘或内存。所述存储器261在另一些实施例中也可以是所述电子设备26的外部存储设备,例如所述电子设备26上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器261还可以既包括所述电子设备26的内部存储单元也包括外部存储设备。所述存储器261用于存储操作系统、应用程序、引导装载程序(BootLoader)、数据以及其他程序等,例如所述计算机程序的程序代码等。所述存储器261还可以用于暂时地存储已经输出或者将要输出的数据。
需要说明的是,上述装置/单元之间的信息交互、执行过程等内容,由于与本申请方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本申请实施例还提供了一种电子设备,该电子设备包括:至少一个处理器、存储器以及存储在所述存储器中并可在所述至少一个处理器上运行的计算机程序,所述处 理器执行所述计算机程序时实现上述任意各个方法实施例中的步骤。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现可实现上述各个方法实施例中的步骤。
本申请实施例提供了一种计算机程序产品,当计算机程序产品在移动终端上运行时,使得移动终端执行时实现可实现上述各个方法实施例中的步骤。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质至少可以包括:能够将计算机程序代码携带到拍照装置/电子设备的任何实体或装置、记录介质、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/网络设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/网络设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神 和范围,均应包含在本申请的保护范围之内。

Claims (12)

  1. 一种窗口页面的交互方法,其特征在于,包括:
    响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件;
    若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型;
    若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
  2. 根据权利要求1所述的交互方法,其特征在于,在所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型之前,还包括:
    确定所述可操作控件的可交互信息;所述可交互信息包括可交互范围和/或可交互方向;
    基于所述可交互信息确定所述可操作控件关联的所述第一操作类型。
  3. 根据权利要求2所述的交互方法,其特征在于,所述第二操作类型为除所述第一操作类型外的其他操作类型。
  4. 根据权利要求1所述的交互方法,其特征在于,所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型,包括:
    识别所述交互操作对应的按压时长;
    根据所述按压时长确定所述交互操作的所述操作类型。
  5. 根据权利要求1所述的交互方法,其特征在于,所述若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制,还包括:
    将所述目标窗口对应的窗口事件标识配置为预设的有效位值。
  6. 根据权利要求5所述的交互方法,其特征在于,所述响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型,包括:
    响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识;
    若所述窗口事件标识为所述有效位值,则基于所述交互操作对所述目标窗口进行控制。
  7. 根据权利要求6所述的交互方法,其特征在于,在所述响应于所述交互操作,获取所述目标窗口对应的所述窗口事件标识之后,还包括:
    若所述窗口事件标识为预设的无效位值,则执行所述确定所述交互操作的操作类型。
  8. 根据权利要求1-7任一项所述的交互方法,其特征在于,在所述若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制之后,还包括:
    将所述可操作控件对应的控件事件标识配置为预设的第一位值。
  9. 根据权利要求8所述的交互方法,其特征在于,所述若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制,包括:
    若所述操作类型为第二操作类型,则确定所述可操作控件对应的控件事件标识;
    若所述控件事件标识为预设的第一位值,则结束响应所述交互操作;
    若所述控件事件标识为预设的第二位值,则执行所述识别所述目标窗口为所述交 互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
  10. 一种窗口页面的交互装置,其特征在于,包括:
    操作类型确定单元,用于响应于用户在目标窗口内发起的交互操作,确定所述交互操作的操作类型;所述目标窗口内包含至少一个可操作控件;
    第一操作类型响应单元,用于若所述操作类型为第一操作类型,则识别所述可操作控件为所述交互操作的操作对象,并基于所述交互操作对所述可操作控件进行控制;所述第一操作类型是与所述可操作控件关联的操作类型;
    第二操作类型响应单元,用于若所述操作类型为第二操作类型,则识别所述目标窗口为所述交互操作的操作对象,并基于所述交互操作对所述目标窗口进行控制。
  11. 一种电子设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至9任一项所述的方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至9任一项所述的方法。
PCT/CN2021/136899 2020-12-30 2021-12-09 一种窗口页面的交互方法、装置、电子设备以及可读存储介质 WO2022143094A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21913835.1A EP4250078A1 (en) 2020-12-30 2021-12-09 Window page interaction method and apparatus, electronic device, and readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011644145.7 2020-12-30
CN202011644145.7A CN114764300B (zh) 2020-12-30 一种窗口页面的交互方法、装置、电子设备以及可读存储介质

Publications (1)

Publication Number Publication Date
WO2022143094A1 true WO2022143094A1 (zh) 2022-07-07

Family

ID=82259043

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/136899 WO2022143094A1 (zh) 2020-12-30 2021-12-09 一种窗口页面的交互方法、装置、电子设备以及可读存储介质

Country Status (2)

Country Link
EP (1) EP4250078A1 (zh)
WO (1) WO2022143094A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101714058A (zh) * 2008-10-06 2010-05-26 索尼株式会社 信息处理装置和方法及程序
CN102257471A (zh) * 2008-10-26 2011-11-23 思杰系统有限公司 将移动计算设备的本地显示器平移到窗口,解释基于手势的指令以滚动窗口内容,以及在窗口中换行文本
US20190317617A1 (en) * 2017-03-01 2019-10-17 Fujitsu Client Computing Limited Terminal Device And Recording Medium
CN110647268A (zh) * 2019-09-26 2020-01-03 网易(杭州)网络有限公司 一种游戏中显示窗口的操控方法及操控装置
CN111831205A (zh) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101714058A (zh) * 2008-10-06 2010-05-26 索尼株式会社 信息处理装置和方法及程序
CN102257471A (zh) * 2008-10-26 2011-11-23 思杰系统有限公司 将移动计算设备的本地显示器平移到窗口,解释基于手势的指令以滚动窗口内容,以及在窗口中换行文本
US20190317617A1 (en) * 2017-03-01 2019-10-17 Fujitsu Client Computing Limited Terminal Device And Recording Medium
CN110647268A (zh) * 2019-09-26 2020-01-03 网易(杭州)网络有限公司 一种游戏中显示窗口的操控方法及操控装置
CN111831205A (zh) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN114764300A (zh) 2022-07-19
EP4250078A1 (en) 2023-09-27

Similar Documents

Publication Publication Date Title
KR102470275B1 (ko) 음성 제어 방법 및 전자 장치
WO2020052529A1 (zh) 全屏显示视频中快速调出小窗口的方法、图形用户接口及终端
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2020134869A1 (zh) 电子设备的操作方法和电子设备
WO2021000881A1 (zh) 一种分屏方法及电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
WO2021063237A1 (zh) 电子设备的控制方法及电子设备
WO2022068483A1 (zh) 应用启动方法、装置和电子设备
WO2021078032A1 (zh) 用户界面的显示方法及电子设备
WO2020238759A1 (zh) 一种界面显示方法和电子设备
WO2022037726A1 (zh) 分屏显示方法和电子设备
WO2021052139A1 (zh) 手势输入方法及电子设备
CN113805487B (zh) 控制指令的生成方法、装置、终端设备及可读存储介质
WO2020024108A1 (zh) 一种应用图标的显示方法及终端
WO2021008589A1 (zh) 一种应用的运行方法及电子设备
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
JP2022501739A (ja) スタイラスペン検出方法、システムおよび関連装置
CN115756268A (zh) 跨设备交互的方法、装置、投屏系统及终端
CN113641271A (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2022052740A1 (zh) 一种折叠设备及其开合控制方法
WO2022001279A1 (zh) 跨设备桌面管理方法、第一电子设备及第二电子设备
CN115016697A (zh) 投屏方法、计算机设备、可读存储介质和程序产品
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
WO2022002213A1 (zh) 翻译结果显示方法、装置及电子设备
WO2022166435A1 (zh) 分享图片的方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21913835

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021913835

Country of ref document: EP

Effective date: 20230620

NENP Non-entry into the national phase

Ref country code: DE