WO2023116012A1 - 屏幕显示方法和电子设备 - Google Patents

屏幕显示方法和电子设备 Download PDF

Info

Publication number
WO2023116012A1
WO2023116012A1 PCT/CN2022/114897 CN2022114897W WO2023116012A1 WO 2023116012 A1 WO2023116012 A1 WO 2023116012A1 CN 2022114897 W CN2022114897 W CN 2022114897W WO 2023116012 A1 WO2023116012 A1 WO 2023116012A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
app
interface
user
electronic device
Prior art date
Application number
PCT/CN2022/114897
Other languages
English (en)
French (fr)
Other versions
WO2023116012A9 (zh
Inventor
杨柳
孙祺
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22862333.6A priority Critical patent/EP4224298A4/en
Priority to US18/026,548 priority patent/US20240295905A1/en
Publication of WO2023116012A1 publication Critical patent/WO2023116012A1/zh
Publication of WO2023116012A9 publication Critical patent/WO2023116012A9/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • some folding screen devices will be equipped with two screens, one of which is a foldable inner screen, that is, a folding screen. Users can unfold the inner screen for use when watching videos or playing games. When the user does not use the inner screen, the folding screen device can be folded for easy portability, and the inner screen will be folded at this time.
  • the other screen is the outer screen, which is usually smaller than the inner screen.
  • the inner screen will be folded and the outer screen will be exposed, so that the user can directly operate the outer screen to use the folding screen device without unfolding the inner screen. For example, the user can operate the outer screen to Answer or make calls, check messages, etc.
  • the folding screen device will think that the user needs to use the inner screen at this time, so the outer screen will automatically turn off.
  • the description is made.
  • the user may perform the first operation on the internal screen, for example, click the icon of the shortcut service on the desktop to open the window of the shortcut service.
  • the user may also perform the first operation on the external screen, such as double-clicking the external screen, or tapping the external screen with two fingers to open the shortcut service window.
  • the external screen may display a shortcut service window.
  • the shortcut service window may include one or more service controls, and each service control corresponds to an application (application, APP) or a setting page of an APP, for example, a payment service control may correspond to a payment APP, It can also correspond to the payment code page of the APP that should be paid.
  • the service controls in the shortcut service window may include controls corresponding to the APP that the user needs to call quickly.
  • the electronic device receives a second operation performed by the user, and the second operation is an operation of clicking a first control to be displayed from at least one service control in the shortcut service window.
  • the operation authority of the external screen may be in a closed state to avoid privacy leakage or information loss caused by misoperation by others, and the user may perform a second operation on the internal screen.
  • the operation authority of the external screen can be in the open state, and other people can perform a second operation on the external screen to select the first control, so that the service code to be displayed can be selected according to the need under the condition of the owner's permission, so use more convenient.
  • the electronic device displays the service code corresponding to the above-mentioned first control on the external screen.
  • the corresponding service code can be the payment code; when the first control selected by the user is the control of the personal health APP, the corresponding service code can also be the personal Health code; when the first control selected by the user is the control of the ride APP, the corresponding service code can be the ride code.
  • the service code here may be in the form of a barcode, a two-dimensional code, etc., and this application does not limit the specific form of the service code.
  • the receiving the first operation performed by the user includes: receiving the first operation when a third interface is displayed on the first screen, and the third interface is the following interface One: the lock screen interface, the always on display (AOD) interface, the desktop, and the interface of the first APP.
  • the third interface is the following interface One: the lock screen interface, the always on display (AOD) interface, the desktop, and the interface of the first APP.
  • the first operation can be performed To open the shortcut service window, perform the second operation in the shortcut service window to make the outer screen display the service code to be displayed, so that the user can use the outer screen to scan the code without turning over the electronic device. Since the external screen only displays the service code, there is no need to show the first interface of the first APP to other people or devices, which protects the user's information from leakage, and the user can also scan while using the first APP without interruption. code, which protects the user's privacy and improves the user's scanning experience at the same time.
  • the first APP is any one of a browser APP, a video APP, a game APP, a conference APP, a document APP, and a video call APP.
  • the user can display the service code to be displayed on the external screen by operating the electronic device without affecting the user's use of any of the above-mentioned browser APP, video APP, game APP, conference APP, document APP and video call APP on the internal screen, which improves the user experience. scanning experience.
  • the first operation is an operation of sliding down a drop-down task bar on the first screen and clicking a control of a shortcut service in the drop-down task bar.
  • the user can slide down the top of the inner screen to activate the drop-down taskbar, and click the shortcut service control in the drop-down taskbar to open the shortcut service window.
  • the first operation is an operation of double-clicking the second screen.
  • the user can also double-click the external screen on the external screen to light up the external screen and open the first interface (for example, the window of the shortcut service) on the external screen. Opening the first interface makes the operation more flexible and convenient.
  • the first interface includes: at least one of a payment code control, a ride code control, and a personal health code control.
  • Payment codes, ride codes, and personal health codes are frequently used service codes. Storing one or more of these service codes in the first interface (that is, the window of the quick service) can significantly reduce the number of users manually passing through The number of times to open the APP to open the service code improves the user experience.
  • the above-mentioned second control may be a menu control or an editing control.
  • the user can open the APP management list by operating the menu control in the shortcut service window, and then manage the service controls in the shortcut service window by operating the status button corresponding to the icon of the APP that supports the shortcut service in the APP management list .
  • the method can realize the user's self-management of the types of service controls in the shortcut service window, so that the operation of invoking the service code through the shortcut service window can meet the personalized needs of different users and improve user experience.
  • a method for displaying a screen is provided, which is applied to an electronic device, and the electronic device includes a first screen and a second screen, and the first screen and the second screen are respectively arranged on two opposite sides of the electronic device.
  • the method includes: displaying a third interface on the first screen, the third interface being the interface of the first application program APP; receiving a fifth operation performed by the user, the fifth operation being to open the first screen and the second The operation of the collaborative function of the screen; in response to the fifth operation, displaying the interface of the first APP displayed on the first screen on the second screen.
  • the electronic device may display the third interface of the first APP on the inner screen.
  • the application does not limit the type of the first APP, which may include but not limited to any one of browser APP, video APP, game APP, conference APP, document APP and video call APP.
  • the first interface of the above-mentioned first APP may be an interface displayed by a game APP, may also be a video playback interface displayed by a video APP, or may be an interface displayed by a document APP, which is not limited here.
  • the fifth operation of opening the collaborative function of the first screen and the second screen can be performed, for example, the user clicks the start button of the collaborative function on the desktop, or Swipe down from the inner screen to pull down the task bar and click the control for external screen collaboration in the drop-down task bar.
  • the electronic device may display the first interface displayed on the inner screen on the outer screen under the trigger of the fifth operation of the user.
  • the outer screen may continue to follow the inner screen to display the interface of the first APP displayed on the inner screen.
  • the electronic device when the user performs the fifth operation, can activate the coordinated function of the first screen and the second screen, that is, the outer screen displays the content displayed on the inner screen, so that the second screen can be displayed without flipping the electronic device.
  • the interface of an APP is displayed for other users to view, which avoids the inconvenience caused by flipping the electronic device to share the screen, makes sharing the screen more convenient, and improves the user experience of sharing the screen.
  • the outer screen may display different content from the inner screen. That is, if the user activates the asynchronous collaboration mode when the interface of the first APP is displayed on the inner screen, when the user operates the inner screen to open the second APP, that is, the sixth operation is performed, and the interface of the second APP is displayed on the inner screen. At this time, the outer screen still displays the interface of the first APP, while the inner screen can display the interface of the first APP in a floating window to remind the user of the content displayed on the outer screen.
  • the user can keep the interface display of the first APP on the outer screen, and use the second APP on the inner screen.
  • the displayed interface of the first APP is used to monitor the demonstration of the first APP, which enriches the form of screen sharing, enriches the functions of the electronic device, facilitates the use of the user, and improves the user experience.
  • the fifth operation is to swipe down on the first screen to activate the drop-down task bar and click on the external screen collaboration control in the drop-down task bar, and to open the pop-up external screen Select the operation of mirror coordination in the selection window.
  • the second APP is a camera APP.
  • parents want to take pictures of their children, but are afraid that the child's lack of concentration will affect the shooting effect they can open the video APP and turn on the asynchronous collaboration mode, and share the interface of the video APP to the external screen to attract the children's attention. Then the parent turns on the camera APP to take pictures, which can ensure that the child's attention is directed towards the electronic device being taken, which is convenient for the user to take pictures of the child and ensures the shooting effect.
  • a fifth aspect provides a chip, including a processor; the processor is used to read and execute a computer program stored in a memory, so as to execute any one of the methods in the technical solutions described in the first aspect or the second aspect.
  • a computer-readable storage medium is provided.
  • a computer program is stored in the computer-readable storage medium.
  • the processor executes the first aspect or the second aspect. Any method in the described technical solutions.
  • Fig. 4 is a schematic diagram of an interface displayed on the inner screen and the outer screen provided by the embodiment of the present application;
  • Fig. 12 is another example of the interface displayed on the inner screen and the outer screen provided by the embodiment of the present application.
  • Fig. 17 is an interaction flow diagram of another example of a screen display method provided by the embodiment of the present application.
  • Fig. 18 is another example of the interface displayed on the internal screen provided by the embodiment of the present application.
  • FIG. 19 is a flow chart of another example of a screen display method provided by an embodiment of the present application.
  • Fig. 20 is a flow chart of another example of a screen display method provided by the embodiment of the present application.
  • the controller may be the nerve center and command center of the terminal device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface to realize the shooting function of the terminal device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the terminal device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transmit data between the terminal device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other terminal devices, such as AR devices.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • the structures of antenna 1 and antenna 2 in FIG. 1 are just an example.
  • Each antenna in the terminal device 100 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the terminal device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves and radiate them through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the folding screen When the angle between the first display unit 301 and the second display unit 302 is 0 degrees, the folding screen is in a fully folded state; when the angle between the first display unit 301 and the second display unit 302 is 180 degrees , the folding screen is in a fully unfolded state; when the angle between the first display unit 301 and the second display unit 302 is greater than 0 degrees and less than 180 degrees, the folding screen is in a half-folding state.
  • the half-folded state of the folding screen can also be called the half-folded state of the folding screen mobile phone, and the same is true for the fully folded state and the fully unfolded state.
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the terminal device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals, not only digital image signals, but also other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs.
  • the terminal device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the terminal device 100 can be realized through the NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the terminal device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C. In some other embodiments, the terminal device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the terminal device 100 can also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, identify sound sources, and realize directional recording functions, etc.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the terminal device 100 determines the intensity of pressure according to the change in capacitance.
  • the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the terminal device 100 may detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the terminal device 100 in various directions (generally three axes). When the terminal device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to recognize the posture of terminal equipment, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and so on.
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the terminal device 100 may receive key input and generate key signal input related to user settings and function control of the terminal device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the terminal device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the terminal device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the terminal device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the terminal device 100 and cannot be separated from the terminal device 100 .
  • Diagram c in FIG. 2 is a software structural block diagram of the terminal device 100 in the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the APIs may include API1 and API2, API1 is an interface for adding window display, and API2 is an interface for registering collaborative services.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, input manager And display manager (display manager, that is, screen manager), etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the terminal device 100 .
  • the management of call status including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is issued, the terminal device vibrates, and the indicator light flashes, etc.
  • the input manager is used to handle the user's touch operations, such as click, swipe, double-tap, and so on.
  • the display manager is used to manage the inner screen and the outer screen, and control the display content to be displayed on the inner screen and/or the outer screen.
  • Android runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, an inner screen driver and an outer screen driver.
  • the above-mentioned terminal device may be an electronic device with an inner screen and an outer screen, or may be a folding screen device with an outer screen.
  • a folding screen device with an external screen one of the screens is an internal screen that can be folded, that is, a folding screen. Users can unfold the internal screen for use when watching videos or playing games. When the user does not use the inner screen, the folding screen device can be folded for easy portability.
  • Another screen is an outer screen, and the outer screen is arranged on the side opposite to the inner screen. When the user folds the folding screen device, the inner screen will be folded and the outer screen will be exposed, which is convenient for the user to directly operate the outer screen to use the folding screen device without unfolding the inner screen.
  • the technical solution of this application is that when the user needs to show the screen to other people or other devices, for example, when presenting the payment code for payment, the external screen can be lit up through quick operations, and the payment code to be displayed can be projected to the electronic
  • the external screen of device 301 is displayed.
  • the user when the user needs to scan the code to pay, he can show the payment code to the code scanning gun 302 through the external screen to complete the operation of scanning the code without turning over the electronic device 301, thus making the code scanning more convenient and improving
  • This improves the portability of the electronic device 301 for scanning codes, does not affect the user's viewing of the inner screen, and improves the user's experience.
  • the user can also double-click the back of the electronic device (for example, double-click to fold the external screen) to wake up the external screen (that is, light up the external screen), and the shortcut service shown in Figure 4 d is displayed on the external screen window.
  • the inner screen may also display a shortcut service window in the form of a floating window as shown in FIG. 4 c .
  • the window of the quick service includes four controls including local health treasure, A payment code (payment code), B payment code and ride code as an example.
  • the external screen can display the health code carrying personal health information.
  • the health code can identify the user's health status, which is convenient for the user to scan the code to pass or record personal health information, etc. .
  • the outer screen can display the ride code, which is convenient for the user to swipe the code to board the bus or pass through the gate.
  • the third-party APP can also send the registration information to the window manager through API2 to register in the system when it runs for the first time after the installation is completed, or the third-party APP stores the registration information in a fixed path during installation, and the electronic device Obtain registration information according to a fixed path in order to obtain which APPs can provide external screen collaboration functions.
  • the electronic device can also preset the registration information of some third-party APPs in the system. For example, the payment APP and the car ride APP are often the APPs that people need to use on the external screen, so the electronic device can preset the registration information of these APPs.
  • the above-mentioned API2 is an interface different from API1.
  • API1 is an interface used by three-party APPs to add windows for display
  • API2 is an interface used by three-party APPs to register and cooperate with the window manager.
  • Fig. 7 takes the scene where the user opens the external screen to scan the code to pay when reading a novel through the browser as an example, and describes the display process of the external screen of the electronic device in detail.
  • the payment APP calls API2 to send the registration information of the payment APP to the window manager for registration.
  • the window manager adds the window of the browser in response to the recognition result of the user operation, and sends an instruction that the window is added to the browser through the API1.
  • the browser responds to the instruction that the window has been added sent by the window manager, draws the browser to display the corresponding page of the novel content, and returns a message of the completion of drawing to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the message that the drawing is completed in the browser receipt, and the display message includes information about the page of the novel drawn by the browser on the internal screen display.
  • the display manager controls the internal screen to display pages of novels.
  • the window management module adds a shortcut service window in response to the recognition result of the user operation, and sends an instruction that the window has been added to the system user interface (system UI) through API1.
  • system UI system user interface
  • the window manager sends a display message to the display manager in response to the message that the drawing of the system user interface receipt is completed, and the display message includes information about the windows of the shortcut service displayed by the inner screen and the outer screen.
  • the display manager controls the inner screen and the outer screen to display the shortcut service window.
  • the window manager adds a payment code window in response to the recognition result of the user operation, and sends an instruction that the window has been added to the payment APP through API1.
  • the window manager After the window manager responds to the message that the drawing of the payment APP receipt is completed, it sends a display message to the display manager, and the display message includes the information of the page where the payment code is displayed by the inner screen and the outer screen.
  • the display manager controls the interface for displaying the payment code on the inner screen and the outer screen.
  • the user can scan the code to pay through the external screen without flipping the electronic device, making the scan code payment more convenient.
  • the user wants to share the screen with other people he can also turn on the external screen collaboration function to project the interface of the internal screen to the external screen for display, and other people can watch the interface that the user wants to display through the external screen. There is no need to turn over the electronic device to share the screen with other users, making screen sharing more convenient.
  • the inner screen can display the interface of the gallery as shown in Figure a in Figure 8, and the interface of the gallery includes multiple photos.
  • the user can perform the operation shown in figure a in FIG. 8 , perform a pull-down operation from the top of the inner screen, and the top of the inner screen displays the drop-down task bar as shown in figure b in FIG. 8 .
  • the drop-down task bar includes a screen projection control, and in FIG. 8 , the screen projection control is displayed as an "external screen collaboration" control as an example for illustration.
  • the external screen can be the interface of the gallery as shown in figure c in Figure 8. If the user performs the operation of clicking one of the pictures as shown in figure b in FIG. 8 , the inner screen will expand and display the picture. At this time, the external screen is in the external screen collaboration mode, and the external screen can expand and display the picture selected by the user as shown in figure d in FIG. 8 . Others can watch the picture selected by the user through the external screen.
  • the outer screen can display the content displayed on the inner screen synchronously, so that others can watch the content that the user wants to share through the outer screen.
  • the electronic device when the external screen collaboration function is turned on, can disable the operation authority of the external screen, so as to prevent other people from operating the external screen to switch displayed content, resulting in leakage of user's privacy.
  • the electronic device when the external screen collaboration function is enabled, can also enable the operation authority of the external screen, so that other users can operate the external screen to use while observing the external screen, such as sliding the external screen left and right to switch pictures, Click the edit control in the external screen to edit the picture, click the collection control in the external screen to bookmark the picture, or click the sharing control in the external screen to share the picture with other applications. Opening the operation authority of the external screen of the electronic device can improve the interactivity in the collaboration process of the internal screen and the external screen, and enrich the collaboration function of the electronic device.
  • the user when the user enables the external screen collaboration function, the user may also select a mirror image collaboration or asynchronous collaboration mode as required.
  • the mode of mirroring synergy is the mode in which the external screen and the internal screen display the same content
  • asynchronous synergy is the mode in which the content displayed on the external screen and the internal screen are different or not exactly the same
  • the external screen displays the interface of APP1 and the internal screen displays APP2
  • the outer screen displays the APP1 interface
  • the inner screen displays the APP2 interface and also displays the APP1 interface in the form of a floating window.
  • the electronic device can project the interface of the APP running on the front end to the external screen for display, and the interfaces of other APPs can only be displayed on the internal screen.
  • the electronic device defaults to an asynchronous collaboration mode, and the browser interface is displayed on the inner screen while the shortcut service window is displayed in the form of a floating window, and the outer screen is Only the window of the shortcut service is displayed and the interface of the browser is not displayed.
  • the interface of the gallery displayed on the inner screen can perform the operation of pulling down from the top of the inner screen as shown in figure a in FIG. 9 to start the pull-down task bar, Then the inner screen displays a drop-down task bar as shown in Figure b in FIG. 9 .
  • the selection window of the collaboration mode includes mirror image collaboration and asynchronous collaboration Users can choose a reasonable collaboration mode according to their needs.
  • the outer screen displays the same content as the inner screen, for example, the outer screen displays the gallery interface as shown in Figure 9 d.
  • the external screen also expands to display this picture.
  • the outer screen also switches pictures accordingly.
  • the user can also click the "cancel" control in the selection window of the collaboration mode to exit the startup of the external screen collaboration function.
  • some applications can be set as the default external screen collaboration mode. For example, in the embodiment shown in FIG. 8 above, when the user starts the external screen collaboration function of the gallery, the gallery application can be It is the default mode of mirroring collaboration, without the need for users to choose.
  • the flow of the embodiment shown in Figure 9 can also be used.
  • the external screen displays the same video content as the internal screen , to achieve video sharing.
  • Users can also operate the video playback interface on the inner screen to control video playback. For example, if the user clicks the start, pause, and stop buttons on the video playback interface on the inner screen, the outer screen can also display the video in the inner screen synchronously in response to user operations. Playback interface.
  • FIG. 10 takes a scene where a user shares a picture as an example to describe in detail the process of displaying a picture on an external screen of an electronic device.
  • the gallery calls API2 to send the registration information of the gallery to the window manager for registration.
  • the input manager in the electronic device recognizes the user's operation, and sends the recognition result to the window manager.
  • the gallery draws the page of the gallery in response to the window addition completion instruction sent by the window manager, and returns a drawing completion message to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the drawing completion message of the gallery receipt, and the display message includes information about displaying pages of the gallery on the internal screen.
  • S1001 is not necessarily executed before S1002, and may also be executed after S1002, S1003, S1004, S1005 or S1006, as long as it is executed before S1007.
  • the input processing module recognizes the user's operation, and sends the recognition result to the window management module.
  • the window management module adds a window in the pull-down taskbar in response to the recognition result of the user operation, and sends an instruction indicating that the window has been added to the system user interface.
  • the system user interface draws the window of the drop-down taskbar in response to the window addition completion instruction sent by the window manager, and returns a drawing completion message to the window manager after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the message that the drawing of the system user interface receipt is completed, and the display message includes information about displaying the window of the gallery on the internal screen.
  • the display manager controls the inner screen to display the window of the gallery.
  • the window manager sends a display message to the display manager in response to the recognition result of the user operation, and the display message includes information about displaying pages of the gallery by the inner screen and the outer screen.
  • the display manager controls the inner screen and the outer screen to display pages of the gallery.
  • the window manager responds to the recognition result of the user operation, adds the window of the picture, and sends an instruction that the window has been added to the gallery through the API1.
  • the gallery draws the page of the picture in response to the window addition completion instruction sent by the window manager, and returns a message of drawing completion to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the drawing completion message of the gallery receipt, and the display message includes the information of the page where the picture is displayed by the inner screen and the outer screen.
  • the display manager controls pages for displaying pictures on the inner screen and the outer screen.
  • the user as a parent can open a cartoon stored locally or open another video APP to play a video, and at this time, the inner screen can display the video playback interface as shown in Figure 12a.
  • the inner screen displays a pull-down task bar as shown in figure b of FIG. 12 .
  • the user performs the operation shown in Figure 12 b, that is, clicks the control of "External Screen Collaboration" in the drop-down task bar, and the interface of the inner screen pops up the selection window of the collaborative mode as shown in Figure 12 c
  • the selection window of the coordination mode includes controls for mirroring coordination and asynchronous coordination.
  • the external screen can display the video playback interface as shown in d in FIG.
  • the user can open the camera APP to take pictures or record videos.
  • the user can slide from the side of the inner screen to the middle to exit the video playback interface as shown in Figure a in Figure 14.
  • the video playback interface will Closed, but as shown in figure b in Figure 14, it will be displayed in a corner of the inner screen in the form of a reduced floating window.
  • the user can click the icon of the camera on the desktop as shown in figure b in FIG. 14 to open the camera for shooting.
  • the outer screen can always play cartoons to attract children's attention, and the inner screen displays the shooting preview interface as shown in Figure 14 c.
  • the user can slide from the side of the inner screen to the middle to exit the video playback interface as shown in Figure a in Figure 15.
  • the video playback interface can be as follows: Figure b in Figure 15 shows a corner of the inner screen in the form of a reduced floating ball. Then the user can click the icon of the camera on the desktop as shown in figure b in FIG. 15 to open the camera for shooting.
  • the outer screen can always play cartoons to attract children's attention, and the inner screen displays the shooting preview interface as shown in Figure 15 c.
  • the input manager in the electronic device recognizes the user's operation, and sends the recognition result to the window manager.
  • the window manager adds the window of the video player, and sends an instruction indicating that the window has been added to the video player through API1.
  • the video player draws the video page in response to the window addition completion instruction sent by the window manager, and returns a drawing completion message to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the drawing completion message of the video player receipt, and the display message includes information about displaying a page of the video player on the inner screen.
  • the input manager recognizes the user's operation and sends the recognition result to the window manager.
  • the input processing module recognizes the user's operation, and sends the recognition result to the window management module.
  • the window management module adds the desktop window and the floating window of the video player in response to the recognition result of the user operation, and sends an instruction of completion of adding the floating window to the video player.
  • the video player draws the floating window of the video player in response to the instruction that the window of the floating window has been added sent by the window manager, and returns a drawing completion message to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the message that the drawing of the video player receipt is completed, and the display message includes the information of displaying the desktop and the floating window of the video player on the internal screen.
  • the display manager controls the internal screen to display the desktop and the floating window of the video player.
  • the window manager adds the window of the camera in response to the recognition result of the user operation, and sends an instruction of completion of window addition to the camera through API1.
  • the camera draws the interface of the camera in response to the window addition completion instruction sent by the window manager, and returns a drawing completion message to the window manager through API1 after the drawing is completed.
  • the window manager sends a display message to the display manager in response to the drawing completion message of the camera receipt, and the display message includes the information of displaying the interface of the camera on the internal screen.
  • the display manager controls the internal screen to display the page of the camera, and displays a video playback interface in the form of a floating window.
  • users can play videos to children through the external screen to attract children's attention, and at the same time use the interface of the internal screen to preview shooting to shoot, which will not affect the parents' shooting but also attract children's attention so that children can cooperate with the shooting, improving the shooting effect and shooting speed .
  • the inner screen can display a sidebar as shown in Figure 18 b.
  • the user can open the external screen coordination in the drop-down task bar and select asynchronous coordination, then the internal screen can display the interface shown in Figure c in Figure 14 or Figure c in Figure 15, displaying the preview interface of the camera and the reduced video The floating window or floating ball of the player; and the outer screen can display the interface of the video player as shown in figure d in Figure 12, figure a in Figure 13 or figure b in Figure 13.
  • the external screen collaboration control in the drop-down taskbar When the user finishes displaying, he can click the external screen collaboration control in the drop-down taskbar to stop the external screen collaboration. At this time, the external screen can be turned off or the external screen desktop can be displayed instead of the previously displayed interface.
  • Figure 19 is a flow chart of another screen display method provided by the embodiment of the present application.
  • the method is applied to an electronic device including a first screen and a second screen.
  • the first screen is used as the inner screen
  • the second screen is used as the outer screen.
  • the screen is taken as an example, wherein the outer screen and the inner screen are arranged on opposite sides of the electronic device, and the method includes:
  • the user may perform the first operation on the internal screen, for example, click the shortcut service icon on the desktop to open the shortcut service window.
  • the user may also perform the first operation on the external screen, such as double-clicking the external screen, or tapping the external screen with two fingers to open the shortcut service window.
  • the external screen may display a first interface.
  • the first interface is a window of a shortcut service as an example for illustration.
  • the first interface may be a shortcut service window as shown in diagram d of FIG. 4 .
  • the shortcut service window can include one or more service controls, and each service control corresponds to an APP or an APP setting page.
  • a payment service control can correspond to a payment APP, or can also correspond to a payment APP.
  • the service controls in the window of the above-mentioned shortcut service may include the first control corresponding to the APP that the user needs to call quickly.
  • the electronic device receives a second operation performed by the user to select the first control.
  • the second operation is an operation of clicking the first control to be displayed from at least one service control in the shortcut service window.
  • the electronic device can display the first interface on the external screen based on the user's first operation, such as the shortcut service window, and then, triggered by the second operation, display the corresponding interface of the first control selected by the user through the external screen.
  • the second interface such as the interface of the service code, can avoid the inconvenience caused by the user turning over the electronic device to scan the code, making the code scanning operation more convenient and faster.
  • the electronic device itself is in a folded state, that is, the inner screen is in a folded state, and the inner screen is turned off at this time.
  • the operation authority of the external screen is enabled, and the user can perform a second operation on the external screen to open the corresponding second interface. In this way, the user can scan the code without unfolding the inner screen, which improves the convenience of the code scanning operation.
  • a possible implementation of the above step S1901 may include: receiving the first operation when the first screen displays a third interface, the third interface being one of the following interfaces : Lock screen interface, AOD interface, desktop, and the interface of the first APP.
  • the first operation can be performed To open the shortcut service window (that is, the first interface), perform the second operation in the shortcut service window to make the external screen display the service code that needs to be displayed (that is, the second interface), so that the user can use the external device without turning over the electronic device.
  • the above-mentioned first APP may be any one of a browser APP, a video APP, a game APP, a conference APP, a document APP, and a video call APP.
  • the user can display the service code to be displayed on the external screen by operating the electronic device without affecting the user's use of any of the above-mentioned browser APP, video APP, game APP, conference APP, document APP and video call APP on the internal screen, which improves the user experience. scanning experience.
  • the first operation is an operation of double-clicking the external screen (ie, the second screen).
  • the user can also double-click the external screen on the external screen to light up the external screen and open the shortcut service window on the external screen.
  • the shortcut service window can be opened by double-clicking the external screen without precise manipulation of the internal screen.
  • the operation mode is more flexible and convenient.
  • the first interface includes: at least one of a payment code control, a ride code control, and a personal health code control.
  • Payment codes, ride codes, and personal health codes are frequently used service codes. Storing one or more of these service codes in the quick service window can significantly reduce the need for users to manually open the service code by opening the APP. times, improving the user experience.
  • the user can also manage, for example add or delete, the service controls in the window of the shortcut service.
  • the shortcut service window in Fig. 4 is shown as an example with controls including local health treasure, A payment code, B payment code and ride code.
  • the user may perform a third operation, and for the third operation, refer to the operation of clicking the second control in the shortcut service window of the inner screen shown in diagram a of FIG. 6 .
  • the second control may be a menu control.
  • the electronic device receives the third operation and responds, and displays the APP management list shown in Figure b in Figure 6 in the shortcut service window displayed on the inner screen.
  • the APP management list includes multiple shortcut service There is a one-to-one correspondence between the icon of the APP and multiple status buttons, and the icons of multiple APPs supporting quick service and multiple status buttons, that is, the icon of each quick service APP corresponds to a status button representing its own status.
  • Each state button can include two states of on and off. Among them, the open state means that the window of the shortcut service contains the control of the APP of the shortcut service corresponding to the status button. If the user needs to delete the icon of the APP corresponding to the status button from the window of the shortcut service, then manually switch the status button It can be in the closed state; the closed state means that the shortcut service window does not contain the corresponding APP control.
  • the electronic device may add or delete the icon of the APP supporting the shortcut service corresponding to the target status button in the shortcut service window. That is to say: if the target state button is in the closed state, when the user performs the fourth operation of clicking the target state button, the closed state can be switched to the open state.
  • the original shortcut service window does not contain the icon of the APP corresponding to the target state button.
  • the shortcut service window contains the icon of the APP corresponding to the target status button, at this time, the icon of the APP corresponding to the target status button can be deleted in the shortcut service window.
  • the user can open the APP management list by operating the menu control in the shortcut service window, and then operate the status button corresponding to the icon of the APP that supports the shortcut service in the APP management list to control the shortcut service window.
  • service controls for management The method can realize the user's self-management of the types of service controls in the shortcut service window, so that the operation of invoking the service code through the shortcut service window can meet the personalized needs of different users and improve user experience.
  • the electronic device may display the third interface of the first APP on the inner screen.
  • the embodiment of the present application does not limit the type of the first APP, which may include but not limited to any one of browser APP, video APP, game APP, conference APP, document APP and video call APP.
  • the first interface of the above-mentioned first APP may be an interface displayed by a game APP, may also be a video playback interface displayed by a video APP, or may be an interface displayed by a document APP, which is not limited in this embodiment of the present application.
  • S2002 Receive a fifth operation performed by the user, where the fifth operation is an operation to enable the coordinated function of the first screen and the second screen (for example, the coordinated function of the inner screen and the outer screen).
  • the user When the user needs to scan the code, he can perform the fifth operation of opening the collaborative function of the inner screen and the outer screen, for example, the user clicks the start button of the collaborative function on the desktop, or slides down from the inner screen to pull down the task bar and clicks on the drop-down task bar Controls for external screen coordination.
  • the electronic device may display the interface of the first APP displayed on the inner screen on the outer screen under the trigger of the fifth operation of the user.
  • the outer screen may continue to follow the inner screen to display the interface of the first APP displayed on the inner screen. For example, when the inner screen displays the first interface of picture thumbnails in the gallery, if the user performs the fifth operation, the outer screen displays the picture thumbnails in the gallery displayed on the inner screen. At this time, if the user clicks one of the pictures on the inner screen to display the enlarged picture of the picture, the outer screen will synchronously display the enlarged picture of the picture displayed on the inner screen.
  • the electronic device when the user performs the fifth operation, can activate the collaborative function of the inner screen and the outer screen, that is, the outer screen can display the content displayed on the inner screen, so that the electronic device can be displayed without turning over the electronic device.
  • the interface of the first APP can be displayed for other users to view, avoiding the inconvenience caused by flipping the electronic device to share the screen, making sharing the screen more convenient, and improving the user experience of sharing the screen.
  • the fifth operation may also be to slide down on the first screen to start the drop-down task bar and click the external screen coordination control in the drop-down task bar, and select the asynchronous coordination in the pop-up external screen coordination selection window.
  • the drop-down taskbar includes controls for external screen collaboration.
  • the internal screen may pop up an external screen coordinated selection window.
  • the external screen cooperation selection window may also include controls for mirroring cooperation. If the user clicks the control of the mirroring collaboration, the electronic device starts the mirroring collaboration mode of the collaboration function. Users can choose different collaborative function modes according to their needs to adapt to the current usage scenario.
  • the electronic device may also receive a sixth operation performed by the user to open the second APP on the inner screen.
  • the electronic device displays the interface of the second APP on the inner screen, and simultaneously displays the interface of the first APP on the inner screen in the form of a floating window.
  • the outer screen may display different content from the inner screen. That is, if the user activates the asynchronous collaboration mode when the interface of the first APP is displayed on the inner screen, when the user operates the inner screen to open the second APP, that is, the sixth operation is performed, and the interface of the second APP is displayed on the inner screen.
  • the outer screen still displays the interface of the first APP
  • the inner screen can display the interface of the first APP in a floating window to remind the user of the content displayed on the outer screen.
  • the user after enabling the collaboration function of the asynchronous collaboration mode, the user can keep the interface display of the first APP displayed on the outer screen, and can use the second APP on the inner screen.
  • the displayed interface of the first APP is used to monitor the demonstration of the first APP, which enriches the form of screen sharing, enriches the functions of the electronic device, facilitates the use of the user, and improves the user experience.
  • the sixth operation for the above-mentioned user to open the second APP can be to operate on the inner screen, first click the "Home" button to exit the interface of the first APP and return to the desktop, and then click the icon of the second APP from the desktop to open the second APP. Operation; you can also open the sidebar by sliding from the side of the inner screen to the middle of the inner screen for a certain distance on the interface of the first APP, and then click the icon of the second APP in the sidebar to open the second APP. 2. APP operation.
  • the user after turning on the collaboration function of the mirroring collaboration mode, the user can maintain the same interface of the external screen and the internal screen to realize screen sharing, without the need for multiple people to watch the same screen and the inconvenience caused by the need to adjust and flip the electronic device.
  • the method makes screen sharing more convenient and improves user experience.
  • the above-mentioned first APP is any one of a gallery APP, a video APP, and a document APP.
  • Users can share the gallery APP, video APP, document APP and other interfaces to the external screen by opening the collaboration function of the internal screen and external screen, which is convenient for users to share pictures, videos and documents.
  • the second APP when the first APP is a video APP, the second APP is a camera APP.
  • parents want to take pictures of their children, but are afraid that the child's lack of concentration will affect the shooting effect they can open the video APP and turn on the asynchronous collaboration mode, and share the interface of the video APP to the external screen to attract the children's attention. Then the parent turns on the camera APP to take pictures, which can ensure that the child's attention is directed towards the electronic device being taken, which is convenient for the user to take pictures of the child and ensures the shooting effect.
  • the corresponding device includes a corresponding hardware structure and/or software module for performing each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the units and algorithm steps of each example described in the embodiments disclosed herein. Whether a certain function is executed by hardware or computer software drives hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the present application may divide the function modules of the screen display device according to the above method example, for example, each function may be divided into each function module, or two or more functions may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in this application is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 21 shows a schematic structural diagram of a screen display device provided by the present application.
  • the device 2100 is applied to an electronic device, and the electronic device includes a first screen and a second screen, and the first screen and the second screen are respectively arranged on opposite sides of the electronic device, for example, the electronic device includes an inner screen and external screens, including:
  • the first display module 2102 is configured to display a first interface on the first screen in response to the first operation, and the first interface includes a first control.
  • it further includes a second display module 2103, configured to display a third interface on the first screen, and the third interface is the interface of the first APP.
  • the first receiving module 2101 is specifically configured to receive the first operation when the third interface is displayed on the first screen.
  • the first operation is an operation of sliding down on the first screen to start the drop-down task bar and clicking the control of the shortcut service in the drop-down task bar.
  • the first operation is an operation of double-clicking the second screen.
  • the first interface is a shortcut service window
  • the first receiving module 2101 is further configured to receive a third operation performed by the user, and the third operation is clicking a menu control in the first interface operation.
  • the second display module 2103 is further configured to display an APP management list on the first screen in response to the third operation, where the APP management list includes a plurality of icons of APPs supporting shortcut services and a plurality of status buttons, and The icons of the plurality of APPs supporting shortcut services are in one-to-one correspondence with the plurality of status buttons.
  • the first receiving module 2101 is further configured to receive a fourth operation performed by the user, the fourth operation is an operation performed on a target state button, and the target state button is any one of the state buttons.
  • FIG. 22 shows a schematic structural diagram of a screen display device provided by the present application.
  • the apparatus 2200 is applied to an electronic device, and the electronic device includes a first screen and a second screen, and the first screen and the second screen are respectively arranged on opposite sides of the electronic device, for example, the electronic device includes an inner screen and external screens, including:
  • the fourth display module 2202 is configured to display the interface of the first APP on the first screen on the second screen in response to the fifth operation.
  • the second receiving module 2202 is also configured to receive a sixth operation performed by the user, the sixth operation is to open the Operation of the second APP.
  • the third display module 2201 is further configured to display the interface of the second APP on the first screen in response to the sixth operation, and display the interface of the first APP on the first screen in the form of a floating window. in the first screen.
  • the fifth operation is to slide down on the first screen to activate the pull-down task bar and click the control of external screen collaboration in the drop-down task bar, and in the pop-up external screen collaboration selection window Opt for asynchronous coroutines.
  • the second receiving module 2202 is further configured to receive a seventh operation performed by the user, when the seventh operation is Operate the operation of the first APP.
  • the third display module 2201 is further configured to, in response to the seventh operation, display content indicated by the seventh operation on the first screen.
  • the fourth display module 2202 is further configured to, in response to the seventh operation, display the content indicated by the seventh operation on the second screen.
  • the first APP is any one of a gallery APP, a video APP, and a document APP.
  • the second APP when the first APP is a video APP, the second APP is a camera APP.
  • An embodiment of the present application also provides an electronic device, including the above-mentioned processor.
  • the electronic device provided in this embodiment may be the terminal device 100 shown in FIG. 1 , and is configured to execute the above screen display method.
  • the terminal device may include a processing module, a storage module and a communication module.
  • the processing module can be used to control and manage the actions of the terminal device, for example, can be used to support the terminal device to execute the steps performed by the display unit, the detection unit and the processing unit.
  • the storage module can be used to support the terminal device to execute and store program codes and data.
  • the communication module can be used to support the communication between terminal equipment and other equipment.
  • the processing module may be a processor or a controller. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (digital signal processing, DSP) and a microprocessor, and so on.
  • the storage module may be a memory.
  • the communication module may be a device that interacts with other terminal devices, such as a radio frequency circuit, a Bluetooth chip, or a Wi-Fi chip.
  • the terminal device involved in this embodiment may be a device having the structure shown in FIG. 1 .
  • the embodiment of the present application also provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the processor is made to perform the operation described in any of the above-mentioned embodiments. screen display method.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or It may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, and the replaced units may or may not be physically separated, as A unit shows a part that can be one physical unit or multiple physical units, that can be located in one place, or can be distributed in many different places. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • an integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods in various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及电子技术领域,提供了一种屏幕显示方法和电子设备,应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,包括:接收用户执行的第一操作;响应于所述第一操作,在所述第一屏幕显示第一界面,所述第一界面中包括第一控件;接收用户执行的第二操作;响应于所述第二操作,在所述第二屏幕显示所述第一控件对应的服务码。该方法能够提高扫码操作的便携性。

Description

屏幕显示方法和电子设备
本申请要求于2021年12月25日提交国家知识产权局、申请号为202111606004.0、申请名称为“屏幕显示方法和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子技术领域,具体涉及一种屏幕显示方法和电子设备。
背景技术
随着电子技术的快速发展,终端设备的形态也越来越丰富。具有折叠屏的终端设备,即折叠屏设备,有着屏幕大以及显示效果好等优势得到众多用户的青睐。
为了满足用户的多种使用场景的需要,有些折叠屏设备会设置两个屏幕,其中一个是能够折叠的内屏,即折叠屏,用户在观看视频或者打游戏的时候可以展开内屏进行使用。当用户不使用内屏的时候,可以将折叠屏设备进行折叠以方便携带,此时内屏会折叠起来。另外一个屏幕是外屏,外屏通常尺寸比内屏小。当用户将折叠屏设备进行折叠后,内屏会折合起来,而外屏裸露在外,方便用户在不展开内屏的情况下可以直接操作外屏来使用折叠屏设备,例如用户可以操作外屏来接听或者拨打电话、查看信息等。通常用户在展开折叠屏设备的内屏时,折叠屏设备会认为用户此时需要使用内屏,因此外屏会自动熄灭。
然而,当用户需要展示屏幕,例如展示支付码进行支付、或者向他人分享照片或视频时,则需要将翻转折叠屏设备使得内屏朝向展示对象,这对于体积较大的折叠屏设备来说,操作不便,影响用户体验。
发明内容
本申请提供了一种屏幕显示方法、装置、芯片、电子设备、计算机可读存储介质和计算机程序产品,能够提高使用的便携性。
第一方面,提供了一种屏幕显示方法,应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,包括:接收用户执行的第一操作;响应于所述第一操作,在所述第一屏幕显示第一界面,所述第一界面中包括第一控件;接收用户对第一控件的第二操作;响应于所述第二操作,在所述第二屏幕显示所述第一控件对应的第二界面。
以第一屏幕为内屏、第二屏幕为外屏、第一界面为快捷服务的窗口、第一控件为快捷服务的窗口中的服务控件为例,进行说明。用户可以在内屏执行第一操作,例如在桌面点击快捷服务的图标来打开快捷服务的窗口。可选地,用户也可以在外屏执行第一操作,例如双击外屏、或者双指敲击外屏来打开快捷服务的窗口。响应于上述第一操作,外屏可以显示快捷服务的窗口。其中,快捷服务的窗口中可以包括一个或多个服务控件,每个服务控件均对应一个应用程序(application,APP)或一个APP的设定的页面,例如一个支付的服务控件可以对应支付APP,也可以对应该支付APP的支付码的页面。可选地,上述快捷服务的窗口中的服务控件可以包括用户需要快捷调 用的APP对应的控件。通过快捷服务的窗口,电子设备接收用户执行的第二操作,该第二操作为从快捷服务的窗口中的至少一个服务控件中点击所需要展示的第一控件的操作。在一些实现方式中,外屏的操作权限可以是关闭状态避免其他人误操作导致隐私泄露或者信息丢失,用户可以是在内屏执行第二操作。在一些实现方式中,外屏的操作权限可以是打开状态,其他人可以在外屏执行第二操作来选择第一控件,实现在机主允许的情况下根据需要选择要展示的服务码,因此使用更方便。响应于上述第二操作,电子设备在外屏显示上述第一控件对应的服务码。例如,当用户所选择的第一控件为支付APP的控件,则对应的服务码可以为支付码;当用户所选择的第一控件为个人健康APP的控件,则对应的服务码还可以是个人健康码;当用户所选择的第一控件为乘车APP的控件时,则对应的服务码可以是乘车码。需要说明的是,此处的服务码可以是条码、二维码等形式,本申请对服务码的具体形式不做限定。
上述方法中,电子设备能够基于用户的第一操作在外屏显示快捷服务的界面,然后在第二操作的触发下,通过外屏显示出用户所选择的第一控件对应的服务码,可以避免用户翻转电子设备进行扫码带来的不便,使得扫码操作更为方便和快捷。
在一种可能的实现方式中,所述接收用户执行的第一操作,包括:在所述第一屏幕显示第三界面时,接收所述第一操作,所述第三界面为以下界面中的一种:锁屏界面、息屏(always on display,AOD)界面、桌面、第一APP的界面。
当用户观看内屏所显示的第一APP的界面时,例如用户采用浏览器显示的小说的内容时,此时需要用户进行扫码支付或者扫码乘车等操作时,则可以执行第一操作来打开快捷服务的窗口,在快捷服务的窗口执行第二操作使得外屏显示需要展示的服务码,以便在用户不翻转电子设备的情况下使用外屏进行扫码操作。由于外屏只展示服务码,这样也就无需将第一APP的第一界面展示给其他人或设备,保护了用户的信息不外泄,用户还可以一边不间断的使用第一APP一边进行扫码,保护了用户的隐私的同时提升了用户的扫码体验。
在一种可能的实现方式中,所述第一APP为浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。用户通过操作电子设备将需要展示的服务码显示在外屏可以不影响用户在内屏使用上述浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个,提高了用户的扫码体验。
在一种可能的实现方式中,所述第一操作为在第一屏幕下滑启动下拉任务栏并点击所述下拉任务栏中快捷服务的控件的操作。当用户正在使用内屏时,用户可以在内屏的顶部向下滑动来启动下拉任务栏,并点击下拉任务栏中的快捷服务的控件来打开快捷服务的窗口,这样的方式执行起来更为精准,不会出现误操作。
在一种可能的实现方式中,所述第一操作为双击第二屏幕的操作。用户还可以在外屏执行双击外屏的操作来点亮外屏并在外屏打开第一界面(例如,快捷服务的窗口),这样则无需精准操作内屏就可以通过双击外屏这样的快捷操作来打开第一界面,使得操作方式更灵活,也更方便。
在一种可能的实现方式中,所述第一界面中包括:支付码的控件、乘车码的控件和个人健康码的控件中的至少一个。支付码、乘车码和个人健康码为使用频次较高的 服务码,将这些服务码中的一个或多个收纳在第一界面(即,快捷服务的窗口)中,可以明显减少用户手动通过打开APP来打开服务码的次数,提升了用户体验。
在一种可能的实现方式中,所述第一界面为快捷服务的窗口,所述方法还包括:接收用户执行的第三操作,所述第三操作为点击所述第一界面中的第二控件的操作;响应于所述第三操作,在所述第一屏幕中显示APP管理列表,所述APP管理列表中包括多个支持快捷服务的APP的图标和多个状态按钮,且所述多个支持快捷服务的APP的图标和所多个述状态按钮一一对应;接收用户执行的第四操作,所述第四操作为对目标状态按钮执行的操作,所述目标状态按钮为任意一个所述状态按钮;响应于所述第四操作,在所述快捷服务的窗口中增加或删除所述目标状态按钮对应的支持快捷服务的APP的图标。
上述第二控件可以是菜单控件或者编辑控件。用户可以通过操作快捷服务的窗口中的菜单控件来打开APP管理列表,然后通过操作APP管理列表中支持快捷服务的APP的图标所对应的状态按钮,来对快捷服务的窗口中的服务控件进行管理。该方法可以实现用户自行管理快捷服务的窗口中的服务控件的种类,使得通过快捷服务的窗口调用服务码的操作能够满足不同用户的个性化的需求,提升了用户体验。
第二方面,提供了一种屏幕显示方法,应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,包括:在所述第一屏幕显示第三界面,所述第三界面为第一应用程序APP的界面;接收用户执行的第五操作,所述第五操作为打开第一屏幕和第二屏幕的协同功能的操作;响应于所述第五操作,在所述第二屏幕显示所述第一屏幕所显示的第一APP的界面。
以第一屏幕为内屏、第二屏幕为外屏为例,当用户使用第一APP时,电子设备可以在内屏显示第一APP的第三界面。本申请对第一APP的种类不做限定,可以包括但不限于浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。上述第一APP的第一界面可以是游戏APP显示的界面,还可以是视频APP显示的视频播放界面,还可以是文档APP显示的文档内容的界面,此处不做限定。当用户需要进行扫码时,可以执行打开第一屏幕和第二屏幕的协同功能(例如内屏和外屏的协同功能)的第五操作,例如用户点击桌面上的协同功能的启动按钮,或者从内屏下滑下拉任务栏并点击下拉任务栏中的外屏协同的控件。此时,电子设备可以在用户的第五操作的触发下,在外屏显示内屏所显示的第一界面。当用户在内屏继续操作该第一APP时,外屏可以继续跟随内屏显示内屏所显示的第一APP的界面。例如,当内屏显示图库的中的图片缩略图这个第一界面时,如果用户执行第五操作,则外屏显示内屏所显示的图库的图片缩略图。此时,如果用户在内屏点击其中一张图片来显示这个图片的放大图时,则外屏同步显示内屏所显示的这个图片的放大图。
上述方法中,电子设备可以在用户执行第五操作时,启动第一屏幕和第二屏幕的协同功能,即使得外屏显示内屏所显示的内容,从而可以无需翻转电子设备就能够实现将第一APP的界面展示给其他的用户查看,避免了翻转电子设备来分享屏幕所带来的不便,使得分享屏幕更为方便,提升了用户分享屏幕的体验。
在一种可能的实现方式中,当所述协同功能的模式为异步协同的模式时,所述方法还包括:接收用户执行的第六操作,所述第六操作为在所述第一屏幕打开第二APP 的操作;响应于所述第六操作,在所述第一屏幕显示所述第二APP的界面,并将所述第一APP的界面以悬浮窗的形式显示在所述第一屏幕中。
当用户打开协同功能时,如果选择了异步协同的模式,则外屏可以显示与内屏不完全一样的内容。即,用户在内屏显示第一APP的界面时启动了异步协同的模式,则当用户操作内屏打开第二APP,即执行第六操作,内屏则显示第二APP的界面。此时外屏依旧显示第一APP的界面,而内屏中可以将第一APP的界面显示在悬浮窗中,以提示用户外屏所显示的内容。该实现方式中,用户可以在打开异步协同模式的协同功能后,保持外屏显示第一APP的界面展示,而自己可以在内屏使用第二APP,同时用户还可以通过内屏的悬浮窗所显示的第一APP的界面来监控第一APP的演示情况,这使得屏幕分享的形式更为丰富,在丰富电子设备的功能的同时,方便了用户的使用,提升了用户体验。
在一种可能的实现方式中,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择异步协同的操作。
即当用户在内屏的顶部下滑时,可以拉出下拉任务栏。该下拉任务栏中包括外屏协同的控件。当用户点击该外屏协同的控件或者点击外屏协同的控件的菜单按键时,内屏可以弹出外屏协同选择窗口。该外屏协同选择窗口中可以包括异步协同的控件。当用户点击异步协同的控件,则电子设备启动协同功能的异步协同的模式。可选地,外屏协同选择窗口中还可以包括镜像协同的控件,如果用户点击镜像协同的控件,则电子设备启动协同功能的镜像协同的模式。用户可以根据需要选择不同的协同功能的模式来适配当前的使用场景。
在一种可能的实现方式中,所述第六操作为在所述第一屏幕操作的退出第一APP的界面并从桌面打开所述第二APP的操作;或者,在所述第一屏幕操作的调用侧边栏并点选所述侧边栏中的第二APP的图标的操作。
在一种可能的实现方式中,当所述协同功能的模式为镜像协同的模式时,所述方法还包括:接收用户执行的第七操作,当所述第七操作为在所述第一屏幕操作所述第一APP的操作;响应于所述第七操作,在所述第一屏幕和所述第二屏幕同步显示所述第七操作所指示的内容。
当用户打开协同功能时,如果选择了镜像协同的模式,则外屏可以显示与内屏一样的内容。该方法中,用户可以在打开镜像协同模式的协同功能后,保持外屏和内屏相同的界面来实现屏幕分享,无需多人观看同一屏幕导致的需要调整翻转电子设备导致的操作不便,该方法使得屏幕分享更为方便,提升了用户体验。
在一种可能的实现方式中,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择镜像协同的操作。
在一种可能的实现方式中,所述第一APP为图库APP、视频APP、文档APP中的任意一个。用户可以通过打开内屏和外屏的协同功能实现将图库APP、视频APP、文档APP等界面分享至外屏,方便了用户分享图片、视频和文档等。
在一种可能的实现方式中,当所述第一APP为视频APP时,所述第二APP为相 机APP。当家长想要给孩子拍照,又怕孩子注意力不集中影响拍摄效果时,则可以打开视频APP,并开启异步协同的模式,将视频APP的界面分享至外屏来吸引孩子的注意力。然后家长打开相机APP进行拍摄,可以保证孩子注意力朝向正在拍摄的电子设备,方便用户对孩子进行拍摄,确保了拍摄效果。
第三方面,提供了一种屏幕显示装置,包括由软件和/或硬件组成的单元,该单元用于执行第一方面或第二方面所述的技术方案中任意一种方法。
第四方面,提供了一种电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,还包括:处理器、存储器和接口;所述处理器、所述存储器和所述接口相互配合,使得所述电子设备执行如第一方面或第二方面所述的技术方案中任意一种方法。
第五方面,提供了一种芯片,包括处理器;处理器用于读取并执行存储器中存储的计算机程序,以执行第一方面或第二方面所述的技术方案中任意一种方法。
可选地,所述芯片还包括存储器,存储器与处理器通过电路或电线连接。
进一步可选地,所述芯片还包括通信接口。
第六方面,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储了计算机程序,当所述计算机程序被处理器执行时,使得该处理器执行第一方面或第二方面所述的技术方案中任意一种方法。
第七方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在电子设备上运行时,使得该电子设备执行第一方面或第二方面所述的技术方案中任意一种方法。
附图说明
图1是本申请实施例提供的一例终端设备100的结构示意图;
图2是本申请实施例提供的终端设备100的软件结构框图;
图3是本申请实施例提供的一例使用外屏进行扫码的场景示意图;
图4是本申请实施例提供的一例内屏和外屏所显示的界面示意图;
图5是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图6是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图7是本申请实施例提供的又一例屏幕显示方法流程交互图;
图8是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图9是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图10是本申请实施例提供的又一例屏幕显示方法的流程交互图;
图11是本申请实施例提供的又一例屏幕显示方法的场景示意图;
图12是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图13是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图14是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图15是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图16是本申请实施例提供的又一例内屏和外屏所显示的界面示意图;
图17是本申请实施例提供的又一例屏幕显示方法的流程交互图;
图18是本申请实施例提供的又一例内屏所显示的界面示意图;
图19是本申请实施例提供的又一例屏幕显示方法的流程图;
图20是本申请实施例提供的又一例屏幕显示方法的流程图;
图21是本申请实施例提供的一例屏幕显示装置的结构示意图;
图22是本申请实施例提供的又一例屏幕显示装置的结构示意。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”、“第三”的特征可以明示或者隐含地包括一个或者更多个该特征。
本申请实施例提供的电子设备的屏幕显示方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等终端设备上,这些终端设备为具有分别设置在终端设备上相对的两侧的第一屏幕和第二屏幕的终端设备。本申请实施例对终端设备的具体类型不作任何限制。
示例性的,图1是本申请实施例提供的一例终端设备100的结构示意图。终端设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对终端设备100的具体限定。在本申请另一些实施例中,终端设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是终端设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现终端设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现终端设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现终端设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为 数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为终端设备100充电,也可以用于终端设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其它终端设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对终端设备100的结构限定。在本申请另一些实施例中,终端设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过终端设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为终端设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其它一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
终端设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。图1中的天线1和天线2的结构仅为一种示例。终端设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在终端设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A, 受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其它功能模块设置在同一个器件中。
无线通信模块160可以提供应用在终端设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,终端设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得终端设备100可以通过无线通信技术与网络以及其它设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
终端设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,终端设备100可以包括1个或N个显示屏194,N为大于1的正整数。
当终端设备100为折叠屏设备时,终端设备100的内屏可以设置为能够折叠的显示屏,简称折叠屏。其中,折叠屏可以采用一个一体成型的柔性显示屏,也可以采用多个柔性显示屏以及位于每两个柔性显示屏之间的铰链组成的拼接显示屏,也可以采用多个刚性屏以及位于每两个刚性屏之间的铰链组成的拼接显示屏。本申请实施例对此不做限定。
示例性的,终端设备100可以为折叠屏手机。图2示例性的示出了一种折叠屏手 机的结构示意图。参照图2,折叠屏手机的显示屏包括第一屏幕和第二屏幕,其中第一屏幕可以是内屏,第二屏幕可以是外屏,内屏为折叠屏,且内屏尺寸大于外屏尺寸。图2中的a图为折叠屏手机内屏展开状态下的外侧示意图。参照图2中的a图,折叠屏手机外侧的第一部分201可以设置为折叠屏手机的外屏,第二部分202可以设置为手机外壳。其中,折叠屏手机外侧的第一部分201和第二部分202均可以设置有摄像头。在其它折叠屏手机示例中,折叠屏手机外侧的第一部分201和第二部分202可以均设置为折叠屏手机的外屏,折叠屏手机外侧的第一部分201和第二部分202也可以均设置为折叠屏手机外壳。图2中的b图为折叠屏手机内屏展开状态下的内侧示意图。参照图2中的b图,折叠屏手机的内屏(也即折叠屏)包括第一显示单元301和第二显示单元302。第一显示单元301和第二显示单元302可以分别显示不同的显示界面,也可以共同显示一个显示界面。根据第一显示单元301和第二显示单元302之间的夹角可以确定折叠屏的姿态或形态。当第一显示单元301和第二显示单元302之间的夹角为0度时,折叠屏为全折叠态;当第一显示单元301和第二显示单元302之间的夹角为180度时,折叠屏为全展开态;当第一显示单元301和第二显示单元302之间的夹角大于0度且小于180度时,折叠屏为半折叠态。需要指出的是,折叠屏为半折叠态也可以称之为折叠屏手机处于半折叠态,全折叠态以及全展开态亦是如此。
终端设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,终端设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其它数字信号。例如,当终端设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。终端设备100可以支持一种或多种视频编解码器。这样,终端设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现终端设备100的智能认知等应用,例如:图像识别,人脸识别,语音 识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展终端设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行终端设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储终端设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
终端设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。终端设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当终端设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。终端设备100可以设置至少一个麦克风170C。在另一些实施例中,终端设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,终端设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动终端设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。终端设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,终端设备100根据压力传感器180A检测所述触摸操作强度。终端设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于 相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定终端设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定终端设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测终端设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消终端设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,终端设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。终端设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当终端设备100是翻盖机时,终端设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测终端设备100在各个方向上(一般为三轴)加速度的大小。当终端设备100静止时可检测出重力的大小及方向。还可以用于识别终端设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。终端设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,终端设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。终端设备100通过发光二极管向外发射红外光。终端设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定终端设备100附近有物体。当检测到不充分的反射光时,终端设备100可以确定终端设备100附近没有物体。终端设备100可以利用接近光传感器180G检测用户手持终端设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。终端设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测终端设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。终端设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,终端设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,终端设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,终端设备100对电池142加热, 以避免低温导致终端设备100异常关机。在其它一些实施例中,当温度低于又一阈值时,终端设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于终端设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。终端设备100可以接收按键输入,产生与终端设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和终端设备100的接触和分离。终端设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。终端设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,终端设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在终端设备100中,不能和终端设备100分离。
终端设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明终端设备100的软件结构。
图2中的c图是本申请实施例的终端设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列 应用程序包。
如图2中的c图所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息、系统用户界面(system UI)以及其他三方应用程序(application,APP)等应用程序。
系统用户界面,用于提供快捷服务的界面以及下拉任务栏的显示。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。其中API可以包括API1和API2,API1为添加窗口显示的接口,API2为注册协同服务的接口。
如图2中的c图所示,应用程序框架层可以包括窗口管理器(window manager),内容提供器,视图系统,电话管理器,资源管理器,通知管理器,输入管理器(input manager)和显示管理器(display manager,即屏幕管理器)等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供终端设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。
输入管理器用于处理用户的触摸操作,例如单击操作、滑动操作、双击操作等等。
显示管理器用于对内屏和外屏进行管理,控制显示内容显示在内屏和/或外屏。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动、内屏驱动和外屏驱动。
上述终端设备可以为具有内屏和外屏的电子设备,也可以是具有外屏的折叠屏设备。以具有外屏的折叠屏设备为例,其中一个屏幕是能够折叠的内屏,即折叠屏,用户在观看视频或者打游戏的时候可以展开内屏进行使用。当用户不使用内屏的时候,可以将折叠屏设备进行折叠以方便携带。另外一个屏幕是外屏,外屏设置在与内屏相对的一面。当用户将折叠屏设备进行折叠后,内屏会折合起来,而外屏会裸露在外,方便用户在不展开内屏的情况下直接操作外屏来使用折叠屏设备,例如用户可以操作外屏来接听或者拨打电话、或者关闭闹钟等。通常用户在展开折叠屏设备时,内屏会被展开,折叠屏设备会认为用户此时需要使用内屏,外屏会自动熄灭。然而,当用户在使用内屏的过程中,如果遇到需要扫码支付的情况,则需要在内屏打开支付码并翻转折叠屏设备使得内屏朝向扫码枪来进行扫码,这对于体积较大的折叠屏设备来说,操作不便,影响了用户体验。
本申请的技术方案是,当用户需要向其他人或者其他设备展示屏幕的时候,例如出示支付码进行支付的时候,则可以通过快捷操作点亮外屏,并将需要展示的支付码投射到电子设备301的外屏进行显示。例如图3所示,当用户需要进行扫码支付时,无需翻转电子设备301就可以通过外屏向扫码枪302出示支付码来完成扫码支付的操作,因此使得扫码更为方便,提升了电子设备301进行扫码的便携性,也不影响用户观看内屏,提高了用户的体验。
为了便于理解,本申请以下实施例将以具有图1和图2所示结构的终端设备为例,结合附图和应用场景,对本申请实施例提供的电子设备的屏幕显示方法进行具体阐述。
当用户在使用内屏的时候,例如用户通过观看内屏阅读小说时,内屏可以显示如图4中的a图所示的小说内容的界面,此时外屏为熄灭状态。如果这时恰逢用户需要对购买的物品进行扫码支付,这时用户可以操作内屏来打开支付码,并将支付码投射到外屏进行显示,以便扫码。例如,当用户需要出示支付码时,可以在内屏执行如图4中的a图所示操作,从内屏的顶部下拉来启动下拉任务栏,此时内屏显示如图4中的b图所示的下拉任务栏,该下拉任务栏中包括“快捷服务”的图标。用户可以如4中的b图所示的操作,点击“快捷服务”的图标来打开如图4中的c图所示的快捷服务的窗口。该快捷服务的窗口中可以包括至少一个控件,每个控件均对应一个APP,用户点击控件就可以打开相应的APP的相应页面。可选地,该快捷服务的窗口可以如图4中的c图所示以悬浮窗的形式显示在内屏,同时外屏也同步显示如图4中的d图所示的快捷服务的窗口。图4中的d图为电子设备展开时背面的示意图,包括外屏和带有 摄像头的后盖。
当用户如图5中的a图所示的操作,点击A支付码这个控件时,响应于用户的操作,折叠屏设备将快捷服务的窗口切换为如图5中的b图所示的A支付码(付款码),同时,将外屏中的快捷服务的窗口也切换为如图5中的c图所示的A支付码。用户可以将外屏对准扫码枪以便于进行扫码支付,而无需反转电子设备进行扫码。
在另一些实施例中,用户还可以双击电子设备的背部(例如双击折叠屏外屏)来唤醒外屏(即点亮外屏),并在外屏显示如图4中d图所示的快捷服务的窗口。可选地,响应于背部的双击操作,内屏也可以如图4中的c图所示以悬浮窗的形式显示快捷服务的窗口。
可选地,在展开态下,电子设备可以关闭外屏的操作权限。当外屏的操作权限关闭时,用户无法点击外屏进行操作,只能是手持电子设备的机主用户通过点选内屏中的支付码的控件来打开支付码,这样可以避免其他人从外屏打开支付码造成的信息泄露。可选地,在折叠态下,电子设备可以开启外屏的操作权限,当外屏的操作权限开启时,用户也可以通过在外屏的快捷服务的窗口中点选支付码的控件来打开支付码进行扫码支付。
上述图4中以快捷服务的窗口中包括本地健康宝、A支付码(付款码)、B支付码和乘车码这四个控件为例示出。当用户在快捷服务的窗口中点击本地健康宝的控件时,外屏则可以显示携带个人健康信息的健康码,该健康码能够标识用户的健康状况,方便用户扫码通行或者记录个人健康信息等。当用户在快捷服务的窗口中点击乘车码的控件时,外屏则可以显示乘车码,便于用户刷码乘车或者通过闸机。
在一些实施例中,快捷服务的窗口中的控件可以包括默认的几个APP的对应的控件,也可以由用户根据需要对控件的种类进行编辑,包括添加控件或者删除控件。例如,用户可以执行如图6中的a图所示的操作,点击快捷服务的窗口中的设置按钮,则快捷服务的窗口可以显示如图6中的b图所示的快捷应用设置界面。该快捷应用设置界面中包括多个支持快捷服务的APP的列表,列表中每个条目中均包括该APP的控制键。用户可以点击APP的控制键将已经添加到快捷服务中的APP从快捷服务中删除。例如用户在如图6中的b图所示的快捷应用设置界面中点击A支付码的控制键,则A支付码的快捷服务功能被关闭,快捷服务的窗口可以如图6中的c图所示,不再出现A支付码的控件。用户还可以通过点击控制键来将未开启快捷服务功能的APP添加到快捷服务中。例如用户在如图6中的b图所示的快捷应用设置界面中点击C应用的控制键,则C应用的快捷服务功能被开启,快捷服务的窗口可以如图6中的d图所示新增了C应用的控件。如果当前的快捷应用设置界面中没有需要编辑的APP,用户还可以上下拖动列表来查找APP,也可以在搜索栏中输入APP的名称进行查找,方便了用户对快捷服务中的APP进行编辑,使得快捷服务的使用更为灵活。
在上述实施例中,用户可以通过操作电子设备,在外屏中显示需要出示的扫码界面,包括支付码、健康码和乘车码等。因此用户无需翻转电子设备来通过内屏出示扫码界面,只需出示外屏即可实现扫码,方便了扫码操作。同时,也无需打断用户观看内屏就可以进行扫码操作,提高了用户体验。
在一些实施例中,三方APP在安装的过程中可以通过应用程序接口(application  programming interface,API)2向窗口管理器发送注册信息以便在系统进行注册,该注册信息用于表征三方APP能够提供外屏协同的服务,例如包括该三方APP的图标、服务名称、回调方法。没有在系统注册的三方APP则可以认为不能提供外屏协同的服务。可选地,三方APP也可以在安装完成后首次运行时通过API2向窗口管理器发送注册信息以便在系统进行注册,或者是三方APP在安装时将注册信息存储在固定路径下,电子设备在开机时按照固定路径获取注册信息以便获取有哪些APP能够提供外屏协同功能。可选地,电子设备还可以在系统内预置一些三方APP的注册信息,例如支付APP、乘车APP常常是人们需要在外屏使用的APP,因此电子设备可以预置这些APP的注册信息。需要说明的是,上述API2是不同于API1的接口,API1是三方APP用于添加窗口进行显示的接口,API2是三方APP向窗口管理器进行注册协同的接口。
图7以在用户通过浏览器阅读小说时打开外屏进行扫码支付的场景为例,对电子设备的外屏显示过程进行详细说明。包括:
S701、支付APP调用API2向窗口管理器发送支付APP的注册信息进行注册。
S702、当用户操作电子设备的内屏来打开浏览器浏览小说内容时,电子设备中的输入管理器识别用户的操作,并将识别结果发送至窗口管理器。
S703、窗口管理器响应于用户操作的识别结果,添加浏览器的窗口,并通过API1向浏览器发送窗口添加完毕的指令。
S704、浏览器响应于窗口管理器发送的窗口添加完毕的指令,则绘制浏览器显示小说内容的相应的页面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S705、窗口管理器响应于浏览器回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏显示浏览器绘制的小说的页面的信息。
S706、显示管理器控制内屏显示小说的页面。
需要说明的是,上述S701并不一定在S702之前执行,还可以在S702、S703、S704、S705或者S706之后执行,只要是在S707之前执行即可。
S707、当用户点击外屏协同的控件或者双击外屏时,输入处理模块识别用户的操作,并将识别结果发送至窗口管理模块。
S708、窗口管理模块响应于用户操作的识别结果,添加快捷服务的窗口,并通过API1向系统用户界面(system UI)发送窗口添加完毕的指令。
S709、系统用户界面响应于窗口管理器发送的窗口添加完毕的指令,绘制快捷服务的窗口,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S710、窗口管理器响应于系统用户界面回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏和外屏显示快捷服务的窗口的信息。
S711、显示管理器控制内屏和外屏显示快捷服务的窗口。
S712、当用户点选快捷服务的窗口中的支付APP的控件时,输入管理器识别用户的操作并通过API1将识别结果发送至窗口管理器。
S713、窗口管理器响应于用户操作的识别结果,添加支付码的窗口,并通过API1向支付APP发送窗口添加完毕的指令。
S714、支付APP响应于窗口管理器发送的窗口添加完毕的指令,绘制支付码的页 面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S715、窗口管理器响应于支付APP回执的绘制完成的消息后,向显示管理器发送显示消息,该显示消息中包括由内屏和外屏显示支付码的页面的信息。
S716、显示管理器控制内屏和外屏显示支付码的界面。
此时,用户就可以通过外屏进行扫码支付,无需翻转电子设备,使得扫码支付更方便。
如果用户想要和其他人分享屏幕时,也可以通过打开外屏协同的功能,将内屏的界面投射到外屏进行显示,其他人就可以通过外屏来观看用户想要展示的界面,用户无需反转电子设备来和其他用户共享屏幕,使得屏幕共享更为方便。
以作为机主的用户向其他人展示图片为例,当用户打开图库后,内屏可以显示如图8中的a图所示的图库的界面,图库的界面中包括多张照片。用户可以执行如图8中的a图所示的操作,从内屏的顶部执行下拉操作,内屏顶部则显示如图8中的b图所示的下拉任务栏。该下拉任务栏中包括投屏的控件,图8中以投屏的控件显示为“外屏协同”的控件为例示出。
当用户执行如图8中的b图所示的点击下拉任务栏中的“外屏协同”的控件的操作时,外屏则可以如图8中的c图所示的图库的界面。如果用户执行如图8中的b图所示的点选其中一张图片的操作,内屏则展开显示该图片。此时外屏处于外屏协同的模式,则外屏可以如图8中的d图所示展开显示用户点选的图片。其他人就可以通过外屏来观看用户点选的图片。当用户在内屏左右滑动切换图片时,外屏可以同步显示内屏所显示的内容,以便其他人能够通过外屏观看用户想要分享的内容。
在一些实施例中,在开启外屏协同功能时,电子设备可以关闭外屏的操作权限,避免他人对外屏进行操作来切换显示的内容,导致用户的隐私泄露。在一些实施例中,在开启外屏协同功能时,电子设备也可以开启外屏的操作权限,使得其他用户在观察外屏的过程中操作外屏进行使用,例如左右滑动外屏来切换图片、点击外屏中的编辑控件来对图片进行编辑、点击外屏中的搜藏控件来收藏图片或者点击外屏中的分享控件将图片分享至其他应用。电子设备开启外屏的操作权限能够提高内屏和外屏协同过程中的交互性,丰富了电子设备的协同功能。
在一些实施例中,用户打开外屏协同的功能时还可以根据需要选择镜像协同或者异步协同的模式。其中,镜像协同的模式为外屏和内屏显示相同内容的模式;异步协同为外屏所显示的内容和内屏不同或者不完全相同的模式,例如外屏显示APP1的界面而内屏显示APP2的界面,再如外屏显示APP1的界面,内屏显示APP2的界面的同时还以悬浮窗的形式显示APP1的界面。
异步协同的模式下,电子设备可以将最前端运行的APP的界面投射到外屏显示,其他APP的界面可以只在内屏显示。例如用户在观看浏览器的内容的同时调用快捷服务,此时电子设备默认为异步协同的模式,内屏中显示浏览器的界面的同时将快捷服务的窗口以悬浮窗的形式显示,外屏则只显示快捷服务的窗口而不会显示浏览器的界面。
在一些实施例中,当用户需要向他人展示图片的时候,可以在内屏显示的图库的界面执行如图9中的a图所示的从内屏的顶部下拉的操作来启动下拉任务栏,则内屏 显示如图9中的b图所示显示下拉任务栏。用户在下拉任务栏中点击“外屏协同”的控件,内屏的界面则弹出如图9中的c图所示的协同模式的选择窗口,该协同模式的选择窗口中包括镜像协同和异步协同的控件,用户可以根据需要选择合理的协同模式。
当用户选择镜像协同的模式时,外屏则显示和内屏相同的内容,例如外屏显示如图9中的d图所示的图库的界面。当用户打开其中一张图片时,外屏也展开显示这张图片。当用户在内屏来回滑动切换图片时,外屏也随之切换图片。
可选地,用户还可以点击协同模式的选择窗口中的“取消”控件来退出外屏协同功能的启动。在一些实施例中,一些应用程序可以设定为默认的外屏协同的模式,例如上述图8所示的实施例中,用户启动图库的外屏协同功能时,图库这个应用程序可以是设定为默认的镜像协同的模式,而无需用户进行选择。
在一些实施例中,当用户需要向他人展示视频的时候,也可以如图9所示的实施例的流程,当用户启动外屏协同功能的时候,外屏显示和内屏相同的视频的内容,实现视频分享。用户还可以操作内屏中的视频播放界面来控制视频播放,例如用户在内屏的视频播放界面中点击开始、暂停、停止等按键,外屏也可以显示响应用户操作同步显示内屏中的视频播放的界面。
图10以在用户分享图片的场景为例,对电子设备的外屏显示图片的过程进行详细说明。包括:
S1001、图库调用API2向窗口管理器发送图库的注册信息进行注册。
S1002、当用户操作电子设备的内屏来打开图库时,电子设备中的输入管理器识别用户的操作,并将识别结果发送至窗口管理器。
S1003、窗口管理器响应于用户操作的识别结果,添加图库的窗口,并通过API1向图库发送窗口添加完毕的指令。
S1004、图库响应于窗口管理器发送的窗口添加完毕的指令,则绘制图库的页面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S1005、窗口管理器响应于图库回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏显示图库的页面的信息。
S1006、显示管理器控制内屏显示图库的页面。
需要说明的是,上述S1001并不一定在S1002之前执行,还可以在S1002、S1003、S1004、S1005或者S1006之后执行,只要是在S1007之前执行即可。
S1007、当用户从内屏顶部下滑启动下拉任务栏时,输入处理模块识别用户的操作,并将识别结果发送至窗口管理模块。
S1008、窗口管理模块响应于用户操作的识别结果,添加下拉任务栏的窗口,并向系统用户界面发送窗口添加完毕的指令。
S1009、系统用户界面响应于窗口管理器发送的窗口添加完毕的指令,绘制下拉任务栏的窗口,并且在绘制完成之后向窗口管理器回执绘制完成的消息。
S1010、窗口管理器响应于系统用户界面回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏显示图库的窗口的信息。
S1011、显示管理器控制内屏显示图库的窗口。
S1012、当用户操作内屏打开外屏协同时,输入管理器识别用户的操作并将识别结 果发送至窗口管理器。
S1013、窗口管理器响应于用户操作的识别结果,向显示管理器发送显示消息,该显示消息中包括由内屏和外屏显示图库的页面的信息。
S1014、显示管理器控制内屏和外屏显示图库的页面。
S1015、当用户操作内屏打开图片时,输入管理器识别用户的操作并将识别结果发送至窗口管理器。
S1016、窗口管理器响应于用户操作的识别结果,添加图片的窗口,并通过API1向图库发送窗口添加完毕的指令。
S1017、图库响应于窗口管理器发送的窗口添加完毕的指令,则绘制图片的页面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S1018、窗口管理器响应于图库回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏和外屏显示图片的页面的信息。
S1019、显示管理器控制内屏和外屏显示图片的页面。
此时,用户就可以通过外屏向他人展示图片,无需翻转电子设备,使得图片的分享更方便。
当家长想要使用电子设备给孩子拍照时,如图11所示的场景,如果担心孩子无法集中注意力导致拍摄的效果不好的话,则可以使用外屏播放一段动画片来吸引孩子注意力,然后进行拍照,这时内屏可以显示拍照预览的界面,既不影响家长拍摄还可以吸引孩子注意力,提高拍摄效果和拍摄速度。
具体为,作为家长的用户可以打开本地存储的动画片或者打开其他视频APP来播放一段视频,此时内屏可以如图12中的a图所示显示视频播放的界面。当用户执行如图12中的a图所示的从内屏的顶部下拉的操作时,内屏则如图12中的b图所示显示下拉任务栏。用户执行如图12中的b图所示的操作,即在下拉任务栏中点击“外屏协同”的控件,内屏的界面则弹出如图12中的c图所示的协同模式的选择窗口,该协同模式的选择窗口中包括镜像协同和异步协同的控件。当用户在图12中的c图所示界面中点击“异步协同”的控件时,外屏则可以显示如图12中的d图所示的视频播放的界面。
可选地,为了让孩子观看更大的视频的界面,还可以将电子设备横向持握,即外屏处于横屏模式,此时外屏则可以如图13中的a图或者如图13中的b图显示横屏的视频播放的界面。
这时候,电子设备的外屏权限可以是关闭的状态,孩子无法操作外屏,只有观看的权限,这样可以避免孩子误操作影响拍摄。用户可以操作内屏来控制视频播放,例如点击播放、暂停、下一集等控件来控制视频的播放。
孩子通过外屏观看动画片时,用户则可以打开摄像APP进行拍照或者录像。可选地,用户可以在如图14中的a图所示的操作,从内屏的侧边向中间滑动退出视频播放界面,此时由于电子设备处于外屏协同的状态,视频播放界面不会关闭,而是如图14中的b图所示以缩小的悬浮窗的形式显示在内屏的一角。然后用户可以在如图14中的b图所示在桌面上点击相机的图标来打开相机进行拍摄。在用户进行拍摄过程中,外屏可以一直播放动画片吸引孩子注意,内屏则如图14中的c图所示显示拍摄的预览界面。
可选地,用户可以在如图15中的a图所示的操作,从内屏的侧边向中间滑动退出视频播放界面,此时由于电子设备处于外屏协同的状态,视频播放界面可以如图15中的b图所示以缩小的悬浮球的形式显示在内屏的一角。然后用户可以在如图15中的b图所示在桌面上点击相机的图标来打开相机进行拍摄。在用户进行拍摄过程中,外屏可以一直播放动画片吸引孩子注意,内屏则如图15中的c图所示显示拍摄的预览界面。
可选地,在异步协同的模式下,用户还可以如图16中的a图所示的操作,从内屏的侧边向中间滑动并停顿来打开如图16中的b图所示的侧边栏,并从侧边栏中点选相机的图标来打开相机进行拍摄。此时内屏则可以如图16中的c图所示显示拍摄的预览界面,同步将外屏所显示的视频播放的界面以悬浮球的形式显示在内屏中。用户还可以在内屏拖动悬浮窗或者悬浮球来移动位置从而避开操作的区域。用户可以在拍摄的预览界面中操作内屏来设置相机拍摄的参数。
图17以在用户通过外屏给孩子播放视频,同时给孩子拍照的场景为例,对电子设备的外屏显示视频的过程进行详细说明。包括:
S1701、视频播放器调用API2向窗口管理器发送视频播放器的注册信息进行注册。
S1702、当用户操作电子设备的内屏来打开视频进行播放时,电子设备中的输入管理器识别用户的操作,并将识别结果发送至窗口管理器。
S1703、窗口管理器响应于用户操作的识别结果,添加视频播放器的窗口,并通过API1向视频播放器发送窗口添加完毕的指令。
S1704、视频播放器响应于窗口管理器发送的窗口添加完毕的指令,则绘制视频的页面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S1705、窗口管理器响应于视频播放器回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏显示视频播放器的页面的信息。
S1706、显示管理器控制内屏显示视频播放器的页面。
需要说明的是,上述S1701并不一定在S1702之前执行,还可以在S1702、S1703、S1704、S1705或者S1706之后执行,只要是在S1707之前执行即可。
S1707、当用户操作内屏打开外屏协同时,输入管理器识别用户的操作并将识别结果发送至窗口管理器。
S1708、窗口管理器响应于用户操作的识别结果,向显示管理器发送显示消息,该显示消息中包括由内屏和外屏显示视频播放的界面的信息。
S1709、显示管理器控制内屏和外屏显示视频播放的界面。
S1710、当用户从内屏侧边向内滑动时,输入处理模块识别用户的操作,并将识别结果发送至窗口管理模块。
S1711、窗口管理模块响应于用户操作的识别结果,添加桌面的窗口和视频播放器的悬浮窗的窗口,并向视频播放器发送悬浮窗的窗口添加完毕的指令。
S1712、视频播放器响应于窗口管理器发送的悬浮窗的窗口添加完毕的指令,绘制视频播放器的悬浮窗,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S1713、窗口管理器响应于视频播放器回执的绘制完成的消息,向显示管理器发送 显示消息,该显示消息中包括由内屏显示桌面和视频播放器的悬浮窗的信息。
S1714、显示管理器控制内屏显示桌面和视频播放器的悬浮窗。
S1715、当用户操作内屏打开相机时,输入管理器识别用户的操作并将识别结果发送至窗口管理器。
S1716、窗口管理器响应于用户操作的识别结果,添加相机的窗口,并通过API1向相机发送窗口添加完毕的指令。
S1717、相机响应于窗口管理器发送的窗口添加完毕的指令,则绘制相机的界面,并且在绘制完成之后通过API1向窗口管理器回执绘制完成的消息。
S1718、窗口管理器响应于相机回执的绘制完成的消息,向显示管理器发送显示消息,该显示消息中包括由内屏显示相机的界面的信息。
S1719、显示管理器控制内屏显示相机的页面,并以悬浮窗的形式显示视频播放的界面。
这样用户就可以通过外屏向孩子播放视频来吸引孩子注意力,同时使用内屏预览拍摄的界面进行拍摄,既不影响家长拍摄还可以吸引孩子注意力使得孩子配合拍摄,提高拍摄效果和拍摄速度。
当用户先打开了相机对孩子进行拍摄,发现孩子注意力不集中,则可以在如图18中的a图所示的拍摄预览的界面中从屏幕侧边向中间滑动并停顿的方式打开侧边栏,则内屏可以如图18中的b图所示显示侧边栏。用户在图18中的b图所示的界面中点击视频播放器来播放视频,这是视频播放器的界面则可以如图18中的c图所示以悬浮窗的形式显示在相机的预览界面的上方。此时用户可以在下拉任务栏中打开外屏协调并选择异步协同,则内屏可以显示如图14中的c图或者15中的c图所示的界面,显示相机的预览界面以及缩小的视频播放器的悬浮窗或悬浮球;而外屏则可以如图12中的d图、图13中的a图或者图13中的b图所示,显示视频播放器的界面。
当用户结束展示时,可以在下拉任务栏中点击外屏协同的控件来停来外屏协同,这时,外屏可以熄灭或者显示外屏的桌面,而不再显示之前展示的界面。
图19为本申请实施例提供的又一例屏幕显示方法的流程图,该方法应用于包括第一屏幕和第二屏幕的电子设备,本实施例以第一屏幕为内屏,第二屏幕为外屏为例进行说明,其中,外屏和内屏设置在电子设备上相对的两侧,方法包括:
S1901、接收用户执行的第一操作。
可选地,用户可以在内屏执行第一操作,例如在桌面点击快捷服务的图标来打开快捷服务的窗口。可选地,用户也可以在外屏执行第一操作,例如双击外屏、或者双指敲击外屏来打开快捷服务的窗口。
S1902、响应于所述第一操作,在所述第一屏幕显示第一界面,所述第一界面中包括第一控件。
响应于上述第一操作,外屏可以显示第一界面,本实施例中以第一界面为快捷服务的窗口为例来进行说明。第一界面可以如图4中的d图所示的快捷服务的窗口。其中,快捷服务的窗口中可以包括一个或多个服务控件,每个服务控件均对应一个APP或一个APP的设定的页面,例如一个支付的服务控件可以对应支付APP,也可以对应该支付APP的支付码的页面。可选地,上述快捷服务的窗口中的服务控件可以包括用 户需要快捷调用的APP对应的第一控件。
S1903、接收用户对第一控件的第二操作。
通过快捷服务的窗口中,电子设备接收用户执行的选择第一控件的第二操作。例如该第二操作为从快捷服务的窗口中的至少一个服务控件中点击所需要展示的第一控件的操作。
在一些实施例中,外屏的操作权限可以是关闭状态避免其他人误操作导致隐私泄露或者信息丢失,用户可以是在内屏执行第二操作。在一些实施例中,外屏的操作权限可以是打开状态,其他人可以在外屏执行第二操作来选择第一控件,实现在机主允许的情况下根据需要选择要展示的服务码,因此使用更方便。
S1904、响应于所述第二操作,在所述第二屏幕显示所述第一控件对应的第二界面。
响应于上述第二操作,电子设备在外屏显示上述第一控件对应的第二界面,例如服务码。例如,当用户所选择的第一控件为支付APP的控件,则对应的服务码可以为支付码;当用户所选择的第一控件为个人健康APP的控件,则对应的服务码还可以是个人健康码;当用户所选择的第一控件为乘车APP的控件时,则对应的服务码可以是乘车码。需要说明的是,此处的服务码可以是条码、二维码等形式,本申请实施例对服务码的具体形式不做限定。
上述实施例中,电子设备能够基于用户的第一操作在外屏显示第一界面,例如快捷服务的窗口,然后在第二操作的触发下,通过外屏显示出用户所选择的第一控件对应的第二界面,例如服务码的界面,可以避免用户翻转电子设备进行扫码带来的不便,使得扫码操作更为方便和快捷。当电子设备本身为折叠状态时,即内屏为折叠状态,此时内屏熄灭。用户双击外屏可以唤醒外屏的同时,外屏的操作权限为开启状态,用户则可以在外屏执行第二操作来打开对应的第二界面。这样用户则无需展开内屏即可实现扫码,提高了扫码操作的便捷性。
在一些实施例中,上述步骤S1901的一种可能的实现方式可以包括:在所述第一屏幕显示第三界面时,接收所述第一操作,所述第三界面为以下界面中的一种:锁屏界面、AOD界面、桌面、第一APP的界面。当用户观看内屏所显示的第一APP的界面时,例如用户采用浏览器显示的小说的内容时,此时需要用户进行扫码支付或者扫码乘车等操作时,则可以执行第一操作来打开快捷服务的窗口(即第一界面),在快捷服务的窗口执行第二操作使得外屏显示需要展示的服务码(即第二界面),以便在用户不翻转电子设备的情况下使用外屏进行扫码操作。由于外屏只展示服务码,这样也就无需将第一APP的第一界面展示给其他人或设备,保护了用户的信息不外泄,用户还可以一边不间断的使用第一APP一边进行扫码,保护了用户的隐私的同时提升了用户的扫码体验。
可选地,上述第一APP可以为浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。用户通过操作电子设备将需要展示的服务码显示在外屏可以不影响用户在内屏使用上述浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个,提高了用户的扫码体验。
在一些实施例中,所述第一操作为在内屏(即第一屏幕)向下滑动启动下拉任务栏并点击所述下拉任务栏中快捷服务的控件的操作。当用户正在使用内屏时,用户可 以在内屏的顶部向下滑动来启动下拉任务栏,并点击下拉任务栏中的快捷服务的控件来打开快捷服务的窗口,这样的方式执行起来更为精准,不会出现误操作。
在一些实施例中,所述第一操作为双击外屏(即第二屏幕)的操作。用户还可以在外屏执行双击外屏的操作来点亮外屏并在外屏打开快捷服务的窗口,这样则无需精准操作内屏就可以通过双击外屏这样的快捷操作来打开快捷服务的窗口,使得操作方式更灵活,也更方便。
在一些实施例中,所述第一界面中包括:支付码的控件、乘车码的控件和个人健康码的控件中的至少一个。支付码、乘车码和个人健康码为使用频次较高的服务码,将这些服务码中的一个或多个收纳在快捷服务的窗口中,可以明显减少用户手动通过打开APP来打开服务码的次数,提升了用户体验。
在一些实施例中,用户还可以对上述快捷服务的窗口中的服务控件进行管理,例如增加或者删除。
图4中的快捷服务的窗口中以包括:本地健康宝、A支付码、B支付码和乘车码的控件为例示出。用户可以执行第三操作,该第三操作可以参见图6中的a图所示的点击内屏的快捷服务的窗口中的第二控件的操作。该第二控件可以为菜单控件。电子设备接收到该第三操作并进行响应,在内屏所显示的快捷服务的窗口中显示如图6中的b图所示的APP管理列表,该APP管理列表中包括多个支持快捷服务的APP的图标和多个状态按钮,且多个支持快捷服务的APP的图标和多个状态按钮一一对应,即每个快捷服务的APP的图标均对应一个代表自身状态的状态按钮。每个状态按钮可以包括打开和关闭两个状态。其中,打开状态表示快捷服务的窗口中包含了这个状态按钮所对应的快捷服务的APP的控件,如果用户需要将这个状态按钮对应的APP的图标从快捷服务的窗口删除,则手动切换该状态按钮为关闭状态即可;关闭状态表示快捷服务的窗口中不包含对应的APP的控件,如果用户需要将这个状态按钮对应的APP的图标添加至快捷服务的窗口,则手动切换该状态按钮为打开状态即可。当电子设备接收到用户对目标状态按钮执行的第四操作,其中,目标状态按钮为多个状态按钮中的任意一个。响应于该第四操作,电子设备可以在快捷服务的窗口中增加或删除目标状态按钮对应的支持快捷服务的APP的图标。也就是说:如果目标状态按钮是关闭状态,用户执行点击该目标状态按钮的第四操作时,可以将关闭状态切换为打开状态,原本快捷服务的窗口中没有包含目标状态按钮对应的APP的图标,此时则可以在快捷服务的窗口中增加目标状态按钮对应的APP的图标;如果目标状态按钮是打开的状态,用户执行点击该目标状态按钮的第四操作时,可以将打开状态切换为关闭状态,原本快捷服务的窗口中包含目标状态按钮对应的APP的图标,此时则可以在快捷服务的窗口中删除目标状态按钮对应的APP的图标。本实施例中,用户可以通过操作快捷服务的窗口中的菜单控件来打开APP管理列表,然后通过操作APP管理列表中支持快捷服务的APP的图标所对应的状态按钮,来对快捷服务的窗口中的服务控件进行管理。该方法可以实现用户自行管理快捷服务的窗口中的服务控件的种类,使得通过快捷服务的窗口调用服务码的操作能够满足不同用户的个性化的需求,提升了用户体验。
图20为本申请实施例提供的又一例屏幕显示方法的流程图,该方法应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,本实施例以第一屏幕为内屏,第二屏幕 为外屏为例进行说明,其中,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,包括:
S2001、在所述第一屏幕显示第三界面,所述第三界面为第一APP的界面。
当用户使用第一APP时,电子设备可以在内屏显示第一APP的第三界面。本申请实施例对第一APP的种类不做限定,可以包括但不限于浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。上述第一APP的第一界面可以是游戏APP显示的界面,还可以是视频APP显示的视频播放界面,还可以是文档APP显示的文档内容的界面,本申请实施例对此不做限定。
S2002、接收用户执行的第五操作,所述第五操作为打开第一屏幕和第二屏幕的协同功能(例如内屏和外屏的协同功能)的操作。
当用户需要进行扫码时,可以执行打开内屏和外屏的协同功能的第五操作,例如用户点击桌面上的协同功能的启动按钮,或者从内屏下滑下拉任务栏并点击下拉任务栏中的外屏协同的控件。
S2003、响应于所述第五操作,在所述第二屏幕显示所述第一屏幕所显示的第一APP的界面。
此时,电子设备可以在用户的第五操作的触发下,在外屏显示内屏所显示的第一APP的界面。当用户在内屏继续操作该第一APP时,外屏可以继续跟随内屏显示内屏所显示的第一APP的界面。例如,当内屏显示图库的中的图片缩略图这个第一界面时,如果用户执行第五操作,则外屏显示内屏所显示的图库的图片缩略图。此时,如果用户在内屏点击其中一张图片来显示这个图片的放大图时,则外屏同步显示内屏所显示的这个图片的放大图。
上述图20所示的实施例中,电子设备可以在用户执行第五操作时,启动内屏和外屏的协同功能,即使得外屏显示内屏所显示的内容,从而可以无需翻转电子设备就能够实现将第一APP的界面展示给其他的用户查看,避免了翻转电子设备来分享屏幕所带来的不便,使得分享屏幕更为方便,提升了用户分享屏幕的体验。
在一些实施例中,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并点选下拉任务栏中的外屏协同的控件,以及在弹出的外屏协同选择窗口中选择异步协同的操作。即当用户在内屏的顶部下滑时,可以拉出下拉任务栏。该下拉任务栏中包括外屏协同的控件。当用户点击该外屏协同的控件或者点击外屏协同的控件的菜单按键时,内屏可以弹出外屏协同选择窗口。该外屏协同选择窗口中可以包括异步协同的控件。当用户点击异步协同的控件,则电子设备启动协同功能的异步协同的模式。可选地,所述第五操作还可以是在第一屏幕中下滑启动下拉任务栏并点选下拉任务栏中的外屏协同的控件,以及在弹出的外屏协同选择窗口中选择异步协同的操作。即当用户在内屏的顶部下滑时,可以拉出下拉任务栏。该下拉任务栏中包括外屏协同的控件。当用户点击该外屏协同的控件或者点击外屏协同的控件的菜单按键时,内屏可以弹出外屏协同选择窗口。该外屏协同选择窗口中还可以包括镜像协同的控件。如果用户点击镜像协同的控件,则电子设备启动协同功能的镜像协同的模式。用户可以根据需要选择不同的协同功能的模式来适配当前的使用场景。
在一些实施例中,当协同功能的模式为异步协同的模式时,电子设备还可以接收 用户执行的在内屏打开第二APP的第六操作。响应于第六操作,电子设备在内屏显示第二APP的界面,同时将第一APP的界面以悬浮窗的形式显示在内屏中。当用户打开协同功能时,如果选择了异步协同的模式,则外屏可以显示与内屏不完全一样的内容。即,用户在内屏显示第一APP的界面时启动了异步协同的模式,则当用户操作内屏打开第二APP,即执行第六操作,内屏则显示第二APP的界面。此时外屏依旧显示第一APP的界面,而内屏中可以将第一APP的界面显示在悬浮窗中,以提示用户外屏所显示的内容。本实施例中,用户可以在打开异步协同模式的协同功能后,保持外屏显示第一APP的界面展示,而自己可以在内屏使用第二APP,同时用户还可以通过内屏的悬浮窗所显示的第一APP的界面来监控第一APP的演示情况,这使得屏幕分享的形式更为丰富,在丰富电子设备的功能的同时,方便了用户的使用,提升了用户体验。
上述用户打开第二APP的第六操作可以是在内屏进行操作,先点击“Home”按键来退出第一APP的界面回到桌面,然后从桌面点击第二APP的图标来打开第二APP的操作;还可以是在第一APP的界面下,从内屏的侧边向内屏的中间滑动一段距离并停顿来打开侧边栏,然后点击侧边栏中的第二APP的图标来打开第二APP的操作。
在一些实施例中,当协同功能的模式为镜像协同的模式时,电子设备还可以接收用户执行的在内屏操作第一APP的第七操作。响应于第七操作,电子设备在内屏切换第一APP的界面来显示第七操作所指示的内容,同时将第七操作所指示的内容同步显示在外屏中,使得内外屏显示相同的内容。该第七操作可以是对第一APP的界面进行的任意切换界面的操作。即,当用户打开协同功能时,如果选择了镜像协同的模式,则外屏可以显示与内屏一样的内容。本实施例中,用户可以在打开镜像协同模式的协同功能后,保持外屏和内屏相同的界面来实现屏幕分享,无需多人观看同一屏幕导致的需要调整翻转电子设备导致的操作不便,该方法使得屏幕分享更为方便,提升了用户体验。
在一些实施例中,上述第一APP为图库APP、视频APP、文档APP中的任意一个。用户可以通过打开内屏和外屏的协同功能实现将图库APP、视频APP、文档APP等界面分享至外屏,方便了用户分享图片、视频和文档等。
在一些实施例中,当所述第一APP为视频APP时,所述第二APP为相机APP。当家长想要给孩子拍照,又怕孩子注意力不集中影响拍摄效果时,则可以打开视频APP,并开启异步协同的模式,将视频APP的界面分享至外屏来吸引孩子的注意力。然后家长打开相机APP进行拍摄,可以保证孩子注意力朝向正在拍摄的电子设备,方便用户对孩子进行拍摄,确保了拍摄效果。
上文详细介绍了本申请提供的屏幕显示方法的示例。可以理解的是,相应的装置为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请可以根据上述方法示例对屏幕显示装置进行功能模块的划分,例如,可以将各个功能划分为各个功能模块,也可以将两个或两个以上的功能集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
图21示出了本申请提供的一种屏幕显示装置的结构示意图。装置2100应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,例如所述电子设备包括内屏和外屏,包括:
第一接收模块2101,用于接收用户执行的第一操作。
第一显示模块2102,用于响应于所述第一操作,在所述第一屏幕显示第一界面,所述第一界面中包括第一控件。
第一接收模块2101,还用于接收用户对第一控件的第二操作。
第一显示模块2102,还用于响应于所述第二操作,在所述第二屏幕显示所述第一控件对应的第二界面。
在一些实施例中,还包括第二显示模块2103,用于在所述第一屏幕显示第三界面,所述第三界面为第一APP的界面。
第一接收模块2101,具体用于在所述第一屏幕显示第三界面时,接收所述第一操作。
在一些实施例中,所述第一APP为浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。
在一些实施例中,所述第一操作为在第一屏幕下滑启动下拉任务栏并点击所述下拉任务栏中快捷服务的控件的操作。
在一些实施例中,所述第一操作为双击第二屏幕的操作。
在一些实施例中,所述第一界面中包括:支付码的控件、乘车码的控件和个人健康码的控件中的至少一个。
在一些实施例中,所述第一界面为快捷服务的窗口,第一接收模块2101,还用于接收用户执行的第三操作,所述第三操作为点击所述第一界面中的菜单控件的操作。
第二显示模块2103,还用于响应于所述第三操作,在第一屏幕中显示APP管理列表,所述APP管理列表中包括多个支持快捷服务的APP的图标和多个状态按钮,且所述多个支持快捷服务的APP的图标和所多个述状态按钮一一对应。
第一接收模块2101,还用于接收用户执行的第四操作,所述第四操作为对目标状态按钮执行的操作,所述目标状态按钮为任意一个所述状态按钮。
第二显示模块2103,还用于响应于所述第四操作,在所述快捷服务的窗口中增加或删除所述目标状态按钮对应的支持快捷服务的APP的图标。
图22示出了本申请提供的一种屏幕显示装置的结构示意图。装置2200应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,例如所述电子设备包括内屏和外屏,包括:
第三显示模块2201,用于在第一屏幕显示第一界面,所述第一界面为以下界面中的一种:锁屏界面、AOD界面、桌面、第一APP的界面。
第二接收模块2202,用于接收用户执行的第五操作,所述第五操作为打开第一屏幕和外屏的协同功能的操作。
第四显示模块2202,用于响应于所述第五操作,在所述第二屏幕所述第一屏幕所显示的第一APP的界面。
在一些实施例中,当所述协同功能的模式为异步协同的模式时,第二接收模块2202,还用于接收用户执行的第六操作,所述第六操作为在所述第一屏幕打开第二APP的操作。
第三显示模块2201,还用于响应于所述第六操作,在所述第一屏幕显示所述第二APP的界面,并将所述第一APP的界面以悬浮窗的形式显示在所述第一屏幕中。
在一些实施例中,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择异步协同的操作。
在一些实施例中,所述第六操作为在所述第一屏幕操作的退出第一APP的界面并从桌面打开所述第二APP的操作;或者,在所述第一屏幕操作的调用侧边栏并点选所述侧边栏中的第二APP的图标的操作。
在一些实施例中,当所述协同功能的模式为镜像协同的模式时,第二接收模块2202,还用于接收用户执行的第七操作,当所述第七操作为在所述第一屏幕操作所述第一APP的操作。
第三显示模块2201,还用于响应于所述第七操作,在所述第一屏幕显示所述第七操作所指示的内容。
第四显示模块2202,还用于响应于所述第七操作,在所述第二屏幕所述第七操作所指示的内容。
在一些实施例中,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择镜像协同的操作。
在一些实施例中,所述第一APP为图库APP、视频APP、文档APP中的任意一个。
在一些实施例中,当所述第一APP为视频APP时,所述第二APP为相机APP。
装置2100和装置2200执行屏幕显示方法的具体方式以及产生的有益效果可以参见方法实施例中的相关描述,此处不再赘述。
本申请实施例还提供了一种电子设备,包括上述处理器。本实施例提供的电子设备可以是图1所示的终端设备100,用于执行上述屏幕显示方法。在采用集成的单元的情况下,终端设备可以包括处理模块、存储模块和通信模块。其中,处理模块可以用于对终端设备的动作进行控制管理,例如,可以用于支持终端设备执行显示单元、检测单元和处理单元执行的步骤。存储模块可以用于支持终端设备执行存储程序代码和数据等。通信模块,可以用于支持终端设备与其它设备的通信。
其中,处理模块可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP) 和微处理器的组合等等。存储模块可以是存储器。通信模块具体可以为射频电路、蓝牙芯片、Wi-Fi芯片等与其它终端设备交互的设备。
在一个实施例中,当处理模块为处理器,存储模块为存储器时,本实施例所涉及的终端设备可以为具有图1所示结构的设备。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储了计算机程序,当所述计算机程序被处理器执行时,使得处理器执行上述任一实施例所述的屏幕显示方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的屏幕显示方法。
其中,本实施例提供的电子设备、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,更换的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (18)

  1. 一种屏幕显示方法,应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,其特征在于,包括:
    接收用户执行的第一操作;
    响应于所述第一操作,在所述第一屏幕显示第一界面,所述第一界面中包括第一控件;
    接收用户对第一控件的第二操作;
    响应于所述第二操作,在所述第二屏幕显示所述第一控件对应的第二界面。
  2. 根据权利要求1所述的方法,其特征在于,接收用户执行的第一操作,包括:
    在所述第一屏幕显示第三界面时,接收所述第一操作,所述第三界面为以下界面中的一种:锁屏界面、息屏AOD界面、桌面、第一应用程序APP的界面。
  3. 根据权利要求2所述的方法,其特征在于,所述第一APP为浏览器APP、视频APP、游戏APP、会议APP、文档APP和视频通话APP中的任意一个。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述第一操作为在所述第一屏幕下滑启动下拉任务栏并点击所述下拉任务栏中快捷服务的控件的操作。
  5. 根据权利要求1至3中任一项所述的方法,其特征在于,所述第一操作为双击所述第二屏幕的操作。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述第一界面中包括:支付码的控件、乘车码的控件和个人健康码的控件中的至少一个。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述第一界面为快捷服务的窗口,所述方法还包括:
    接收用户执行的第三操作,所述第三操作为点击所述第一界面中的第二控件的操作;
    响应于所述第三操作,在所述第一屏幕中显示APP管理列表,所述APP管理列表中包括多个支持快捷服务的APP的图标和多个状态按钮,且所述多个支持快捷服务的APP的图标和多个所述状态按钮一一对应;
    接收用户执行的第四操作,所述第四操作为对目标状态按钮执行的操作,所述目标状态按钮为任意一个所述状态按钮;
    响应于所述第四操作,在所述快捷服务的窗口中增加或删除所述目标状态按钮对应的支持快捷服务的APP的图标。
  8. 一种屏幕显示方法,应用于电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,其特征在于,包括:
    在所述第一屏幕显示第三界面,所述第三界面为以下界面中的一种:锁屏界面、息屏AOD界面、桌面、第一应用程序APP的界面;
    接收用户执行的第五操作,所述第五操作为打开第一屏幕和第二屏幕的协同功能的操作;
    响应于所述第五操作,在所述第二屏幕显示所述第一屏幕所显示的第一APP的界面。
  9. 根据权利要求8所述的方法,其特征在于,当所述协同功能的模式为异步协同的模式时,所述方法还包括:
    接收用户执行的第六操作,所述第六操作为在所述第一屏幕打开第二APP的操作;
    响应于所述第六操作,在所述第一屏幕显示所述第二APP的界面,并将所述第一APP的界面以悬浮窗的形式显示在所述第一屏幕中。
  10. 根据权利要求9所述的方法,其特征在于,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择异步协同的操作。
  11. 根据权利要求9或10所述的方法,其特征在于,所述第六操作为在所述第一屏幕操作的退出第一APP的界面并从桌面打开所述第二APP的操作;或者,在所述第一屏幕操作的调用侧边栏并点选所述侧边栏中的第二APP的图标的操作。
  12. 根据权利要求8所述的方法,其特征在于,当所述协同功能的模式为镜像协同的模式时,所述方法还包括:
    接收用户执行的第七操作,当所述第七操作为在所述第一屏幕操作所述第一APP的操作;
    响应于所述第七操作,在所述第一屏幕和所述第二屏幕同步显示所述第七操作所指示的内容。
  13. 根据权利要求12所述的方法,其特征在于,所述第五操作为在所述第一屏幕中下滑启动下拉任务栏并在所述下拉任务栏中点选外屏协同的控件,以及在弹出的外屏协同选择窗口中选择镜像协同的操作。
  14. 根据权利要求12或13所述的方法,其特征在于,所述第一APP为图库APP、视频APP、文档APP中的任意一个。
  15. 根据权利要求14所述的方法,其特征在于,当所述第一APP为视频APP时,所述第二APP为相机APP。
  16. 一种电子设备,所述电子设备包括第一屏幕和第二屏幕,所述第一屏幕和所述第二屏幕分别设置在电子设备上相对的两侧,其特征在于,包括:处理器、存储器和接口;
    所述处理器、所述存储器和所述接口相互配合,使得所述电子设备执行如权利要求1至15中任一项所述的方法。
  17. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储了计算机程序,当所述计算机程序被处理器执行时,使得所述处理器执行权利要求1至15中任一项所述的方法。
  18. 一种计算机程序产品,其特征在于,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在电子设备上运行时,使得所述电子设备执行权利要求1至15中任一项所述的方法。
PCT/CN2022/114897 2021-12-25 2022-08-25 屏幕显示方法和电子设备 WO2023116012A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22862333.6A EP4224298A4 (en) 2021-12-25 2022-08-25 SCREEN DISPLAY METHOD AND ELECTRONIC DEVICE
US18/026,548 US20240295905A1 (en) 2021-12-25 2022-08-25 Screen display method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111606004.0A CN116339568A (zh) 2021-12-25 2021-12-25 屏幕显示方法和电子设备
CN202111606004.0 2021-12-25

Publications (2)

Publication Number Publication Date
WO2023116012A1 true WO2023116012A1 (zh) 2023-06-29
WO2023116012A9 WO2023116012A9 (zh) 2023-08-24

Family

ID=86875177

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114897 WO2023116012A1 (zh) 2021-12-25 2022-08-25 屏幕显示方法和电子设备

Country Status (4)

Country Link
US (1) US20240295905A1 (zh)
EP (1) EP4224298A4 (zh)
CN (1) CN116339568A (zh)
WO (1) WO2023116012A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117615055A (zh) * 2023-09-19 2024-02-27 荣耀终端有限公司 一种显示控制方法及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210307A (zh) * 2016-07-08 2016-12-07 努比亚技术有限公司 移动终端及屏幕切换方法
CN107645611A (zh) * 2017-10-17 2018-01-30 维沃移动通信有限公司 一种支付方法及移动终端
WO2018034402A1 (en) * 2016-08-16 2018-02-22 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN109710132A (zh) * 2018-12-28 2019-05-03 维沃移动通信有限公司 操作控制方法及终端
CN109739407A (zh) * 2019-01-25 2019-05-10 维沃移动通信有限公司 信息处理方法及终端设备
CN109917956A (zh) * 2019-02-22 2019-06-21 华为技术有限公司 一种控制屏幕显示的方法和电子设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101802760B1 (ko) * 2011-06-27 2017-12-28 엘지전자 주식회사 이동 단말기 및 그 제어방법
CN108323197B (zh) * 2016-12-27 2021-01-15 华为技术有限公司 一种多屏显示的方法和设备
US20190138205A1 (en) * 2017-11-09 2019-05-09 Hisense Mobile Communications Technology Co., Ltd. Image display method of a dual-screen device, dual-screen device, and non-transitory storage medium
CN107943380A (zh) * 2017-11-30 2018-04-20 努比亚技术有限公司 双面屏备注操作方法、双面屏终端及计算机可读存储介质
CN110007844A (zh) * 2018-01-05 2019-07-12 中兴通讯股份有限公司 一种屏幕切换的方法及双屏的终端
CN108829304A (zh) * 2018-05-29 2018-11-16 维沃移动通信有限公司 一种显示控制方法及终端
CN109547628A (zh) * 2018-10-30 2019-03-29 努比亚技术有限公司 镜像显示方法、装置及计算机可读存储介质
CN110134476A (zh) * 2019-04-24 2019-08-16 珠海格力电器股份有限公司 一种应用显示的方法及设备
CN113535054B (zh) * 2020-04-16 2024-06-07 深圳市腾讯计算机系统有限公司 显示内容切换方法、装置、存储介质及电子设备
CN112667074A (zh) * 2020-12-23 2021-04-16 北京小米移动软件有限公司 显示方法、显示装置及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210307A (zh) * 2016-07-08 2016-12-07 努比亚技术有限公司 移动终端及屏幕切换方法
WO2018034402A1 (en) * 2016-08-16 2018-02-22 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN107645611A (zh) * 2017-10-17 2018-01-30 维沃移动通信有限公司 一种支付方法及移动终端
CN109710132A (zh) * 2018-12-28 2019-05-03 维沃移动通信有限公司 操作控制方法及终端
CN109739407A (zh) * 2019-01-25 2019-05-10 维沃移动通信有限公司 信息处理方法及终端设备
CN109917956A (zh) * 2019-02-22 2019-06-21 华为技术有限公司 一种控制屏幕显示的方法和电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4224298A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117615055A (zh) * 2023-09-19 2024-02-27 荣耀终端有限公司 一种显示控制方法及电子设备

Also Published As

Publication number Publication date
EP4224298A4 (en) 2024-06-05
EP4224298A1 (en) 2023-08-09
US20240295905A1 (en) 2024-09-05
WO2023116012A9 (zh) 2023-08-24
CN116339568A (zh) 2023-06-27

Similar Documents

Publication Publication Date Title
WO2021103981A1 (zh) 分屏显示的处理方法、装置及电子设备
CN113794800B (zh) 一种语音控制方法及电子设备
WO2021027747A1 (zh) 一种界面显示方法及设备
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021000881A1 (zh) 一种分屏方法及电子设备
WO2020062294A1 (zh) 系统导航栏的显示控制方法、图形用户界面及电子设备
WO2021082835A1 (zh) 启动功能的方法及电子设备
CN111176506A (zh) 一种屏幕显示方法及电子设备
CN111669459B (zh) 键盘显示方法、电子设备和计算机可读存储介质
CN111443836B (zh) 一种暂存应用界面的方法及电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
CN110750317A (zh) 一种桌面的编辑方法及电子设备
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
CN111970401B (zh) 一种通话内容处理方法、电子设备和存储介质
WO2021249281A1 (zh) 一种用于电子设备的交互方法和电子设备
WO2021218429A1 (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
WO2022012418A1 (zh) 拍照方法及电子设备
WO2021057203A1 (zh) 一种操作方法和电子设备
WO2021196980A1 (zh) 多屏交互方法、电子设备及计算机可读存储介质
CN112068907A (zh) 一种界面显示方法和电子设备
CN113746961A (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2023116012A1 (zh) 屏幕显示方法和电子设备
WO2024067551A1 (zh) 界面显示方法及电子设备
WO2021042878A1 (zh) 一种拍摄方法及电子设备

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 18026548

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022862333

Country of ref document: EP

Effective date: 20230307

NENP Non-entry into the national phase

Ref country code: DE