CN115328592B - Display method and related device - Google Patents

Display method and related device Download PDF

Info

Publication number
CN115328592B
CN115328592B CN202210801489.7A CN202210801489A CN115328592B CN 115328592 B CN115328592 B CN 115328592B CN 202210801489 A CN202210801489 A CN 202210801489A CN 115328592 B CN115328592 B CN 115328592B
Authority
CN
China
Prior art keywords
area
user
target device
message
machine interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210801489.7A
Other languages
Chinese (zh)
Other versions
CN115328592A (en
Inventor
万潇潇
祁云飞
彭琦
余洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202410119217.8A priority Critical patent/CN117931354A/en
Priority to CN202210801489.7A priority patent/CN115328592B/en
Publication of CN115328592A publication Critical patent/CN115328592A/en
Application granted granted Critical
Publication of CN115328592B publication Critical patent/CN115328592B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a display method and a related device. In the method, the electronic equipment monitors the calling condition of a target device; the target device is arranged in an area covered by the black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area; under the condition that the target device is called in a preset time length from the current moment, the electronic equipment displays a notification message in a message notification area of the display screen; the message notification area surrounds a part or all of the black opaque area, and the preset time period is greater than or equal to 0 nanosecond. By implementing the technical scheme, negative experiences brought by the black opaque region to the user can be reduced.

Description

Display method and related device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a display method and a related device.
Background
Liu Haibing or perforated screen refers to an electronic screen in which a black opaque area is provided over the screen of the electronic device, which area can be used to incorporate cameras and the like. This is one electronic device screen solution employed in pursuit of extreme borders such as full-face screens. However, the black and opaque area can block the content displayed by the electronic screen, resulting in poor user experience.
Disclosure of Invention
The application provides a display method and a related device, which can improve user experience.
In a first aspect, the present application provides a display method, the method comprising:
the electronic equipment monitors the calling condition of a target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
under the condition that the target device is monitored to be called in a preset time length from the current moment, the electronic equipment displays a notification message in a message notification area of the display screen; the message notification area surrounds a part or all of the black opaque area, and the predetermined time period is 0 or more (e.g., 0 nanosecond). The preset duration being equal to 0 indicates that the target device is being invoked, and a duration greater than 0 indicates that the target device is about to be invoked. In some examples, the display notification message may be either when the target device is invoked or before the target device is invoked.
According to the method and the device, the notification message is displayed around the black opaque area, so that the negative experience brought by the black opaque area can be ignored after the user sees the notification message, and the notification message and the black opaque area are combined to be displayed to be interesting and practical things, so that the use experience of the user is improved.
In one possible implementation, the notification message is a prompt message about a service provided by the target device.
The application highlights that the notification message is related to the service provided by the target device in the opaque region, namely, in the application, the notification message appears around hardware which is strongly related to the current notification content, so that the appearance time and the appearance position of the message meet or exceed the user expectation, and different experiences are brought to the user.
In one possible embodiment, the message notification area and the black opaque area are in the same color tone.
The message notification area and the black opaque area are highlighted to be consistent in color tone, namely the two areas are of an immersive integrated design, the sense of limitation among the areas is eliminated, and negative experience brought by the black opaque area is further reduced.
In one possible embodiment, the foregoing method further includes: and under the condition that the target device is monitored to be called in a preset time length from the current moment, the electronic equipment further displays a first man-machine interaction control in a message notification area of the display screen.
Optionally, the first human interaction control instructs the user to respond to the notification message or the content notified by the notification message.
According to the method and the device, the participation of the user can be improved by displaying the man-machine interaction control in the message notification area, the user has more choices, unexpected surprise and different experiences are brought to the user, and negative experiences brought by the black opaque area are ignored.
In one possible embodiment, the foregoing method further includes:
under the condition that the target device is monitored to be called within a preset time length from the current moment, the electronic equipment displays human-computer interaction elements on the display screen; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
Optionally, the notification message includes indication information for indicating a user to perform a man-machine interaction operation on the man-machine interaction element.
According to the method and the device, more services are expanded through the human-computer interaction element, so that a user has more choices, unexpected surprise and different experiences are brought to the user, and negative experiences brought by black opaque areas are ignored.
In one possible implementation manner, the notification message includes a hyperlink text, where the hyperlink text is used to provide an extended service for the user after performing man-machine interaction with the user, and the extended service is a service other than the service provided by the target device.
Optionally, the notification message includes indication information for indicating a user to perform man-machine interaction on the hyperlink text.
According to the method and the device, more services are expanded through the human-computer interaction element, so that a user has more choices, unexpected surprise and different experiences are brought to the user, and negative experiences brought by black opaque areas are ignored.
In one possible embodiment, the foregoing method further includes:
under the condition that the target device is monitored to be called in a preset time length from the current moment, the electronic equipment further displays a second man-machine interaction control on the display screen; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
Optionally, the notification message includes indication information for indicating that the user clicks, pulls down, or drags the second human-computer interaction control.
According to the method and the device, more services are expanded through the man-machine interaction control, so that a user has more choices, unexpected surprise and different experiences are brought to the user, and negative experiences brought by black opaque areas are ignored.
In one possible implementation, the display screen includes a visual effect presentation area, and the visual effect presentation area includes the black opaque area and a message notification area; the method further comprises the steps of:
the electronic device provides a preset visual effect in the visual effect presentation area during the display of the notification message in the message notification area.
Optionally, the aforementioned preset visual effects include visual effects presented by one or more of color, brightness, transparency, dynamic effects, gradient effects, shadows, images, luminescence, soft edges, 3D effects, or three-dimensional rotation.
The method and the device for displaying the visual effect display area of the black opaque area display the colorful visual effect for the user in the visual effect display area including the black opaque area, increase the visual experience of the user, and neglect negative experience brought by the black opaque area.
In a second aspect, the present application provides an electronic device comprising:
the monitoring unit is used for monitoring the calling condition of the target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
The display unit is used for displaying a notification message in a message notification area of the display screen under the condition that the target device is monitored to be called within a preset time length from the current moment; the message notification area surrounds a part or all of the black opaque area, and the preset time period is greater than or equal to 0.
In one possible implementation, the notification message is a prompt message about a service provided by the target device.
In one possible embodiment, the message notification area and the black opaque area are in the same color tone.
In one possible implementation manner, the display unit is further configured to:
and displaying a first man-machine interaction control in a message notification area of the display screen under the condition that the target device is monitored to be called in a preset time length from the current moment.
In one possible implementation, the first human interaction control indicates that the user responds to the notification message or the content of the notification message notification.
In one possible implementation manner, the display unit is further configured to:
under the condition that the target device is monitored to be called in a preset time length from the current moment, displaying a human-computer interaction element on the display screen; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
In one possible implementation manner, the notification message includes indication information for indicating that the user performs man-machine interaction operation on the man-machine interaction element.
In one possible implementation manner, the notification message includes a hyperlink text, where the hyperlink text is used to provide an extended service for the user after performing man-machine interaction with the user, and the extended service is a service other than the service provided by the target device.
In one possible implementation manner, the notification message includes indication information for indicating that the user performs man-machine interaction on the hyperlink text.
In one possible implementation manner, the display unit is further configured to:
under the condition that the target device is monitored to be called in a preset time length from the current moment, displaying a second man-machine interaction control on the display screen; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
In one possible implementation manner, the notification message includes indication information for indicating that the user clicks, pulls down or drags the second human-computer interaction control.
In one possible implementation, the display screen includes a visual effect presentation area, and the visual effect presentation area includes the black opaque area and a message notification area; the aforementioned display unit is further configured to:
and providing a preset visual effect in the visual effect presentation area during the display of the notification message in the message notification area.
In one possible implementation, the aforementioned preset visual effects include visual effects presented by one or more of color, brightness, transparency, dynamic effects, gradient effects, shadows, images, lighting, soft edges, 3D effects, or three-dimensional rotation.
In a third aspect, the present application provides an electronic device, comprising: one or more processors, memory, and a display screen. The display screen is for displaying information based on the indication of the one or more processors. The memory is coupled to the one or more processors, the memory for storing a computer program comprising computer instructions for invoking the one or more processors to cause the electronic device to perform the method of any of the first aspects.
Optionally, the electronic device may also include a communication interface for the device to communicate with other devices, which may be, for example, a transceiver, circuit, bus, module, or other type of communication interface.
In one possible implementation, the electronic device may include:
a memory for storing a computer program or computer instructions;
a processor for:
monitoring the calling condition of a target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
under the condition that the target device is monitored to be called in a preset time length from the current moment, displaying a notification message in a message notification area of the display screen; the message notification area surrounds a part or all of the black opaque area, and the preset time period is greater than or equal to 0.
Note that, the computer program or the computer instructions in the memory may be stored in advance or may be stored after being downloaded from the internet when the device is used in the present application, and the source of the computer program or the computer instructions in the memory is not particularly limited in the present application. The coupling in the embodiments of the present application is an indirect coupling or connection between devices, units, or modules, which may be in electrical, mechanical, or other form for information interaction between the devices, units, or modules.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program for execution by a processor to implement the method of any one of the first aspects.
In a fifth aspect, embodiments of the present application provide a chip, which includes a processor and a memory, where the memory is configured to store a computer program or computer instructions, and the processor is configured to execute the computer program or computer instructions stored in the memory, so that the chip performs the method according to any one of the first aspect.
In a sixth aspect, the present application provides a computer program product which, when executed by a processor, performs the method of any one of the first aspects above.
The solutions provided in the second aspect to the sixth aspect are used to implement or cooperate to implement the methods correspondingly provided in the first aspect, so that the same or corresponding beneficial effects as those of the corresponding methods in the first aspect can be achieved, and no further description is given here.
Drawings
The drawings used in the embodiments of the present application are described below.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application;
Fig. 2 is a software architecture block diagram of the electronic device 100 provided in the embodiment of the present application;
fig. 3A and fig. 3B are schematic diagrams of a display screen of an electronic device according to an embodiment of the present application;
fig. 4A and fig. 4B are schematic diagrams of a display screen of an electronic device according to an embodiment of the present application;
fig. 5A to fig. 5C are schematic diagrams illustrating positions of message notification areas in a display screen according to an embodiment of the present application;
fig. 6A and fig. 6B are schematic diagrams illustrating notification message display provided in an embodiment of the present application;
fig. 6C is a schematic flow chart of displaying a notification message according to an embodiment of the present application;
fig. 7A and fig. 7B are schematic diagrams illustrating notification message display provided in an embodiment of the present application;
fig. 8A and fig. 8B are schematic diagrams illustrating notification message display provided in an embodiment of the present application;
fig. 9A to 9C are schematic diagrams illustrating notification message display provided in the embodiments of the present application;
fig. 10A and fig. 10B are schematic diagrams illustrating notification message display provided in an embodiment of the present application;
11A and 11B are schematic diagrams showing notification message display provided in the embodiments of the present application;
fig. 12A to 12D are schematic diagrams illustrating notification message display provided in the embodiments of the present application;
FIGS. 13A-13K are schematic views showing visual effects according to embodiments of the present application;
fig. 14 is a flow chart of a display method according to an embodiment of the present application;
Fig. 15 is a schematic logic structure diagram of an electronic device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the application. As used in the specification of the embodiments of the present application and the appended claims, the singular forms "a," "an," "the," "said," "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in the embodiments of the present application refers to and encompasses any or all possible combinations of one or more of the listed items.
The following describes a terminal according to an embodiment of the present application.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure.
The embodiment will be specifically described below taking the electronic device 100 as an example. It should be understood that electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The SIM interface may be used to communicate with the SIM card interface 195 to perform functions of transferring data to or reading data from the SIM card.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, an application required for at least one function (such as a face recognition function, a fingerprint recognition function, a mobile payment function, etc.), and the like. The storage data area may store data created during use of the electronic device 100 (e.g., face information template data, fingerprint information templates, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication.
In this embodiment, the electronic device 100 may execute the display method provided in the embodiment of the present application through the processor 110, and the display screen 194 displays a message or a prompt box obtained by the display method.
By way of example, the electronic device 100 may include, but is not limited to, any electronic product based on an intelligent operating system that can interact with a user via an input device such as a keyboard, a virtual keyboard, a touch pad, a touch screen, and a voice control device. Such as smartphones, tablet computers (Tablet personal computer, tablet PCs), handheld computers, wearable electronic devices, personal computers (personal computer, PCs), desktop computers, and the like. The electronic device may also include, but is not limited to, any internet of things (internet of things, ioT) device. The IoT device may be, for example, a smart sound box, a Television (TV), an in-vehicle display of an automobile, and so forth. Among these, intelligent operating systems include, but are not limited to, any operating system that enriches device functionality by providing various applications to the device, such as the android Android, IOS, windows, MAC or hong system (Harmony OS) operating systems.
The software system of the electronic device 100 shown in fig. 1 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application exemplifies a software system of a hierarchical architecture, and exemplifies a software structure of the electronic device 100. Fig. 2 illustrates a software architecture block diagram of the electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, runtime (run time) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications (also referred to as applications) such as cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in a status bar or a preset designated location (for example, a message notification area described in the embodiments of the present application, which will be described in detail below, and not described in detail herein), and may be used to convey a notification type message, and may automatically disappear after a short stay, without user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The Runtime (run time) includes core libraries and virtual machines. Run time is responsible for scheduling and management of the system.
The core library consists of two parts: one part is the function that the programming language (e.g., the java language) needs to call, and the other part is the core library of the system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2D) and three-Dimensional (3D) layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver.
The workflow of the electronic device 100 in combination with the software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video by the camera 193. This example is merely illustrative of the flow of the electronic device 100 software and hardware interoperating with each other and is not limiting of the present application.
In order to solve the problem that a black opaque area in a display screen of an electronic device affects user experience, a display method is provided.
First, to facilitate understanding of black opaque regions in a display screen of an electronic device, reference may be made exemplarily to fig. 3A, 3B, 4A, and 4B. In the display screen shown in fig. 3A and 3B, the black opaque region may be circular. This form of display screen may also be referred to as a perforated screen. In the perforated screen shown in fig. 4A and 4B, the black opaque regions may be rounded rectangles. This form of display may also be referred to as Liu Haibing.
As can also be seen in fig. 3A, 3B, 4A and 4B described above, a black opaque region in the display screen may be disposed directly above the display screen (see, e.g., fig. 3A and 4A). Alternatively, a black opaque region in the display screen may be disposed at the upper left of the display screen (see, e.g., fig. 3B and 4B). Alternatively, the black opaque region in the display screen may be disposed at the upper right of the display screen. Alternatively, the black opaque region in the display screen may be disposed at any position in the display screen, which is not limited by the embodiments of the present application. In the embodiment of the present application, the orientation of the display screen, such as the right upper side, the left upper side, or the right upper side, refers to the orientation determined from the user's angle when the user faces the display screen.
In a possible implementation manner, a hardware device is arranged in an area covered by a black opaque area in the display screen, and the hardware device provides services for users through the black opaque area. The black opaque area in the display screen may, for example, be where the camera of the electronic device is located, i.e. may be a camera aperture. In another possible embodiment, the black opaque area in the display screen may be, for example, where an earpiece of the electronic device is located. In another possible embodiment, the black opaque area in the display screen may be, for example, where a sensor of the electronic device is located. Merely an example here, embodiments of the present application do not limit the use of the black transparent region.
In a possible embodiment, the black opaque region of the display screen is not limited to the circular or rounded rectangle. Illustratively, the black opaque region may be any shape such as square, drop-shaped, rounded trapezoid, or oval, and the shape of the black opaque region is not limited in the embodiments of the present application.
In a possible embodiment, the display screen shown in fig. 3A, 3B, 4A, and 4B may be rectangular. In a specific implementation, the shape of the display screen is not limited to be a rectangular shape, but may be any shape such as a rounded rectangle, a folded screen or a curved screen, which is not limited in this embodiment of the present application.
Since the black opaque region in the display screen is arranged in the display screen suddenly, the black opaque region affects the viewing experience of the user when the user views the user interface by using the display screen. For this reason, the embodiment of the present application provides a display method, and a specific implementation of the display method provided by the embodiment of the present application will be described in the following by way of example.
In a specific implementation, some notification messages may be displayed around a black opaque region of the display screen during use of the display screen by a user, and the region of the display screen where these notification messages are displayed may be referred to as a message notification region. These notification messages may be some prompt messages or may be human-machine interaction indication messages or the like. These notification messages may be unexpected to the user and may enhance the user's interests and experience, so the negative experience of the black opaque region may be counteracted by the positive action of these notification messages.
In one possible embodiment, reference may be made exemplarily to fig. 5A and 5B. Fig. 5A illustrates a black opaque region directly above the display screen, and fig. 5B illustrates a black opaque region at the upper left of the display screen. It can be seen that in the display screen, the message notification area may enclose a black opaque area.
In one possible embodiment, one can exemplarily see fig. 5C. Fig. 5C illustrates an example where the black opaque region is directly above the display screen. It can be seen that the message notification area can be located to the left of the black opaque area in the display. In this embodiment of the present application, the directions of up, down, left, right, etc. refer to directions determined from the user's angle when the user faces the display screen, and are the same as the above, and are not repeated in the following description. In another possible embodiment, the message notification area may also be located to the right of the black opaque area.
The message notification area is located around the black opaque area, and the message notification area and the black opaque area may or may not intersect, for example. The embodiment of the application does not limit the area size of the display screen occupied by the message notification area, and the area size can be preset. The embodiment of the application does not limit the shape of the message notification area. The embodiment of the application does not limit visual effects such as color, line or brightness in the message notification area. To facilitate an understanding of the specific presentation of the message notification area, the following is presented by way of example with the black opaque area directly above the display screen.
Referring to fig. 6A, specific notification messages may be displayed in a message notification area of a display screen, and the notification messages may be displayed in text form, for example.
In one possible implementation, the notification message may be associated with a hardware device disposed in the black opaque region. For ease of understanding, reference may be made to fig. 6B for exemplary purposes. In fig. 6B, it is assumed that the hardware device provided in the black opaque region includes a camera, and the black opaque region includes a camera hole therein. Then, in the case where the camera is about to be called up or has been called up, information related to the camera may be displayed around the black opaque region, that is, a message notification region. For example, taking a scenario in which the camera is about to be invoked as an example in fig. 6B, a prompt message of "12-point conference about to start, please pre-test the camera" is displayed in the message notification area. The user, after seeing the prompt, ignores the negative experience of the black opaque region, and instead feels interesting and practical to display the notification message in combination with the black opaque region. In other words, in the embodiment of the present application, the notification message appears around the hardware strongly related to the current notification content, so that the time and the position of the message both conform to or exceed the expectations of the user, and different experiences are brought to the user.
The application scenario of the camera to be invoked or already invoked is not limited to the application scenario of the video conference invoking camera, and may be, for example, the following application scenario:
for example, an application scene that calls a camera to take a picture or video may also be used. For example, after the electronic device starts the camera, a prompt message of "good recording with lens" may be displayed in the message notification area.
For example, the application scenario may also be an application scenario that invokes a camera to make a video call. For example, in the process that the electronic device initiates the video call to wait for the counterpart to answer, a prompt message of "make the camera bring a pleasant online meeting for you" may be displayed in the message notification area. For example, after the electronic device receives the request for the video call, a prompt message of "turning on a pleasant online meeting bar through the camera quickly" may be displayed in the message notification area.
For example, the method can be an application scene of calling a camera of the electronic device to perform face recognition. For example, after the electronic device activates the camera, a prompt message "expect to know unique you" may be displayed in the message notification area.
It should be noted that the application scenario described above and the information displayed in the message notification area are only examples. In a specific implementation, the embodiment of the application can be applied to all scenes calling the camera. In different application scenarios, the information displayed in the message notification area may be the same or may be different. The embodiment of the application does not limit the specific content of the information displayed in the message notification area.
In another possible implementation, the hardware device provided in the black opaque region is not limited to a camera, but may be other devices, such as a headset, etc. Taking the earpiece as an example, in case the earpiece is about to be invoked or has been invoked, information related to the earpiece may be displayed around the black opaque area, i.e. the message notification area. For example, a prompt message "please get close to your ear" may be displayed in the message notification area. The embodiment of the application does not limit the hardware devices arranged in the black opaque region. The specific content of the information related to the hardware device displayed in the message notification area is not limited.
For example, a specific flow of displaying a notification message in the above-described message notification area may be exemplarily seen in fig. 6C. In a specific implementation, after the electronic device receives a preset operation or monitors a preset event, a message may be sent to trigger a processing operation of a notification manager (such as the notification manager shown in fig. 2) in the electronic device. The preset operation may be an operation of calling a hardware device provided in the above-described black opaque region. The preset event may be an event that the hardware device set in the above-described black opaque region is to be called. The notification manager then matches the content of the notification message to be displayed immediately based on the corresponding keyword in the operation or event. And the notification message is processed by a notification graphic engine (e.g., a 2D graphic engine and/or a 3D graphic drawing as shown in fig. 2) and then displayed in a designated area, i.e., the message notification area. The designated area may be, for example, an area predetermined in the course of User Interface (UI) design.
Referring to fig. 6C, the electronic device receives an operation of calling the hardware device set in the above-mentioned black opaque region and may also send a message to the hardware device. The hardware device is started in response to the message. Optionally, the hardware device may also indicate a corresponding light flash or constant bright, etc. in response to the message.
The hardware device disposed in the above-described black opaque region is exemplified as a camera. The operation of calling the hardware device set in the black opaque region may be an operation of calling the camera. The operation of invoking the camera may be, for example, an operation of touching (or clicking) a camera application icon received by the electronic device, for example. Alternatively, the operation of invoking the camera may be, for example, an operation of touching (or clicking) the shooting control or touching (or clicking) the shooting control received by the electronic device in a user interface of a third party application (for example, an application with shooting function such as a WeChat, a tremble, or a fast hand). Alternatively, the operation of invoking the camera may be, for example, an operation of touching (or clicking) a video call acceptance control received by the electronic device. Alternatively, the operation of invoking the camera may be, for example, an operation of touching (or clicking) a control received by the electronic device to join the video conference. Alternatively, the operation of invoking the camera may be, for example, an operation of touching (or clicking) a face recognition control or touching (or clicking) a confirmation control for face recognition verification received by the electronic device.
Taking the above-mentioned hardware device set in the black opaque region as an example of a camera. The event that the hardware device set in the black opaque region is to be called may be an event that the camera is to be called. For example, an event monitoring unit in the electronic device may be used to monitor the event that is about to invoke the camera. The event that will call up the camera may be, for example, an event that a predetermined video conference is about to start. Specifically, the event monitoring unit monitors that a predetermined video conference is about to start, and senses that the conference requires a camera to be invoked, then the event monitoring unit may trigger a processing operation of the notification manager by a message. The event that will call up the camera may be, for example, a timed shot event. Specifically, the event monitoring unit monitors the event photographed at the timing, and then the event monitoring unit may trigger a processing operation of the notification manager by a message.
It should be understood that the notification message implemented by the notification manager in the above embodiment is merely an example, and in other embodiments, other similar software functional modules may also be used to implement operations such as generating and displaying a message. For example, the software module may be a software module in an operating system, a software module of an application, or a combination of the software module of the operating system and the software module of the application.
Referring to fig. 7A, fig. 7A differs from fig. 6A described above in that in fig. 7A, a black opaque region and a message notification region are fused, i.e., the message notification region and the black opaque region are of an immersive integrated design. This design reduces the sense of demarcation between the black opaque region and the message notification region, making it easier for the user to ignore the negative experience of the black opaque region when seeing the information of the message notification region. See fig. 7B for an example. The difference in fig. 7B compared to fig. 6B is that in fig. 7B, the message notification area is of immersive integrated design with the black opaque area. The user, after seeing the prompt, ignores the negative experience of the black opaque region, and instead feels interesting and practical to display the notification message in combination with the black opaque region. In addition to the differences described above, the relevant description of fig. 7A may refer to the corresponding description of fig. 6A, and the relevant description of fig. 7B may refer to the corresponding description of fig. 6B, which is not repeated herein.
Referring to fig. 8A, specific notification messages may be displayed in a message notification area of a display screen, and the notification messages may be displayed in text form, for example. In addition, a control of man-machine interaction can be displayed in the message notification area. The electronic equipment can receive specific operation instructions of the user through the man-machine interaction control. For ease of understanding, reference may be made to fig. 8B for exemplary purposes.
In fig. 8B, it is assumed that the hardware device provided in the black opaque region including the camera hole therein includes the camera. Then, in the case where the camera is about to be called up or has been called up, information related to the camera may be displayed around the black opaque region, that is, a message notification region. And controls for camera-related human-machine interaction may be provided in the message notification area. For example, taking a scenario in which the camera is about to be invoked as an example in fig. 8B, a prompt message of "ready to start the camera" is displayed in the message notification area. And also displays "disable" and "enable" human-machine interaction controls in the message notification area. The "disable" control is used to disable the camera. The "allow" control is used to allow the camera to be activated. In the user interface shown in the display screen of fig. 8B, the electronic device may not activate the camera in response to a user touch (or click) operation of the "disable" control. Alternatively, the electronic device may activate the camera in response to a user touch (or click) operation of the "allow" control.
In other embodiments, for example, the message notification area may also display application notification messages, such as messages from WeChat friends, and then the control may be "reply" or "close" for quick reply to the message or close the message, among other operations.
The message prompt and man-machine interaction mode provided by the embodiment of the application can promote the participation of the user, and the user has more choices, so that unexpected surprise and different experiences are brought to the user, and negative experiences brought by black opaque areas are ignored.
Fig. 8A and 8B described above illustrate an implementation in which the message notification area and the black opaque area are of an immersive integrated design. In another possible implementation, the design of the message notification area and the black opaque area in fig. 8A and 8B described above may also be a design in which there is a distinct boundary between the two areas as shown in fig. 6A. The embodiments of the present application are not limited in this regard.
Referring to fig. 9A, specific notification messages may be displayed in a message notification area of the display screen, and the notification messages may be displayed in text form, for example. In addition, man-machine interaction elements matched with the notification message can be displayed in the display screen. For example, the notification message may be used to prompt the user to operate on the human-machine interaction element. The electronic equipment can receive the specific operation instruction of the user through the man-machine interaction element. For ease of understanding, reference may be made to fig. 9B for exemplary purposes.
In fig. 9B, it is assumed that the hardware device provided in the black opaque region including the camera hole therein includes the camera. Then, in the case where the camera is about to be called up or has been called up, information related to the camera may be displayed around the black opaque region, that is, a message notification region. And human-computer interaction elements matched with the information can be displayed in the display screen. For example, taking the scenario of waiting to start a video conference or having entered a video conference as an example in fig. 9B, the message notification area shows the prompt information of "replace virtual avatar" and "one-touch-up meta-cosmic conference". And the selectable avatar may be displayed at any location in the display screen, such as a location below the message notification area. The user can touch (or click) a certain head portrait to replace the head portrait of the user. For example, the electronic device may directly replace the user's avatar with the certain avatar based on a touch (or click) operation on the certain avatar after receiving the operation. Alternatively, the electronic device may pop up a prompt box after receiving a touch (or click) operation on the certain avatar, for example. The prompt box prompts the user whether to replace the head portrait, and after receiving the touch (or click) operation of the user on the control for indicating to replace the head portrait in the prompt box, the head portrait of the user is directly replaced by the head portrait based on the operation. For example only, embodiments of the present application do not limit specific operations after the electronic device receives a touch (or click) operation on the certain avatar.
In another possible implementation, in the user interface shown in fig. 9B, a human-machine interaction element hyperlinked to the head portrait database may also be displayed. Illustratively, the man-machine interaction element of the hyperlink to the head portrait database may be a text hyperlink or an icon hyperlink, etc., which is not limited by the embodiments of the present application. For ease of understanding, reference may be made to fig. 9C for exemplary purposes. Fig. 9C exemplarily illustrates that the man-machine interaction element of the hyperlink to the avatar database is a text hyperlink, and the text content of the hyperlink is "more avatar …". The electronic device receiving a touch (or click) operation on the hyperlink of the "more head portrait …" may jump the user interface into the head portrait database or pop up a window displaying more head portraits so that the user may select among more head portraits.
The message prompt and man-machine interaction mode provided by the embodiment of the application can promote the participation of the user, and the user has more choices, so that unexpected surprise and different experiences are brought to the user, and negative experiences brought by black opaque areas are ignored.
Fig. 9A and 9B described above are exemplified by the design in which there is a clear boundary between the message notification area and the black opaque area. In another possible implementation, the design of the message notification area and the black opaque area in fig. 9A and 9B described above may also be an immersive integrated design of the two areas as shown in fig. 7A. The embodiments of the present application are not limited in this regard.
In a specific implementation, the man-machine interaction element shown in fig. 9A is not limited to be an avatar. The human-machine interaction element may also be, for example, wallpaper (i.e., background) of a video or conference scene. The electronic device can receive an indication of changing the background picture or an indication of blurring the background picture by a user through the man-machine interaction element of the wallpaper, and display a background picture effect selected by the user based on the indication. Specific implementation may refer to the corresponding description of fig. 9B, which is not repeated herein.
The man-machine interaction element shown in fig. 9A may also be a decoration element (such as a sticker or an emoticon (Emoji) or the like) of the user avatar display frame. The electronic device may receive, via the human-machine interaction element of the decorative element, an indication that the user altered the decorative element, and display an effect of the decorative element selected by the user based on the indication. Specific implementation may refer to the corresponding description of fig. 9B, which is not repeated herein.
The man-machine interaction element shown in fig. 9A may also be a sound effect (e.g. slow, fast, variable frequency, etc. effects). The electronic device may receive, via the human-machine interaction element of the sound effect, an indication that a user alters the effect of the sound played by the electronic device, and display the sound effect selected by the user based on the indication. Specific implementation may refer to the corresponding description of fig. 9B, which is not repeated herein.
The man-machine interaction elements shown in fig. 9A described above are merely examples, and do not limit the embodiments of the present application.
It should be understood that, in the present application, the content of the message displayed in the message notification area, the man-machine interaction element in the above embodiment, etc. may be configured in advance; the terminal may also be adaptively displayed according to the current scene of the user using the terminal or the state of the terminal, for example, the camera may be called to deal with, but the message notification area may display different message content and/or different man-machine interaction elements according to the meeting history or other history information of the friend.
Referring to fig. 10A, fig. 10A differs from fig. 9A described above in that in fig. 10A, the black opaque region is a circular region, and the message notification region is a region surrounded by an arc line and a display screen boundary closest to the black opaque region. As shown in fig. 10A, this design may optimize the aesthetics of the user interface, provide a more pleasing interface for the user, and enhance the user experience. For ease of understanding, reference may be made to fig. 10B for exemplary purposes. The difference in fig. 10B compared to fig. 9B is that in fig. 10B, the black opaque region is a circular region in which a camera hole is provided. In addition, the message notification area is an area surrounded by an arc line and the boundary of the display screen closest to the camera hole. In addition to the differences described above, the relevant description of fig. 10A may refer to the corresponding description of fig. 9A, and the relevant description of fig. 10B may refer to the corresponding description of fig. 9B, which is not repeated herein.
Referring to fig. 11A, in fig. 11A, a black opaque region is taken as a circular region, and the message notification region is taken as an example of a region surrounded by an arc line and a display screen boundary closest to the black opaque region. The message notification area includes specific contents of the notification message. The notification message includes hyperlink text. The user can be text-linked to other more extended application functions through the hyperlink. Therefore, the limitation of the application function is widened for the user, the electronic equipment can call another function more conveniently or quickly to be displayed for the user, the user experience is improved, and the negative experience brought by the black opaque area can be ignored by the user. For ease of understanding, reference may be made to fig. 11B for exemplary purposes.
In fig. 11B, it is assumed that the hardware device provided in the black opaque region includes a camera, and the black opaque region includes a camera hole therein. Then, in the case where the camera is about to be called up or has been called up, information related to the camera may be displayed around the black opaque region, that is, a message notification region. For example, taking a scene that the camera is about to be invoked as an example in fig. 11B, a prompt message of "meeting is about to start, and one-touch on beauty" is displayed in the message notification area. And, the "start" in the text of the "meeting is about to start, the one-key start beauty" is the hyperlink text. For example, the electronic device may receive a touch (or click) operation by the user that the hyperlink text is "on", and in response to the operation, may directly invoke and turn on the beauty function. Alternatively, the electronic device may pop up a setup window, for example, in response to the operation. The setting window may be used to set and save various effects of beauty (e.g., whitening, thinning face, brightening eyes or polishing skin, etc.). After the setting window is used for setting and storing operations, the electronic equipment receives touch (or clicking) operation of a user on a confirmation control for opening the beauty through the setting window, and then invokes and opens the beauty function. By way of example only, embodiments of the present application are not limited to a particular implementation of a response operation after an electronic device receives a touch (or click) operation of a user to "open" the hyperlink text. In the embodiment of the application, the notification message appears around the hardware which is strongly related to the current notification content, and can provide extended application functions, so that different experiences are brought to the user beyond the expectations of the user.
Fig. 11B is merely an example, and in a specific implementation, the extended application function that can be hyperlinked in the message notification area is not limited to the beauty function described above. For example, a sound optimization function is also possible. Illustratively, if the hardware device disposed in the black opaque region is a headset. In the case where the handset is about to be called up or has been called up, a prompt message of "one-to-optimize sound you hear" may be displayed in the message notification area. Then, the "optimization" may be the hyperlink text, or the "optimize you hear sound" is the hyperlink text. The electronic device may then invoke a sound optimization function to optimize sound output from the earpiece in response to a user touch (or click) operation on the hyperlinked text. The sound optimization may be, for example, filtering noise or providing the user with various optional sound effects (e.g., 3D liyin, subwoofer, or spatial sound effects, etc.), to which embodiments of the present application are not limited.
The above-mentioned extended application function that can be hyperlinked in the message notification area may be, for example, a function of optimizing a display background or optimizing a display area (for example, a display area of the message notification area or the entire display screen, etc.), and all the setting behaviors related to the camera may be generally displayed in the form of a graphical representation. Specific implementation may refer to the foregoing description of fig. 11B, which is not repeated herein.
See fig. 12A. Fig. 12A differs from fig. 11A in the manner of man-machine interaction. In contrast to FIG. 11A, which links to other more extended application functions through hyperlinked text, FIG. 12A does not design hyperlinked text, but rather receives user instructions through additional human-machine interaction controls to extend more application functions.
In one possible implementation, other application functions may be extended by clicking, pulling down, or dragging the human-machine interaction control with a cursor. In another possible implementation, other application functions may be extended by means of a gesture of the user touching, clicking, pulling down or dragging.
Fig. 12B, 12C, and 12D may be exemplarily referred to as operating the above-described human-computer interaction control through a cursor.
By way of example, fig. 12B can be seen. It is assumed that the hardware device provided in the black opaque region includes a camera, and the black opaque region includes a camera hole therein. Then, in the case where the camera is about to be called up or has been called up, information related to the camera may be displayed around the black opaque region, that is, a message notification region. For example, taking a scenario in which the camera is about to be invoked as an example in fig. 12B, a prompt message of "about to start a meeting, pull down to start beauty" is displayed in the message notification area, which indicates that the user can start the beauty function through a pull down operation. Also, the human-machine interaction control is a drop-down control, which may be, for example, a ring that passes through the message notification area. The drop-down control may be any other shape, and embodiments of the present application are not limited in this regard. The pull-down operation is not limited to the vertical pull-down operation, and any operation that tends to be downward may be the pull-down operation.
In a particular implementation, when the arrow-shaped cursor approaches the drop-down control, the cursor becomes the gesture shape pointed by the index finger, particularly with reference to FIG. 12B. For example, the electronic device may receive a drop-down operation of the drop-down control by the user through the gesture cursor, and may directly invoke and turn on the beauty function in response to the operation. Alternatively, the electronic device may pop up a setup window, for example, in response to the operation. The setting window may be used to set and save various effects of beauty (e.g., whitening, thinning face, brightening eyes or polishing skin, etc.). After the setting window is used for setting and storing operations, the electronic equipment receives touch (or clicking) operation of a user on a confirmation control for opening the beauty through the setting window, and then invokes and opens the beauty function. For example only, embodiments of the present application do not limit specific response operations of the electronic device to receiving a drop-down operation of the drop-down control by the user through the gesture cursor. In the embodiment of the application, the notification message appears around the hardware which is strongly related to the current notification content, and can provide extended application functions, so that different experiences are brought to the user beyond the expectations of the user.
By way of example, fig. 12C can be seen. Fig. 12C is a diagram of a human-computer interaction control changed to a drag control, compared to fig. 12B described above. In fig. 12C, taking a scenario in which the camera is about to be invoked as an example, a prompt message of "about to start a meeting and drag to start beauty" is displayed in the message notification area, which indicates that the user can start the beauty function through drag operation. The drag control may be a control composed of a plurality of points, or may be a control composed of three parallel line segments, for example, and the specific representation of the drag control is not limited in the embodiment of the present application. The drag operation may be drag in any direction, which is not limited in the embodiments of the present application.
In a particular implementation, when the arrow-shaped cursor approaches the drag control, the cursor becomes a fist gesture shape, particularly with reference to fig. 12C. For example, the electronic device may receive a drag operation of the drag control by the user through the gesture cursor, and may directly invoke and turn on the beauty function in response to the operation. Alternatively, the electronic device may pop up a setup window, for example, in response to the operation. The setting window may be used to set and save various effects of beauty (e.g., whitening, thinning face, brightening eyes or polishing skin, etc.). After the setting window is used for setting and storing operations, the electronic equipment receives touch (or clicking) operation of a user on a confirmation control for opening the beauty through the setting window, and then invokes and opens the beauty function. For example only, the embodiment of the application does not limit a specific response operation of the electronic device to receiving a drag operation of the drag control by the user through the gesture cursor. In the embodiment of the application, the notification message appears around the hardware which is strongly related to the current notification content, and can provide extended application functions, so that different experiences are brought to the user beyond the expectations of the user.
By way of example, fig. 12D can be seen. Fig. 12D is similar to fig. 12C described above, except that the hand gesture is changed to a palm gesture. Specifically, when the arrow-shaped cursor approaches the drag control, the cursor becomes the gesture shape of the palm. The dragging operation can be also realized through the palm-shaped cursor, and specific implementation can refer to the description of fig. 12C, which is not repeated here.
In a possible embodiment, the display screen of the electronic device may further include a visual effect presentation area. The visual effect presentation area refers to an area including the above-described black opaque area and message notification area. The electronic equipment can present colorful visual effects for the user in the visual effect presentation area, and the visual experience of the user is increased. Illustratively, the visual effect includes a combined presentation of one or more of color, brightness, transparency, dynamic effect, fade effect, shading, imaging, lighting, softened edges, 3D effect, or three-dimensional rotation. For ease of understanding, reference may be made exemplarily to fig. 13A to 13K.
Fig. 13A to 13K illustrate examples of the above-described black opaque region as a camera hole. For example, when the camera is invoked or is about to be invoked, a change in the light may occur in the camera aperture, e.g., the light in the camera aperture is normally on, etc. Taking this as an example, referring to the visual effect presentation area of fig. 13A, it can be seen that a visual effect is presented in which the surrounding area is illuminated with the camera hole as a light source.
Referring to the visual effect presentation area of fig. 13B, it can be seen that the visual effect of lighting and color gradation around the message notification area is presented.
Referring to the visual effect presentation area of fig. 13C, it can be seen that the visual effect of the bright side where the color gradation appears below the message notification area is presented.
Referring to the visual effect presentation area of fig. 13D, it can be seen that the visual effect that the message notification area lights up and a color gradation occurs is presented.
Referring to the visual effect presentation area of fig. 13E, it can be seen that the visual effect of the lunar diet is presented, or the visual effect of the light edge with gradual color is presented in the message notification area.
Referring to the visual effect presentation area of fig. 13F, it can be seen that the visual effect of water waves spreading around the message notification area is presented.
Referring to the visual effect presentation area of fig. 13G, it can be seen that the visual effect of solar light is presented.
Referring to the visual effect presentation area of fig. 13H, it can be seen that the visual effect of water waves spreading around the message notification area is presented. However, the water wave of fig. 13H is different from the water wave of fig. 13F, and the water wave of fig. 13H is represented in the form of dots like a dotted line.
Referring to the visual effect presentation area of fig. 13I, it can be seen that the visual effect of running the planet in the universe is presented.
Referring to the visual effect presentation area of fig. 13J, it can be seen that the visual effect of the lunar diet is presented.
Referring to the visual effect presentation area of fig. 13K, it can be seen that a visual effect is presented in which a gradual glow appears around the message notification area.
In one possible implementation, the visual effects presented in fig. 13A-13K above may all be dynamic visual effects. For example, an effect of the line being inattentive around the message notification area may be presented as shown in fig. 13G. For example, the planet in fig. 13I may be an effect on motion. The embodiment of the application does not limit the specific dynamic visual effect.
In one possible implementation, the visual effects presented in fig. 13A through 13K described above may all be 3D visual effects. For example, the planet in fig. 13I may be a planet of a 3D visual effect, or the like. The embodiment of the application does not limit specific 3D visual effects.
It should be noted that the visual effects described above are merely examples, and do not limit the embodiments of the present application. Any visual effect may be presented in the visual effect presentation area described above, which is not limited by the embodiment of the present application.
In addition, it should be noted that, the visual effect presented in the visual effect presenting area may be implemented by using an existing method for designing and presenting a visual effect, which is not limited in the embodiment of the present application.
Based on the above description, fig. 14 shows an interactive flow diagram of a display method according to an embodiment of the present application, where the display method may include the following steps:
s1401, the electronic device monitors the calling condition of a target device; the target device is disposed in an area covered by a black opaque area of the electronic device display screen, the target device providing services through the black opaque area.
The target device may be, for example, a hardware device disposed in an area covered by a black opaque area in the display screen described in the foregoing embodiment, and may be, for example, a camera or a receiver, which is not limited by the embodiment of the present application.
S1402, under the condition that the target device is monitored to be called in a preset time length from the current moment, the electronic equipment displays a notification message in a message notification area of the display screen; the message notification area surrounds a part or all of the black opaque area, and the preset time period is greater than or equal to 0 nanoseconds. Time units are illustratively in nanoseconds in this embodiment.
When the preset time period is equal to 0 ns, this may refer to the case where the hardware device has been called in the above embodiment. When the preset time period is longer than 0 nanosecond, this may refer to the situation that the hardware device is about to be invoked in the above embodiment. The situation to be invoked may be, for example, a situation that is invoked after 1 second, 5 seconds, or 10 seconds at the current time, that is, a situation in which the preset time period is 1 second, 5 seconds, or 10 seconds. In a specific implementation, the embodiment of the application does not limit the specific value of the preset duration.
Specific implementations of S1401 and S1402 may refer to the descriptions of the foregoing embodiments, and are not repeated here.
In one possible implementation, the notification message is a reminder message regarding the service provided by the target device.
The service provided by the target device may be, for example, a video conference service (in this case, the target device is a camera), a video call service (in this case, the target device is a camera), or a voice playing service (in this case, the target device is a handset), which is described in the above embodiment, and the embodiment of the present application is not limited thereto.
In one possible embodiment, the message notification area and the black opaque area are of the same hue.
The tone consistency refers to that the two regions are of an immersive integrated design to eliminate the sense of boundary between the regions, and specifically, reference is made to the above description related to fig. 7A for exemplary purposes, which is not repeated here.
In one possible embodiment, the method further comprises: and under the condition that the target device is called in a preset time length from the current moment, the electronic equipment also displays a first man-machine interaction control in a message notification area of the display screen. Optionally, the first human interaction control instructs the user to respond to the notification message or the content of the notification message notification. This embodiment may be exemplarily described with reference to fig. 8A, which is not described herein.
In one possible embodiment, the method further comprises: under the condition that the target device is called in a preset time length from the current moment, the electronic equipment also displays human-computer interaction elements on the display screen; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device. Optionally, the notification message includes indication information for indicating that the user performs a man-machine interaction operation on the man-machine interaction element. This embodiment may be exemplarily referred to the description related to fig. 9A and fig. 10A, which are not repeated herein.
In one possible implementation manner, the notification message includes a hyperlink text, where the hyperlink text is used to provide an extended service for the user after performing man-machine interaction with the user, and the extended service is a service other than the service provided by the target device. Optionally, the notification message includes indication information for indicating that the user performs man-machine interaction on the hyperlink text. This embodiment may be exemplarily referred to the description related to fig. 11A, which is not repeated herein.
In one possible embodiment, the method further comprises: under the condition that the target device is called in a preset time length from the current moment, the electronic equipment also displays a second man-machine interaction control on the display screen; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device. Optionally, the notification message includes indication information for indicating that the user clicks, pulls down or drags the second human-computer interaction control. This embodiment may be exemplarily referred to the description related to fig. 12A above, which is not repeated here.
In one possible implementation, the display screen includes a visual effect presentation area including the black opaque area and a message notification area; the method further comprises the steps of: the electronic device provides a preset visual effect in the visual effect presentation area during the notification message is displayed in the message notification area. Optionally, the preset visual effect includes one or more of color, brightness, transparency, dynamic effect, gradient effect, shading, image, lighting, softened edge, 3D effect, or three-dimensional rotation. This embodiment may be exemplarily referred to the above description related to fig. 13A to 13K, which is not repeated here.
The foregoing mainly describes the display method provided in the embodiments of the present application. It will be appreciated that the electronic device, in order to implement the corresponding functions described above, includes corresponding hardware structures and/or software modules that perform the respective functions. The elements and steps of the examples described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional modules of the device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, the division of the modules in the embodiments of the present application is merely a logic function division, and other division manners may be actually implemented.
In the case of dividing the respective functional modules with the respective functions, fig. 15 shows a schematic logical structure diagram of one possible of the above-described electronic apparatus, and the electronic apparatus 1500 includes a listening unit 1501 and a display unit 1502. Wherein:
a monitoring unit 1501, configured to monitor a call condition of a target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
a display unit 1502, configured to display a notification message in a message notification area of the display screen when the target device is monitored to be invoked within a preset time period from a current time; the message notification area surrounds a part or all of the black opaque area, and the preset time period is greater than or equal to 0 nanoseconds.
In one possible implementation, the notification message is a reminder message regarding the service provided by the target device.
In one possible embodiment, the message notification area and the black opaque area are of the same hue.
In one possible implementation, the display unit 1502 is further configured to:
and under the condition that the target device is called in a preset time length from the current moment, displaying a first man-machine interaction control in a message notification area of the display screen.
In one possible implementation, the first human interaction control instructs the user to respond to the notification message or the content of the notification message notification.
In one possible implementation, the display unit 1502 is further configured to:
under the condition that the target device is called in a preset time length from the current moment, displaying a human-computer interaction element on the display screen; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
In one possible implementation manner, the notification message includes indication information for indicating that the user performs man-machine interaction operation on the man-machine interaction element.
In one possible implementation manner, the notification message includes a hyperlink text, where the hyperlink text is used to provide an extended service for the user after performing man-machine interaction with the user, and the extended service is a service other than the service provided by the target device.
In one possible implementation, the notification message includes indication information for indicating that the user performs man-machine interaction on the hyperlink text.
In one possible implementation, the display unit 1502 is further configured to:
under the condition that the target device is called in a preset time length from the current moment, displaying a second man-machine interaction control on the display screen; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
In one possible implementation, the notification message includes indication information for indicating that the user clicks, pulls down or drags the second human-computer interaction control.
In one possible implementation, the display screen includes a visual effect presentation area including the black opaque area and a message notification area; the display unit 1502 is further configured to:
A preset visual effect is provided in the visual effect presentation area during the display of the notification message in the message notification area.
In one possible embodiment, the preset visual effect includes a visual effect presented by one or more of color, brightness, transparency, dynamic effect, fade effect, shading, image, lighting, softening edge, 3D effect, or three-dimensional rotation.
The specific operation and advantages of each unit in the electronic device 1500 shown in fig. 15 may be referred to the corresponding descriptions in fig. 3A to 14 and the possible embodiments thereof, and will not be repeated here.
The embodiment of the application also provides a chip, and can be exemplarily seen in fig. 16. The chip 1600 includes a processor 1601 and a memory 1602. Wherein the memory 1602 is configured to store a computer program or computer instructions, and the processor 1601 is configured to execute the computer program or computer instructions stored in the memory 1602, to cause the chip 1600 to perform the method as described in any of the above fig. 14 and possible embodiments thereof.
The present application also provides a computer-readable storage medium storing a computer program that is executed by a processor to implement operations performed by the electronic device of any of the above-described various embodiments and possible embodiments thereof.
The present application also provides a computer program product, which when read and executed by a computer, performs the operations of the electronic device of any of the above embodiments and possible embodiments thereof.
In the above-described embodiments, all or part of the functions may be implemented by software, hardware, or a combination of software and hardware. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer readable storage medium. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
The terms "first," "second," and the like in this application are used to distinguish between identical or similar items that have substantially the same function and function, and it should be understood that there is no logical or chronological dependency between the "first," "second," and "nth" terms, nor is it limited to the number or order of execution. It will be further understood that, although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another element.
It should also be understood that, in the embodiments of the present application, the sequence number of each process does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not constitute any limitation on the implementation process of the embodiments of the present application.
It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be further understood that reference throughout this specification to "one embodiment," "an embodiment," "one possible implementation," means that a particular feature, structure, or characteristic described in connection with the embodiment or implementation is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment," "one possible implementation" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (28)

1. A display method, the method comprising:
the electronic equipment monitors the calling condition of a target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
when the target device is called or within a preset time period before the target device is called, the electronic equipment displays a notification message in a message notification area of the display screen; the message notification area surrounds part or all of the black opaque area, and the preset duration is more than 0;
the message notification area is also used for displaying a man-machine interaction control or hyperlink text.
2. The method of claim 1, wherein the notification message is a reminder message regarding a service provided by the target device.
3. The method according to claim 1 or 2, wherein the message notification area and the black opaque area are of a consistent hue.
4. A method according to any one of claims 1-3, wherein the method further comprises:
and when the target device is called or within a preset time period before the target device is called, the electronic equipment further displays a first man-machine interaction control in a message notification area of the display screen.
5. The method of claim 4, wherein the first human interaction control instructs a user to respond to the notification message or the content of the notification message notification.
6. A method according to any one of claims 1-3, wherein the method further comprises:
when the target device is called or within a preset time period before the target device is called, the electronic equipment further displays a human-computer interaction element on the display screen; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
7. The method of claim 6, wherein the notification message includes indication information for indicating a user to perform a human-machine interaction operation on the human-machine interaction element.
8. A method according to any one of claims 1-3, wherein the notification message includes a hyperlink text, the hyperlink text being used to provide an extended service for the user after the man-machine interaction with the user, the extended service being a service other than the service provided by the target device.
9. The method of claim 8, wherein the notification message includes indication information for indicating a user to perform man-machine interaction with the hyperlink text.
10. A method according to any one of claims 1-3, wherein the method further comprises:
when the target device is called or within a preset time period before the target device is called, the electronic equipment further displays a second man-machine interaction control on the display screen; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
11. The method of claim 10, wherein the notification message includes indication information for indicating a click, drop, or drag operation of the second human-machine interaction control by the user.
12. The method of any of claims 1-11, wherein the display screen comprises a visual effect presentation area comprising the black opaque area and a message notification area; the method further comprises the steps of:
The electronic device provides a preset visual effect in the visual effect presentation area during the notification message is displayed in the message notification area.
13. The method of claim 12, wherein the pre-set visual effect comprises a visual effect presented in combination with one or more of color, brightness, transparency, dynamic effect, fade effect, shading, image, lighting, softened edge, 3D effect, or three-dimensional rotation.
14. An electronic device, the electronic device comprising:
the monitoring unit is used for monitoring the calling condition of the target device; the target device is arranged in an area covered by a black opaque area of the display screen of the electronic equipment, and the target device provides service through the black opaque area;
the display unit is used for displaying a notification message in a message notification area of the display screen when the target device is called or within a preset time period before the target device is called; the message notification area surrounds part or all of the black opaque area, and the preset duration is more than 0;
the message notification area is also used for displaying a man-machine interaction control or hyperlink text.
15. The electronic device of claim 14, wherein the notification message is a reminder message regarding a service provided by the target device.
16. The electronic device of claim 14 or 15, wherein the message notification area and the black opaque area are of a consistent hue.
17. The electronic device of any of claims 14-16, wherein the display unit is further configured to:
and displaying a first man-machine interaction control in a message notification area of the display screen when the target device is called or within a preset time period before the target device is called.
18. The electronic device of claim 17, wherein the first human interaction control instructs a user to respond to the notification message or the content of the notification message notification.
19. The electronic device of any of claims 14-16, wherein the display unit is further configured to:
displaying man-machine interaction elements on the display screen when the target device is called or within a preset time period before the target device is called; the man-machine interaction element is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
20. The electronic device of claim 19, wherein the notification message includes indication information for indicating a user to perform a human-machine interaction operation with the human-machine interaction element.
21. The electronic device of any of claims 14-16, wherein the notification message includes a hyperlink text, the hyperlink text being configured to provide an extended service to the user after the human-machine interaction with the user, the extended service being a service other than the service provided by the target device.
22. The electronic device of claim 21, wherein the notification message includes indication information for indicating a user to perform a human-machine interaction with the hyperlink text.
23. The electronic device of any of claims 14-16, wherein the display unit is further configured to:
displaying a second man-machine interaction control on the display screen when the target device is called or within a preset time period before the target device is called; the display area of the second man-machine interaction control is intersected with the message notification area; the second man-machine interaction control is used for providing an expansion service for the user after man-machine interaction with the user, and the expansion service is a service other than the service provided by the target device.
24. The electronic device of claim 23, wherein the notification message includes indication information for indicating a click, drop, or drag operation of the second human-machine interaction control by a user.
25. The electronic device of any of claims 14-24, wherein the display screen comprises a visual effect presentation area comprising the black opaque area and a message notification area; the display unit is further configured to:
and providing a preset visual effect in the visual effect presentation area during the display of the notification message in the message notification area.
26. The electronic device of claim 25, wherein the pre-set visual effects comprise visual effects presented in combination with one or more of color, brightness, transparency, dynamic effects, fade effects, shadows, images, lighting, softened edges, 3D effects, or three-dimensional rotations.
27. An electronic device, the electronic device comprising: one or more processors, memory, and a display screen; the memory is coupled to the one or more processors, the memory for storing a computer program comprising computer instructions for invoking the computer instructions to cause the electronic device to perform the method of any of claims 1-13.
28. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of any one of claims 1 to 13.
CN202210801489.7A 2022-07-08 2022-07-08 Display method and related device Active CN115328592B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410119217.8A CN117931354A (en) 2022-07-08 2022-07-08 Display method and related device
CN202210801489.7A CN115328592B (en) 2022-07-08 2022-07-08 Display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210801489.7A CN115328592B (en) 2022-07-08 2022-07-08 Display method and related device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410119217.8A Division CN117931354A (en) 2022-07-08 2022-07-08 Display method and related device

Publications (2)

Publication Number Publication Date
CN115328592A CN115328592A (en) 2022-11-11
CN115328592B true CN115328592B (en) 2023-12-29

Family

ID=83917968

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410119217.8A Pending CN117931354A (en) 2022-07-08 2022-07-08 Display method and related device
CN202210801489.7A Active CN115328592B (en) 2022-07-08 2022-07-08 Display method and related device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410119217.8A Pending CN117931354A (en) 2022-07-08 2022-07-08 Display method and related device

Country Status (1)

Country Link
CN (2) CN117931354A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108494964A (en) * 2018-03-28 2018-09-04 努比亚技术有限公司 Mobile terminal state column display methods, terminal, computer readable storage medium
CN108881629A (en) * 2018-06-08 2018-11-23 Oppo广东移动通信有限公司 Reminding method, device, terminal and the storage medium of notification message
CN109885373A (en) * 2019-02-27 2019-06-14 腾讯科技(深圳)有限公司 A kind of rendering method and device of user interface
CN111125696A (en) * 2019-12-31 2020-05-08 维沃移动通信有限公司 Information prompting method and electronic equipment
CN112887454A (en) * 2019-11-29 2021-06-01 北京小米移动软件有限公司 Electronic equipment, display method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108494964A (en) * 2018-03-28 2018-09-04 努比亚技术有限公司 Mobile terminal state column display methods, terminal, computer readable storage medium
CN108881629A (en) * 2018-06-08 2018-11-23 Oppo广东移动通信有限公司 Reminding method, device, terminal and the storage medium of notification message
CN109885373A (en) * 2019-02-27 2019-06-14 腾讯科技(深圳)有限公司 A kind of rendering method and device of user interface
CN112887454A (en) * 2019-11-29 2021-06-01 北京小米移动软件有限公司 Electronic equipment, display method and device
CN111125696A (en) * 2019-12-31 2020-05-08 维沃移动通信有限公司 Information prompting method and electronic equipment

Also Published As

Publication number Publication date
CN117931354A (en) 2024-04-26
CN115328592A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
US11669242B2 (en) Screenshot method and electronic device
AU2018430381B2 (en) Flexible screen display method and terminal
EP4057135A1 (en) Display method for electronic device having foldable screen, and electronic device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
WO2021000881A1 (en) Screen splitting method and electronic device
CN114397979A (en) Application display method and electronic equipment
WO2021036770A1 (en) Split-screen processing method and terminal device
CN114040242B (en) Screen projection method, electronic equipment and storage medium
WO2022037726A1 (en) Split-screen display method and electronic device
EP4199499A1 (en) Image capture method, graphical user interface, and electronic device
CN113935898A (en) Image processing method, system, electronic device and computer readable storage medium
CN113986070B (en) Quick viewing method for application card and electronic equipment
CN112068907A (en) Interface display method and electronic equipment
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
WO2021042878A1 (en) Photography method and electronic device
CN113723397B (en) Screen capturing method and electronic equipment
CN110609650B (en) Application state switching method and terminal equipment
EP4390643A1 (en) Preview method, electronic device, and system
EP4239464A1 (en) Method for invoking capabilities of other devices, electronic device, and system
CN117784991A (en) Display method of latest task list and electronic equipment
CN115328592B (en) Display method and related device
CN115268737A (en) Information processing method and device
CN113986406B (en) Method, device, electronic equipment and storage medium for generating doodle pattern
WO2024139257A1 (en) Method for displaying interfaces of application programs and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant