CN115712368A - Volume display method, electronic device and storage medium - Google Patents

Volume display method, electronic device and storage medium Download PDF

Info

Publication number
CN115712368A
CN115712368A CN202110958713.9A CN202110958713A CN115712368A CN 115712368 A CN115712368 A CN 115712368A CN 202110958713 A CN202110958713 A CN 202110958713A CN 115712368 A CN115712368 A CN 115712368A
Authority
CN
China
Prior art keywords
volume
media
played
displaying
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110958713.9A
Other languages
Chinese (zh)
Inventor
唐吴全
王斌
李艳明
燕瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110958713.9A priority Critical patent/CN115712368A/en
Priority to PCT/CN2022/112437 priority patent/WO2023020420A1/en
Publication of CN115712368A publication Critical patent/CN115712368A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Abstract

The embodiment of the application provides a volume display method, electronic equipment and a storage medium, and relates to the technical field of information, wherein the method comprises the following steps: responding to the detected first operation of the user, and displaying a display interface of the media to be played; wherein, the media to be played is in a state to be played; acquiring system volume, and displaying the system volume on a display interface of the media to be played; and the system volume is used for representing the current media volume of the media to be played. The method provided by the embodiment of the application can display the volume before the media file is played, so that the user experience can be improved.

Description

Volume display method, electronic device and storage medium
Technical Field
The embodiment of the application relates to the technical field of information, in particular to a volume display method, electronic equipment and a storage medium.
Background
With the development of information technology, the capabilities of smart terminals such as mobile phones are increasingly diversified. When playing a media file, such as a video, on the smart terminal, a user may adjust the volume through a mechanical button on the smart terminal or a touch button on the media interface. However, before the media file is played, the current volume cannot be intuitively known, which may result in too large volume during playing, resulting in poor experience for the user.
Disclosure of Invention
The embodiment of the application provides a volume display method, electronic equipment and a storage medium, so that a mode of displaying the current volume before a media file is played is provided, a user can adjust the current volume based on the currently displayed volume, the influence caused by the overlarge volume can be avoided, and the user experience can be improved.
In a first aspect, an embodiment of the present application provides a volume display method, including:
responding to the detected first operation of the user, and displaying a display interface of the media to be played; wherein, the media to be played is in a state to be played;
acquiring system volume, and displaying the system volume on a display interface of a medium to be played; the system volume is used for representing the current media volume of the media to be played.
In the embodiment of the application, the current volume is displayed before the media is played, so that the influence caused by overlarge volume can be avoided, and the user experience can be improved.
In one possible implementation manner, the media to be played is a media file, the display interface includes a play control, the play control is used to control the playing of the media to be played, and displaying the system volume on the display interface of the media to be played includes:
and displaying the system volume on the icon corresponding to the playing control.
In the embodiment of the application, the media volume is displayed through the play control, so that the flexibility of volume display can be improved.
In one possible implementation manner, after displaying the system volume on the icon corresponding to the play control, the method further includes:
and reminding the user of the larger volume of the system by using the color on the icon corresponding to the playing control.
In the embodiment of the application, the user is reminded of large volume through the color, so that the user can be effectively reminded, and the feeling of the user can be improved.
In one possible implementation manner, the method further includes:
and updating the volume of the display system on the icon corresponding to the playing control in response to the detected second operation of adjusting the volume by the user.
In the embodiment of the application, the media volume is adjusted on the playing control, so that the volume adjustment efficiency can be improved.
In order to improve flexibility of volume display, in one possible implementation manner, the media to be played is a media file, and displaying the system volume on a display interface of the media to be played includes:
displaying a volume bar on a display interface of a medium to be played; wherein, the volume bar is used for representing the volume of the system.
In order to improve the flexibility of volume display, in one possible implementation manner, the display interface includes a speaker control, the speaker control is used to control a system volume state, the system volume state includes a system mute state and a system non-mute state, and when the media to be played is in the system non-mute state, displaying the system volume on the display interface of the media to be played includes:
the system volume is displayed on an icon corresponding to the speaker control.
In one possible implementation manner, the method further includes:
responding to a detected third operation of adjusting the system volume state by the user, and displaying a system mute state identifier on an icon corresponding to the loudspeaker control; the system mute state identification is used for representing that the media to be played is in a mute state.
In the embodiment of the application, the mute is set through the loudspeaker control, so that the efficiency of volume adjustment can be improved.
In order to improve the flexibility of volume display, in one possible implementation manner, the media to be played is a page media stream, and displaying the system volume on a display interface of the media to be played includes:
and if the current page is detected to contain the playing control, displaying the system volume in the current page.
In one possible implementation manner, displaying the system volume in the current page includes:
and displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, displaying the system volume in the current page includes:
displaying a volume bar in a current page; wherein, the volume bar is used for representing the volume of the system.
In a second aspect, an embodiment of the present application provides a volume display device, including:
the first display module is used for responding to the detected first operation of the user and displaying a display interface of the media to be played; wherein, the media to be played is in a state to be played;
the second display module is used for acquiring the system volume and displaying the system volume on a display interface of the media to be played; the system volume is used for representing the current media volume of the media to be played.
In one possible implementation manner, the media to be played is a media file, the display interface includes a play control, the play control is used to control the playing of the media to be played, and the second display module is further used to control the playing of the media to be played
And displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, the volume display device further includes:
and the reminding module is used for reminding the user of the larger system volume by using colors on the icon corresponding to the playing control.
In one possible implementation manner, the volume display device further includes:
and the updating module is used for responding to the detected second operation of adjusting the volume by the user and updating the volume of the display system on the icon corresponding to the playing control.
In one possible implementation manner, the media to be played is a media file, and the second display module is further configured to display the media file on the display device
Displaying a volume bar on a display interface of a medium to be played; wherein, the volume bar is used for representing the volume of the system.
In one possible implementation manner, the display interface includes a speaker control, the speaker control is used to control a system volume state, the system volume state includes a system mute state and a system non-mute state, and when the media to be played is in the system non-mute state, the second display module is further used to control the system volume state
The system volume is displayed on an icon corresponding to the speaker control.
In one possible implementation manner, the volume display device further includes:
the third display module is used for responding to the detected third operation of adjusting the system volume state by the user and displaying a system mute state identifier on an icon corresponding to the loudspeaker control; the system mute state identifier is used for representing that the media to be played is in a mute state.
In one possible implementation manner, the media to be played is a page media stream, and the second display module is further configured to
And if the current page is detected to contain the playing control, displaying the system volume in the current page.
In one possible implementation manner, the second display module is further configured to
And displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, the second display module is further configured to
Displaying a volume bar in a current page; wherein, the volume bar is used for representing the volume of the system.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory, wherein the memory is used for storing a computer program code, and the computer program code includes instructions, and when the electronic device reads the instructions from the memory, the electronic device executes the following steps:
responding to the detected first operation of the user, and displaying a display interface of the media to be played; wherein, the media to be played is in a state to be played;
acquiring system volume, and displaying the system volume on a display interface of a medium to be played; the system volume is used for representing the current media volume of the media to be played.
In one possible implementation manner, the media to be played is a media file, the display interface includes a play control, the play control is used to control the playing of the media to be played, and when the instruction is executed by the electronic device, the step of causing the electronic device to execute displaying the system volume on the display interface of the media to be played includes:
and displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, when the instruction is executed by the electronic device, after the electronic device executes the step of displaying the system volume on the icon corresponding to the play control, the following steps are further executed:
and reminding the user of the larger volume of the system by using colors on the icon corresponding to the playing control.
In one possible implementation manner, when the instruction is executed by the electronic device, the electronic device further executes the following steps:
and updating the volume of the display system on the icon corresponding to the playing control in response to the detected second operation of adjusting the volume by the user.
In one possible implementation manner, the media to be played is a media file, and when the instruction is executed by the electronic device, the step of causing the electronic device to execute displaying the system volume on the display interface of the media to be played includes:
displaying a volume bar on a display interface of a medium to be played; wherein, the volume bar is used for representing the volume of the system.
In one possible implementation manner, the step of displaying, by the electronic device, the system volume on the display interface of the to-be-played media includes:
the system volume is displayed on an icon corresponding to the speaker control.
In one possible implementation manner, when the instruction is executed by the electronic device, the electronic device further performs the following steps:
responding to a detected third operation of adjusting the system volume state by the user, and displaying a system mute state identifier on an icon corresponding to the loudspeaker control; the system mute state identification is used for representing that the media to be played is in a mute state.
In one possible implementation manner, the media to be played is a page media stream, and when the instruction is executed by the electronic device, the step of causing the electronic device to execute displaying the system volume on the display interface of the media to be played includes:
and if the current page is detected to contain the playing control, displaying the system volume in the current page.
In one possible implementation manner, when the instruction is executed by the electronic device, the step of causing the electronic device to display the system volume in the current page includes:
and displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, when the instruction is executed by the electronic device, the step of causing the electronic device to display the system volume in the current page includes:
displaying a volume bar in a current page; wherein, the volume bar is used for representing the volume of the system.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program, which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program, which is configured to perform the method according to the first aspect when the computer program is executed by a computer.
In a possible design, the program of the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating a volume display method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a media file preview interface display provided in an embodiment of the present application;
fig. 4a to fig. 4d are schematic display diagrams of a play control provided in the embodiment of the present application;
fig. 5 is a schematic flow chart illustrating another embodiment of a volume display method provided in the present application;
fig. 6a and fig. 6b are schematic diagrams illustrating a speaker control display provided in an embodiment of the present application;
fig. 7 is a schematic flow chart illustrating a volume display method according to still another embodiment of the present disclosure;
fig. 8 is a schematic display diagram of a volume bar of a media stream of a page according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a volume display device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
With the development of information technology, the capabilities of smart terminals such as mobile phones are increasingly diversified. When a media file, such as a video, is played on the smart terminal, the user may adjust the volume by a mechanical button on the smart terminal or a touch button on the media interface. However, before the media file is played, the current volume cannot be intuitively known, which may result in too large volume during playing, resulting in poor experience for the user.
Based on the above problem, the embodiment of the present application provides a volume display method, which is applied to the electronic device 100. The electronic device 100 may be a mobile terminal having a display screen and a speaker. A mobile terminal can also be called a terminal device, user Equipment (UE), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a User terminal, a wireless communication device, a User agent, or a User Equipment. The mobile terminal may also be a wearable device, e.g., a smart watch, a smart bracelet, etc. The embodiment of the present application does not particularly limit the specific form of the electronic device 100 for implementing the technical solution.
An exemplary electronic device provided in the following embodiments of the present application is first described below with reference to fig. 1. Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus comprising a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, a charger, a flash, a camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 100 may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power. The proximity light sensor 180G can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards can be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The volume display method provided by the embodiment of the present application will now be described with reference to fig. 2 to 8.
Fig. 2 is a schematic flow chart of an embodiment of a volume display method provided in the embodiment of the present application, including:
step 201, acquiring a media file, and displaying a preview interface of the media file on a display screen of the electronic device 100.
In particular, the media file may be an audio-video file. The user may operate on the electronic device 100 to find a media file that is desired to be played. For example, the user may search for a song to be played in a recorder playlist, or the user may search for a video to be played in a gallery, or the user may search for an audio/video to be played in a file manager. After the user determines the media files to be played, for example, after the user arbitrarily selects one media file, a preview interface of the media file may be displayed on the display screen of the electronic device 100 in response to the selection operation of the user. Fig. 3 is a schematic diagram of a preview interface of a media file, referring to fig. 3, an interface 300 is a preview interface of a media file, and the interface 300 may include a play control 301, where the play control 301 is used to execute playing of the media file.
Step 202, acquiring the media volume, and displaying the media volume on a preview interface of the media file.
Specifically, the media volume may be the media volume of the current system in the electronic device 100, that is, the media volume of the system that was last set by the user. In particular implementations, electronic device 100 may invoke the media volume interface for obtaining the media volume of the current system.
After the electronic device 100 acquires the media volume of the current system, the media volume may be displayed on a preview interface of the media file. In a specific implementation, the display of the media volume may be displayed through a play control on the preview interface. For example, the electronic device 100 may obtain the media volume of the current system, and may calculate, based on the media volume of the current system and the maximum media volume, a percentage of the media volume of the current system in the maximum media volume. Then, the electronic device 100 may display the volume percentage on the icon corresponding to the play control in a region filling manner.
The display of the volume of the media will now be described with reference to fig. 4a to 4 d. As shown in fig. 4a, the play control 401 represents the volume of the media by means of region filling. Referring to fig. 4a, the blank area 4011 (the part with lighter color in the playing control 401) occupies a larger area of the whole area of the playing control 401, and the non-blank area 4012 (the part with darker color in the playing control 401) occupies a smaller area of the whole area of the playing control 401, which means that the media volume of the current system is smaller. As shown in fig. 4b, the blank area 4021 (the light color part of the play control 402) occupies a smaller area of the entire area of the play control 402, and the non-blank area 4022 (the dark color part of the play control 402) occupies a larger area of the entire area of the play control 402, which indicates that the media volume of the current system is larger. It is to be understood that fig. 4a and 4b illustrate only an exemplary manner of region filling, and do not limit the embodiments of the present application, and in some embodiments, the media volume may be characterized by other region filling manners.
As shown in fig. 4c, the play control 403 includes an outer frame area 4031 and an inner frame area 4032. In the outline region 4031, a white region 40311 (a lighter color portion in the outline region 4031) in the outline region 4031 occupies a larger area of the entire outline region 4031, and a non-white region 40312 (a darker color portion in the outline region 4031) in the outline region 4031 occupies a smaller area of the entire outline region 4031, so that it is seen that the media volume of the current system is smaller. As shown in fig. 4d, the play control 404 includes an outer frame area 4041 and an inner frame area 4042. In the frame region 4041, the area of the blank region 40411 (the portion of the frame region 4041 with a lighter color) in the entire frame region 4031 is small, and the area of the non-blank region 40412 (the portion of the frame region 4041 with a darker color) in the frame region 4041 in the entire frame region 4041 is large, which indicates that the media volume of the current system is large.
It should be noted that fig. 4a to 4d are only exemplary, the shape of the outer frame is not limited to be circular, and in some embodiments, the shape of the outer frame may also be square.
Further, in order to highlight the effect, the volume of the media can also be characterized in a color filling manner. For example, if the current media volume exceeds a preset threshold (e.g. 80%) of the maximum volume, the non-blank area may be filled in red to remind the user that the current media volume is too large, so as to avoid affecting the surrounding environment. For example, the red color is only an exemplary illustration and does not constitute a limitation to the embodiments of the present application, and in some embodiments, other colors may be used.
In step 203, the media volume is adjusted in response to the detected user operation.
Specifically, the user operation may be an operation of adjusting a media volume, and the detection manner of the user operation of adjusting the media volume may include detecting a touch screen sliding event or a button pressing event. The touch screen sliding may be a user sliding on the touch screen of the electronic device 100 with a finger to adjust the media volume. The touch screen sliding event may include left-right sliding, up-down sliding, clockwise sliding, and counterclockwise sliding events. The button press may be a user pressing a mechanical key (e.g., a volume key) on the electronic device 100 with a finger for adjusting the media volume. It should be understood that the pressing may be a short pressing or a long pressing, and the pressing mode is not particularly limited in the embodiments of the present application.
In response to the detected user operation to adjust the media volume, the electronic device 100 may acquire the adjusted media volume and may adjust the current media volume based on the acquired media volume. Further, the electronic device may also display the adjusted media volume on a preview interface of the media file, that is, after the media volume is updated, the media volume on a play control in the preview interface may also be updated synchronously. Optionally, after the user adjusts the current media volume, the current media file may be directly played according to the media volume adjusted by the user.
Taking the above fig. 4a and 4b as an example, referring to fig. 4a, the non-blank area 4012 of the play control 401 is smaller, the blank area 4011 is larger, and it can be seen that the media volume of the current system is smaller, that is, in the above scenario, the rightward sliding operation is a volume increasing operation, and the leftward sliding operation is a volume decreasing operation. After the user slides right on the touch screen of the electronic device 100 with a finger, the current media volume is increased, so that the play control 402 shown in fig. 4b can be obtained. Referring to fig. 4b, the non-blank area 4022 of the play control 402 is larger, and the blank area 4021 is smaller, so that the volume is increased after the user performs the volume increasing operation.
Fig. 5 is a schematic flow chart of another embodiment of a volume display method provided in the embodiment of the present application, including:
step 501, acquiring a media file, and displaying a preview interface of the media file on a display screen of the electronic device 100.
In particular, the media file may be an audio-video file. The user may operate on the electronic device 100 to find a media file that is desired to be played. For example, the user may search for a song to be played in a recorder playlist, or the user may search for a video to be played in a gallery, or the user may search for an audio/video to be played in a file manager. After the user determines the media files to be played, for example, after the user arbitrarily selects one media file, a preview interface of the media file may be displayed on the display screen of the electronic device 100 in response to the selection operation of the user. The preview interface can refer to the embodiment shown in fig. 3, and is not described herein again.
Step 502, acquiring the volume of the media, and displaying the volume of the media on a preview interface of the media file.
Specifically, the media volume may be the media volume of the current system in the electronic device 100, that is, the media volume of the system that was last set by the user. In particular implementations, electronic device 100 may invoke a media volume interface for obtaining the media volume of the current system.
After the electronic device 100 obtains the media volume of the current system, the media volume may be displayed on a preview interface of the media file. In a specific implementation, the display of the media volume may be displayed through a speaker control on the preview interface. It will be appreciated that the speaker controls described above may be used to control media volume, and the playback controls described above may be used to control playback of media files and media volume. For example, the electronic device 100 may obtain the media volume of the current system, and may calculate, based on the media volume of the current system and the maximum media volume, a percentage of the media volume of the current system in the maximum media volume. Then, the electronic device 100 may display the volume percentage on the icon corresponding to the speaker control by region filling. For example, a blank area and a non-blank area may be provided on the icon corresponding to the speaker control, and the volume of the media may be represented by a ratio of the blank area to the non-blank area in the icon area corresponding to the speaker control. The above manner of displaying the volume through the region filling can refer to the embodiments shown in fig. 4a to 4d, and is not described herein again.
Further, the speaker control can be used for representing a mute state or a non-mute state besides representing the volume of the media, so that the operation of a user can be more convenient, for example, one-key mute can be realized. In particular implementations, the electronic device 100 may invoke the media volume interface for obtaining the status of the media volume of the current system. If the media volume state of the current system is a mute state, the mute state can be displayed on the speaker control. If the media volume state of the current system is a non-mute state, the non-mute state can be displayed on the speaker control.
The state of the above-described media volume will now be described with reference to fig. 6a and 6 b. As shown in fig. 6a, preview interface 600 includes a play control 601. If the electronic device 100 detects that the media volume status of the current system is a mute status, a speaker control 602 in the mute status may be displayed in the preview interface 600, for example, on the play control 601. As shown in fig. 6b, the preview interface 610 includes a play control 611. If the electronic device 100 detects that the media volume of the current system is in a non-mute state, a speaker control 612 in the non-mute state may be displayed in the preview interface 610, for example, on the play control 611. It is to be appreciated that in the unmuted state, the current media volume can be displayed via the play control 611 or the speaker control 612. That is, if it is preset that the current media volume is displayed through the play control, in a mute state, the icons of the speaker control and the play control may be as shown in fig. 6 a; in the un-muted state, the icon for the speaker control may be as shown by speaker control 612 in fig. 6b, and the icon for the play control may be as shown by play control 403 in fig. 4 c. If the preset current media volume is displayed through the speaker control, in a mute state, the icons of the speaker control and the play control may be as shown in fig. 6 a; in the non-mute state, the icon of the speaker control may display the current media volume in a region-filling manner, and the icon of the play control may be as shown in the play control 612 in fig. 6 b.
It should be noted that the embodiment shown in fig. 6a and fig. 6b described above only exemplarily shows the position of the speaker control in the preview interface, and does not constitute a limitation to the embodiment of the present application, and in some embodiments, the speaker control may be in other areas besides the play control.
And step 503, responding to the detected operation of the user for adjusting the volume state, and displaying the adjusted volume state on a preview interface of the media file.
Specifically, the detecting manner of the operation of adjusting the volume state by the user may include detecting a touch screen click event. The touch screen click may be a click made by a user through a speaker control on the touch screen of the electronic device 100 with a finger, and used for adjusting the media volume. For example, if the current state of the media volume of the system is a mute state, after the user clicks the speaker control, the state of the media volume of the system is adjusted to a non-mute state, and at this time, the user may further adjust the media volume through a touch screen sliding operation or a key pressing operation. If the media volume state of the current system is a non-mute state, the media volume state of the system is adjusted to be a mute state after the user clicks the loudspeaker control, and at the moment, the user cannot further adjust the media volume.
Taking the above-mentioned fig. 6a and fig. 6b as an example, referring to fig. 6a, since the state of the media volume of the current system is a mute state, the preview interface 600 displays the speaker control 602 in the mute state. At this time, if the user clicks the speaker control 602, a preview interface 610 as shown in fig. 6b can be obtained. Due to the clicking operation of the user, the mute state of the media volume is adjusted to the un-mute state, and thus the icon of the speaker control 602 is updated to the speaker control 612. In response to the operation of adjusting the mute state by the user, the electronic device 100 may invoke the media volume interface to obtain the media volume of the current system, so that the user may further perform an operation of adjusting the volume (for example, a sliding operation) in the preview interface 610, and further may adjust the current media volume according to the operation of adjusting the volume by the user. The method for adjusting the current media volume through the operation of adjusting the volume by the user may refer to step 203, and will not be described herein again.
Fig. 7 is a schematic flow chart of a volume display method according to still another embodiment of the present application, including:
step 701, obtaining a current page.
Specifically, the page may be a page containing digital multimedia, and the digital multimedia may be text, pictures, audio, video, and the like. Illustratively, the page may be a web page. When a user opens an application or opens a browser, a page may be displayed on the display screen of the electronic device 100 in response to the user's operation. The user may then perform a swipe or click operation in the application or browser for browsing different pages.
Step 702, detecting whether the current page contains a play control.
Specifically, if a page includes a media stream (e.g., an audio stream or a video stream), the page includes a play control for executing the playing of the media stream. Therefore, the electronic device 100 may be configured to determine the media stream in the current page by detecting the play control in the current page, that is, if the current page includes the play control, it may be considered that the current page includes the media stream, and if the current page does not include the play control, it may be considered that the current page does not include the media stream. In a specific implementation, the detecting whether the page includes the play control may be performed by capturing a screen of the current page, identifying a page (image) captured by the screen, determining whether a play icon exists, if the play icon is identified, determining that the current page includes the play control, and if the play icon is not identified, determining that the current page does not include the play control.
And step 703, displaying the volume of the media based on the detection result.
Specifically, when it is detected that the current page includes the play control, the media volume of the current system may be obtained, and the media volume may be displayed on the current page. The media volume of the current system can be acquired by calling a media volume interface, and the media volume can be displayed in a volume bar mode. It should be understood that the display of the media volume may also be displayed in other manners, and the display manner of the media volume is not particularly limited in the embodiment of the present application.
It should be noted that, the volume bar in the embodiment of the present application is only an exemplary illustration and does not constitute a limitation to the embodiment of the present application, that is, in the embodiment of the present application, the current media volume may also be displayed through the play control, and the current media volume does not need to be displayed through the volume bar. In addition, in the embodiments shown in fig. 2 and fig. 5, the current media volume may also be displayed through the volume bar in the embodiments of the present application, without displaying the current media volume through the play control. The display mode of the media volume is not particularly limited in the present application.
Fig. 8 is a schematic diagram of a volume bar display of a media stream of a page. Referring to fig. 8, a page 800 contains a media stream 801, the media stream 801 containing a play control 802. When the electronic device 100 detects that the page 800 includes the play control 802, a volume bar 803 may be displayed in the page 800. The volume bar 803 is used to display the media volume of the current system. Furthermore, the user can also operate on the volume bar to adjust the volume of the media.
Fig. 9 is a schematic structural diagram of an embodiment of a volume display device according to the present application, and as shown in fig. 9, the volume display device 90 may include: a first display module 91 and a second display module 92; wherein, the first and the second end of the pipe are connected with each other,
the first display module 91 is configured to display a display interface of a to-be-played media in response to a detected first operation of a user; wherein, the media to be played is in a state to be played;
the second display module 92 is configured to obtain the system volume, and display the system volume on a display interface of the media to be played; the system volume is used for representing the current media volume of the media to be played.
In one possible implementation manner, the media to be played is a media file, the display interface includes a playing control, the playing control is used to control playing of the media to be played, and the second display module 92 is further used to control playing of the media to be played
And displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, the volume display device 90 further includes:
and the reminding module 93 is used for reminding the user of the larger system volume by using colors on the icon corresponding to the playing control.
In one possible implementation manner, the volume display device 90 further includes:
and an updating module 94, configured to update the display system volume on the icon corresponding to the play control in response to the detected second operation of adjusting the volume by the user.
In one possible implementation manner, the second display module 92 is further configured to
Displaying a volume bar on a display interface of a medium to be played; wherein, the volume bar is used for representing the volume of the system.
In one possible implementation manner, the display interface includes a speaker control, the speaker control is used to control a system volume state, the system volume state includes a system mute state and a system non-mute state, and when the media to be played is in the system non-mute state, the second display module 92 is further used to control the system volume state
The system volume is displayed on an icon corresponding to the speaker control.
In one possible implementation manner, the volume display device 90 further includes:
a third display module 95, configured to display a system mute state identifier on an icon corresponding to the speaker control in response to a detected third operation of adjusting the system volume state by the user; the system mute state identifier is used for representing that the media to be played is in a mute state.
In one possible implementation manner, the media to be played is a page media stream, and the second display module 92 is further configured to display the page media stream
And if the current page is detected to contain the playing control, displaying the system volume in the current page.
In one possible implementation manner, the second display module 92 is further configured to
And displaying the system volume on the icon corresponding to the playing control.
In one possible implementation manner, the second display module 92 is further configured to
Displaying a volume bar in a current page; wherein, the volume bar is used for representing the volume of the system.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
It is to be understood that the electronic device 100 and the like described above include hardware structures and/or software modules for performing the respective functions in order to realize the functions described above. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
In the embodiment of the present application, the electronic device 100 and the like may be divided into functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media that can store program code, such as flash memory, removable hard drive, read-only memory, random-access memory, magnetic or optical disk, etc.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method of displaying volume, the method comprising:
responding to the detected first operation of the user, and displaying a display interface of the media to be played; wherein, the media to be played is in a state to be played;
acquiring system volume, and displaying the system volume on a display interface of the media to be played; and the system volume is used for representing the current media volume of the media to be played.
2. The method of claim 1, wherein the media to be played is a media file, the display interface includes a play control, the play control is used to control playing of the media to be played, and the displaying the system volume on the display interface of the media to be played comprises:
and displaying the system volume on the icon corresponding to the playing control.
3. The method of claim 2, wherein after displaying the system volume on the icon corresponding to the play control, the method further comprises:
and reminding the user of the larger volume of the system by using colors on the icon corresponding to the playing control.
4. The method of claim 2, further comprising:
and in response to the detected second operation of adjusting the volume by the user, updating and displaying the system volume on the icon corresponding to the play control.
5. The method of claim 1, wherein the media to be played is a media file, and wherein displaying the system volume on the display interface of the media to be played comprises:
displaying a volume bar on a display interface of the media to be played; wherein the volume bar is used for representing the volume of the system.
6. The method of claim 1, wherein the display interface comprises a speaker control, the speaker control being configured to control a system volume state, the system volume state comprising a system mute state and a system un-mute state, and when the media to be played is in the system un-mute state, the displaying the system volume on the display interface of the media to be played comprises:
displaying the system volume on an icon corresponding to the speaker control.
7. The method of claim 6, further comprising:
responding to a detected third operation of adjusting the system volume state by the user, and displaying a system mute state identifier on an icon corresponding to the loudspeaker control; the system mute state identifier is used for representing that the media to be played is in a mute state.
8. The method of claim 1, wherein the media to be played is a page media stream, and wherein displaying the system volume on the display interface of the media to be played comprises:
and if the current page is detected to contain the playing control, displaying the system volume in the current page.
9. The method of claim 8, wherein the displaying the system volume in the current page comprises:
and displaying the system volume on the icon corresponding to the playing control.
10. The method of claim 8, wherein the displaying the system volume in the current page comprises:
displaying a volume bar in the current page; wherein the volume bar is used for representing the volume of the system.
11. An electronic device, comprising:
a processor, a memory for storing the processor-executable instructions;
the processor is configured to, when executing the instructions, cause the electronic device to implement the method of any of claims 1-10.
12. A computer-readable storage medium comprising computer instructions that, when executed on the electronic device, cause the electronic device to perform the method of any of claims 1-10.
CN202110958713.9A 2021-08-20 2021-08-20 Volume display method, electronic device and storage medium Pending CN115712368A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110958713.9A CN115712368A (en) 2021-08-20 2021-08-20 Volume display method, electronic device and storage medium
PCT/CN2022/112437 WO2023020420A1 (en) 2021-08-20 2022-08-15 Volume display method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110958713.9A CN115712368A (en) 2021-08-20 2021-08-20 Volume display method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115712368A true CN115712368A (en) 2023-02-24

Family

ID=85230128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110958713.9A Pending CN115712368A (en) 2021-08-20 2021-08-20 Volume display method, electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN115712368A (en)
WO (1) WO2023020420A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8136040B2 (en) * 2007-05-16 2012-03-13 Apple Inc. Audio variance for multiple windows
CN104967913B (en) * 2014-07-21 2019-01-08 腾讯科技(深圳)有限公司 Audio file control method for playing back and device
CN105704523A (en) * 2016-02-16 2016-06-22 浪潮软件集团有限公司 Method and device for adjusting volume
CN106412261A (en) * 2016-09-18 2017-02-15 深圳市金立通信设备有限公司 Media voice prompt method and terminal
CN107493500B (en) * 2017-08-03 2020-06-02 北京小米移动软件有限公司 Multimedia resource playing method and device
US10891665B2 (en) * 2018-04-16 2021-01-12 Edupresent Llc Reduced bias submission review system
CN109582274B (en) * 2018-11-30 2021-02-02 北京微播视界科技有限公司 Volume adjusting method and device, electronic equipment and computer readable storage medium
CN112015943A (en) * 2019-05-31 2020-12-01 华为技术有限公司 Humming recognition method and related equipment

Also Published As

Publication number Publication date
WO2023020420A1 (en) 2023-02-23

Similar Documents

Publication Publication Date Title
CN110347269B (en) Empty mouse mode realization method and related equipment
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN112492193B (en) Method and equipment for processing callback stream
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN114466107A (en) Sound effect control method and device, electronic equipment and computer readable storage medium
CN114422340A (en) Log reporting method, electronic device and storage medium
CN114863494A (en) Screen brightness adjusting method and device and terminal equipment
CN114339429A (en) Audio and video playing control method, electronic equipment and storage medium
CN111930335A (en) Sound adjusting method and device, computer readable medium and terminal equipment
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN113593567B (en) Method for converting video and sound into text and related equipment
CN112272191B (en) Data transfer method and related device
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN115514844A (en) Volume adjusting method, electronic equipment and system
CN113490291A (en) Data downloading method and device and terminal equipment
CN112532508B (en) Video communication method and video communication device
CN109285563B (en) Voice data processing method and device in online translation process
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN113467747B (en) Volume adjusting method, electronic device and storage medium
CN113596320B (en) Video shooting variable speed recording method, device and storage medium
CN112527220B (en) Electronic equipment display method and electronic equipment
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN114661258A (en) Adaptive display method, electronic device, and storage medium
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination