CN116954770A - Display method and electronic equipment - Google Patents

Display method and electronic equipment Download PDF

Info

Publication number
CN116954770A
CN116954770A CN202210392551.1A CN202210392551A CN116954770A CN 116954770 A CN116954770 A CN 116954770A CN 202210392551 A CN202210392551 A CN 202210392551A CN 116954770 A CN116954770 A CN 116954770A
Authority
CN
China
Prior art keywords
application
interface
vehicle
electronic device
gear
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210392551.1A
Other languages
Chinese (zh)
Inventor
周星辰
卞苏成
徐亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210392551.1A priority Critical patent/CN116954770A/en
Priority to PCT/CN2023/087332 priority patent/WO2023197999A1/en
Publication of CN116954770A publication Critical patent/CN116954770A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display method and electronic equipment. The method is applied to the electronic equipment, and comprises the following steps: displaying an interface of the first application in a full screen manner; acquiring first state information, and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background, or ending the process of the first application, and displaying an interface of a second application in a full screen mode; or split-screen displaying the interfaces of the first application and the second application. Through the technical scheme, the user does not need to interact with the electronic equipment, and the electronic equipment can recommend the content displayed on the electronic equipment preferentially according to the state of the electronic equipment or the equipment connected with the electronic equipment, so that the user experience is improved.

Description

Display method and electronic equipment
Technical Field
The embodiment of the application relates to the field of terminals, in particular to a display method and electronic equipment.
Background
With the development of electronic devices, users have increasingly demanded electronic devices, such as users often want to use multiple applications (apps) of the electronic devices to implement multiple functions at the same time. In the process that a user uses multiple apps of an electronic device, the electronic device generally only displays a display interface of an App that the user has operated last time, and if the user wants to use a function of another App on the electronic device (such as a vehicle-mounted device) at this time, the user needs to manually operate the electronic device to call out the display interface of the other App, which results in higher operation cost and poorer user experience.
Disclosure of Invention
The embodiment of the application provides a display method and electronic equipment.
In a first aspect, a display method is provided, where the method is applied to an electronic device, and the method includes: displaying an interface of the first application in a full screen manner; acquiring first state information, and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background, or ending the process of the first application, and displaying an interface of a second application in a full screen mode; or split-screen displaying the interfaces of the first application and the second application.
In the technical scheme of the application, under the condition that the electronic equipment displays the interface of the first application in a full screen manner, if the electronic equipment detects that the state of the electronic equipment or the equipment connected with the electronic equipment meets the first preset condition, the electronic equipment can continue to run the first application in a non-full screen display mode or end the process of the first application in a background mode, and display the interface of the second application in a full screen or split screen manner. Therefore, the user does not need to interact with the electronic equipment, and the electronic equipment can recommend the content displayed on the electronic equipment preferentially according to the state of the electronic equipment or the equipment connected with the electronic equipment, so that the user experience is improved.
In addition, if the interface of the first application is displayed in a non-full screen display mode or the background continues to run the first application, the user can intuitively see the interface of the second application on the electronic device, namely, the user can use the function of the second application and can continue to use the function of the first application, and further the user experience is improved.
With reference to the first aspect, in some implementations of the first aspect, the second application is a default application, or the second application is an application that was last run on the electronic device before the interface of the first application was displayed full screen.
With reference to the first aspect, in certain implementations of the first aspect, the first application includes a non-map-class application and the second application includes a map-class application.
With reference to the first aspect, in certain implementation manners of the first aspect, the first state information is used to indicate a condition of a moving speed of the electronic device, and the first preset condition includes: the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or, the difference value of the moving speeds of the electronic equipment in the target time is greater than or equal to a second threshold value.
With reference to the first aspect, in certain implementation manners of the first aspect, the first state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the first preset condition includes: the gear of the vehicle is in the D gear; and/or the gear of the vehicle is adjusted from the P gear to the D gear; and/or the gear of the vehicle is in R gear.
With reference to the first aspect, in certain implementation manners of the first aspect, the manner of non-full screen display includes: a floating window, a picture-in-picture, a split screen, a floating ball, and/or a card.
With reference to the first aspect, in certain implementation manners of the first aspect, the manner of non-full screen display is related to a degree of influence of the first application on the electronic device.
The method for setting the non-full screen display of the first application according to the influence degree of the first application on the electronic equipment can be compatible with the use of the functions of the first application and the functions of the second application by a user, and further helps to improve the experience of the user.
With reference to the first aspect, in some implementations of the first aspect, the influence degrees include a low influence degree, a medium influence degree, and a high influence degree, an influence value corresponding to the low influence degree is smaller than or equal to a third threshold, an influence value corresponding to the medium influence degree is greater than the third threshold and smaller than a fourth threshold, an influence value corresponding to the high influence degree is greater than or equal to the fourth threshold, and in a case that the influence degree of the first application on the electronic device is the low influence degree, the non-full screen display mode is a floating window; in the case that the influence degree of the first application on the electronic equipment is a medium influence degree, the non-full screen display mode is picture-in-picture or split screen; in the case that the influence degree of the first application on the electronic equipment is high, the non-full screen display mode is a suspension ball and/or a card.
With reference to the first aspect, in certain implementations of the first aspect, in a case where an interface of the first application displayed in full screen includes video content and audio content, an interface of the first application displayed in the hover ball and/or the card includes only audio content; alternatively, in a case where the interface of the first application displayed full screen includes audio content and the first application continues to be run in the background, the method further includes: and playing the audio content corresponding to the running of the first application.
If the electronic device detects that the state of the electronic device or the device connected with the electronic device meets the first preset condition, the electronic device can display only the audio content of the first application in a manner of a suspension ball and/or a card, so that more functions of using the first application by a user can be met as much as possible, and further the user experience can be improved.
With reference to the first aspect, in certain implementation manners of the first aspect, the interface of the first application displayed in a non-full screen display manner includes at least one control, where the at least one control is used to manage the first application.
In the case of displaying the interface of the first application in a non-full screen display manner, at least one control may also be displayed on the interface of the first application, so that the user may manage the first application through the at least one control, which further helps to improve the user experience.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: detecting operation of an interface of the first application displayed in a non-full screen display mode; responsive to the operation, displaying an interface of the first application in another non-full screen display.
According to the technical scheme, the user can operate the interface of the first application in a non-full-screen display mode according to the needs of the user, so that the non-full-screen display mode of the first application is adjusted, and further the user experience is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, after the displaying the interface of the second application in full screen or split screen, the method further includes: acquiring second state information, and determining that the second state information meets a preset condition, wherein the second state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; and displaying the interface of the first application in a full screen mode.
In the technical scheme of the application, under the condition that the electronic equipment continuously runs the first application in a non-full screen display mode or the background or finishes the process of the first application and displays the interface of the second application in a full screen or split screen mode, if the electronic equipment detects that the state of the electronic equipment or equipment connected with the electronic equipment meets the second preset condition, the electronic equipment can display the interface of the first application in a full screen mode. Therefore, the user does not need to interact with the electronic equipment, and the electronic equipment can recommend the content displayed on the electronic equipment preferentially according to the state of the electronic equipment or the equipment connected with the electronic equipment, so that the user experience is further improved.
With reference to the first aspect, in certain implementation manners of the first aspect, the second state information is used to indicate a condition of a moving speed of the electronic device, and the second preset condition includes: the moving speed of the electronic equipment is smaller than a first threshold value; and/or the difference value of the moving speeds of the electronic equipment in the target time is smaller than a second threshold value.
With reference to the first aspect, in certain implementation manners of the first aspect, the second state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the second preset condition includes: the gear of the vehicle is in the P gear; and/or the gear of the vehicle is adjusted from the D gear to the P gear; and/or the gear of the vehicle is in N gear.
In a second aspect, there is provided an electronic device comprising: one or more processors; a memory; a plurality of applications; and one or more programs, wherein the one or more programs are stored in the memory, which when executed by the processor, cause the electronic device to perform the steps of: displaying an interface of the first application in a full screen manner; acquiring first state information, and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background, or ending the process of the first application, and displaying an interface of a second application in a full screen mode; or split-screen displaying the interfaces of the first application and the second application.
With reference to the second aspect, in some implementations of the second aspect, the second application is a default application, or the second application is an application that was last run on the electronic device before the interface of the first application was displayed full screen.
With reference to the second aspect, in certain implementations of the second aspect, the first application includes a non-map-class application, and the second application includes a map-class application.
With reference to the second aspect, in certain implementations of the second aspect, the first state information is used to indicate a condition of a moving speed of the electronic device, and the first preset condition includes: the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or, the difference value of the moving speeds of the electronic equipment in the target time is greater than or equal to a second threshold value.
With reference to the second aspect, in certain implementations of the second aspect, the first state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the first preset condition includes: the gear of the vehicle is in the D gear; and/or the gear of the vehicle is adjusted from the P gear to the D gear; and/or the gear of the vehicle is in R gear.
With reference to the second aspect, in some implementations of the second aspect, the manner of non-full screen display includes: a floating window, a picture-in-picture, a split screen, a floating ball, and/or a card.
With reference to the second aspect, in some implementations of the second aspect, the manner of non-full screen display is related to a degree of influence of the first application on the electronic device.
With reference to the second aspect, in some implementations of the second aspect, the influence degrees include a low influence degree, a medium influence degree, and a high influence degree, an influence value corresponding to the low influence degree is smaller than or equal to a third threshold, an influence value corresponding to the medium influence degree is greater than the third threshold and smaller than a fourth threshold, an influence value corresponding to the high influence degree is greater than or equal to the fourth threshold, and in a case that the influence degree of the first application on the electronic device is the low influence degree, the non-full screen display mode is a floating window; in the case that the influence degree of the first application on the electronic equipment is a medium influence degree, the non-full screen display mode is picture-in-picture or split screen; in the case that the influence degree of the first application on the electronic equipment is high, the non-full screen display mode is a suspension ball and/or a card.
With reference to the second aspect, in certain implementations of the second aspect, the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of: in the case where the interface of the first application displayed in full screen includes video content and audio content, the interface of the first application displayed in the hover ball and/or the card includes only audio content; or playing the audio content corresponding to the running of the first application under the condition that the interface of the first application which is displayed in a full screen comprises the audio content and the running of the first application is continued by the background.
With reference to the second aspect, in some implementations of the second aspect, the interface of the first application displayed in a non-full screen display manner includes at least one control, where the at least one control is used to manage the first application.
With reference to the second aspect, in certain implementations of the second aspect, the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of: detecting operation of an interface of the first application displayed in a non-full screen display mode; responsive to the operation, displaying an interface of the first application in another non-full screen display.
With reference to the second aspect, in certain implementations of the second aspect, the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of: acquiring second state information after the interface of a second application is displayed in a full screen or a split screen, and determining that the second state information meets a preset condition, wherein the second state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; and displaying the interface of the first application in a full screen mode.
With reference to the second aspect, in certain implementations of the second aspect, the second state information is used to indicate a condition of a moving speed of the electronic device, and the second preset condition includes: the moving speed of the electronic equipment is smaller than a first threshold value; and/or the difference value of the moving speeds of the electronic equipment in the target time is smaller than a second threshold value.
With reference to the second aspect, in certain implementations of the second aspect, the second state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the second preset condition includes: the gear of the vehicle is in the P gear; and/or the gear of the vehicle is adjusted from the D gear to the P gear; and/or the gear of the vehicle is in N gear.
In a third aspect, there is provided an apparatus comprising: the display unit is used for displaying the interface of the first application in a full screen manner; the processing unit is used for acquiring first state information and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; the display unit is further used for displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background or ending the process of the first application, and displaying an interface of the second application in a full screen mode; or the display unit is further used for displaying the interfaces of the first application and the second application in a split screen mode.
With reference to the third aspect, in some implementations of the third aspect, the second application is a default application, or the second application is an application that was last run on the electronic device before the interface of the first application was displayed full screen.
With reference to the third aspect, in certain implementations of the third aspect, the first application includes a non-map-class application and the second application includes a map-class application.
With reference to the third aspect, in certain implementations of the third aspect, the first state information is used to indicate a condition of a moving speed of the electronic device, and the first preset condition includes: the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or, the difference value of the moving speeds of the electronic equipment in the target time is greater than or equal to a second threshold value.
With reference to the third aspect, in certain implementations of the third aspect, the first state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the first preset condition includes: the gear of the vehicle is in the D gear; and/or the gear of the vehicle is adjusted from the P gear to the D gear; and/or the gear of the vehicle is in R gear.
With reference to the third aspect, in some implementations of the third aspect, the manner of non-full screen display includes: a floating window, a picture-in-picture, a split screen, a floating ball, and/or a card.
With reference to the third aspect, in some implementations of the third aspect, the manner of non-full screen display is related to a degree of influence of the first application on the electronic device.
With reference to the third aspect, in some implementations of the third aspect, the influence degrees include a low influence degree, a medium influence degree, and a high influence degree, an influence value corresponding to the low influence degree is smaller than or equal to a third threshold, an influence value corresponding to the medium influence degree is greater than the third threshold and smaller than a fourth threshold, an influence value corresponding to the high influence degree is greater than or equal to the fourth threshold, and in a case that the influence degree of the first application on the electronic device is the low influence degree, the non-full screen display mode is a floating window; in the case that the influence degree of the first application on the electronic equipment is a medium influence degree, the non-full screen display mode is picture-in-picture or split screen; in the case that the influence degree of the first application on the electronic equipment is high, the non-full screen display mode is a suspension ball and/or a card.
With reference to the third aspect, in some implementations of the third aspect, the display unit is further specifically configured to: in the case where the interface of the first application displayed in full screen includes video content and audio content, the interface of the first application displayed in the hover ball and/or the card includes only audio content; alternatively, the device further comprises a playing unit, wherein the playing unit is used for: and playing the audio content corresponding to the running of the first application under the condition that the interface of the first application which is displayed in a full screen comprises the audio content and the running of the first application is continued by the background.
With reference to the third aspect, in some implementations of the third aspect, the interface of the first application displayed in a non-full screen manner includes at least one control, where the at least one control is used to manage the first application.
With reference to the third aspect, in certain implementations of the third aspect, the processing unit is further configured to: detecting operation of an interface of the first application displayed in a non-full screen display mode; responsive to the operation, displaying an interface of the first application in another non-full screen display.
With reference to the third aspect, in certain implementations of the third aspect, the processing unit is further configured to: acquiring second state information after the interface of a second application is displayed in a full screen or a split screen, and determining that the second state information meets a preset condition, wherein the second state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment; the display unit is also used for displaying the interface of the first application in a full screen mode.
With reference to the third aspect, in certain implementations of the third aspect, the second state information is used to indicate a condition of a moving speed of the electronic device, and the second preset condition includes: the moving speed of the electronic equipment is smaller than a first threshold value; and/or the difference value of the moving speeds of the electronic equipment in the target time is smaller than a second threshold value.
With reference to the third aspect, in certain implementations of the third aspect, the second state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the second preset condition includes: the gear of the vehicle is in the P gear; and/or the gear of the vehicle is adjusted from the D gear to the P gear; and/or the gear of the vehicle is in N gear.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the display method described in the first aspect or any of the realizable modes of the first aspect.
In a fifth aspect, a computer readable storage medium is provided, which may be non-volatile. The storage medium includes instructions that, when executed on an electronic device, cause the electronic device to perform the display method described in the first aspect or any of the achievable manners of the first aspect.
In a sixth aspect, there is provided a chip system comprising: a processor for calling and running a computer program from a memory, so that an electronic device on which the chip system is mounted performs the display method described in the above first aspect or any of the realizable modes of the first aspect.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application.
Fig. 3 is an exemplary diagram of an application scenario 200 provided in an embodiment of the present application.
Fig. 4 is a set of GUIs of the in-vehicle apparatus provided by the embodiment of the present application.
Fig. 5 is another set of GUIs of the in-vehicle apparatus provided by the embodiment of the present application.
Fig. 6 is a further set of GUIs of the in-vehicle apparatus provided by the embodiment of the present application.
Fig. 7 is a further set of GUIs of the in-vehicle apparatus provided by the embodiment of the present application.
Fig. 8 is a schematic flowchart of an example of a display method according to an embodiment of the present application.
Fig. 9 is a schematic block diagram of an example apparatus provided in an embodiment of the present application.
Fig. 10 is a schematic structural diagram of an example of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plural" or "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The method provided by the embodiment of the application can be applied to electronic equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the embodiment of the application does not limit the specific type of the electronic equipment.
By way of example, fig. 1 shows a schematic diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an App (such as a sound playing function, an image playing function, etc.) and the like required for at least one function of the operating system. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs an embedded SIM (eSIM) card, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
It should be understood that the phone cards in embodiments of the present application include, but are not limited to, SIM cards, eSIM cards, universal subscriber identity cards (universal subscriber identity module, USIM), universal integrated phone cards (universal integrated circuit card, UICC), and the like.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be understood that the technical scheme in the embodiment of the application can be used in Android, IOS, hong Meng and other systems.
A user may use multiple apps of an electronic device to implement multiple functions. For example, a user can realize a positioning or navigation function through a map App on the electronic device, and the user can also realize an audio/video listening function through an audio/video App on the electronic device.
The App may run in the foreground of the electronic device or in the background of the electronic device. Wherein, when the App is running in the foreground of the electronic device, the electronic device 100 may display a user interface of the App on the display screen. The user may interact with the App through controls in the App's user interface. The App runs in the electronic device, but the case that the App runs in the foreground is the case that the App runs in the background of the electronic device. The case where the App is running in the background of the electronic device may include a process in which the App exists in the electronic device, but a user interface of the App is not displayed on the display screen. Because the user interface of the App running in the background is not displayed on the display screen, the user typically cannot directly interact with the App running in the background. The background may also be referred to as a "non-foreground".
The App running in the foreground of the electronic device may have one or more, and the App running in the background of the electronic device may have one or more. Wherein, while one or more apps are running in the foreground of the electronic device, one or more apps can be running in the background of the electronic device.
With the development of electronic devices, users have increasingly demanded electronic devices. Illustratively, users often wish to use multiple apps of an electronic device to implement multiple functions simultaneously. When a user uses one function of the electronic device, if the user wants to use another function of the electronic device at this time, the user needs to open an App corresponding to the function. For example, a user may wish to play audio and video functions using an electronic device while using the navigation functions of the electronic device, and the user may also need to open an audio and video App. At this time, the App opened later occupies the whole display interface of the electronic device, and when the user wants to see the display interfaces of other apps on the electronic device, that is, when the user wants to operate other apps on the electronic device, the user needs to interact with the electronic device, for example, the user needs to manually perform some operations on other apps (for example, an operation of tuning the App opened later to a background operation or an operation of ending a process of the App opened later). Thus, the operation cost is high and the user experience is poor.
Therefore, the embodiment of the application provides a display method which can be applied to electronic equipment. According to the display method, according to the state of the electronic equipment or equipment connected with the electronic equipment, the user interface of the App which is most wanted by a user is displayed on the electronic equipment in a full screen mode, other Apps are displayed in a non-full screen mode, or the other Apps are tuned to a background operation mode, or the processes of the other Apps are directly closed. Therefore, the user can realize full-screen display of the user interface of the App which the user wants to see most on the electronic equipment without interaction with the electronic equipment, and further the user experience is improved.
Fig. 3 is an exemplary diagram of an application scenario 200 provided in an embodiment of the present application. For example, as shown in fig. 3, the application scenario 200 includes a vehicle 210 and a driver 220. The driver 220 is in the driving position of the vehicle 210. The vehicle 210 is equipped with an in-vehicle apparatus 230, and the in-vehicle apparatus 230 has a display screen.
The embodiment of the present application is not limited to a specific form of the in-vehicle apparatus 230 as long as it has a display screen.
Alternatively, in some embodiments, the in-vehicle device 230 may be an in-vehicle infotainment product installed inside the vehicle 210.
Alternatively, in some embodiments, the in-vehicle device 230 may also be a device that interfaces with an in-vehicle infotainment product installed in the vehicle 210, such as a cell phone.
It should be noted that the scenario shown in fig. 3 is only an example, and should not limit the present application. For example, more electronic devices or people may be included in the scene.
Next, with reference to fig. 4 to 7, a change in the graphical user interface (graphical user interface, GUI) of the in-vehicle apparatus 230 will be described in detail. Fig. 4 to 6 each illustrate an example in which the in-vehicle apparatus 230 initially displays a display interface of the in-vehicle map application. In fig. 7, the in-vehicle device 230 initially displays a display interface for a video call.
Fig. 4 shows a set of graphical user interfaces (graphical user interface, GUI) provided by an embodiment of the present application.
For example, as shown in (a) of fig. 4, the in-vehicle apparatus 230 displays a display interface 231A of the in-vehicle map application.
Note that, the manner of triggering the in-vehicle device 230 to display the display interface 231A of the in-vehicle map application is not limited in the embodiment of the present application. For example, the display interface 231A of the in-vehicle map application may be a display interface that a user brings up through interaction with the in-vehicle apparatus 230. For another example, the display interface 231A of the in-vehicle map application may be an interface that the in-vehicle device 230 displays by default in the standby state.
The embodiment of the present application is not limited to the content specifically displayed in the display interface 231A. For example, as shown in fig. 4 (a), the display interface 231A includes head portrait information 2311 of a user account registered on the vehicle, a bluetooth function icon 2312, a Wi-Fi function icon 2313, a cellular network signal icon 2314, a vehicle-mounted map application search box 2315, a card 2316A for switching to display a vehicle-mounted music application, a display card 2317 for displaying the remaining amount of power and remaining mileage of the vehicle, and a display card 2318 for a 360 degree (°) ring-image function of the vehicle. Among other things, the in-vehicle map application search box 2315 may include a user-set home control 23151 and a go to company control 23152.
Optionally, in some embodiments, as shown in (a) of fig. 4, the in-vehicle device 230 may further display a function bar 232, and the function bar 232 may display one or more icons. For example, as shown in fig. 4 (a), the function bar 232 includes an icon 2321 for switching to displaying the desktop of the in-vehicle apparatus 230, an in-vehicle circulation icon 2322, a main driving seat heating function icon 2323, a main driving region air-conditioning temperature display icon 2324, a sub driving region air-conditioning temperature display icon 2325, a sub driving seat heating function icon 2326, and a volume setting icon 2327.
Note that, in the embodiment of the present application, the number of icons displayed in the function bar 232 and which icons are specifically displayed are not limited.
Optionally, in some embodiments, the in-vehicle device 230 may also display the vehicle speed of the vehicle 210.
The manner in which the vehicle-mounted device 230 displays the vehicle speed is not limited in the embodiment of the present application.
For example, as shown in (a) of fig. 4, the in-vehicle apparatus 230 may display the vehicle speed (for example, 20 km/h) in the display interface 231A displaying the in-vehicle map application.
For another example, the in-vehicle device 230 may also display the vehicle speed in the display card 2317.
If the driver 220 wants to listen to music, the driver 220 can turn on the music App through interaction with the in-vehicle device 230. The manner in which the driver 220 opens the music App is not limited in the embodiments of the present application. For example, as shown in fig. 4 (a), the driver 220 may turn on the music App by clicking on a music icon in the card 2316A.
After the driver 220 turns on the music App of the in-vehicle apparatus 230, the in-vehicle apparatus 230 may display the play interface 233A of the music application. The play interface 233A includes song information, which may be the last song played by the driver 220 or a default song for a music application.
The embodiment of the present application does not limit the specific content of the song information displayed on the play interface 233A. For example, as shown in (b) of fig. 4, song information displayed by the play interface 233A includes: song title "Axxx", singer title "singer: xxx ", lyric text information" axxx "and" bxxxxx ", and a play progress bar.
Optionally, in some embodiments, the play interface 233A may also include one or more controls. For example, as shown in fig. 4 (b), the play interface 233A may also include a control 2331A for collecting a currently playing song, a control 2332A for switching to a previous song, a control 2333A for pausing the currently playing song, and a control 2334A for switching to a next song.
Alternatively, in some embodiments, as shown in (b) of fig. 4, the in-vehicle device 230 may display a function bar 232 in addition to the play interface 233A of the music application. The description of the function bar 232 may be referred to above and will not be repeated here.
In some embodiments, if the driver 220 jerks the throttle, the vehicle 210 speed is increased from 20km/h to 40km/h within 1 s. At this time, when the in-vehicle device 230 detects that the vehicle speed of the vehicle 210 has risen from 20km/h to 40km/h, the difference in vehicle speed is greater than 5km/h (other threshold values may be also be used, which is only an example here), the in-vehicle device 230 may process the progress of the music application and display the display of the previous application of the music application or the default display in a full-screen manner so as not to affect the driver 220 to view the display of the previous application of the music application or the default display (for example, the display 231A of the in-vehicle map application). At this time, the driver 220 may see a display interface of a previous application of the music application or a default display interface. For example, the in-vehicle device 230 displays the display interface 231A of the in-vehicle map application in a full-screen manner.
In other embodiments, the vehicle 210 has a certain speed if the vehicle 210 is in the process of traveling. At this time, when the in-vehicle apparatus 230 detects that the vehicle speed of the vehicle 210 is 40km/h, which is greater than 30km/h (other thresholds may be used, for example, 0km/h, 10km/h, and this is only an example), the in-vehicle apparatus 230 may also process the progress of the music application and display the display of the previous application of the music application or the default display in a full screen manner so as not to affect the driver 220 to view the display of the previous application of the music application or the default display (for example, the display 231A of the in-vehicle map application). At this time, the driver 220 may see a display interface of a previous application of the music application or a default display interface. For example, the in-vehicle device 230 displays the display interface 231A of the in-vehicle map application in a full-screen manner.
The manner in which the in-vehicle device 230 processes the progress of the music application is not limited in the embodiment of the present application.
In one example, the in-vehicle device 230 may directly end the process of the music application, such as directly closing the music application.
In another example, the in-vehicle device 230 may run the music application in the background, i.e., no interface for the music application may be displayed on the in-vehicle device 230.
Alternatively, in this example, while the music application is running in the background, the in-vehicle device 230 may also play the audio content corresponding to the music application running in the background at the same time.
In yet another example, the in-vehicle device 230 may display the play interface 233A of the music application in a non-full screen display. For example, as shown in (c) of fig. 4, the in-vehicle device 230 may display a floating window 234A, in which a playback interface of the music application is displayed in the floating window 234A.
Note that, in the embodiment of the present application, the display position of the floating window 234A on the in-vehicle device 230 is not limited. For example, if the driving safety of the driver 220 is not affected, the floating window 234A may be displayed on the in-vehicle device 230 at a side position close to the co-driver, that is, the floating window 234A may be displayed on the in-vehicle device 230 at a side position away from the driver 220. For another example, the floating window 234A may be displayed at a side of the in-vehicle device 230 near the driver 220 for convenience of the driver 220 to further operate the music application.
Alternatively, in some embodiments, as shown in (c) of fig. 4, the in-vehicle device 230 may also display the vehicle speed (e.g., 40 km/h) in the display interface 231A displaying the in-vehicle map application.
The content displayed by the playback interface of the music application in the floating window 234A is not limited in the embodiment of the present application. For example, the content displayed by the playback interface of the music application in the floating window 234A may be the display content of the playback interface 233A in fig. 4.
Optionally, in some embodiments, as shown in fig. 4 (c), the floating window 234A may further include a control a for closing the floating window 234A. Thus, when the user clicks on control A, the in-vehicle device 230 no longer displays the hover window 234A.
Optionally, in some embodiments, the user may also operate the floating window 234A to close the floating window 234A or change the floating window 234A to another way of non-full screen display. The embodiment of the present application is not limited to this operation, and for example, this operation may be an operation of pulling up or pulling down the floating window 234A.
It should be noted that, the diagram shown in (c) in fig. 4 is described by taking the playback interface 233A of the music application of the vehicle-mounted device 230 as the floating window 234A in a non-full-screen display manner as an example, and the embodiment of the present application is not limited to the specific form in which the playback interface 233A of the music application of the vehicle-mounted device 230 is in a non-full-screen display manner.
A specific description of a non-full screen display manner may be found in the following description of fig. 8, and will not be repeated here.
Optionally, in some embodiments, after the in-vehicle device 230 processes the progress of the music application and displays the display interface of the previous application of the music application or the default display interface in a full-screen manner, if the in-vehicle device 230 detects that the vehicle speed difference of the vehicle 210 within 1s is less than 5km/h (other threshold values may also be used, which is only an example here), the in-vehicle device 230 may resume the display manner of the previous music application, for example, display the display interface of the music application in a full-screen manner.
Alternatively, in other embodiments, after the in-vehicle device 230 processes the progress of the music application and displays the display interface of the previous application of the music application or the default display interface in a full-screen manner, if the in-vehicle device 230 detects that the vehicle speed of the vehicle 210 is less than or equal to 30km/h (other threshold values may be used, which is only an example herein, or may be when the vehicle speed of the vehicle 210 is equal to 0), the in-vehicle device 230 may resume the display manner of the previous music application, for example, display the play interface of the music application in a full-screen manner.
In the set of GUI provided in the embodiment of the present application and shown in fig. 4, in the case where the vehicle-mounted device displays the interface for playing music in full screen, if the vehicle-mounted device detects that the vehicle speed of the vehicle on which the vehicle-mounted device is loaded rises from 20km/h to 40km/h within 1s, the vehicle-mounted device may display the interface for playing music in a non-full screen display manner and display the display interface of the vehicle-mounted map application in full screen. In this way, the user can see the display interface of the vehicle map application (the display interface of the previous application of the music application or the default display interface) on the vehicle device without interacting with the vehicle device, thereby being beneficial to improving the experience of the user. In addition, if the interface for playing music is displayed in a non-full screen display mode or the background continues to run the music application, the user can intuitively see the display interface of the vehicle-mounted map application on the vehicle-mounted device and can continue to use the music application, the requirements of safety performance reminding (such as displaying a navigation interface) and music listening of the driver in the driving process are met to the greatest extent, and the user experience is further improved.
FIG. 5 illustrates another set of GUIs provided by an embodiment of the present application.
For example, as shown in (a) of fig. 5, the in-vehicle device 230 displays the display interface 231B of the in-vehicle map application.
For the description of the display interface 231B, reference may be made to the description related to the display interface 231A, and the card 2316A for switching to displaying the vehicle-mounted music application included in the display interface 231A is replaced by the card 2316B for switching to displaying the vehicle-mounted video application, which is not described herein.
Optionally, in some embodiments, as shown in (a) of fig. 5, the vehicle device 230 may further display a function field 232, and the description of the function field 232 may refer to the related description above, which is not repeated herein.
Optionally, in some embodiments, the in-vehicle device 230 may also display the speed of the vehicle 210 and/or the current gear of the vehicle 210.
The manner in which the vehicle-mounted device 230 displays the vehicle speed and/or the gear is not limited in the embodiment of the present application.
For example, as shown in (a) of fig. 5, the in-vehicle apparatus 230 may display the vehicle speed (for example, 0 km/h) in the display interface 231B displaying the in-vehicle map application.
If the driver 220 wants to see the video, the driver 220 can turn on the video App through interaction with the in-vehicle device 230. The manner in which the driver 220 opens the video App is not limited in this embodiment of the present application. For example, as shown in fig. 5 (a), the driver 220 may turn on the video App by clicking on the video icon in the card 2316B.
After the driver 220 turns on the video App of the in-vehicle device 230, the in-vehicle device 230 may display the play interface 233B of the video application. The play interface 233B includes video information, which may be the last video played by the driver 220 or a default video for the video application.
The embodiment of the present application does not limit the specific content of the video information displayed on the playing interface 233B. For example, as shown in (B) of fig. 5, the video information displayed by the playback interface 233B includes: video picture and playing progress bar.
Optionally, in some embodiments, the play interface 233B may also include one or more controls. For example, as shown in fig. 5 (B), the play interface 233B may also include a control 2331B for pausing the currently playing video.
Alternatively, in some embodiments, as shown in (b) of fig. 5, the in-vehicle device 230 may display a function bar 232 in addition to the play interface 233 of the music application. The description of the function bar 232 may be referred to above and will not be repeated here.
Thereafter, if the driver 220 starts the vehicle 210, the gear of the vehicle 210 is shifted from the P range to the D range and the accelerator is jerked so that the vehicle speed of the vehicle 210 within 1.5s is increased from 0km/h to 35km/h. At this time, if the in-vehicle device 230 detects that the gear of the vehicle 210 is shifted from the P-gear to the D-gear and the vehicle speed is accelerated from 0km/h to 35km/h, the vehicle speed difference is greater than 8km/h, in order not to affect the driver 220 to view the display interface of the previous application of the video application or the default display interface (for example, the display interface 231B of the in-vehicle map application), the in-vehicle device 230 may process the progress of the video application and display the display interface of the previous application of the video application or the default display interface in a full-screen manner. At this time, the driver 220 may see a display interface of a previous application of the video application or a default display interface. For example, the in-vehicle device 230 displays the display interface 231B of the in-vehicle map application in a full-screen manner.
The manner in which the in-vehicle device 230 processes the video application according to the embodiment of the present application is not limited.
In one example, in-vehicle device 230 may directly end the process of the video application, such as directly closing the video application.
In another example, the in-vehicle device 230 may run the video application in the background, i.e., no interface for the video application is displayed on the in-vehicle device 230.
Alternatively, in this example, if the video application running in the background has audio content, the in-vehicle device 230 may also play the audio content corresponding to the video application running in the background at the same time as the video application running in the background.
In yet another example, the in-vehicle device 230 may display the play interface 233B of the video application in a non-full screen display. For example, as shown in (c) of fig. 5, the in-vehicle device 230 may display a card 234B, and the card 234B displays therein a playback interface of an audio application that plays audio corresponding to the video shown in (B) of fig. 5.
At this time, the driver 220 may see a display interface of a previous application of the video application or a default display interface. For example, the in-vehicle device 230 displays the display interface 231B of the in-vehicle map application in a full-screen manner.
Note that, in the embodiment of the present application, the display position of the card 234B on the in-vehicle device 230 is not limited. For example, if the driving safety of the driver 220 is not affected, the card 234B may be displayed on the in-vehicle device 230 at a side position close to the co-driver, that is, the card 234B may be displayed on the in-vehicle device 230 at a side position away from the driver 220. For another example, if the music application can be further operated for the driver 220, the card 234B can be displayed at a side of the in-vehicle device 230 near the driver 220.
The content displayed by the playing interface of the video application in the card 234B is not limited in the embodiment of the present application.
Illustratively, the content displayed by the playback interface of the video application in card 234B may include audio information and/or at least one control. For example, the audio information may include audio text information "Bxxx". The controls may include a control to collect the currently playing audio, a control to switch to the last audio, a control to pause the currently playing audio, a control to switch to the next audio, and so on.
Alternatively, in some embodiments, as shown in (c) of fig. 5, the in-vehicle device 230 may also display the vehicle speed (e.g., 35 km/h) in the display interface 231B displaying the in-vehicle map application.
Optionally, in some embodiments, the card 234B may also include controls for closing the card 234B.
Optionally, in some embodiments, the user may also operate on card 234B to turn off card 234B or change card 234B to another way of non-full screen display. The embodiment of the present application is not limited to this operation, and for example, this operation may be an operation of pulling up or pulling down the card 234B.
It should be noted that, the diagram shown in fig. 5 (c) is described by taking the card 234B as an example of the playing interface 233B of the video application of the vehicle-mounted device 230 in a non-full screen display manner, and the embodiment of the present application is not limited to the specific form of the playing interface 233B of the audio application of the vehicle-mounted device 230 in a non-full screen display manner.
A specific description of a non-full screen display manner may be found in the following description of fig. 8, and will not be repeated here.
Optionally, in some embodiments, after the in-vehicle device 230 processes the progress of the video application and displays the display interface of the previous application of the video application or the default display interface in a full-screen manner, if the in-vehicle device 230 detects that the vehicle speed difference of the vehicle 210 within 1.5s is less than 8km/h and the gear of the vehicle 210 is shifted from D-gear to P-gear, the in-vehicle device 230 may resume the display manner of the previous video application, for example, display the display interface of the video application in a full-screen manner.
In a set of GUI provided in the embodiment of the present application and shown in fig. 5, in the case where the vehicle-mounted device displays the interface for playing the video in full screen, if the vehicle-mounted device detects that the gear of the vehicle on which the vehicle-mounted device is loaded is adjusted from P gear to D gear, and the vehicle speed suddenly increases from 0km/h to 35km/h, the vehicle-mounted device may continue playing the video in a non-full screen display manner, and display the display interface of the vehicle-mounted map application in full screen. In this way, the user can see the display interface of the vehicle map application (the display interface of the previous application of the video application or the default display interface) on the vehicle device without interacting with the vehicle device, thereby being beneficial to improving the experience of the user. In addition, if the interface of the video application is displayed in a non-full screen display mode or the background continues to run the video application, a user can intuitively see the display interface of the vehicle map application on the vehicle-mounted device and can continue to use the video application, the requirements of safety performance reminding (such as displaying a navigation interface) of a driver in the driving process and audio listening corresponding to the video are met to the greatest extent, and the user experience is further improved.
It should be noted that, in the embodiment shown in fig. 5, when the gear of the vehicle is shifted from the P gear to the D gear and the vehicle speed suddenly increases from 0km/h to 35km/h, the triggering vehicle-mounted device 230 is described by taking a display interface that does not display the video application in a full screen manner and displays the map application in a full screen manner as an example, which should not limit the present application. For example, in other embodiments, when the gear of the vehicle 210 is shifted from the P-gear to the D-gear, but the speed of the vehicle 210 has not changed, the in-vehicle device 230 may be triggered to display the video application in a non-full screen manner and display the display interface of the in-vehicle map application in a full screen manner. Accordingly, when the gear of the vehicle 210 is shifted from the D-range to the P-range, the in-vehicle device 230 can also resume the display mode of the previous video application without satisfying the condition that the difference in the vehicle speed of the vehicle 210 is less than a certain threshold value or the current vehicle speed of the vehicle 210 is less than a certain threshold value.
The speed according to the present application may be an absolute speed or a relative speed, which is not limited in the embodiment of the present application.
The speed according to the present application may be an instantaneous speed or an average speed over a certain period of time, which is not limited in the embodiment of the present application.
FIG. 6 illustrates yet another set of GUIs provided by an embodiment of the present application.
For example, as shown in (a) of fig. 6, the in-vehicle apparatus 230 displays a display interface 231C of the in-vehicle map application.
For the description of the display interface 231C, reference may be made to the description related to the display interface 231A, and the card 2316A for switching to displaying the vehicle-mounted music application included in the display interface 231C is replaced by the card 2316C for switching to displaying the vehicle-mounted chat tool application, which is not described herein.
Optionally, in some embodiments, as shown in (a) of fig. 6, the vehicle device 230 may further display a function field 232, and the description of the function field 232 may refer to the related description above, which is not repeated herein.
Optionally, in some embodiments, the in-vehicle device 230 may also display the current gear of the vehicle 210.
The manner in which the shift position is displayed by the in-vehicle device 230 is not limited in the embodiment of the present application.
Thereafter, if a person makes a video call through the chat tool application of the in-vehicle apparatus 230, as shown in (b) of fig. 6, the in-vehicle apparatus 230 displays the display interface 233C1 waiting for the video call in full screen, and the display interface 233C1 can display information of the person making the video call.
The embodiment of the application does not limit the specific content of the information of the incoming video call participants. For example, as shown in (b) of fig. 6, the information of the incoming video talker may include avatar information C11 and title text information "wife" used in the chat tool application of the incoming video talker.
Optionally, in some embodiments, the display interface 233C1 may also display at least one control that may be used to further process the video call. For example, as shown in fig. 6 (b), the display interface 233C1 may further include an answer control C12 for answering a video call and a hang-up control C13 for rejecting answering a video call.
Alternatively, in some embodiments, as shown in (b) of fig. 6, the in-vehicle device 230 may display the function bar 232 in addition to the display interface 233C1 for waiting for the video call. The description of the function bar 232 may be referred to above and will not be repeated here.
If the driver 220 wants to answer the video call, the driver 220 can answer the video call through interaction with the in-vehicle device 230. The manner in which the driver 220 listens to the video call is not limited in the embodiments of the present application. For example, as shown in (b) of fig. 6, the driver 220 can answer the video call by clicking the answer control C12 for answering the video call.
After the driver 220 answers the video call, the in-vehicle device 230 may display the call interface 233C2 of the video call in full screen. The call interface 233C2 includes information of the video caller.
The embodiment of the present application does not limit the specific content of the information of the video caller displayed on the call interface 233C2. For example, as shown in (C) of fig. 6, the information of the video talker displayed by the talker interface 233C2 includes: video images of both parties of the video call.
The number of people in the video image is not limited, and the method mainly depends on the acquisition range of an acquisition device (such as a camera) for acquiring the video image.
Optionally, in some embodiments, the call interface 233C2 may also include one or more controls. For example, as shown in fig. 6 (C), the call interface 233C2 may also include a control C14 for closing the camera, a control C15 for hanging up a video call, and a control C16 for muting the video call.
Alternatively, in some embodiments, as shown in (C) of fig. 6, the in-vehicle device 230 may display the function bar 232 in addition to the call interface 233C 2. The description of the function bar 232 may be referred to above and will not be repeated here.
Thereafter, if the driver 220 starts the vehicle 210, the gear of the vehicle 210 is shifted from the P range to the D range. At this time, if the in-vehicle device 230 detects that the gear of the vehicle 210 is shifted from the P-gear to the D-gear, in order not to affect the driver 220 to view the display interface of the previous application of the chat tool application or the default display interface (for example, the display interface 231C of the in-vehicle map application), the in-vehicle device 230 may process the progress of the chat tool application and display the display interface of the previous application of the chat tool application or the default display interface in a full-screen manner. At this time, the driver 220 can see a display interface of a previous application of the chat tool application or a default display interface. For example, the in-vehicle device 230 displays the display interface 231C of the in-vehicle map application in a full-screen manner.
The manner in which the in-vehicle device 230 processes the chat tool application is not limited in this embodiment of the present application.
In one example, in-vehicle device 230 may directly end the process of the chat tool application, such as directly closing the chat tool application.
In another example, in-vehicle device 230 may run the chat tool application in the background, i.e., no interface for the chat tool application is displayed on in-vehicle device 230.
Optionally, in this example, if the chat tool application running in the background has audio content, the in-vehicle device 230 may also play the audio content corresponding to the chat tool application running in the background at the same time as the chat tool application running in the background.
In yet another example, the in-vehicle device 230 may display the call interface 233C2 in a non-full screen display. For example, as shown in (d) of fig. 6, the in-vehicle device 230 may display the call interface 234C in a picture-in-picture form.
At this point, driver 220 may see the display interface of the previous application of the chat tool or a default display interface. For example, the in-vehicle device 230 displays the display interface 231C of the in-vehicle map application in a full-screen manner.
Note that, in the embodiment of the present application, the display position of the call interface 234C on the in-vehicle device 230 is not limited. For example, if the driving safety of the driver 220 is not affected, the call interface 234C may be displayed on the in-vehicle device 230 at a side position close to the co-driver, that is, the call interface 234C may be displayed on the in-vehicle device 230 at a side position away from the driver 220. For another example, if the driver 220 further operates the call interface 234C for convenience, the call interface 234C may be displayed at a side of the in-vehicle device 230 near the driver 220.
The embodiment of the present application is not limited to what is displayed in the call interface 234C.
In one example, the content displayed in the talk interface 234C may be the same as the content displayed in the talk interface 233C 2. I.e., the content displayed in the call interface 234C includes information about the video caller (e.g., video images of both parties to the call) and one or more controls.
In another example, the content displayed in the call interface 234C includes only part or all of the content displayed in the main screen (screen occupying the largest area) in the call interface 233C 2.
For example, as shown in fig. 6 (d), the content displayed in the call interface 234C includes only the entire content of the content displayed in the home screen in the call interface 233C2, i.e., the content displayed in the call interface 234C includes only the video image and one or more controls in the content displayed in the home screen in the call interface 233C 2.
For another example, the content displayed in the call interface 234C includes only video images among the content displayed in the main screen (screen occupying the largest area) in the call interface 233C 2.
Optionally, in some embodiments, the talk interface 234C may also include a control for closing the talk interface 234C.
Optionally, in some embodiments, the user may also operate the talk interface 234C to turn off the talk interface 234C or change the talk interface 234C to another way of non-full screen display. The embodiment of the present application is not limited to this operation, and for example, this operation may be an operation of pulling up or pulling down the call interface 234C.
Note that, the diagram shown in (d) of fig. 6 is described by taking the example of the picture-in-picture of the call interface 233C2 of the chat tool application of the vehicle-mounted device 230 in a non-full screen display manner, and the embodiment of the present application does not limit the specific form of the call interface 233C2 of the vehicle-mounted device 230 in a non-full screen display manner.
A specific description of a non-full screen display manner may be found in the following description of fig. 8, and will not be repeated here.
Optionally, in some embodiments, after the in-vehicle device 230 processes the progress of the chat tool application and displays the display interface of the previous application of the chat tool application or the default display interface in a full-screen manner, if the in-vehicle device 230 detects that the gear of the vehicle 210 is no longer in the D-gear, the in-vehicle device 230 may resume the display manner of the previous chat tool application, for example, display the interface of the chat tool application in a full-screen manner.
In the set of GUIs as shown in fig. 6 provided by the embodiment of the present application, if the in-vehicle device detects that the gear of the vehicle on which the in-vehicle device is mounted is adjusted from the P-gear to the D-gear under the condition that the in-vehicle device displays the interface of the video call in full screen, the in-vehicle device may display the interface of the video call in a non-full screen display manner and display the display interface of the in-vehicle map application in full screen. In this way, the user can see the display interface of the vehicle map application (the display interface of the previous application of the chat tool or the default display interface) on the vehicle-mounted device without interacting with the vehicle-mounted device, thereby being beneficial to improving the experience of the user. In addition, if the interface of the chat tool application is displayed in a non-full screen display mode or the background continues to run the chat tool application, a user can intuitively see the display interface of the vehicle-mounted map application on the vehicle-mounted device and can continue to use the chat tool application, the requirements of safety performance reminding (such as displaying a navigation interface) and video communication of a driver in the driving process are met to the greatest extent, and the user experience is further improved.
In the embodiment shown in fig. 6, the vehicle is directly triggered to take the display interface of the vehicle map application as an example, and the vehicle device is not fully displayed with the video call interface, and the display interface of the vehicle map application is fully displayed, which should not limit the present application. For example, in other embodiments, when the gear of the vehicle 210 is shifted from the P-gear to the D-gear and the vehicle speed difference is greater than or equal to a certain threshold, or the current vehicle speed is greater than or equal to a certain threshold, the vehicle-mounted device may be triggered to display the video call interface in a non-full screen manner and display the display interface of the vehicle map application in a full screen manner. Correspondingly, when the gear of the vehicle 210 is shifted from the D gear to the P gear and the vehicle speed difference is smaller than a certain threshold or the current vehicle speed is smaller than a certain threshold, the vehicle device 230 may also resume the display mode of the previous video call interface. Reference may be made specifically to the relevant description in the embodiment shown in fig. 5, and no further description is given here.
FIG. 7 illustrates yet another set of GUIs provided by an embodiment of the present application.
For example, as shown in fig. 7 (a), the driver 220 is making a video call with another person through the chat tool application of the in-vehicle apparatus 230, and the in-vehicle apparatus 230 can display the call interface 233C2 of the video call in full screen. The description of the call interface 233C2 may be referred to the description related to (C) in fig. 6, and will not be repeated here.
At this time, if the driver 220 wants to reverse, the shift position of the vehicle 210 is adjusted to the R range. The in-vehicle device 230 can detect that the gear of the vehicle 210 is in R-gear, and in order not to affect the default display interface (e.g., the display interface of the reverse image) of the driver 220, the in-vehicle device 230 can process the progress of the chat tool application and display the default display interface. At this point, the driver 220 may see the default display interface.
The manner in which the in-vehicle device 230 processes the chat tool application is not limited in this embodiment of the present application.
In one example, in-vehicle device 230 may directly end the process of the chat tool application, such as directly closing the chat tool application.
In another example, in-vehicle device 230 may run the chat tool application in the background, i.e., no interface for the chat tool application is displayed on in-vehicle device 230.
Optionally, in this example, if the chat tool application running in the background has audio content, the in-vehicle device 230 may also play the audio content corresponding to the chat tool application running in the background at the same time as the chat tool application running in the background.
In yet another example, the in-vehicle device 230 may display the call interface 233C2 in a non-full screen display. For example, as shown in fig. 7 (b), the in-vehicle device 230 may display the call interface 2422 and the default display interface (the display interface 2421 of the reverse image) in a split screen.
Note that, in the embodiment of the present application, the display position of the call interface 2422 on the in-vehicle device 230 is not limited. For example, if the call interface 2422 is displayed on the in-vehicle device 230 at a side position close to the co-driver in order not to affect the driving safety of the driver 220, that is, the call interface 2422 may be displayed on the in-vehicle device 230 at a side position away from the driver 220. For another example, if the call interface 2422 is further manipulated by the driver 220 for convenience, the call interface 2422 may be displayed at a side of the in-vehicle apparatus 230 close to the driver 220.
The embodiment of the application is not limited to the split screen display mode. For example, the split display may be in the form of a left and right split display.
The size of each interface displayed by the split screen is not limited in the embodiment of the application.
Optionally, in some embodiments, the telephony interface 2422 may also include controls for closing the telephony interface 2422.
Optionally, in some embodiments, the user may also operate on the call interface 2422 to close the call interface 2422 or change the call interface 2422 to another way of non-full screen display. The embodiment of the present application is not limited to this operation, and for example, this operation may be an operation of pulling up or pulling down the call interface 2422.
Note that, the diagram shown in (b) in fig. 7 is described by taking the example that the call interface 233C2 of the chat tool application of the vehicle-mounted device 230 is displayed in a split screen manner in a non-full screen manner, and the embodiment of the present application does not limit the specific form that the call interface 233C2 of the vehicle-mounted device 230 is displayed in a non-full screen manner.
A specific description of a non-full screen display manner may be found in the following description of fig. 8, and will not be repeated here.
Optionally, in some embodiments, after the in-vehicle device 230 processes the progress of the chat tool application and displays the default display interface, if the in-vehicle device 230 detects that the gear of the vehicle 210 is in the P range, the in-vehicle device 230 may resume the display mode of the previous chat tool application, for example, display the interface of the chat tool application in a full-screen mode.
In the set of GUIs shown in fig. 7 provided by the embodiment of the present application, if the in-vehicle device detects that the gear of the vehicle on which the in-vehicle device is mounted is in the R gear under the condition that the in-vehicle device displays the video call interface in full screen, the in-vehicle device may display the video call interface and the display interface of the in-vehicle map application in a split screen display manner. Therefore, the user can see the display interface of the reversing image on the vehicle-mounted equipment without interacting with the vehicle-mounted equipment, and the user experience is improved. In addition, if the interface of the chat tool application is displayed in a non-full screen display mode or the background continues to run the chat tool application, a user can intuitively see the display interface of the reverse image application on the vehicle-mounted equipment and can continue to use the chat tool application, the requirements of safety performance reminding (such as displaying the reverse image interface) and video call of a driver in the driving process are met to the greatest extent, and the user experience is further improved.
In the embodiment shown in fig. 7, the vehicle is shifted to the R range, and the interface for directly triggering the vehicle-mounted device to display the video call and the display interface for the vehicle-mounted map application in a split-screen display manner are described as an example, which should not limit the present application. For example, in other embodiments, when the gear of the vehicle 210 is shifted to R-gear and the difference between the vehicle speeds is greater than or equal to a certain threshold, or the current vehicle speed is greater than or equal to a certain threshold, the vehicle-mounted device may be triggered to display the interface of the video call and the display interface of the vehicle map application in a split-screen display manner. Accordingly, when the gear of the vehicle 210 is in the P range and the vehicle speed difference is smaller than a certain threshold or the current vehicle speed is smaller than a certain threshold, the vehicle device 230 can also restore the display mode of the previous chat tool application. Reference may be made specifically to the relevant description in the embodiment shown in fig. 5, and no further description is given here.
Next, a display method provided by an embodiment of the present application will be described with reference to fig. 8. The method may be applied to electronic device execution. The electronic device may be, for example, the in-vehicle device 230 described in the description above with respect to fig. 3 to 7.
For example, as shown in fig. 8, the display method 300 includes S310 to S340. S310 to S340 are described in detail below.
S310, displaying the interface of the first application in a full screen mode.
The embodiment of the application does not limit the type of the first application. Illustratively, the first application is a non-map-like application.
For example, the first application may be a music application as described in fig. 4. At this time, the playing interface 233A shown in (b) in fig. 4 is an interface of the first application displayed in full screen.
As another example, the first application may be a video application as described in fig. 5. At this time, the playing interface 233B shown in (B) in fig. 5 is an interface of the first application displayed in full screen.
As another example, the first application may be a chat tool application as described in fig. 6 or fig. 7. At this time, the call interface 233C2 as shown in fig. 6 (C) or fig. 7 (a) is an interface of the first application displayed in full screen.
It should be noted that, the display mode in which the difference between the screen area and the display area of the interface of the first application is smaller than the preset value may be referred to as full-screen display. The embodiment of the application does not limit the specific value of the preset value.
It should be noted that, the full screen display may be that the display content occupies the whole screen of the display screen, or may be that the display content occupies a screen other than the black edge on the display screen.
It should be noted that, optionally, in some embodiments, in the case of displaying the application interface in full screen, the display screen may further include content of other applications, which is not limited by the embodiment of the present application. For example, in the case of displaying the interface of the first application in full screen, a floating window, a picture-in-picture, a card, or the like corresponding to the second application may also be displayed on the display screen.
Optionally, in some embodiments, the electronic device may display other content on a display screen of the electronic device in addition to the interface of the first application being displayed full screen.
For example, as shown in (b) of fig. 4, the in-vehicle apparatus 230 may display the function bar 232 in addition to the play interface 233A.
As another example, as shown in (B) of fig. 5, the in-vehicle apparatus 230 may display the function bar 232 in addition to the play interface 233B.
As another example, as shown in fig. 6 (C) or fig. 7 (a), the in-vehicle device 230 may display the function field 232 in addition to the call interface 233C 2.
S320, acquiring first state information, and determining that the first state information meets a first preset condition.
The first state information is used for indicating the state of the electronic device or the device connected with the electronic device.
In one example, the first status information is used to indicate a condition of a movement speed of the electronic device. The first preset condition includes: the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or the difference of the moving speeds of the electronic device in the target time is greater than or equal to a second threshold.
The specific value of the first threshold is not limited in the embodiment of the application. For example, the first threshold may be 30km/h.
The specific value of the second threshold is not limited in the embodiment of the present application. For example, the second threshold may take the form of 5km/h.
The target time according to the embodiment of the present application is any time, and the embodiment of the present application is not limited thereto.
For example, in the embodiment described above with respect to fig. 4, the first state information is used to indicate the condition of the moving speed of the in-vehicle apparatus 230. The first preset condition includes: the difference in the moving speed of the in-vehicle device 230 within 1s (an example of the target time) is greater than or equal to 5km/h (an example of the second threshold value). Thus, in the embodiment illustrated in fig. 4, when the vehicle speed of the vehicle 210 carrying the in-vehicle apparatus 230 rises from 20km/h to 40km/h within 1s, that is, the difference in the moving speed of the in-vehicle apparatus 230 within the target time (for example, 1 s) is 20km/h or more than 5km/h, it can be said that the in-vehicle apparatus 230 satisfies the first preset condition.
In another example, the first status information is used to indicate a condition of a gear of a vehicle connected to the electronic device. The first preset condition includes: the gear of the vehicle is in the D gear; and/or the gear of the vehicle is adjusted from the P gear to the D gear; and/or the gear of the vehicle is in R gear.
For example, in the embodiment described above with respect to fig. 6, the first state information is used to indicate the condition of the shift position of the vehicle connected to the in-vehicle device 230. The first preset condition includes: the gear of the vehicle is changed from the P gear to the D gear. Thus, in the embodiment illustrated in fig. 6, when the gear of the vehicle 210 carrying the in-vehicle apparatus 230 is shifted from the P gear to the D gear, the in-vehicle apparatus 230 can be considered to satisfy the first preset condition. Also for example, in the embodiment described above with respect to fig. 7, the first state information is used to indicate the condition of the shift position of the vehicle connected to the in-vehicle device 230. The first preset condition includes: the gear of the vehicle is in R gear. Thus, in the embodiment illustrated in fig. 6, when the gear of the vehicle 210 carrying the in-vehicle device 230 is in the R range, the in-vehicle device 230 can be considered to satisfy the first preset condition.
In yet another example, the first status information is used to indicate not only a condition of a moving speed of the electronic device but also a condition of a gear of a vehicle connected to the electronic device. The first preset condition includes a first sub-preset condition and a second sub-preset condition. The first sub-preset condition comprises: the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or the difference of the moving speeds of the electronic device in the target time is greater than or equal to a second threshold. The second sub-preset condition includes: the gear of the vehicle is in the D gear; and/or the gear of the vehicle is adjusted from the P gear to the D gear; and/or the gear of the vehicle is in R gear.
It should be noted that, if the electronic device is the vehicle-mounted device 230 described in fig. 3 to 7, the moving speed of the vehicle-mounted device 230 is the moving speed of the vehicle 210 carrying the vehicle-mounted device 230.
For example, in the embodiment described above with respect to fig. 5, the first state information is used not only to indicate the condition of the moving speed of the in-vehicle apparatus 230 but also to indicate the condition of the shift position of the vehicle connected to the in-vehicle apparatus 230. The first sub-preset condition includes: the difference in the moving speed of the in-vehicle apparatus 230 within 1.5s (another example of the target time) is greater than or equal to 8km/h (another example of the second threshold value). The second sub-preset condition includes: the gear of the vehicle is changed from the P gear to the D gear. Thus, in the embodiment shown in fig. 5, when the vehicle speed of the vehicle 210 carrying the in-vehicle apparatus 230 rises from 0km/h to 35km/h within 1.5s, that is, the difference in the moving speed of the in-vehicle apparatus 230 within 1.5s is 35km/h or more than 8km/h, it can be said that the in-vehicle apparatus 230 satisfies the first sub-preset condition. And when the gear of the vehicle 210 carrying the vehicle-mounted device 230 is changed from the P gear to the D gear, the vehicle-mounted device 230 can be considered to satisfy the second sub-preset condition. In the case where the in-vehicle apparatus 230 satisfies both the first sub-preset condition and the second sub-preset condition, it is considered that the in-vehicle apparatus 230 satisfies the first preset condition. The embodiment of the application does not limit the connection mode of the electronic equipment and the vehicle.
In one example, the electronic device may be connected to the vehicle via a wire. For example, the electronic device may be connected to the vehicle via a controller area network (controller area network, CAN) bus.
In another example, the electronic device may be connected to the vehicle through a wireless network. For example, the electronic device may be connected to the vehicle via bluetooth or an in-vehicle network.
The embodiment of the application does not limit the moving speed of the electronic equipment and/or the mode of acquiring the gear of the vehicle connected with the electronic equipment.
In one example, if the electronic device is connected to the vehicle via the CAN bus, the sensor sends speed information to the CAN bus when the engine speed of the vehicle changes, and the electronic device CAN obtain speed data of the electronic device via the CAN line. In addition, when the gear of the vehicle changes, the sensor transmits the gear information of the vehicle to the CAN bus, and the electronic device CAN acquire the gear information of the vehicle connected with the electronic device through the CAN line.
In another example, if a positioning module (e.g., global positioning system (Global Positioning System, GPS)) is incorporated into the electronic device, the positioning module may sense the positioning data of the electronic device. The electronic equipment can acquire the positioning data of the electronic equipment from the positioning module, and acquire the speed data of the electronic equipment according to the positioning data.
S330, displaying an interface of the first application in a non-full screen display mode, or continuing to run the first application in the background, or ending the process of the first application.
In some embodiments, continuing to run the first application in the background may be understood as not displaying an interface of the first application on the electronic device.
Optionally, in some embodiments, where the interface of the first application displayed full screen includes audio content and the first application continues to be run in the background, the display method 300 further includes: and playing the audio content corresponding to the running first application.
For example, the playback interface 233A of the audio application (an example of the first application) as described in (b) in fig. 4 includes audio content. When the in-vehicle apparatus 230 detects that the vehicle speed of the vehicle 210 connected to the in-vehicle apparatus 230 is accelerated from 20km/h to 40km/h (an example of the first preset condition is satisfied), the in-vehicle apparatus 230 may tune the audio application to the background operation in order not to affect the driver 220 to view the display interface 231B of the in-vehicle map application, at which time the user may hear the audio played by the audio application.
In some embodiments, the process of ending the first application may be understood as ending the process corresponding to the interface of displaying the first application in full screen in S310. In other embodiments, a process that ends a first application may be understood as ending all processes of the first application, i.e. closing the first application.
The manner in which the process of the first application is executed is not limited in the embodiment of the present application.
It should be noted that, the embodiment of the present application is not limited to the specific form of the non-full screen display manner.
For example, ways of non-full screen display may include a floating window (which may also be referred to as a small window), a picture-in-picture, split screens, floating balls, and/or cards.
Alternatively, in some embodiments, the manner in which the corresponding non-full screen display is configured for different types of applications may be pre-configured. In this embodiment, the present application is not limited with respect to correspondence of the application and the manner of non-full screen display corresponding thereto.
In order to further improve user experience, different non-full screen display modes can be configured for different applications according to the influence degree of the applications on the electronic equipment. That is, the manner in which the non-full screen display is displayed is related to the degree of impact of the application on the electronic device.
It should be noted that, the degree of influence of the application on the electronic device can be understood as: the degree of influence of the display interface of the application on the user when displayed on the electronic device can be said to be the degree of interference of the display interface of the application displayed on the electronic device on the user when the user uses the electronic device.
By way of example, the degree of influence of an application on an electronic device may be classified into three classes, such as a low degree of influence, a medium degree of influence, and a high degree of influence, according to the value of influence of the application on the electronic device. Wherein, the influence value corresponding to the low influence degree is smaller than or equal to the third threshold value, the influence value corresponding to the medium influence degree is larger than the third threshold value and smaller than the fourth threshold value, and the influence value corresponding to the high influence degree is larger than or equal to the fourth threshold value.
For example, the music-like application or the radio-like application may have a low degree of influence on the electronic device; the degree of influence of the chat-like application on the electronic device may be a medium degree of influence; the degree of impact of the video playback type application on the electronic device may be a high degree of impact.
In addition, in the case that the influence degree of the application on the electronic equipment is low, the non-full screen display mode is a floating window; under the condition that the influence degree of the application on the electronic equipment is the middle influence degree, the non-full screen display mode is picture-in-picture or split screen; in the case where the degree of impact of the application on the electronic device is high, the manner of non-full screen display is a hover ball and/or card.
Optionally, in some embodiments, where the interface of the first application displayed full screen includes video content and audio content, the interface of the first application displayed in the hover sphere and/or card includes only audio content. For example, the playback interface 233B of the video application (an example of the first application) as described in (B) in fig. 5 includes both video content and audio content corresponding to the video content. When the in-vehicle apparatus 230 detects that the vehicle 210 connected to the in-vehicle apparatus 230 is shifted from the P range to the D range (an example of the second sub-preset condition is satisfied) and the vehicle speed is accelerated from 0km/h to 35km/h (an example of the first sub-preset condition is satisfied), the in-vehicle apparatus 230 displays an interface of the video application in a card 234B as shown in (c) of fig. 5 in order not to affect the driver 220 viewing the display interface 231B of the in-vehicle map application. At this time, the interface of the video application displayed by the card 234B may not include video content, but include only audio content, that is, the user may hear the audio corresponding to the video played by the video application, and cannot see the played video frame.
Optionally, in some embodiments, the user may modify the manner in which the respective applications are displayed non-fully through setting options in the electronic device.
Optionally, in some embodiments, the interface of the first application displayed in a non-full screen display includes at least one control for managing the first application.
For example, as shown in fig. 4 (c), the floating window 234A may further include a control a for closing the floating window 234A.
Optionally, in some embodiments, the display method 300 further includes S350 and S360 to change a non-full screen display manner of the interface of the first application described in S330.
S350, detecting operation of an interface of the first application displayed in a non-full screen display mode.
The embodiment of the present application is not limited to the specific form of the operation described in S350. For example, the operation may be an operation of pulling up or pulling down the interface of the first application displayed in a non-full screen display manner described in S330.
S360, responding to the operation described in S350, and displaying the interface of the first application in another non-full-screen display mode.
For example, if the interface of the first application in S330 is displayed in the form of a hover ball, the interface of the first application may be displayed in the form of a card after the user pulls up the interface.
Through S350 to S360, the user can operate the interface of the first application in a non-full screen display mode according to the needs of the user to adjust the non-full screen display mode of the first application, so that the user experience is further improved.
And S340, displaying the interface of the second application in a full screen or split screen mode.
Specifically, when the interface of the first application is displayed in a non-full screen display mode, the background continues to run the first application, or the process of the first application is ended, the interface of the second application is displayed in a full screen mode; alternatively, when the interface of the first application is displayed in a non-full screen display manner, the interfaces of the first application and the second application are displayed in a split screen.
It should be noted that, the execution sequence of S330 and S340 is not limited in the embodiment of the present application. For example, S330 may be performed before S340, or S330 may be performed after S340, or S330 may be performed simultaneously with S340.
The embodiment of the application does not limit the type of the second application.
In one example, the second application may be a default application. That is, whether the second application on the electronic device is in a running state or not, if the electronic device determines that the first state information meets the first preset condition, the second application is run, and an interface of the second application is displayed in a full screen or a split screen.
By way of example, the default application may be a map-like application.
For example, after the playing interface 233C2 of the chat tool application (an example of the first application) described in fig. 7 is displayed in full screen, since the vehicle-mounted device 230 is in the reverse state (the gear of the vehicle-mounted device 230 is in the R gear), the reverse image application is displayed on the vehicle-mounted device 230 by default in a split screen mode, and the reverse image application is the second application.
For another example, after the playing interface 233A of the music application (another example of the first application) shown in fig. 4 is displayed full screen, since the vehicle speed of the in-vehicle apparatus 230 rises from 20km/h to 40km/h within 1s, the in-vehicle map application is the second application when the in-vehicle map application is displayed full screen by default on the in-vehicle apparatus 230.
In another example, the second application is an application that was last run on the electronic device prior to displaying the interface of the first application full screen. At this time, before S310, the electronic device displays the interface of the second application full screen or split screen.
For example, before the playing interface 233A of the music application (an example of the first application) as described in fig. 4 is displayed in full screen, the application that has been executed last time on the electronic device is the vehicle map application, and the vehicle map application is the second application.
Through S310 to S360, according to the state of the electronic device itself or the device connected to the electronic device, the user interface of the application that the user wants to see most may be displayed on the electronic device in a full screen manner, and other applications may be displayed in a non-full screen manner, or may be tuned to a background operation, or may be directly turned off. Thus, the user can realize the full-screen display of the user interface of the application which the user wants to see most on the electronic equipment without interacting with the electronic equipment, thereby being beneficial to improving the experience of the user. In addition, if the interface of the first application is displayed in a non-full screen display mode or the background continues to run the first application, the user can intuitively see the interface of the second application on the electronic device, namely, the user can use the function of the second application and can continue to use the function of the first application, and further the user experience is improved.
Optionally, in some embodiments, after S340, the display method 300 further includes: s370 and S380 to determine whether to restore the display interface of the first application referred to in S330.
S370, acquiring second state information, and determining that the second state information meets a second preset condition.
The second status information is used for indicating the status of the electronic device or the device connected with the electronic device.
In one example, the second status information is used to indicate a condition of a movement speed of the electronic device. The second preset condition includes: the moving speed of the electronic equipment is smaller than a first threshold value; and/or the difference of the moving speeds of the electronic device in the target time is smaller than a second threshold value.
For example, in the embodiment described above with respect to fig. 4, the second state information is used to indicate the condition of the moving speed of the in-vehicle apparatus 230. The second preset condition includes: the difference in the moving speed of the in-vehicle device 230 within 1s (an example of the target time) is less than 5km/h (an example of the second threshold value). Thus, in the embodiment illustrated in fig. 4, the in-vehicle apparatus 230 can be considered to satisfy the second preset condition when the difference in the moving speeds of the vehicle 210 carrying the in-vehicle apparatus 230 within 1s is less than 5 km/h.
In another example, the second status information is used to indicate a condition of a gear of a vehicle connected to the electronic device. The second preset condition includes: the gear of the vehicle is in the P gear; and/or the gear of the vehicle is adjusted from the D gear to the P gear; and/or the gear of the vehicle is in N gear.
For example, in the embodiment described above with respect to fig. 6, the second state information is used to indicate the condition of the shift position of the vehicle connected to the in-vehicle device 230. The second preset condition includes: the gear of the vehicle is changed from D gear to P gear. Thus, in the embodiment illustrated in fig. 6, when the gear of the vehicle 210 carrying the in-vehicle apparatus 230 is shifted from the D gear to the P gear, the in-vehicle apparatus 230 can be considered to satisfy the second preset condition.
Also for example, in the embodiment described above with respect to fig. 7, the second state information is used to indicate the condition of the shift position of the vehicle connected to the in-vehicle device 230. The second preset condition includes: the gear of the vehicle is in the P gear. Thus, in the embodiment illustrated in fig. 7, when the gear of the vehicle 210 carrying the in-vehicle apparatus 230 is in the P range, the in-vehicle apparatus 230 can be considered to satisfy the second preset condition.
In yet another example, the second status information is used not only to indicate a condition of a moving speed of the sub device but also to indicate a condition of a gear of a vehicle connected to the electronic device. The second preset condition includes a third sub-preset condition and a fourth sub-preset condition. Wherein the third sub-preset condition includes: the moving speed of the electronic equipment is smaller than a first threshold value; and/or the difference of the moving speeds of the electronic device in the target time is smaller than a second threshold value. The fourth word preset conditions include: the gear of the vehicle is in the P gear; and/or the gear of the vehicle is adjusted from the D gear to the P gear; and/or the gear of the vehicle is in N gear.
For example, in the embodiment described above with respect to fig. 5, the second state information is used not only to indicate the condition of the moving speed of the in-vehicle apparatus 230 but also to indicate the condition of the shift position of the vehicle connected to the in-vehicle apparatus 230. The third sub-preset condition includes: the difference in the moving speed of the in-vehicle apparatus 230 within 1.5s (another example of the target time) is smaller than 8km/h (another example of the second threshold value). The fourth sub-preset condition includes: the gear of the vehicle is changed from D gear to P gear. Thus, in the embodiment illustrated in fig. 5, when the difference in the moving speeds of the vehicle 210 carrying the in-vehicle apparatus 230 within 1.5s is less than 8km/h, the in-vehicle apparatus 230 can be considered to satisfy the third sub-preset condition. And when the gear of the vehicle 210 carrying the in-vehicle device 230 is shifted from D-gear to P-gear, the in-vehicle device 230 can be considered to satisfy the fourth sub-preset condition. In the case where the in-vehicle apparatus 230 satisfies both the third sub-preset condition and the fourth sub-preset condition, it is considered that the in-vehicle apparatus 230 satisfies the second preset condition.
In the embodiment of the present application, a certain relationship exists between the content indicated by the first state information and the content included in the first preset condition, and a certain relationship exists between the content indicated by the second state information and the content included in the second preset condition. And whether the content indicated by the second state information and the content indicated by the first state information have a certain relationship is not limited. Also, whether or not there is a certain relationship between the content included in the second preset condition and the content included in the first preset condition is not limited.
For example, the first state information is used to indicate a condition of a moving speed of the electronic device, and the first preset condition includes that the moving speed of the electronic device is greater than or equal to a first threshold, that is, the first state information and the first preset condition are both related to the moving speed of the electronic device.
For another example, the second state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, and the second preset condition includes that the gear of the vehicle is in P range, that is, the second state information and the second preset condition are both related to the gear of the vehicle connected to the electronic device.
For another example, the first state information is used to indicate a condition of a moving speed of the electronic device, and the first preset condition includes the moving speed of the electronic device being greater than or equal to a first threshold. The second state information is used for indicating the condition of a gear of a vehicle connected with the electronic device, and the second preset condition includes that the gear of the vehicle is in a P gear. At this time, the contents included in the first state information and the second state information are irrelevant, and the contents included in the first preset condition and the second preset condition are irrelevant.
For another example, the first state information and the second state information are both used for indicating a condition of a moving speed of the electronic device, the first preset condition includes that the moving speed of the electronic device is greater than or equal to a first threshold, and the second preset condition includes that a difference value of the moving speeds of the electronic device in a target time is greater than or equal to a second threshold. At this time, the contents included in the first state information and the second state information are identical, so the contents included in the first state information and the second state information are related. And the first preset condition and the second preset condition include contents which are not identical but are both related to the moving speed of the electronic device, so that the first preset condition and the second preset condition can be considered to be related.
For another example, the first state information is used to indicate a condition of a moving speed of the electronic device, the second state information is used to indicate a condition of a gear of a vehicle connected to the electronic device, the first preset condition includes the moving speed of the electronic device being greater than or equal to a first threshold, and the second preset condition includes the gear of the vehicle being in R-gear. At this time, the first state information and the second state information have no relation. And the first preset condition and the second preset condition are not related. S380, displaying the interface of the first application in a full screen mode.
The principle of switching between full screen display and floating window/split screen display, picture-in-picture display and audio-listening display, respectively, is described in detail below.
(1) Principle of switching between full screen display and floating window display/split screen display
An Activity Manager (Activity Manager) provides an interface that sets the Flag (Flag) of the Intent (Intent) to contain a Flag Activity NEW TASK (flag_activity_NEW_TASK) when the A Activity jumps to the B Activity, i.e., the B Activity will start in the NEW window. And simultaneously setting the active window mode of the B to be a free small window mode or a full screen mode. The mode of the B active window may also be modified again to a small window or full screen after having jumped to the B active window. If the window is modified to be a small window, the display position of the window can be set at the same time, and the display position of the window can be displayed in the last small window mode, and if no last position information exists, the display position can be displayed at a certain position on the left or the right by default. If set to full screen, the display position of the window may be set to the whole screen according to the width and height of the screen.
(2) Principle of switching between full screen display and picture-in-picture display
When the first state information meets a first preset condition, in the running process of the first application, the system creates a floating window and a virtual screen, the first application is moved to the virtual screen, the picture synthesized by the virtual screen is displayed in the floating window, and at the moment, the floating window displays the interface of the first application. And simultaneously, according to the position corresponding relation between the floating window and the virtual screen, injecting a click event on the floating window into the virtual screen, so that the capability of controlling the interface of the first application through the floating window can be realized. At this time, the home screen displays an interface of the second application. And when the second state information meets a second preset condition, the first application is moved back to the main screen, and the floating window and the virtual screen are closed.
(3) Principle of switching between full screen display and listening to audio
(a) Application adaptation is required (e.g. video playback applications require support for listening to video)
And when the first state information meets a first preset condition, notifying the first application, and responding to the event, the first application automatically switches the conversion from the current video playing mode to the audio mode. And vice versa.
(b) Application adaptation is not required (e.g., whether or not the video playback class application supports listening to video is not limited)
When the first state information meets a first preset condition, a virtual screen is created and the application is moved to the virtual screen, a suspension ball (or a small window) is created on the main screen to bear the content synthesized by the virtual screen, at the moment, the first application continues to run, the main screen displays an interface of the second application, and the sound of the first application is output.
Through S370 and S380, if the electronic device detects that the state of the electronic device or the device connected with the electronic device satisfies the second preset condition, the electronic device may display the interface of the first application in a full screen manner, under the condition that the electronic device continues to run the first application in a non-full screen manner or the background or ends the process of the first application and displays the interface of the second application in a full screen manner or in a split screen manner. Therefore, the user does not need to interact with the electronic equipment, and the electronic equipment can recommend the content displayed on the electronic equipment preferentially according to the state of the electronic equipment or the equipment connected with the electronic equipment, so that the user experience is further improved.
Next, an apparatus provided by an embodiment of the present application will be described with reference to fig. 9 to 10.
Fig. 9 is a schematic block diagram of an apparatus provided by an embodiment of the present application. The apparatus may be provided to the in-vehicle device 230 described in fig. 3 to 7 or the electronic device described in fig. 8 above.
For example, as shown in fig. 9, the apparatus 400 includes: a display unit 410 and a processing unit 420.
In one implementation, the display unit 410 is configured to perform the steps related to the display of the in-vehicle device 230 in the embodiments described above in fig. 3 to 7; the processing unit 420 is configured to perform the steps related to the processing of the in-vehicle apparatus 230 in the embodiments described above in fig. 3 to 7.
Optionally, in some embodiments, the apparatus may further include a playing unit, where the playing unit is configured to perform the steps related to playing of the in-vehicle device 230 in the embodiments described above in fig. 3 to 7.
In another implementation, the display unit 410 is configured to perform the steps related to display of the electronic device in the embodiment described above in fig. 8; the processing unit 420 is configured to perform the steps associated with the processing of the electronic device in the embodiment described above in fig. 8.
Optionally, in some embodiments, the apparatus may further comprise a playing unit for performing the steps related to playing of the electronic device in the embodiment described in fig. 8 above.
Fig. 10 shows a schematic structural diagram of an electronic device 500 provided by an embodiment of the present application.
For example, as shown in fig. 10, the electronic device 500 includes: one or more processors 510, one or more memories 520, the one or more memories 520 storing one or more computer programs, the one or more computer programs comprising instructions. When executed by the one or more processors 510, cause the in-vehicle device 230 as described in fig. 3-7 above to perform the technical solutions described in the embodiments of fig. 3-7 above, or cause the electronic device as described in method 300 above to perform the technical solutions described in method 300 above.
An embodiment of the present application provides a computer program product, which when executed on an electronic device, causes the electronic device to execute the technical solution in the foregoing embodiment. The implementation principle and technical effects are similar to those of the related embodiments of the method, and are not repeated here.
An embodiment of the present application provides a readable storage medium, where the readable storage medium contains instructions, and when the instructions are executed in an electronic device, the instructions cause the electronic device to execute the technical solution of the foregoing embodiment. The implementation principle and technical effect are similar, and are not repeated here.
The embodiment of the application provides a chip for executing instructions, and when the chip runs, the technical scheme in the embodiment is executed. The implementation principle and technical effect are similar, and are not repeated here.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (30)

1. A display method, wherein the method is applied to an electronic device, the method comprising:
displaying an interface of the first application in a full screen manner;
acquiring first state information, and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment;
displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background, or ending the process of the first application, and displaying an interface of a second application in a full screen mode; or alternatively
And displaying interfaces of the first application and the second application in a split screen mode.
2. The display method of claim 1, wherein the second application is a default application or is an application that was last run on the electronic device prior to displaying the interface of the first application full screen.
3. A display method according to claim 1 or 2, wherein the first application comprises a non-map-like application and the second application comprises a map-like application.
4. The display method according to any one of claims 1 to 3, wherein the first status information is used to indicate a status of a moving speed of the electronic device,
the first preset condition includes:
the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or the number of the groups of groups,
the difference of the moving speeds of the electronic equipment in the target time is larger than or equal to a second threshold value.
5. The display method according to any one of claims 1 to 4, wherein the first status information is used to indicate a condition of a gear of a vehicle to which the electronic apparatus is connected,
the first preset condition includes:
the gear of the vehicle is in the D gear; and/or the number of the groups of groups,
the gear of the vehicle is adjusted from the P gear to the D gear; and/or the number of the groups of groups,
the gear of the vehicle is in R gear.
6. The display method according to any one of claims 1 to 5, wherein the manner of non-full screen display includes: a floating window, a picture-in-picture, a split screen, a floating ball, and/or a card.
7. The display method according to any one of claims 1 to 6, wherein the manner of non-full screen display is related to the degree of influence of the first application on the electronic device.
8. The display method according to claim 7, wherein the influence degree includes a low influence degree, a medium influence degree, and a high influence degree, the low influence degree corresponds to an influence value smaller than or equal to a third threshold value, the medium influence degree corresponds to an influence value larger than the third threshold value and smaller than a fourth threshold value, the high influence degree corresponds to an influence value larger than or equal to the fourth threshold value,
in the case that the influence degree of the first application on the electronic equipment is low, the non-full screen display mode is a floating window;
in the case that the influence degree of the first application on the electronic equipment is a medium influence degree, the non-full screen display mode is picture-in-picture or split screen;
in the case that the influence degree of the first application on the electronic equipment is high, the non-full screen display mode is a suspension ball and/or a card.
9. The display method according to claim 6 or 8, wherein in the case where the interface of the first application displayed in full screen includes video content and audio content, the interface of the first application displayed in the hover ball and/or the card includes only audio content; or alternatively, the process may be performed,
In the case where the interface of the first application displayed full screen includes audio content and the first application continues to be run in the background, the method further includes:
and playing the audio content corresponding to the running of the first application.
10. The display method according to any one of claims 1 to 9, wherein the interface of the first application displayed in a non-full screen display manner includes at least one control for managing the first application.
11. The display method according to any one of claims 1 to 10, characterized in that the method further comprises:
detecting operation of an interface of the first application displayed in a non-full screen display mode;
responsive to the operation, displaying an interface of the first application in another non-full screen display.
12. The display method according to any one of claims 1 to 11, wherein after the interface of the second application is displayed full screen or split screen, the method further comprises:
acquiring second state information, and determining that the second state information meets a second preset condition, wherein the second state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment;
And displaying the interface of the first application in a full screen mode.
13. The display method according to claim 12, wherein the second status information is used to indicate a status of a moving speed of the electronic device,
the second preset condition includes:
the moving speed of the electronic equipment is smaller than a first threshold value; and/or the number of the groups of groups,
the difference in the moving speeds of the electronic device in the target time is smaller than a second threshold.
14. The display method according to claim 12 or 13, wherein the second status information is used to indicate a condition of a gear of a vehicle to which the electronic device is connected,
the second preset condition includes:
the gear of the vehicle is in the P gear; and/or the number of the groups of groups,
the gear of the vehicle is adjusted from D gear to P gear; and/or the number of the groups of groups,
the gear of the vehicle is in N gear.
15. An electronic device, the electronic device comprising: one or more processors; a memory; a plurality of applications; and one or more programs, wherein the one or more programs are stored in the memory, which when executed by the processor, cause the electronic device to perform the steps of:
Displaying an interface of the first application in a full screen manner;
acquiring first state information, and determining that the first state information meets a first preset condition, wherein the first state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment;
displaying an interface of the first application in a non-full screen display mode, or continuously running the first application in the background, or ending the process of the first application, and displaying an interface of a second application in a full screen mode; or alternatively
And displaying interfaces of the first application and the second application in a split screen mode.
16. The electronic device of claim 15, wherein the second application is a default application or is an application that was last run on the electronic device prior to displaying the interface of the first application full screen.
17. The electronic device of claim 15 or 16, wherein the first application comprises a non-map-like application and the second application comprises a map-like application.
18. The electronic device according to any one of claims 15 to 17, wherein the first status information is used to indicate a condition of a moving speed of the electronic device,
The first preset condition includes:
the moving speed of the electronic equipment is greater than or equal to a first threshold value; and/or the number of the groups of groups,
the difference of the moving speeds of the electronic equipment in the target time is larger than or equal to a second threshold value.
19. The electronic device of any one of claims 15 to 18, wherein the first status information is used to indicate a condition of a gear of a vehicle to which the electronic device is connected,
the first preset condition includes:
the gear of the vehicle is in the D gear; and/or the number of the groups of groups,
the gear of the vehicle is adjusted from the P gear to the D gear; and/or the number of the groups of groups,
the gear of the vehicle is in R gear.
20. The electronic device of any one of claims 15-19, wherein the manner of non-full screen display comprises: a floating window, a picture-in-picture, a split screen, a floating ball, and/or a card.
21. The electronic device of any of claims 15-20, wherein the manner of non-full screen display is related to a degree of impact of the first application on the electronic device.
22. The electronic device of claim 21, wherein the impact levels include a low impact level, a medium impact level, and a high impact level, the low impact level corresponds to an impact value less than or equal to a third threshold, the medium impact level corresponds to an impact value greater than the third threshold and less than a fourth threshold, the high impact level corresponds to an impact value greater than or equal to the fourth threshold,
In the case that the influence degree of the first application on the electronic equipment is low, the non-full screen display mode is a floating window;
in the case that the influence degree of the first application on the electronic equipment is a medium influence degree, the non-full screen display mode is picture-in-picture or split screen;
in the case that the influence degree of the first application on the electronic equipment is high, the non-full screen display mode is a suspension ball and/or a card.
23. The electronic device of claim 20 or 22, wherein the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of:
in the case where the interface of the first application displayed in full screen includes video content and audio content, the interface of the first application displayed in the hover ball and/or the card includes only audio content; or alternatively, the process may be performed,
and playing the audio content corresponding to the running of the first application under the condition that the interface of the first application which is displayed in a full screen comprises the audio content and the running of the first application is continued by the background.
24. The electronic device of any of claims 15-23, wherein the interface of the first application displayed in a non-full screen display includes at least one control for managing the first application.
25. The electronic device of any one of claims 15-24, wherein the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of:
detecting operation of an interface of the first application displayed in a non-full screen display mode;
responsive to the operation, displaying an interface of the first application in another non-full screen display.
26. The electronic device of any one of claims 15-25, wherein the instructions, when executed by the one or more processors, cause the electronic device to perform the steps of:
acquiring second state information after the interface of a second application is displayed in a full screen or a split screen, and determining that the second state information meets a preset condition, wherein the second state information is used for indicating the state of the electronic equipment or equipment connected with the electronic equipment;
and displaying the interface of the first application in a full screen mode.
27. The electronic device of claim 26, wherein the second status information is used to indicate a condition of a speed of movement of the electronic device,
the second preset condition includes:
The moving speed of the electronic equipment is smaller than a first threshold value; and/or the number of the groups of groups,
the difference in the moving speeds of the electronic device in the target time is smaller than a second threshold.
28. The electronic device of claim 26 or 27, wherein the second status information is used to indicate a condition of a gear of a vehicle to which the electronic device is connected,
the second preset condition includes:
the gear of the vehicle is in the P gear; and/or the number of the groups of groups,
the gear of the vehicle is adjusted from D gear to P gear; and/or the number of the groups of groups,
the gear of the vehicle is in N gear.
29. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the first electronic device to perform the display method of any one of claims 1 to 14.
30. A chip system, comprising: a processor for calling and running a computer program from a memory, so that an electronic device on which the chip system is mounted performs the display method according to any one of claims 1 to 14.
CN202210392551.1A 2022-04-15 2022-04-15 Display method and electronic equipment Pending CN116954770A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210392551.1A CN116954770A (en) 2022-04-15 2022-04-15 Display method and electronic equipment
PCT/CN2023/087332 WO2023197999A1 (en) 2022-04-15 2023-04-10 Display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210392551.1A CN116954770A (en) 2022-04-15 2022-04-15 Display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116954770A true CN116954770A (en) 2023-10-27

Family

ID=88328997

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210392551.1A Pending CN116954770A (en) 2022-04-15 2022-04-15 Display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN116954770A (en)
WO (1) WO2023197999A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101580850B1 (en) * 2014-03-18 2015-12-30 주식회사 오비고 Method for configuring dynamic user interface of head unit in vehicle by using mobile terminal, and head unit and computer-readable recoding media using the same
CN107487274A (en) * 2017-03-30 2017-12-19 宝沃汽车(中国)有限公司 Control the method, apparatus and vehicle of vehicle-carrying display screen
US10097684B1 (en) * 2018-03-19 2018-10-09 Google Llc Maintaining an automobile configuration of a mobile computing device while switching between automobile and non-automobile user interfaces
CN108833707A (en) * 2018-06-22 2018-11-16 深圳市万普拉斯科技有限公司 Incoming display method, device, computer readable storage medium and mobile device

Also Published As

Publication number Publication date
WO2023197999A1 (en) 2023-10-19

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
CN111666119B (en) UI component display method and electronic device
CN113542485B (en) Notification processing method, electronic equipment and computer readable storage medium
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN110138959B (en) Method for displaying prompt of human-computer interaction instruction and electronic equipment
CN111543042B (en) Notification message processing method and electronic equipment
CN112399390B (en) Bluetooth connection method and related device
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN117014859A (en) Address book-based device discovery method, audio and video communication method and electronic device
CN114466107A (en) Sound effect control method and device, electronic equipment and computer readable storage medium
CN116048358B (en) Method and related device for controlling suspension ball
CN115119048B (en) Video stream processing method and electronic equipment
CN113438366A (en) Information notification interaction method, electronic device and storage medium
CN114691248B (en) Method, device, equipment and readable storage medium for displaying virtual reality interface
CN114500728B (en) Incoming call bell setting method, incoming call prompting method and electronic equipment
CN117009005A (en) Display method, automobile and electronic equipment
CN116954770A (en) Display method and electronic equipment
CN116700556B (en) Card generation method and related device
CN116048831B (en) Target signal processing method and electronic equipment
CN116095224B (en) Notification display method and terminal device
CN115022807B (en) Express information reminding method and electronic equipment
CN116703689B (en) Method and device for generating shader program and electronic equipment
CN116095223B (en) Notification display method and terminal device
CN116744392B (en) Dual-card communication method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination