CN114666435A - Method for using enhanced function of electronic equipment and related device - Google Patents

Method for using enhanced function of electronic equipment and related device Download PDF

Info

Publication number
CN114666435A
CN114666435A CN202210131964.4A CN202210131964A CN114666435A CN 114666435 A CN114666435 A CN 114666435A CN 202210131964 A CN202210131964 A CN 202210131964A CN 114666435 A CN114666435 A CN 114666435A
Authority
CN
China
Prior art keywords
electronic device
enhanced
application
function
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210131964.4A
Other languages
Chinese (zh)
Other versions
CN114666435B (en
Inventor
陶强
陈俊
于雪松
高光远
张跃
韩静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210131964.4A priority Critical patent/CN114666435B/en
Publication of CN114666435A publication Critical patent/CN114666435A/en
Application granted granted Critical
Publication of CN114666435B publication Critical patent/CN114666435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and related apparatus for using enhanced functionality of an electronic device. In the method, the electronic equipment provides an enhanced function interface for a third party application without an enhanced function, and the third party application can call the enhanced function of the electronic equipment through the enhanced function interface. The electronic device provides different enhanced function options for different third-party applications, thereby providing different enhanced functions. According to the technical scheme, when the user uses different third-party applications, different enhancement functions provided by the electronic equipment can be fully utilized, and therefore user experience is improved.

Description

Method for using enhanced function of electronic equipment and related device
The present application is a divisional application, the original application having application number 201910319978.7, the original application date being 2019, 19/04, the entire content of the original application being incorporated by reference in the present application.
Technical Field
The present application relates to the field of terminal and communication technologies, and in particular, to a method for enhancing functions of an electronic device and a related apparatus.
Background
With the development of terminal technology, electronic devices such as smart phones and tablet computers begin to have more and more functions. There are many manufacturers for producing electronic devices, and most manufacturers produce electronic devices with some enhanced functions to ensure their competitiveness. The enhanced function refers to a function which is developed by manufacturers and is not possessed by electronic equipment produced by other manufacturers, for example, a "large-aperture shooting" function, an "ultra-wide-angle shooting" function and the like provided by a Huashi (HUAWEI) mobile phone when shooting.
The enhanced functions of the electronic device can only be called by Applications (APPs) independently developed by manufacturers, and cannot be called by third-party applications. For example, the "large aperture shooting" function and the "ultra-wide angle shooting" function of the mobile phone can be called by a camera application installed in advance on the mobile phone, but cannot be called by a third-party application such as WeChat (Wechat), Facebook (Facebook), and Skype (Skype). In this way, the user cannot fully utilize the enhanced functions of the electronic device, and the user experience when using the third-party application installed on the electronic device is poor.
Disclosure of Invention
The application provides a method and a related device for using an enhanced function of electronic equipment, the enhanced function of the electronic equipment can be called when the electronic equipment starts a third-party application, which is equivalent to the enhanced function of the electronic equipment can be called by the third-party application, so that a user can make full use of the enhanced function provided by the electronic equipment, and the experience of the user when the user uses the third-party application is improved.
In a first aspect, the present application provides a method for using enhanced functionality of an electronic device, the method being applied to an electronic device, the electronic device providing one or more enhanced functionality interfaces for a third-party application, the third-party application being an application program provided by a non-first manufacturer, the first manufacturer being a manufacturer of the electronic device, the method comprising: in response to a first operation on the third-party application, the electronic device displays a first user interface, the first user interface including one or more enhanced function options for providing enhanced functions of the electronic device to the third-party application; and responding to the user operation on the enhanced function option, and providing the enhanced function corresponding to the enhanced function option for the third-party application by the electronic equipment through the enhanced function interface.
By implementing the method provided by the first aspect, the electronic device can call the enhanced function of the electronic device when the third-party application is started, so that the user can make full use of the enhanced function provided by the electronic device, and the experience of the user when the user uses the third-party application is improved.
The enhanced function refers to a function which is developed by a manufacturer of the electronic device and is not possessed by electronic devices manufactured by other manufacturers. That is, the enhanced function is a point of distinction between electronic devices produced by different manufacturers. The enhanced functionality may be developed by the manufacturer based on the unique hardware and/or software of the electronic device. For example, a hua shi (hua mei) mobile phone is equipped with a come card camera, and the mobile phone can provide a "large aperture shooting" function, an "ultra wide angle shooting" function, and the like when taking a picture using the camera.
With reference to the first aspect, in some embodiments, the first operation on the third-party application includes: an operation for starting the third-party application, or an operation for switching between a plurality of user interfaces provided by the third-party application. Illustratively, the third-party application is WeChat (WeChat), and the first operation can be an operation that a user selects a contact to initiate a video call in a user interface provided by WeChat (WeChat) or an operation that the user clicks an icon of the live third-party application on a main interface (Home screen).
With reference to the first aspect, in some embodiments, in response to the first operation on the third-party application, the electronic device may display or call up the first user interface in two ways:
1. in response to a first operation on the third-party application, the electronic device displays a second user interface, the second user interface including an enhanced functionality control; in response to a user operation on the enhanced functionality control, the electronic device displays the first user interface.
In some embodiments, the enhanced functionality control may be hidden as desired by the user. Specifically, in response to a gesture on the enhanced functionality control, the electronic device displays an interactive element for hiding the enhanced functionality control on the second user interface; the gesture comprises a long press gesture, a swipe gesture, a double tap gesture, or a heavy press gesture; in response to a user action on the interactive element for hiding the enhanced functionality control, ceasing to display the enhanced functionality control. Understandably, hiding the enhanced functionality control does not affect the enhanced functionality provided by the electronic device to the user.
In some embodiments, the enhanced functionality control may be recalled as desired by the user after being hidden. Specifically, the user may click on a blank area on the second user interface to recall the enhanced functionality control.
2. In response to the first operation on the third-party application, the electronic device displays a third user interface; in response to a swipe gesture in the first direction of the third user interface, the electronic device displays the first user interface. The first direction may be a direction from the right side of the display screen to the left side, a direction from the top of the display screen to the bottom, a direction from the left side of the display screen to the right side, and the like.
In combination with the first aspect, in some embodiments, the method further comprises: in response to the user operation on the enhanced function option, the electronic device displays an enhanced function menu corresponding to the enhanced function option on the first user interface, wherein the enhanced function menu is used for adjusting the level of the enhanced function provided by the electronic device for the third-party application. For example, in response to a user operation on the "beauty" function option, the electronic device may display a slider bar for adjusting the beauty level on the first user interface. This may enable the user to adjust the level of enhanced functionality provided by the electronic device according to his or her needs.
With reference to the first aspect, in some embodiments, the enhanced functions corresponding to the one or more enhanced function options, or the enhanced functions currently provided by the electronic device to the user, may be determined by the electronic device by any one of the following manners:
1. and the electronic equipment determines all the enhanced functions to be the enhanced functions currently provided for the user. Through the 1 st determination mode, when the electronic device starts each third-party application, all enhanced functions of the electronic device can be provided for the user. The third-party application installed on the electronic equipment can call the enhanced function of the electronic equipment, and the user can make full use of the enhanced function provided by the electronic equipment, so that the experience of the user when using the third-party application is improved.
2. And the electronic equipment determines the enhanced functions which can be used by the currently started third-party application as the enhanced functions currently provided for the user in all the enhanced functions. The enhanced functions that can be used by the third-party application currently launched by the electronic device may include: and the electronic equipment currently starts enhanced functions corresponding to one or more usage scenes provided by the third-party application.
Through the 2 nd determination mode, the electronic device can provide different enhanced functions for the user when starting different third-party applications. That is, the electronic device may provide the user with enhanced functionality appropriate for the currently launched third-party application. The user may use different enhanced features when using different third party applications on the electronic device.
3. The electronic device determines the enhanced function applicable to the current usage scenario as the enhanced function currently provided to the user. In some embodiments, the enhanced function applicable to the current usage scenario may be an enhanced function corresponding to the current usage scenario. In other embodiments, the enhanced functionality applicable to the current usage scenario may refer to: and in the enhanced functions corresponding to the current use scene, the enhanced functions compatible with the capability currently provided by the electronic equipment.
Through the 3 rd determination mode, when the electronic device starts the third-party application, different enhanced functions can be provided for the user under different use scenes provided by the third-party application. That is, the electronic device may provide the user with enhanced functions suitable for the current usage scenario. The user may use different enhanced functionality when opening different interfaces of the same third party application on the electronic device.
In the above determination manner of the 3 rd aspect, if the usage scenario currently provided by the third-party application is a video call scenario, the enhanced functions currently provided to the user and determined by the electronic device may include one or more of the following: the function of "beauty", "body" function, "blurring" function, "large aperture shooting" function or "ultra wide angle shooting" function.
In the above-mentioned 3 rd determination manner, if the usage scenario currently provided by the third-party application is a live scenario, the enhanced functions currently provided to the user and determined by the electronic device may include one or more of the following: a "network acceleration" function, a "coding optimization" function, or a "tone beautification" function.
With reference to the first aspect, in some embodiments, before the electronic device displays the first user interface, the method further includes: in response to the first operation on the third-party application, the electronic device displays a fourth user interface, wherein the fourth user interface comprises prompt information and a first interactive element, and the prompt information is used for prompting a user to: a mode of opening or closing the permission of the third party application for using the enhanced function provided by the electronic device, and/or a comparison effect before and after the third party application uses the enhanced function provided by the electronic device; in response to a user operation on the first interactive element, the electronic device displays the first user interface.
In some embodiments, the electronic device displays the first user interface in response to determining, by the electronic device, that the third-party application has permission to use the enhanced functionality provided by the electronic device in response to the first operation on the third-party application. That is, the electronic device displays the first user interface upon determining that the currently launched third-party application has the enhanced functionality provided using the electronic device.
In some embodiments, the permissions of the third party application to use the enhanced functionality provided by the electronic device may be turned on by default by the electronic device. For example, the electronic device may default to having some or all of the third-party applications with the right to use the enhanced functionality provided by the electronic device.
In other embodiments, the rights of the third party to use the enhanced functionality provided by the electronic device may be set by the user. Specifically, before the electronic device displays the first user interface, the method further includes: the electronic device displays a fifth user interface, the fifth user interface including a second interactive element; and opening the permission of the third-party application to use the enhanced function provided by the electronic equipment in response to the user operation on the second interactive element. In this way, the user can set the authority of each third-party application to use the enhanced function of the electronic device according to the requirement of the user.
In combination with the first aspect, in some embodiments, the method further comprises: in response to a second operation on the third-party application, the electronic device displays a sixth user interface, the sixth user interface including one or more enhanced function options; one or more enhanced feature options in the sixth user interface are different from one or more enhanced feature options in the first user interface. An exemplary implementation of this second operation may include an operation in which a user selects a contact in a user interface provided by WeChat (WeChat) and sends a text message to it. An exemplary implementation of the sixth user interface may include a text chat interface. The one or more enhanced feature options in the sixth user interface are different from the one or more enhanced feature options in the first user interface. That is, the user can dynamically refresh the enhanced function options provided to the user in different usage scenarios provided by the third-party application in accordance with the usage scenarios.
With reference to the first aspect, in some embodiments, the providing, by the electronic device, an enhanced function corresponding to the enhanced function option for the third-party application through the enhanced function interface specifically includes: the electronic equipment issues the first parameter to a hardware abstraction layer and a kernel layer of the electronic equipment through the enhanced function interface; and the hardware abstraction layer and the kernel layer of the electronic equipment call corresponding algorithms and/or hardware equipment according to the first parameter, so that the enhanced function corresponding to the enhanced function option is provided for the third-party application.
In a second aspect, the present application provides an electronic device comprising: the system comprises an application program layer, an application program interface, an enhanced function matching module, one or more enhanced function interfaces, a hardware abstraction layer, a kernel layer and a display module, wherein:
the application layer includes: a function helper application, one or more third party applications; the function assistant application is an application program provided by a first manufacturer, the third-party application is an application program which is not provided by the first manufacturer, and the first manufacturer is a manufacturer of the electronic equipment;
the application program interface is used for communication between the third-party application and the hardware abstraction layer and the kernel layer;
the enhanced function matching module is used for determining the enhanced function provided by the electronic equipment to the currently running third-party application;
the enhanced function penetration module is used for providing the enhanced function of the electronic equipment for the currently running third-party application;
the hardware abstraction layer and the kernel layer are used for calling an algorithm and/or hardware equipment of the electronic equipment so as to start corresponding functions of the electronic equipment;
the display module is used for responding to a first operation of the currently running third-party application to display a first user interface, and the first user interface comprises one or more enhanced function options which correspond to the enhanced functions provided by the electronic equipment to the currently running third-party application determined by the enhanced function matching module; wherein the enhanced functionality options are provided by the functionality assistant application.
The electronic equipment provided by the second aspect can call the enhanced function of the electronic equipment when the third-party application is started, so that the user can make full use of the enhanced function provided by the electronic equipment, and the experience of the user when the user uses the third-party application is improved.
With reference to the second aspect, in some embodiments, the first operation on the third-party application currently running includes: and the operation for starting the currently running third-party application or the operation for switching among a plurality of user interfaces provided by the currently running third-party application. Illustratively, the third-party application is WeChat (WeChat), and the first operation can be an operation of selecting a contact to initiate a video call in a user interface provided by WeChat (WeChat) by a user or an operation of clicking an icon of the live third-party application on a main interface (Home screen) by the user.
With reference to the second aspect, in some embodiments, the display module is specifically configured to display a second user interface in response to a first operation on the currently running third-party application, the second user interface including an enhanced functionality control; displaying the first user interface in response to a user operation on the enhanced functionality control; wherein the enhanced functionality control is provided by the functionality assistant application.
In some embodiments, the display module is further configured to display an interactive element on the second user interface for hiding the enhanced functionality control in response to a gesture on the enhanced functionality control, the gesture comprising a long press gesture, a swipe gesture, a double tap gesture, or a heavy press gesture; in response to a user action on the interactive element for hiding the enhanced functionality control, ceasing to display the enhanced functionality control. That is, the enhanced functionality control may be hidden according to the user's needs. Understandably, hiding the enhanced functionality control does not affect the enhanced functionality provided by the electronic device to the user.
In some embodiments, the display module may also recall the enhanced features control in response to a user clicking on a blank area on the second user interface.
With reference to the second aspect, in some embodiments, the display module is specifically configured to display a third user interface in response to a first operation on the third-party application currently running; in response to a swipe gesture in the first direction of the third user interface, the first user interface is displayed. The first direction may be a direction from the right side of the display screen to the left side, a direction from the top of the display screen to the bottom, a direction from the left side of the display screen to the right side, and the like.
With reference to the second aspect, in some embodiments, the display module is further configured to display, on the first user interface, an enhanced function menu corresponding to the enhanced function option in response to a user operation on the enhanced function option, where the enhanced function menu is used to adjust a level of the enhanced function provided by the electronic device for the third-party application; wherein the enhanced functionality menu is provided by the function assistant application. For example, in response to a user operation on the "beauty" function option, the display module may display a slider bar for adjusting the beauty level on the first user interface. This may enable the user to adjust the level of enhanced functionality provided by the electronic device to their own needs.
With reference to the second aspect, in some embodiments, the enhanced function matching module is specifically configured to determine all enhanced functions provided by the electronic device as the enhanced functions provided to the currently running third-party application.
With reference to the second aspect, in other embodiments, the electronic device further includes an application identification module, configured to identify the third-party application currently running; the enhanced function matching module is specifically configured to determine, as an enhanced function provided for the currently running third-party application, an enhanced function that can be used by the currently running third-party application among all enhanced functions that are provided by the electronic device.
With reference to the second aspect, in further embodiments, the electronic device further includes a scenario identification and parameter collection module, configured to identify a usage scenario currently provided by the launched third-party application, and a function that is currently provided by the electronic device; the enhanced function matching module is specifically configured to determine an enhanced function applicable to the usage scenario as an enhanced function provided to the currently running third-party application; the enhanced functions applicable to the usage scenario include: and the enhanced functions corresponding to the use scenes or the enhanced functions compatible with the functions currently provided by the electronic equipment in the enhanced functions corresponding to the use scenes.
In some embodiments, if the usage scenario currently provided by the third-party application is a video call scenario, the enhanced functions currently provided to the user determined by the enhanced function matching module may include one or more of: the function of "beauty", "body" function, "blurring" function, "large aperture shooting" function or "ultra wide angle shooting" function. Alternatively, the first and second electrodes may be,
in some embodiments, if the usage scenario currently provided by the third-party application is a live scenario, the enhanced functions currently provided to the user determined by the enhanced function matching module may include one or more of: a "network acceleration" function, a "coding optimization" function, or a "tone beautification" function.
With reference to the second aspect, in some embodiments, the display module is further configured to display a fourth user interface in response to the first operation on the third-party application, the fourth user interface including prompt information and a first interactive element, the prompt information being configured to prompt a user to: a mode of opening or closing the permission of the third party application for using the enhanced function provided by the electronic device, and/or a comparison effect before and after the third party application uses the enhanced function provided by the electronic device; displaying the first user interface in response to a user operation on the first interactive element; wherein the prompt information and the first interactive element are provided by the function assistant application.
In some embodiments, in combination with the second aspect, the function helper application is further configured to determine whether the third party application currently running has a right to use the enhanced functionality provided by the electronic device. The display module is specifically configured to display the first user interface when the function assistant application determines that the currently launched third-party application has the enhanced function provided by using the electronic device.
In some embodiments, the third party application currently running is turned on by default by the electronic device using the permissions of the enhanced functionality provided by the electronic device.
In other embodiments, the rights of the third party to use the enhanced functionality provided by the electronic device may be set by the user. Specifically, the application layer includes a "setup" application; the display module is further configured to display a fifth user interface, where the fifth user interface includes a second interactive element, and the second interactive element is configured to monitor a permission for turning on or off a currently running third-party application to use an enhanced function provided by the electronic device; the fifth user interface is provided by the "settings" application.
In combination with the second aspect, in some embodiments, the display module is further configured to display a sixth user interface in response to a second operation on the third-party application; the sixth user interface includes one or more enhanced functionality options; one or more enhanced feature options in the sixth user interface are different from one or more enhanced feature options in the first user interface; wherein one or more enhanced functionality options in the sixth user interface are provided by the functionality assistant application. An exemplary implementation of this second operation may include a user selecting and sending a text message to a contact in a user interface provided by WeChat. An exemplary implementation of the sixth user interface may include a text chat interface. The one or more enhanced feature options in the sixth user interface are different from the one or more enhanced feature options in the first user interface. That is to say, the electronic device may dynamically refresh the enhanced function options displayed by the display module according to the current usage scenario in different usage scenarios provided by the third-party application.
With reference to the second aspect, in some embodiments, the enhanced function interface is specifically configured to issue a first parameter to the hardware abstraction layer and the kernel layer, where the first parameter is set by the function assistant application; the hardware abstraction layer and the kernel layer are specifically configured to invoke a corresponding algorithm and/or hardware device according to the first parameter, so as to provide a corresponding enhanced function for the currently running third-party application.
In a third aspect, the present application provides an electronic device providing one or more enhanced function interfaces for a third-party application, where the third-party application is an application program provided by a non-first manufacturer, and the first manufacturer is a manufacturer of the electronic device, and the electronic device includes: one or more processors, memory, and a display screen; the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions, the one or more processors invoking the computer instructions to cause the electronic device to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, the present application provides a chip applied to an electronic device, where the chip includes one or more processors, and the processor is configured to invoke computer instructions to cause the electronic device to execute the method described in the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, the present application provides a computer program product containing instructions that, when run on an electronic device, cause the electronic device to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a sixth aspect, the present application provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
According to the technical scheme, the electronic equipment can call the enhanced function of the electronic equipment when the third-party application is started, and the enhanced function of the electronic equipment can be called equivalently by the third-party application, so that the user can make full use of the enhanced function provided by the electronic equipment, and the experience of the user when the user uses the third-party application is improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
FIGS. 2A-2B are schematic diagrams of a manner of turning on a "feature helper" provided by an embodiment of the present application;
3A-3C are schematic diagrams of another manner of turning on a "function assistant" provided by an embodiment of the present application;
4A-4H are some schematic diagrams of human-computer interaction provided in a video call scenario according to an embodiment of the present application;
FIGS. 5A-5F are schematic diagrams of further human-computer interactions provided in a video call scenario according to an embodiment of the present application;
6A-6H are some schematic diagrams of human-computer interaction provided in a live scene according to the embodiment of the present application;
fig. 7 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a data flow direction involved when a third-party application calls an enhanced function of an electronic device in a video call scenario according to an embodiment of the present application;
fig. 9 is a schematic view of a cooperation flow of each software module of an electronic device when a third-party application calls an enhanced function of the electronic device in a video call scene according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
The following embodiments of the application provide a method and a related device for using an enhanced function of an electronic device, where the enhanced function of the electronic device can be called by the electronic device when a third-party application is started, which is equivalent to the enhanced function of the electronic device that can be called by the third-party application, and a user can make full use of the enhanced function provided by the electronic device, thereby improving the experience of the user when using the third-party application.
In the following embodiments of the present application, the enhanced functions of the electronic device refer to functions that are not possessed by electronic devices produced by other manufacturers, which are developed by manufacturers of the electronic devices themselves. That is, the enhanced function is a point of distinction between electronic devices produced by different manufacturers. The enhanced functionality may be developed by the manufacturer based on the unique hardware and/or software of the electronic device. For example, a hua shi (hua mei) mobile phone is equipped with a come card camera, and the mobile phone can provide a "large aperture shooting" function, an "ultra wide angle shooting" function, and the like when taking a picture using the camera. For another example, a mobile phone manufacturer independently develops an algorithm to provide a 'beauty' function, a 'body' function, a 'blurring' function and the like during photographing; an anti-shake function and an ultra-wide-angle shooting function when recording small videos; a "super resolution" imaging function, "a" color enhancement "function, and the like when playing a video; the 'beautiful sound' function when recording music, etc. It can be seen that the enhanced functionality of the electronic device may comprise one or more.
It is to be understood that "enhanced function" is only a word used in the present embodiment, and its representative meaning has been described in the present embodiment, and its name does not set any limit to the present embodiment. In other embodiments of the present application, "enhanced functionality" may also be referred to by other terms such as "proprietary technology," "proprietary functionality," and the like.
In the embodiment of the present application, the third-party application refers to an application program provided or developed by a manufacturer of the non-electronic device. A manufacturer of an electronic device may include a manufacturer, supplier, provider, or operator of the electronic device, etc. A manufacturer may refer to a manufacturer that manufactures electronic devices from parts and materials that are either self-made or purchased. The supplier may refer to a manufacturer that provides the complete machine, stock, or parts of the electronic device. The operator may refer to a vendor responsible for the distribution of the electronic device. In the embodiment of the present application, the manufacturer of the electronic device may also be referred to as a first manufacturer. In some embodiments, the third party application may also refer to an application that is not pre-installed on the electronic device. For example, for hua is a mobile phone, APPs such as WeChat (Wechat), Facebook (Facebook), Skype, Messenger, WhatsApp, Taobao and the like belong to third-party applications, and APPs such as Hua is a wearable APP, Hua is a mall APP, Hua is a mobile service and the like belong to non-third-party applications.
In the following embodiments of the present application, under a condition that a "function assistant" of an electronic device such as a smartphone is turned on, when the electronic device recognizes a scene in which a user uses a third-party application, the electronic device may actively provide an enhanced function for the user to use. That is, the electronic device may extend the enhanced function into the third-party application, so that the user may use the enhanced function of the electronic device when using the third-party application. Therefore, when the user uses the third-party application, the enhanced function of the electronic equipment can be fully utilized, and the user experience is good.
The "function assistant" may be a service or function provided by the electronic device, and may be installed in the electronic device in the form of APP. In embodiments of the present application, the "function assistant" may enable the electronic device to provide enhanced functionality when the user uses a third-party application. In the embodiment of the present application, the electronic device provides the enhanced functions when the user uses the third-party application means that when the electronic device enables the third-party application, part or all of the enhanced functions of the electronic device may be provided to the user, and the user may select to enable one or more of the enhanced functions. Here, the manner in which the "function assistant" supports determining the enhanced function provided by the electronic device when the user uses the third-party application may refer to the related description of the subsequent embodiments, and is not repeated herein.
In this embodiment, after the "function assistant" of the electronic device is turned on, part or all of the third-party applications may use the enhanced function of the electronic device, that is, the part or all of the third-party applications have the right to use the enhanced function of the electronic device, and the part or all of the third-party applications having the right may refer to the relevant description of the subsequent embodiments. Through the function assistant, the user can use the enhanced functions of the electronic device when using the third-party application. Therefore, the user can fully utilize the enhanced function of the electronic equipment, and the user experience is good when the third-party application installed on the electronic equipment is used.
It should be understood that "function assistant" is only a word used in this embodiment, and its representative meaning is described in this embodiment, and its name does not set any limit to this embodiment.
An exemplary electronic device 100 provided in the following embodiments of the present application is first described below.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
In an embodiment of the present application, the processor 110 may be configured to determine whether a third-party application currently launched by the electronic device 100 has an authority to use an enhanced function of the electronic device. In some embodiments, the processor 110 may be further configured to determine the enhanced functionality currently provided to the user if the currently launched third-party application has permission to use the enhanced functionality of the electronic device. The manner in which the processor 110 determines the enhanced functions currently provided to the user can refer to the related description of the subsequent embodiments, and will not be described herein.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In this embodiment, the display screen 194 may be used to display a control, and the control may be used to monitor and display an operation of the control corresponding to the enhanced function currently provided by the electronic device. In response to this operation, the display screen 194 may also be used to display a control corresponding to the enhanced functionality currently provided by the electronic device. The control corresponding to the enhanced function currently provided by the electronic device can be used for monitoring the operation of enabling the corresponding enhanced function. The manner in which the electronic device determines the enhanced functions currently provided to the user may refer to the related description of the subsequent embodiments, and will not be described herein again.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Some exemplary User Interfaces (UIs) provided by the electronic device 100 are described below. The term "user interface" in the description and claims and drawings of the present application is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 2A illustrates an exemplary user interface 21 on the electronic device 100 for exposing applications installed by the electronic device 100.
The user interface 21 may include: status bar 201, calendar indicator 202, weather indicator 203, tray 204 with common application icons, navigation bar 205, and other application icons. Wherein:
the status bar 201 may include: one or more signal strength indicators 201A for mobile communication signals (which may also be referred to as cellular signals), an operator name (e.g., "china mobile") 201B, one or more signal strength indicators 201C for wireless fidelity (Wi-Fi) signals, a battery status indicator 201D, and a time indicator 201E.
Calendar indicator 202 may be used to indicate the current time, such as the date, day of the week, time division information, and the like.
The weather indicator 203 may be used to indicate a weather type, such as cloudy sunny, light rain, etc., and may also be used to indicate information such as temperature, etc.
The tray 204 with the common application icons may show: phone icon 204A, contact icon 204B, short message icon 204C, camera icon 204D.
The navigation bar 205 may include: a system navigation key such as a back key 205A, a home screen key 205B, a multitasking key 205C, etc. When it is detected that the user clicks the return key 205A, the electronic apparatus 100 may display a page previous to the current page. When the user is detected to click the home screen key 205B, the electronic device 100 may display a home interface. When the user's click on the multi-task key 205C is detected, the electronic device 100 may display the task that was recently opened by the user. The names of the navigation keys can be other keys, which is not limited in this application. Not limited to virtual keys, each navigation key in the navigation bar 205 may also be implemented as a physical key.
Other application icons may be, for example: an icon 206 of Wechat (Wechat), an icon 207 of QQ, an icon 208 of Twitter (Twitter), an icon 209 of face book (Facebook), an icon 210 of mailbox, an icon 211 of cloud sharing, an icon 212 of memo, an icon 213 of Payment treasure, an icon 214 of gallery, and an icon 215 of settings. The user interface 21 may also include a page indicator 216. Other application icons may be distributed across multiple pages and page indicator 216 may be used to indicate which page the user is currently browsing for applications in. The user may slide the area of the other application icons from side to browse the application icons in the other pages.
In some embodiments, the user interface 21 illustratively shown in FIG. 2A may be a Home screen (Home Screen).
In other embodiments, electronic device 100 may also include a home screen key for the entity. The home screen key may be used to receive a user's instruction to return the currently displayed UI to the home interface, which may facilitate the user to view the home screen at any time. The instruction may be an operation instruction for the user to press the home screen key once, an operation instruction for the user to press the home screen key twice in a short time, or an operation instruction for the user to press the home screen key for a long time. In other embodiments of the present application, the home screen key may also incorporate a fingerprint recognizer for fingerprint acquisition and recognition therewith when the home screen key is pressed.
It is understood that fig. 2A only illustrates the user interface on the electronic device 100, and should not be construed as limiting the embodiments of the present application.
Several ways of turning on the "function assistant" on the electronic device 100 provided by the embodiments of the present application are described below.
Fig. 2A and 2B illustrate an operation of turning on a "function assistant" on the electronic device 100.
As shown in fig. 2A, when electronic device detects a slide-down gesture on status bar 201, in response to the gesture, electronic device 100 may display window 217 on user interface 21. As shown in FIG. 2B, a window 217 may display switch controls 217A for "feature assistants" and may also display switch controls for other features (e.g., Wi-Fi, Bluetooth, flashlight, etc.). When an operation on the switch control 217A in the window 217 (e.g., a touch operation on the switch control 217A) is detected, the electronic device 100 may turn on "function assistant" in response to the operation.
That is, the user may make a downward swipe gesture at the status bar 201 to open the window 217, and may click the switch control 217A of the "function assistant" in the window 217 to conveniently turn on the "function assistant". The switch control 217A of the "function assistant" may be represented in the form of text information or an icon.
In the embodiment of the present application, after the "function assistant" is turned on through the operations shown in fig. 2A and fig. 2B, part or all of the third-party applications installed in the electronic device have a right to use the enhanced functions of the electronic device. The portion of the third party application may be electronic device default settings; or may be set by the user, which is not limited in the embodiment of the present application.
Fig. 3A-3C illustrate two other operations of turning on the "function assistant" on the electronic device 100.
The user interface 31 exemplarily shown in fig. 3A may be one implementation of a "setup interface". The user interface 31 may be provided by a "settings" application. The "setting" application is an application program installed on an electronic device such as a smart phone or a tablet computer and used for setting various functions of the electronic device, and the name of the application program is not limited in the embodiment of the application. The user interface 31 may be a user interface opened by a user clicking on the setting icon 215 in the user interface 21 shown in fig. 2A.
As shown in fig. 3A, the user interface 31 may include: a status bar 301, a title bar 302, a search box 303, an icon 304, an area 305 including one or more setting items.
The status bar 301 can refer to the status bar 201 in the user interface 21 shown in fig. 2A, and is not described in detail here.
The title bar 302 may include a current page indicator 302A, which current page indicator 302A may be used to indicate the current page, e.g., the textual information "settings" may be used to indicate that the current page is used to present one or more settings. Not limited to text information, the current page indicator 302A may also be an icon.
The search box 303 can be used to listen for operations (e.g., touch operations) to search for a setting item through text. In response to this operation, the electronic device may display a text input box so that the user displays a setting item desired to be searched in the input box.
The icon 304 can be used to listen to an operation (e.g., a touch operation) for searching for a setting item by voice. In response to the operation, the electronic device may display a voice input interface so that the user inputs a voice in the voice input interface to search for the setting item.
The area 305 includes one or more setting items, which may include: login is an account setting item, a wireless and network setting item, a device connection setting item, an application and notification setting item, a battery setting item, a display setting item, a sound setting item, a function assistant setting item 305A, a security and privacy setting item, a user and account setting item, and the like. The representation of each setting item may include an icon and/or text, which is not limited in this application. Each setting item can be used to listen for an operation (e.g., a touch operation) that triggers display of the setting content of the corresponding setting item, and in response to the operation, the electronic device can open a user interface for displaying the setting content of the corresponding setting item.
In some embodiments, upon detecting an operation (e.g., a touch operation) on the function assistant setting item 305A in the user interface 31 shown in FIG. 3A, the electronic device may display the user interface 32 shown in FIG. 3B.
As shown in fig. 3B, the user interface 32 is used to display the corresponding contents of the function assistant setting items. The user interface 32 may include: status bar 306, title bar 307, switch controls 308 for "function assistant", and prompt information 309.
The status bar 306 can refer to the status bar 201 in the user interface 21 shown in fig. 2A, and is not described in detail here.
The title bar 307 may include: a return key 307A, current page indicators 307B and 307C. The return key 307A is an APP level return key, and is used to return to a level above the menu. The top level page of the user interface 32 may be the user interface 31 as shown in fig. 3A. The current page indicators 307B and 307C may be used to indicate the current page, for example, the text information "settings" and "function helper" may be used to indicate that the current page is used to show the corresponding content of the function helper settings items, not limited to text information, but the current page indicators 307B and 307C may also be icons.
The switch control 308 is used to listen for operations (e.g., touch operations) to turn "function assistant" on/off. As shown in fig. 3B, when an operation on the switch control 308 (e.g., a touch operation on the switch control 308) is detected, the electronic device 100 may turn on "function assistant" in response to the operation. The switch control 308 may be represented in the form of a text message or an icon.
The prompt 309 may be used to introduce a "function assistant" that prompts the user for the role of the "function assistant". The presentation of the reminder 309 may be a text message or an icon.
In the embodiment of the present application, as in the case where the "function assistant" is turned on by the operation shown in fig. 2A and 2B, after the "function assistant" is turned on by the operation shown in fig. 3B, part or all of the third-party applications installed in the electronic device have the right to use the enhanced function of the electronic device. The part of the third-party application may be set by the electronic device as a default, or may be set by the user, which is not limited in the embodiment of the present application.
In other embodiments, upon detecting an operation (e.g., a touch operation) on the function assistant setting item 305A in the user interface 31 shown in FIG. 3A, the electronic device may display the user interface 33 shown in FIG. 3C.
As shown in fig. 3C, the user interface 33 is used to display the corresponding contents of the function assistant setting items. The user interface 33 may include: status bar 310, title bar 311, prompt information 312, and "function assistant" switch control 313 and 316 corresponding to one or more third-party applications, respectively.
The status bar 310 can refer to the status bar 201 in the user interface 21 shown in fig. 2A, and is not described in detail here.
The title bar 311 may include: a return key 311A, current page indicators 311B and 311C. The return key 311A is an APP level return key, and is used to return to a level above the menu. The top level page of the user interface 33 may be the user interface 31 as shown in fig. 3A. The current page indicators 311B and 311C may be used to indicate the current page, for example, the text information "settings" and "function helper" may be used to indicate that the current page is used to show the corresponding contents of the function helper setting items, not limited to the text information, but the current page indicators 311B and 311C may also be icons.
The prompt 312 may refer to the prompt 309 in the user interface 32 shown in fig. 3B, and is not described here.
The "function assistant" switch controls 313 and 316 respectively corresponding to one or more third-party applications can be used to monitor operations (e.g., touch operations) for turning on/off the "function assistant". The one or more third party applications may be all third party applications installed in the electronic device. As shown in fig. 3C, when an operation on the switch control 313 and 316 (e.g., a touch operation on the switch control 313 and 316) is detected, in response to the operation, the electronic device may turn on/off the right of the corresponding third-party application to use the enhanced function of the electronic device. For example, as shown in fig. 3C, when an operation on the switch control 316 is detected, the electronic device may allow a third party application Facebook (Facebook) to use some or all of the enhanced functionality provided with the electronic device. The representation of the switch controls 313-316 can be text information or icons.
In some embodiments, after the electronic device opens the right of the third-party application to use the "function assistant" through the operation shown in fig. 3C, the third-party application is added to the "function assistant" white list.
The "function helper" may be turned on in some embodiments by other means, not limited to the several ways of turning on the "function helper" shown in fig. 2A-2B and 3A-3C. In other embodiments, the electronic device may also turn on the "function assistant" by default, for example, automatically after power-on.
In some embodiments of the present application, after the electronic device turns on the "function assistant", a prompt that the "function assistant" has been turned on may also be displayed in the status bar 201. For example, an icon of "function assistant" is displayed in the status bar 201 or the text "function assistant" or the like is directly displayed.
In some embodiments of the present application, the electronic device may maintain a "function assistant" whitelist and add third party applications with enhanced functionality for using the electronic device to the "function assistant" whitelist. That is, third party applications in the "function helper" whitelist have permission to use enhanced functions of the electronic device. If the electronic device turns on a "function assistant" in the manner shown in fig. 2A-2B, the electronic device may add some or all of the third-party applications to the "function assistant" white list. If the electronic device has turned on the "function assistant" in the manner shown in fig. 3A and 3B, the electronic device may add some or all of the third party applications to the "function assistant" white list. If the electronic device turns on the "function assistant" in the manner shown in fig. 3A and 3C, the electronic device may add the corresponding third-party application to the "function assistant" white list.
It can be understood that, without being limited to the manner of turning on the "function assistant" shown in fig. 2A to 2B and fig. 3A to 3C to open some or all of the rights of the third-party application to use the enhanced functions of the electronic device, the rights of the third-party application to use the enhanced functions of the electronic device may also be opened in other manners in the embodiment of the present application. For example, the user may set the right to use the enhanced function of the electronic device for a third party application each time the third party application is newly downloaded.
Some embodiments of the graphical user interface implemented on the electronic device 100 when a user enables enhanced functionality provided by the electronic device through a "feature assistant" during use of a third-party application are described below in conjunction with two different usage scenarios.
Use scenario one: video call scenario
In the UI embodiments exemplarily illustrated in fig. 4A-4H below, the electronic device may provide enhanced functionality for the user to use when the user manipulates a third-party application installed on the electronic device. That is, a third party application on the electronic device may invoke enhanced functionality of the electronic device. The following embodiments of fig. 4A to 4H take a usage scenario as a video call scenario as an example to describe a method for a third-party application on an electronic device to invoke an enhanced function of the electronic device.
The following describes a "video call interface" provided by a third-party application installed on the electronic device exemplarily shown in fig. 4A-4H.
In some embodiments, a "video call interface" may be used to display images of both video call parties, one or more controls associated with the video call. The image of one party of the video call (i.e. the image of the local user of the electronic device) may be an image collected by the camera 193 of the electronic device, and the camera may be a front camera or a rear camera. The image of the other party to the video call (i.e., the image of the peer user) may be an image sent by the electronic device via other electronic devices that the electronic device receives via a third party application that provides the "video call interface. The associated control at the time of the video call may be used to receive a user operation (e.g., a touch operation), in response to which the electronic device may perform one or more of: switching from a video call to a voice call, hanging up the video call, or switching cameras, etc.
The user interface 41 exemplarily shown in fig. 4A-4H may be one implementation of a "video call interface". The user interface 41 may be provided by an instant messaging type third party application, such as WeChat. In some embodiments, the user interface 41 may be a user interface that is opened after the user clicks the WeChat icon 206 in FIG. 2A and selects a contact to initiate a video call. In other embodiments, the user interface 41 may be a user interface that is opened after the user clicks the WeChat icon 206 in FIG. 2A and accepts a video call initiated by a contact. In still other embodiments, it may also be a user interface that the user invokes by voice.
As shown in fig. 4A-4H, the user interface 41 may include: a status bar 401, a display area 402 of an image of an opposite end user, a display area 403 of an image of a home end user, a call duration indicator 404, a control 405, a control 406, and a control 407. In some embodiments, the user interface 41 may also include a navigation bar (not shown) that may be hidden, which may refer to the navigation bar 205 in FIG. 2A.
The status bar 401 can refer to the status bar 201 in the user interface 21 shown in fig. 2A, and is not described in detail here.
The image of the end user display area 402 is used to display an image of the end user of the video call. The image of the peer user is an image sent by the other electronic device that the electronic device receives through the third party application providing the user interface 41.
The home user image display area 403 is used for displaying an image of the home user of the video call. The image of the home user is an image captured by the camera 193 of the electronic device. The camera is a front camera or a rear camera.
The call duration indicator 404 is used to indicate the duration of the video call. Call duration indicator 404 may be a text message "00: 50" identifying that the duration of the current video call is 50 seconds.
Control 405 is used to monitor the operation of switching the video call to a voice call. In response to a detected operation (e.g., a touch operation) acting on the control 405, the electronic device may switch a video call between the current home user and the peer user to a voice call.
Control 406 is used to listen for an operation to hang up the video call. In response to a detected operation (e.g., a touch operation) acting on control 406, the electronic device can suspend the video call between the current home user and the peer user.
The control 407 is used to monitor the operation of the conversion camera. In response to a detected operation (e.g., a touch operation) acting on the control 407, the electronic device may convert a currently turned-on camera, e.g., convert a front camera to a rear camera, or convert a rear camera to a front camera, etc.
The representations of controls 405, 406, and 407 may be icons and/or textual information.
In the case where the "feature assistant" is not turned on, the user may turn on the "feature assistant". The user may turn on "feature assistant" in window 217 as shown in FIG. 2B, or in the setup interface shown in FIG. 3B or FIG. 3C. Reference is made to the preceding description of embodiments. Without being limited thereto, in some embodiments, the electronic device may also turn "function assistant" on by default.
In some embodiments, the user may also be prompted to turn on the "feature assistant" when the electronic device displays the user interface 41 as shown in FIG. 4A, in the event that the "feature assistant" is not turned on, i.e., when the third-party application currently launched by the electronic device does not have the right to use the enhanced features of the electronic device. The manner in which the electronic device prompts the user to turn on the "function assistant" may include, but is not limited to: prompt information (not shown in the figure), such as text information, icons, etc., sounding a prompt tone, vibrating, etc., is displayed on the user interface 41.
In the case where the "function assistant" is turned on, i.e., when the third-party application currently launched by the electronic device has the right to use the enhanced functions of the electronic device, the electronic device may display a user interface 41 as shown in fig. 4B. The user interface 41 as shown in FIG. 4B may include a small window 408 for introducing a "function assistant". After the "function assistant" is turned on, the electronic device may display the widget 408 only when the user interface provided by the third-party application is opened for the first time, or may display the widget 408 when the user interface provided by the third-party application is opened each time, which is not limited in this application.
As shown in fig. 4B, the small window 408 includes: effect display information 408A before the "function assistant" is turned on, effect display information 408B after the "function assistant" is turned on, prompt information 408C, a control 408D, and a control 408E.
The effect presentation information 408A and 408B may present different effects brought by the "function assistant" before and after the activation, so that the user intuitively feels the effect of the "function assistant". The presentation form of the effect presentation information 408A and 408B may be a picture and/or text.
The prompt 408C is used to introduce the function and the activation mode of the "function assistant". The presentation of the reminder 408C may be text, such as the text shown in FIG. 4B.
Control 408D may be used to listen for operations that stop displaying this portlet 408 and close the currently open third-party application's right to use the enhanced functionality of the electronic device. In response to a detected operation (e.g., a touch operation) acting on control 408D, the electronic device can cease displaying widget 408 and close the currently open third-party application (e.g., WeChat) permission to use the enhanced functionality of the electronic device. In some embodiments, the electronic device may only close the permission of the third-party application to use the enhanced functions of the electronic device this time, and restore the permission of the third-party application to use the enhanced functions of the electronic device when the third-party application is opened next time. In other embodiments, the electronic device may turn off the third-party application's permission to use the enhanced functionality of the electronic device, moving the third-party application out of the "function assistant" white list. The control 408D may be represented in the form of an icon or text (e.g., the text "cancel" in fig. 4B).
Control 408E can be used to listen for operations that stop displaying this widget 408 and present enhanced functionality provided by the electronic device. In response to a detected operation (e.g., a touch operation) acting on control 408E, the electronic device can cease displaying widget 408 and present the enhanced functionality currently provided by the electronic device. The control 408D may be represented in the form of an icon or text (e.g., the text "start experience" in fig. 4B).
The user interface 41 illustratively shown in fig. 4C-4H may present one implementation of the enhanced functionality provided for the electronic device.
In response to a detected operation (e.g., a touch operation) acting on the control 408E, the electronic device may display the user interface 41 as shown in fig. 4C. As shown in fig. 4C, a control 409 may be included in the user interface 41. The control 409 can be used for monitoring the operation of the control corresponding to the enhanced function which can be currently provided by the electronic equipment. The representation of control 409 may include icons and/or text. The control 409 is not limited to be displayed at the middle position on the right side of the user interface 41, and the control 409 may be displayed at other positions such as the middle position on the left side of the screen, the middle position on the top of the screen, and the like. In some embodiments, control 409 may also be dragged by the user to any location on the screen. In some embodiments, the user interface shown in FIG. 4C may also include a prompt message 410. The prompt 410 may be used to prompt the user to see the manner in which the enhanced functionality currently provided by the electronic device, such as the text "click on icon, view more functionality" shown in fig. 4C in bubble form. The electronic device may display the prompt message 410 only when the control 409 is displayed on the user interface 41 for the first time, or may display the prompt message 410 each time the control 409 is displayed on the user interface 41, which is not limited in this application. Not limited to the prompt 410, in other embodiments, the electronic device may prompt the user by voice or the like to view the enhanced functionality currently provided by the electronic device.
Here, the enhanced functions currently provided by the electronic device may be determined in the following manners: 1. the enhanced functionality currently provided by the electronic device may be all of the enhanced functionality provided by the electronic device. 2. The enhanced functionality currently provided by the electronic device may be enhanced functionality that can be used by a currently launched third party application, such as WeChat. 3. The enhanced functionality currently provided by the electronic device may be adapted to the enhanced functionality of the current usage scenario (e.g., a video call scenario). In the following embodiments, a determination method of the enhanced function provided when the electronic device runs the third-party application will be described in detail, and details are not repeated here.
The user interface 41 shown in FIG. 4D illustrates one implementation of the electronic device deploying display of a control corresponding to a currently provided enhanced functionality. In the fig. 4D embodiment, the enhanced functions currently provided by the electronic device include: the functions of beautifying face, beautifying body and weakening. The user interface 41 may be a user interface opened by the electronic device in response to a detected operation (e.g., a click operation) acting on the control 409 shown in fig. 4C. The user interface 41 may also be a user interface that is opened after the user clicks the wechat icon 206 in fig. 2A and selects a contact initiates a video call or accepts a video call initiated by a contact, or may be a user interface that is opened by the electronic device in response to the voice of the user.
As shown in fig. 4D, the user interface 41 may include: control 411A, control 411B, and control 411C.
The control 411A is used to listen for an operation to turn on the "beauty" function. In response to a detected operation (e.g., a touch operation) acting on the control 411A, the electronic device may turn on a "beauty" function. The "beauty" function may be used to beautify the captured image of the face of the local user in the current video call interface, and the beautification may include, for example, skin whitening, skin peeling (e.g., removing pockmarks, freckles, wrinkles, etc. on the face of a person), and so on.
The control 411B is used to listen for an operation to turn on the "beauty" function. In response to a detected operation (e.g., a touch operation) acting on the control 411B, the electronic device may turn on a "beauty" function. The "body beautification" function may be used to beautify the captured body image of the home user in the current video call interface, and may include, for example, beautifying the body proportion (e.g., lengthening legs, widening shoulders, etc.), adjusting the fatness of the body (e.g., slimming waist, legs, belly, buttocks, or plumping body parts, etc.).
The control 411C is used to listen for an operation to turn on the "ghosting" function. In response to a detected operation (e.g., a touch operation) acting on the control 411C, the electronic device may turn on a "ghosting" function. The "blurring" function may be used to blur the captured image of the home user in the current video call interface, which may include, for example, making the depth of field of the image shallow, focusing on a subject, and so on.
The representations of controls 411A, 411B, and 411C may include icons and/or text. The method is not limited to the controls 411A to 411C, and the electronic device may also provide controls corresponding to other enhanced functions, which is not limited in the present application.
In some embodiments, the user interface 41 shown in fig. 4D may further include prompt information 411D, prompt information 411E, and prompt information 411F corresponding to the control 411A, the control 411B, and the control 411C, respectively. The prompt messages can be used for introducing the enhanced functions corresponding to the corresponding controls. For example, the prompt information 411D is used to introduce that the enhanced function corresponding to the control 411A is a "beauty" function, the prompt information 411E is used to introduce that the enhanced function corresponding to the control 411B is a "beauty" function, and the prompt information 411F is used to introduce that the enhanced function corresponding to the control 411C is a "blurring" function. The presentation of the reminder information 411D-411F may be in the form of text presented in the form of bubbles. The electronic device may only display the prompt information 411D-411F when the controls 411A-411C are displayed on the user interface 41 for the first time, or may display the prompt information 411D-411F when the controls 411A-411C are displayed on the user interface 41 for each time, which is not limited in this application.
The user interface 41 illustratively shown in FIG. 4E may invoke or enable one implementation of the "beauty" functionality provided by the electronic device for third party applications.
Referring to fig. 4E, the user interface 41 shown in fig. 4E may be a user interface that is opened by the electronic device in response to a detected operation (e.g., a click operation) acting on the control 411B shown in fig. 4D. As shown in fig. 4E, the user interface 41 may include: an "beauty" icon 412A, a control 412B, a beauty level indicator 412C.
"beauty" icon 412A may be used to indicate that the enhanced functionality currently enabled by the user's selection includes "beauty" functionality. In some embodiments, the "beauty" icon may also be highlighted to indicate that the enhanced functionality currently selected for use by the user includes "beauty functions". Without limitation to highlighting, the "beauty" icon may also present other display states (e.g., underlined or shaded, etc.) to indicate that the currently selected enhanced functionality includes "beauty" functionality.
Control 412B may be a slider. The control 412B may be used to receive an adjustment operation of the user on the beauty level of the person being photographed (i.e., the end user of the video call), and the adjustment operation may be a sliding operation applied to the control 412B. Here, the beauty level may be a beauty level for beautifying the body of the person to be photographed. The body beauty level may also be referred to as body beauty grade. For example, there may be 11 body beauty grades from body beauty grade 0 to body beauty grade 10, the higher the body beauty grade, the higher the beautification degree of body beautification; the beauty level 0 may indicate that the body image of the person to be photographed is not beautified, and at this time, the body shape represented by the body image of the person to be photographed displayed in the display area 403 of the image of the home terminal user is identical to the actual body shape of the person to be photographed, that is, no beautification occurs; the beauty level 10 may indicate that the body of the person to be photographed is beautified to a greater extent, and at this time, the body image of the person to be photographed displayed in the display area 403 of the image of the home terminal user shows a body beautified to a greater extent than the actual body shape of the person to be photographed.
The beauty level indicator 412C is used to indicate the beauty level currently selected by the user. The body beauty level indicator 412C may be represented in the form of text, such as the text "7" in FIG. 4E, which may be used to indicate that the body beauty level currently selected by the user is 7. As shown in fig. 4E, the body image of the person to be photographed displayed in the display area 403 of the home user image is beautified as compared with those in fig. 4A to 4D.
The user interface 41 shown in fig. 4F may be one implementation of the electronic device stopping displaying or collapsing displaying a control or other interactive element corresponding to the currently provided enhanced function. In some embodiments, the user interface 41 shown in fig. 4F may be displayed if the electronic device does not receive a user operation on the control 412B of the user interface 41 shown in fig. 4E within a preset time. In other embodiments, the electronic device may display the user interface 41 as shown in FIG. 4F in response to a detected operation (e.g., a touch operation) of a blank area on the user interface 41 as shown in FIG. 4E. Here, the blank area on the user interface 41 may refer to other areas besides the control and other interactive elements, and may be, for example, the home end user image display area 403.
As shown in fig. 4F, the user interface 41 may stop displaying or collapse displaying the control or other interactive element corresponding to the enhanced function currently provided by the electronic device, i.e., stop displaying the "beauty" icon 412A, the control 412B, and the beauty level indicator 412C, and display the control 409. The control 409 in fig. 4F is the same as the control 409 shown in fig. 4C, and reference may be made to the related description of the embodiment in fig. 4C.
In some embodiments, the electronic device can also stop displaying the control 409. Fig. 4F-4H illustrate one implementation of an electronic device stop display 409.
The control 409 in fig. 4F may also be used to listen for operations (e.g., a long press gesture) that stop the control 409. In response to an operation (e.g., a long press gesture) acting on the control 409 as shown in fig. 4F, the electronic device can display the control 414 as in fig. 4G, the control 414 can be used to receive a user operation (e.g., a long press gesture), and the electronic device can stop displaying the control 409 in response to the user operation received on the control 414. Illustratively, the electronic device may display the user interface 41 as shown in fig. 4H after responding to the user operation received on the control 414, and the control 409 is no longer displayed in the user interface 41. Here, while the electronic device no longer displays the control 409 on the user interface 41 as shown in FIG. 4H, the electronic device continues to provide enhanced functionality and continues to enable the enhanced functionality selected by the user. After the control 409 in the user interface 41 is hidden, the electronic device may have more display areas for displaying the image of the opposite-end user or the image of the home-end user, so that the user can conveniently perform video call better, and experience is better.
In some embodiments, a prompt 413 may also be included in FIG. 4F. The prompt 413 may be used to prompt the user to stop the way the control 409 is displayed, such as the text "long press icon, hide function assistant" shown in fig. 4F in bubble form.
In some embodiments, after the user hides the control 409, if the user wants to view the control 409 again and use or enable enhanced functionality provided by the electronic device through the control 409, the control 409 can be recalled by user operation. The user action to recall the control 409 may be: an operation (e.g., a touch operation or a click operation) that acts on a blank area in the user interface 41 shown in fig. 4H. Here, the blank area may refer to other areas besides the control and other interactive elements, and may be, for example, the home end user image display area 403.
The page layout of the "video call interface" may also take other forms, which is not limited to the page layouts shown in fig. 4A to 4H, and this is not limited in this embodiment of the present application. Not limited to third party application WeChat, other social communication third party applications (e.g., Skype, WhatsApp, etc.) may also provide similar "video call interfaces," and when these third party applications provide "video call interfaces," the electronic device may also provide enhanced functionality in a manner similar to that described above with respect to FIGS. 4A-4H.
Not limited to the implementation manner shown in fig. 4D that the electronic device expands and displays the control corresponding to the currently provided enhanced function in response to the detected operation acting on the control 409, in this embodiment of the application, the electronic device may also expand and display the control corresponding to the currently provided enhanced function in other manners, for example, the user may also view the control corresponding to the currently provided enhanced function of the electronic device by sliding and calling out the sidebar, the drop-down bar, or the drop-up bar. When the user slides and calls out the sidebar to check the control corresponding to the enhanced function currently provided by the electronic equipment, the application does not limit the sliding direction.
In the UI embodiments exemplarily shown in fig. 5D-5F, the user may call up the sidebar through a gesture of sliding to the left to view a control corresponding to the enhanced function currently provided by the electronic device. The user interfaces exemplarily shown in fig. 5D-5F are explained below.
The user interface 51 exemplarily shown in fig. 5A-5B may refer to the user interface 41 shown in fig. 4A-4B, and is not repeated herein.
In some embodiments, in response to a detected operation (e.g., a touch operation) acting on the text "start experience" as shown in FIG. 5B, the electronic device may display a user interface 51 as shown in FIG. 5C. As shown in fig. 5C, the user interface 51 may include a prompt 501. The prompt 501 may be used to prompt the user to see the manner in which the enhanced functionality currently provided by the electronic device, such as the text "slide left, function assistant" shown in bubble form in fig. 5C.
As shown in fig. 5C, the electronic apparatus may detect an operation of sliding left on the screen, displaying the user interface 51 as shown in fig. 5D. The user interface 51 shown in fig. 5D illustrates one implementation of the electronic device expanding to display a control corresponding to the enhanced functionality currently provided. In the fig. 5D embodiment, the enhanced functions currently provided by the electronic device include: the functions of beautifying face, beautifying body and weakening.
As shown in fig. 5D, the user interface 51 includes: control 502A, control 502B, and control 502C. The control 502A, the control 502B, and the control 502C may refer to the control 411A, the control 411B, and the control 411C in fig. 4D, which are not described herein again. In some embodiments, the user interface 51 may further include: hint information 502D, hint information 502E, and hint information 502F. The prompt information 502D, the prompt information 502E, and the prompt information 502F may refer to the prompt information 411D, the prompt information 411E, and the prompt information 411F in fig. 4D, which are not described herein again.
The user interface 51 exemplarily shown in fig. 5E may invoke or enable one implementation of the "beauty" functionality provided by the electronic device for third-party applications. Referring to fig. 5E, the user interface 51 shown in fig. 5E may be a user interface that the electronic device opens in response to a detected operation (e.g., a click operation) acting on the control 502B shown in fig. 5D. As shown in fig. 5E, the user interface 51 may include: "beauty" icon 503A, control 503B, beauty level indicator 503C. The "beauty" icon 503A, the control 503B, and the beauty level indicator 503C may refer to the "beauty" icon 412A, the control 412B, and the beauty level indicator 412C in fig. 4E, which are not described herein again.
The user interface 51 shown in fig. 5F may be one implementation of the electronic device stopping displaying or collapsing displaying a control or other interactive element corresponding to the currently provided enhanced function. In some embodiments, if the electronic device does not receive a user operation on the control 503B of the user interface 51 shown in fig. 5E within a preset time, the user interface 51 shown in fig. 5F may be displayed. In other embodiments, the electronic device may display the user interface 51 as shown in fig. 5F in response to a detected operation (e.g., a touch operation) of a blank area on the user interface 51 as shown in fig. 5E. Here, the blank area on the user interface 51 may refer to other areas besides the control and other interactive elements, and may be, for example, a home-end user image display area.
As shown in fig. 5F, the user interface 51 may stop displaying or collapse displaying the control or other interactive element corresponding to the enhanced function currently provided by the electronic device, that is, stop displaying the "beauty" icon 503A, the control 503B and the beauty level indicator 503C.
With the embodiments exemplarily shown in fig. 4A to 4H or fig. 5A to 5F, the electronic device may provide an enhanced function in a video call scenario, that is, may enable a third party application to adjust the enhanced function of the electronic device in the video call scenario. That is, the electronic device may extend the enhanced functionality into the third party application, so that the user may also use the enhanced functionality of the electronic device when using the third party application. Therefore, when the user uses the third-party application, the enhanced function of the electronic equipment can be fully utilized, and the user experience is good.
The method is not limited to a video call scene, and when the electronic device uses a third-party application, the enhanced function can be provided in other use scenes. These usage scenarios may include some or all of the scenarios that can be provided by a third-party application installed on the electronic device. Illustratively, these usage scenarios may include, but are not limited to: a social communication scenario (e.g., text or voice chat), a video call scenario, a video play scenario, a news information scenario, a video recording scenario, a game scenario, a shopping scenario, a live broadcast scenario, or an audio recording scenario, etc.
(II) usage scenario II: live broadcast scene
In the UI embodiments exemplarily illustrated in fig. 6A-6H below, the electronic device may provide enhanced functionality for the user to use when the user manipulates a third-party application installed on the electronic device. That is, a third party application on the electronic device may invoke enhanced functionality of the electronic device. The following embodiments of fig. 6A to 6H take the usage scenario as a live scenario as an example to describe a method for a third party application on an electronic device to invoke an enhanced function of the electronic device. The live broadcast scene refers to a multifunctional network live broadcast platform which is constructed on the internet by electronic equipment by utilizing the internet and advanced multimedia communication technology and integrates audio, video, desktop sharing, document sharing and interaction links, and a scene in which voice, video and data are comprehensively communicated and interacted with one another directly on line. The following live scene electronics involved in fig. 6A-6H provide scenes to a party viewing the live.
The embodiment of fig. 6A-6H differs from the embodiment of fig. 4A-4H in that the electronic device has different usage scenarios for providing enhanced functionality, which are described in detail below.
The following describes a "live interface" provided by a third-party application installed on the electronic device exemplarily illustrated in fig. 6A-6H.
In some embodiments, a "live interface" may be used to display images of a main broadcast (a user publishing voice, video, data on a live platform) acquired by an electronic device from a network, and associated controls while the electronic device is live. The related control at the time of live broadcast may be used to receive a user operation (e.g., a touch operation), and in response to the user operation, the electronic device may perform one or more of the following: opening a comment frame, displaying information, displaying live friends, sharing links of the live interface and the like.
The user interface 61 exemplarily shown in fig. 6A-6H may be one implementation of a "live interface". The user interface 61 may be provided by a live third party application (e.g., a goby live, a tiger live, a panda live, etc.). In some embodiments, the user interface 61 may be a user interface opened by the electronic device in response to a user operation (e.g., a click operation) on an icon (not shown) of the live-class third-party application in fig. 2A, or may be a user interface invoked by a voice.
As shown in fig. 6A-6H, the user interface 61 may include: status bar 601, current page indicator 602, live image display area 603, comment display area 604, controls 605A-605D. In some embodiments, the user interface 61 may also include a navigation bar (not shown) that may be hidden, which may refer to the navigation bar 205 in FIG. 2A.
The status bar 601 can refer to the status bar 201 in the user interface 21 shown in fig. 2A, and is not described in detail here.
The current page indicator 602 may be used to indicate a current page, such as the text "live room No. 1" shown in fig. 6A.
The live image display area 603 is used for displaying a live image received by the electronic device through a network.
The comment display area 604 is used to display comments posted by one or more home users or other users.
Control 605A may be used to listen to a user operation (e.g., a touch operation) in response to which the electronic device may display a comment box on user interface 61 for the user to enter comments. Control 605B may be used to listen to a user operation (e.g., a touch operation) in response to which the electronic device may display information (e.g., a personal letter) received by the home user on user interface 61. Control 605C may be used to listen to a user operation (e.g., a touch operation) in response to which the electronic device may display the live friends of the home user on user interface 61. Control 605D may be used to listen for user operations (e.g., touch operations) in response to which the electronic device may share the link of the "live interface" to other devices.
In the case where the "feature assistant" is not turned on, the user may turn on the "feature assistant". The user may turn on "feature assistant" in window 217 as shown in FIG. 2B, or may turn on "feature assistant" in the setup interface shown in FIG. 3B or FIG. 3C. Reference is made to the preceding description of embodiments. Without being limited thereto, in some embodiments, the electronic device may also turn "function assistant" on by default.
In some embodiments, the user may also be prompted to turn on the "feature assistant" when the electronic device displays a user interface 61 as shown in FIG. 6A, in the event that the "feature assistant" is not turned on, i.e., when a third-party application currently launched by the electronic device does not have the right to use the enhanced features of the electronic device. The manner in which the electronic device prompts the user to turn on the "function assistant" may include, but is not limited to: prompt information (not shown in the figure), such as text information, icons, etc., sounding a prompt tone, vibrating, etc., is displayed on the user interface 61.
In the case where the "function assistant" is turned on, i.e., when the third-party application currently launched by the electronic device has the right to use the enhanced functions of the electronic device, the electronic device may display a user interface 61 as shown in fig. 6B. The user interface 61 as shown in FIG. 6B may include a small window 606 for introducing a "function assistant". After the "function assistant" is turned on, the electronic device may display the small window 606 only when the user interface provided by the third-party application is opened for the first time, or may display the small window 606 when the user interface provided by the third-party application is opened each time, which is not limited in this application. The widget 606 may refer to the widget 408 in the user interface 41 shown in fig. 4B, and will not be described herein.
The user interface 61 illustratively shown in fig. 6C-6H may present one implementation of the enhanced functionality provided for the electronic device.
In response to a detected operation (e.g., a touch operation) acting on a control 606A in the widget 606, the electronic device may display the user interface 61 as shown in fig. 6C. As shown in fig. 6C, a control 607 may be included in the user interface 61. The control 607 can refer to the control 409 in fig. 4C, and is not described in detail here. In some embodiments, the user interface shown in FIG. 4C may also include prompt information 608. The prompt 608 refers to the prompt 410 in fig. 4C, and is not described herein.
Here, the enhanced functions currently provided by the electronic device may be determined in the following manners: 1. the enhanced functionality currently provided by the electronic device may be all of the enhanced functionality provided by the electronic device. 2. The enhanced functionality currently provided by the electronic device may be enhanced functionality available to a currently launched third party application (e.g., live fighting). 3. The enhanced functionality currently provided by the electronic device may be enhanced functionality currently available using a scene (e.g., a live scene). In the following embodiments, a determination method of the enhanced function provided when the electronic device runs the third-party application will be described in detail, and details are not repeated here.
The user interface 61 shown in fig. 6D illustrates one implementation of the electronic device deploying and displaying a control corresponding to the enhanced functionality currently provided. In the fig. 6D embodiment, the enhanced functionality currently provided by the electronic device includes: a "network acceleration" function, a "coding optimization" function, and a "tone beautification" function. The user interface 61 may be a user interface opened by the electronic device in response to a detected operation (e.g., a click operation) acting on the control 607 shown in fig. 6C. The user interface 41 may be a user interface that can be opened by the electronic device in response to a user operation (for example, a click operation) on an icon (not shown in the figure) of the live-type third-party application in fig. 2A, or may be a user interface that can be opened by the electronic device in response to a voice of the user.
As shown in fig. 6D, the user interface 61 may include: control 609A, control 609B, and control 609C.
The control 609A is used to listen for an operation to turn on the "network acceleration" function. In response to a detected operation (e.g., a touch operation) acting on control 609A, the electronic device may turn on a "network acceleration" function. The network acceleration function can be used for accelerating the acquisition speed of live images in the current live interface. The network acceleration function can make the user watch the live image more smoothly, and avoid blocking.
Control 609B is used to listen for an operation to turn on the "encode optimize" function. In response to a detected operation (e.g., a touch operation) acting on control 609B, the electronic device may turn on a "code optimization" function. The function of "coding optimization" can be used for improving the visual quality effect of live images in the current live interface.
The control 609C is used to listen for an operation to turn on the "beautiful sound" function. In response to a detected operation (e.g., a touch operation) acting on control 609C, the electronic device may turn on a "beautiful sound" function. The 'sound beautifying' function can be used for beautifying the audio corresponding to the live images in the live interface, and the beautifying can comprise processing the tone, the tone color and the like of the audio.
The representations of control 609A, control 609B, and control 609C may include icons and/or text.
In some embodiments, when the control 609A, the control 609B, or the control 609C is selected by the user, i.e., the user selects to enable the corresponding enhanced functionality of the electronic device, the display state of the control may be used to indicate that the control is selected. The display state may include: adding background color, underlining, highlighting, etc.
The user interface 61 exemplarily shown in fig. 6E may invoke one implementation of the "code optimization" functionality provided by the electronic device for third-party applications. As shown in fig. 6E, selected within the control 609B, i.e., the user selects to enable the "code optimization" function provided by the electronic device, the background color of the control 609B may be gray. As shown in fig. 6E, the visual quality of the live image displayed in the live image display area 603 is improved.
The user interface 61 shown in fig. 6F may be one implementation of the electronic device stopping displaying or collapsing displaying a control or other interactive element corresponding to the currently provided enhanced function. In some embodiments, the user interface 61 shown in FIG. 6F may be displayed if the electronic device does not receive a user operation on the controls 609A-609C of the user interface 61 shown in FIG. 6E within a preset time. In other embodiments, the electronic device may display the user interface 61 as shown in fig. 6F in response to a detected operation (e.g., a touch operation) of a blank area on the user interface 61 as shown in fig. 6E. Here, the blank area on the user interface 61 may refer to other areas besides the control and other interactive elements, and may be, for example, the home-end live image display area 603.
As shown in fig. 6F, the user interface 61 may stop displaying or collapse displaying the controls or other interactive elements corresponding to the enhanced functions currently provided by the electronic device, i.e., stop displaying the controls 609A-609C.
In some embodiments, the electronic device can also stop displaying the control 607. 6F-6H illustrate one implementation of an electronic device stop display control 607. The manner of stopping displaying the control 607 is the same as the manner of stopping displaying the control 409 by the electronic device shown in fig. 4F to 6H, and reference may be made to related descriptions, which are not repeated herein.
It is understood that, without being limited to the video call scenario or the live broadcast scenario mentioned in the above embodiments, in the embodiment of the present application, the electronic device may also provide an enhanced function in other usage scenarios. The manner in which the electronic device provides enhanced functionality in other usage scenarios is similar to the embodiments of fig. 4A-4H, fig. 5A-5F, and fig. 6A-6H, and reference is made to the related descriptions, which are not to be taken as examples.
In the embodiment of the present application, a determination manner of the enhanced function provided to the user by the electronic device when the electronic device starts the third-party application is described in detail below. Here, the enhanced function provided to the user when the electronic device starts the third-party application may refer to an enhanced function that the electronic device may provide to the user for selection to enable after starting the third-party application, for example, a "beauty" function, and a "blurring" function provided by the electronic device in the embodiments of fig. 4A to 4H and fig. 5A to 5F, and for example, a "network acceleration" function, a "coding optimization" function, and a "beauty" function provided by the electronic device in the embodiments of fig. 6A to 6H.
In a first determination, the enhanced functions provided to the user by the electronic device include: the electronic device has all enhanced functions.
The enhanced functionality may be developed by the manufacturer based on the unique hardware and/or software of the electronic device. For example, a HUAWEI (HUAWEI) mobile phone is equipped with a come card camera based on which the mobile phone can provide a "large aperture shooting" function, an "ultra wide angle shooting" function, and the like at the time of taking a picture. For another example, a mobile phone manufacturer independently develops an algorithm to provide a 'beauty' function, a 'body' function, a 'blurring' function and the like during photographing; an anti-shake function and an ultra-wide-angle shooting function when recording small videos; a "super resolution" imaging function, "a" color enhancement "function, and the like when playing a video; the 'beautiful sound' function when recording music, etc. It can be seen that the enhanced functionality of the electronic device may comprise one or more.
Through the first determination mode, when the electronic device starts each third-party application, all enhanced functions of the electronic device can be provided for the user. The third-party application installed on the electronic equipment can call the enhanced function of the electronic equipment, and the user can make full use of the enhanced function provided by the electronic equipment, so that the experience of the user when using the third-party application is improved.
(II) in a second determination mode, the enhanced functions provided to the user by the electronic equipment comprise: among all the enhanced functions of the electronic equipment, the enhanced functions which can be used by the third-party application currently started by the electronic equipment.
In a second determination, the enhanced functions that can be used by the third-party application currently launched by the electronic device may include: and the electronic equipment currently starts enhanced functions corresponding to one or more usage scenes provided by the third-party application.
In particular, in general, the usage scenario provided by the third-party application is not single, and one third-party application may provide multiple usage scenarios. Different third party applications may provide different usage scenarios. Illustratively, referring to table 1, table 1 exemplarily lists several usage scenarios provided by third-party applications. As shown in table 1, the usage scenarios provided by the instant messaging third party applications such as WeChat, Skype, WhatsApp, etc. may include: social communication scenes (such as text or voice chatting), small video recording scenes, video playing scenes, scenes for reading news information, video playing scenes, audio playing scenes and the like; usage scenarios that shopping-like third-party applications, such as naobao, Amazon (Amazon), etc., may provide may include shopping scenarios, live scenes, etc.
Figure BDA0003503043980000261
Figure BDA0003503043980000271
TABLE 1 usage scenarios provided by different third-party applications
It can be understood that, in the embodiment of the present application, the usage scenarios may be classified according to user requirements, and may also be classified according to other criteria, for example, resources called when the electronic device provides the usage scenarios, and the present application does not limit this.
In the embodiment of the application, each enhanced function of the electronic device has a corresponding use scene. In general, the enhancement function is used only in the corresponding usage scenario, and is not used in other usage scenarios. That is, different usage scenarios may correspond to different enhanced functionality.
In the embodiment of the application, the enhanced functions respectively corresponding to different usage scenarios may be configured in the electronic device in advance. Illustratively, referring to table 1, table 1 exemplarily lists enhancement functions respectively corresponding to several usage scenarios. As shown in table 1, the "large aperture" function, "super wide angle shooting" function, "beauty" function, "body beauty" function, and the like are applicable to a shooting scene; the super-resolution imaging function, the color enhancement function and the like are suitable for playing video scenes; the 'beautiful sound' function is suitable for recording audio scenes and the like.
Figure BDA0003503043980000272
TABLE 2 enhanced functions respectively corresponding to different usage scenarios
In the second determination manner, the enhanced functions that can be used by the third-party application currently started by the electronic device may include: and the electronic equipment currently starts enhanced functions corresponding to one or more usage scenes provided by the third-party application. In some embodiments, the electronic device may match or determine the enhanced functionality corresponding to one or more usage scenarios provided by the third-party application when the third-party application is installed. In other embodiments, the electronic device may match or determine the enhanced functions corresponding to the one or more usage scenarios provided by the third-party application when the third-party application is started.
For example, referring to table 2, the usage scenario provided by WeChat (WeChat) includes: the enhanced functions that can be used by WeChat (WeChat) can include enhanced functions corresponding to the use scenes, and can include: the function of "red packet acceleration", "interesting chat", "big aperture", "super wide shooting", "beauty", "slimming", "super resolution imaging", "color enhancement", "read later", "intelligent screen recognition", etc.
As another example, the usage scenarios provided by the treasure hunt include shopping scenarios and game scenarios, and the enhanced functions that the treasure hunt can use may include: the system comprises a commodity historical price display function, a commodity rapid comparison function, a network acceleration function, a coding optimization function, a beautiful sound function and the like.
In a second determination manner, the electronic device may first determine a currently started third-party application, then determine an enhanced function that can be used by the third-party application, and provide the enhanced function that can be used by the third-party application for the user to use. The third-party application currently launched by the electronic device may be a foreground application of the electronic device, and the foreground application may occupy screen focus to provide a graphical user interface for a user.
The electronic device may determine the currently launched third-party application by any one of the following: 1. looking at the name of the task process at the top of the stack of RunningTask, this name can be used to indicate the currently launched third party application. 2. Acquiring a list (list) of currently running processes through a running process, traversing each process in the list, judging whether the process is a foreground process, and if so, indicating the currently started third-party application by the name of the process. 3. And acquiring the use conditions of all applications through the UsagStatsManager so as to determine the currently started third-party application.
Through the second determination mode, the electronic device can provide different enhanced functions for the user when starting different third-party applications. That is, the electronic device may provide the user with enhanced functionality appropriate for the currently launched third-party application. The user may use different enhanced features when using different third party applications on the electronic device.
(iii) in a third determination mode, the enhanced functions provided to the user by the electronic device include: the method is suitable for the enhancement function of the current use scene.
In some embodiments, the enhancement function applicable to the current usage scenario may be an enhancement function corresponding to the current usage scenario, as described with reference to table 2 and related descriptions. For example, if the current usage scenario is a video call scenario, the enhanced functions provided by the electronic device to the user include: the functions of ' beauty ', body ' and ' vacuity '; if the current usage scene is a video playing scene, the enhanced functions provided by the electronic device to the user include: a super-resolution imaging function and a color enhancement function. The electronic device may read, in one or more of the kernel layer, the system layer, or the application framework layer, a resource category occupied by the currently-started third-party application and/or a user interface currently provided by the third-party application, and determine a current usage scenario.
For example, if the hardware resources occupied by the third-party application include a camera and a microphone, it may be determined that the current usage scenario is a video recording scenario; if the third-party application calls a processor (CPU) to perform video decoding and occupies an audio playing device (e.g., a speaker 170A, etc.), it may be determined that the current usage scene is a video playing scene; if the third-party application uses a Graphics Processing Unit (GPU) to perform graphics rendering, and combines with the recognition of the current interface name (for example, the third-party application currently running on the electronic device is "royal glory", the electronic device may capture the interface name "com.tension.tmgp.sgame.sgameactivity" from the background), it may be determined that the current usage scene is a game scene; if the third-party application calls a processor (CPU) to perform video encoding and decoding, calls a camera and a microphone, and the wireless communication module or the mobile communication module has data transmission (i.e., has data exchange with a network server), and combines the recognized current interface name (for example, the third-party application "WeChat" currently running by the electronic device provides a video call scene, the electronic device may capture the interface name "com. Here, the current interface refers to a graphical user interface provided by the currently launched third-party application, which takes screen focus. The electronic device recognizes the current interface in a manner similar to the manner of determining the currently launched third-party application, and reference may be made to the relevant description.
In other embodiments, the enhanced functionality applicable to the current usage scenario may refer to: and among the enhanced functions corresponding to the current use scene, the enhanced functions compatible with the current capability provided by the electronic equipment. In the enhancement functions corresponding to the current use scene, the enhancement functions exclusive to the capability currently provided by the electronic device are removed, namely the enhancement functions suitable for the current use scene.
It is understood that due to resource conflicts or other reasons, some of the functions (including enhanced functions and non-enhanced functions) of the electronic device are mutually exclusive, and the electronic device cannot provide mutually exclusive functions at the same time. For example, the electronic device cannot provide an enhanced function "skin makeup" function and a non-enhanced function "anti-shake" function at the same time, cannot provide an enhanced function "blurring" function and a non-enhanced function "high dynamic range imaging (HDR)" function at the same time, and cannot provide an enhanced function "ultra wide angle shooting" function and a non-enhanced function "anti-shake" function at the same time.
The electronic device may read the parameters of the currently occupied resources to determine the functionality that has been currently provided to the user. The resource parameters currently occupied by the electronic device may include, but are not limited to: camera parameters, codec parameters, display parameters, network parameters, audio parameters, and the like. The camera parameters may include resolution, frame rate, focus mode, and anti-shake mode, and the encoding and decoding parameters may include: bitrate, I-frame interval, display parameters may include resolution, display mode (e.g., SurfaceView, TextureView), and the like. For example, the electronic device may determine, from the read camera parameters (e.g., resolution, focus parameters), that a "skin makeup" function or an "anti-shake" function, etc. has been provided.
Through the third determination mode, when the electronic device starts the third-party application, different enhanced functions can be provided for the user under different use scenes provided by the third-party application. That is, the electronic device may provide the user with enhanced functions suitable for the current usage scenario. The user may use different enhanced functionality when opening different interfaces of the same third party application on the electronic device.
In this application, the control for listening for an operation of turning on the enhanced function may be referred to as an enhanced function option, and the user interface for presenting the enhanced function option may be referred to as a first user interface. Exemplary implementations of the first user interface may include the user interface 41 shown in fig. 4D, the user interface 51 shown in fig. 5D, or the user interface 61 shown in fig. 6D. Exemplary implementations of enhanced functionality options may include control 411A, control 411B, control 411C, as shown in fig. 4D, or control 502A, control 502B, control 502C, as shown in fig. 5D, or control 609A, control 609B, control 609C, as shown in fig. 6D.
In some embodiments of the present application, an operation for starting a third-party application, or an operation for switching between a plurality of user interfaces provided by the third-party application, may be referred to as a first operation on the third-party application. Exemplary implementations of the first operation on the third-party application may include an operation of the user selecting a contact to initiate a video call in a user interface provided by WeChat (WeChat), or an operation of the user clicking on an icon of a live-type third-party application on a Home screen (Home screen).
In some embodiments of the present application, the control for listening to the operation of deploying the enhanced functionality option may be referred to as an enhanced functionality control, and the user interface for presenting the enhanced functionality control may be referred to as a second user interface. Exemplary implementations of the second user interface may include the user interface 41 shown in fig. 4C or the user interface 61 shown in fig. 6C. Exemplary implementations of enhanced functionality controls may include control 409 in fig. 4C or control 507 in fig. 6C.
In some embodiments of the present application, an exemplary implementation of an interactive element for listening to operations of a hidden enhanced functionality control may include control 414 shown in FIG. 4G or control 611 shown in FIG. 6G. The user's operation to evoke the interactive element may include a gesture on the enhanced functionality control, which may include a long press gesture, a swipe gesture, a double tap gesture, or a heavy press gesture, among others.
In some embodiments of the present application, the user interface 51 shown in FIG. 5C may be referred to as a third user interface. In response to a swipe gesture in a first direction of the third user interface, the electronic device displays the first user interface. The first direction may be a direction from the right side of the display screen to the left side, a direction from the top of the display screen to the bottom, a direction from the left side of the display screen to the right side, and the like.
In some embodiments of the present application, the control for adjusting the level of enhanced functionality provided by the electronic device may be referred to as an enhanced functionality menu. An exemplary implementation of the enhanced functionality menu may include a control 412B as shown in fig. 4E.
In some embodiments of the present application, the user interface for displaying the reminder information and the first interactive element may be referred to as a fourth user interface. Exemplary implementations of the toast may include the toast 408C in the portlet 408 shown in fig. 4B, or the toast in the portlet 5B, or the toast in the portlet 606 shown in fig. 6B. An exemplary implementation of the first interactive element may include a prompt 408E in the widget 408 shown in fig. 4B, or a control 606A in the widget 606 shown in fig. 6B.
In some embodiments of the present application, the user interface 21 shown in fig. 2B, the user interface 32 shown in fig. 3B, or the user interface 33 shown in fig. 3C may be referred to as a fifth user interface. The control in the fifth user interface for turning on the "function assistant" may be referred to as a second interactive element, and exemplary implementations of the second interactive element may include control 217A in user interface 21 shown in FIG. 2B, control 308 in user interface 32 shown in FIG. 3B, or controls 313-316 in user interface 33 shown in FIG. 3C.
In some embodiments of the present application, the electronic device may provide different enhanced functions in different usage scenarios. For example, the electronic device may display a sixth user interface in response to the second operation of the third-party application, the sixth user interface including one or more enhanced functionality options. A second operation on the third-party application may include an operation to switch between multiple user interfaces provided by the third-party application, and an exemplary implementation of the second operation may include an operation for the user to select and send a text message to a contact in a user interface provided by WeChat (WeChat). An exemplary implementation of the sixth user interface may include a text chat interface. The one or more enhanced feature options in the sixth user interface are different from the one or more enhanced feature options in the first user interface.
In the embodiment of the present application, the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Referring to fig. 7, fig. 7 shows a block diagram of a software structure of the electronic device 100 exemplarily provided in the embodiment of the present application. The electronic device 100 may invoke enhanced functionality of the electronic device when the third party application is enabled.
As shown in fig. 7, the electronic device may include: an application layer, an Application Programming Interface (API), an enhanced function matching module, an enhanced function penetration module, a Hardware Abstraction Layer (HAL) layer, and a kernel layer (kernel). Wherein:
the application layer includes a series of application packages, such as "feature assistants" and third party applications. The "function assistant" may be a service or function provided by the electronic device, and refer to the related description of the foregoing embodiments. The "function assistant" may support providing enhanced functions of the electronic device when the third-party application is enabled, and the enhanced functions of the electronic device may refer to the related descriptions of the foregoing embodiments, and may include, for example, a "beauty" function, a "large aperture" function, a "wide angle" function, an "anti-shake" function, and the like. The third party application may refer to the description related to the previous embodiments, and may include WeChat, Messenger, Skype, trembler, etc., for example.
In some embodiments, a "function assistant" may be used to determine whether a third-party application currently launched by an electronic device has permission to use enhanced functions of the electronic device.
The application program interface is used for realizing communication between the application program layer and the HAL layer and the kernel layer (kernel). For example, communication between third party applications and a HAL layer and kernel layer (kernel) may be provided, among others.
The enhanced function matching module is used for determining enhanced functions provided for the user by the electronic equipment when the third-party application is started, namely determining the enhanced functions provided for the currently running third-party application by the electronic equipment.
In some embodiments, the enhanced function matching module may determine all enhanced functions provided by the electronic device as the enhanced functions provided to the user by the electronic device when the third-party application is started. Here, the manner in which the enhanced function matching module determines the enhanced function to be provided to the user may refer to the aforementioned first determination manner.
In other embodiments, the electronic device may further include an application identification module. The application identification module may be used to identify a third party application currently launched by the electronic device. The enhanced function matching module may determine, of all enhanced functions possessed by the electronic device, an enhanced function that can be used by a third-party application currently started by the electronic device, as the enhanced function provided to the user when the electronic device starts the third-party application. Here, the manner in which the enhanced function matching module determines the enhanced function to be provided to the user may refer to the aforementioned second determination manner.
In still other embodiments, the electronic device may also include a scene identification and parameter gathering module. The scenario identification and parameter gathering module may be used to identify the usage scenario currently provided by the electronic device and the functions that are currently already provided. The enhanced function matching module is used for determining the enhanced functions suitable for the current use scene as the enhanced functions provided for the user by the electronic equipment when the third-party application is started. Here, the manner in which the enhanced function matching module determines the enhanced function provided to the user may refer to the aforementioned third determination manner, and the enhanced function applicable to the current usage scenario may refer to the relevant description in the aforementioned third determination manner.
A "function helper" in the application layer may present the enhanced functions to the user after the enhanced function matching module determines the enhanced functions provided to the user for the user to select the enhanced functions that the user wants to enable. The manner in which the "function assistant" presents the enhanced functionality to the user may include displaying controls corresponding to the enhanced functionality on a screen, etc. The manner in which the "function assistant" presents available functions to the user can be found in relation to the embodiments of fig. 4A-4H, 5A-5F, and 6A-6H. The user may choose to invoke or enable one or more of the enhanced features after the "feature assistant" provides the enhanced features.
The enhanced function penetration module is used for applying one or more enhanced functions selected and started by a user to the third-party application, namely enabling the electronic equipment to call the enhanced functions of the electronic equipment when the third-party application is used. Specifically, the enhanced function penetration module is configured to send parameters related to the called enhanced function to the HAL layer and the kernel layer (kernel), so that the HAL layer and the kernel layer (kernel) perform corresponding operations, thereby enabling the enhanced function selected by the user. In some embodiments of the present application, a parameter involved in an enhanced function invoked by an electronic device may be referred to as a first parameter. After the enhanced function penetration module applies one or more enhanced functions selected and started by the user to the third-party application, the electronic equipment can present effects corresponding to the enhanced functions selected by the user, such as 'beauty' and 'body' effects of the character image in the screen and the like, in a user interface provided by the third-party application.
The HAL layer and the kernel layer (kernel) are used for responding to the enhancement function called by the enhancement function penetration module to execute corresponding operations. For example, the electronic device may invoke an algorithm of the HAL layer, thereby enabling the enhanced functionality. For another example, the electronic device may invoke a corresponding driver to enable the enhanced functionality through a hardware device (e.g., a camera). Exemplarily, taking a video call scene as an example, if a user selects to enable an enhanced function "blurring" function provided by the electronic device, the electronic device may call a camera (camera) algorithm through the HAL layer to perform blurring processing on a captured image of the home terminal user; if the user selects to enable the enhanced function 'anti-shake' function provided by the electronic device, the electronic device may use a camera (camera) of a kernel layer (kernel) to drive the camera to perform anti-shake processing on the captured image of the local user. As another example, if the user selects to enable the enhanced function "color enhancement" provided by the electronic device, the electronic device may call a Liquid Crystal Display (LCD) algorithm through the HAL layer to perform color enhancement processing on an image displayed on the screen; if the user selects to start the enhanced function of the super-resolution imaging function provided by the electronic equipment, the electronic equipment can call an image super-resolution algorithm through the HAL layer to perform pixel super-resolution processing on the image displayed on the screen.
The HAL layer and the kernel layer (kernel) are also used for reporting the state change occurring after the enhanced function is enabled to a third-party application (for example, WeChat (WeChat), Skype, etc.) of the application layer. The state change includes activation/deactivation of the respective enhanced function, an activation level of the enhanced function, and the like. After receiving the state change, the third-party application may present the start/stop state of each enhanced function and the activation level of the enhanced function to the user. For example, as shown in fig. 6E, if the user has enabled the "encode optimization" function, the electronic device may change the display state of control 609B.
In the embodiment of the present application, the number of the software modules is not limited to one, and may be multiple. For example, the electronic device may have one or more enhanced-function penetration modules. In some embodiments, the electronic device may be configured with a plurality of enhanced functionality penetration modules, each enhanced functionality penetration module corresponding to an enhanced functionality provided by the electronic device for acting the enhanced functionality in a third party application.
In this application, the enhanced function penetration module may also be referred to as an enhanced function interface.
It should be noted that the functional architecture of the electronic device shown in fig. 7 is only one implementation manner of the embodiment of the present application, and in practical applications, the electronic device may further include more or fewer software modules, which is not limited herein.
In the embodiment of the present application, the electronic device may further include a display module, based on the software structure shown in fig. 7, and the display module is configured to display a corresponding user interface according to the operation of each software module. The user interface displayed by the display module can refer to the embodiments shown in fig. 4A-4H, fig. 5A-5F, and fig. 6A-6H. The display module may be embodied as the display screen 194 in fig. 1.
Based on the functional architecture diagram shown in fig. 7, a method for invoking an enhanced function of an electronic device provided in an embodiment of the present application is described below as a specific example. Fig. 8 illustrates a data flow involved in invoking an enhanced function of an electronic device provided by an embodiment of the present application. The usage scenario of this embodiment may be the video call scenario in fig. 4A-4H or fig. 5A-5F, in which the electronic device enables the camera.
The correspondence relationship between the respective modules is described below with reference to fig. 7 and 8.
The application layer in FIG. 7 corresponds to the application layer in FIG. 8;
the application interface in FIG. 7 corresponds to the application framework layer in FIG. 8;
the enhanced function matching module in fig. 7 corresponds to the system service (SystemServer) in fig. 8;
the enhanced function penetration module in fig. 7 corresponds to a camera function service extension (CameraServiceExtra);
the application identification module in fig. 7 corresponds to the system service (SystemServer) in fig. 8;
the scene recognition and parameter collection module in fig. 7 corresponds to a camera function service extension (CameraServiceExtra) in fig. 8.
In some embodiments, a "feature assistant" may be used to configure the permissions of third-party applications providing the video-call scenario to use the enhanced features of the electronic device.
First, the electronic device starts a third-party application (e.g., WeChat, Skype, etc.) that can provide a video call scene, and opens a video call interface based on the third-party application to provide the video call scene. Referring to the third party application original parameter flow, the third party application of the application layer issues the original parameter flow to the HAL layer and the kernel layer (kernel). Here, the original parameter stream includes parameters involved when the third-party application provides the video call scene, such as camera parameters (e.g., resolution, frame rate, focus mode, anti-shake mode), and the like. After receiving the parameter, the HAL layer and the kernel layer (kernel) may execute corresponding operations to provide a video call scenario for the user. For example, after receiving an original parameter stream of a third-party application, a kernel layer (kernel) may invoke a corresponding driver, and provide a video call scene for a user through a hardware device (e.g., a camera).
Second, when a third-party application initiates a video call, the system service (SystemServer) may send a notification to the "function assistant," the notification indicating the third-party application currently being launched by the electronic device. The notification may be a broadcast. The data flow trend for which the system service (SystemServer) sends a notification to the "function helper" may refer to the white list listening data flow shown in fig. 8.
The "function assistant" then determines whether the third-party application currently initiating the video call has permission to use the enhanced functions of the electronic device. In some embodiments, the "feature assistant" may check to see if the third party application currently initiating the video call is on the "feature assistant" white list, and if so, the third party application has permission to use the enhanced features of the electronic device. Here, the "function assistant" white list may refer to the related description of the previous embodiments. The "feature assistant" may provide enhanced functionality to the user if the third-party application currently initiating the video call has permission to use the enhanced functionality of the electronic device. The manner in which the "function assistant" provides enhanced functionality may be found in relation to the previous description of the embodiment of fig. 4A-4H or the embodiment of fig. 5A-5F. Illustratively, the electronic device may display a control 409 as shown in FIG. 4A on the screen for the user to view and invoke or enable enhanced functionality provided by the electronic device. The determination of the enhancement function can be made in the manner described in the foregoing embodiments.
After the user selects to enable the enhanced function provided by the electronic device, the "function assistant" issues the parameter corresponding to the enhanced function selected to be enabled by the user to the HAL layer and the kernel layer (kernel), and the HAL layer and the kernel layer (kernel) may execute a corresponding operation according to the parameter corresponding to the enhanced function, for example, the HAL layer may call an algorithm to start the enhanced function, and for example, the kernel layer (kernel) may call a driver to enable the enhanced function through a hardware device (for example, a camera). After the HAL layer and the kernel layer (kernel) execute corresponding operations according to the parameters corresponding to the enhanced function, the electronic device may present the effect after the enhanced function is started.
For example, taking a video call scene as an example, if the user selects to enable the enhanced function "blurring" function provided by the electronic device, the electronic device may call a camera (camera) algorithm through the HAL layer to perform blurring processing on a captured image of the home terminal user, and display the blurred image of the home terminal user on the display screen. For another example, if the user selects to enable a "beauty" function and a "body beauty" function provided by the electronic device, the electronic device may call an image processing algorithm through the HAL layer to perform beauty or body beauty processing on the captured image of the home terminal user, so that the character image displayed on the screen presents "beauty", "body beauty" effects, and the like. Here, the "function helper" sends the parameters corresponding to the enhanced functions selected and enabled by the user to the data flow direction of the driver of the kernel layer (kernel), which may refer to the function helper parameter flow shown in fig. 8.
After receiving the parameter sent by the "function assistant" by the driver of the kernel layer (kernel), reporting the state change occurring after the corresponding operation is executed according to the parameter to a third-party application (for example, WeChat, Skype, etc.) of the application layer. The state change includes activation/deactivation of the respective enhanced function, an activation level of the enhanced function, and the like. After receiving the state change, the third-party application may present the start/stop state of each enhanced function and the activation level of the enhanced function to the user. For example, as shown in fig. 6E, if the user has enabled the "encode optimization" function, the electronic device may change the display state of control 609B. The data flow trend of the kernel layer (kernel) driving reporting the state change to the third-party application (e.g. WeChat, Skype, etc.) of the application layer can refer to the callback data flow shown in FIG. 8.
Based on the functional structures of the electronic device shown in fig. 7 and 8, fig. 9 shows a flowchart of a method when the third-party application calls the enhanced function of the electronic device. Here, a portion shown by a dotted line in fig. 9 indicates that the module is an optional module or that the step is an optional step. The implementation of each step in fig. 9 can refer to the related description of fig. 7 and fig. 8, and is not described herein again.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (21)

1. An electronic device, characterized in that the electronic device comprises:
at least one processor;
at least one memory coupled with the at least one processor;
and a computer program, stored on the memory, which when executed by the processor causes the electronic device to perform:
in response to a first user input for a first application, displaying a first interface of the first application, displaying a first enhanced functionality option on the first interface;
providing, by the first application, a first enhanced functionality corresponding to the first enhanced functionality option in response to a second user input for the first enhanced functionality option;
in response to a third user input for a second application, displaying a second interface of the second application, displaying a second enhanced functionality option on the second interface;
providing, by the second application, a second enhanced function corresponding to the second enhanced function option in response to a fourth user input for the second enhanced function option;
the first application is a third-party application installed on the electronic device, the second application is another third-party application installed on the electronic device, the third-party application is an application provided by a third party other than a manufacturer of the electronic device, the first enhanced function option and the second enhanced function option are different enhanced function options, and the first enhanced function and the second enhanced function are different enhanced functions provided by the electronic device.
2. The electronic device of claim 1, wherein the electronic device further performs:
in response to a fifth user input for a third application, displaying a third interface of the third application, displaying a third enhanced functionality option on the third interface;
providing, by the third application, a third enhanced functionality corresponding to the third enhanced functionality option in response to a sixth user input for the third enhanced functionality option;
the third application is a third-party application installed on the electronic device, and the third enhanced function is provided for the electronic device.
3. The electronic device of claim 2, wherein the third enhanced functionality option is different from both the first enhanced functionality option and the second enhanced functionality option, and wherein the third enhanced functionality option is different from both the first enhanced functionality option and the second enhanced functionality option.
4. The electronic device of claim 1 or 2,
the first application corresponds to a first type scene, wherein the electronic device determines that the first application corresponds to the first type scene by the first application occupying a first hardware resource of the electronic device;
the second application corresponds to a second type of scene, wherein the electronic device determines that the second application corresponds to the second type of scene by the second application occupying a second hardware resource of the electronic device.
5. The electronic device of claim 3,
the third application corresponds to a third type of scene, wherein the electronic device determines that the third application corresponds to the third type of scene by the third application occupying a third hardware resource of the electronic device.
6. The electronic device of claim 4,
the electronic equipment determines that the first application corresponds to the first type of scene by using the first application to occupy a first hardware resource of the electronic equipment and using a current interface as a first interface.
7. The electronic device of claim 6,
and the electronic equipment determines that the second application corresponds to the second type of scene by using the second application to occupy a second hardware resource of the electronic equipment and using the current interface as a second interface.
8. The electronic device of claim 5,
and the electronic equipment determines that the third application corresponds to the third type of scene by using the third application to occupy a third hardware resource of the electronic equipment and using the current interface as a third interface.
9. The electronic device of any of claims 1-8, wherein the first user input comprises: a first user operation for opening the first application, or a first user operation for switching between a plurality of interfaces provided by the first application.
10. The electronic device of claim 9,
the first enhanced function option is a wide-angle function option, and the second enhanced function option is an anti-shake function option.
11. A method of using enhanced functionality of an electronic device, the method being applied to the electronic device, the method comprising:
in response to a first user input for a first application, displaying a first interface of the first application, displaying a first enhanced functionality option on the first interface;
providing, by the first application, a first enhanced functionality corresponding to the first enhanced functionality option in response to a second user input for the first enhanced functionality option;
in response to a third user input for a second application, displaying a second interface of the second application, displaying a second enhanced functionality option on the second interface;
providing, by the second application, a second enhanced function corresponding to the second enhanced function option in response to a fourth user input for the second enhanced function option;
the first application is a third-party application installed on the electronic device, the second application is another third-party application installed on the electronic device, the third-party application is an application provided by a third party other than a manufacturer of the electronic device, the first enhanced function option and the second enhanced function option are different enhanced function options, and the first enhanced function and the second enhanced function are different enhanced functions provided by the electronic device.
12. The method of claim 11, further comprising:
in response to a fifth user input for a third application, displaying a third interface of the third application, displaying a third enhanced functionality option on the third interface;
providing, by the third application, a third enhanced functionality corresponding to the third enhanced functionality option in response to a sixth user input for the third enhanced functionality option;
the third application is a third-party application installed on the electronic device, and the third enhanced function is provided for the electronic device.
13. The method of claim 12,
the third enhanced function option is different from both the first enhanced function option and the second enhanced function option, and the third enhanced function is different from both the first enhanced function and the second enhanced function option.
14. The method according to claim 11 or 12,
the first application corresponds to a first type scene, wherein the electronic device determines that the first application corresponds to the first type scene by the first application occupying a first hardware resource of the electronic device;
the second application corresponds to a second type of scene, wherein the electronic device determines that the second application corresponds to the second type of scene by the second application occupying a second hardware resource of the electronic device.
15. The method of claim 13,
the third application corresponds to a third type of scene, wherein the electronic device determines that the third application corresponds to the third type of scene by the third application occupying a third hardware resource of the electronic device.
16. The method of claim 14,
the electronic equipment determines that the first application corresponds to the first type of scene by using the first application to occupy a first hardware resource of the electronic equipment and using a current interface as a first interface.
17. The method of claim 16,
and the electronic equipment determines that the second application corresponds to the second type of scene by using the second application to occupy a second hardware resource of the electronic equipment and determining that the current interface is a second interface.
18. The method of claim 15,
and the electronic equipment determines that the third application corresponds to the third type of scene by using the third application to occupy a third hardware resource of the electronic equipment and using the current interface as a third interface.
19. The electronic device of claim 18,
the first enhanced function option is a wide-angle function option, and the second enhanced function option is an anti-shake function option.
20. A chip comprising one or more processors for invoking a computer program to cause an electronic device to perform the method of any of claims 11-19.
21. A computer-readable storage medium comprising a computer program, which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 11-19.
CN202210131964.4A 2019-04-19 2019-04-19 Method for using enhanced function of electronic device, chip and storage medium Active CN114666435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210131964.4A CN114666435B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic device, chip and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910319978.7A CN110113483B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic equipment and related device
CN202210131964.4A CN114666435B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic device, chip and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910319978.7A Division CN110113483B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic equipment and related device

Publications (2)

Publication Number Publication Date
CN114666435A true CN114666435A (en) 2022-06-24
CN114666435B CN114666435B (en) 2023-03-28

Family

ID=67485970

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210131964.4A Active CN114666435B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic device, chip and storage medium
CN201910319978.7A Active CN110113483B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic equipment and related device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910319978.7A Active CN110113483B (en) 2019-04-19 2019-04-19 Method for using enhanced function of electronic equipment and related device

Country Status (2)

Country Link
CN (2) CN114666435B (en)
WO (1) WO2020211735A1 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114666435B (en) * 2019-04-19 2023-03-28 华为技术有限公司 Method for using enhanced function of electronic device, chip and storage medium
CN110784592A (en) * 2019-09-29 2020-02-11 华为技术有限公司 Biological identification method and electronic equipment
CN110769096A (en) * 2019-10-21 2020-02-07 Oppo(重庆)智能科技有限公司 Motor vibration method, terminal and storage medium
CN110941821A (en) * 2019-12-09 2020-03-31 Oppo广东移动通信有限公司 Data processing method, device and storage medium
CN110990088B (en) * 2019-12-09 2023-08-11 Oppo广东移动通信有限公司 Data processing method and related equipment
CN111061524A (en) * 2019-12-09 2020-04-24 Oppo广东移动通信有限公司 Application data processing method and related device
CN111062025B (en) * 2019-12-09 2022-03-01 Oppo广东移动通信有限公司 Application data processing method and related device
CN111259441B (en) * 2020-01-14 2023-02-28 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN111338545A (en) 2020-02-24 2020-06-26 北京字节跳动网络技术有限公司 Image processing method, assembly, electronic device and storage medium
CN111314617B (en) 2020-03-17 2023-04-07 北京达佳互联信息技术有限公司 Video data processing method and device, electronic equipment and storage medium
CN111246032B (en) * 2020-03-27 2021-07-30 北京小米移动软件有限公司 Call management method and device
CN111565282A (en) * 2020-05-11 2020-08-21 Oppo(重庆)智能科技有限公司 Shooting control processing method, device, equipment and storage medium
CN114500822B (en) * 2020-11-11 2024-03-05 华为技术有限公司 Method for controlling camera and electronic equipment
CN112689086B (en) * 2020-12-10 2022-08-19 联想(北京)有限公司 Information determination method, electronic equipment and computer readable storage medium
CN114697348B (en) * 2020-12-25 2023-08-22 华为终端有限公司 Distributed implementation method, distributed system, readable medium and electronic device
CN115220828A (en) * 2021-04-19 2022-10-21 Oppo广东移动通信有限公司 Sidebar display method and device, terminal and storage medium
CN113220179A (en) * 2021-05-07 2021-08-06 Oppo广东移动通信有限公司 Sidebar display method and device, terminal and storage medium
CN113225429B (en) * 2021-05-19 2022-08-30 Tcl通讯(宁波)有限公司 Display effect optimization method and system and intelligent terminal
CN113284500B (en) * 2021-05-19 2024-02-06 Oppo广东移动通信有限公司 Audio processing method, device, electronic equipment and storage medium
CN115567630B (en) * 2022-01-06 2023-06-16 荣耀终端有限公司 Electronic equipment management method, electronic equipment and readable storage medium
CN116451264A (en) * 2022-01-10 2023-07-18 华为技术有限公司 Application program management method and related device
CN116708886B (en) * 2022-11-22 2024-05-14 荣耀终端有限公司 Video processing method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018027679A1 (en) * 2016-08-10 2018-02-15 华为技术有限公司 Notification message management method and terminal
CN109274575A (en) * 2018-08-08 2019-01-25 阿里巴巴集团控股有限公司 Message method and device and electronic equipment
CN109313530A (en) * 2017-05-16 2019-02-05 苹果公司 Equipment, method and graphic user interface for carrying out navigating and interacting with control object between user interface

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6993722B1 (en) * 1999-02-08 2006-01-31 Cirrus Logic, Inc. User interface system methods and computer program products for multi-function consumer entertainment appliances
US20080246733A1 (en) * 2007-04-04 2008-10-09 Henty David L TV interface control system and method with automatic text entry
US9060152B2 (en) * 2012-08-17 2015-06-16 Flextronics Ap, Llc Remote control having hotkeys with dynamically assigned functions
US10324704B2 (en) * 2015-05-27 2019-06-18 Google Llc Online marketplace of plugins for enhancing dialog systems
CN109495688B (en) * 2018-12-26 2021-10-01 华为技术有限公司 Photographing preview method of electronic equipment, graphical user interface and electronic equipment
CN114666435B (en) * 2019-04-19 2023-03-28 华为技术有限公司 Method for using enhanced function of electronic device, chip and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018027679A1 (en) * 2016-08-10 2018-02-15 华为技术有限公司 Notification message management method and terminal
CN109313530A (en) * 2017-05-16 2019-02-05 苹果公司 Equipment, method and graphic user interface for carrying out navigating and interacting with control object between user interface
CN109274575A (en) * 2018-08-08 2019-01-25 阿里巴巴集团控股有限公司 Message method and device and electronic equipment

Also Published As

Publication number Publication date
CN114666435B (en) 2023-03-28
CN110113483A (en) 2019-08-09
WO2020211735A1 (en) 2020-10-22
CN110113483B (en) 2022-02-25

Similar Documents

Publication Publication Date Title
CN110113483B (en) Method for using enhanced function of electronic equipment and related device
CN109951633B (en) Method for shooting moon and electronic equipment
WO2021129326A1 (en) Screen display method and electronic device
CN110381282B (en) Video call display method applied to electronic equipment and related device
CN110114747B (en) Notification processing method and electronic equipment
CN111046680B (en) Translation method and electronic equipment
WO2021036770A1 (en) Split-screen processing method and terminal device
EP4002144A1 (en) File sharing method and device for mobile terminal
EP3879401A1 (en) Automatic screen-splitting method, graphical user interface, and electronic device
WO2022037726A1 (en) Split-screen display method and electronic device
CN113452945A (en) Method and device for sharing application interface, electronic equipment and readable storage medium
CN114115770A (en) Display control method and related device
CN113141483B (en) Screen sharing method based on video call and mobile device
CN110286975B (en) Display method of foreground elements and electronic equipment
CN111492678B (en) File transmission method and electronic equipment
EP4181498A1 (en) Photographing method and electronic device
CN112449101A (en) Shooting method and electronic equipment
CN115543145A (en) Folder management method and device
CN112532508B (en) Video communication method and video communication device
CN114173005B (en) Application layout control method and device, terminal equipment and computer readable storage medium
CN113645595A (en) Equipment interaction method and device
WO2024114212A1 (en) Cross-device focus switching method, electronic device and system
CN114584652B (en) User graphical interface display method, device, computer equipment and storage medium
CN118092834A (en) Method for switching focus across devices, electronic device and system
CN113973152A (en) Unread message quick reply method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant