WO2022095906A1 - 一种按键映射方法、电子设备及系统 - Google Patents

一种按键映射方法、电子设备及系统 Download PDF

Info

Publication number
WO2022095906A1
WO2022095906A1 PCT/CN2021/128486 CN2021128486W WO2022095906A1 WO 2022095906 A1 WO2022095906 A1 WO 2022095906A1 CN 2021128486 W CN2021128486 W CN 2021128486W WO 2022095906 A1 WO2022095906 A1 WO 2022095906A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
control
user interface
user
pixel
Prior art date
Application number
PCT/CN2021/128486
Other languages
English (en)
French (fr)
Inventor
张田甜
韩金晓
冉冬
高光远
赵磊
李宏宇
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022095906A1 publication Critical patent/WO2022095906A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup

Definitions

  • the present invention relates to the technical field of terminals, and in particular, to a key mapping method, electronic device and system.
  • a mobile phone can generally be connected to a peripheral handle device, so that a user can use the peripheral handle to control an application with buttons on the mobile phone (such as a game application, etc.).
  • a wireless connection can be established between the mobile phone and the peripheral handle through Bluetooth.
  • the screen of the mobile phone may display a button library interface of the peripheral handle, and the button library interface may include a plurality of button icons. The user needs to select the button icon, and then manually drag the selected button icon to the appropriate position on the phone screen. After the button icon hovers to the specified position, the user releases the finger to complete the adaptation of the mobile phone and the peripheral handle.
  • the user needs to drag the button icon multiple times. Moreover, the user also needs to drag the button icon to the designated position of the application. If the hovering position of the button icon is inaccurate, the button mapping relationship between the mobile phone and the peripheral controller may fail to be established. Therefore, the operation steps of the prior art are cumbersome for the user, the operation is very inconvenient, and the key mapping matching efficiency is low, and the user experience is poor.
  • the purpose of this application is to provide a key mapping method, electronic device and system, which can make the process of establishing key mapping between the first electronic device and the second electronic device more intuitive, simple and effective, greatly improve the efficiency of key mapping, and also It simplifies the user's operation steps and improves the user's experience.
  • the present application provides a method for key mapping, the method may include: a first electronic device may establish a first connection with a second electronic device, and then the first electronic device may display a first user interface, the first electronic device A user interface may display a plurality of controls, and the plurality of controls may include a first control.
  • the first electronic device detects a first user operation (for example, the user clicks an icon with a finger), and in response to the first user operation, the first electronic device may recognize a plurality of controls in the first user interface, and the first electronic device may A first control of the identified plurality of controls is selected.
  • the first electronic device establishes a mapping relationship between the first physical key and the first control.
  • the second electronic device may be a gamepad.
  • the gamepad has one or more physical buttons and also has a bluetooth (BT) module and/or a wireless local area network (WLAN) module.
  • the Bluetooth (BT) module can provide a solution including one or more Bluetooth communications in classic Bluetooth (Bluetooth 2.1 standard) or Bluetooth low energy (bluetooth low energy, BLE).
  • WLAN modules can provide wireless fidelity direct (wireless fidelity direct, Wi-Fi direct), wireless fidelity local area networks (wirelessfidelity local area networks, Wi-Fi LAN) or wireless fidelity software access point (wirelessfidelitysoftware access point, Wi-Fi One or more solutions for WLAN communication in FisoftAP).
  • the first user interface may be a user interface of a game application, which is an application on an electronic device such as a smart phone, tablet computer, etc. that can be used for recreation and entertainment for the user.
  • a game application which is an application on an electronic device such as a smart phone, tablet computer, etc. that can be used for recreation and entertainment for the user.
  • This application does not make any reference to the name of the application. limit. That is to say, the game application can be any game application on the market that users can acquire and control, such as Etc., this application does not limit it.
  • Implementing the method of the first aspect can make the first electronic device more efficiently and quickly identify the controls in the first user interface, and establish a mapping relationship with the physical keys on the second electronic device, thereby improving the efficiency of key mapping and improving the User experience can also make key mapping more precise.
  • the first connection may be a wireless connection established by the first electronic device with the second electronic device through one or more wireless communication technologies of Bluetooth, Wi-Fi direct or Wi-Fi softAP , it can also be a wired connection between the first electronic device and the second electronic device through the universal serial bus (USB), after the first electronic device establishes a communication connection with the second electronic device, the first electronic device can pass One or more WLAN communication technologies of USB, Bluetooth, Wi-Fi direct or Wi-Fi softAP send data information to and/or receive data information from the second electronic device.
  • USB universal serial bus
  • the first electronic device after establishing the mapping relationship between the first physical button of the second electronic device and the first control, can select the second control among the plurality of controls. Then, the first electronic device can receive the second signal sent by the second electronic device through the first connection, and in response to the second signal received when the second control is in the selected state, establish the The mapping relationship between the second physical button and the second control. Wherein, the second signal is generated by the second electronic device when the second physical key is pressed by the user.
  • the first electronic device receives the data through the first connection
  • the received first signal executes the corresponding function of the first control.
  • the first electronic device performs grayscale processing on the first user interface to obtain a first image, and then the first electronic device can identify the first image from the first image through an edge detection algorithm Control boundaries contained in the first user interface.
  • a plurality of controls in the first user interface may include a first control
  • the first control may include a plurality of boundary pixels
  • the plurality of boundary pixels may include a first control A pixel. The higher the frequency that the touch position corresponding to the first pixel in the first user interface is touched by the user, the higher the probability that the first pixel is identified as the boundary pixel of the first control. high.
  • the first electronic device may use an edge operator to calculate The gradient vector for each pixel in the first image. Then, the first electronic device can use a linear interpolation method to compare the gradient value of the first gradient vector of the first pixel point with the gradient values of other pixel points in the same direction as the first gradient vector of the first pixel point. In the above direction, the gradient value of the first pixel point is the largest, then the first electronic device can retain the gradient value of the first pixel point, and set the gradient values of other pixel points to zero. Next, the first electronic device may set a first threshold. If the gradient value of the first pixel is greater than the first threshold, the first pixel is reserved, and all the reserved pixels constitute the first user interface. The bounds of the first control.
  • the first electronic device when the first electronic device executes the function corresponding to the first control, the first electronic device displays the second user interface.
  • the second user interface may be different in part or in whole from the aforementioned first user interface.
  • the second user interface may be the jump of the game application interface, the movement of characters in the game application interface, the change of the game scene, and the like.
  • the first control in the first user interface when the first control in the first user interface is in a selected state, the first control may be in a highlighted state, or a cursor may be displayed in the first control area, and /or the first control is in a blinking state, that is, how to prompt the user that the first control is in a selected state, this application does not limit this.
  • the first floating control may be displayed on the first user interface.
  • the first electronic device can detect a first user operation (eg, click) acting on the first floating control.
  • an embodiment of the present application provides a communication method, which is applied to a communication system, where the communication system includes a first electronic device and a second electronic device.
  • the first electronic device may establish a first connection with the second electronic device.
  • the first electronic device may display a first user interface, the first user interface may include a plurality of controls, and the plurality of controls include the first control.
  • the first electronic device may detect a first user operation, and in response to the first user operation, the first electronic device may identify a plurality of controls in the first user interface.
  • the first electronic device may select a first control among the plurality of controls, and the second electronic device may detect that the first physical key is pressed by the user, generate a first signal, and send the first signal to the first control through the first connection.
  • the electronic device sends the first signal.
  • the first electronic device may establish a mapping relationship between the first physical button and the first control in response to the first signal received when the first control is in the selected state.
  • Implementing the method of the second aspect can make the first electronic device more efficiently and quickly identify the controls in the first user interface, and establish a mapping relationship with the physical keys on the second electronic device, thereby improving the efficiency of key mapping and improving the User experience can also make key mapping more precise.
  • the first electronic device may select a second control among the plurality of controls, and then the second electronic device may detect When the second physical key is pressed by the user, a second signal is generated, and the second signal is sent to the first electronic device through the first connection.
  • the first electronic device may establish a mapping relationship between the second physical button and the second control in response to the second signal received when the second control is in the selected state.
  • the second electronic device in the communication system may be a gamepad.
  • the game handle reference may be made to the description of the second electronic device provided in the foregoing first aspect, and details are not repeated here.
  • the first electronic device receives through the first connection The received first signal executes the corresponding function of the first control.
  • the first electronic device performs grayscale processing on the first user interface to obtain a first image, and then the first electronic device can identify from the first image through an edge detection algorithm Control boundaries contained in the first user interface.
  • the first electronic device performs grayscale processing on the first user interface to obtain a first image, and then the first electronic device can identify from the first image through an edge detection algorithm Control boundaries contained in the first user interface.
  • a plurality of controls in the first user interface may include a first control
  • the first control may include a plurality of boundary pixels
  • the plurality of boundary pixels may include a first control A pixel. The higher the frequency that the touch position corresponding to the first pixel in the first user interface is touched by the user, the higher the probability that the first pixel is identified as the boundary pixel of the first control. high.
  • the first electronic device may use an edge operator to calculate The gradient vector for each pixel in the first image. Then, the first electronic device can use a linear interpolation method to compare the gradient value of the first gradient vector of the first pixel point with the gradient values of other pixel points in the same direction as the first gradient vector of the first pixel point. In the above direction, the gradient value of the first pixel point is the largest, then the first electronic device can retain the gradient value of the first pixel point, and set the gradient values of other pixel points to zero. Next, the first electronic device may set a first threshold. If the gradient value of the first pixel is greater than the first threshold, the first pixel is reserved, and all the reserved pixels constitute the first user interface. The bounds of the first control.
  • the first electronic device when the first electronic device executes the function corresponding to the first control, the first electronic device displays the second user interface.
  • the second user interface may be different in part or in whole from the aforementioned first user interface.
  • the second user interface may be the jump of the game application interface, the movement of characters in the game application interface, the change of the game scene, and the like.
  • the first control in the first user interface when the first control in the first user interface is in a selected state, the first control may be in a highlighted state, or a cursor may be displayed in the first control area, and /or the first control is in a blinking state, that is, how to prompt the user that the first control is in a selected state, this application does not limit this.
  • the first floating control after the first electronic device displays the first user interface, the first floating control may be displayed on the first user interface. The first electronic device can detect a first user operation (eg, click) acting on the first floating control.
  • an electronic device which may include: a communication device, a touch screen, a memory, and a processor coupled to the memory, where computer-executable instructions are stored in the memory.
  • the communication device can be used to establish a first connection with the second electronic device.
  • the touch screen may be used to display a first user interface, the first user interface may include a plurality of controls, and the plurality of controls may include the first control.
  • the touch screen can also be used to detect the first user operation.
  • the processor may be used to identify a plurality of controls in the first user interface, and may also be used to select a first control of the plurality of controls.
  • the communication apparatus may also be used to receive the first signal sent by the second electronic device through the first connection.
  • the processor may also be configured to, in response to the first signal received when the first control is in the selected state, establish a mapping relationship between the first physical button and the first control in the second electronic device, wherein the first control A signal is generated by the second electronic device when the first physical key is pressed by the user.
  • the processor may be further configured to select a second control among the plurality of controls after establishing the mapping relationship between the first physical button of the second electronic device and the first control.
  • the communication apparatus can also be used to receive a second signal sent by the second electronic device through the first connection.
  • the processor may also be configured to establish a mapping relationship between the second physical button of the second electronic device and the second control in response to the second signal received when the second control is in the selected state; The second signal is generated by the second electronic device when the second physical key is pressed by the user.
  • the second electronic device may be a gamepad.
  • the game handle reference may be made to the description of the second electronic device provided in the foregoing first aspect, and details are not repeated here.
  • the processor may be further configured to, after the mapping relationship between the plurality of controls and the plurality of physical keys of the second electronic device is established, receive the first connection through the first connection.
  • the function corresponding to the first control is executed.
  • the processor may also be specifically configured to perform grayscale processing on the first user interface to obtain a first image, and then, the processor may obtain a first image from the first image through an edge detection algorithm Control boundaries included in the first user interface are identified.
  • a plurality of controls in the first user interface may include a first control
  • the first control may include a plurality of boundary pixels
  • the plurality of boundary pixels may include a first control A pixel. The higher the frequency that the touch position corresponding to the first pixel in the first user interface is touched by the user, the higher the probability that the first pixel is identified as the boundary pixel of the first control. high.
  • the processor may also be specifically configured to, when the processor recognizes the boundary of the first control in the first user interface from the first image through the edge detection method, use the edge calculation method. sub, calculate the gradient vector of each pixel in the first image. Then, the processor may use a linear interpolation method to compare the gradient value of the first gradient vector of the first pixel point with the gradient values of other pixel points in the same direction as the first gradient vector of the first pixel point. In the direction, the gradient value of the first pixel point is the largest, then the processor may retain the gradient value of the first pixel point, and set the gradient values of other pixel points to zero. Next, the processor may set a first threshold, if the gradient value of the first pixel is greater than the first threshold, the first pixel is reserved, and all the reserved pixels constitute the first pixel in the first user interface. The bounds of the first control.
  • the processor when the processor executes the function corresponding to the first control, the processor displays the second user interface.
  • the second user interface may be different in part or in whole from the aforementioned first user interface.
  • the second user interface may be the jump of the game application interface, the movement of characters in the game application interface, the change of the game scene, and the like.
  • the first control in the first user interface when the first control in the first user interface is in a selected state, the first control may be in a highlighted state, or a cursor may be displayed in the first control area, and /or the first control is in a blinking state, that is, how to prompt the user that the first control is in a selected state, this application does not limit this.
  • the first floating control may be displayed on the first user interface.
  • the touch screen may be specifically used to detect a first user operation (eg click) acting on the first floating control.
  • an embodiment of the present invention provides a computer storage medium, where a computer program is stored in the storage medium, and the computer program includes executable instructions, and when executed by a processor, the executable instructions cause the processor to execute as described in Section 1.
  • operations corresponding to the methods provided in the second aspect are performed by a processor, and when executed by a processor, the executable instructions cause the processor to execute as described in Section 1.
  • FIG. 1 is a schematic diagram of the architecture of a communication system provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of another electronic device provided by an embodiment of the present application.
  • FIG. 4 is a software framework diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 5 is a software module architecture diagram of a communication system provided by an embodiment of the present application.
  • 6A is a schematic diagram of a user interface provided by an embodiment of the present application.
  • 6B is a schematic diagram of a user interface provided by an embodiment of the present application.
  • 6C is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a user interface provided by an embodiment of the present application.
  • 14A is a schematic diagram of a user interface provided by an embodiment of the present application.
  • 14B is a schematic diagram of a user interface provided by an embodiment of the present application.
  • 16 is a flowchart of a method for identifying a key area provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of identifying a pixel point in an adjacent area provided by an embodiment of the present application.
  • the present application provides a key mapping method, which can be applied to the electronic device provided by the present application.
  • the electronic device may be a mobile phone, a tablet computer, a personal computer (PC), a smart TV, or other electronic device, and the application does not make any limitation on the specific type of the electronic device.
  • the method can establish a mapping relationship between the virtual key element in the application program on the electronic device and the physical key (also called the physical key) of the peripheral electronic device (for example, the peripheral electronic device), and then the user can use the peripheral electronic device to control the The button of the application to trigger the function corresponding to the button.
  • the electronic device can identify the key area in the game application through the image processing algorithm.
  • the electronic device may further correct the image processing algorithm in combination with the user's touch frequency on the game screen. After all the key regions are identified, each key region is numbered, and at the same time, a cursor that the user can control to move may appear on the electronic device interface, and the cursor can move within the identified key region.
  • the user can select the virtual button icon that needs to be matched with the physical button of the gamepad in the application interface by controlling the movement operation of the cursor on the identified button area. For example, when the cursor stays on the first game virtual key, the user can press the first physical key on the game handle, so that the first physical key and the first game virtual key establish a mapping relationship. After the user matches all the virtual button icons in the game application with the physical buttons of the gamepad, the mapping relationship is established, and the user can trigger the virtual buttons in the game application by controlling the physical buttons on the gamepad, and then trigger the virtual button.
  • the functions corresponding to the buttons such as the movement of characters in the game, the casting of skills, the switching of scenes, and so on.
  • the implementation of the technical solution of the present application can reduce the steps of user operations in the process of matching the gamepad and the game application keys, making the user operations more convenient, the matching more accurate, and greatly improving the user experience. .
  • User interface is a medium interface for interaction and information exchange between application programs or operating systems and users, which realizes the conversion between the internal form of information and the form acceptable to users.
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML). Such as pictures, text, buttons and other controls.
  • the leftmost digit is not used to represent positive and negative, but is connected with the following to represent integers, so it is impossible to distinguish whether the number is positive or negative, it can only be positive, which is no. Signed integer.
  • Grayscale refers to the use of black tones to represent objects, that is, using black as the base color and displaying images with different saturations of black.
  • Each grayscale object has a brightness value from 0% (white) to a grayscale bar of 100% (black).
  • Noise refers to pixel points or pixel blocks that are extremely abrupt in the image, which will interfere with the image without making the image unclear, or affect the details of the observed image.
  • Gaussian noise means that the probability density function of noise pixels follows a Gaussian distribution (also called a normal distribution). That is to say, if a noise, its amplitude obeys a Gaussian distribution, and its power spectral density advantage is uniformly distributed, it is called Gaussian noise.
  • Gaussian filtering is a linear smoothing filter, suitable for removing Gaussian noise, and is widely used in the noise reduction process of image processing.
  • Gaussian filtering is a process of weighted averaging of the entire image. The value of each pixel is obtained by the weighted average of itself and other pixel values in the neighborhood.
  • the specific operation of Gaussian filtering is to scan each pixel in the image with a template (or Gaussian kernel), and replace the value of the template center with the weighted average gray value of the pixels in the neighborhood determined by the template.
  • Gaussian filters are very effective for noise that is always normally distributed.
  • RGB refers to the colors that represent the three channels of red (red, R), green (green, G), and blue (blue, B).
  • Electronic devices are generated by changing the above three color channels and superimposing them on each other. Get a variety of colors.
  • This color representation standard includes almost all colors that can be perceived by human vision, and it is one of the most widely used color systems at present.
  • the blur radius refers to the value that a certain pixel expands outward during the Gaussian filtering process.
  • Gradient means that in vector calculus, the gradient at a certain point in the scalar field points to the fastest growing direction of the scalar field, and the gradient strength is the largest rate of change in this direction.
  • the gradient is simply the derivative, or, for a linear function, the gradient of the line. Used for slope, which is how much a surface slopes in a given direction, the value of the gradient is sometimes called the gradient.
  • the communication system 10 may include an electronic device 100 and an electronic device 200 .
  • the electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a PC, and a smart TV.
  • the electronic device 100 may have one or more of a Bluetooth (bluetooth) module and a WLAN module.
  • the electronic device 100 can detect and scan for nearby devices of the electronic device 100 by transmitting signals through one or more of the Bluetooth module and the WLAN module, so that the electronic device 100 can discover nearby devices using one or more wireless communication technologies of Bluetooth or WLAN (eg, the electronic device 200 ), and can establish a wireless communication connection with a nearby device, and transmit data to the nearby device (eg, the electronic device 200 ) through one or more wireless communication technologies in Bluetooth or WLAN.
  • the Bluetooth module can provide a solution including one or more Bluetooth communications in classic Bluetooth (Bluetooth 2.1 standard) or Bluetooth Low Energy (BLE).
  • the WLAN module can provide a solution including one or more WLAN communications in Wi-Fi direct, Wi-Fi LAN or Wi-Fi softAP.
  • the electronic device 200 may be a peripheral handle with a Bluetooth module, and/or a WLAN module, and/or a data line interface.
  • the electronic device 200 may receive or transmit wireless signals through one or more of a Bluetooth module and a WLAN module.
  • the bluetooth module can provide a solution including one or more bluetooth communication in classic bluetooth or bluetooth low energy consumption.
  • the WLAN module can provide a solution including one or more WLAN communications in Wi-Fi direct, Wi-Fi LAN or Wi-Fi softAP.
  • the electronic device 200 may further include a joystick button 201A, a start button 201B (identified with the letter S), an A function button 201C, a B function button 201D, a C function button 201E, and a D function button 201F.
  • the joystick button 201A can be used to control the direction buttons on the user interface of the electronic device 100 to move in a direction (eg, up, down, etc.), and the start button 201B can be used to The electronic device 200 is turned on or off.
  • a function button 201C, B function button 201D, C function button 201E, D function button 201F can be mapped with each function button on the user interface of the electronic device 100, and when the user presses these function buttons, the electronic device 100 can be triggered to generate the corresponding functional event.
  • the electronic device 200 may establish a first connection with the electronic device 100 .
  • the first connection may be one or more wireless communication connections among Bluetooth, Wi-Fi direct, or Wi-Fi softAP, or may be a wired connection, such as a universal serial bus (universal serial bus, USB) connection.
  • the electronic device 100 and the second electronic device can transmit data information to each other through the first connection.
  • the structure of the electronic device 200 shown in the embodiments of the present application does not constitute a specific limitation on the communication system 10 .
  • the electronic device 200 may have more or less buttons than shown in the figure, for example, the electronic device 200 may have a plurality of rocker buttons 201A.
  • the positions of the keys on the electronic device 200 may be located on the side, the back, etc. of the electronic device 200 , which are not facing the user. This application does not limit this.
  • the communication system 10 may include more or fewer devices than shown.
  • the communication system 10 may also include a plurality of mobile phones, or a plurality of different types of electronic devices, such as a display with a communication function, a tablet computer, a PC, and the like. This application does not limit this.
  • FIG. 2 shows a schematic diagram of the hardware structure of the electronic device 100 .
  • the electronic device 100 may be a cell phone, tablet computer, desktop computer, laptop computer, handheld computer, notebook computer, ultra-mobile personal computer (UMPC), netbook, as well as cellular telephones, personal digital assistants (personal digital assistants) digital assistant (PDA), augmented reality (AR) devices, virtual reality (VR) devices, artificial intelligence (AI) devices, wearable devices, in-vehicle devices, smart home devices and/or Smart city equipment, the embodiments of the present application do not specifically limit the specific type of the electronic equipment.
  • PDA personal digital assistants
  • AR augmented reality
  • VR virtual reality
  • AI artificial intelligence
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication solution provided by the mobile communication module 150 may enable the electronic device to communicate with a device (eg, a server) in the network.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the electronic device 100 can detect or scan devices near the electronic device 100 by transmitting signals through the Bluetooth module and the WLAN module in the wireless communication module 160, and establish a wireless communication connection with the nearby devices and transmit data.
  • the bluetooth module can provide a solution including one or more bluetooth communications in classic bluetooth (Bluetooth 2.1 standard) or bluetooth low energy consumption.
  • the WLAN module can provide a solution including one or more WLAN communications in Wi-Fi direct, Wi-Fi LAN or Wi-Fi softAP.
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the internal memory 121 may include one or more random access memories (RAM) and one or more non-volatile memories (NVM).
  • RAM random access memories
  • NVM non-volatile memories
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronization Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5 SDRAM), etc.;
  • SRAM static random-access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR5 SDRAM double data rate synchronous dynamic random access memory
  • Non-volatile memory may include magnetic disk storage devices, flash memory.
  • Flash memory can be divided into NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operating principle, and can include single-level memory cell (SLC), multi-level memory cell (multi-level memory cell, SLC) according to the level of storage cell potential.
  • cell, MLC multi-level memory cell
  • TLC triple-level cell
  • QLC quad-level cell
  • UFS universal flash storage
  • eMMC embedded multimedia memory card
  • the random access memory can be directly read and written by the processor 110, and can be used to store executable programs (eg, machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • executable programs eg, machine instructions
  • the random access memory can be directly read and written by the processor 110, and can be used to store executable programs (eg, machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • the non-volatile memory can also store executable programs and store data of user and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video, etc. files in external non-volatile memory.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • FIG. 3 exemplarily shows the hardware structure of the electronic device 200 provided by the present application.
  • the electronic device 200 may include a button 201, a processor (central processing unit, CPU) 202, a memory 203, a bus 204, an input and output interface 205, a motor 206, an indicator light 207, an audio module 208, and a sensor module 209 , a communication interface 210 , a wireless communication module 211 , a power management module 212 , and an antenna 3 .
  • the sensor module 209 may include a pressure sensor 209A, an angle sensor 209B, a gravity sensor 209C, a gyroscope sensor 209D, an acceleration sensor 209E, and the like.
  • the communication interface 210 may include a USB interface 210A, a wireless communication interface 210B, and the like.
  • the wireless communication module 211 may include a Bluetooth communication module 211A, a Wi-Fi communication module 211B, and the like.
  • the processor 202, the communication interface 210, the wireless communication module 211, and the power management module 212 may be connected through the bus 204 or in other ways.
  • FIG. 3 takes the connection through the bus 204 as an example.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 200 .
  • the electronic device 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the buttons 201 may include a start button 201B, a rocker button 201A, various function buttons, and the like as shown in FIG. 1 .
  • the keys 201 may be mechanical keys.
  • the electronic device 100 may receive the key signal of the electronic device 200 and generate touch events for corresponding application keys on the display screen of the electronic device 100 .
  • the processor 202 may include one or more processing units, for example, the processor 202 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 202 for storing instructions and data.
  • the memory in processor 202 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 202 . If the processor 202 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 202 is reduced, thereby increasing the efficiency of the system.
  • the processor 202 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the memory 203 may include one or more random access memories (RAM) and one or more non-volatile memories (NVM).
  • RAM random access memories
  • NVM non-volatile memories
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronization Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5 SDRAM), etc.;
  • SRAM static random-access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • DDR5 SDRAM double data rate synchronous dynamic random access memory
  • Non-volatile memory may include magnetic disk storage devices, flash memory.
  • Flash memory can be divided into NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operating principle, and can include single-level memory cell (SLC), multi-level memory cell (multi-level memory cell, SLC) according to the level of storage cell potential.
  • cell, MLC multi-level memory cell
  • TLC triple-level cell
  • QLC quad-level cell
  • UFS universal flash storage
  • eMMC embedded multimedia memory card
  • the random access memory can be directly read and written by the processor 202, and can be used to store executable programs (eg, machine instructions) of an operating system or other running programs, and can also be used to store data of users and application programs.
  • executable programs eg, machine instructions
  • the non-volatile memory can also store executable programs and store data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 202 to directly read and write.
  • Motor 206 can generate vibrating cues.
  • the motor 206 can be used for touch vibration feedback, and when the user acts on a touch operation (eg, press) on different keys 201 (eg, start key, function key, etc.), different vibration feedback effects can be corresponding. Acting on the touch operations of different keys 201, the motor 206 can also correspond to different vibration feedback effects. In some embodiments, the touch vibration feedback effect may also support customization.
  • the indicator 207 may be an indicator light, which may be used to indicate the charging state, the change of the electric quantity, or may be used to indicate the touch operation of the key 201 and the like.
  • the audio module 208 is used to convert digital audio information to analog audio signal output, and also to convert analog audio input to digital audio signal. Audio module 208 may also be used to encode and decode audio signals. In some embodiments, the audio module 208 may be provided in the processor 202 , or some functional modules of the audio module 208 may be provided in the processor 202 .
  • the pressure sensor 209A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 209A may be disposed at the bottom of the button 201 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 209A, the capacitance between the electrodes changes.
  • the electronic device 200 determines the intensity of the pressure according to the change in capacitance.
  • a touch operation eg, pressing
  • the electronic device 200 detects the intensity of the touch operation according to the pressure sensor 209A.
  • the electronic device 200 can also calculate the touched position according to the detection signal of the pressure sensor 209A.
  • touch operations that act on the same touch position but have different touch operation durations may correspond to different operation instructions. For example, when the electronic device 200 is in a running state, and a touch operation whose duration is less than the first duration threshold acts on the start button, the instruction for the electronic device 200 to sleep is executed. When a touch operation whose duration is greater than or equal to the first duration threshold acts on the start button, the instruction for shutting down the electronic device 200 is executed.
  • the angle sensor 209B may be used to detect the angle. In a specific implementation, there may be a hole in the middle of the angle sensor 209B to match the corresponding mechanical axis.
  • the angle sensor 102 counts every 1/16th of a revolution of the mechanical shaft. When turning in one direction, the count increases, and when the turning direction changes, the count decreases. The count is related to the initial position of the angle sensor 102, its count value is set to 0 when the angle sensor is initialized, and the angle sensor can be reset using programming if desired.
  • the gravity sensor 209C may be used to collect gravitational acceleration data of the electronic device 200 to determine the motion state of the electronic device 200 .
  • the gravity sensor 209C can be used in scenarios such as somatosensory games.
  • the gyroscope sensor 209D can be used to determine the motion posture of the electronic device 200, and send the motion posture-related signal to the electronic device 100, so that the corresponding control element icon on the display screen of the electronic device 100 can display the same motion as the electronic device 200. attitude.
  • the angular velocity of electronic device 200 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used in somatosensory games, racing games and other scenarios.
  • the acceleration sensor 209E can detect the magnitude of the acceleration of the electronic device 200 in various directions (generally three axes). Therefore, the acceleration sensor 209E can be used to detect motion information of the electronic device 200 . When the electronic device 200 is stationary, the acceleration sensor 209E can also detect the magnitude and direction of gravity.
  • the USB interface 210A is an interface that conforms to the USB standard specification, and can specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 210A can be used to connect a charger to charge the electronic device 200, and can also be used to transmit data between the electronic device 200 and peripheral devices.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the USB interface 210A may also be an on-the-go (OTG) interface, which is mainly used for connection between various devices to exchange data, and can extend the USB interface to the smart terminal Accessories to enrich the functions of the smart terminal.
  • OTG interface can connect the electronic device 200 with the peripheral device through an OTG data line having a USB interface at one end and a Type C interface at the other end, and can transmit data and the like through this connection.
  • the wireless communication interface 210B is an interface that conforms to a wireless communication protocol, and specifically may be an 802.11 wireless interface or the like.
  • the wireless communication interface 210B can be used to establish a wireless connection between the electronic device 200 and a peripheral device, and perform data transmission between the devices through the wireless connection.
  • the wireless communication processing module 211 may include one or more of the Bluetooth communication processing module 211A and the WLAN communication processing module 211B, and may be used to monitor signals transmitted by other devices (eg, the electronic device 100 ), such as probe requests, scan signals, and the like , and can send response signals, such as probe response, scan response, etc., so that other devices (such as electronic device 100 ) can discover electronic device 200 and establish a wireless communication connection with other devices (such as electronic device 100 ) through Bluetooth or WLAN One or more wireless communication technologies used to communicate with other devices (eg, electronic device 100).
  • one or more of the Bluetooth communication processing module and the WLAN communication processing module may also transmit signals, such as broadcasting Bluetooth signals, beacon signals, so that other devices (eg, electronic device 100 ) can discover the electronic device 200, and establishes a wireless communication connection with other devices (eg, electronic device 100), and communicates with other devices (eg, electronic device 100) through one or more wireless communication technologies in Bluetooth or WLAN.
  • signals such as broadcasting Bluetooth signals, beacon signals, so that other devices (eg, electronic device 100 ) can discover the electronic device 200, and establishes a wireless communication connection with other devices (eg, electronic device 100), and communicates with other devices (eg, electronic device 100) through one or more wireless communication technologies in Bluetooth or WLAN.
  • the antenna 3 is used for transmitting and receiving electromagnetic wave signals, which can cover single or multiple communication frequency bands.
  • the antenna may be used in conjunction with a tuning switch.
  • the electronic device 200 may further include multiple antennas, which is not limited in this application.
  • the power management module 212 may include a battery and charging management module.
  • the power management module 212 can supply power to the processor 202, the memory 203, the wireless communication module 211, and the like.
  • the power management module 212 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 212 may also be provided in the processor 202 .
  • the charging management modules in the power management module 212 may also be provided in different devices.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present invention use a layered architecture Taking the system as an example, the software architecture of the electronic device 100 is exemplarily described.
  • FIG. 4 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the software structure of the electronic device 100 may include: an application layer (applications, APP), an application framework layer (application framework, FWK), an Android runtime ( runtime) and system libraries (libraries), kernel layer (kernel) and hardware layer (hardware).
  • an application layer applications, APP
  • an application framework layer application framework, FWK
  • Android runtime runtime
  • system libraries libraries
  • kernel layer kernel layer
  • hardware layer hardware layer
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application layer may further include a game application and a gamepad application.
  • a game application is an application on an electronic device such as a smart phone and a tablet computer that can be used for recreation and entertainment for the user, and this application does not limit the name of the application. That is to say, the game application can be any game application on the market that users can acquire and control, such as Etc., this application does not limit it.
  • the gamepad application can be used to manage and configure the application of the electronic device 200 (eg, a peripheral controller).
  • the gamepad application can be used to set and/or adjust parameters such as button sensitivity, button icon transparency, combo rate, etc., and can also be used to perform button mapping on the button area on the user interface of the application program where the electronic device 100 needs to perform button mapping. identify.
  • the gamepad application may, in response to the user's operation, establish mapping information between the virtual key icon that currently needs to be mapped and the physical key on the electronic device 200, and save the mapping information in the electronic device 100.
  • Internal storage and/or cloud server can be used to manage and configure the application of the electronic device 200 (eg, a peripheral controller).
  • the gamepad application can be used to set and/or adjust parameters such as button sensitivity, button icon transparency, combo rate, etc., and can also be used to perform button mapping on the button area on the user interface of the application program where the electronic device 100 needs to perform button mapping. identify.
  • the gamepad application may, in response
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, a patch package, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, prompting text information in the status bar, making a sound, the head-mounted display device vibrating, the indicator light flashing, etc.
  • Runtime includes core libraries and virtual machines. The runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), patch engine, etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • patch engine etc.
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, sensor driver, as well as WLAN Bluetooth capability and basic communication protocol.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, etc.).
  • Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Take the touch operation as a touch operation, and the control corresponding to the touch operation is a camera application icon as an example, the camera application calls the interface of the application framework layer, starts the camera application, and then starts the camera driver by calling the kernel layer, and captures a static image through the camera 193 or video.
  • FIG. 5 is a block diagram of the software architecture of the communication system 10 .
  • the software architecture of the communication system 10 may include software modules of the electronic device 100 and software modules of the electronic device 200 .
  • the software modules of the electronic device 100 may include a processing module 510 , a communication module 520 , and a display module 530 .
  • the electronic device 200 may include a communication module 540 and a processing module 550 .
  • the processing module 510 includes a key recognition module 511 , a key mapping module 512 , and a key management module 513 .
  • the button identification module 511 can be used to identify the button icon area on the current user interface of the electronic device 100 after the electronic device 100 establishes a communication connection with the electronic device 200 and starts an application that needs to perform button mapping.
  • the button management module 513 can be used for the user to manage and adjust the parameter configuration of the gamepad, such as button sensitivity, button icon transparency, combo rate and so on.
  • the key management module 513 can be used to obtain the coordinate position of the key icon on the user interface of the electronic device 100, and obtain the key value corresponding to the physical key touched by the user on the electronic device 200, and generate and save the coordinate position of the aforementioned key icon corresponding to the physical key. Key-value mapping information.
  • the electronic device 100 detects a user's touch operation on the icon for enabling the key area identification module (for example, clicking on the "start matching" icon), the key recognition module 511, in response to the operation, recognizes the key icon area on the current user interface of the electronic device 100.
  • the key area identification module for example, clicking on the "start matching" icon
  • Communication module 520 may include managing USB-based wired connections, and/or managing wireless connections based on one or more wireless communication technologies of Bluetooth and/or WLAN.
  • the Bluetooth (BT) module can provide a solution including one or more Bluetooth communications in classic Bluetooth (Bluetooth 2.1) or Bluetooth Low Energy (BLE).
  • the WLAN module can provide a solution including one or more WLAN communications in Wi-Fi direct, Wi-Fi LAN or Wi-Fi softAP.
  • the display module 530 may be used to display a user interface, such as images and videos, on the electronic device 100 so that a user can interact with the electronic device 100 .
  • the communication module 540 is a module used by the electronic device 200 to communicate with other devices. For details, reference may be made to the description of the communication module 520, which will not be repeated here.
  • the electronic device 200 and the electronic device 100 may perform data interaction through the first connection established between the communication module 520 and the communication module 540 .
  • first connection For the first connection, reference may be made to the description in FIG. 1 , which will not be repeated here.
  • the processing module 550 may be used for signal processing of the gamepad. Specifically, for example, in response to a user's touch operation (such as pressing) on the electronic device 200, the electronic device 200 may receive a physical signal (such as a pressure signal), and then the processing module 550 can pass the sensor in the aforementioned sensor module 209. (eg pressure sensor 209A) converts physical signals into electrical signals. The processing module 550 can also collect the data in the aforementioned sensor module 209 for processing.
  • a physical signal such as a pressure signal
  • the processing module 550 can also collect the data in the aforementioned sensor module 209 for processing.
  • the gravity sensor 209C in the sensor module 209 detects the gravitational acceleration moving downward from the left side, and the processing module 550 converts the physical signal into an electrical signal and processes it accordingly (for example, it is processed into an appropriate frequency) and sent to the electronic device 100 through the communication module 540 .
  • the display module 530 displays the corresponding touch events (for example, the controllable icon elements on the electronic device 100 are also shifted to the left accordingly).
  • FIG. 6A shows an exemplary user interface 30 .
  • the user interface 30 may include a status bar 301, a tray 302 with frequently used application icons, a calendar indicator 303, a page indicator 304, and other application icons, among others.
  • Status bar 301 may include: one or more signal strength indicators 301A for mobile communication signals (also referred to as cellular signals), one or more signal strength indicators 301B for wireless fidelity (Wi-Fi) signals, Battery status indicator 301C.
  • signal strength indicators 301A for mobile communication signals (also referred to as cellular signals)
  • signal strength indicators 301B for wireless fidelity (Wi-Fi) signals
  • Battery status indicator 301C for battery status indicator 301C.
  • a tray 302 with icons of frequently used applications may display: a camera icon 302A, an address book icon 302B, a phone icon 302C, and an information icon 302D.
  • Calendar indicator 303 may be used to indicate the current time, such as date, day of the week, hour and minute information, and the like.
  • the page indicator 304 may be used to indicate the application in which page the user is currently viewing. Users can swipe left and right in the area of other application icons to browse application icons in other pages.
  • Other application icons may be, for example: music icon 305 , calculator icon 306 , game icon 307 , settings icon 308 .
  • the user interface 30 exemplarily shown in FIG. 6A may be a Homescreen.
  • the electronic device may also include a home screen key.
  • the home screen key can be a physical key or a virtual key.
  • the home screen button can be used to receive the user's instruction and return the currently displayed UI to the home interface, so that it is convenient for the user to view the home screen at any time.
  • the above instruction may be an operation instruction for the user to press the home screen key once, or an operation instruction for the user to press the home screen key twice in a short period of time, or the user presses the home screen key for a predetermined period of time. operation instructions.
  • the home screen key may also be integrated with a fingerprint reader, so that when the user presses the home screen key, fingerprint collection and identification are performed accordingly.
  • FIG. 6A only exemplarily shows a user interface on the electronic device 100 , and should not constitute a limitation to the embodiments of the present application.
  • FIG. 6A , FIG. 6B , FIG. 6C and FIG. 7 show related user interfaces for establishing a communication connection between the electronic device 100 and the electronic device 200 .
  • the embodiment of the present application takes the communication between the electronic device 100 and the electronic device 200 through Bluetooth as an example.
  • FIG. 6A exemplarily shows an operation of turning on Bluetooth on the electronic device 100 .
  • the electronic device 100 may display the user interface 31 in response to the gesture.
  • the user interface 31 may include a window 311 and some or all of the same interface elements as the aforementioned user interface 30 (eg, controls, icons, text, etc.).
  • the window 311 may display a "Bluetooth" switch control 312, and may also display switch controls with other functions (eg, Wi-Fi control, torch control, location information control, gamepad control 313, etc.).
  • the electronic device 100 may turn on the Bluetooth function.
  • the user can make a downward swipe gesture at the status bar 301 to open the window 311, and can click the switch control 312 of “Bluetooth” in the window 311 to conveniently turn on the Bluetooth.
  • the electronic device 100 can discover nearby devices through the Bluetooth communication technology.
  • FIG. 6B exemplarily shows another operation of enabling Bluetooth.
  • a touch operation eg, a click
  • the electronic device 100 may display the user interface 32 in response to the operation.
  • the user interface 32 may include one or more setting items, and the one or more setting items may include: an airplane mode setting item, a Wi-Fi setting item, a Bluetooth setting item 321, a mobile network setting item, a gamepad setting item 322, a Disturbance mode setting entry, display and brightness setting entry, Huawei account setting entry, etc.
  • Each setting item on the user interface 32 has a corresponding title.
  • the title corresponding to the airplane mode setting item is "Airplane Mode”
  • the title corresponding to the Wi-Fi setting item is “Wi-Fi”
  • the title corresponding to the Bluetooth setting item 321 is “Bluetooth”
  • the title corresponding to the mobile network setting item is "Mobile Network”
  • the title corresponding to the gamepad setting item 322 is “Gamepad”
  • the title corresponding to the Do Not Disturb mode setting item is "Do Not Disturb Mode”
  • the title corresponding to the display and brightness setting item is "Display and Brightness”
  • Huawei The title corresponding to the account settings entry is "Huawei Account”.
  • Each setting item may be used to monitor an operation (eg, a touch operation) that triggers displaying the setting content of the corresponding setting item, and in response to the operation, the electronic device 100 may open a user interface for displaying the setting content of the corresponding setting item.
  • an operation eg, a touch operation
  • the user interface 32 may add setting items, for example, "Assistant Assistant", “Biometrics and Passwords", and the like.
  • the setting items in the user interface 32 may also have responsive text descriptions.
  • the user interface 32 can also reduce some items, and the titles corresponding to the setting items can also be different.
  • the representation of each setting item may include icons and/or text. This application does not limit this.
  • a touch operation eg, click
  • the electronic device 100 can turn on the Bluetooth function. After Bluetooth is turned on, the electronic device 100 can discover nearby devices through the Bluetooth communication technology.
  • the electronic device 100 can also discover devices near the electronic device through Wi-Fi direct connection (such as Wi-Fi p2p), Wi-Fi softAP, etc., Wi-Fi LAN and other communication technologies. There is no restriction on this.
  • FIG. 6C exemplarily shows the user interface 33 for Bluetooth settings.
  • a touch operation eg, click
  • the electronic device 100 displays the user interface 33 shown in FIG. 6C .
  • the user interface 33 may include the status bar 301 shown in the aforementioned user interface 30 , and the status bar may refer to the description of the aforementioned FIG. 6A , which will not be repeated here.
  • User interface 33 may also include a current page indicator 331, Bluetooth status control controls 332, gamepad 200 device options entry 323, and other interface elements (eg, icons, controls, text, etc.).
  • the current page indicator 331 may be used to indicate the current page, for example, the text information "Bluetooth" may be used to indicate that the current page is used to display the main interface of Bluetooth settings. Not limited to text information, the current page indicator 311 may also be an icon.
  • the Bluetooth state control control 332 can be used to monitor touch operations (eg, clicks) acting on the control. In response to this operation, the electronic device 100 may turn on or off the Bluetooth function.
  • touch operations eg, clicks
  • the gamepad 200 device options item 323 may be used to listen for touch operations (eg, clicks) acting on the item.
  • the electronic device 100 can establish a Bluetooth wireless communication connection with the gamepad 200 .
  • the user interface 33 may display more device option items, such as cell phone device option items, tablet computer device option items, and the like. This application does not limit this.
  • FIG. 7 exemplarily shows the user interface 40 displayed by the electronic device 100 when the electronic device 100 and the electronic device 200 successfully establish a Bluetooth communication connection.
  • the user interface 40 may include a Bluetooth icon 401, a handle icon 402, a handle suspension control 403, and some or all of the same interface elements (such as controls, icons, text content, etc.) displayed in the user interface 30 shown in FIG. 6A. element.
  • the Bluetooth icon 401 is used to prompt the user that the electronic device 100 and the electronic device 200 establish a wireless communication connection through Bluetooth
  • the handle icon 402 is used to prompt the user that the electronic device 100 and the electronic device 200 have successfully established a communication connection.
  • the handle hovering control 403 can be used to monitor a touch operation acting on the control, and in response to the operation, the electronic device 100 can display a user interface for setting the game handle.
  • the handle suspension control 402 may also display text information, such as "game handle”, etc., which is not limited in this application.
  • the electronic device 100 may display a function debugging interface for the electronic device 200 .
  • the function debugging interface can be used to display one or more function debugging options for the electronic device 200 , and the function debugging options can be used to set and modify parameters related to the electronic device 200 or the electronic device 100 .
  • the electronic device 100 can store the setting and modification of this parameter, so that when the electronic device 100 and the electronic device 200 establish a communication connection again, the parameter setting can be directly used without the user having to manually debug.
  • the parameters that can be used for debugging by the user include but are not limited to the following options: click mode, associated mouse, roulette mode, gesture mode, etc., which are not limited in this application.
  • FIG. 8 , FIG. 9 , FIG. 10 , and FIG. 11 exemplarily show an implementation manner of the electronic device 100 for functional debugging of the electronic device 200 .
  • the electronic device 100 may respond to a user's touch operation (eg, click) on the gamepad control 313 , the gamepad setting item 322 , or the handle hovering control 403 , and display the display shown in FIG. 8 .
  • User interface 41 may include a current page indicator 411, a key transparency setting entry 412, a key sensitivity setting entry 413, a combo mode setting entry 414, a save control 415, a cancel control 416, and the like.
  • the current page indicator 411 may be used to indicate the current page, for example, the text information "gamepad” may be used to indicate that the current page is used to display the main interface of the gamepad setting. Not limited to text information, the current page indicator 411 may also be an icon.
  • the key transparency setting item 412 may include the corresponding title "Key Transparency", text information "25", and key transparency adjustment controls. Wherein, the text information can be changed according to the operation (eg dragging) acting on the key transparency adjustment control. For example, when the operation used for the key transparency adjustment control is to drag to the right, the number in the text information can be enlarged. When the operation used for the key transparency adjustment control is to drag to the left, the number in the text information can be reduced.
  • the key transparency adjustment control in the key transparency setting item 412 can be used to monitor the operation (eg dragging) acting on the control, and in response to the operation, the electronic device 100 can display the corresponding degree of transparency of the virtual key icon on the user interface .
  • the key sensitivity setting item 413 may include the corresponding title "Key Sensitivity", text information "46", and key sensitivity adjustment controls.
  • the text information can be changed according to the operation (eg dragging) acting on the button sensitivity adjustment control. For example, when the operation used for the key sensitivity adjustment control is dragging to the right, the number in the text information may become larger. When the operation used for the button sensitivity adjustment control is to drag to the left, the number in the text information can be reduced.
  • the key sensitivity adjustment control in the key sensitivity setting item 413 can be used to monitor the operation (eg dragging) acting on the control, and in response to the operation, can be used to adjust the triggering force of the physical key on the electronic device 200 .
  • the combo rate setting entry 414 may include the corresponding title "Combo Rate”, text information "46", and combo rate switch controls 414A and combo rate adjustment controls 414B.
  • the text information can be changed according to the operation (eg dragging) acting on the button sensitivity adjustment control.
  • the operation for the combo rate adjustment control 414B is dragging to the right, the number in the text message may become larger.
  • the operation for the combo rate adjustment control 414B is dragging to the left, the numbers in the text information can be reduced.
  • the combo rate switch control 414A in the combo rate setting item 414 can be used to monitor a touch operation (such as a click) acting on the control, and in response to the operation, the electronic device 100 turns on the combo mode.
  • the combo rate adjustment control 414B in the key sensitivity setting item 413 can be used to monitor an operation (eg, dragging) acting on the control, and in response to the operation, the electronic device 100 can be used for each time the combo mode is triggered interval.
  • the user interface 41 can add setting items or reduce some items, and the titles corresponding to the setting items can also be different.
  • the representation of each setting item may include icons and/or text. This application does not limit this.
  • the setting operation may be performed in other scenarios.
  • FIG. 9 and FIG. 10 illustrate an example of setting operations in a game application scenario.
  • the electronic device 100 displays a user interface 60 (which may also be referred to as a first user interface).
  • the user interface 60 is a user interface of a game application exemplarily shown in the embodiments of the present application.
  • the user interface 60 may include: text information and graphic elements, a game character 601 , a direction key area 602 , a function key area 603 , and a handle suspension control 403 . in:
  • the text information and graphic elements may include text information "Breakthrough Game” for prompting the user's current game name, text information “Physical Strength: 48” for prompting the user for the current user's data information in the game, and other prompt text information "Level 1-9 Dungeon”, “Perform Mission”, “Mission”, “Daoxiang Village: Suggestion of 1440 Battle Strength for Breakthrough” and so on.
  • the game character 601 is the main body operated by the user in the game, and can respond to the touch operation (eg, click) of the direction key controls and/or skill key controls acting on the user interface 60 .
  • the game character 601 can perform a movement action in response to the touch operation of the direction keys, and can also cast a corresponding skill in response to the touch operation of the skill key.
  • the directional key area 602 may include an up key icon 602A, a right key icon 602B, a down key icon 602C, and a left key icon 602D.
  • the direction key area 602 may be used to receive touch operations (eg, click operations) performed by the user on the keys in this area.
  • the electronic device 100 may display movement of the game character icon 601 in a corresponding direction (eg, up, down, left, right, etc.) in response to the operation. For example, when the electronic device 100 detects a user's touch operation (eg, long press) on the up button icon 602A in the directional button area 602, the electronic device 100 displays a scene in which the game character icon 601 moves upward in the game scene.
  • a user's touch operation eg, long press
  • the directional keys area 602 may include more or fewer directional keys than shown.
  • the direction key area 602 may further include a diagonally upper-right direction key, a diagonally lower-right direction key, and the like.
  • the key area 602 may be a circular icon, which is used to monitor the user's touch operation on the circular icon (for example, long press the circular icon and drag it in any direction), and the electronic device 100 responds In response to this operation, a user interface in which the game character icon 601 moves in any direction in the game scene is displayed. This application does not limit this.
  • the skill button area 603 may include an equipment button icon 603A, a map button icon 603B, a basic attack button icon 603C, and a big move button icon 603D.
  • the functional area key 603 may be used to receive a touch operation (eg, a click operation) performed by the user on the area key.
  • the electronic device 100 may display the user interface of the corresponding skill in response to the operation.
  • the skill key area may also include more or less skill keys than shown.
  • the skill key area 603 may further include a "setting" skill key, a “collecting equipment” skill key, and the like, which are not limited in this application.
  • the user interface 60 of the game application may be the scene of other game applications.
  • the user interface 60 may be the user interface of the game application described in the application layer of FIG. 4 .
  • the user can also obtain the user interface 60 of the game application in other ways, for example, the user can
  • the user interface 60 of the corresponding game application is obtained by searching for a certain game applet in the .
  • It is a social application and can also provide users with various types of small program applications (such as shopping, games, news, etc.).
  • the applet mentioned above is It provides an application that can be used without downloading. Users can experience the small programs developed by developers through QR codes, search, etc. This application does not limit this.
  • FIG. 10 shows a setting main interface displayed by the electronic device 100 in an exemplary game application scenario.
  • the electronic device 100 may display the user interface 61 .
  • the user interface 61 may include a function debugging bar 611 and interface elements (eg, controls, icons, text content, etc.) displayed on the user interface 60 shown in FIG. 9 .
  • the function debugging bar 611 may include a game identification icon 611A, a show/hide icon 611B, a setting icon 611C, and a problem feedback icon 661D.
  • the game identification icon 611A can be used to monitor the user's touch operation through the icon, and in response to the operation, the electronic device 100 can identify the direction key area and the skill key area in the user interface 60 . Subsequent embodiments will describe the detailed steps of the key recognition provided by the electronic device 100 in detail, which will not be repeated here.
  • the display/hide icon 611B can be used to monitor the user's touch operation through the icon, and in response to the operation, the electronic device 100 can display or hide the virtual key icon in the user interface.
  • the settings icon 611C may be used to monitor a user's touch operation through the icon, and in response to the operation, the electronic device 100 may display a user interface for basic general settings of the electronic device 200 .
  • the options of the basic general setting may include but are not limited to: icon transparency setting, key transparency setting, combo mode, restoring default setting, saving, returning to the game interface, and so on.
  • the question feedback icon 611D may be used to monitor the user's touch operation through the icon, and in response to the operation, the electronic device 100 may display a user interface for the user to feedback related questions.
  • the icons in the aforementioned function debugging bar 611 can also display corresponding text information, such as "start matching”, “show/hide”, “setting”, “reset key matching” and so on.
  • some or all of the icon controls in the function debugging bar 611 may not be displayed on the touch screen of the electronic device 100, but may be physical buttons provided on the electronic device 200, and the electronic device 100 may respond to user actions A touch operation (eg, pressing) of a physical button on the electronic device 200 displays a corresponding function debugging user interface. It can be understood that this application does not limit how to trigger the electronic device 100 to display the corresponding function debugging interface.
  • the electronic device 100 may display the user interface 66 .
  • the user interface 66 may display a general setting window 661, which may include a key transparency setting item 662, a key sensitivity setting item 663, a combo mode setting item 664, a restore default setting control 665, a save control 666, a return to game Interface controls 667. in:
  • combo mode setting item 664 reference may be made to the combo mode setting item 414 in the user interface 41 of FIG. 8, which will not be described here.
  • the restore default setting control 665 can monitor the touch operation acting on the control, and in response to the operation, the electronic device 100 can clear the data of each setting item adjusted by the user in the general setting window, and restore the default settings of the electronic device 100 The first set data of the entry.
  • the save control 666 can monitor the touch operation acting on the control, and in response to the operation, the electronic device 100 can save the data of each setting item adjusted by the user.
  • the return-to-game interface control 667 may listen for a touch operation acting on the control, and in response to the operation, the electronic device 100 may display the user interface 60 of the game application.
  • FIG. 12 shows a schematic interface diagram of the electronic device 100 identifying virtual keys in the game interface in one embodiment.
  • the electronic device 100 may display the user interface 62 when the key area in the user interface is identified in response to a touch action (eg, click) acting on the game identification icon 611A.
  • a touch action eg, click
  • the user interface 62 may include the aforementioned interface elements (eg, controls, icons, textual content, etc.) displayed by the user interface 60 shown in FIG. 9 .
  • the user interface 62 displays the highlighted direction key area and skill key area, and may also display prompt information 621, re-identify icon 622, and create a game Gamepad keymap icon 623. in:
  • the prompt information 621 is used to prompt the user that the key area identification has been completed, and may be text information "the automatic identification of game keys has been completed". In other embodiments, the prompt information 621 may also be voice information or an icon. This application does not limit this.
  • the re-identification icon 622 may be used to monitor a touch operation (eg, click) acting on the icon, and in response to the operation, the electronic device 100 re-identifies the directional key area and the skill key area in the user interface.
  • a touch operation eg, click
  • Establishing a gamepad button mapping icon 623 may be used to monitor a touch operation (eg, click) acting on the icon, and in response to the operation, the electronic device 100 may establish a button mapping relationship with the electronic device 200 .
  • a touch operation eg, click
  • the prompt information 621, the re-identification icon 622, and the gamepad button mapping icon 623 are independent of the user interface of the game application, that is to say, the above-mentioned prompt information and icons are not related to the user interface of the game application. When the user interface of the game application changes, the above prompt information and icons do not change accordingly.
  • FIG. 13 shows a schematic interface diagram of an implementation process of establishing a button mapping between the electronic device 100 and the electronic device 200 in an embodiment.
  • the electronic device 100 may display the user interface 63 when the establishment of the button mapping between the electronic device 100 and the electronic device 200 is started.
  • a cursor icon 631 that can be manipulated by the user is displayed in the user interface 63 .
  • the cursor icon 631 can be used to move on the identified key area.
  • the user moves the cursor to the designated virtual key icon, so that the cursor hovers over the virtual key icon.
  • the cursor icon 631 hovers over the normal attack skill button 625C that has been identified as highlighted.
  • the electronic device 200 sends a first signal including the key value of the function key 201E to the electronic device 100 .
  • the electronic device 100 establishes a mapping relationship between the general attack skill button 625C and the function button 201E, and the general attack skill button 625C is no longer highlighted.
  • the electronic device 100 displays the user interface 64 .
  • the cursor icon 631 moves to the next directional key area 624 marked as highlighted.
  • the electronic device 100 After detecting a touch operation (eg, pressing) of the electronic device 200 acting on the joystick button 201A, the electronic device 100 establishes a mapping relationship between the direction button area 624 and the joystick button 201A, and the direction button area 624 is no longer highlighted.
  • the electronic device 100 displays the user interface 65 in which the cursor icon 631 disappears and all the keys are no longer highlighted.
  • the cursor icon 631 may move to the highlighted key area by itself, or may be moved to the highlighted key area designated by the user by the user, which is not limited in this application.
  • the electronic device 100 may perform legality detection on the key mapping, that is, when the user is in the process of establishing the key mapping relationship, in response to a touch operation acting on a physical key on the electronic device 200 (for example, pressing), the electronic device 200 sends the first signal containing the key value of the physical key to the electronic device 100, and when the electronic device 100 receives the first signal containing the key value of the physical key, it will detect the key value of the physical key Whether a mapping relationship has been established with other virtual key icons on the electronic device 100 . If so, a prompt message is displayed to prompt the user that the physical button has been mapped with other virtual button icons.
  • the prompt information may be text information, for example, "The button has been mapped, please select it again", or it may be voice information, which is not limited in this application.
  • the general attack skill button 625C on the electronic device 100 has established a mapping relationship with the function button 201E on the electronic device 200 , and at this time, the cursor icon 631 moves to the button that has not yet been mapped.
  • the electronic device 200 in response to a touch operation (eg, pressing) on the function key 201E on the electronic device 200, the electronic device 200 sends a first signal containing the key value of the function key 201E to the electronic device 100.
  • the device 100 When the device 100 receives the first signal containing the key value of the function key 201E, it detects that the key value of the function key 201E has established a mapping relationship with the general attack skill key 625C, and displays a text prompt message "This key has been mapped, please Select again” to remind the user that the physical button has been mapped with the virtual button icon.
  • the electronic device 100 may also perform reasonableness detection on the key mapping, that is, when the user is in the process of establishing the key mapping relationship, in response to the touch acting on the physical keys on the electronic device 200 Operation (for example, pressing), the electronic device 200 sends the first signal containing the key value of the physical key to the electronic device 100, and when the electronic device 100 receives the first signal containing the key value of the physical key, it will detect the physical key. Whether the value can establish the correct mapping relationship with the selected virtual button icon. If not, a prompt message is displayed to prompt the user to make a new selection.
  • the prompt information may be text information, such as "this button cannot be matched with the selected button icon, please select again", or may be voice information, which is not limited in this application.
  • the cursor icon 631 moves to the direction key area 631 where key mapping has not yet been performed.
  • the device 200 sends the first signal containing the key value of the function key 201E to the electronic device 100.
  • the electronic device 100 receives the first signal containing the key value of the function key 201E, it detects that the key value of the function key 201E cannot be combined with the direction key. If the correct mapping relationship is established in the area 631, a text prompt message "This button cannot be matched with the selected button icon, please select it again” is displayed to prompt the user to re-select a physical button that can be correctly mapped with the direction button area 631 ( For example, rocker button 201A).
  • FIG. 14A and FIG. 14B are schematic diagrams showing the effect interface generated by the application buttons in the gamepad control game application after the electronic device 100 and the electronic device 200 successfully establish a button mapping.
  • the electronic device 100 can display a new The user interface (also referred to as the second user interface), the user interface may be partially or completely different from the user interface displayed by the electronic device 100 when the physical buttons on the electronic device 200 are not touched, such as the jumping of the game application interface, The user interface is refreshed due to the movement of characters in the game application interface, the change of the game scene, and so on.
  • FIG. 14A and FIG. 14B exemplarily show the above-mentioned technical effect interface of the present application.
  • a mapping relationship is established between the function button 201E in the electronic device 200 and the “normal attack” button icon 625C on the electronic device 100 .
  • the electronic device 100 detects a touch operation (eg, pressing) of the user's finger 712 on the function button 201E on the electronic device 200 , the electronic device 100 generates a touch event of the "normal attack” button icon 625C, and the electronic device 100 displays The user interface 71 where the game character icon 601 activates the basic attack skill is displayed.
  • a mapping relationship is established between the joystick button 201A in the electronic device 200 and the direction button area 624 on the electronic device 100 .
  • the electronic device 100 detects that the user's finger 732 acts on the touch operation of the joystick button 201A on the electronic device 200 (eg, pushes it to the right)
  • the electronic device 100 generates a touch of the right direction button 624C in the direction button area 624 Event
  • the electronic device 100 displays the user interface 73 with the game character icon 601 moving to the right in the game scene.
  • FIG. 15 shows a flowchart of a key mapping method provided by the present application.
  • the present application takes the electronic device 100 as a mobile phone and the electronic device 200 as a gamepad as an example to describe the method in detail, and the method may include:
  • the mobile phone and the gamepad may have one or more of a Bluetooth (BT) module and a WLAN module.
  • the Bluetooth (BT) module can provide a solution including one or more Bluetooth communications in classic Bluetooth (Bluetooth 2.1) or Bluetooth Low Energy (BLE).
  • the WLAN module can provide a solution including one or more WLAN communications in Wi-Fi direct, Wi-Fi LAN or Wi-Fi softAP.
  • the mobile phone may establish a first connection with the gamepad using one or more wireless communication technologies in Bluetooth or WLAN.
  • the mobile phone displays a first user interface.
  • the first user interface may include a plurality of game buttons (also referred to as game controls), and the plurality of game buttons may include a first game button (also referred to as a first control).
  • the user interface may include a basic attack skill button 603C.
  • the mobile phone detects a first user operation.
  • the mobile phone detects a first user operation that the user requests to establish a game button mapping, such as a touch operation (eg, click) performed by the user on the game identification icon 611A as shown in FIG. 12 .
  • a touch operation eg, click
  • the mobile phone scans the first user interface, and identifies a plurality of game buttons in the first user interface through the first image processing algorithm.
  • the button area in the game screen is separated from the game screen, that is to say, the button area will not change with the change of the game screen, and the color contrast between the button area and the surrounding game screen is large. . Therefore, the mobile phone can identify the key area in the game screen through the first image processing algorithm. Then, in order to improve the recognition accuracy of the key area, the mobile phone can correct the first image processing algorithm in combination with the acquired touch frequency of the user in the game application screen, so as to accurately and effectively find the key area in the game screen.
  • the first image processing algorithm may be an edge detection algorithm. How the edge detection algorithm recognizes the game buttons will be described later, and will not be repeated here.
  • the edge detection algorithm is only an example, and the present application does not impose special restrictions on the first image processing algorithm, and other methods that can realize the identification of game keys are acceptable.
  • a corresponding virtual button can be generated at the position of each game button.
  • the basic attack skill button 625C displays the first cursor 631 .
  • the first virtual key may also display a highlighted state, a blinking state, etc., as long as it can be recognized by the user as a state to be selected.
  • the gamepad detects that the user presses the first physical button.
  • the gamepad detects that the user presses the operation of the first physical key function key 201E.
  • the gamepad sends a first signal that the first physical key is pressed.
  • the first signal may carry the identifier of the first physical key.
  • the mobile phone can obtain the position information of the basic attack skill button 625C with the cursor icon 631 hovering over it, and the identification or key value of the first physical button function button 201E, and form a mapping relationship between the two .
  • the other virtual buttons on the mobile phone and the physical buttons on the gamepad also establish a mapping relationship one by one.
  • the first cursor can be moved from the first game button to the second game button.
  • the present application does not limit the order in which the cursor moves when the buttons are matched, and the cursor can move to any game button that has not yet been mapped.
  • a first mapping table may be generated and stored in the mobile phone, and the first mapping table records the game buttons in the mobile phone interface and the physical buttons of the gamepad the mapping relationship between them.
  • the mobile phone when the mapping relationship between all the game buttons on the mobile phone and the physical buttons on the game handle is established, the mobile phone returns to display the game interface.
  • the gamepad detects that the user presses the first physical button.
  • the gamepad detects that the user presses the operation of the first physical key function key 201E.
  • the gamepad sends a first signal that the first physical key is pressed to the mobile phone.
  • the mobile phone receives a signal that the first physical button is pressed, and triggers the click of the first game button according to the first mapping relationship.
  • the mobile phone displays a second user interface, where the second user interface is a user interface displayed when the game function corresponding to the first game button is triggered.
  • the mobile phone receives the signal that the first physical button function button 201E is pressed, and triggers the click event of the general attack function button 625C being clicked.
  • the user interface 71 displayed when the corresponding basic attack skill is triggered is displayed.
  • the electronic device 100 may also set a first duration threshold, and when the cursor hovers over the virtual button icon for longer than the first duration threshold, the electronic device 100 confirms that the virtual button icon needs to be adapted the virtual button icon, and obtain the location information of the virtual button icon. Therefore, it can be understood that this application does not limit how the electronic device 100 confirms which virtual button icon in the game application screen is the virtual button icon that needs to be adapted.
  • the steps of identifying the key area in the game application screen by the electronic device 100 will be described by taking the edge detection algorithm as an example. As shown in Figure 16, the method steps may include:
  • the electronic device 100 performs grayscale processing on the first image to obtain a second image.
  • the first image is an image converted from the first user interface, and the first user interface is a game interface.
  • the first image is a color image
  • the second image is a gray image obtained by performing grayscale processing on the first image.
  • the current mainstream standard image representation is a 24-bit mode, that is, an RGB value encoded with 24 bits per pixel (bits per pixel, BPP). It uses three 8-bit unsigned integers (0 to 255) to represent the intensities of red, green and blue.
  • the 24-bit mode is used for common color exchange in image file formats such as True Color and the Joint Photographic Experts Group (JPEG) image format or the Tag Image File Format (TIFF) image file format. It can generate 16 million color combinations, many of which are indistinguishable to the human eye.
  • JPEG Joint Photographic Experts Group
  • TIFF Tag Image File Format
  • the electronic device 100 first stores the red R (8bit), then stores the green G (8bit), and finally stores the blue B (8bit), a total of 24 bits, 256 gradients for each color, staggered in RGBRGBRGB... stored in the file.
  • JPEG is an international image compression standard.
  • the JPEG image compression algorithm can provide good compression performance and good reconstruction quality, and is widely used in the field of image and video processing.
  • TIFF is a flexible bitmap format primarily used to store images including photographs and artwork.
  • the grayscale image has only an 8-bit image depth, so in image processing, the grayscale image requires less computation than the color image. Although some color levels are lost, the description of the second image as a grayscale image is consistent with the description of the first image as a color image from the perspective of the overall and local color and luminance level distribution characteristics of the entire image.
  • the electronic device 100 may perform grayscale processing on the first image. According to the importance of R, G, B and other indicators, the three different components are weighted and averaged. Since the human eye has the highest sensitivity to green and the lowest sensitivity to blue, a reasonable grayscale image can be obtained by weighting the three components of RGB according to formula 1. The electronic device 100 can obtain the grayscale value of each pixel in the first image according to formula 1, so as to obtain the second image after grayscale processing:
  • the electronic device 100 performs Gaussian filtering on the second image.
  • performing Gaussian filtering processing on the second image is to perform a weighted average on the grayscale values of the second image, that is to say, for the grayscale value of each pixel in the second image , all of which are weighted and averaged by its own value and other gray values in the neighborhood, and finally the final gray value of the pixel after Gaussian filtering is obtained.
  • Gaussian filtering can be divided into two steps: 1. Obtain a Gaussian template (that is, a weight template). 2. Perform a weighted average.
  • the blur radius of the exemplary embodiment of the present application is 1, when the Gaussian filter calculation is performed on a pixel point, it is only necessary to take a weighted average of the surrounding 8 pixel points.
  • the grayscale values of the 9 pixels are shown in Table 4.
  • the grayscale value range is 0-255.
  • the pixel in the center of Table 4 is the pixel that needs to be processed this time:
  • the electronic device 100 can obtain the second image by repeating the foregoing process for each pixel in the first image.
  • x and y in the two-dimensional Gaussian function may also have other values, that is to say, the weight matrix finally obtained may also be other values, which are not limited in this application.
  • the first image may not undergo grayscale, and the electronic device 100 may directly perform Gaussian filtering on the first image. That is to say, the electronic device 100 may perform Gaussian filtering on the three RGB channels in the first image, respectively, for the first image.
  • the electronic device 100 acquires the user touch frequency.
  • the electronic device 100 may obtain the first number of times of touch operations (eg clicks) performed by the user on the screen interface of the electronic device 100 for the game application scenario.
  • touch operations eg clicks
  • acquiring the number of touch operations (eg, clicks) performed by the user on the screen interface of the electronic device 100 for the game application scenario may be performed by a gamepad application.
  • the gamepad application can obtain the data of the first number of touch operations (eg clicks) by the user on the screen interface of the electronic device 100 in the game application scenario located in the internal storage space of the electronic device 100 .
  • the first count data may be the historical count data of touch operations (such as clicks) performed by the user on the screen interface of the electronic device 100 for the game application scenario, that is to say, the first count data may be the game Data of the number of touch operations (such as clicks) performed by the user on the screen interface of the electronic device 100 for the game application scenario before the application performs a key recognition operation or a key matching operation.
  • the electronic device 100 may acquire a user's touch frequency for an application that needs to perform key matching when the electronic device 100 establishes a communication connection with the electronic device 200 .
  • the electronic device 100 may also acquire the user's touch frequency for an application that needs to perform key matching after performing grayscale processing on the first image. That is to say, the step that the electronic device 100 obtains the touch frequency of the user for the application that needs to perform key matching only needs to be completed before the electronic device 100 uses the linear interpolation formula to process the edge detection. make restrictions.
  • the electronic device 100 performs edge detection on the second image in combination with the obtained user touch frequency.
  • the electronic device 100 uses an edge detection algorithm to perform edge detection on the second image.
  • the edge detection operator of the edge detection algorithm may use the Sobel edge difference operator or other operators.
  • the Sobel edge difference operator can detect the edge according to the phenomenon that the gray-scale weighted difference of the upper and lower, left and right adjacent points of the pixel reaches the extreme value at the edge.
  • the Sobel edge difference operator calculates the difference G x in the horizontal direction and the difference G y in the vertical direction, so that the gradient mode (also called gradient intensity) G and the direction ⁇ of the pixel point can be determined.
  • G is the gradient strength (also known as the gradient value)
  • is the direction
  • arctan is the arc tangent function:
  • the pixel point is considered as an edge point.
  • S x represents the Sobel operator in the x direction, which is used to detect the edge in the y direction
  • S y represents the Sobel operator in the y direction, which is used to detect the edge in the x direction (the edge direction and the gradient direction are perpendicular).
  • the gradient values of the first pixel e in the x and y directions are:
  • * is the convolution symbol
  • sum represents the addition and summation of all elements in the matrix.
  • the edge extracted only based on the gradient value is still very blurred. Therefore, it is necessary to perform edge refinement on the calculated gradient edge, that is to say, keep the local maximum gradient value, and divide the others except the local maximum gradient. All gradient values outside the value are suppressed to 0.
  • the algorithm in this part is divided into two steps: 1. Compare the gradient strength of the current pixel with multiple pixels along the positive and negative directions. 2. If the gradient strength of the current pixel is the largest compared with other pixels, the pixel is reserved as an edge point, otherwise, the pixel will be suppressed. Usually for a more accurate calculation, linear interpolation is used between several adjacent pixels across the gradient direction to obtain the gradient strength to be compared.
  • distance (S1, S2) represents the distance between the two points S1 and S2, and the w coefficient can be calculated by the gradient direction.
  • the weight m is added to the area with high touch frequency, and n is the number of times the current point S is clicked by the user per minute, which is used as the correction value of the gradient transformation of this area, and the other Q1, Q2, Gradient strength of other points in Q3 and Q4.
  • the gradient strength of the current pixel S is the largest when compared with the gradient strengths of other points Q1, Q2, Q3, and Q4 in the same direction, keep its value. Otherwise, the gradient of the current pixel S is suppressed, that is, the gradient of the current pixel S is set to 0.
  • edge pixels After refining the gradient edges as described above, the remaining pixels can more accurately represent the actual edges in the image. However, there are still some edge pixels in the image due to noise and color changes. To address these spurious responses, edge pixels must be filtered with weak gradient values, and edge pixels with high gradient values must be preserved, which can be achieved by choosing high and low thresholds. Count the histogram of gradient intensities of all pixels in the whole image, and select the high-level gradient intensity corresponding to 75% of the total number of the histogram as the high threshold (also called the first threshold), accounting for 25% of the total number of the histogram. The corresponding low-level gradient strength is the low threshold. If the gradient value of the pixel point is higher than the high threshold, the pixel is kept; if the gradient value of the pixel point is less than the low threshold, the pixel point will be excluded.
  • the high threshold also called the first threshold
  • the values of the high threshold and the low threshold may also be selected by other methods, which are not limited in this application.
  • the electronic device 100 acquires multiple frames of images, and repeats the above steps for each frame of images.
  • the electronic device 100 may acquire a group of images that are continuous on the time axis, and repeat Gaussian filtering and edge detection for each frame of images in the group. In other embodiments, the electronic device 100 may also acquire a set of images according to a certain time interval. It can be understood that this application does not limit how the electronic device 100 acquires multiple frames of images.
  • the electronic device 100 compares the results of multiple sets of images, and acquires the edge of the key area.
  • the electronic device 100 may obtain the output results of Gaussian filtering and edge detection performed on multiple sets of images, and compare the results.
  • the electronic device 100 can obtain the repeated identification positions after comparing the output results of multiple sets of images, and the repeated identification positions are the key regions that the electronic device 100 needs to obtain.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting" depending on the context.
  • the phrases “in determining" or “if detecting (the stated condition or event)” can be interpreted to mean “if determining" or “in response to determining" or “on detecting (the stated condition or event)” or “in response to the detection of (the stated condition or event)”.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种按键映射方法、电子设备及系统。根据本申请提供的方法,电子设备可以通过图像处理算法识别出游戏应用中的按键区域。另外,为了提高按键区域识别的精度,电子设备还可以结合用户对于该游戏画面的触控频率,对该图像处理算法进行校正。在识别出所有的按键区域后,用户可以将游戏应用中所有的虚拟按键与游戏手柄的物理按键一一进行匹配,建立映射关系,然后用户可以通过控制游戏手柄上的物理按键来触发游戏应用中的虚拟按键对应的功能,比如游戏中人物的移动、技能的施放、场景的切换等。实施本申请提供的方法,可以简化用户匹配游戏手柄按键与游戏应用按键的操作步骤,提高了按键匹配的效率以及正确率,用户体验得到提升。

Description

一种按键映射方法、电子设备及系统
本申请要求于2020年11月5日提交中国专利局、申请号为202011223709.X、申请名称为“一种按键映射方法、电子设备及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及终端技术领域,尤其涉及一种按键映射方法、电子设备及系统。
背景技术
当前手机普遍可以连接外设手柄设备,以便用户使用外设手柄操控手机上具有按键的应用(例如游戏应用等)。在现有技术中,手机与外设手柄的适配,可以通过蓝牙先将手机与外设手柄建立无线连接。然后,手机屏幕可以显示出外设手柄的按键库界面,该按键库界面可以包括多个按键图标。用户需要选择按键图标,然后将选定的按键图标手动拖动到手机屏幕中适当的位置。按键图标悬停至指定位置后,用户松开手指,以此完成手机与外设手柄的适配。
可以看出,在上述的现有技术中,用户需要多次拖动按键图标。并且,用户也需要将按键图标拖动至应用的指定位置。如果按键图标悬停的位置不准确,则手机与外设手柄的按键映射关系就可能建立失败。因此,该现有技术对用户来说的操作步骤比较繁琐,操作非常不方便,并且按键映射匹配效率低,用户使用体验差。
发明内容
本申请的目的在于提供一种按键映射方法、电子设备及系统,可使得第一电子设备与第二电子设备在建立按键映射时过程更加直观且简单有效,极大提高了按键映射的效率,也简化了用户的操作步骤,提升了用户的使用体验。
上述目标和其他目标将通过独立权利要求中的特征来达成。进一步的实现方式在从属权利要求、说明书和附图中体现。
第一方面,本申请提供了一种按键映射的方法,该方法可包括:第一电子设备可以和第二电子设备建立第一连接,然后,第一电子设备可以显示第一用户界面,该第一用户界面中可以显示有多个控件,该多个控件中可以包括第一控件。第一电子设备检测到第一用户操作(例如用户用手指点击图标),响应于该第一用户操作,第一电子设备可以识别出第一用户界面中的多个控件,该第一电子设备可以选定识别出的多个控件中的第一控件。接着,响应于在第一控件处于选定状态时接收到的第一信号,第一电子设备建立第一物理按键与第一控件的映射关系。
在本申请中,第二电子设备可以是游戏手柄。该游戏手柄具有一个或多个物理按键,并且也具有蓝牙(bluetooth,BT)模块、和/或无线局域网络(wirelesslocalareanetworks,WLAN)模块。其中,蓝牙(BT)模块可以提供包括经典蓝牙(蓝牙2.1标准)或蓝牙低功耗(bluetooth low energy,BLE)中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括无线保真直连(wireless fidelity direct,Wi-Fi direct)、无线保真局域网(wirelessfidelity local area networks,Wi-Fi LAN)或无线保真软件接入点(wirelessfidelitysoftware access point,Wi-Fi softAP)中 一项或多项WLAN通信的解决方案。
本申请中,第一用户界面可以是游戏应用程序的用户界面,该游戏应用程序是智能手机、平板电脑等电子设备上的可供用户消遣娱乐的应用程序,本申请对该应用程序的名称不作限制。也即是说,该游戏应用程序可以是市面上任意一款用户可以获取并进行操控的游戏应用程序,例如
Figure PCTCN2021128486-appb-000001
等等,本申请对此不作限制。
实施第一方面的方法,可以使得第一电子设备更加高效快速地识别出第一用户界面中的控件,并与第二电子设备上的物理按键建立映射关系,提高了按键映射的效率,提升了用户体验,也可以使得按键映射更为精确。
结合第一方面,在一些实施例中,第一连接可以是第一电子设备通过蓝牙、Wi-Fi direct或Wi-Fi softAP中一项或多项无线通信技术与第二电子设备建立的无线连接,也可以是第一电子设备通过通用串行总线(universal serial bus,USB)与第二电子设备建立的有线连接,第一电子设备与第二电子设备建立通信连接后,第一电子设备可以通过USB、蓝牙、Wi-Fi direct或Wi-Fi softAP中一项或多项WLAN的通信技术发送数据信息至第二电子设备,和/或从第二电子设备接收数据信息。
结合第一方面,在一些实施例中,在建立第二电子设备的第一物理按键与第一控件的映射关系后,第一电子设备可以选定多个控件中的第二控件。然后,第一电子设备可以通过第一连接接收到第二电子设备发送的第二信号,并响应在该第二控件处于选定状态时收到的该第二信号,建立第二电子设备的该第二物理按键与该第二控件的映射关系。其中,该第二信号是第二电子设备在该第二物理按键被用户按压时产生的。
结合第一方面,在一些实施例中,在第一电子设备上的多个控件与第二电子设备上的多个物理按键之间的映射关系建立完成后,第一电子设备通过第一连接接收到的第一信号,执行第一控件相应的功能。
结合第一方面,在一些实施例中,第一电子设备将第一用户界面进行灰度处理,得到第一图像,然后,该第一电子设备可以通过边缘检测算法从该第一图像中识别出第一用户界面中所包含的控件边界。
结合第一方面,在一些实施例中,第一用户界面中的多个控件可以包括有第一控件,该第一控件可以包括有多个边界像素点,该多个边界像素点可以包括有第一像素点,该第一像素点在第一用户界面中对应的触控位置被用户触控的频率越高,则该第一像素点被识别为该第一控件的边界像素点的概率就越高。
结合第一方面,在一些实施例中,当第一电子设备通过边缘检测方法从第一图像中识别出第一用户界面中第一控件的边界时,第一电子设备可以采用边缘算子,计算第一图像中每个像素点的梯度向量。然后,第一电子设备可以采用线性插值法,比较该第一像素点的第一梯度向量的梯度值与该第一像素点的第一梯度向量相同方向上其他像素点的梯度值,如果在所述方向上,该第一像素点的梯度值最大,那么第一电子设备可以保留该第一像素点的梯度值,将其他像素点的梯度值设为零。接着,第一电子设备可以设定第一阈值,如果该第一像素点的梯度值大于该第一阈值,则该第一像素点被保留,所有被保留的像素点构成该第一用户界面中的第一控件的边界。
结合第一方面,在一些实施例中,第一电子设备执行第一控件对应的功能时,第一电子设备显示第二用户界面。该第二用户界面可以部分或全部不同于前述第一用户界面。具体的,该第二用户界面可以是游戏应用界面的跳转、游戏应用界面中人物的移动、游戏场景的改变等等。
结合第一方面,在一些实施例中,当第一用户界面中的第一控件处于选定状态时,该第一控件可以处于高亮状态,也可以在该第一控件区域显示有光标,和/或该第一控件处于闪烁状态,也即是说,对于如何提示用户该第一控件处于选定状态,本申请对此不作限制。
结合第一方面,在一些实施例中,第一电子设备显示第一用户界面之后,可以在该第一用户界面显示第一悬浮控件。第一电子设备可以检测到作用于该第一悬浮控件的第一用户操作(例如点击)。
第二方面,本申请实施例提供了一种通信方法,应用于通信系统,该通信系统包括第一电子设备和第二电子设备。第一电子设备可以与第二电子设备建立第一连接。第一电子设备可以显示第一用户界面,该第一用户界面中可以包含多个控件,该多个控件中包括第一控件。第一电子设备可以检测到第一用户操作,响应于该第一用户操作,第一电子设备可以识别出所述第一用户界面中的多个控件。然后,第一电子设备可以选定该多个控件中的第一控件,而第二电子设备可以检测到第一物理按键被用户按压,生成第一信号,并且可以通过该第一连接向第一电子设备发送该第一信号。接着,第一电子设备可以响应在该第一控件处于选定状态时收到的该第一信号,建立该第一物理按键与该第一控件的映射关系。
实施第二方面的方法,可以使得第一电子设备更加高效快速地识别出第一用户界面中的控件,并与第二电子设备上的物理按键建立映射关系,提高了按键映射的效率,提升了用户体验,也可以使得按键映射更为精确。
结合第二方面,在一些实施例中,在建立第一物理按键与第一控件的映射关系后,第一电子设备可以选定多个控件中的第二控件,然后,第二电子设备可以检测到第二物理按键被用户按压,生成第二信号,并通过该第一连接向第一电子设备发送该第二信号。接着,第一电子设备可以响应在该第二控件处于选定状态时收到的该第二信号,建立该第二物理按键与该第二控件的映射关系。
结合第二方面,在一些实施例中,该通信系统中的第二电子设备可以是游戏手柄。关于该游戏手柄的说明,可以参照前述第一方面中所提供的第二电子设备的描述,在此不再赘述。
结合第二方面,在一些实施例中,在第一电子设备上的多个控件与第二电子设备上的多个物理按键之间的映射关系建立完成后,第一电子设备通过第一连接接收到的第一信号,执行第一控件相应的功能。
结合第二方面,在一些实施例中,第一电子设备将第一用户界面进行灰度处理,得到第一图像,然后,该第一电子设备可以通过边缘检测算法从该第一图像中识别出第一用户界面中所包含的控件边界。
结合第二方面,在一些实施例中,第一电子设备将第一用户界面进行灰度处理,得到第一图像,然后,该第一电子设备可以通过边缘检测算法从该第一图像中识别出第一用户界面中所包含的控件边界。
结合第二方面,在一些实施例中,第一用户界面中的多个控件可以包括有第一控件,该第一控件可以包括有多个边界像素点,该多个边界像素点可以包括有第一像素点,该第一像素点在第一用户界面中对应的触控位置被用户触控的频率越高,则该第一像素点被识别为该第一控件的边界像素点的概率就越高。
结合第二方面,在一些实施例中,当第一电子设备通过边缘检测方法从第一图像中识别出第一用户界面中第一控件的边界时,第一电子设备可以采用边缘算子,计算第一图像中每个像素点的梯度向量。然后,第一电子设备可以采用线性插值法,比较该第一像素点的第一梯度向量的梯度值与该第一像素点的第一梯度向量相同方向上其他像素点的梯度值,如果在 所述方向上,该第一像素点的梯度值最大,那么第一电子设备可以保留该第一像素点的梯度值,将其他像素点的梯度值设为零。接着,第一电子设备可以设定第一阈值,如果该第一像素点的梯度值大于该第一阈值,则该第一像素点被保留,所有被保留的像素点构成该第一用户界面中的第一控件的边界。
结合第二方面,在一些实施例中,第一电子设备执行第一控件对应的功能时,第一电子设备显示第二用户界面。该第二用户界面可以部分或全部不同于前述第一用户界面。具体的,该第二用户界面可以是游戏应用界面的跳转、游戏应用界面中人物的移动、游戏场景的改变等等。
结合第二方面,在一些实施例中,当第一用户界面中的第一控件处于选定状态时,该第一控件可以处于高亮状态,也可以在该第一控件区域显示有光标,和/或该第一控件处于闪烁状态,也即是说,对于如何提示用户该第一控件处于选定状态,本申请对此不作限制。结合第二方面,在一些实施例中,第一电子设备显示第一用户界面之后,可以在该第一用户界面显示第一悬浮控件。第一电子设备可以检测到作用于该第一悬浮控件的第一用户操作(例如点击)。
第三方面,本申请实施例提供了一种电子设备,该电子设备可以包括:通信装置、触摸屏、存储器以及耦合于所述存储器的处理器,存储器中存储有计算机可执行指令。其中,该通信装置可以用于与第二电子设备建立第一连接。该触摸屏可以用于显示第一用户界面,该第一用户界面中可以包含多个控件,该多个控件中可以包括第一控件。该触摸屏还可以用于检测到第一用户操作。该处理器可以用于识别出该第一用户界面中的多个控件,还可以用于选定该多个控件中的第一控件。该通信装置还可以用于通过该第一连接接收到第二电子设备所发送的第一信号。该处理器还可以用于,响应在该第一控件处于选定状态时收到的该第一信号,建立第二电子设备中的第一物理按键与第一控件的映射关系,其中,该第一信号是第二电子设备在该第一物理按键被用户按压时所产生的。
结合第三方面,在一些实施例中,该处理器还可以用于在建立第二电子设备的第一物理按键与第一控件的映射关系后,选定多个控件中的第二控件。该通信装置还可以用于通过第一连接接收到第二电子设备发送的第二信号。该处理器还可以用于响应在该第二控件处于选定状态时收到的所述第二信号,建立所述第二电子设备的第二物理按键与所述第二控件的映射关系;所述第二信号是所述第二电子设备在所述第二物理按键被用户按压时产生的。
结合第三方面,在一些实施例中,第二电子设备可以是游戏手柄。关于该游戏手柄的说明,可以参照前述第一方面中所提供的第二电子设备的描述,在此不再赘述。
结合第三方面,在一些实施例中,该处理器还可以用于,在多个控件与第二电子设备的多个物理按键之间的映射关系建立完成之后,通过第一连接接收到第一信号时,执行该第一控件对应的功能。
结合第三方面,在一些实施例中,该处理器还可以具体用于将第一用户界面进行灰度处理,得到第一图像,然后,该处理器可以通过边缘检测算法从该第一图像中识别出第一用户界面中所包含的控件边界。
结合第三方面,在一些实施例中,第一用户界面中的多个控件可以包括有第一控件,该第一控件可以包括有多个边界像素点,该多个边界像素点可以包括有第一像素点,该第一像素点在第一用户界面中对应的触控位置被用户触控的频率越高,则该第一像素点被识别为该第一控件的边界像素点的概率就越高。
结合第三方面,在一些实施例中,该处理器还可以具体用于,当该处理器通过边缘检测 方法从第一图像中识别出第一用户界面中第一控件的边界时,采用边缘算子,计算第一图像中每个像素点的梯度向量。然后,该处理器可以采用线性插值法,比较该第一像素点的第一梯度向量的梯度值与该第一像素点的第一梯度向量相同方向上其他像素点的梯度值,如果在所述方向上,该第一像素点的梯度值最大,那么该处理器可以保留该第一像素点的梯度值,将其他像素点的梯度值设为零。接着,该处理器可以设定第一阈值,如果该第一像素点的梯度值大于该第一阈值,则该第一像素点被保留,所有被保留的像素点构成该第一用户界面中的第一控件的边界。
结合第三方面,在一些实施例中,该处理器在执行第一控件对应的功能时,该处理器显示第二用户界面。该第二用户界面可以部分或全部不同于前述第一用户界面。具体的,该第二用户界面可以是游戏应用界面的跳转、游戏应用界面中人物的移动、游戏场景的改变等等。
结合第三方面,在一些实施例中,当第一用户界面中的第一控件处于选定状态时,该第一控件可以处于高亮状态,也可以在该第一控件区域显示有光标,和/或该第一控件处于闪烁状态,也即是说,对于如何提示用户该第一控件处于选定状态,本申请对此不作限制。
结合第三方面,在一些实施例中,该触摸屏显示第一用户界面之后,可以在该第一用户界面显示第一悬浮控件。该触摸屏可以具体用于检测到作用于该第一悬浮控件的第一用户操作(例如点击)。
第四方面,本发明实施例提供了一种计算机存储介质,该存储介质中存储有计算机程序,该计算机程序包括可执行指令,该可执行指令当被处理器执行时使该处理器执行如第一方面、第二方面中所提供的方法对应的操作。
附图说明
图1是本申请实施例所提供的一种通信系统的架构示意图;
图2是本申请实施例提供的一种电子设备的硬件结构示意图;
图3是本申请实施例提供的另一种电子设备的硬件结构示意图;
图4是本申请实施例提供的一种电子设备的软件框架图;
图5是本申请实施例提供的一种通信系统的软件模块架构图;
图6A是本申请实施例提供的一种用户界面示意图;
图6B是本申请实施例提供的一种用户界面示意图;
图6C是本申请实施例提供的一种用户界面示意图;
图7是本申请实施例提供的一种用户界面示意图;
图8是本申请实施例提供的一种用户界面示意图;
图9是本申请实施例提供的一种用户界面示意图;
图10是本申请实施例提供的一种用户界面示意图;
图11是本申请实施例提供的一种用户界面示意图;
图12是本申请实施例提供的一种用户界面示意图;
图13是本申请实施例提供的一种用户界面示意图;
图14A是本申请实施例提供的一种用户界面示意图;
图14B是本申请实施例提供的一种用户界面示意图;
图15是本申请实施例提供的一种按键映射方法的流程图;
图16是本申请实施例提供的一种识别按键区域方法的流程图;
图17是本申请实施例提供的一种邻近区域内识别像素点的示意图。
具体实施方式
本申请以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请得到说明书和所附权利要书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括复数表达形式,除非其上下文中明确地有相反指示。还应当理解,本申请中使用的术语“和/或”是指包含一个或多个所列出醒目的任何或所有可能组合。在本申请实施例中,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请提供了一种按键映射方法,该方法可以应用于本申请所提供的电子设备上。其中,电子设备可以为手机、平板电脑、个人计算机(personalcomputer,PC)、智能电视等电子设备,本申请对电子设备的具体类型不作任何限制。该方法可以将电子设备上应用程序中的虚拟按键元素与外设电子设备(例如外设手柄)的物理按键(又可称为实体按键)建立映射关系,然后用户可以使用外设电子设备操控该应用的按键,以触发该按键对应的功能。例如,以游戏应用为例,由于游戏应用中的按键区域相对于游戏画面是分离的,也即是说,该按键区域不会随着游戏画面的变化而产生改变。并且,该按键区域与区域周边游戏画面的颜色存在着较大的反差。因此,根据本申请实施例中的方法,电子设备可以通过图像处理算法识别出游戏应用中的按键区域。另外,为了提高按键区域识别的精度,电子设备还可以结合用户对于该游戏画面的触控频率,对该图像处理算法进行校正。在识别出所有的按键区域后,对每一个按键区域进行编号,同时,电子设备界面上可以出现用户可以控制移动的光标,该光标可以在识别出的按键区域范围内进行移动。用户可以通过控制该光标在识别出的按键区域上的移动操作,从而在应用界面中选择出需要跟游戏手柄的物理按键进行匹配的虚拟按键图标。比如,在光标停留在第一游戏虚拟按键的时候,用户可以按下游戏手柄上的第一物理按键,这样第一物理按键与第一游戏虚拟按键建立映射关系。当用户将游戏应用中所有的虚拟按键图标与游戏手柄的物理按键一一进行匹配后,映射关系建立完成,用户可以通过控制游戏手柄上的物理按键来触发游戏应用中的虚拟按键,进而触发虚拟按键对应的功能,比如游戏中人物的移动、技能的施放、场景的切换等等。
与现有技术相比,实施本申请的技术方案,可以在减少在游戏手柄与游戏应用按键匹配过程中用户操作的步骤,使得用户操作更加简便,匹配更为精确,大大提升了用户的使用体验。
下面,介绍一些与本申请相关的术语及概念。
用户界面(user interface,UI)是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析、渲染,最终呈现为用户可以识别的内容,比如图片、文字、按钮等控件。
无符号整数在计算机的二进制计数中,最左边一位不用来表示正负,而是和后面的连在一起表示整数,那么就不能区分这个数是正还是负,就只能是正数,这就是无符号整数。
灰度是指使用黑色调表示物体,即用黑色为基准色,不同的饱和度的黑色来显示图像。每个灰度对象都具有从0%(白色)到灰度条100%(黑色)的亮度值。
噪声是指图像中表现极为突兀的像素点或像素块,会给图像带来干扰没让图像变得不清楚,或者说影响观察图像的细节。
高斯噪声是指噪声像素的概率密度函数服从高斯分布(也称作正态分布)。也即是说,如果一个噪声,它的幅度服从高斯分布,而它的功率谱密度优势均匀分布的,则称为高斯噪声。
高斯滤波是一种线性平滑滤波,适用于消除高斯噪声,广泛应用于图像处理的减噪过程。通俗地讲,高斯滤波就是对整幅图像进行加权平均的过程。每一个像素点的值,都由其本身和邻域内的其他像素值经过加权平均后得到。高斯滤波的具体操作是用一个模板(或称高斯核)扫描图像中的每一个像素,用模板确定的邻域内像素的加权平均灰度值去代替模板中心的值。高斯滤波器对于一直服从正态分布的噪声非常有效。
RGB是指代表红(red,R)、绿(green,G)、蓝(blue,B)三个通道的颜色,电子设备是通过对上述三个颜色通道的变化以及它们相互之间的叠加来得到各式各样的颜色。这种颜色表示的标准几乎包括了人类视力所能感知的所有颜色,是目前运用最广泛的颜色系统之一。
模糊半径是指在高斯滤波过程中,某个像素点向外扩展的值。
梯度是指在向量微积分中,标量场中某一点上的梯度指向标量场增长最快的方向,梯度强度是这个方向上最大的变化率。在单变量的实值函数的情况下,梯度只是导数,或者,对于一个线性函数来说,梯度也就是线的斜率。用于斜度,也就是一个曲面沿着给定方向的倾斜程度,梯度的数值有时也被称为梯度。
首先,介绍本申请实施例提供的一种通信系统10。
如图1示例性所示,该通信系统10可以包括电子设备100、电子设备200。
电子设备100可以为手机、平板电脑、PC、智能电视等电子设备。具体的,电子设备100可以具有蓝牙(bluetooth)模块、WLAN模块中的一项或多项。电子设备100可以通过蓝牙模块、WLAN模块的一项或多项发射信号来探测扫描电子设备100的附近设备,使得电子设备100可以使用蓝牙或WLAN中的一种或多种无线通信技术发现附近设备(例如电子设备200),并可以与附近设备建立无线通信连接,通过蓝牙或WLAN中的一种或多种无线通信技术传输数据至附近设备(例如电子设备200)。其中,蓝牙模块可以提供包括经典蓝牙(蓝牙2.1标准)或蓝牙低功耗(BLE)中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括Wi-Fi direct、Wi-Fi LAN或Wi-Fi softAP中一项或多项WLAN通信的解决方案。
电子设备200可以是具有蓝牙模块、和/或WLAN模块、和/或数据线接口的外设手柄。电子设备200可以通过蓝牙模块、WLAN模块中的一项或多项接收或发射无线信号。其中,蓝牙模块可以提供包括经典蓝牙或蓝牙低功耗中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括Wi-Fi direct、Wi-Fi LAN或Wi-Fi softAP中一项或多项WLAN通信的解决方案。
电子设备200还可以包括摇杆按键201A、启动按键201B(标识有字母S)、A功能按键201C、B功能按键201D、C功能按键201E、D功能按键201F。当电子设备100与电子设备200完成按键映射后,摇杆按键201A可以用于操控电子设备100用户界面上的方向按键进行方向(例如向上、向下等等)的移动,开始按键201B可以用于启动或关闭电子设备200。A 功能按键201C、B功能按键201D、C功能按键201E、D功能按键201F可以与电子设备100用户界面上的各个功能按键进行映射,并且当用户按下这些功能按键时,可以触发电子设备100产生对应的功能事件。
如图1所示,电子设备200可以和电子设备100建立第一连接。具体的,第一连接可以是蓝牙、Wi-Fi direct或Wi-Fi softAP中一项或多项无线通信连接,也可以是有线连接,比如通用串行总线(universal serial bus,USB)连接。电子设备100与电子设备200建立第一连接后,电子设备100与第二电子设备之间可以通过第一连接互相传输数据信息。
可以理解的是,本申请实施例示出的电子设备200的结构并不构成对通信系统10的具体限定。在本申请另一些实施例中,电子设备200可以具有比图示更多或更少的按键,例如,电子设备200可以具有多个摇杆按键201A。在本申请的另一些实施中,电子设备200上的按键位置可以位于电子设备200的侧面、背面等并不面向用户的正面侧位置。本申请对此不作限制。
可以理解的是,本申请实施例示出的示意的结构并不构成对通信系统10的具体限定。在本申请另一些实施例中,通信系统10可以包括比图示更多或更少的设备。例如通信系统10中还可以包括多个手机,或者多个不同类型的电子设备,比如具有通信功能的显示器、平板电脑、PC等。本申请对此不作限制。
接下来,介绍本申请实施例中提供的示例性电子设备100。
图2示出了电子设备100的硬件结构示意图。
电子设备100可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备的具体类型不作特殊限制。
电子设备100可以包括处理器110,外部存储器接口120,内存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit, NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用 于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。在一些实施例中,移动通信模块150提供的无线通信的解决方案可使得电子设备可以与网络中的设备(如服务器)通信。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决 方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。在一些实施例中,电子设备100可以通过无线通信模块160中的蓝牙模块、WLAN模块发射信号来探测或扫描在电子设备100附近的设备,并与附近的设备建立无线通信连接并传输数据。其中,蓝牙模块可以提供包括经典蓝牙(蓝牙2.1标准)或蓝牙低功耗中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括Wi-Fi direct、Wi-Fi LAN或Wi-Fi softAP中一项或多项WLAN通信的解决方案。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字 信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5 SDRAM)等;
非易失性存储器可以包括磁盘存储器件、快闪存储器(flash memory)。
快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。
随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两 个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器 180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
图3示例性示出了本申请提供的电子设备200的硬件结构。
如图3所示,电子设备200可以包括按键201、处理器(central processing unit,CPU)202、 存储器203、总线204、输入输出接口205、马达206、指示灯207、音频模块208、传感器模块209、通信接口210、无线通信模块211、电源管理模块212、天线3。其中,传感器模块209可以包括压力传感器209A,角度传感器209B、重力传感器209C,陀螺仪传感器209D,加速度传感器209E等。通信接口210可以包括USB接口210A和无线通信接口210B等。无线通信模块211可以包括蓝牙通信模块211A和Wi-Fi通信模块211B等。处理器202、通信接口210、无线通信模块211、电源管理模块212可以通过总线204或者其他方式连接。图3以通过总线204连接为例。
可以理解的是,本发明实施例示意的结构并不构成对电子设备200的具体限定。在本申请另一些实施例中,电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
按键201可以包括如图1中所示的启动按键201B、摇杆按键201A、各个功能按键等。按键201可以是机械按键。电子设备100可以接收电子设备200的按键信号,产生针对电子设备100显示屏上相应的应用按键的触控事件。
处理器202可以包括一个或多个处理单元,例如:处理器202可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器202中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器202中的存储器为高速缓冲存储器。该存储器可以保存处理器202刚用过或循环使用的指令或数据。如果处理器202需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器202的等待时间,因而提高了系统的效率。
在一些实施例中,处理器202可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
存储器203可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5 SDRAM)等;
非易失性存储器可以包括磁盘存储器件、快闪存储器(flash memory)。
快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单 元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。
随机存取存储器可以由处理器202直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器202直接进行读写。
马达206可以产生振动提示。例如,马达206可以用于触摸振动反馈,当用户作用于不同按键201(例如启动按键、功能按键等)的触摸操作(例如按压),可以对应不同的振动反馈效果。作用于不同按键201的触摸操作,马达206也可对应不同的振动反馈效果。在一些实施例中,触摸振动反馈效果还可以支持自定义。
指示器207可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示按键201的触摸操作等。
音频模块208用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块208还可以用于对音频信号编码和解码。在一些实施例中,音频模块208可以设置于处理器202中,或将音频模块208的部分功能模块设置于处理器202中。
压力传感器209A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器209A可以设置于按键201底部。压力传感器209A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器209A,电极之间的电容改变。电子设备200根据电容的变化确定压力的强度。当有触摸操作(例如按压)作用于按键201,电子设备200根据压力传感器209A检测所述触摸操作强度。电子设备200也可以根据压力传感器209A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作持续时长的触摸操作,可以对应不同的操作指令。例如:当电子设备200处于运行状态时,有触摸操作持续时长小于第一时长阈值的触摸操作作用于启动按键时,执行电子设备200休眠的指令。当有触摸操作持续时长大于或等于第一时长阈值的触摸操作作用于启动按键时,执行关闭电子设备200的指令。
角度传感器209B可以用于检测角度。具体实现中,角度传感器209B中间可以有一个孔,配合相应的机械轴。当机械轴每转过1/16圈时,角度传感器102就会计数一次。往一个方向转动时,计数增加,转动方向改变时,计数减少。计数与角度传感器102的初始位置有关,当初始化角度传感器时,它的计数值被设置为0,如果需要,可以使用编程将角度传感器重新复位。
重力传感器209C可以用于采集电子设备200的重力加速度数据,确定电子设备200的运动状态。重力传感器209C可以用于体感游戏等场景。
陀螺仪传感器209D可以用于确定电子设备200的运动姿态,并将该运动姿态相关信号发送至电子设备100,从而实现电子设备100显示屏上相应的操控元素图标显示出与电子设备200相同的运动姿态。在一些实施例中,可以通过陀螺仪传感器209D确定电子设备200围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于体感游戏、赛车游戏等场景。
加速度传感器209E可检测电子设备200在各个方向上(一般为三轴)加速度的大小,因此,加速度传感器209E可以用于检测电子设备200的运动信息。当电子设备200静止时,加速度 传感器209E还可以检测出重力的大小及方向。
USB接口210A是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口210A可以用于连接充电器为电子设备200充电,也可以用于电子设备200与外围设备之间传输数据。该接口还可以用于连接其他电子设备,例如AR设备等。在另一些实施例中,USB接口210A还可以是即插即用(on-the-go,OTG)接口,主要应用于各种不同的设备间连接,进行数据交换,可以给智能终端扩展USB接口配件以丰富智能终端的功能。OTG接口可以通过一端具有USB接口,另一端具有Type C接口的OTG数据线将电子设备200与外围设备进行连接,并可以通过该连接方式传输数据等。
无线通信接口210B是符合无线通信协议的接口,具体可以是802.11无线接口等。无线通信接口210B可以用于电子设备200与外围设备间建立无线连接,并通过该无线连接进行设备间的数据传输。
无线通信处理模块211可以包括蓝牙通信处理模块211A、WLAN通信处理模块211B中的一项或多项,可以用于监听其他设备(例如电子设备100)发射的信号,如探测请求、扫描信号等等,并可以发送响应信号,如探测响应、扫描响应等,使得其他设备(例如电子设备100)可以发现电子设备200,并与其他设备(例如电子设备100)建立无线通信连接,通过蓝牙或WLAN中的一种或多种无线通信技术与其他设备(例如电子设备100)进行通信。
在另外一些实施例中,蓝牙通信处理模块、WLAN通信处理模块中的一项或多项也可以发射信号,如广播蓝牙信号、信标信号,使得其他设备(例如电子设备100)可以发现电子设备200,并与其他设备(例如电子设备100)建立无线通信连接,通过蓝牙或WLAN中的一种或多种无线通信技术与其他设备(例如电子设备100)进行通信。
天线3用于发射和接收电磁波信号,可覆盖单个或多个通信频带。在另外一些实施例中,天线可以和调谐开关结合使用。
在另一些实施例中,电子设备200还可以包括有多个天线,本申请对此不作限制。
电源管理模块212可以包括有电池与充电管理模块。电源管理模块212可以为处理器202,存储器203,和无线通信模块211等供电。电源管理模块212还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块212也可以设置于处理器202中。在另一些实施例中,电源管理模块212中的充电管理模块也可以设置于不同的器件中。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的
Figure PCTCN2021128486-appb-000002
系统为例,示例性说明电子设备100的软件架构。
图4是本申请实施例的电子设备100的软件结构框图。如图4所示,该电子设备100的软件结构可以包括:应用程序层(applications,APP)、应用程序框架层(application framework,FWK)、安卓运行时(
Figure PCTCN2021128486-appb-000003
runtime)和系统库(libraries)、内核层(kernel)以及硬件层(hardware)。
应用程序层可以包括一系列应用程序包。
如图4所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
在本申请实施例中,应用程序层还可以包括游戏应用程序和游戏手柄应用程序。
游戏应用程序是智能手机、平板电脑等电子设备上的可供用户消遣娱乐的应用程序,本申请对该应用程序的名称不作限制。也即是说,该游戏应用程序可以是市面上任意一款用户 可以获取并进行操控的游戏应用程序,例如
Figure PCTCN2021128486-appb-000004
等等,本申请对此不作限制。
游戏手柄应用程序可以用于管理及设置电子设备200(例如外设手柄)的应用程序。例如,游戏手柄应用程序可以用于设置和/或调整按键灵敏度、按键图标透明度、连击速率等等参数,也可以用于对电子设备100需要进行按键映射的应用程序用户界面上的按键区域进行识别。按键区域识别步骤完成后,游戏手柄应用程序可以响应于用户的操作,将当前需要进行映射的虚拟按键图标与电子设备200上的物理按键建立映射信息,并将该映射信息保存于电子设备100的内部存储和/或云端服务器中。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图4所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器、补丁包等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,头戴式显示设备振动,指示灯闪烁等。
Figure PCTCN2021128486-appb-000005
Runtime包括核心库和虚拟机。
Figure PCTCN2021128486-appb-000006
runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)、补丁引擎等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传 感器驱动、以及WLAN蓝牙能力和基本通信协议。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸操作,该触摸操作所对应的控件为相机应用图标为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
下面,介绍通信系统10的软件模块。
图5是通信系统10的软件架构框图。如图5所示,通信系统10的软件架构可以包括电子设备100的软件模块和电子设备200的软件模块。电子设备100的软件模块可以包括处理模块510、通信模块520、显示模块530。电子设备200可以包括通信模块540和处理模块550。
处理模块510包括按键识别模块511、按键映射模块512、按键管理模块513。其中,按键识别模块511可以用于确定电子设备100在与电子设备200建立通信连接,并启动需要进行按键映射的应用后,对电子设备100当下用户界面上的按键图标区域进行识别。按键管理模块513可以用于用户管理和调整游戏手柄的参数配置,例如按键灵敏度、按键图标透明度、连击速率等等。按键管理模块513可以用于获取电子设备100用户界面上按键图标的坐标位置、以及获取用户在电子设备200上触控的实体按键对应的键值,生成并保存前述按键图标坐标位置与实体按键对应键值的映射信息。
在另一些实施例中,电子设备100在与电子设备200建立通信连接,并启动需要进行按键映射的应用后,电子设备100检测到用户作用在用于开启按键区域识别模块的图标的触控操作(例如点击“开始匹配”图标)的操作,按键识别模块511响应于该操作,对电子设备100当下用户界面上的按键图标区域进行识别。
通信模块520可以包括管理基于USB的有线连接,和/或管理基于蓝牙和/或WLAN中的中一项或多项无线通信技术的无线连接。其中,蓝牙(BT)模块可以提供包括经典蓝牙(蓝牙2.1)或蓝牙低功耗(BLE)中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括Wi-Fi direct、Wi-Fi LAN或Wi-Fi softAP中一项或多项WLAN通信的解决方案。
显示模块530可以用于在电子设备100上显示用户界面,比如图像、视频,以便用户能与电子设备100进行交互。
通信模块540是电子设备200用于跟其他设备通信的模块,具体可以参考上述通信模块520的描述,在此不再赘述。
如图所示,电子设备200和电子设备100可以通过通信模块520和通信模块540之间建立的第一连接进行数据的交互。该第一连接可以参考前述图1中的描述,在此不再赘述。
处理模块550可以用于游戏手柄的信号处理。具体的,例如,响应于用户作用于电子设备200的触控操作(例如按压),电子设备200可以接收到物理信号(例如压力信号),然后,处理模块550可以通过前述传感器模块209中的传感器(例如压力传感器209A)将物理信号转化为电信号。处理模块550还可以采集前述传感器模块209中的数据进行处理。例如,当电子设备200向左偏移时,传感器模块209中的重力传感器209C检测到由左侧向下运动的重力加速度,处理模块550将该物理信号转化为电信号,并经过相应的处理(例如处理成合适的频率),通过通信模块540发送至电子设备100。电子设备100经由通信模块520接收并 进行相应的处理后,在显示模块530显示出相应的触控事件(例如电子设备100上可控制的图标元素也相应地向左偏移)。
下面介绍电子设备100上的用于应用程序菜单的示例性用户界面。
图6A示出了一种示例性用户界面30。
用户界面30可以包括:状态栏301、具有常用应用程序图标的托盘302、日历指示符303、页面指示符304以及其他应用程序图标等等。
状态栏301可以包括:移动通信信号(又可以称为蜂窝信号)的一个或多个信号强度指示符301A、无线高保真(wirelessfidelity,Wi-Fi)信号的一个或多个信号强度指示符301B、电池状态指示符301C。
具有常用应用程序图标的托盘302可以展示:相机图标302A、通讯录图标302B、电话图标302C、信息图标302D。
日历指示符303可以用于指示当前时间,例如日期、星期几、时分信息等。
页面指示符304可以用于指示用户当前浏览的哪一个页面中的应用程序。用户可以左右滑动其他应用程序图标的区域,来浏览其他页面中的应用程序图标。
其他应用程序图标可以例如:音乐图标305、计算器图标306、游戏图标307、设置图标308。
在一些实施例中,图6A示例性所示的用户界面30可以为主界面(Homescreen)。
在其他一些实施例中,电子设备还可以包括主屏幕键。该主屏幕键可以是实体按键,也可以是虚拟按键。该主屏幕按键可以用于接收用户的指令,将当前显示的UI返回到主界面,这样可以方便用户随时查看主屏幕。上述指令具体可以是用户单次按下主屏幕键的操作指令,也可以是用户在短时间内连续两次按下主屏幕键的操作指令,还可以是用户在预定时间内长按主屏幕键的操作指令。在本申请其他一些实施例中,主屏幕键还可以集成指纹识别器,以便用户在按下主屏幕键的时候,随之进行指纹采集和识别。
可以理解的是,图6A仅仅示例性示出了电子设备100上的用户界面,不应构成对本申请实施例的限定。
下面分别描述本申请涉及的应用场景以及电子设备100上实现的用户界面的一些实施例。
图6A、图6B、图6C及图7示出了电子设备100和电子设备200建立通信连接的相关用户界面。
本申请实施例以电子设备100与电子设备200通过蓝牙进行通信为例。图6A示例性示出了电子设备100上的一种开启蓝牙的操作。
如图6A所示,当检测到在状态栏301上的向下滑动手势时,响应于该手势,电子设备100可以显示出用户界面31。用户界面31可以包括有窗口311以及与前述用户界面30相同的部分或全部界面元素(例如控件、图标、文字等)。窗口311可以显示有“蓝牙”的开关控件312,还可以显示有其他功能的开关控件(如Wi-Fi控件、手电筒控件、位置信息控件、游戏手柄控件313等等)。当检测到在窗口311中的蓝牙开关控件312上的操作(如在蓝牙开关控件312上的触摸操作)时,响应于该操作,电子设备100可以开启蓝牙功能。也即是说,用户可以在状态栏301处做一个向下滑动的手势来打开窗口311,并可以在窗口311中点击“蓝牙”的开关控件312来方便地开启蓝牙。蓝牙被开启后,电子设备100可以通过蓝牙通信技术发现附近的设备。
图6B示例性示出了另外一种开启蓝牙的操作。当检测到用户作用于设置图标308的触控操作(例如点击)时,响应于该操作,电子设备100可以显示出用户界面32。
用户界面32可以包括一个或多个设置项,该一个或多个设置项可以包括:飞行模式设置条目、Wi-Fi设置条目、蓝牙设置条目321、移动网络设置条目、游戏手柄设置条目322、勿扰模式设置条目、显示与亮度设置条目、华为账号设置条目等等。
在用户界面32上的每个设置条目对应有相应的标题。例如,飞行模式设置条目对应的标题为“飞行模式”,Wi-Fi设置条目对应的标题为“Wi-Fi”,蓝牙设置条目321对应的标题为“蓝牙”,移动网络设置条目对应的标题为“移动网络”,游戏手柄设置条目322对应的标题为“游戏手柄”,勿扰模式设置条目对应的标题为“勿扰模式”,显示与亮度设置条目对应的标题为“显示与亮度”,华为账号设置条目对应的标题为“华为账号”。各个设置项可用于监听触发显示对应设置项的设置内容的操作(如触摸操作),响应于该操作,电子设备100可打开用于显示对应设置项的设置内容的用户界面。
在另一些实施例中,该用户界面32可以增加设置条目,例如,“辅助助手”、“生物识别和密码”等。在另一些实施例中,用户界面32中的设置条目也可以有响应的文字说明。该用户界面32也可以减少一些条目,设置条目对应的标题也可以不同。各个设置项的表现形式可包括图标和/或文本。本申请对此不作限制。
在用户界面32的场景下,检测到用户作用于蓝牙设置条目321的触控操作(例如点击),响应于该操作,电子设备100可以开启蓝牙功能。蓝牙被开启后,电子设备100可以通过蓝牙通信技术发现附近的设备。
在另一些实施例中,电子设备100也还可以通过Wi-Fi直连(如Wi-Fi p2p)、Wi-Fi softAP等、Wi-Fi LAN等通信技术发现该电子设备附近的设备,本申请对此不作限制。
图6C示例性示出了用于蓝牙设置的用户界面33。当检测到用户作用于蓝牙开关控件312或蓝牙设置条目312的触控操作(例如点击),响应于该操作,电子设备100显示出如图6C所示的用户界面33。
用户界面33可以包括前述用户界面30所示的状态栏301,该状态栏可以参照前述图6A的描述,在此不再赘述。用户界面33还可以包括当前页面指示符331、蓝牙状态控制控件332、游戏手柄200设备选项条目323以及其他界面元素(例如图标、控件、文字等等)。
当前页面指示符331可用于指示当前页面,例如文本信息“蓝牙”可用于指示当前页面用于展示蓝牙设置主界面。不限于文本信息,当前页面指示符311还可以是图标。
蓝牙状态控制控件332可以用于监听作用于该控件的触控操作(例如点击)。响应于该操作,电子设备100可以开启或关闭蓝牙功能。
游戏手柄200设备选项条目323可以用于监听作用于该条目的触控操作(例如点击)。响应于该操作,电子设备100可以和该游戏手柄200建立蓝牙无线通信连接。
在另一些实施例中,用户界面33可以显示出更多的设备选项条目,例如手机设备选项条目、平板电脑设备选项条目等等。本申请对此不作限制。
图7示例性示出了电子设备100与电子设备200成功建立蓝牙通信连接时,电子设备100显示出的用户界面40。
用户界面40可以包括蓝牙图标401、手柄图标402、手柄悬浮控件403,以及与前述图6A所示的用户界面30显示的界面元素(如控件、图标、文字内容等等)相同的部分或所有界面元素。其中,蓝牙图标401用于提示用户电子设备100与电子设备200通过蓝牙建立无线通信连接,手柄图标402用于提示用户电子设备100与电子设备200已经成功建立通信连 接。手柄悬浮控件403可以用于监听作用于该控件的触控操作,响应于该操作,电子设备100可以显示出用于设置游戏手柄的用户界面。
在其他一些实施例中,该手柄悬浮控件402也可以显示出文本信息,例如“游戏手柄”等,本申请对此不作限制。
在电子设备100与电子设备200成功建立通信连接后,电子设备100可以显示出针对电子设备200的功能调试界面。
在一些实施例中,该功能调试界面可以用于显示一个或多个针对于电子设备200的功能调试选项,该功能调试选项可以用于对电子设备200或电子设备100相关的参数进行设置和修改。电子设备100可以存储该项参数的设置和修改,以便于当电子设备100与电子设备200再次建立通信连接时,直接使用该项参数设置,而不必用户重新手动进行调试。其中,可以用于用户进行调试的参数包括但不限于如下选项:点击模式、关联鼠标、轮盘模式、手势模式等等,本申请对此不作限制。
图8、图9、图10、图11示例性示出了一种电子设备100针对于电子设备200功能调试的实现方式。
如图8示例性所示,电子设备100可以响应于用户作用于游戏手柄控件313、或游戏手柄设置条目322、或手柄悬浮控件403的触控操作(例如点击),显示出图8示例性所示的功能调试用户界面41。
用户界面41可以包括当前页面指示符411、按键透明度设置条目412、按键灵敏度设置条目413、连击模式设置条目414、保存控件415、取消控件416等等。
当前页面指示符411可用于指示当前页面,例如文本信息“游戏手柄”可用于指示当前页面用于展示游戏手柄设置主界面。不限于文本信息,当前页面指示符411还可以是图标。
按键透明度设置条目412可以包括对应的标题“按键透明度”、文字信息“25”及按键透明度调整控件。其中,该文字信息可以根据作用于按键透明度调整控件的操作(例如拖动)发生变化。例如,当作用于按键透明度调整控件的操作为向右拖动,该文字信息中的数字可以变大。当作用于按键透明度调整控件的操作为向左拖动,该文字信息中的数字可以变小。该按键透明度设置条目412中的按键透明度调整控件可以用于监听作用于该控件上的操作(例如拖动),响应于该操作,电子设备100可以显示出用户界面上虚拟按键图标相应的透明程度。
按键灵敏度设置条目413可以包括对应的标题“按键灵敏度”、文字信息“46”及按键灵敏度调整控件。其中,该文字信息可以根据作用于按键灵敏度调整控件的操作(例如拖动)发生变化。例如,当作用于按键灵敏度调整控件的操作为向右拖动,该文字信息中的数字可以变大。当作用于按键灵敏度调整控件的操作为向左拖动,该文字信息中的数字可以变小。该按键灵敏度设置条目413中的按键灵敏度调整控件可以用于监听作用于该控件上的操作(例如拖动),响应于该操作,可以用于调整电子设备200上物理按键的触发力度。
连击速率设置条目414可以包括对应的标题“连击速率”、文字信息“46”及连击速率开关控件414A与连击速率调整控件414B。其中,该文字信息可以根据作用于按键灵敏度调整控件的操作(例如拖动)发生变化。例如,当作用于连击速率调整控件414B的操作为向右拖动,该文字信息中的数字可以变大。当作用于连击速率调整控件414B的操作为向左拖动,该文字信息中的数字可以变小。该连击速率设置条目414中的连击速率开关控件414A可以用于监听作用于该控件上的触控操作(例如点击),响应于该操作,电子设备100开启连击模 式。该按键灵敏度设置条目413中的连击速率调整控件414B可以用于监听作用于该控件上的操作(例如拖动),响应于该操作,电子设备100可以用于每一次连击模式触发的时间间隔。
在另一些实施例中,该用户界面41可以增加设置条目,也可以减少一些条目,设置条目对应的标题也可以不同。各个设置项的表现形式可包括图标和/或文本。本申请对此不作限制。
在另外一些实施例中,设置操作可以在其他场景下施行。例如,以游戏应用场景为例,图9和图10示出了一种在游戏应用场景下进行设置操作的示例。
如图9所示,响应于游戏图标307的触控操作(例如点击),电子设备100显示出用户界面60(也可以称作第一用户界面)。该用户界面60是本申请实施例示例性所示的一种游戏应用的用户界面。
如图9所示,用户界面60可以包括:文字信息与图形元素、游戏角色601、方向按键区域602、功能按键区域603、手柄悬浮控件403。其中:
文字信息与图形元素可以包括用于提示用户当前游戏名称的文字信息“闯关游戏”,用于提示用户当前用户在游戏中的数据信息的文字信息“体力:48”,以及其他提示性的文字信息“1-9级副本”、“执行任务”、“任务”、“稻香村:闯关战力建议1440”等等。
游戏角色601是用户在游戏中所操作的主体,可以响应作用在用户界面60中的方向按键控件、和/或技能按键控件的触控操作(例如点击)。游戏角色601可以响应于方向按键的触控操作,作出移动动作,也可以响应于技能按键的触控操作,施放对应的技能等。
方向按键区域602可以包括向上按键图标602A、向右按键图标602B、向下按键图标602C、向左按键图标602D。方向按键区域602可以用于接收用户作用于该区域按键的触控操作(例如点击操作)。电子设备100响应于该操作,可以显示游戏人物图标601在相应方向上(例如向上、向下、向左、向右等等)的移动。例如,电子设备100检测到用户作用于方向按键区域602中的向上按键图标602A的触控操作(例如长按),则电子设备100显示出游戏人物图标601在游戏场景中向上移动的场景。
在一些实施例中,方向按键区域602可以包括比图示更多或更少的方向按键。例如,方向按键区域602还可以包括斜向右上方向按键、斜向右下方向按键等等。在另一些实施例中,按键区域602可以是圆形图标,用于监听用户作用于该圆形图标的触控操作(例如长按圆形图标并在任意方向上拖动),电子设备100响应于该操作,显示出游戏人物图标601在游戏场景中进行任意方向移动的用户界面。本申请对此不作限制。
示例性地,技能按键区域603可以包括装备按键图标603A、地图按键图标603B、普攻按键图标603C、大招按键图标603D。功能区域按键603可以用于接收用户作用于该区域按键的触控操作(例如点击操作)。电子设备100响应于该操作,可以显示相应技能的用户界面。
在另外一些实施例中,技能按键区域还可以包括比图示更多或更少的技能按键。例如,技能按键区域603还可以包括“设置”技能按键、“收集装备”技能按键等等,本申请对此不作限制。
手柄悬浮控件403可以参考前述图7中手柄悬浮控件403的描述,在此不再赘述。
可以理解的是,该游戏应用的用户界面60可以是其他游戏应用的场景,例如,用户界面60可以是前述图4应用程序层中所描述的游戏应用的用户界面。
不限于此,在另一种可能的实施例中,用户还可以通过其他方式获取游戏应用的用户界面60,例如,用户可以在
Figure PCTCN2021128486-appb-000007
中通过搜索某款游戏小程序来获取相应的游戏应用的用户界面60。
Figure PCTCN2021128486-appb-000008
是一款社交类应用程序,同时也可以提供用户各种类型的小程序应用(例如购物、游戏、新闻资讯等等)。其中,上述所提及的小程序是
Figure PCTCN2021128486-appb-000009
提供的一种不用下载就能 使用的应用,用户可以通过二维码、搜索等方式体验到开发者们开发的小程序。本申请对此不作限制。
图10示出了一种在示例性游戏应用场景下,电子设备100显示出的设置主界面。
如图10所示,响应于作用在手柄悬浮控件403(也可以称为第一悬浮控件)的触控操作(例如点击),电子设备100可以显示出用户界面61。其中,用户界面61可以包括功能调试栏611,以及前述图9所示的用户界面60显示的界面元素(例如控件、图标、文字内容等等)。功能调试栏611可以包括游戏识别图标611A、显示/隐藏图标611B、设置图标611C、问题反馈图标661D。
游戏识别图标611A可以用于监听用户通过该图标的触控操作,响应于该操作,电子设备100可以对用户界面60中的方向按键区域及技能按键区域进行识别。后续实施例将详细描述电子设备100提供的按键识别的详细步骤,在此暂不赘述。
显示/隐藏图标611B可以用于监听用户通过该图标的触控操作,响应于该操作,电子设备100可以实现显示,或隐藏用户界面中的虚拟按键图标。
设置图标611C可以用于监听用户通过该图标的触控操作,响应于该操作,电子设备100可以显示出针对电子设备200的基本通用设置的用户界面。该基本通用设置的选项可以包括但不限于:图标透明度设置、按键透明度设置、连击模式、恢复默认设置、保存、返回游戏界面等等。
问题反馈图标611D可以用于监听用户通过该图标的触控操作,响应于该操作,电子设备100可以显示出用于用户反馈相关问题的用户界面。
在其他一些实施例中,前述功能调试栏611中的图标也可以显示出相应的文本信息,例如“开始匹配”、“显示/隐藏”、“设置”、“重置按键匹配”等。在另外一些实施中,前述功能调试栏611中的部分或全部图标控件也可以不显示在电子设备100的触摸屏上,而是设置于电子设备200上的物理按键,电子设备100可以响应于用户作用于电子设备200上的物理按键的触控操作(例如按压),而显示出相对应的功能调试用户界面。可以理解的是,对于如何触发电子设备100显示出相应的功能调试界面,本申请对此不作限制。
可以理解的是,上述实施场景以及功能调试用户界面仅用于对本申请作示例性说明,并不对本申请构成限制。
如图11所示,响应于作用在设置图标611C的触控操作(例如点击),电子设备100可以显示出用户界面66。该用户界面66可以显示出通用设置窗口661,该通用设置窗口661可以包括按键透明度设置条目662、按键灵敏度设置条目663、连击模式设置条目664、恢复默认设置控件665、保存控件666、返回游戏界面控件667。其中:
按键透明度设置条目662可以参考前述图8用户界面41中的按键透明度设置条目412,在此暂不赘述。
按键灵敏度设置条目663可以参考前述图8用户界面41中的按键灵敏度设置条目413,在此暂不赘述。
连击模式设置条目664可以参考前述图8用户界面41中的连击模式设置条目414,在此暂不赘述。
恢复默认设置控件665可以监听作用于该控件的触控操作,响应于该操作,电子设备100可以清除在通用设置窗口中用户所调整的各设置条目的数据,恢复电子设备100默认的各项设置条目的最先设置数据。
保存控件666可以监听作用于该控件的触控操作,响应于该操作,电子设备100可以保 存用户所调整的各设置条目的数据。
返回游戏界面控件667可以监听作用于该控件的触控操作,响应于该操作,电子设备100可以显示该游戏应用的用户界面60。
图12示出了一个实施例中电子设备100识别游戏界面中虚拟按键的界面示意图。
如图12所示,响应于作用在游戏识别图标611A上的触控作用(例如点击),开始识别用户界面中的按键区域时,电子设备100可以显示出用户界面62。
用户界面62可以包括前述图9所示的用户界面60显示的界面元素(例如控件、图标、文字内容等等)。在电子设备100识别出前述方向按键区域与技能按键区域后,用户界面62显示出被标记为高亮的方向按键区域及技能按键区域,还可以显示出提示信息621、重新识别图标622、建立游戏手柄按键映射图标623。其中:
该提示信息621用于提示用户已经完成按键区域识别的,可以是文字信息“已完成自动识别游戏按键”。在另一些实施例中,该提示信息621还可以是语音信息,或者是图标。本申请对此不作限制。
重新识别图标622可以用于监听作用于该图标上的触控操作(例如点击),响应于该操作,电子设备100重新识别用户界面中的方向按键区域和技能按键区域。
建立游戏手柄按键映射图标623可以用于监听作用于该图标的触控操作(例如点击),响应于该操作,电子设备100可以建立与电子设备200的按键映射关系。
可以理解的是,提示信息621、重新识别图标622、建立游戏手柄按键映射图标623独立于游戏应用的用户界面,也即是说,上述提示信息与图标并不与游戏应用的用户界面相关,当游戏应用的用户界面发生改变时,上述提示信息与图标并不随之发生改变。
图13示出了一个实施例中,电子设备100与电子设备200建立按键映射的实施过程的界面示意图。
如图13所示,响应于作用在建立游戏手柄按键映射图标623上的触控作用(例如点击),开始建立电子设备100与电子设备200的按键映射时,电子设备100可以显示出用户界面63。当游戏虚拟按键与游戏手柄物理按键建立映射关系时,用户界面63中显示出可以供用户可操控的光标图标631。该光标图标631可以用于在已经识别出的按键区域上移动,当需要对某个虚拟按键进行适配时,用户将光标移动到该指定虚拟按键图标上,使得该光标悬停在该虚拟按键图标上。
例如,如图13所示,该光标图标631悬停在已被标识为高亮的普攻技能按键625C上。当响应于作用在电子设备200上功能按键201E的触控操作(例如按压),电子设备200向电子设备100发送包含有功能按键201E键值的第一信号。当响应于接收到包含有功能按键201E键值的第一信号时,电子设备100建立普攻技能按键625C与功能按键201E的映射关系,该普攻技能按键625C不再高亮显示。电子设备100完成普攻技能按键625C与功能按键201E的映射后,电子设备100显示出用户界面64。该用户界面64中,光标图标631移动到下一个被标记成为高亮的方向按键区域624。检测到电子设备200作用于摇杆按键201A的触控操作(例如按压),电子设备100建立方向按键区域624与摇杆按键201A的映射关系,该方向按键区域624不再高亮显示。当所有按键都与电子设备200建立映射关系后,电子设备100显示出光标图标631消失,所有按键都不再高亮显示的用户界面65。
在一些实施例中,光标图标631可以自行移动到高亮按键区域,也可以由用户操控移动至用户所指定的高亮按键区域,本申请对此不作限制。
在另一些实施例中,电子设备100可以对按键映射进行合法性检测,也即是说,当用户在建立按键映射关系的过程中,响应于作用于电子设备200上的物理按键的触控操作(例如按压),电子设备200向电子设备100发送包含有物理按键键值的第一信号,当电子设备100接收到包含有该物理按键键值的第一信号时,会检测该物理按键键值是否已与电子设备100上的其他虚拟按键图标建立了映射关系。若是,则显示出提示信息以提示用户该物理按键已与其他虚拟按键图标建立映射。该提示信息可以是文本信息,例如“该按键已建立映射,请重新进行选择”,也可以是语音信息,本申请对此不作限制。
具体的,例如,如图13所示,电子设备100上的普攻技能按键625C已与电子设备200上的功能按键201E已经建立了映射关系,此时,光标图标631移动至还未进行按键映射的方向按键区域631,若响应于作用于电子设备200上的功能按键201E的触控操作(例如按压),电子设备200向电子设备100发送包含有功能按键201E键值的第一信号,当电子设备100接收到包含有功能按键201E键值的第一信号时,检测到该功能按键201E键值已与普攻技能按键625C建立映射关系,则显示出文本提示信息“该按键已建立映射,请重新进行选择”,以提示用户该物理按键已与虚拟按键图标建立映射。
在另一些实施例中,电子设备100还可以对按键映射进行合理性检测,也即是说,当用户在建立按键映射关系的过程中,响应于作用于电子设备200上的物理按键的触控操作(例如按压),电子设备200向电子设备100发送包含有物理按键键值的第一信号,当电子设备100接收到包含有该物理按键键值的第一信号时,会检测该物理按键键值是否能与该选定的虚拟按键图标建立正确的映射关系。若否,则显示出提示信息以提示用户重新进行选择。该提示信息可以是文本信息,例如“该按键不能与选定按键图标进行匹配,请重新进行选择”,也可以是语音信息,本申请对此不作限制。
具体的,例如,如图13所示,光标图标631移动至还未进行按键映射的方向按键区域631,若响应于作用于电子设备200上的功能按键201E的触控操作(例如按压),电子设备200向电子设备100发送包含有功能按键201E键值的第一信号,当电子设备100接收到包含有功能按键201E键值的第一信号时,检测到该功能按键201E键值不能与方向按键区域631建立正确的映射关系,则显示出文本提示信息“该按键不能与选定按键图标进行匹配,请重新进行选择”,以提示用户重新选择可以与方向按键区域631进行正确映射的物理按键(例如摇杆按键201A)。
图14A、图14B示出了电子设备100与电子设备200成功建立按键映射后的游戏手柄控制游戏应用中应用按键所产生的效果界面示意图。
当电子设备100上的虚拟按键图标与电子设备200上的物理按键建立映射关系后,响应于用户作用在电子设备200上物理按键的触控操作(例如按压),电子设备100可以显示出新的用户界面(又可称为第二用户界面),该用户界面可以部分或全部不同于未触控电子设备200上的物理按键时电子设备100所显示的用户界面,例如游戏应用界面的跳转、游戏应用界面中人物的移动、游戏场景的改变等等导致的用户界面的刷新。具体的,如图14A、图14B示例性示出了本申请上述的技术效果界面。
如图14A所示,电子设备200中的功能按键201E与电子设备100上的“普攻”按键图标625C建立了映射关系。当电子设备100检测用户手指712作用于电子设备200上的功能按键201E的触控操作(例如按压)时,电子设备100即产生该“普攻”按键图标625C的触控事件,电子设备100显示出游戏人物图标601发动普攻技能的用户界面71。
如图14B所示,电子设备200中的摇杆按键201A与电子设备100上的方向按键区域624建立了映射关系。当电子设备100检测用户手指732作用于电子设备200上的摇杆按键201A的触控操作(例如向右推动)时,电子设备100即产生该方向按键区域624中向右方向按键624C的触控事件,电子设备100显示出游戏人物图标601在游戏场景中向右移动的用户界面73。
可以理解的是,本实施例图14A、图14B所示出的技术效果界面仅仅用作解释本申请实施例,并不对本申请构成具体限制。
基于上述示例性实施例及示例性应用场景,下面详细说明电子设备100与电子设备200建立按键映射步骤。
图15示出了本申请所提供的一种按键映射方法的流程图。如图15所示,本申请以电子设备100是手机,电子设备200是游戏手柄为例,详细描述该方法,该方法可以包括:
S101、手机与游戏手柄建立第一连接。
具体的,手机和游戏手柄可以具有蓝牙(BT)模块、WLAN模块中的一项或多项。其中,蓝牙(BT)模块可以提供包括经典蓝牙(蓝牙2.1)或蓝牙低功耗(BLE)中一项或多项蓝牙通信的解决方案。WLAN模块可以提供包括Wi-Fi direct、Wi-Fi LAN或Wi-Fi softAP中一项或多项WLAN通信的解决方案。
手机可以使用蓝牙或WLAN中的一种或多种无线通信技术与游戏手柄建立第一连接。
S102、该手机显示第一用户界面。
具体的,该第一用户界面可以包含多个游戏按键(又可称为游戏控件),这多个游戏按键中可以包括第一游戏按键(又可称为第一控件)。如图9所示的用户界面60,该用户界面可以包括普攻技能按键603C。
S103、该手机检测到第一用户操作。
具体的,该手机检测到用户请求建立游戏按键映射的第一用户操作,如图12所示的用户作用于游戏识别图标611A的触控操作(例如点击)。
S104、该手机扫描第一用户界面,通过第一图像处理算法,识别出第一用户界面中的多个游戏按键。
如图12所示。具体的,游戏画面中的按键区域相对于游戏画面来说是分离的,也即是说,按键区域不会随着游戏画面的变化而变化,并且,按键区域与周边游戏画面的颜色反差较大。因此,手机可以通过第一图像处理算法,将游戏画面中的按键区域识别出来。然后,为了提高按键区域的识别精准度,手机可以结合获取到的用户在该游戏应用画面中的触控频率,对第一图像处理算法进行校正,从而准确有效地找到游戏画面中的按键区域。
在一些实施例中,第一图像处理算法可以是边缘检测算法,关于边缘检测算法是如何识别游戏按键的,会在后文进行说明,此处先不赘述。
边缘检测算法仅为一个示例,本申请不对第一图像处理算法进行特殊限制,其他可以实现识别游戏按键的方法均可。
S105、在每个游戏按键的位置上生成对应的虚拟按键,即在第一游戏按键的位置上生成第一虚拟按键。
具体的,如图12所示,当手机成功识别出用户界面中的游戏按键位置后,在每个游戏按键的位置上可以生成对应的虚拟按键。
S106、在第一虚拟按键处显示第一光标。
具体的,如图13所示,普攻技能按键625C显示出第一光标631。不限于显示第一光标,第一虚拟按键还可以显示高亮状态、闪烁状态等,只要可以被用户识别为待选定状态即可。
S107、游戏手柄检测到用户按压第一物理按键。
具体的,如图13所示,游戏手柄检测到用户按压第一物理按键功能按键201E的操作。
S108、游戏手柄发送第一物理按键被按压的第一信号。
第一信号可以携带有第一物理按键的标识。
S109、成功建立第一游戏按键和第一物理按键的第一映射关系。同理,其余按键也一一建立映射关系。
具体的,如图13所示,手机可以获取有光标图标631悬停的普攻技能按键625C的位置信息,以及第一物理按键功能按键201E的标识或键值,并形成两者间的映射关系。同理,手机上其余虚拟按键和游戏手柄上的物理按键也一一建立映射关系。
在第一游戏按键与第一物理按键建立完映射关系后,第一光标可以从第一游戏按键处移至第二游戏按键处,本申请对按键匹配时光标移动的顺序不作限制,光标可以移至任意一个还未建立映射的游戏按键处。
在手机界面中的游戏按键与游戏手柄的物理按键建立映射关系后,手机中可以生成并存储有第一映射表,该第一映射表中记录了手机界面中的游戏按键与游戏手柄的物理按键之间的映射关系。
S110、在所有游戏按键与游戏手柄的物理按键的映射关系建立完成之后,手机返回显示第一用户界面。
具体的,当手机上所有游戏按键和游戏手柄上的物理按键的映射关系建立完成后,手机返回显示游戏界面。
S111、游戏手柄检测到用户按压第一物理按键。
如图14A所示,游戏手柄检测到用户按压第一物理按键功能按键201E的操作。
S112、游戏手柄向手机发送第一物理按键被按压的第一信号。
S113、手机接收到第一物理按键被按压的信号,根据第一映射关系,触发第一游戏按键被点击。
S114、手机显示第二用户界面,第二用户界面为第一游戏按键对应的游戏功能被触发所展示的用户界面。
如图14A所示,当游戏手柄上的第一物理按键功能按键201E被按压时,手机接收到第一物理按键功能按键201E被按压的信号,触发点击普攻功能按键625C被点击的事件,显示出对应的普攻技能被触发所展示的用户界面71。
在一些实施例中,电子设备100也可以设置第一时长阈值,当光标悬停在虚拟按键图标上的时间超过第一时长阈值时,电子设备100将该虚拟按键图标确认为需要进行是适配的虚拟按键图标,并获取该虚拟按键图标的位置信息。因此,可以理解的是,对于电子设备100如何确认游戏应用画面中哪个虚拟按键图标是需要进行适配的虚拟按键图标,本申请对此并不作限制。
下面结合图16和图17,以边缘检测算法为例,说明电子设备100识别游戏应用画面中的按键区域的步骤。如图16所示,该方法步骤可以包括:
S201、电子设备100对第一图像进行灰度处理,得到第二图像。
具体的,第一图像是第一用户界面转化的图像,第一用户界面即游戏界面。第一图像为彩色图像,第二图像是第一图像进行灰度处理后得到的灰色图像。
可以理解的是,当前主流的标准图像表示方式是24比特模式,即每像素24位(bitsperpixel,BPP)编码的RGB值。它是使用三个8位无符号整数(0到255)表示红色、绿色和蓝色的强度。24比特模式用于真彩色和联合图像专家小组(joint photographic experts group,JPEG)图像格式或者标签图像文件格式(tag image file format,TIFF)等图像文件格式里的通用颜色交换。它可以产生一千六百万种颜色组合,对人类的眼睛来说,其中有许多颜色已经无法确切地分辨。也即是说,电子设备100先存储红色R(8bit),再存储绿色G(8bit),最后存储蓝色B(8bit),一共24bit,每个颜色256个梯度,交错地以RGBRGBRGB……这样的形式存储在文件里面。其中,JPEG是一个国际图像压缩标准,JPEG图像压缩算法能够在提供良好的压缩性能的同时,具有比较好的重建质量,被广泛应用于图像、视频处理领域。TIFF是一种灵活的位图格式,主要用来存储包括照片和艺术图在内的图像。
而灰度图只有8位的图像深度,因此在图像处理中,灰度图所需的计算量比彩色图的计算量要少。虽然丢失了一些颜色等级,但是从整幅图像的整体和局部的色彩及亮度等级分布特征来看,作为灰度图的第二图像的描述与作为彩色图的第一图像的描述是一致的。
因此,电子设备100可以将第一图像进行灰度处理。根据R、G、B的重要性及其他指标,将三个不同的分量进行加权平均。由于人眼对绿色的敏感度最高,对蓝色的敏感度最低,因此,按公式1中对RGB三个分量进行加权平均能得到较合理的灰度图像。电子设备100可以根据公式1得到第一图像中每一个像素的灰度值,从而获得灰度处理后的第二图像:
Gray=0.3R+0.6G+0.1B公式1
S202、电子设备100对第二图像进行高斯滤波。
具体的,在本申请实施例中,对第二图像进行高斯滤波处理即是对第二图像的灰度值进行加权平均,也即是说,针对第二图像中每一个像素点的灰度值,都由其本身值和邻域内的其他灰度值经过加权平均,最终得到该像素点经过高斯滤波后的最终灰度值。
因此,可以看出,高斯滤波可以分为两步:1.获取高斯模板(也即是权重模板)。2.进行加权平均。
在像素点的加权平均过程中,正态分布显然是一种可取的权重分配模式,由于图像是二维的,所以需要使用二维的高斯函数,如公式2所示:
Figure PCTCN2021128486-appb-000010
在计算权重时,只需要将“中心点”作为原点,其他点按照其在正态曲线上的位置,分配权重,就可以得到一个加权平均值。
示例性的,假定σ=1.5,模糊半径为1,中心点的坐标为(0,0),那么距离它最近的8个点坐标如表1所示:
表1
(-1,1) (0,1) (1,1)
(-1,0) (0,0) (1,0)
(-1,-1) (0,-1) (1,-1)
将上述坐标中的横坐标作为二维高斯函数中的x,纵坐标作为二维高斯函数中的y,经计 算,则模糊半径为1的权重矩阵如表2所示:
表2
0.0453542 0.0566406 0.0453542
0.0566406 0.0707355 0.0566406
0.0453542 0.0566406 0.0453542
这9个点的权重总和等于0.4787147。根据权重模板的特性,如果只计算这9个点的加权平均,还必须让它们的权重之和等于1,因此上面9个值还需要分别除以0.4787147,得到最终的权重矩阵,如表3所示:
表3
0.0947416 0.118318 0.0947416
0.118318 0.147761 0.118318
0.0947416 0.118318 0.0947416
由于本申请示例性的实施例的模糊半径为1,因此,在对像素点进行高斯滤波计算时,只需要取周围8个像素点进行加权平均。示例性的,该9个像素点的灰度值如表4所示。灰度值范围为0-255。位于表4中心的像素点即为本次需要处理的像素点:
表4
14 15 16
24 25 26
34 35 36
将上述表4中像素点的灰度值分别与表3中对应位置的权重相乘,如表5所示:
表5
14×0.0947416 15×0.118318 16×0.0947416
24×0.118318 25×0.147761 26×0.118318
34×0.0947416 35×0.118318 36×0.0947416
最终所获得的数值如表6所示:
表6
1.32638 1.77477 1.51587
2.83963 3.69403 3.07627
3.22121 4.14113 3.4107
将上述表6中的9个值相加,即是该中心点的高斯滤波值。该中心点的高斯滤波值计算过程如下式子所示:
1.32638+1.77477+1.51587+2.83963+3.69403+3.07627+3.22121+4.14113+3.4107=24.99999
由于表示灰度值的符号是正整数,因此,可以近似取为25。
由上述本申请示例性实施例可以看出,电子设备100对第一图像中每一个像素点重复上述过程,既可获得第二图像。
在另外一些实施例中,二维高斯函数中的x,y还可以有其他的取值方式,也即是说,最后所得到的权重矩阵也可以是其他数值,本申请对此不作限制。
在另外一些实施例中,第一图像可以不经过灰度化,电子设备100可以直接对第一图像进行高斯滤波。也即是说,电子设备100可以针对第一图像,分别对第一图像中的RGB三个 通道进行高斯滤波。
S203、电子设备100获取用户触控频率。
具体的,以前述的游戏应用程序为例,电子设备100可以获取用户在电子设备100屏幕界面针对于该游戏应用场景下的触控操作(例如点击)的第一次数数据。
具体的,获取用户在电子设备100屏幕界面针对于该游戏应用场景下的触控操作(例如点击)的次数,可以由游戏手柄应用程序进行。该游戏手柄应用程序可以获取到位于电子设备100内部存储空间中的,用户在电子设备100屏幕界面针对于该游戏应用场景下的触控操作(例如点击)的第一次数数据。该第一次数数据可以是用户在电子设备100屏幕界面针对于该游戏应用场景下的触控操作(例如点击)的历史次数数据,也即是说,该第一次数数据可以是该游戏应用还未进行按键识别操作、或按键匹配操作前,用户在电子设备100屏幕界面针对于该游戏应用场景下的触控操作(例如点击)的次数数据。
在一些实施例中,电子设备100可以在电子设备100与电子设备200建立通信连接时获取用户针对需要进行按键匹配的应用的触控频率。在另一些实施例中,电子设备100也可以在对第一图像进行灰度处理后,获取用户针对需要进行按键匹配的应用的触控频率。也即是说,电子设备100获取用户针对需要进行按键匹配的应用的触控频率这一步骤,只需要在电子设备100利用线性插值公式处理边缘检测之前完成,具体发生顺序本申请对此并不做限制。
S204、电子设备100结合上述获取到的用户触控频率,对第二图像进行边缘检测。
具体的,电子设备100利用边缘检测算法对第二图像进行边缘检测。
在本申请实施例中,边缘检测算法的边缘检测算子可以采用Sobel边缘差分算子或其他算子。Sobel边缘差分算子可以根据像素点上下、左右邻点灰度加权差,在边缘处达到极值这一现象来检测边缘。Sobel边缘差分算子计算出水平方向的差分G x、垂直方向的差分G y,由此可以确定像素点的梯度模(也称梯度强度)G和方向θ。如公式3和公式4所示,G为梯度强度(也称为梯度值),θ表示方向,arctan为反正切函数:
Figure PCTCN2021128486-appb-000011
Figure PCTCN2021128486-appb-000012
如果梯度强度G大于或等于某一阈值,则认为该像素点为边缘点。
下面,示例性地描述如何计算梯度强度G和方向θ。
x和y方向的Sobel算子分别为:
Figure PCTCN2021128486-appb-000013
其中,S x表示x方向的Sobel算子,用于检测y方向的边缘;S y表示y方向的Sobel算子,用于检测x方向的边缘(边缘方向和梯度方向垂直)。
若图像中有一个3x3的窗口为A,要计算梯度的第一像素点为e,则和Sobel算子进行卷积后,第一像素点e在x方向和y方向的梯度值分别为:
Figure PCTCN2021128486-appb-000014
Figure PCTCN2021128486-appb-000015
其中,*为卷积符号,sum表示矩阵中所有元素相加求和。根据公式3和公式4便可以计算出第一像素点e中第一梯度向量的梯度强度和方向。
对图像进行梯度计算后,仅仅基于梯度值提取的边缘仍然很模糊,因此,需要将计算出来的梯度边缘进行边缘细化,也即是说,保留局部最大梯度值,而将其他除局部最大梯度值外的所有梯度值抑制为0。该部分的算法分为两个步骤:1.将当前像素的梯度强度与沿正负方向上的多个像素进行比较。2.如果当前像素的梯度强度与另外几个像素相比最大,则该像素点被保留为边缘点,否则,该像素点将被抑制。通常为了更加精确的计算,在跨越梯度方向的几个相邻像素之间使用线性插值来得到要比较的梯度强度。
示例性的,如图17所示,以5x5窗口为当前第一像素点e的邻近空间,比较当前点像素e的梯度强度T(e)和正负梯度直线相交的窗口上的像素点Q1、Q2、Q3、Q4的梯度强度T(Q1)、T(Q2)、T(Q3)、T(Q4)。其中,S1、S2为与Q1同一条直线上的两点,S3、S4为与Q2同一条直线上的两点,S5、S6为与Q3同一条直线上的两点,S7、S8为与Q4同一条直线上的两点。
如公式5、公式6、公式7所示,计算Q1的梯度强度T(Q1)的线性插值公式为:
Figure PCTCN2021128486-appb-000016
Figure PCTCN2021128486-appb-000017
T(Q1)=e (w×T(S1)+(1-w)×T(S2))公式7
其中,distance(S1,S2)表示S1、S2两点之间的距离,w系数可以通过梯度方向计算得到。同时,结合用户的使用习惯,对触控频率高的区域增加权重m,n为每分钟当前点S被用户点击的次数,以作为该区域梯度变换的修正值,同理计算其他Q1、Q2、Q3、Q4其他点的梯度强度。
若当前像素点S的梯度强度和同方向的Q1、Q2、Q3、Q4其他点的梯度强度相比较时最大的,保留其值。否则将当前像素点S的梯度抑制,也即是说,将当前像素点S的梯度设置为0。
在上述对梯度边缘进行细化后,剩余的像素可以更准确地表示图像中的实际边缘。然而,图像中仍然存在由于噪声和颜色变化引起的一些边缘像素。为了解决这些杂散响应,必须用弱梯度值过滤边缘像素,并保留具有高梯度值的边缘像素,可以通过选择高低阈值来实现。统计整幅图像所有像素点的梯度强度的直方图,选取占直方图总数75%的所对应的高位梯度强度为高阈值(也可以称为第一阈值),占直方图中总数25%的所对应的低位梯度强度为低阈值。如果像素点的梯度值高于高阈值,则将该像素保留;如果像素点的梯度值小于低阈值,则该像素点会被排除。
最后,所有识别出的边缘像素点依次连接构成第一图像上按键区域的边界。
在另外一些实施例中,也可以通过其他方法选取高阈值和低阈值的数值,本申请对此不作限制。
S205、电子设备100获取多帧图像,并对每一帧图像重复上述步骤。
具体的,电子设备100可以获取在时间轴上连续的一组图像,并对该组中的每一帧图像重复进行高斯滤波和边缘检测。在另一些实施例中,电子设备100也可以根据一定的时间间隔获取一组图像。可以理解的是,对于电子设备100如何获取多帧图像,本申请对此不作限制。
S206、电子设备100比较多组图像的结果,获取按键区域边缘。
具体的,电子设备100可以获取多组图像进行高斯滤波和边缘检测后的输出结果,并将该结果进行比较。电子设备100可以获取多组图像输出结果比较后的重复识别位置,该重复识别位置即为电子设备100需要获取的按键区域。
上述实施例中,根据上下文,术语“当…时”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (30)

  1. 一种按键映射方法,其特征在于,所述方法包括:
    第一电子设备与第二电子设备建立第一连接;
    所述第一电子设备显示第一用户界面,所述第一用户界面中包含多个控件,所述多个控件中包括第一控件;
    所述第一电子设备检测到第一用户操作;
    所述第一电子设备识别出所述第一用户界面中的所述多个控件;
    所述第一电子设备选定所述多个控件中的第一控件;
    所述第一电子设备通过所述第一连接接收到所述第二电子设备发送的第一信号,并响应在所述第一控件处于选定状态时收到的所述第一信号,建立所述第二电子设备的第一物理按键与所述第一控件的映射关系;所述第一信号是所述第二电子设备在所述第一物理按键被用户按压时产生的。
  2. 如权利要求1所述的方法,其特征在于,还包括:
    在建立所述第二电子设备的第一物理按键与所述第一控件的映射关系后,所述第一电子设备选定所述多个控件中的第二控件;
    所述第一电子设备通过所述第一连接接收到所述第二电子设备发送的第二信号,并响应在所述第二控件处于选定状态时收到的所述第二信号,建立所述第二电子设备的第二物理按键与所述第二控件的映射关系;所述第二信号是所述第二电子设备在所述第二物理按键被用户按压时产生的。
  3. 如权利要求1或2所述的方法,其特征在于,所述第一用户界面为游戏应用程序的用户界面,所述第二电子设备为游戏手柄。
  4. 如权利要求1-3中任一项所述的方法,其特征在于,还包括:在所述多个控件与所述第二电子设备的多个物理按键之间的映射关系建立完成之后,所述第一电子设备通过所述第一连接接收到所述第一信号,所述第一电子设备执行所述第一控件对应的功能。
  5. 如权利要求1-4中任一项所述的方法,其特征在于,所述第一电子设备识别出所述第一用户界面中的所述多个控件,具体包括:
    所述第一电子设备通过边缘检测方法从第一图像中识别出所述第一用户界面中的所述控件的边界;所述第一图像为对所述第一用户界面进行灰度处理得到的图像。
  6. 如权利要求5所述的方法,其特征在于,所述控件的边界包括多个边界点,所述边缘检测方法具体包括:所述第一图像中的第一像素点在所述第一用户界面中对应的触控位置被用户触控的频率越高,所述第一像素点被识别为所述控件的边界点的概率越高。
  7. 如权利要求5或6所述的方法,其特征在于,所述第一电子设备通过边缘检测方法从第一图像中识别出所述第一用户界面中的所述控件的边界,具体包括:
    所述第一电子设备采用边缘算子,计算所述第一图像中每个像素点的梯度向量;
    所述第一电子设备采用线性插值法,比较第一像素点的第一梯度向量的梯度值与所述第 一像素点的第一梯度向量相同方向上其他像素点的梯度值;
    如果在所述方向上,所述第一像素点的梯度值最大,那么所述第一电子设备保留所述第一像素点的梯度值,将所述其他像素点的梯度值设为零;
    所述第一电子设备设定第一阈值,如果所述第一像素点的梯度值大于所述第一阈值,则所述第一像素点被保留,所有被保留的像素点构成所述第一用户界面中的所述控件的边界。
  8. 如权利要求4-7中任一项所述的方法,其特征在于,所述第一电子设备执行所述第一控件对应的功能,具体包括:所述第一电子设备显示第二用户界面,所述第二用户界面不同于所述第一用户界面。
  9. 如权利要求1-8中任一项所述的方法,其特征在于,所述第一控件处于选定状态,具体包括一项或多项:所述第一控件处于高亮状态、所述第一控件上显示有光标、所述第一控件处于闪烁状态。
  10. 如权利要求1-9中任一项所述的方法,其特征在于,还包括:
    在所述第一电子设备显示第一用户界面之后,所述第一电子设备在所述第一用户界面中显示第一悬浮控件;
    其中,所述第一电子设备检测到第一用户操作,具体包括:检测到作用于所述第一悬浮控件上的用户操作。
  11. 一种通信方法,应用于通信系统,所述通信系统包括第一电子设备和第二电子设备,其特征在于,包括:
    所述第一电子设备与所述第二电子设备建立第一连接;
    所述第一电子设备显示第一用户界面,所述第一用户界面中包含多个控件,所述多个控件中包括第一控件;
    所述第一电子设备检测到第一用户操作;
    所述第一电子设备识别出所述第一用户界面中的所述多个控件;
    所述第一电子设备选定所述多个控件中的第一控件;
    所述第二电子设备检测到第一物理按键被用户按压,生成第一信号;
    所述第二电子设备通过所述第一连接向所述第一电子设备发送所述第一信号;
    所述第一电子设备响应在所述第一控件处于选定状态时收到的所述第一信号,建立所述第一物理按键与所述第一控件的映射关系。
  12. 如权利要求11所述的方法,其特征在于,还包括:
    在建立所述第一物理按键与所述第一控件的映射关系后,所述第一电子设备选定所述多个控件中的第二控件;
    所述第二电子设备检测到第二物理按键被用户按压,生成第二信号;
    所述第二电子设备通过所述第一连接向所述第一电子设备发送所述第二信号;
    所述第一电子设备响应在所述第二控件处于选定状态时收到的所述第二信号,建立所述第二物理按键与所述第二控件的映射关系。
  13. 如权利要求11或12所述的方法,其特征在于,所述第一用户界面为游戏应用程序的用户界面,所述第二电子设备为游戏手柄。
  14. 如权利要求11-13中任一项所述的方法,其特征在于,还包括:在所述多个控件与所述第二电子设备的多个物理按键之间的映射关系建立完成之后,所述第一电子设备通过所述第一连接接收到所述第一信号,所述第一电子设备执行所述第一控件对应的功能。
  15. 如权利要求11-14中任一项所述的方法,其特征在于,所述第一电子设备识别出所述第一用户界面中的所述多个控件,具体包括:
    所述第一电子设备通过边缘检测方法从第一图像中识别出所述第一用户界面中的所述控件的边界;所述第一图像为对所述第一用户界面进行灰度处理得到的图像。
  16. 如权利要求15所述的方法,其特征在于,所述控件的边界包括多个边界点,所述边缘检测方法具体包括:所述第一图像中的第一像素点在所述第一用户界面中对应的触控位置被用户触控的频率越高,所述第一像素点被识别为所述控件的边界点的概率越高。
  17. 如权利要求15或16所述的方法,其特征在于,所述第一电子设备通过边缘检测方法从第一图像中识别出所述第一用户界面中的所述控件的边界,具体包括:
    所述第一电子设备采用边缘算子,计算所述第一图像中每个像素点的梯度向量;
    所述第一电子设备采用线性插值法,比较第一像素点的第一梯度向量的梯度值与所述第一像素点的第一梯度向量相同方向上其他像素点的梯度值;
    如果在所述方向上,所述第一像素点的梯度值最大,那么所述第一电子设备保留所述第一像素点的梯度值,将所述其他像素点的梯度值设为零;
    所述第一电子设备设定第一阈值,如果所述第一像素点的梯度值大于所述第一阈值,则所述第一像素点被保留,所有被保留的像素点构成所述第一用户界面中的所述控件的边界。
  18. 如权利要求14-17中任一项所述的方法,其特征在于,所述第一电子设备执行所述第一控件对应的功能,具体包括:所述第一电子设备显示第二用户界面,所述第二用户界面不同于所述第一用户界面。
  19. 如权利要求11-18中任一项所述的方法,其特征在于,所述第一控件处于选定状态,具体包括一项或多项:所述第一控件处于高亮状态、所述第一控件上显示有光标、所述第一控件处于闪烁状态。
  20. 如权利要求11-19中任一项所述的方法,其特征在于,还包括:
    在所述第一电子设备显示第一用户界面之后,所述第一电子设备在所述第一用户界面中显示第一悬浮控件;
    其中,所述第一电子设备检测到第一用户操作,具体包括:检测到作用于所述第一悬浮控件上的用户操作。
  21. 一种电子设备,其特征在于,包括通信装置、触摸屏、存储器以及耦合于所述存储器 的处理器,所述存储器中存储有可执行指令,其中:
    所述通信装置用于与第二电子设备建立第一连接;
    所述触摸屏用于显示第一用户界面,所述第一用户界面中包含多个控件,所述多个控件中包括第一控件;
    所述触摸屏还用于检测到第一用户操作;
    所述处理器用于识别出所述第一用户界面中的所述多个控件;
    所述处理器还用于选定所述多个控件中的第一控件;
    所述通信装置还用于通过所述第一连接接收到所述第二电子设备发送的第一信号;
    所述处理器还用于,响应在所述第一控件处于选定状态时收到的所述第一信号,建立所述第二电子设备的第一物理按键与所述第一控件的映射关系;所述第一信号是所述第二电子设备在所述第一物理按键被用户按压时产生的。
  22. 如权利要求21所述的电子设备,其特征在于,所述处理器还用于,在建立所述第二电子设备的第一物理按键与所述第一控件的映射关系后,选定所述多个控件中的第二控件;
    所述通信装置还用于通过所述第一连接接收到所述第二电子设备发送的第二信号;
    所述处理器还用于,响应在所述第二控件处于选定状态时收到的所述第二信号,建立所述第二电子设备的第二物理按键与所述第二控件的映射关系;所述第二信号是所述第二电子设备在所述第二物理按键被用户按压时产生的。
  23. 如权利要求21或22所述的电子设备,其特征在于,所述第一用户界面为游戏应用程序的用户界面,所述第二电子设备为游戏手柄。
  24. 如权利要求21-23中任一项所述的电子设备,其特征在于,所述处理器还用于,在所述多个控件与所述第二电子设备的多个物理按键之间的映射关系建立完成之后,通过所述第一连接接收到所述第一信号时,执行所述第一控件对应的功能。
  25. 如权利要求21-24中任一项所述的电子设备,其特征在于,所述处理器具体用于:
    通过边缘检测方法从第一图像中识别出所述第一用户界面中的所述控件的边界;所述第一图像为对所述第一用户界面进行灰度处理得到的图像。
  26. 如权利要求25所述的电子设备,其特征在于,所述控件的边界包括多个边界点,所述边缘检测方法具体包括:所述第一图像中的第一像素点在所述第一用户界面中对应的触控位置被用户触控的频率越高,所述第一像素点被识别为所述控件的边界点的概率越高。
  27. 如权利要求25或26所述的电子设备,其特征在于,所述处理器具体用于:
    采用边缘算子,计算所述第一图像中每个像素点的梯度向量;
    采用线性插值法,比较第一像素点的第一梯度向量的梯度值与所述第一像素点的第一梯度向量相同方向上其他像素点的梯度值;
    如果在所述方向上,所述第一像素点的梯度值最大,那么保留所述第一像素点的梯度值,将所述其他像素点的梯度值设为零;
    设定第一阈值,如果所述第一像素点的梯度值大于所述第一阈值,则所述第一像素点被 保留,所有被保留的像素点构成所述第一用户界面中的所述控件的边界。
  28. 如权利要求24-27中任一项所述的电子设备,其特征在于,所述处理器具体用于,显示第二用户界面,所述第二用户界面不同于所述第一用户界面。
  29. 如权利要求21-28中任一项所述的电子设备,其特征在于,所述第一控件处于选定状态,具体包括一项或多项:所述第一控件处于高亮状态、所述第一控件上显示有光标、所述第一控件处于闪烁状态。
  30. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1至10、权利要求11至20中任一项所述的方法。
PCT/CN2021/128486 2020-11-05 2021-11-03 一种按键映射方法、电子设备及系统 WO2022095906A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011223709.XA CN114527903A (zh) 2020-11-05 2020-11-05 一种按键映射方法、电子设备及系统
CN202011223709.X 2020-11-05

Publications (1)

Publication Number Publication Date
WO2022095906A1 true WO2022095906A1 (zh) 2022-05-12

Family

ID=81456961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/128486 WO2022095906A1 (zh) 2020-11-05 2021-11-03 一种按键映射方法、电子设备及系统

Country Status (2)

Country Link
CN (1) CN114527903A (zh)
WO (1) WO2022095906A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024067308A1 (zh) * 2022-09-30 2024-04-04 华为技术有限公司 智能设备控制方法、电子设备及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999167A (zh) * 2012-11-14 2013-03-27 广东欧珀移动通信有限公司 一种pc端按键操作移动终端内带有虚拟键盘应用的方法
US20150231498A1 (en) * 2014-02-17 2015-08-20 DingMedia, Ltd. Universal controller interpreter
CN106775282A (zh) * 2016-11-10 2017-05-31 宇龙计算机通信科技(深圳)有限公司 操作终端的方法及装置
CN111399920A (zh) * 2020-03-11 2020-07-10 深圳汗思凯普科技有限公司 移动终端应用程序按键自动配置的方法、装置及存储介质
CN112764564A (zh) * 2019-10-21 2021-05-07 Oppo广东移动通信有限公司 触控信息的处理方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999167A (zh) * 2012-11-14 2013-03-27 广东欧珀移动通信有限公司 一种pc端按键操作移动终端内带有虚拟键盘应用的方法
US20150231498A1 (en) * 2014-02-17 2015-08-20 DingMedia, Ltd. Universal controller interpreter
CN106775282A (zh) * 2016-11-10 2017-05-31 宇龙计算机通信科技(深圳)有限公司 操作终端的方法及装置
CN112764564A (zh) * 2019-10-21 2021-05-07 Oppo广东移动通信有限公司 触控信息的处理方法、装置、存储介质及电子设备
CN111399920A (zh) * 2020-03-11 2020-07-10 深圳汗思凯普科技有限公司 移动终端应用程序按键自动配置的方法、装置及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024067308A1 (zh) * 2022-09-30 2024-04-04 华为技术有限公司 智能设备控制方法、电子设备及系统

Also Published As

Publication number Publication date
CN114527903A (zh) 2022-05-24

Similar Documents

Publication Publication Date Title
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
WO2020259452A1 (zh) 一种移动终端的全屏显示方法及设备
EP3952263A1 (en) Notification message preview method and electronic device
JP7238115B2 (ja) 写真撮影シナリオおよび電子デバイスで画像を表示するための方法
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
CN113542485B (zh) 一种通知处理方法、电子设备及计算机可读存储介质
WO2020134869A1 (zh) 电子设备的操作方法和电子设备
WO2021036585A1 (zh) 一种柔性屏显示方法和电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
WO2021013132A1 (zh) 输入方法及电子设备
CN111543042A (zh) 通知消息的处理方法及电子设备
WO2020107463A1 (zh) 一种电子设备的控制方法及电子设备
WO2022160991A1 (zh) 权限控制方法和电子设备
WO2020024108A1 (zh) 一种应用图标的显示方法及终端
WO2022095744A1 (zh) Vr显示控制方法、电子设备及计算机可读存储介质
CN112150499A (zh) 图像处理方法及相关装置
WO2022166435A1 (zh) 分享图片的方法和电子设备
WO2022105702A1 (zh) 保存图像的方法及电子设备
WO2022022674A1 (zh) 应用图标布局方法及相关装置
WO2022095906A1 (zh) 一种按键映射方法、电子设备及系统
CN115032640B (zh) 手势识别方法和终端设备
CN114283195B (zh) 生成动态图像的方法、电子设备及可读存储介质
WO2022179495A1 (zh) 一种隐私风险反馈方法、装置及第一终端设备
CN113610943B (zh) 图标圆角化的处理方法及装置
CN117724863A (zh) 一种目标信号处理方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21888603

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21888603

Country of ref document: EP

Kind code of ref document: A1