WO2022028537A1 - 一种设备识别方法及相关装置 - Google Patents

一种设备识别方法及相关装置 Download PDF

Info

Publication number
WO2022028537A1
WO2022028537A1 PCT/CN2021/110906 CN2021110906W WO2022028537A1 WO 2022028537 A1 WO2022028537 A1 WO 2022028537A1 CN 2021110906 W CN2021110906 W CN 2021110906W WO 2022028537 A1 WO2022028537 A1 WO 2022028537A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
interface
icon
label
user
Prior art date
Application number
PCT/CN2021/110906
Other languages
English (en)
French (fr)
Inventor
徐杰
龙嘉裕
吴思举
孙科
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011183311.8A external-priority patent/CN114079691B/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21853577.1A priority Critical patent/EP4184905A4/en
Priority to JP2023507686A priority patent/JP2023538835A/ja
Publication of WO2022028537A1 publication Critical patent/WO2022028537A1/zh
Priority to US18/164,170 priority patent/US20230188832A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present application relates to the field of electronic technologies, and in particular, to a device identification method and related devices.
  • smart connected devices are becoming more and more popular. Many users own multiple electronic devices such as smartphones, computers, smart TVs, tablets, and smart speakers. Other devices, such as households, may also have smart audio and video devices, routers/wifi boxes, smart cleaning devices, smart kitchen appliances, and smart lighting systems. and other electronic equipment.
  • the target device can be discovered and selected among multiple devices through menu/list, map, and NFC. User operation is more complicated.
  • the embodiments of the present application provide a device identification method and a related device, which can intuitively display identification information of nearby devices through simple operations, and provide an interaction approach between devices, realize coordinated control among multiple devices, and effectively improve user experience .
  • the present application provides a device identification method, which is applied to a first electronic device with a camera.
  • the method includes: the first electronic device receives a first operation; in response to the first operation, the first electronic device displays the first interface, the first interface includes a preview image captured by the camera, wherein the preview image includes the second electronic device; the first electronic device obtains first position information of the second electronic device relative to the first electronic device; the first electronic device is based on the first electronic device The position information, and the display area of the second electronic device in the preview screen, determine the display position of the first label in the preview screen, and display the first label in the display position, wherein the first label is used to identify the second electronic device; The first electronic device receives a second operation for the first tag; in response to the second operation, the first electronic device displays a second interface including one or more controls for controlling the second electronic device.
  • the first electronic device receives the first operation and displays the first interface, activates the camera, and displays the image captured by the camera in real time on the first interface; the first electronic device recognizes the image in the image according to the image recognition technology.
  • Electronic devices and device types of electronic devices such as speakers, computers, tablet computers, etc.
  • the first electronic device obtains the second electronic device according to the wireless positioning technology (such as UWB positioning, Bluetooth positioning, WiFi positioning, etc.).
  • the location information includes one or more of distance, direction, and angle.
  • the first electronic device determines the display position of the first label of the second electronic device in the preview screen, and the first label is used to identify the second electronic device, such as the device name, device name of the second electronic device type, etc.
  • the display position of the first label is related to the display position of the second electronic device.
  • the first electronic device detects a user operation on the first tag
  • the first electronic device outputs a second interface, where the second interface includes one or more controls for controlling the second electronic device.
  • the second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface.
  • the present application presents the corresponding relationship between the first label and the second electronic device in real time through the augmented reality display mode, and realizes the interaction between the first electronic device and the second electronic device through the first label, and realizes coordinated control among multiple devices, Improved user experience.
  • acquiring the first position information of the second electronic device relative to the first electronic device by the first electronic device specifically includes: the first electronic device broadcasts a detection request, and the detection request includes the identity of the first electronic device ;
  • the first electronic device receives the probe response sent by the second electronic device based on the probe request, it determines the first position information of the second electronic device and the first electronic device based on the probe response, and the probe response includes the identity of the second electronic device.
  • the first position information includes relative positions of the second electronic device and the first electronic device, such as distance, direction, angle, and the like.
  • the first electronic device can calculate the distance between the second electronic device and the first electronic device according to the time difference between sending the detection request and receiving the detection response (the distance is equal to the time difference multiplied by the propagation speed of the electromagnetic wave); based on the detection response, the first electronic device, By calculating the arrival angle of the detection response, the azimuth angle of the second electronic device relative to the first electronic device can be determined.
  • the detection response includes the identity identifier of the second electronic device and the first location information
  • the first electronic device determines the first location information of the second electronic device and the first electronic device based on the detection response.
  • the second electronic device calculates the relative position of the second electronic device and the first electronic device according to the received detection request.
  • the detection request includes the sending time, and the second electronic device determines the time difference based on the sending time and the time when the second electronic device receives the detection request, so as to calculate the distance between the second electronic device and the first electronic device;
  • the device calculates the arrival angle of the probe request based on the received probe request, and can determine the azimuth angle of the second electronic device relative to the first electronic device.
  • the second electronic device sends a probe response to the first electronic device, where the probe response includes the identity identifier of the second electronic device and the first location information.
  • the display position of the first label in the preview screen and the display area of the second electronic device in the preview screen partially or completely overlap.
  • the first label may be displayed in the display area of the second electronic device, may be displayed on the edge of the display area of the second electronic device, or may be displayed at a position close to the display area of the second electronic device.
  • the method further includes: acquiring, by the first electronic device, second position information of the third electronic device relative to the first electronic device; when the first electronic device detects that the preview screen does not include the third electronic device, And based on the second position information, it is determined that the third electronic device is within the viewing range of the camera; the first electronic device determines the display position of the second label in the preview screen based on the second position information, wherein the second label is used to indicate the following: One or more pieces of information: identification information of the third electronic device, obstructions of the third electronic device, and second location information.
  • the first electronic device determines that the third electronic device If it is blocked, output the second label of the third electronic device, indicating one or more of the identification information of the third electronic device, the blocking object, and the blocked position in the preview interface.
  • the method further includes: when the first electronic device detects that the preview image does not include the third electronic device, and determines based on the second position information that the third electronic device is not within the viewing range of the camera; the first The electronic device determines the display position of the third label in the preview screen based on the second location information, where the third label is used to indicate one or more of the following information: identification information of the third electronic device and second location information.
  • the first electronic device determines that the third electronic device
  • the second label of the third electronic device is output, indicating one or more of the identification information of the third electronic device and the relative position (direction, angle, distance, etc.) of the third electronic device.
  • the preview image includes an image of the fourth electronic device
  • the method further includes: the first electronic device determines, based on the preview image, that the device type of the fourth electronic device is the first electronic device. type; the first electronic device determines that the device type is the first target device of the first type among the electronic devices associated or bound under the account of the first electronic device; the first electronic device displays a fourth label, and the fourth label uses The image indicating the fourth electronic device is associated with the first target device. In this manner, when the first electronic device cannot detect the location information of the fourth electronic device, and the image of the fourth electronic device is in the preview screen.
  • the first electronic device identifies the device type of the fourth electronic device according to the image recognition technology, and detects whether there is a target device of this device type in the devices logged into the same account (eg, Huawei account) as the first electronic device. If so, the first electronic device considers the target device to be the fourth electronic device, and the first electronic device outputs a fourth label identifying the target device.
  • the same account eg, Huawei account
  • the preview image includes an image of the fifth electronic device
  • the method further includes: the first electronic device determines, based on the preview image, that the device type of the fifth electronic device is the second electronic device. type; the first electronic device obtains the third position information of the first electronic device, and the first electronic device stores the corresponding relationship between the electronic device and the position information; based on the corresponding relationship, the first electronic device determines the device according to the third position information
  • the type is the second target device of the first type, and the location information of the target device is the same as the third location information; the first electronic device displays a fifth label, and the fifth label is used to indicate that the image of the fifth electronic device is associated with the second target device .
  • the first electronic device cannot detect the location information of the fifth electronic device, and the image of the fifth electronic device is in the preview screen.
  • the first electronic device since the first electronic device stores the corresponding relationship between the electronic device and the location information (for example, smart speakers—living room, smart desk lamp—bedroom, computer—company, etc.)
  • the current geographic location of the first electronic device and the device type of the fifth electronic device are identified according to the image recognition technology to detect whether there is a target device of the device type in the devices in the same geographic location as the first electronic device. If so, the first electronic device considers that the target device is the fifth electronic device, and the first electronic device outputs a fifth label identifying the target device.
  • the first interface further includes a first icon
  • the first icon is associated with the data to be shared
  • the method further includes: the first electronic device receives a third operation, and the third operation is directed to the first tag and/or or the operation of the first icon; in response to the third operation, the first electronic device sends the data to be shared to the second electronic device.
  • the third operation includes but is not limited to drag operation, click operation, etc.; provides a data sharing method, select the second electronic device you want to share on the first interface, and send the data to be shared to the second electronic device .
  • the user operation of data sharing is simplified, the device information is displayed intuitively, and the user experience is improved.
  • the method before the first electronic device receives the third operation, the method further includes: the first electronic device displays a first label in a first display form on the first interface according to the data type of the data to be shared, the first The first label in the display form is used to prompt the user that the second electronic device supports outputting the data to be shared.
  • the first display form may be to brighten the display area of the first label (change brightness, color, etc.).
  • the preview screen includes an image of the third electronic device and a third label, and the third label is associated with the third electronic device; the method further includes: the first electronic device receives a fourth operation, and the fourth operation is For the operation of the first label and/or the third icon; in response to the fourth operation, the first electronic device outputs a prompt message, where the prompt message is used to prompt the user that the third electronic device does not support outputting data to be shared.
  • the present application provides an electronic device, comprising: one or more processors and a memory; the memory includes computer instructions, and when the one or more processors invoke the computer instructions, the electronic device is made to execute:
  • the display position of the first label in the preview screen Based on the first relative position information and the display position of the first target device in the preview screen, determine the display position of the first label in the preview screen, wherein the first label is used to indicate the identification information of the first target device;
  • a second interface is displayed, the second interface including one or more controls for controlling the first target device.
  • the electronic device receives the operation display interface, activates the camera, and displays the image captured by the camera in real time in the interface; the electronic device recognizes the electronic device in the image and the device type of the electronic device (for example, according to the image recognition technology) Speaker, computer, tablet computer, etc.), such as the first target device; and the electronic device obtains the position information of the first target device relative to the electronic device according to wireless positioning technology (such as UWB positioning, Bluetooth positioning, WiFi positioning, etc.).
  • the location information includes one or more of distance, direction, and angle.
  • the electronic device determines the display location of the label of the first target device in the preview screen.
  • the label is used to identify the first target device, such as device name and device type of the first target device.
  • the display position of the label is related to the display position of the first target device.
  • the electronic device detects a user operation for the tag, the electronic device outputs a second interface, where the second interface includes one or more controls for controlling the first target device.
  • the second interface may be displayed by being superimposed on the interface, or the electronic device may jump from the interface to display the second interface.
  • the present application presents the corresponding relationship between the first label and the first target device in real time on the first interface of the electronic device by means of augmented reality display, and realizes the interaction between the electronic device and the first target device through the first label, and realizes multiple Coordinated control between devices improves user experience.
  • the process specifically includes: broadcasting a probe request, where the probe request includes the electronic device's first relative position information. Identity; when receiving a probe response sent by the first target device based on the probe request, determine the first relative position information with the first target device based on the probe response, and the probe response includes the identity of the first target device.
  • the detection response includes the identity of the first target device and the first relative position information
  • the electronic device determines the first relative position information, such as distance, direction, angle, etc., between the first target device and the electronic device based on the detection response.
  • the first target device calculates the relative position of the first target device and the electronic device according to the received detection request.
  • the detection request includes the sending time, and the first target device determines the time difference based on the sending time and the time when the first target device receives the detection request, thereby calculating the distance between the first target device and the electronic device; the first target device is based on the The received detection request, the arrival angle of the detection request is calculated, and the azimuth angle of the first target device relative to the electronic device can be determined.
  • the first target device sends a probe response to the electronic device, where the probe response includes the identity of the first target device and the first relative position information.
  • the display position of the first label in the preview screen and the display position of the first target device in the preview screen partially or completely overlap.
  • the first label may be displayed in the display area of the first target device, may be displayed on the edge of the display area of the first target device, or may be displayed at a position close to the display area of the first target device.
  • the electronic device when the one or more processors invoke the computer instructions, the electronic device is caused to further execute: acquire second relative position information with the second target device; when the electronic device detects that the preview screen does not include the second relative position information; Two target devices, and determine that the second target device is within the viewing range of the camera based on the second relative position information; the electronic device determines the display position of the second label in the preview screen based on the second relative position information, wherein the second label It is used to indicate one or more of the following information: identification information of the second target device, obstructions of the second target device, and second relative position information.
  • the electronic device when the one or more processors invoke the computer instructions, the electronic device further executes: when the electronic device detects that the preview screen does not include the second target device, and determines based on the second relative position information The second target device is not within the viewing range of the camera; the electronic device determines the display position of the third label in the preview screen based on the second relative position information, where the third label is used to indicate one or more of the following information: the second Identification information and second relative position information of the target device.
  • the preview screen includes an image of the third target device.
  • the electronic device further executes: determining based on the preview screen The device type of the third target device is the first type; in the electronic devices associated or bound under the account of the electronic device, it is determined that the device type is the identification information of the device of the first type; the fourth label is displayed, and the fourth label uses The image indicating the third target device is associated with the identification information.
  • the preview screen includes an image of the fourth target device.
  • the electronic device further executes: determining based on the preview screen
  • the device type of the fourth target device is the second type; the position information of the electronic device is obtained, and the electronic device stores the corresponding relationship between the electronic device and the position information; the electronic device determines that the device type is in the corresponding relationship according to the third position information.
  • the first interface further includes a first icon, the first icon is associated with the data to be shared, and when one or more processors invoke computer instructions, the electronic device further executes: receiving a third operation, first The third operation is an operation for the first label and/or the first icon; in response to the third operation, the data to be shared is sent to the first target device.
  • the third operation includes but is not limited to drag operation, click operation, etc.; provides a data sharing method, select the first target device to be shared on the first interface, and send the data to be shared to the first target device .
  • the user operation of data sharing is simplified, the device information is displayed intuitively, and the user experience is improved.
  • the electronic device when one or more processors invoke computer instructions, before the electronic device performs the receiving third operation, the electronic device further performs: according to the data type of the data to be shared, displaying the first interface on the first interface.
  • a first label in a display form where the first label in the first display form is used to prompt the user that the first target device supports outputting the data to be shared.
  • the first display form may be to brighten the display area of the first label (change brightness, color, etc.).
  • the preview screen includes an image of the second target device and a third label, and the third label is associated with the second target device; when one or more processors invoke the computer instructions, the electronic device is caused to also execute : receive a fourth operation, the fourth operation is an operation for the first label and/or the third icon; in response to the fourth operation, output a prompt message, the prompt message is used to prompt the user that the second target device does not support the output of the data to be shared .
  • the present application provides a method for sharing photos, applied to a first electronic device, the method includes: displaying a shooting preview interface of the first electronic device, where the shooting preview interface includes a thumbnail of the first photo and the A preview image captured by the camera of the first electronic device; identifying the sixth electronic device included in the preview image; determining the relative position of the sixth electronic device and the first electronic device; based on the identified sixth electronic device and the relative position, on the preview screen, display the label of the sixth electronic device, the label is used to identify the sixth electronic device; receive a fifth operation of the thumbnail of the first photo; in response to the fifth operation, moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen; sending the first photo to the sixth electronic device.
  • the main interface of the camera application displayed by the user clicking on the icon of the camera application may be referred to as a "shooting preview interface", and the screen presented in the shooting preview interface may be referred to as a "preview image” or a "preview screen”.
  • the shooting preview interface in the embodiment of the present application may represent an interface including a preview screen, a shooting shutter button, a local album icon, a camera switching icon, etc. If the displayed content changes on the interface, for example, a certain identification This interface can still be called a shooting preview interface, which will not be repeated in the following.
  • the preview image may be obtained by a front camera or a rear camera of a mobile phone, and the camera for taking pictures is not limited in this embodiment of the present application.
  • a person's photo is obtained by the front camera of the mobile phone. If the user wants to identify the electronic device through the rear camera, he can switch by clicking the camera switch button.
  • the character photo is obtained by the rear camera of the mobile phone. If the user wants to identify the electronic device through the front camera, he or she can switch by clicking the camera switch button, which is not limited in this embodiment of the present application.
  • the mobile phone can determine the electronic device included in the preview screen in advance, and when the user activates the photo sharing function, the recognized electronic device name can be quickly displayed on the interface, thereby improving the speed of the mobile phone to identify objects in the preview screen. For example, after the mobile phone recognizes the sixth electronic device included in the current preview screen, the user can drag the thumbnail of the first photo to the sixth electronic device to be shared according to his needs.
  • the mobile phone can detect and recognize other electronic devices included in the preview screen through image detection, 3D scanning technology, machine vision, etc., and other electronic devices included in the preview screen.
  • image detection 3D scanning technology
  • machine vision etc.
  • the manner of other electronic devices is not limited.
  • the thumbnail of the first photo may be a local album icon.
  • the icon of the local album displays the first photo recently taken by the user.
  • the thumbnail of the first photo may have the same style or display as the local album icon, and the thumbnail of the first photo is suspended and displayed on the shooting preview interface.
  • the method further includes: receiving a sixth operation on the album icon; and in response to the sixth operation, displaying the image in a suspended manner on the shooting preview interface Thumbnail of the first photo.
  • the fifth operation is an operation of dragging the thumbnail of the first photo
  • the sixth operation is an operation of long-pressing the local album icon
  • the embodiments of the present application can also trigger the photo sharing process provided in the embodiments of the present application through other preset operations, or trigger the mobile phone to identify the electronic device in the preview screen through other preset operations.
  • the preset operation is not limited to long pressing.
  • the local album icon, double-clicking the local album icon, or drawing a fixed pattern on the shooting preview interface, etc., are not limited in this embodiment of the present application.
  • the label of the sixth electronic device is used to identify the name of the sixth electronic device, and/or the label of the sixth electronic device is used to identify The location where the sixth electronic device is located.
  • the mobile phone after recognizing the sixth electronic device included in the preview screen, the mobile phone can determine the position where the label of the sixth electronic device is displayed according to the display position of the sixth electronic device in the current preview screen. In a possible manner, the mobile phone may display the label of the sixth electronic device in the area where the sixth electronic device is located in the preview screen.
  • the label of the sixth electronic device may be displayed in an area close to the positioning device of the sixth electronic device.
  • the label of the sixth electronic device may be displayed in a blank area in the preview image, and does not block other objects in the preview image.
  • the above-described icon display method can mark the identified electronic device without blocking other objects in the preview screen, does not affect the user's vision and perception, and improves the user's visual experience.
  • the user can activate the device identification function and positioning function of the mobile phone through a preset operation during the process of taking a photo, and identify other electronic devices included in the preview screen of the camera in combination with the identification function and positioning function of the mobile phone.
  • the user can directly drag the photo to be shared to the area where other electronic devices are located, so as to quickly share the photo to other electronic devices around. This process simplifies the operation process of sharing photos, shortens the time for sharing photos, and improves user experience.
  • the first electronic device includes a first positioning chip
  • the sixth electronic device includes a second positioning chip
  • the sixth electronic device included in the preview screen is identified
  • Determining the relative position of the sixth electronic device and the first electronic device includes: based on the first positioning chip, the second positioning chip and the preview screen, identifying the sixth electronic device included in the preview screen, and determining the The relative position of the sixth electronic device and the first electronic device, wherein the first positioning chip includes at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip, and the second positioning chip includes a Bluetooth positioning chip and an ultra-wideband UWB positioning chip. At least one of the positioning chips.
  • the mobile phone can identify other electronic devices in the preview screen through various possible positioning technologies, and locate the positions of the other electronic devices.
  • the positioning technology in this embodiment of the present application may include one of technologies such as Bluetooth-based wireless sensing positioning, ultra-wide-band (UWB) sensing-based wireless sensing positioning, computer vision-based positioning, and the like, or The fusion of multiple positioning technologies listed above, or other more positioning technologies, the embodiments of the present application do not limit the manner in which the mobile phone locates other electronic devices.
  • the shooting preview interface further includes a shooting shutter key
  • the method further includes: receiving a seventh operation on the shooting shutter key; Seven operations, take the first photo.
  • the newly taken first picture can be directly shared with other devices. Or share the first photo with the latest date in the local album to other devices.
  • the method before displaying the thumbnail of the first photo on the shooting preview interface, the method further includes: receiving an eighth operation; responding to the fourth operation , display a photo list on the shooting preview interface, the photo list includes the first photo and a plurality of second photos, the date of the second photo is before the first photo; receive the ninth operation; in response to the fifth operation, Selecting at least one second photo from the photo list; and, after moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen, the method further includes: pointing to the first photo Six electronic devices send the first photo and the selected at least one second photo.
  • the eighth operation is a sliding operation along the preset direction starting from the local album icon
  • the ninth operation is a click operation
  • the user can activate the device identification function and positioning function of the mobile phone through a preset operation during the process of taking a photo, and identify other electronic devices included in the preview screen of the camera in combination with the identification function and positioning function of the mobile phone.
  • the user can select multiple photos to be shared, and directly drag the multiple photos to be shared to the area where other electronic devices are located, so as to quickly share the photos to other electronic devices around. This process simplifies the operation process of sharing photos, shortens the time for sharing photos, and improves user experience.
  • the photos in the photo list may be arranged according to the sequence taken by the user.
  • the first photo is the latest photo taken by the user
  • the shooting time of the second photo is earlier than the shooting time of the first photo.
  • the photos in the photo list may be arranged in other possible order.
  • the photos of the company where the shooting place is may be displayed in the photo list, which is not limited in this embodiment of the present application.
  • the first photo in the photo list may be selected by default. If the user does not wish to share the first photo, he may click the selection box in the lower right corner of the first photo to deselect the first photo. Similarly, if the user wishes to share the first photo and at least one second photo at the same time, he can click the selection box in the lower right corner of each second photo to select multiple photos to be shared, which will not be repeated here.
  • the embodiments of the present application may also trigger the process of sharing multiple photos provided in the embodiments of the present application through other preset operations, or trigger the mobile phone to identify the electronic device in the preview screen through other preset operations.
  • the preset operations do not. It is limited to selecting the local album icon and dragging it upwards, double-clicking the local album icon, or drawing a fixed pattern on the shooting preview interface, which is not limited in this embodiment of the present application.
  • the display effect of the label when moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label, the The display effect of the label changes, and the display effect includes one or more of the color, size, and animation effect of the label of the sixth electronic device.
  • the user can drag the thumbnail of the first photo to the location of the sixth electronic device and release it, and the icon of the sixth electronic device can be displayed in different colors, or show size changes, jumps, flashing lights, etc.
  • a dynamic effect is used to remind the user to share the first photo currently taken with the sixth electronic device identified in the preview screen.
  • a reminder control may also be displayed on the preview screen.
  • the reminder control may be an arrow or the like, and the arrow may be displayed statically, in a beating manner, or in a blinking display to prompt the user to drag the thumbnail of the first photo to the position marked by the arrow to realize the photo sharing function.
  • the embodiment of the present application does not limit the display manner of the reminder control.
  • the method when the sixth electronic device is blocked in the preview screen, or it is detected that the sixth electronic device is located in the range corresponding to the preview screen
  • the method further includes: displaying prompt information on the shooting preview interface, where the prompt information is used to prompt the user for the position of the sixth electronic device, or the prompt information is used to prompt the user to adjust the first electronic device position so that the sixth electronic device is displayed in the preview screen of the first electronic device.
  • the mobile phone can communicate with other nearby electronic devices, for example, through Bluetooth, wireless fidelity (wireless fidelity, WIFI) modules and other possible ways to communicate, then the mobile phone can sense the electronic devices that exist nearby .
  • the mobile phone determines that there are other electronic devices nearby through a wireless positioning technology such as UWB, and recognizes the type of the electronic device, etc., which can be displayed in the shooting preview interface.
  • the embodiments of the present application do not limit the communication interaction mode and the connection establishment mode between the mobile phone and other nearby electronic devices.
  • reminder information such as text or icons can be displayed on the shooting preview interface.
  • the user can further quickly share the photos taken to the blocked electronic device, which provides a possible way for the user to share photos with the blocked electronic device and simplifies the sharing of the user. photo steps.
  • the mobile phone may identify through the wireless positioning technology that there is a sixth electronic device nearby, and the sixth electronic device is not displayed in the current preview screen of the mobile phone.
  • the embodiment of the present application may also display reminder information on the shooting preview interface, which is used to remind the user that there is a sixth electronic device in a certain position.
  • an icon reminder may also be included.
  • icons that mark the position of the blocked sixth electronic device such as statically displayed arrows, dynamically flashing arrows, or beatingly displayed arrows, may be included. The embodiment does not limit this.
  • the user can turn the direction of the mobile phone according to the reminder information on the interface, so that the camera of the mobile phone can obtain the detected sixth electronic device, and display the sixth electronic device that the user will share the photo on in the preview screen.
  • An electronic device so that the captured photos can be quickly shared to other electronic devices according to the method described above.
  • the mobile phone can recognize that there are other electronic devices nearby through the wireless positioning technology, and if the electronic device is not displayed in the current preview screen of the mobile phone.
  • the embodiment of the present application may also display reminder information on the shooting preview interface, which is used to remind the user that there are other electronic devices in a certain position.
  • the user can activate the device identification function and the positioning function of the electronic device through a preset operation during the process of taking a photo or running the camera application. And based on the identification function and positioning function of the electronic device, other electronic devices included in the preview screen of the camera are identified, and the user can select one or more photos to be shared through a shortcut operation, and directly drag the one or more photos.
  • the photos to be shared go to the area where other electronic devices are located, so as to quickly share one or more photos to other electronic devices around.
  • the embodiments of the present application provide users with a user-friendly interactive interface for various scenarios such as other electronic devices that are blocked in the preview screen, so that users can share one or more photos through quick operations, which simplifies the sharing process.
  • the photo operation process shortens the time for sharing photos and improves the user experience.
  • a first electronic device comprising: a processor and a memory; the memory stores one or more instructions, when the one or more instructions are executed by the processor, the first electronic device is Performing the following steps: displaying a shooting preview interface of the first electronic device, the shooting preview interface including a thumbnail of the first photo and a preview image captured by the camera of the first electronic device; identifying the sixth electronic device included in the preview image ; Determine the relative position of the sixth electronic device and the first electronic device; Based on the identified sixth electronic device and the relative position, on the preview screen, display the label of the sixth electronic device, the label is used for Identifying the sixth electronic device; receiving a fifth operation of the thumbnail of the first photo; in response to the fifth operation, moving the thumbnail of the first photo to the sixth electronic device identified by the label on the preview screen the display area; send the first photo to the sixth electronic device.
  • the first electronic device includes a first positioning chip
  • the sixth electronic device includes a second positioning chip
  • the first electronic device includes a first positioning chip
  • the sixth electronic device includes a second positioning chip
  • the first positioning chip includes at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip
  • the second positioning chip includes at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip.
  • the shooting preview interface includes an album icon, and when the one or more instructions are executed by the processor, the first electronic device is caused to perform the following steps : receive the sixth operation of the album icon; in response to the sixth operation, display the thumbnail of the first photo in a floating manner on the shooting preview interface.
  • the fifth operation is an operation of dragging the thumbnail of the first photo
  • the sixth operation is an operation of long pressing the local album icon
  • the shooting preview interface further includes a shooting shutter button, and when the one or more instructions are executed by the processor, the first electronic device The following steps are performed: receiving a seventh operation of the shooting shutter key; and taking the first photo in response to the seventh operation.
  • the first electronic device when the one or more instructions are executed by the processor, the first electronic device is caused to perform the following steps: receiving an eighth operation; responding to In the fourth operation, a photo list is displayed on the shooting preview interface, the photo list includes the first photo and a plurality of second photos, and the date of the second photo is before the first photo; receiving the ninth operation; responding to In the fifth operation, select at least one second photo from the photo list; and, after moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen, send the image to the sixth electronic device.
  • the device sends the first photo and the selected at least one second photo.
  • the eighth operation is a sliding operation along the preset direction starting from the local album icon
  • the ninth operation is a click operation
  • the label of the sixth electronic device is used to identify the name of the sixth electronic device, and/or the label of the sixth electronic device is used to identify The location where the sixth electronic device is located.
  • the thumbnail of the first photo when the thumbnail of the first photo is moved to the display area of the sixth electronic device identified by the label, the The display effect of the label changes, and the display effect includes one or more of the color, size, and animation effect of the label of the sixth electronic device.
  • the first An electronic device when the sixth electronic device is blocked in the preview screen, when the one or more instructions are executed by the processor, the first An electronic device is further configured to perform the following steps: displaying prompt information on the shooting preview interface, where the prompt information is used to prompt the user of the position of the sixth electronic device, or the prompt information is used to prompt the user to adjust the position of the first electronic device position so that the sixth electronic device is displayed in the preview screen of the first electronic device.
  • the embodiments of the present application provide a computer storage medium, including computer instructions, when the computer instructions are executed on an electronic device, the electronic device enables the electronic device to execute the method in any of the possible implementations of any of the foregoing aspects.
  • an embodiment of the present application provides a computer program product, which, when the computer program product runs on a computer, enables the computer to execute the method in any of the possible implementations of any one of the foregoing aspects.
  • FIG. 1 is a schematic diagram of a system architecture provided by an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of another electronic device provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a scenario of a device identification method provided by an embodiment of the present application.
  • 5A-5H are schematic diagrams of a set of interfaces provided by an embodiment of the present application.
  • 6A-6B are another set of interface schematic diagrams provided by the embodiments of the present application.
  • 9A-9E are another set of interface schematic diagrams provided by the embodiments of the present application.
  • FIGS. 10A-10C are another set of interface schematic diagrams provided by the embodiments of the present application.
  • 11A-11D are another set of interface schematic diagrams provided by the embodiments of the present application.
  • FIG. 12F are another set of interface schematic diagrams provided by the embodiments of the present application.
  • FIG. 13 is a schematic diagram of a graphical user interface of an example sharing photo process
  • FIG. 14 is a schematic diagram of a graphical user interface of an example of a photo sharing process provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a graphical user interface of another example of a photo sharing process provided by an embodiment of the present application.
  • 16 is a schematic diagram of a graphical user interface for receiving a photo provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a graphical user interface of an example of a photo sharing process provided by an embodiment of the present application.
  • FIG. 18 is a schematic diagram of another example of a graphical user interface for receiving a photo provided by an embodiment of the present application.
  • FIG. 19 is a schematic flowchart of an example of a method for sharing a photo provided by an embodiment of the present application.
  • FIG. 21 is a schematic diagram of the principle of a positioning method provided by an embodiment of the present application.
  • 22 is a schematic flowchart of a device identification method provided by an embodiment of the present application.
  • FIG. 23 is a schematic diagram of a software architecture provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as implying or implying relative importance or implying the number of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • AR Augmented Reality
  • AR is a method of superimposing virtual information and real environment on the same screen or space with the help of computer graphics and visualization technology. , which integrates three-dimensional display technology, interactive technology, sensor technology, computer vision technology and multimedia technology.
  • the electronic device 100 enters the first interface, activates the camera, and displays the image collected by the camera in real time on the first interface; at the same time, a detection request with wireless positioning technology is sent, and the electronic device 100 receives the data according to the received data.
  • a detection request with wireless positioning technology is sent, and the electronic device 100 receives the data according to the received data.
  • nearby devices of the electronic device 100 are determined, as well as device names, device types, physical distances and angles from the electronic device 100 of the nearby devices.
  • the electronic device 100 performs image recognition on the image collected by the camera, and recognizes the electronic device in the image and the device type of the electronic device (eg, a speaker, a computer, a tablet computer, etc.).
  • the electronic device 100 determines the display area of the image of the nearby device in the first interface according to the physical distance and angle of the nearby device from the electronic device 100 .
  • the electronic device displays a device icon on the first interface in real time by means of augmented reality, and the device icon can be used for the electronic device 100 to interact with nearby devices.
  • the electronic device 100 detects a user operation for the device icon and responds to the user. Operation, the electronic device 100 outputs the control interface of the nearby device corresponding to the device icon.
  • the method realizes the interaction between the electronic device and the nearby device, and presents the corresponding relationship between the device icon and the device in real time through the display mode of augmented reality, thereby improving the user experience.
  • a device icon may also be referred to as a device label.
  • FIG. 1 exemplarily shows a schematic diagram of a communication system 10 provided in an embodiment of the present application.
  • the communication system 10 includes an electronic device 100 , an electronic device 201 , an electronic device 202 , an electronic device 203 , an electronic device 204 , and the like.
  • the electronic device 100 may assist the user to select and control various electronic devices (eg, speakers, televisions, refrigerators, air conditioners, etc.).
  • the electronic device 100 may also be referred to as the first electronic device
  • the electronic device 201 (or the electronic device 202, the electronic device 203, the electronic device 204, etc.) may also be referred to as the second electronic device; wherein,
  • the electronic device (eg, electronic device 100, electronic device 201, electronic device 202, electronic device 203, or electronic device 204) has an ultra wide band (UWB) communication module, and may also have a Bluetooth communication module, a WLAN communication module, and a GPS communication module One or more of the modules.
  • UWB ultra wide band
  • the electronic device 100 can detect and scan the electronic An electronic device (eg, electronic device 201, electronic device 202, electronic device 203, or electronic device 204) near device 100 so that electronic device 100 can communicate via one or more of UWB, Bluetooth, WLAN, and GPS Discover nearby electronic devices, establish a wireless communication connection with nearby electronic devices, and transmit data to nearby electronic devices.
  • the electronic device in the embodiments of the present application may be Cell phones, wearable devices (eg, smart bracelets), tablets, laptops, handheld computers, computers, ultra-mobile personal computers (UMPCs), cellular phones, personal digital assistants ( Portable devices such as personal digital assistant, PDA), augmented reality (AR) ⁇ virtual reality (virtual reality, VR) devices. It can also be a speaker, a TV, a refrigerator, an air conditioner, a vehicle-mounted device, a printer, a projector, and other devices. Exemplary embodiments of electronic devices include, but are not limited to, onboard Or other electronic devices with operating systems.
  • the electronic device 100 , the electronic device 201 , the electronic device 202 , the electronic device 203 and the electronic device 204 may communicate directly.
  • the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 may be connected to a local area network (local area network) through a wired or wireless fidelity (wireless fidelity, WiFi) connection. , LAN).
  • a local area network local area network
  • wireless fidelity wireless fidelity, WiFi
  • LAN local area network
  • electronic device 100, electronic device 201, electronic device 202, electronic device 203, and electronic device 204 are all connected to the same electronic device 301, and electronic device 100, electronic device 201, electronic device 202, electronic device 203, and electronic device 204 may Indirect communication through electronic device 301 .
  • the electronic device 301 may be one of the electronic device 100 , the electronic device 201 , the electronic device 202 , the electronic device 203 and the electronic device 204 , and may also be an additional third-party device, such as a router, a cloud server, a gateway, and the like.
  • the cloud server may be a hardware server, and may also be embedded in a virtualized environment.
  • the cloud server may be a virtual machine executed on a hardware server that may include one or more other virtual machines.
  • the electronic device 301 can send data to the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 through the network, and can also receive the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device. 204 data sent.
  • Electronic device 301 may include memory, a processor, and a transceiver.
  • the memory can be used to store related programs of UWB positioning; the memory can also be used to store the orientation parameters of the electronic device (for example, the electronic device 201 ) obtained through the UWB positioning technology; the memory can also be used to store the information exchanged via the electronic device 301 Messages, data and/or configuration related to the electronic device 100 and nearby devices.
  • the processor may be configured to determine the responding target device according to the orientation parameters of the multiple nearby devices when acquiring the orientation parameters of the multiple nearby devices in the local area network.
  • Transceivers can be used to communicate with electronic devices connected to a local area network. It should be noted that, in this embodiment of the present application, multiple nearby areas may be connected to the same local area network, or may not be connected to the same local area network, which is not specifically limited here.
  • the structure shown in this embodiment does not constitute a specific limitation on the communication system 10 .
  • the communication system 10 may include more or fewer devices than shown.
  • FIG. 2 shows a schematic structural diagram of an exemplary electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including UWB, wireless local area networks (WLAN) (such as wireless fidelity (WiFi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • UWB wireless communication is a wireless personal area network communication technology with low power consumption and high-speed transmission.
  • UWB uses pulsed signals to transmit data.
  • UWB utilizes nanosecond (ns) to picosecond (ps) level non-sinusoidal narrow pulse signals to transmit data, and time modulation technology enables its transmission rate to be greatly improved. Because very short pulses are used, the transmit power of UWB devices is very small at the same time of high-speed communication, only a few percent of the current continuous carrier system, so the power consumption is relatively low.
  • UWB Compared with the traditional narrowband system, UWB system has the advantages of strong penetration, low power consumption, good anti-multipath effect, high security, low system complexity, and can provide precise positioning accuracy.
  • UWB can be applied to wireless communication applications that require high-quality services, and can be used in wireless personal area networks (WPANs), home network connections, and short-range radars.
  • WPANs wireless personal area networks
  • UWB will become a technical means to solve the contradiction between the demand for high-speed Internet access in enterprises, homes, public places, etc. and the increasingly crowded frequency resource allocation.
  • the electronic device 100 can measure distance and received signal strength (RSSI) through a UWB antenna.
  • the electronic device 100 can implement angle of arrival (Angle of arrival, AOA) measurement through at least two UWB antennas.
  • RSSI received signal strength
  • AOA angle of arrival
  • the UWB communication module of the electronic device 100 may be in a power-on state.
  • the electronic device 100 may implement distance and AOA measurements via Bluetooth.
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 displays the interface content currently output by the system.
  • the electronic device 100 cooperates with modules such as the GPU, the display screen 194 and the application processor, and then displays images, application interfaces, buttons, icons, windows, etc. on the display screen of the electronic device 100 to realize the display function of the electronic device.
  • the interface content is an interface provided by an instant messaging application.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the pressure sensor 180A can be used to capture the pressure value generated when the user's finger part touches the display screen, and transmit the pressure value to the processor, so that the processor can identify which finger part the user inputs through User action.
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A. In some embodiments, acting on different touch positions may correspond to different operation instructions.
  • the pressure sensor 180A may also calculate the number of touch points according to the detected signal, and transmit the calculated value to the processor, so that the processor recognizes that the user inputs the user operation through single finger or multiple fingers.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of the electronic device 100 about three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. In some optional embodiments of the present application, the acceleration sensor 180E may be used to capture the acceleration value generated when the user's finger touches the display screen (or the user's finger taps the rear side frame of the rear shell of the electronic device 100 ), and converts the acceleration value It is transmitted to the processor, so that the processor can identify the part of the user's finger through which the user's operation is input.
  • the electronic device 100 may determine the attitude change of the electronic device 100 by using the gyro sensor and/or the acceleration sensor, and then recognize the user operation. For example, according to the posture change of the electronic device 100, it is recognized that the current user operation is a lift operation. is the included angle with the horizontal direction, that is, 0 degrees), the user lifts the electronic device 100 to the vertical and horizontal direction within a preset time (at this time, the display screen 194 of the electronic device is vertical to the horizontal direction, and the lifting angle is the same as that of the horizontal direction). The included angle in the horizontal direction is 90 degrees), and the lift change angle within the preset time is 90 degrees (90 degrees minus 0 degrees). When the electronic device 100 detects that the lift change angle within the preset time exceeds the preset angle, the electronic device 100 may consider that the current user operation is a lift operation.
  • the preset angle may be, for example, 30 degrees.
  • the electronic device 100 detects that the lift change angle within the preset time exceeds the preset angle, and the lift angle at a certain moment in the preset time is within the preset angle range, then the electronic device 100 The device 100 considers that the current user operation is a lift operation.
  • the preset angle range may be 60 degrees to 90 degrees.
  • the electronic device 100 may determine the attitude change of the electronic device 100 through a gyro sensor and/or an acceleration sensor, thereby identifying a stationary state.
  • the static state may be that the angle change detected by the gyro sensor of the electronic device 100 within the preset time is within the preset range, and the speed change detected by the acceleration sensor within the preset time is smaller than the threshold.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the display screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called “touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it, and the touch touch operation refers to an operation of a user's hand, elbow, stylus, etc. touching the display screen 194 .
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • FIG. 3 exemplarily shows a schematic structural diagram of an electronic device 202 provided by an embodiment of the present application.
  • the electronic device 201 , the electronic device 203 , and the electronic device 204 can all refer to the schematic structural diagram shown in FIG. 3 .
  • the electronic device 202 may include: a processor 401, a memory 402, a wireless communication processing module 403, an antenna 404, a power switch 405, a wired LAN communication processing module 406, a USB communication processing module 407, an audio module 408, a display Screen 409. in:
  • the processor 401 may be used to read and execute computer readable instructions.
  • the processor 401 may mainly include a controller, an arithmetic unit, and a register.
  • the controller is mainly responsible for instruction decoding, and sends out control signals for the operations corresponding to the instructions.
  • the arithmetic unit is mainly responsible for saving the register operands and intermediate operation results temporarily stored during the execution of the instruction.
  • the hardware architecture of the processor 401 may be an application specific integrated circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, an NP architecture, or the like.
  • ASIC application specific integrated circuit
  • processor 401 may be configured to parse signals received by wireless communication module 403 and/or wired LAN communication processing module 406, such as probe requests broadcast by electronic device 100, and the like. Processing 401 may be used to perform corresponding processing operations according to the parsing result, such as generating a probe response, and so on.
  • the processor 401 may also be configured to generate a signal sent out by the wireless communication module 403 and/or the wired LAN communication processing module 406, such as a Bluetooth broadcast signal.
  • Memory 402 is coupled to processor 401 for storing various software programs and/or sets of instructions.
  • memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices.
  • the memory 402 can store an operating system, such as an embedded operating system such as uCOS, VxWorks, RTLinux, and the like.
  • Memory 402 may also store communication programs that may be used to communicate with electronic device 100, one or more servers, or accessory devices.
  • the wireless communication module 403 may include one or more of a UWB communication module 403A, a Bluetooth communication module 403B, a WLAN communication module 404C, and a GPS communication module 404D.
  • the UWB communication module 403A can be integrated into a chip (System on Chip, SOC), and the UWB communication module 403A can also be integrated with other communication modules (eg, Bluetooth communication module 403B) in hardware (or software).
  • one or more of the UWB communication module 403A, the Bluetooth communication module 403B, the WLAN communication module 404C, and the GPS communication module 404D may listen to signals transmitted by other devices (eg, electronic device 100 ), such as measurement signals , scan signals, etc., and can send response signals, such as measurement responses, scan responses, etc., so that other devices (eg, electronic device 100 ) can discover electronic device 202 and communicate via one or more of UWB, Bluetooth, WLAN, or infrared
  • a short-range wireless communication technology establishes a wireless communication connection with other devices (such as the electronic device 100 ) for data transmission.
  • one or more of UWB communication module 403A, Bluetooth communication module 403B, WLAN communication module 404C, GPS communication module 404D may also transmit signals, such as broadcasting UWB measurement signals, so that other devices (such as electronic The device 100) can discover the electronic device 202, and establish a wireless communication connection with other devices (such as the electronic device 100) through one or more short-range wireless communication technologies among UWB, Bluetooth, WLAN or infrared for data transmission.
  • the wireless communication module 403 may also include a cellular mobile communication module (not shown).
  • the cellular mobile communication processing module can communicate with other devices (such as servers) through cellular mobile communication technology.
  • Antenna 404 may be used to transmit and receive electromagnetic wave signals.
  • the antennas of different communication modules can be multiplexed or independent of each other to improve the utilization rate of the antennas.
  • the antenna of the Bluetooth communication module 403A can be multiplexed as the antenna of the WLAN communication module 403B.
  • the UWB communication module 403A is to use a separate UWB antenna.
  • the electronic device 202 in order to realize UWB communication, has at least one UWB antenna.
  • the power switch 405 may be used to control the supply of power to the electronic device 202 from a power source.
  • the wired LAN communication processing module 406 can be used to communicate with other devices in the same LAN through the wired LAN, and can also be used to connect to the WAN through the wired LAN, and can communicate with the devices in the WAN.
  • the USB communication processing module 407 may be used to communicate with other devices through a USB interface (not shown).
  • the audio module 408 can be used to output audio signals through the audio output interface, which enables the electronic device 202 to support audio playback.
  • the audio module can also be used to receive audio data through the audio input interface.
  • the electronic device 202 may be a media playing device such as a television.
  • Display screen 409 may be used to display images, video, and the like.
  • the display screen 409 can be a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED) display, an active-matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED) Display, flexible light-emitting diode (flexible light-emitting diode, FLED) display, quantum dot light emitting diode (quantum dot light emitting diodes, QLED) display and so on.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED active-matrix organic light emitting diode
  • flexible light-emitting diode flexible light-emitting diode
  • FLED flexible light-emitting diode
  • QLED quantum dot light emitting diode
  • the electronic device 202 may also include a serial interface such as an RS-232 interface.
  • the serial interface can be connected to other devices, such as audio amplifiers such as speakers, so that the display and audio amplifiers can cooperate to play audio and video.
  • the structure shown in FIG. 3 does not constitute a specific limitation on the electronic device 202 .
  • the electronic device 202 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the present application provides a device identification method based on augmented reality. After the electronic device 100 detects a first operation, it enters a first interface, the electronic device 100 activates a camera, and displays a preview image collected by the camera on the first interface in real time.
  • the electronic device 100 identifies the type of the second electronic device (such as a speaker, computer, tablet, computer, etc.) in the preview image of the first interface through computer vision technology; at the same time, the electronic device 100 uses wireless positioning technology (such as UWB positioning, Bluetooth positioning, WiFi positioning, GPS positioning, etc.), to determine the orientation information (such as longitude and latitude information, or the physical distance and angle from the electronic device) and identity information (such as the device name) of the second electronic device within the communication range of the electronic device 100 , device type, device properties, etc.).
  • wireless positioning technology such as UWB positioning, Bluetooth positioning, WiFi positioning, GPS positioning, etc.
  • the electronic device 100 determines the position of the second electronic device in the preview image according to the relative distance and relative angle between the second electronic device and itself, and the shooting angle range of the camera.
  • the electronic device 100 and nearby devices are included in FIG. 4 .
  • the nearby devices include electronic device 201 , electronic device 202 , electronic device 203 , and electronic device 204 .
  • the second electronic device may be any electronic device among the nearby devices. 4 exemplarily shows the positional relationship between the electronic device 100 and the electronic device 201 , the electronic device 202 , the electronic device 203 , and the electronic device 204 on the horizontal plane in some application scenarios of the present application.
  • a reference point eg, a central position point
  • the center position point of the electronic device 100 can be used to represent its position in the horizontal plane.
  • the center position of the electronic device 100 may be the direction pointed by a vector whose starting point is perpendicular to the upper edge of the touch screen of the electronic device 100 as the reference direction of the electronic device 100 , which may also be referred to as the electronic device 100 0 degree direction.
  • the electronic device 201 can be located at 1 m in the 0-degree direction of the electronic device 100
  • the electronic device 202 can be located at 1.2 m in the clockwise 330-degree direction of the electronic device 100
  • the electronic device 203 can be located in the electronic device 100 .
  • the electronic device 204 may be 0.8m in the clockwise 30-degree direction of the electronic device 100 at 0.5m in the 330-degree clockwise direction.
  • the left and right angles of the camera's shooting angle are in the range of 60° to 80°, and the upper and lower angles are around 45°, which will vary according to different mobile phone brands and camera configurations. If the left and right angles of the shooting angle of the electronic device 100 are 60°, it can be seen that the electronic device 201 , the electronic device 202 , the electronic device 203 and the electronic device 204 are all within the shooting range of the electronic device 100 . According to the length, width, and physical distance from the electronic device 100 of different electronic devices, it can be determined whether the electronic device 201 , the electronic device 202 , the electronic device 203 , and the electronic device 204 are fully displayed or partially displayed in the photographing interface of the electronic device 100 .
  • the nearby devices of the electronic device 100 may not be limited to the four electronic devices in the above-mentioned FIG. 4 , but may also have more or less.
  • FIG. 4 the present application is merely explained with four electronic devices. , shall not constitute a limitation.
  • the above-mentioned FIG. 4 exemplarily shows the relative positional relationship between the above-mentioned four electronic devices (electronic device 201 , electronic device 202 , electronic device 203 , and electronic device 204 ) and the electronic device 100 , which only exemplarily explains the embodiments of the present application. , shall not constitute a limitation.
  • the electronic device 100 determines the orientation information of the second electronic device, the display image and display area of the second electronic device are determined in the preview image, and the device icon is displayed in real time on the first interface by means of augmented reality, and the user triggers the device icon by triggering the device icon.
  • the electronic device 100 can output the control interface of the second electronic device, so as to realize the user's interaction with the second electronic device.
  • the display area of the device icon on the first interface corresponds to the display area of the second electronic device in the preview image.
  • the first operation of the user triggers the first electronic device to enter the first interface, and the electronic device displays the device icon in real time on the first interface.
  • FIG. 5A illustrates an exemplary user interface 510 on the electronic device 100 .
  • the user interface 510 may include a status bar 511, a tray 512, and one or more application icons, wherein the status bar 201 may include: one or more signal strength indicators 513 of a mobile communication signal (also referred to as a cellular signal), One or more signal strength indicators 514 , Bluetooth indicator 515 , battery status indicator 516 , and time indicator 517 of wireless fidelity (Wi-Fi) signals.
  • the Bluetooth module of the electronic device 100 is turned on (ie, the electronic device supplies power to the Bluetooth module)
  • the Bluetooth indicator 515 is displayed on the display interface of the electronic device 100 .
  • the tray 512 has icons of commonly used applications, and can display, for example, a phone icon, a contact icon, a text message icon, a camera icon, and the like.
  • the one or more application icons include: a gallery icon, a browser icon, an application store icon, a setting icon, a mailbox icon, a cloud sharing icon, and a memo icon.
  • the electronic device 100 can start and run multiple application programs at the same time to provide users with different services or functions.
  • the electronic device 100 running multiple application programs at the same time means that the electronic device 100 has started multiple application programs, the multiple application programs have not been closed, and the electronic device 100 has not deleted resources such as memory occupied by the multiple application programs.
  • Each application occupies resources such as memory in the background at the same time; it does not require multiple applications to interact with the user in the foreground at the same time.
  • the electronic device 100 starts three applications of mailbox, gallery and instant messaging successively, and runs the three applications of mailbox, gallery and instant messaging at the same time.
  • the electronic device 100 When a user is using an application, if the application switches or jumps to the desktop for operation, the electronic device 100 will not kill the application previously used by the user, but will use the previously used application as a background application Remain in the multitasking queue.
  • the electronic device 100 When the electronic device 100 runs multiple application programs at the same time, it can generate a card corresponding to each application program according to the multiple application programs in the multitasking queue. Multiple cards on the multitasking interface are set horizontally side by side according to the preset order strategy. For example, in a sequence strategy, the electronic device 100 arranges cards corresponding to different application programs according to the chronological order of running different application programs.
  • the electronic device 100 displays the multitasking interface 520 after detecting a user operation instructing to open the multitasking interface 520 .
  • the multitasking interface 520 includes cards respectively corresponding to multiple application programs running on the electronic device 100 . There may be various user operations for instructing to open the multitasking interface.
  • the electronic device 100 when the electronic device 100 detects an upward swipe operation for the bottom of the electronic device 100 , in response to the operation, as shown in FIG. 5B , the electronic device 100 displays the multitasking interface 520 .
  • the multitasking interface 520 may include: a card 521 , a card 522 and a delete icon 523 . Among them, the card 521 is displayed in its entirety, and the card 522 is partially displayed.
  • the delete icon 523 can be used to close the application corresponding to the complete card displayed in the current multitasking interface 520 .
  • Closing here refers to deleting resources such as memory occupied by the application.
  • the delete icon 530 may be used to close applications corresponding to all cards in the current multitasking interface 520 .
  • the multitasking interface 520 shown in the drawings refers to the interface displayed on the touch screen within the frame of the electronic device 100 , and the part of the card inside the frame of the electronic device 100 can be used by the electronic device 100 The touch screen display of the electronic device 100 cannot be displayed on the part of the card outside the frame of the electronic device.
  • the user can switch and display cards by sliding left and right on the multitasking interface 520 .
  • the electronic device 100 detects a right swipe operation on the multitasking interface 520, in response to the operation, the cards on the multitasking interface 520 move to the right in sequence, and at this time, the electronic device 100 can display the card 522 completely, partially Card 521 is displayed.
  • the electronic device 100 detects a leftward swipe operation on the multitasking interface 520, in response to the operation, the cards on the multitasking interface 520 move to the left in sequence, because the card 521 is the first from the right in the multitasking interface 520 There is no other card to the right of the card 521.
  • the electronic device 100 After the electronic device 100 completely displays the card 521, it detects a leftward sliding operation, as shown in FIG. 5C. In response to this operation, the electronic device 100 partially displays the preset area 524, and continues to Swipe left, as shown in FIG. 5D , the electronic device 100 completely displays the preset area 524 . In some embodiments, at this time, the electronic device 100 triggers to display the viewfinder interface corresponding to the preset area 524 .
  • the viewfinder interface may be a picture captured by a rear camera of the electronic device 100 or a picture captured by a front camera.
  • FIG. 5E exemplarily shows a viewfinder interface 530 .
  • the image captured by the camera is displayed in the viewfinder interface 530 in real time; optionally, the electronic device 100 may also send a detection request with wireless positioning technology, and the electronic device 100 determines the electronic device 100 according to the received detection response to the detection request
  • the nearby devices of the device 100 further, determine one or more pieces of information among the device name, device type, and physical distance or angle from the electronic device 100 of the nearby device.
  • the electronic device 100 performs image recognition on the image collected by the camera, and recognizes the electronic device (eg, a speaker, a computer, a tablet computer, etc.) in the image.
  • the display content of the viewfinder interface 530 in FIG. 5E is the image captured by the camera, including a device image 531 , a device image 532 , a device image 533 and a device image 534 .
  • the device image 531 is the image taken by the electronic device 100 and displayed in the viewfinder interface 530 of the electronic device 202; Image; the device image 533 is the image taken by the electronic device 100 and displayed in the viewfinder interface 530 of the electronic device 203 ; the device image 534 is the image taken by the electronic device 100 and displayed in the viewfinder interface 534 of the electronic device 202 .
  • the electronic device 100 determines the display area of the corresponding device image of each device in the viewfinder interface 530 according to the physical distance and angle of the electronic device 201 , the electronic device 202 , the electronic device 203 , and the electronic device 204 from the electronic device 100 .
  • the electronic device 100 displays the device icon on the viewfinder interface 530 in real time by means of augmented reality, and the device icon indicates the electronic device corresponding to the device image in the viewfinder interface 530 .
  • the display area of the device icon corresponds to the device image in the viewfinder interface 530 .
  • the device icon may be displayed at a fixed position on the viewfinder interface 530, or may be displayed corresponding to the device image, for example, displayed around the corresponding device image, or displayed at the center of the corresponding device image, etc. ;
  • the device icon and the display area of the corresponding device image may completely overlap, partially overlap, or not overlap (for example, displayed in the area immediately above the display area of the corresponding device image).
  • the display area of the device icon 5311 completely overlaps with the device image 531, and the device icon 5311 indicates that the device name corresponding to the device image 531 is matepad (tablet computer); the display area of the device icon 5321 partially overlaps with the device image 532 , the device icon 5321 indicates that the device name corresponding to the device image 532 is HUAWEI soundX (Huawei speaker); the display area of the device icon 5331 completely overlaps the device image 533, and the device icon 5331 indicates that the device name corresponding to the device image 531 is matebook (computer ); the display area of the device icon 5341 partially overlaps with the device image 534, and the device icon 5341 indicates that the device name corresponding to the device image 534 is matebook (computer).
  • a device icon may also be referred to as a device label.
  • the device icon 5321 may also be referred to as the first label.
  • the display area of the device icon corresponds to the position of the device's positioning chip (eg, UWB chip, Bluetooth chip) in the device image.
  • the electronic device 100 receives the detection response from the positioning chip of the electronic device 201 , and determines the orientation (physical distance and angle from the electronic device 100 ) of the positioning chip of the electronic device 201 . According to the orientation of the positioning chip of the electronic device 201, the electronic device 100 determines the corresponding position of the positioning chip of the electronic device 201 in the viewfinder interface 530, and the electronic device 100 displays the device icon of the device 201 at the corresponding position.
  • the device icon 5311 displays the position of the positioning chip inside the electronic device corresponding to the device image 531 .
  • the device icon 5321 and the device icon 5331 are the same.
  • the position of the positioning chip of the device is not in the viewfinder interface 530 , for example, the positioning chip of the electronic device 504 corresponding to the device image 534 is not in the viewfinder interface 530 .
  • the electronic device 100 can calculate the physical distance and orientation of the key points (such as the four corners of the screen) of the electronic device 504 relative to the electronic device 100 according to the position and size of the electronic device 504. When the electronic device 100 captures one or more The key point of appearance is that the electronic device 100 displays the device icon in the viewfinder interface 530 .
  • the device icon can not only indicate the identity information of the device corresponding to the device image, but also can be associated with the control card of the device corresponding to the device image.
  • the electronic device 100 detects a user operation on the device icon, the electronic device 100 outputs the control card of the device corresponding to the device icon.
  • the electronic device associated with the device icon 5321 is HUAWEI soundX, as shown in Figure 5H, the electronic device 100 outputs the control card 540 of HUAWEI soundX.
  • the control card 540 may include one or more of the following: an application title bar 601 , a connection card 602 , a music card 603 , a projection card 604 , a refresh control 605 , and a close control 606 . in:
  • the application title bar 601 indicates that the device of the control card 540 is HUAWEI soundX.
  • the connection card 602 may include instruction information 602A and connection method 602B.
  • the indication information 602A is used to represent whether the device (electronic device 201 ) corresponding to the device image 532 is currently in an online state or an offline state.
  • the online state means that the electronic device 201 is currently connected to the Internet
  • the offline state means that the electronic device 201 is not currently connected to the Internet.
  • the connection method 602B is used to indicate the current connection method between the electronic device 201 and the electronic device 100.
  • the connection method 602B can be displayed on the icon of Bluetooth.
  • the connection method 602B can be displayed on the icon of WiFi.
  • Music card 603 may include music title 603A, pause control 603B, previous control 603C, next control 603D, progress bar 603E, volume 603F, more control 603H.
  • the pause control 603B may receive a user's input operation (eg, a click operation), and in response to the detected user operation, the electronic device 201 pauses playing music.
  • a user's input operation eg, a click operation
  • the previous control 603C may receive a user's input operation (eg, a single-click operation), and in response to the detected user operation, the electronic device 201 may play the previous song of the currently playing song in the music list.
  • a user's input operation eg, a single-click operation
  • the next control 603D may receive a user's input operation (eg, a single-click operation), and in response to the detected user operation, the electronic device 201 may play the next song of the currently playing song in the music list.
  • a user's input operation eg, a single-click operation
  • the progress bar 603E may indicate the total duration (eg, 04:42) and the played duration (eg, 00:42) of the current song.
  • the volume 603F may receive a user's input operation (eg, a sliding operation), and in response to the detected user operation, the electronic device 201 adjusts the playback volume of the electronic device 201 .
  • a user's input operation eg, a sliding operation
  • the more controls 603H may receive a user's input operation (eg, a swipe operation), and in response to the detected user operation, the electronic device 100 may display more function options of the music card, such as sharing, deleting, downloading, and the like.
  • a user's input operation eg, a swipe operation
  • the electronic device 100 may display more function options of the music card, such as sharing, deleting, downloading, and the like.
  • the sound projection card 604 is used to instruct the electronic device 100 to output audio to the electronic device 201 .
  • the audio of the electronic device 100 is output to the electronic device 201 in response to the operation.
  • the refresh control 605 is used to refresh the display interface of the current control card 540 , and the electronic device 100 obtains the current state of the device 201 again.
  • the close control 606 is used to close the control card 540.
  • the electronic device 100 detects a user operation on the control 606, the card 540 is closed, and the electronic device 100 displays the viewfinder interface 530 as shown in FIG. 5G.
  • the electronic device 100 may also interact with the device image 532 in other ways, which are not specifically limited here. For example, when the electronic device 100 detects a user operation on the device icon 5321, the electronic device 100 can directly open and jump to the application software associated with the electronic device corresponding to the device image 532, and display the application interface of the application software of the device image 532 , such as smart life, sports health and other application software.
  • the viewfinder interface 530 may also be referred to as a first interface.
  • the electronic device 100 determines the orientation information of the nearby devices of the electronic device 100 through computer recognition technology and wireless positioning technology, and determines the display image and display area of the nearby devices in the preview image of the viewfinder interface 530, and uses the augmented reality method on the shooting interface. Display the device icon in real time to achieve the effect of real-time preview. The user can trigger the device icon, and the electronic device 100 outputs the control interface of the corresponding electronic device, so as to realize the user's interaction with the nearby devices.
  • the electronic device 100 has no application running in the background, and no application running in the multitasking queue, that is, the multitasking interface 520 does not include the card 521 and the card 522 .
  • the electronic device 100 displays the user interface 510
  • the electronic device 100 displays a multitasking interface. Since there is no card in the multitasking interface, the electronic device 100 directly enters the viewfinder interface 530 .
  • the electronic device 100 starts the camera, and captures an image in real time through the camera and displays it on the viewfinder interface 530 .
  • the electronic device 100 after the electronic device 100 enters the viewfinder interface 530, when there is only one device in the viewfinder interface 530, the user does not need to click the device icon, and the electronic device 100 can directly enter the control interface of the device.
  • the viewfinder interface 530 in FIG. 6A includes a device image 532 , the device icon 5321 is displayed near the device image 532 , and the device icon 5321 partially overlaps the display area of the device image 532 .
  • the electronic device 100 detects that there is only one device in the viewfinder interface 530 , as shown in FIG. 6B , the electronic device 100 directly outputs the control card 540 of the device image 532 .
  • the electronic device 100 when there is only one device image in the viewfinder interface 530, it can be considered that the user wants to interact with the electronic device corresponding to the device image, and the electronic device 100 omits the user's trigger operation and directly enters the control of the electronic device The interface improves the user experience.
  • the manner of entering the viewfinder interface 530 shown in FIGS. 5A to 5D is optional, and the electronic device 100 may also enter the viewfinder interface 530 in other manners.
  • FIGS. 7A and 7B also provide a way to enter the viewfinder interface 530 .
  • a user interface 510 is shown in FIG. 7A , wherein the description of the user interface 510 may refer to the related description in the above-mentioned FIG. 5A .
  • the electronic device 100 detects a user operation on the bottom left side of the electronic device, or when the electronic device 100 detects a user operation on the bottom right side of the electronic device, the electronic device 100 displays a user interface 710 as shown in FIG. 7B .
  • the user interface 710 may include one or more of the following: a connection device selection bar 701 , a control 702A, a control 702B, a device display bar 703 , and a live view control 704 . in,
  • Continuing device selection bar 701 includes device options (also referred to as device icons) for one or more nearby devices. Such as smart screen, matepad, matebook, speakers, etc.
  • the device options displayed in area 1202 can be used to trigger the action of sharing.
  • the electronic device 100 may trigger a process of sharing the selected data or task to the device corresponding to the device option selected by the operation.
  • the process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device option, and then transmits the selected data or task to the device corresponding to the device option through the communication connection.
  • Control 702A indicates a preset mode in which one or more devices can be controlled uniformly.
  • the preset mode is the home mode.
  • the electronic devices corresponding to the device icon 703B, the device icon 703C and the device icon 703F are automatically turned on, and the electronic devices corresponding to the device icon 703A and the device icon 703D are automatically turned off.
  • Control 702B indicates another preset mode in which one or more devices can be controlled uniformly.
  • the preset mode is the home-away mode.
  • the electronic devices corresponding to the device icon 703B, the device icon 703C and the device icon 703F are automatically turned off, and the electronic devices corresponding to the device icon 703A and the device icon 703D are automatically turned on.
  • the device display bar 703 includes a plurality of device icons.
  • a plurality of device icons For example, Huawei AI Speaker 703A, Smart TV 703B, Air Purifier 703C, Smart Desk Lamp 703D, Bluetooth Headset 703E, Air Conditioning Companion 703F.
  • Any device icon among the plurality of device icons displayed in the device display bar 703 may receive an input operation (eg, a single-click operation) of the user, and in response to the detected input operation, the electronic device 100 displays a control interface of the device.
  • the air purifier 703C includes a control 7031, and the control 7031 is used to control the opening and closing of the air purifier 703C.
  • the Smart Desk Lamp 703D and Air Conditioning Companion 703F also include the same controls as the Controls 7031.
  • Devices such as Huawei AI Speaker 703A and Smart TV 703B cannot be turned on and off through the user interface 710.
  • the live view control 704 is used for triggering to enter the viewfinder interface.
  • the electronic device 100 detects a user operation on the live view control 704
  • the electronic device 100 displays the viewfinder interface 530 as shown in FIG. 5F; optionally, the electronic device 100 displays the viewfinder interface 530 as shown in FIG. 5E, and then displays the viewfinder interface 530 as shown in FIG. 5E.
  • the live view control 704 in the user interface 710 is optional, and the electronic device 100 may not display the live view control 704 .
  • the electronic device 100 may display the viewfinder interface 530 .
  • FIG. 7C exemplarily shows a lift operation.
  • the electronic device 100 displays the user interface 710.
  • the electronic device 100 detects the lift operation, at time T2, the electronic device 100 displays the user interface 710.
  • a viewfinder interface 530 wherein the time interval between time T1 and time T2 is less than a threshold.
  • the lifting operation is only an exemplary user operation, and the electronic device 100 can also enter the viewfinder interface 530 through other user operations.
  • the present application can also activate the camera through, for example, a camera application, thereby entering the viewfinder interface 530; or through other applications, such as instant messaging applications, payment applications, etc., trigger to enter the viewfinder interface 530; etc. .
  • the display forms of the device icon 5311, the device icon 5321, the device icon 5331, and the device icon 5341 in the viewfinder interface 530 shown in FIG. 5F are optional.
  • 8A to 8D also provide a display form of the device icon.
  • the device icon can change with the change of the displayed content in the viewfinder interface.
  • the display area of the device is at the first position in the viewfinder interface,
  • the device icon of the device is displayed in the first position of the viewfinder interface or close to the first position; at the second moment, when the display area of the device is at the second position in the viewfinder interface, the device icon of the device is displayed in the viewfinder within or proximate to the second position of the interface.
  • a user interface 530 is shown in FIG. 8A , wherein, for the description of FIG. 8A , reference may be made to the related description of FIG. 5F above.
  • the viewfinder interface 530 in FIG. 8A includes a device image 534, a device icon 5341 is displayed near the device image 534 in the viewfinder interface 530, and the device icon 5341 partially overlaps the display area of the device image 534;
  • the viewfinder interface 530 includes The device image 531 and the device icon 5311 are displayed near the device image 531 in the viewfinder interface 530 , and the device icon 5311 completely overlaps the display area of the device image 531 .
  • the display area of the device icon corresponds to the display area of the corresponding device image.
  • the electronic device 100 when the display content in the viewfinder interface is constantly changing, the electronic device 100 does not display the device icon until the duration of the stationary state of the electronic device 100 exceeds a preset time, and the electronic device 100 changes according to the display content in the viewfinder interface. to display the device icon.
  • the electronic device 100 may determine the static state of the electronic device 100 through an acceleration sensor and/or a gyro sensor.
  • the display area of the device icon is related to the display area of other device icons, for example, the display areas between the device icons do not block each other.
  • the shooting direction or angle of FIG. 8B is different from that of FIG. 8A .
  • the viewfinder interface 810 in FIG. 8B there are more display parts of the device image 534 , and the device icon 5341 completely overlaps the display area of the device image 534 .
  • the device image 531 and the device image 533 are partially overlapped, and the device icon 5311 is displayed above the device image 531 , close to the display area of the device image 531 .
  • the device icon 5311 does not overlap the display area of the device image 531 .
  • the display area of the device icon may change with the change of the display area of the device image.
  • the display area of the device icon may be the display area of the device image.
  • the center position (or any position) of the device icon; the display area of the device icon may be immediately above (below, to the left, to the right) of the display area of the device image, etc.
  • the viewfinder interface of the electronic device 100 when the device is not within the shooting range of the camera, does not include the device image of the device.
  • the device icon of the device can be displayed in the viewfinder interface in a specific way.
  • the electronic device 100 enters the viewfinder interface, activates the camera, and displays the image collected by the camera in real time in the viewfinder interface; at the same time, a detection request with wireless positioning technology is sent, and the electronic device 100 receives a detection response for the detection request according to the received detection request. , determine the nearby devices of the electronic device 100, and further, determine one or more information of the device name, device type, physical distance or angle from the electronic device of the nearby device.
  • the electronic device 100 performs image recognition on the image collected by the camera, and recognizes the electronic device (eg, a speaker, a computer, a tablet computer, etc.) in the image.
  • the electronic device 100 receives four probe responses and detects that there are four electronic devices nearby, the probe responses carry the device's identity information, such as device name, device type and other information.
  • the electronic device 100 determines the device names, device types, etc. of the four electronic devices, for example, matepad (device type: tablet computer), HUAWEI soundX (device type: speaker), matebook (device type: computer), and matebook ( Device type: computer); and the orientation information (physical distance and angle to the electronic device 100) of the four electronic devices is determined through wireless positioning technology.
  • the electronic device 100 determines that one of the electronic devices is not within the shooting range of the camera of the electronic device 100 through the orientation information of the four electronic devices, or the electronic device 100 recognizes the device types of the three electronic devices through computer vision technology.
  • the device types of the four electronic devices are determined to determine which electronic device and device type are not in the image; then the electronic device 100 displays the device icons of the electronic devices not in the image in the image in a first preset manner.
  • the first preset manner may be displayed at a fixed position on the viewfinder interface, or may be displayed at a position related to orientation information, for example.
  • the device image 531 is a partial display image of the device 202
  • the device icon 5311 is displayed near the device image 531 in the viewfinder interface 530
  • the device icon 5341 is the same as the device image.
  • the display area of the 534 completely overlaps.
  • the shooting direction or angle of FIG. 8B is different from that of FIG. 8A .
  • the device 202 is not within the shooting range of the camera.
  • the viewfinder interface 810 in FIG. 8B does not include the device image of the device 202 .
  • the viewfinder interface 810 displays an icon 801 and a prompt 802 .
  • the prompt 802 is used to remind the user that the icon 801 is a special icon; the icon 801 is displayed on the left edge of the viewfinder interface 810 of the electronic device 100 to remind the user that the device matepad exists outside the shooting range of the camera of the electronic device 100.
  • the icon 801 can trigger the electronic device to display a control interface of the device image 531 .
  • icon 801 or prompt 802 may indicate the orientation of the device (including angle, distance, etc.).
  • the icon 801 is displayed on the left edge of the viewfinder interface 810 of the electronic device, prompting the user that there is a device matepad on the left side of the electronic device 100 outside the shooting range of the camera of the electronic device 100 .
  • the orientation of the device can also be indicated by text.
  • the icon 801 or the prompt 802 may also be referred to as the third label.
  • the device image of the device is not included in the viewfinder interface of the electronic device 100 .
  • the device icon of the device can be displayed in the viewfinder interface in a specific way.
  • the electronic device 100 receives four probe responses and detects that there are four electronic devices nearby, the probe responses carry the device's identity information, such as device name, device type and other information.
  • the electronic device 100 determines the device names, device types, etc. of the four electronic devices, for example, matepad (device type: tablet computer), HUAWEI soundX (device type: speaker), matebook (device type: computer), and matebook ( Device type: computer); and the orientation information (physical distance and angle to the electronic device 100) of the four electronic devices is determined through wireless positioning technology.
  • the electronic device 100 detects that the four electronic devices are all within the shooting range of the camera of the electronic device 100 through the orientation information of the four electronic devices, and determines that the electronic devices are blocked.
  • the electronic device 100 recognizes the device types of the three electronic devices in the image through computer vision technology, and combines the device types of the four electronic devices to determine the electronic device and device type that are blocked; the electronic device 100 will block the electronic device.
  • the device icon of is displayed in the image in the second preset way.
  • the second preset manner may be displayed at a fixed position on the viewfinder interface, for example, or may be displayed at a position related to orientation information, for example.
  • the device image 532 is not included.
  • the viewfinder interface 820 displays an icon 803 and a prompt 804 .
  • the prompt 804 is used to prompt the user that the icon 803 is a special icon; the icon 803 is displayed in the middle area of the viewfinder interface 820 of the electronic device 100, prompting the user that there is a device HUAWEI soundX within the shooting range of the camera of the electronic device 100.
  • the icon 803 can trigger the electronic device 100 to display the control interface of the device image 532 (ie, HUAWEI soundX).
  • icon 803 or prompt 804 may indicate the orientation of the device (including angle, distance, etc.).
  • the icon 803 is displayed above the device image 5331 in the viewfinder interface 820 of the electronic device 100 to remind the user that the device HUAWEI soundX is blocked by the device 203 corresponding to the device image 5331.
  • the orientation of the device can also be indicated by means of text (for example, HUAWEI soundX is directly behind the device image 5331); or the device can be indicated by means of text.
  • the display area of the icon 803 does not overlap with the display areas of other device images and device icons.
  • the electronic device 100 determines the display area of the icon 803 according to the display areas of the device image 531 , the device image 533 , the device image 534 , the device icon 5311 , the device icon 5331 , and the device icon 5341 displayed in the viewfinder interface 820 .
  • the icon 803 or the prompt 804 may also be referred to as the second label.
  • the electronic device 100 identifies the type of the other device (such as a mobile phone, tablet, TV, speaker, etc.) through computer vision, and searches for the same login as the electronic device 100 Whether the account's device has a corresponding device type.
  • the type of the other device such as a mobile phone, tablet, TV, speaker, etc.
  • the electronic device 100 receives three probe responses and detects that there are three electronic devices nearby, the probe responses carry device identity information, such as device name, device type and other information.
  • the electronic device 100 determines that the three electronic devices are matepad (device type: tablet computer), HUAWEI soundX (device type: speaker), and matebook (device type: computer); and determine the three electronic devices through wireless positioning technology azimuth information (physical distance and angle to the electronic device 100).
  • the electronic device 100 determines the display area of the images of the four electronic devices in the viewfinder interface through the computer vision recognition technology, and determines that the device types of the four electronic devices are tablet computer, speaker, computer, and computer respectively. Then, the electronic device 100 searches for whether there is a computer in the device that logs into the same account as the electronic device 100 . Each electronic device has its own login account, and one account can be bound to one or more electronic devices. The electronic device 100 searches for an electronic device bound with a computer of the device type under its own account. If it exists, the electronic device 100 considers that the computer is associated with the device image in the image. The electronic device 100 displays the device icon of the computer in the image in a preset manner. The preset manner may be, for example, displayed at a fixed position on the viewfinder interface, or may be displayed at a position related to the display area of the device image in the image, for example.
  • the viewfinder interface 830 includes a device icon 805 and a prompt 806 .
  • the prompt 806 is used to indicate that the device icon 805 is an indeterminate icon, indicating that the device corresponding to the device image 533 has an indeterminate relationship with the device icon 805 .
  • the device icon 805 or the icon 806 may also be referred to as the fourth label.
  • the electronic device 100 identifies the type of the device (such as a mobile phone, tablet, TV, speaker, etc.) through computer vision, and uses the GPS information of the electronic device 100 itself. , to find out whether there is a device type corresponding to the device in the same geographical location as the electronic device 100 .
  • the type of the device such as a mobile phone, tablet, TV, speaker, etc.
  • the electronic device 100 receives three probe responses and detects that there are three electronic devices nearby, the probe responses carry device identity information, such as device name, device type and other information.
  • the electronic device 100 determines that the three electronic devices are matepad (device type: tablet computer), HUAWEI soundX (device type: speaker), and matebook (device type: computer); at the same time, the three electronic devices are determined through wireless positioning technology. azimuth information (physical distance and angle to the electronic device 100).
  • the electronic device 100 determines the display area of the images of the four electronic devices in the viewfinder interface through the computer visual recognition technology, and determines that the device types of the four electronic devices are tablet computer, speaker, computer, and computer respectively. Then, the electronic device 100 searches for whether there is a computer in the electronic device in the same geographical location as the electronic device 100 .
  • the configuration of the geographic location can be included.
  • the user configures the device location of a smart desk lamp as a room in the application software associated with the smart desk lamp (such as smart life); when the electronic device 100 is paired and connected with the smart speaker, the user In the application software associated with the smart speaker (such as smart life), the device location of the smart speaker is configured as the living room; when the electronic device 100 is paired and connected to the computer, the user in the application software associated with the computer (such as smart life), set the The computer's device location is configured as company; and so on.
  • the electronic device 100 determines the area where it is located according to its own geographic location.
  • the electronic device 100 obtains its own location through GPS positioning. In the company, configure the device location in the company's electronic device to find out whether there is an electronic device whose device type is computer. . If it exists, the electronic device 100 considers that the computer is associated with the device image in the image.
  • the electronic device 100 displays the device icon of the computer in the image in a preset manner.
  • the preset manner may be, for example, displayed at a fixed position on the viewfinder interface, or may be displayed at a position related to the display area of the device image in the image, for example. For the content of this part, reference may be made to the above-mentioned related description of FIG. 8D .
  • the device icon 805 or the icon 806 may also be referred to as the fifth label.
  • the electronic device 100 if other devices do not have wireless positioning technology that can be identified, and the electronic device 100 cannot correctly identify the location information of two devices of the same type, the electronic device outputs two tags for the user to select.
  • the images collected by the electronic device 100 include images of two electronic devices.
  • the electronic device 100 determines the display areas of the images of the two electronic devices in the viewfinder interface through the computer visual recognition technology, and determines that the device types of the two electronic devices are both speakers.
  • the electronic device 100 does not receive a probe response and cannot determine the orientation of the two speakers.
  • the electronic device 100 can use the methods described in the above two embodiments to find out whether the device that logs into the same account as the electronic device 100 has a corresponding device type; Whether the device in the geographic location has a corresponding device type. If the electronic device 100 determines a device whose device type is a speaker according to the two methods, the electronic device 100 displays the device icon of the speaker in the image in a preset manner.
  • the preset manner may be, for example, displayed at a fixed position on the viewfinder interface, or may be displayed at a position related to the display area of the device image in the image, for example.
  • the electronic device 100 determines that the two device types are speaker devices according to the two methods, since the electronic device 100 cannot match the device icons of the two speakers with the two speaker images in the image one-to-one, the electronic device 100 will The device icons for the two speakers are displayed in the image by default.
  • the preset mode may be displayed at a fixed position on the viewfinder interface, or may be displayed on the viewfinder interface in the form of a control.
  • the electronic device 100 detects a user operation for the control, the electronic device 100 outputs two devices. Icons for user selection.
  • the present application also shows a display form of the device icons, which can achieve the effect that the display areas between the device icons do not overlap.
  • FIG. 8E shows a display form in which the device icon is displayed in the upper area of the display area of the device image through the wire. As shown in FIG.
  • the device image 531 and the device icon 5311 are connected by a line segment, indicating that the device icon 5311 corresponds to the device image 531; the device image 532 and the device icon 5321 are connected by a line segment, indicating the device The icon 5321 corresponds to the device image 532; the device image 533 and the device icon 5331 are connected by a line segment, indicating that the device icon 5331 corresponds to the device image 533; the device image 534 and the device icon 5341 are connected by a line segment, indicating that The device icon 5341 corresponds to the device image 534 .
  • the electronic device 100 when the electronic device 100 detects that the display areas between the device icons overlap each other, or the closest distance between the display areas between the two device icons in the viewfinder interface is less than a threshold, the electronic device 100 outputs FIG. 8E The device icons are shown so that the display area between the device icons does not overlap.
  • the present application also provides a data transmission method, where the user can quickly select data (such as pictures, documents, videos, etc.) on the viewfinder interface 530 through a sliding operation (or click operation, etc.) Share to other devices.
  • data such as pictures, documents, videos, etc.
  • a sliding operation or click operation, etc.
  • Application Scenario 1 in the UI embodiment exemplarily shown in FIG. 9A-FIG. 9E, the user can trigger the sharing function based on the augmented reality display in the multitasking interface, and the application or data of the application in the multitasking interface can be triggered. Share to other devices.
  • a user interface 520 is shown in FIG. 9A , wherein, for the description of the user interface 520 , reference may be made to the related description of the above-mentioned FIG. 5B .
  • the electronic device 100 detects the long-press operation 901 on the card 521 , the electronic device enters the sharing interface corresponding to the card 521 .
  • the electronic device 100 extracts the application program corresponding to the card 521 and the data types that can be shared in the current interface of the card 521, and presents them on the sharing interface in the form of icons.
  • the electronic device 100 activates the camera, captures images in real time through the camera and displays them on the sharing interface 920 , and the display content in the sharing interface 920 includes the images captured by the camera.
  • FIG. 9B exemplarily shows a sharing interface 920 , and the sharing interface 920 includes a device image 531 , a device image 532 , a device image 533 and a device image 534 .
  • the device image 531, the device image 532, the device image 533, the device image 534, the device icon 5311, the device icon 5321, the device icon 5331, and the device icon 5341 can be referred to in FIG. 532 , the device image 533 , the device image 534 , the device icon 5311 , the device icon 5321 , the device icon 5331 , and the device icon 5341 related descriptions, which are not repeated here.
  • the sharing interface 920 may further include one or more icons, each of the one or more icons identifies a type of shareable data, such as an application icon 902 and a file icon 903 .
  • the application icon 902 is associated with the application of the card 521 ;
  • the file icon 903 is associated with the PDF document of “Fiction 1” in the card 521 .
  • the user can drag the icon onto the display area of the corresponding device by dragging and dropping, and after the user lets go, the electronic device 100 sends the data corresponding to the icon to the device.
  • the electronic device 100 sends the PDF document of "Fiction 1" associated with the file icon 903 to the electronic device (electronic device 204) corresponding to the device image 534 through wireless communication.
  • wireless communication methods include but are not limited to Zig-Bee, Bluetooth (Bluetooth), Wireless Broadband (Wi-Fi), Ultra Wideband (UWB), Near Field Communication (NFC), Wi-Fi Direct )etc.
  • the electronic device 100 when the user drags the file icon 903 to the display area of the device image 534 , the electronic device 100 increases the brightness of the display area of the device image 534 on the sharing interface 920 to indicate that the user is currently dragging the file icon 903 dragged to the active area of the device image 534.
  • the user drags the file icon 903 to the display area of the device icon 5341, and after the user lets go, the electronic device 100 sends the PDF document of "Fiction 1" associated with the file icon 903 through wireless communication. to the electronic device (electronic device 204) corresponding to device image 534.
  • the electronic device 204 receives the PDF document of “Fiction 1” sent by the electronic device, and outputs a prompt box 1001 on the display interface 1000 of the electronic device 204.
  • the text content of the prompt box 1001 can be “Received from The PDF file of the electronic device, click the prompt box to view”.
  • the electronic device 204 detects the click operation for the prompt box 1001
  • the electronic device 204 opens the PDF document of “Fiction 1”, as shown in FIG. 9E
  • the display interface 1002 of the electronic device 204 displays the PDF document of “Fiction 1” .
  • FIG. 9D is optional, the electronic device 204 receives the PDF document of “Fiction 1” sent by the electronic device, and the electronic device 204 directly opens the document, as shown in FIG. 9E .
  • the icon to be shared may also be referred to as the first icon.
  • the user drags the file icon 903 on the sharing interface 920, and drags the file icon 903 to the effective area of the device image 534, wherein this dragging operation may also be referred to as a third operation.
  • the electronic device 100 can determine whether the target device can support the output of the data type according to the data type that the user wants to share. If it is not supported, output prompt information to prompt the user to select other devices than the target device.
  • the electronic device drags the file icon 903 to the display area of the device image 532 by dragging and dropping. Since the device type of the electronic device 201 corresponding to the device image 532 is an audio device, and the device attribute of the electronic device 201 does not include a display function. Then when the electronic device detects that the user drags the file icon 903 to the display area of the device image 532, the electronic device 100 outputs a prompt message 1100 "HUAWEI soundX cannot perform this task", indicating that the electronic device corresponding to the device image 532 cannot perform the output of this task.
  • the PDF document corresponding to the file icon 903 .
  • the electronic device 100 outputs prompt information 1100 .
  • the user may be prompted for selectable devices for data sharing through the display form of device icons.
  • the user selects the file icon 903. Since the file icon 903 is associated with the PDF document of “Fiction 1”, when the electronic device detects that the file icon 903 is selected, the device icon 5311 in the viewfinder interface 920, the device The display areas of the icon 5331 and the device icon 5341 are highlighted (or the icon color is changed, etc.); optionally, the display areas of the device image 531 , the device image 533 and the device image 534 are highlighted.
  • the brightness (or color, etc.) of the display area of the device icon 5321 is different, indicating that the electronic device 201 corresponding to the device icon 5321 does not support outputting the file icon 903 is associated with the PDF document, and the user is prompted not to drag and drop the file icon 903 into the display area of the device image 532 .
  • the display form of the device icon 5311, the device icon 5331 and the device icon 5341 may also be referred to as the first display form, and the display form of the device icon 5321 may also be referred to as the second display form; the display form of the device icon may also have More forms are not limited in this application.
  • the user can trigger the sharing function based on the augmented reality display through a screenshot operation, and share the screenshot image to other devices.
  • FIG. 11A shows a user interface 1110.
  • the user interface 1110 may be any display interface in the electronic device.
  • the electronic device 100 displays the user interface 1110 and receives a screenshot operation, the electronic device 100 collects the display content of the current interface and generates a picture file.
  • the screenshot operation may be triggered by one or more virtual keys, or may be triggered by one or more physical keys.
  • the electronic device 100 receives the screenshot operation, collects the displayed content of the current interface, and generates a picture file.
  • Screenshot thumbnails 1111 are displayed on the current user interface 1110 .
  • the screenshot thumbnail 1111 is associated with a corresponding image file.
  • the user presses the screenshot thumbnail 1111 for a long time.
  • the sharing function is triggered.
  • the electronic device displays a sharing interface 1120 as shown in FIG. 11D .
  • the electronic device activates the camera, captures images in real time through the camera and displays them on the sharing interface 1120 , and the display content in the sharing interface 1120 includes the images captured by the camera.
  • FIG. 11D exemplarily shows a sharing interface 1120
  • the sharing interface 1120 includes a device image 531 , a device image 532 , a device image 533 and a device image 534 .
  • the sharing interface 1120 also includes a screenshot thumbnail 1111 .
  • the user can freely drag the screenshot thumbnail 1111.
  • the electronic device 100 sends the screenshot thumbnail 1111 to the device. associated image file.
  • the principle of dragging the screenshot thumbnail 1111 by the user to share in the display area of other devices provided in the embodiment of the present invention is similar to that of the user dragging the file icon 903 to the display area of other devices to share. Therefore, for the implementation of sharing by the user dragging the screenshot thumbnail 1111 to the display area of other devices, please refer to the corresponding description of the implementation of the user dragging the file icon 903 to the display area of other devices to share, for example, please refer to FIG. 9C to FIG. 9E . The implementation manners and corresponding descriptions are not repeated here.
  • Application Scenario 3 in the UI embodiment exemplarily shown in FIGS. 12A to 12E , when the electronic device detects the operation of selecting a picture for sharing, the user can trigger the sharing function based on the augmented reality display, and combine one or more pictures. Share files to other devices.
  • FIG. 12A exemplarily shows a user interface 1201 .
  • user interface 1201 may include one or more of the following areas: area 1201 , area 1202 , and area 1203 . in:
  • Area 1201 may be used to display one or more pictures in the gallery, and the one or more pictures may include pictures selected by the user, such as selected picture 1205.
  • a mark 1206 may be displayed on the selected picture 1205, and the mark 1206 may indicate that the corresponding picture 1205 is selected by the electronic device 100 (ie, the picture has been selected by the user).
  • the user can make a left or right swipe gesture in the area 1201 to switch or update the picture.
  • Pictures 1205 may be thumbnails. The original image corresponding to the image displayed in the area 405 may be stored in the image on the electronic device 100 or stored in the cloud server.
  • One or more service options may be displayed in area 1203.
  • the application or protocol corresponding to the service option can support sharing the picture selected by the user to a contact or a server.
  • the electronic device 100 in response to an operation detected in area 1203 acting on a service option (eg, a touch operation on an "info" icon), the electronic device 100 may trigger an application or protocol corresponding to the service option through the
  • the process of sharing the selected image to the cloud contact or server the process may include: the electronic device 100 opens the application or the protocol, displays its user interface, detects the user's data sharing operation in the user interface, and responds to the operation , share the selected picture to cloud contacts or server through the application or protocol.
  • the area 1202 can be used to display nearby device options discovered by the electronic device 100, such as smart screens, mate 30 Pro, matebook X, printers, and so on.
  • the device options (eg mate 30 Pro, matebook X) displayed in area 1202 can be used to trigger the action of sharing.
  • the electronic device 100 may trigger a process of sharing the selected picture to the device corresponding to the device option selected by the operation.
  • the process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device option, and then transmits the selected picture to the device corresponding to the device option through the communication connection.
  • the user interface 1210 further includes a live view sharing control 1204, and the live view sharing control 704 is used to trigger entry into the sharing interface.
  • the electronic device 100 detects a user operation on the live view control 704, the electronic device 100 activates the camera and displays the sharing interface 1220 as shown in FIG. 12B.
  • the sharing interface includes an image captured by the camera, a device icon, and a picture column 1221 to be shared.
  • the live view control 1204 in the user interface 1210 is optional, and the electronic device 100 may not display the live view control 1204 .
  • the electronic device 100 triggers the display of the sharing interface 1220 .
  • the electronic device 100 For the lift operation, reference may be made to the description of FIG. 7C.
  • the electronic device 100 displays the user interface 1210, and when the electronic device 100 detects the lift operation, at time T2, the electronic device 100 displays the sharing interface 1220,
  • the time interval between time T1 and time T2 is less than the threshold.
  • the lifting operation is only an exemplary user operation, and the electronic device 100 can also enter the sharing interface 1220 through other user operations.
  • the picture bar 1221 is used to display one or more pictures in the gallery, and the one or more pictures may include pictures selected by the user, such as the selected picture 1205 .
  • a mark 1206 may be displayed on the selected picture 1205, and the mark 1206 may indicate that the corresponding picture 1205 is selected by the electronic device (ie, the picture has been selected by the user).
  • the user can make a left or right swipe gesture in the area 1201 to switch or update the picture.
  • the electronic device 100 After the user selects one or more pictures, and selects any device in the sharing interface, when the electronic device 100 detects a user operation on the device icon (such as a click operation on the device icon), the electronic device 100 can trigger the sharing process.
  • the process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device icon, and then transmits the selected picture to the device corresponding to the device icon through the communication connection.
  • the electronic device 100 detects the user operation on the device icon 5311 and sends the picture 1205 to the electronic device 202 corresponding to the device image 531 .
  • the electronic device 202 receives the picture 1205 sent by the electronic device 100, and outputs a prompt box 1211 on the display interface of the electronic device 202.
  • the text content of the prompt box 1211 can be “Received a PDF file from the electronic device. , click the prompt box to view”.
  • the electronic device 202 detects a click operation on the prompt box 1211 , the electronic device 202 opens the picture 1205 , as shown in FIG.
  • FIG. 12D is optional, the electronic device 202 receives the picture 1205 sent by the electronic device, and the electronic device 202 directly opens the picture, as shown in FIG. 12E .
  • the user may be prompted for selectable devices for data sharing through the display of device icons.
  • the user selects the picture 1205. Since the data type of the picture 1205 is picture, when the electronic device 100 detects that the picture 1205 is selected, the device icon 5311, the device icon 5331 and the device icon 5341 in the viewfinder interface 920 The display area of the device image 531 , the device image 533 and the device image 534 are highlighted. This identifies the device image 531, device image 533 and device image 534 indicated by device icon 5311, device icon 5331 and device icon 5341 respectively corresponding to electronic device 202, electronic device 203, and electronic device 204, which support the output of the picture 1205 equipment. Prompt the user to choose to click the device icons of these devices to share data.
  • the brightness (or color, etc.) of the display area of the device icon 5321 is different, indicating that the electronic device 201 corresponding to the device icon 5321 does not support outputting the picture 1205 , prompting the user not to click the device icon 5321 of the device image 532 .
  • the display form of the device icon may also have more forms, which is not limited in this application.
  • an embodiment of the present application further provides a method for sharing photos, and a user can quickly share photos on a shooting preview interface of a camera application. The method of sharing photos is described in detail below.
  • FIG. 13 is an example of a schematic diagram of a graphical user interface (GUI) of a process of sharing a photo.
  • GUI graphical user interface
  • 13 (a) shows the interface content 1301 currently output by the mobile phone in the unlocking mode, and the interface content 1301 displays a variety of application programs (applications, apps), such as music, settings, photo albums and cameras, etc. application. It should be understood that the interface content 1301 may also include other more application programs, which is not limited in this embodiment of the present application.
  • the shooting preview interface 1302 may include a preview screen in the middle, buttons and menu options of the camera application displayed in the top area and bottom area of the interface. etc., in subsequent embodiments, both the shooting preview interface and the preview screen can be used to describe the shooting interface of the camera application. For example, “display a reminder window on the shooting preview interface” or “display a reminder window in the preview screen” is not strictly distinguished. No further description will be given later.
  • the shooting preview interface in the embodiment of the present application may represent an interface including a preview screen, a shooting shutter button, a local album icon, a camera switching icon, etc. If the displayed content changes on the interface, for example, a certain The identified device label, etc., this interface can still be called a shooting preview interface, which will not be described in detail later.
  • the main interface 1302 of the camera application includes various buttons and menu options, such as the shooting shutter button 31, the local album icon 32 and the camera switching button 33, etc.
  • the user can implement different operations through various buttons and menu options.
  • the user can perform operation 1 as shown in (b) of FIG. 13, click the shooting shutter button 31, and in response to the user's shooting operation, the mobile phone takes a photo and saves the taken photo in the local album.
  • the user can perform operation 2 as shown in (b) in FIG. 13 , and click the local album on the main interface 1302 of the camera application.
  • Icon 32 in response to the user's click operation, the mobile phone enters the photo display interface 1303.
  • the photo display interface 1303 can display the currently taken photo, as shown in (c) of FIG.
  • the photo sharing interface 304 may include a photo area and a sharing menu area, wherein the photo area may display multiple captured photos, and the user may click the "select" box in the lower right corner of the photo to select the photo desired to be shared.
  • the share menu area can provide users with a variety of photo sharing methods, such as “Huawei Share”, “Send to a friend”, “Bluetooth”, “Send to a friend”, “Weibo”, “Information”, “Electronics” Mail”, “Memo” and other photo sharing methods, different photo sharing methods can be associated with different applications (such as WeChat, etc.), and will not be repeated here.
  • the user can select the icon of the target electronic device to be shared according to his own needs, so as to share the selected photo to the target electronic device.
  • a receiving window may pop up on the target electronic device, and the receiving window may be used to select whether to receive the currently shared photo.
  • the above describes the process of sharing the photo to other electronic devices after the user takes a photo through the camera application.
  • the steps of this process are followed by the user taking a photo, opening the gallery, selecting the image, clicking to share, selecting the sharing method, searching for other electronic devices, selecting the target electronic device, and transferring the image.
  • Electronic equipment The process of sharing photos is cumbersome and has many interactive processes, and the efficiency of sharing photos is low.
  • an embodiment of the present application provides a method for sharing photos.
  • a user can quickly share photos to other electronic devices through a camera application.
  • FIG. 14 is a schematic diagram of a graphical user interface of an example of a process of sharing a photo according to an embodiment of the present application.
  • 14(a) shows the interface content 1401 currently output by the mobile phone in the unlocking mode, the user clicks the icon of the camera application, and in response to the user's click operation, the mobile phone displays the picture (b) in Fig. 14
  • the shooting preview interface 1402 is shown. On the shooting preview interface 1402, the user clicks the shooting shutter button 31, and in response to the user's shooting operation, the mobile phone shoots a photo and saves the captured photo in a local album.
  • the user performs the operation as shown in (c) of FIG. 14 and long presses the local album icon 32.
  • the mobile phone displays the interface 1404 as shown in (d) of FIG. 14 .
  • the interface 1404 displays a thumbnail photo icon 30, or "photo thumbnail”.
  • the mobile phone activates the device identification function, and identifies whether the preview image includes other electronic devices according to the preview image presented on the current shooting preview interface 1404 .
  • the mobile phone can identify the mobile phone 10 in the preview screen. and the PC 20, and the recognized name of the mobile phone 10 and the name of the PC 20 are displayed in the interface 1404, for example, the mobile phone 10 is "P40", the PC 20 is "MateBook”, etc.
  • the mobile phone may not display the recognized names of other electronic devices in the preview screen, but only mark “device 1", “device 2", etc., which is not limited in this embodiment of the present application.
  • the preview images presented in (b) in FIG. 14 and (c) in FIG. 14 may be obtained by the front camera or the rear camera of the mobile phone, and the camera for taking pictures is not affected in this embodiment of the present application. limited.
  • the person photo in (b) of FIG. 14 is obtained by the front camera of the mobile phone, if the user wants to identify the electronic device through the rear camera, he can switch by clicking the camera switch button 33 .
  • the person photo in (b) of FIG. 14 is obtained by the rear camera of the mobile phone, if the user wants to identify the electronic device through the front camera, he can switch by clicking the camera switch button 33 .
  • the above-mentioned embodiments take the long-press operation as an example to introduce the operation of triggering the photo sharing process by the user long-pressing the local album icon 32 .
  • the embodiments of the present application can also trigger the photo sharing process provided in the embodiments of the present application through other preset operations, or trigger the mobile phone to identify the electronic device in the preview screen through other preset operations.
  • the preset operation is not limited to long pressing.
  • the local album icon 32, double-clicking the local album icon 32, or drawing a fixed pattern on the shooting preview interface 1403, etc., are not limited in this embodiment of the present application.
  • the mobile phone triggers the identification function of the mobile phone after detecting the user's long-pressing operation on the local album icon 32 .
  • the object in the preview screen may not be recognized, and the picture (c) in FIG. 14 is displayed.
  • the mobile phone detects the user's long-press operation on the local album icon 32, it triggers the recognition of the objects in the preview screen, and marks the recognized names of the electronic devices "P40" and "MateBook", as shown in Figure 14 ( d) Figure.
  • the above implementation manner can prevent the mobile phone from being in a state of recognizing objects in the preview screen all the time, thereby reducing the power consumption of the mobile phone.
  • the mobile phone can always activate the device identification function, that is, the mobile phone continues to identify the objects in the preview screen, and after detecting the user's long-press operation on the local album icon 32, it marks the identified electronic device , the icons of "P40" and "MateBook” as shown in (d) of Figure 14 are displayed.
  • the above-mentioned implementation can make the mobile phone determine the objects included in the preview screen in advance, and when the user starts the photo sharing function by long-pressing the local album icon 32, the identified name of the electronic device is quickly displayed in the interface, which improves the ability of the mobile phone to identify the preview screen. the speed of the object.
  • the user can press and hold the icon 30 of the thumbnail photo according to his needs, and drag the icon 30 of the thumbnail photo to the target device to be shared.
  • the icons of “P40” and “MateBook” are displayed in the preview screen.
  • the user presses the icon 30 of the thumbnail photo for a long time, and the icon 30 of the thumbnail photo is displayed. Drag to the icon area of the P40 and release.
  • the user long presses the icon 30 of the thumbnail photo, drags the icon 30 of the thumbnail photo to any position in the area where the P40 is located, and releases it.
  • the user can drag the icon 30 of the thumbnail photo to the position of the icon of P40 and release it, and the icon of P40 can be presented in different colors, or display size changes, jumps, flashing lights and other dynamic effects, To remind the user to share the currently taken photo to the P40 identified in the preview screen.
  • the icon 30 of the thumbnail photo to the location of the icon of P40
  • the color of the “P40” icon changes, and the user releases the thumbnail photo at this time. icon 30, you can share the currently taken photo to the P40.
  • a reminder control may also be displayed on the preview screen.
  • the reminder control can be an arrow 40 or the like, and the arrow 40 can be displayed statically, pulsatingly, or flashing to remind the user that the icon 30 of the thumbnail photo can be dragged. Move to the position marked by the arrow 40 to realize the photo sharing function.
  • the embodiment of the present application does not limit the display manner of the reminder control.
  • the mobile phone can detect and recognize other electronic devices included in the preview screen through image detection, 3D scanning technology, machine vision, etc., and other electronic devices included in the preview screen.
  • image detection 3D scanning technology
  • machine vision etc.
  • the manner of other electronic devices is not limited.
  • the mobile phone can also identify other electronic devices in the preview screen through various possible positioning technologies, and locate the positions of other electronic devices.
  • the positioning technology in this embodiment of the present application may include one of technologies such as Bluetooth-based wireless sensing positioning, ultra-wide-band (UWB) sensing-based wireless sensing positioning, computer vision-based positioning, and the like, or
  • technologies such as Bluetooth-based wireless sensing positioning, ultra-wide-band (UWB) sensing-based wireless sensing positioning, computer vision-based positioning, and the like, or
  • UWB ultra-wide-band
  • the embodiments of the present application do not limit the manner in which the mobile phone locates other electronic devices.
  • the mobile phone after the mobile phone recognizes other electronic devices included in the preview screen, it can determine the position where the icon of the electronic device is displayed according to the display position of the object in the current preview screen.
  • the mobile phone may display icons marking other electronic devices in the area where the electronic device is located in the preview screen.
  • icons marking other electronic devices in the area where the electronic device is located in the preview screen.
  • the icon marked "P40” is displayed on the recognized location of the mobile phone, and the icon "MateBook” is marked. icon is displayed where the PC is located.
  • the icon marking other electronic equipment may be displayed in an area close to the positioning device of the electronic equipment.
  • the mobile phone communicates with the P40 through the UWB chip to locate the position of the P40 in the preview screen. If the UWB chip of the P40 is installed in the upper right corner of the P40, then the picture including “P40” in (d) of FIG. The icon may be displayed in the area where the UWB chip is located in the upper right corner of the P40, which is not limited in this embodiment of the present application.
  • the icon marking other electronic devices may be displayed in a blank area in the preview image, and does not block other objects in the preview image.
  • an icon marked “P40” is displayed on the left border of the preview screen so as not to block the right side of the P40.
  • the icon marked "MateBook” is displayed on the right border of the preview screen so as not to block the mobile phone on the left side of the MateBook.
  • the above-described icon display method can mark the identified electronic device without blocking other objects in the preview screen, does not affect the user's vision and perception, and improves the user's visual experience.
  • the user can activate the device identification function and positioning function of the mobile phone through a preset operation during the process of taking a photo, and identify other electronic devices included in the preview screen of the camera in combination with the identification function and positioning function of the mobile phone.
  • the user can directly drag the photo to be shared to the area where other electronic devices are located, so as to quickly share the photo to other electronic devices around. This process simplifies the operation process of sharing photos, shortens the time for sharing photos, and improves user experience.
  • an embodiment of the present application further provides a method for sharing a photo, so as to quickly share the captured photo to the electronic device that is blocked in the preview screen.
  • FIG. 15 is a schematic diagram of a graphical user interface for another example of a photo sharing process provided by an embodiment of the present application.
  • the mobile phone recognizes that the PC 20 in the preview screen is a MateBook, and displays an icon marked “MateBook”.
  • the phone also recognized that there was a blocked device 1 behind the MateBook.
  • the MateBook blocks the device 1, and on the camera preview interface 1501 of the mobile phone, a reminder window 50 may be displayed, and the reminder window 50 may include text information for reminding the user of the detected device 1.
  • an icon reminder may also be included.
  • icons such as statically displayed arrows, dynamically flashing arrows, or beatingly displayed arrows may be included to mark the position of the blocked electronic device. The embodiment does not limit this.
  • the reminder window 50 displays: Device 1 is detected here, whether to share.
  • the mobile phone displays an interface 1502 as shown in (b) in FIG. 15 .
  • the photo sharing window 60 is included, and the user can click the “Share” button of the photo sharing window 60. , determine to share the currently taken photo to the blocked device 1.
  • the mobile phone may not further display the interface in (b) in FIG. 15 , and directly share the captured photo to the blocked
  • the device 1 is not limited in this embodiment of the present application.
  • the mobile phone can communicate with other nearby electronic devices, for example, through Bluetooth, wireless fidelity (wireless fidelity, WIFI) modules and other possible ways to communicate, then the mobile phone can sense the electronic devices that exist nearby .
  • the mobile phone determines that there are other electronic devices nearby through a wireless positioning technology such as UWB, and recognizes the type of the electronic device, etc., which can be displayed in the shooting preview interface.
  • the embodiments of the present application do not limit the communication interaction mode and positioning mode between the mobile phone and other nearby electronic devices.
  • reminder information such as text or icons can be displayed on the shooting preview interface.
  • the user can further quickly share the photos taken to the blocked electronic device, which provides a possible way for the user to share photos with the blocked electronic device and simplifies the sharing of the user. photo steps.
  • the mobile phone may identify through the wireless positioning technology that there is another electronic device nearby, and the electronic device is not displayed in the current preview screen of the mobile phone.
  • the embodiment of the present application may also display reminder information on the shooting preview interface, which is used to remind the user that there are other electronic devices in a certain position.
  • the preview image obtained by the camera of the mobile phone does not include any electronic equipment, but the mobile phone may detect the image in the left area outside the preview image. to 3 electronic devices.
  • a reminder window 70 may be displayed, and the reminder window 70 may include text information for reminding the user of the detected multiple electronic devices.
  • an icon reminder may also be included.
  • icons such as statically displayed arrows, dynamically blinking arrows, or beatingly displayed arrows may be included to mark the position of the blocked electronic device. The embodiment does not limit this.
  • the reminder window 70 displays: 3 electronic devices are detected here, please rotate the camera to obtain the information of the electronic devices.
  • the mobile phone displays an interface 1504 as shown in (d) in FIG. 15 .
  • the device list window 80 is included, and the user can click on any device in the device list window 80. , such as device 3, so as to determine to share the currently taken photo to device 3.
  • the user can turn the direction of the mobile phone according to the reminder information on the interface 1503, so that the camera of the mobile phone can obtain the three detected electronic devices, and display the device that the user will share the photo with in the preview screen 3, so that the captured photo can be quickly shared with other electronic devices according to the method introduced in FIG. 14 .
  • reminder information such as text or icons can be displayed on the shooting preview interface to remind the user that there are other electronic devices nearby.
  • Information or location of other electronic devices of the photo, etc. Therefore, in the process of sharing photos, the user can quickly share the captured photos to the electronic device by dragging the photos to other electronic devices in the preview screen, which provides another possible way for users to share photos. Simplifies the steps for users to share photos.
  • a mobile phone is used as a sending device, and an electronic device that accepts photos shared by a user can be used as a "receiving device". 14 and 15, after the user drags the thumbnail photo icon 30 to the receiving device identified by the mobile phone, correspondingly, a receiving window for the photo can appear on the receiving device.
  • FIG. 16 is a schematic diagram of an example of a graphical user interface for receiving a photo provided by an embodiment of the present application.
  • FIG. 16 shows a possible interface 1601 of the receiving device, it should be understood that the interface 1601 is not limited to the main interface of the receiving device or the running interface of any application, etc. , which is not limited in the embodiments of the present application.
  • the receiving device can display the interface 1602 shown in (b) of FIG. Receive window 90.
  • the photo receiving window 90 may provide the user with buttons such as "view” and “close”, so that the user can quickly view the shared photo through the receiving device.
  • the receiving window 90 of the photo may automatically disappear or hide in the notification bar of the receiving device after being displayed on the interface of the receiving device for a preset period of time, and the user can view the photo sharing result in the notification bar through a pull-down operation; or The pull-down operation further closes the photo sharing result in the notification bar.
  • the pull-down operation further closes the photo sharing result in the notification bar.
  • the mobile phone can transmit the currently taken photo to the receiving device.
  • the transmission method may not be limited to Bluetooth transmission, WIFI transmission, near-field communication technology (near-field communication, NFC) transmission, and high-speed communication methods such as the fifth generation (5th generation, 5G) mobile communication system in the future. Possible ways, the embodiments of the present application do not limit the way of photo transmission.
  • the shared photo may be the latest photo taken after the current user clicks the shutter button to shoot, or may be a photo taken by the user before, or a picture from other sources saved on the user's mobile phone, which the embodiments of the present application do not make. limited.
  • the user can open the camera app, without taking a photo, directly long press and drag the local album icon, and share the first photo in the local album with the shooting date closest to the current date, or pictures from other sources stored on the user's mobile phone.
  • a receiving device which is not limited in this embodiment of the present application.
  • the embodiments of the present application also provide a method for sharing a photo.
  • the user can simultaneously share multiple photos to the receiving device identified in the preview screen through the camera application. .
  • FIG. 17 is a schematic diagram of a graphical user interface of an example of a process of sharing a photo according to an embodiment of the present application.
  • 17(a) shows the main interface 1701 currently output by the mobile phone in the unlocking mode.
  • the user clicks the icon of the camera application on the main interface 1701.
  • the mobile phone displays as shown in Figure 17 (b)
  • the user clicks the shooting shutter button 31 the user clicks the shooting shutter button 31, and in response to the user's shooting operation, the mobile phone takes a photo and saves the taken photo in the local album.
  • the user performs the operation as shown in (c) in Figure 17, selects the local album icon 32 and drags the local album icon 32 up in the direction shown by the arrow.
  • the mobile phone displays as shown in the figure.
  • a photo list is displayed on the interface 1704 , as shown in (d) of FIG. 17 , the photo list can display thumbnails of multiple photos, such as photo 1 , photo 2 , and photo 3 .
  • the photo list can be displayed in the bottom area of the interface 1704, which does not affect the display of the preview screen in the interface 1704, so as to ensure that the user can see the content in the preview screen.
  • the photos in the photo list may be arranged according to the sequence taken by the user.
  • photo 1 is the latest photo taken by the user
  • the shooting time of photo 2 and photo 3 is earlier than the shooting time of photo 1 .
  • the photos in the photo list may be arranged in other possible order.
  • the photos of the company where the shooting place is may be displayed in the photo list, which is not limited in this embodiment of the present application.
  • the first photo in the photo list may be selected by default.
  • the lower right corner of the photo 1 in the figure is identified as the selected photo to be shared by default. If the user does not want to share the photo 1, he can click the selection box in the lower right corner of the photo 1 to deselect the photo 1. Similarly, if the user wishes to share the photo 1, photo 2 and photo 3 at the same time, he can click the selection box in the lower right corner of each photo to select multiple photos to be shared, which will not be repeated here.
  • the mobile phone activates the device identification function, and according to the preview picture presented on the current shooting preview interface 1705, identifies whether the preview picture includes other electronic devices.
  • the thumbnail photo icon 30 may only display the thumbnail image of any one of the photo 1, the photo 2, and the photo 3 to be shared, which is not limited in this embodiment of the present application.
  • the embodiments of the present application may also trigger the process of sharing multiple photos provided in the embodiments of the present application through other preset operations, or trigger the mobile phone to identify the electronic device in the preview screen through other preset operations.
  • the preset operations do not. It is limited to selecting the local album icon 32 and dragging it upward, double-clicking the local album icon 32, or drawing a fixed pattern on the shooting preview interface 1703, etc., which is not limited in this embodiment of the present application.
  • the mobile phone can identify the mobile phone 10 and the PC 20 in the preview screen, and in this The name of the recognized mobile phone 10 and the name of the PC 20 are displayed in the preview screen, for example, the mobile phone 10 is "P40", the PC 20 is "MateBook” and so on.
  • the mobile phone may not display the recognized names of other electronic devices in the preview screen, but only mark “device 1", “device 2", etc., which is not limited in this embodiment of the present application.
  • the user can drag the icon 30 of the thumbnail photo to the target device to be shared according to his needs.
  • the icons of “P40” and “MateBook” are displayed in the preview screen, and the user drags the icon 30 of the thumbnail photo to the icon area of the MateBook and releases it, That is, you can share the selected photo 1, photo 2 and photo 3 to the MateBook.
  • the user drags the icon 30 of the thumbnail photo to the location of the icon of the MateBook and releases it, and the icon of the MateBook can be presented in different colors, or display size changes, jumps, flashing lights and other dynamic effects, to Remind the user to share the currently taken photo to the MateBook identified in the preview screen.
  • a reminder control may also be displayed on the preview screen.
  • the reminder control can be an arrow 40 or the like, and the arrow 40 can be displayed in a static state, in a pulsating display, or in a blinking display to prompt the user to drag the icon 30 of the thumbnail photo. Move to the position marked by the arrow 40 to realize the photo sharing function.
  • the embodiment of the present application does not limit the display manner of the reminder control.
  • the receiving device in the preview interface may also be blocked.
  • the relevant description in FIG. 15 which will not be repeated here.
  • the user can activate the device identification function and positioning function of the mobile phone through a preset operation during the process of taking a photo, and identify other electronic devices included in the preview screen of the camera in combination with the identification function and positioning function of the mobile phone.
  • the user can select multiple photos to be shared, and directly drag the multiple photos to be shared to the area where other electronic devices are located, so as to quickly share the photos to other electronic devices around. This process simplifies the operation process of sharing photos, shortens the time for sharing photos, and improves user experience.
  • FIG. 18 is a schematic diagram of another example of a graphical user interface for receiving a photo provided by an embodiment of the present application. Illustratively, (a) of FIG. 18 illustrates one possible interface of the PC 20. It should be understood that the PC 20 can display the interface presented by different systems such as the windows system and the Hongmeng system, and the interface can also be any operating interface in the use process of the PC 20. The display interface is not limited.
  • the MateBook can display the photo receiving window 1801 shown in (b) of FIG. 18 .
  • the photo receiving window 1801 can display the thumbnails of photo 1, photo 2 and photo 3 shared by the user, in addition, buttons such as “view” and “close” can also be provided for the user to facilitate the user to quickly view the photo. Shared photos.
  • the receiving window 1801 of the photo can be displayed on the interface of the receiving device for a preset duration, and then automatically disappear or be hidden to the status bar at the bottom of the MateBook, and the user can view the photo sharing result by clicking on the status bar; or To further close the photo sharing result in the status bar, reference may be made to related operations in the prior art for this process, which will not be repeated here.
  • the mobile phone can transfer the currently taken photo to the MateBook.
  • the transmission method between the mobile phone and the MateBook may not be limited to Bluetooth transmission, WIFI transmission, near-field communication (NFC) transmission, and future fifth-generation (5th generation, 5G) mobile communication systems.
  • 5th generation, 5G fifth-generation
  • the shared photo may be the latest photo taken after the current user clicks the shutter button to shoot, or may be a photo taken by the user before, or a picture from other sources saved on the user's mobile phone, which the embodiments of the present application do not make. limited.
  • the user can open the camera application, directly press and drag the local album icon without taking a photo, and share the first photo in the local album with the shooting date closest to the current date to the receiving device, which is not limited in this embodiment of the present application .
  • the user can activate the device identification function and the positioning function of the electronic device through a preset operation during the process of taking a photo or running the camera application. And based on the identification function and positioning function of the electronic device, other electronic devices included in the preview screen of the camera are identified, and the user can select one or more photos to be shared through a shortcut operation, and directly drag the one or more photos.
  • the photos to be shared go to the area where other electronic devices are located, so as to quickly share one or more photos to other electronic devices around.
  • the embodiments of the present application provide users with a user-friendly interactive interface for various scenarios such as other electronic devices that are blocked in the preview screen, so that users can share one or more photos through quick operations, which simplifies the sharing process.
  • the photo operation process shortens the time for sharing photos and improves the user experience.
  • the method for sharing photos from the user interaction level is introduced in the above embodiments.
  • the following describes the method for sharing photos provided by the embodiments of the present application from the software implementation strategy level with reference to FIG. 19. It should be understood that the method can be implemented in electronic devices (eg, mobile phones, tablets, computers, etc.) having structures such as a touch screen and a camera assembly as shown in FIG. 2 and FIG. 3 .
  • FIG. 19 is a schematic flowchart of an example of a method for sharing photos provided by an embodiment of the present application. Taking a mobile phone as an example, as shown in FIG. 19 , the method may include the following steps:
  • the camera application is launched.
  • the mobile phone starts the camera application and displays the shooting preview interface.
  • the implementation process of this step 1901 may be as shown in (a) in FIG. 14 , or as shown in (a) in FIG. 17 .
  • step 1902 is an optional step.
  • the method for sharing a photo can be applied to a scene where a user takes a photo, and the photo to be shared can be the latest photo taken by the user by clicking the shutter button, or a photo taken by the user before, or the photo to be shared.
  • the photo may also be a picture from other sources saved on the user's mobile phone, which is not limited in this embodiment of the present application.
  • the icon of the thumbnail photo is triggered to be displayed, and the icon of the thumbnail photo is in a draggable mode, and the device identification function is activated at the same time.
  • this embodiment of the present application can also trigger the photo sharing process provided by the embodiment of the present application through other preset operations, or trigger the mobile phone to identify the preview screen through other preset operations.
  • the preset operation is not limited to long-pressing the local album icon, double-clicking the local album icon, or drawing a fixed pattern on the shooting preview interface, etc., which are not limited in this embodiment of the present application.
  • the mobile phone when the mobile phone does not detect the user's long-press operation on the local album icon, it may not recognize the object in the preview screen.
  • the mobile phone when the mobile phone detects the user's long-press operation on the local album icon, it triggers the recognition of the objects in the preview screen, and marks the recognized names of the electronic devices "P40" and "MateBook", as shown in Figure 14.
  • Figure (d) in This method can prevent the mobile phone from being in the state of recognizing the object in the preview screen all the time, thereby reducing the power consumption of the mobile phone.
  • the mobile phone can always activate the device identification function, that is, the mobile phone continues to identify the objects in the preview screen, and after detecting the user's long-press operation on the local album icon, it marks the identified electronic equipment.
  • the names "P40" and "MateBook” are shown as (d) in Figure 14. This method enables the mobile phone to determine the objects included in the preview screen in advance, and when the user activates the photo sharing function by long-pressing the local album icon, the recognized name of the electronic device is quickly displayed on the interface, which improves the mobile phone's ability to identify objects in the preview screen. speed.
  • the mobile phone can communicate with other nearby electronic devices, for example, through Bluetooth, WIFI module, NFC and other possible ways to communicate, then the mobile phone can sense the electronic devices that exist nearby. Alternatively, the mobile phone determines that there are other electronic devices nearby through a wireless positioning technology such as UWB, and recognizes the type of the electronic device, etc., which can be displayed in the shooting preview interface.
  • a wireless positioning technology such as UWB
  • the embodiments of the present application do not limit the communication interaction mode and positioning mode between the mobile phone and other nearby electronic devices.
  • the mobile phone can display the icon marking other electronic devices in the area where the electronic device is located in the preview screen, or in the blank area in the preview screen, without blocking other objects in the preview screen. Please refer to the specific display method. Referring to the foregoing description, details are not repeated here.
  • the user can drag the icon of the thumbnail photo to the location where the icon of the other electronic device is marked and release it.
  • the icon can be displayed in different colors, or display size changes, jumps, flashing lights and other dynamic effects. to remind the user to share the currently taken photo to other electronic devices identified in the preview screen.
  • the user drags the icon 30 of the thumbnail photo to the position of the icon of the MateBook and releases it, the icon of the MateBook can be presented in different colors, or Displays size changes, jumps, flashing lights and other dynamic effects to remind users to share the currently taken photo to the MateBook identified in the preview screen.
  • the user may expect to share multiple photos, or the shared photos are not currently taken.
  • the processes of steps 1905-1911 may be performed.
  • the photo to be shared is a photo from another source saved on the user's mobile phone
  • the user can open the camera application, without taking a photo, directly press and drag the local album icon, find and select the photo to be shared in the photo list. , as shown in (d) and (e) of Figure 17.
  • This embodiment of the present application does not limit this.
  • the photo list is displayed, and the first photo in the photo list may be selected by default. If the user does not want to share the photo 1, he can click the selection box in the lower right corner of the photo 1 to deselect the photo 1. Similarly, if the user wishes to share the photo 1, photo 2 and photo 3 at the same time, he can click the selection box in the lower right corner of each photo to select multiple photos to be shared, which will not be repeated here.
  • the icon of the thumbnail photo is triggered to be displayed, and the icon of the thumbnail photo is in a draggable mode, and the device identification function is activated at the same time.
  • the user can drag three photos by long-pressing any area of photo 1, photo 2, and photo 3 to be shared.
  • the user can activate the device identification function and positioning function of the mobile phone through a preset operation during the process of taking a photo, and identify other electronic devices included in the preview screen of the camera in combination with the identification function and positioning function of the mobile phone.
  • the user can select multiple photos to be shared, and directly drag the multiple photos to be shared to the area where other electronic devices are located, so as to quickly share the photos to other electronic devices around. This process simplifies the operation process of sharing photos, shortens the time for sharing photos, and improves user experience.
  • the display interface and method implementation of the present application are described above.
  • the following takes the UWB wireless positioning technology as an example to describe in detail how the electronic device 100 implements ranging and angle measurement for other electronic devices.
  • the electronic device 100 initiates a UWB measurement request. And according to the measurement response of the electronic device 201, the distance between the electronic device 100 and the electronic device 201 is determined.
  • the above device control method includes but is not limited to steps S101 to S105, wherein:
  • the electronic device 100 broadcasts a UWB measurement request, and the electronic device 201 receives the UWB measurement request.
  • the electronic device 100 initiates a UWB measurement request, and the electronic device 100 uses the ranging algorithm 3 to determine the distance of the electronic device 201 .
  • Step S101 may specifically include: the electronic device 100 broadcasts the first measurement request at time T11, and records that the first measurement request sending time is T11, and the first measurement request carries the identity information of the electronic device 100 (for example, the ID of the electronic device, the mac address, etc. ).
  • the electronic device 201 receives the first measurement request sent by the electronic device 100 at time T12, and records the reception time of the first measurement request as T12.
  • the electronic device 201 sends a first measurement response to the electronic device 100 .
  • the electronic device 201 sends a first measurement response to the electronic device 201 at time T13 , and the first measurement request carries T12 , T13 , the identity information of the electronic device 100 and the identity information of the electronic device 201 .
  • the electronic device 201 receives the first measurement response sent by the electronic device 100 at time T14, and records the reception time of the first measurement response as time T14.
  • the electronic device 100 determines an orientation parameter of the electronic device 201 according to the measurement response sent by the electronic device 201 .
  • the orientation parameter of the electronic device 201 may include one or more of the physical distance between the electronic device 201 and the electronic device 100 , the signal AOA of the electronic device 201 , and the RRSI of the signal sent by the electronic device 201 .
  • the three orientation parameters are described in detail below:
  • the physical distance between the electronic device 201 and the electronic device 100 is equal to Tround1
  • the time difference between the receiving time T12 of the first measurement request and the sending time T13 of the first measurement response is equal to Trelay1
  • the electronic device 100 determines the one-way flight time of the signal according to the above formula, and then determines the physical distance D from the electronic device 201 and the electronic device 100 as C*T according to the product of the one-way flight time T and the electromagnetic wave propagation speed C.
  • the electronic device 100 can calculate the receiving direction of the signal according to the phase difference of the first measurement response reaching the UWB antennas at different positions, so as to determine the direction of the electronic device 201 relative to the electronic device 100 .
  • the electronic device 100 receives a wireless signal sent by the electronic device 201 , and the signal AOA of the signal in the electronic device 100 (that is, relative to the connection line between the receiving antenna 1 and the receiving antenna 2, the above wireless signal
  • the incident angle ⁇ can be determined according to the phase difference of the signal on the receiving antenna 1 and the receiving antenna 2 of the electronic device 100 Sure. in, can be expressed as follows,
  • is the wavelength
  • ⁇ ( ⁇ ) is the phase difference of the antenna hardware.
  • the incident angle ⁇ that is, the signal AOA of the electronic device 201 can be determined by the above formula. For example, if the incident angle ⁇ of the electronic device is 60 degrees, the electronic device 201 is 30 degrees clockwise from the electronic device 100 .
  • the electronic device 201 sends the RRSI of the signal.
  • the electronic device 100 determines the RRSI of the signal sent by the electronic device 201 according to the RRSI average value of the first measurement request and the first measurement response. In some embodiments, the electronic device 100 determines the RRSI of the signal sent by the electronic device 201 according to the RRSI of the first measurement request and the first measurement response.
  • whether there is an obstruction between the electronic device 100 and the electronic device 201 can be determined according to the RRSI of the signal sent by the electronic device 201 .
  • the preset RRSI of the signal sent by the electronic device 201 received by the electronic device 100 may be determined.
  • the RRSI of the received signal sent by the electronic device 201 is smaller than the preset RRSI, it is determined that there is an obstruction between the electronic device 100 and the electronic device 201 , otherwise there is no obstruction.
  • the orientation parameters of the electronic device 201 may include the physical distance between the electronic device 201 and the electronic device 100 , the signal AOA, and the first identifier.
  • the first identifier of the electronic device 201 is used to represent whether there is a blockage between the electronic device 100 and the electronic device 201 . For example, the first flag equal to 1 indicates that there is occlusion, and the first flag equal to 0 indicates that there is no occlusion.
  • the electronic device 100 sends a connection request to the electronic device 201 , and the electronic device 201 receives the connection request sent by the electronic device 100 .
  • the electronic device 201 sends first capability information and corresponding connection parameters to the electronic device 100 , where the first capability information is used to represent a communication mode that the electronic device 201 can support.
  • the corresponding connection parameters may include: device ID, pairing key and other parameters.
  • the electronic device 100 can establish a WiFi connection with the electronic device 201 based on the above-mentioned connection parameters using the connection process of the IEEE802.11 standard;
  • the corresponding connection parameters may include parameters such as a secret key, an encryption method, and a Service Set Identifier (SSID).
  • the electronic device 100 may establish a Bluetooth connection with the electronic device 201 based on the above-mentioned connection parameters using the connection process of the IEE802.15.1 standard.
  • the electronic device 100 can preferentially use the connection process of the IEEE802.11 standard to establish a WiFi connection with the electronic device 201 based on the above-mentioned connection parameters.
  • the first measurement request may further carry second capability information, where the second capability information is used to represent all communication modes supported by the electronic device 100, such as Bluetooth, WiFi, and the like.
  • the first measurement response may also carry first capability information and corresponding connection parameters.
  • the second capability information includes first capability information, and the second capability information is determined by the electronic device 201 according to the second capability information. In this way, after step S103, the electronic device 100 can directly establish a connection with the electronic device 201 according to the first capability information and the corresponding connection parameters in the first measurement response, without sending a connection request again.
  • the electronic device 100 may also initiate measurement requests multiple times, and obtain the average value of one-way flight time and the average value of AOA according to the sending and receiving time of multiple measurement requests and multiple measurement responses, so as to reduce distance and angle measurement errors .
  • the electronic device 100 broadcasts a UWB measurement request, and the probe request includes the sending time.
  • the electronic device 201 determines the time difference based on the sending time and the time when the electronic device 201 receives the probe request, so as to calculate The distance between the electronic device 201 and the electronic device 100 (the distance is equal to the time difference multiplied by the propagation speed of the electromagnetic wave); the electronic device 201 calculates the arrival angle of the detection request based on the received detection request, and can determine the relative relationship between the electronic device 201 and the electronic device.
  • the electronic device 201 sends a probe response to the electronic device 100, the identity identifier of the second electronic device 201 and the first location information in the probe response.
  • the electronic device 100 receives the detection response, and obtains and determines the orientation parameters of the electronic device 201 relative to the electronic device 100 .
  • the above measurement request (first measurement request) may also be referred to as a probe request
  • the measurement response (first measurement response) may also be referred to as a probe response
  • the present application provides a device identification method, which is applied to a first electronic device with a camera, as shown in FIG. 22 , the method includes:
  • the first electronic device receives a first operation.
  • the first operation may be any one or more user operations in the foregoing FIGS. 5A to 5D , and may also be any one or more user operations in the foregoing FIGS. 7A to 7C .
  • FIGS. 5A to 5D may also be any one or more user operations in the foregoing FIGS. 7A to 7C .
  • FIGS. 5A to 5D may also be any one or more user operations in the foregoing FIGS. 7A to 7C .
  • the first electronic device displays a first interface, where the first interface includes a preview image captured by the camera, wherein the preview image includes the second electronic device.
  • the first interface may be the aforementioned viewfinder interface 530 .
  • the second electronic device may be, for example, the electronic device 201 corresponding to the device image 532 in FIG. 5G .
  • the first electronic device acquires first position information of the second electronic device relative to the first electronic device.
  • the first electronic device determines the display position of the first tag in the preview screen based on the first position information and the display area of the second electronic device in the preview screen, and displays the first tag in the display position, wherein the first tag is displayed in the preview screen.
  • the label is used to identify the second electronic device.
  • the second electronic device may be, for example, the electronic device 201 corresponding to the device image 532 in FIG. 5G
  • the first label may be, for example, the device icon 5321 .
  • the first electronic device receives a second operation for the first tag.
  • the second operation may be the aforementioned user operation for the device icon 5321 in FIG. 5G .
  • the first electronic device displays a second interface, where the second interface includes one or more controls for controlling the second electronic device.
  • the second interface may be the display interface in FIG. 5H.
  • the second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface.
  • the present application presents the corresponding relationship between the first label and the second electronic device in real time through the augmented reality display mode, and realizes the interaction between the first electronic device and the second electronic device through the first label, and realizes coordinated control among multiple devices, Improved user experience.
  • acquiring the first position information of the second electronic device relative to the first electronic device by the first electronic device specifically includes: the first electronic device broadcasts a detection request, and the detection request includes the identity of the first electronic device ;
  • the first electronic device receives the probe response sent by the second electronic device based on the probe request, it determines the first position information of the second electronic device and the first electronic device based on the probe response, and the probe response includes the identity of the second electronic device.
  • the first position information includes relative positions of the second electronic device and the first electronic device, such as distance, direction, angle, and the like.
  • the first electronic device can calculate the distance between the second electronic device and the first electronic device according to the time difference between sending the detection request and receiving the detection response (the distance is equal to the time difference multiplied by the propagation speed of the electromagnetic wave); based on the detection response, the first electronic device, By calculating the arrival angle of the detection response, the azimuth angle of the second electronic device relative to the first electronic device can be determined.
  • the detection response includes the identity identifier of the second electronic device and the first location information
  • the first electronic device determines the first location information of the second electronic device and the first electronic device based on the detection response.
  • the second electronic device calculates the relative position of the second electronic device and the first electronic device according to the received detection request.
  • the detection request includes the sending time, and the second electronic device determines the time difference based on the sending time and the time when the second electronic device receives the detection request, so as to calculate the distance between the second electronic device and the first electronic device;
  • the device calculates the arrival angle of the probe request based on the received probe request, and can determine the azimuth angle of the second electronic device relative to the first electronic device.
  • the second electronic device sends a probe response to the first electronic device, where the probe response includes the identity identifier of the second electronic device and the first location information.
  • the display position of the first label in the preview screen and the display area of the second electronic device in the preview screen partially or completely overlap.
  • the first label may be displayed in the display area of the second electronic device, may be displayed on the edge of the display area of the second electronic device, or may be displayed at a position close to the display area of the second electronic device.
  • the method further includes: acquiring, by the first electronic device, second position information of the third electronic device relative to the first electronic device; when the first electronic device detects that the preview screen does not include the third electronic device, And based on the second position information, it is determined that the third electronic device is within the viewing range of the camera; the first electronic device determines the display position of the second label in the preview screen based on the second position information, wherein the second label is used to indicate the following: One or more pieces of information: identification information of the third electronic device, obstructions of the third electronic device, and second location information.
  • the first electronic device determines that the third electronic device If it is blocked, output the second label of the third electronic device, indicating one or more of the identification information of the third electronic device, the blocking object, and the blocked position in the preview interface.
  • the second label may be, for example, the icon 803 in FIG. 8C , the image of the third electronic device is not in the first interface, and the third electronic device is blocked by the device image 533 .
  • the method further includes: when the first electronic device detects that the preview image does not include the third electronic device, and determines based on the second position information that the third electronic device is not within the viewing range of the camera; the first The electronic device determines the display position of the third label in the preview screen based on the second location information, where the third label is used to indicate one or more of the following information: identification information of the third electronic device and second location information.
  • the first electronic device determines that the third electronic device
  • the second label of the third electronic device is output, indicating one or more of the identification information of the third electronic device and the relative position (direction, angle, distance, etc.) of the third electronic device.
  • the third label may be, for example, the icon 802 in FIG. 8B , the image of the third electronic device is not in the first interface, and the third electronic device is outside the viewing range of the camera.
  • the preview image includes an image of the fourth electronic device
  • the method further includes: the first electronic device determines, based on the preview image, that the device type of the fourth electronic device is the first electronic device. type; the first electronic device determines that the device type is the first target device of the first type among the electronic devices associated or bound under the account of the first electronic device; the first electronic device displays a fourth label, and the fourth label uses The image indicating the fourth electronic device is associated with the first target device. In this manner, when the first electronic device cannot detect the location information of the fourth electronic device, and the image of the fourth electronic device is in the preview screen.
  • the first electronic device identifies the device type of the fourth electronic device according to the image recognition technology, and detects whether there is a target device of this device type in the devices logged into the same account (eg, Huawei account) as the first electronic device. If so, the first electronic device considers the target device to be the fourth electronic device, and the first electronic device outputs a fourth label identifying the target device.
  • the same account eg, Huawei account
  • the third label may be, for example, the icon 805 in FIG. 8D , the image of the fourth electronic device is in the first interface, and the first electronic device cannot locate the position of the fourth electronic device.
  • the preview image includes an image of the fifth electronic device
  • the method further includes: the first electronic device determines, based on the preview image, that the device type of the fifth electronic device is the second electronic device. type; the first electronic device obtains the third position information of the first electronic device, and the first electronic device stores the corresponding relationship between the electronic device and the position information; based on the corresponding relationship, the first electronic device determines the device according to the third position information
  • the type is the second target device of the first type, and the location information of the target device is the same as the third location information; the first electronic device displays a fifth label, and the fifth label is used to indicate that the image of the fifth electronic device is associated with the second target device .
  • the first electronic device cannot detect the location information of the fifth electronic device, and the image of the fifth electronic device is in the preview screen.
  • the first electronic device since the first electronic device stores the corresponding relationship between the electronic device and the location information (for example, smart speakers—living room, smart desk lamp—bedroom, computer—company, etc.)
  • the current geographic location of the first electronic device and the device type of the fifth electronic device are identified according to the image recognition technology to detect whether there is a target device of the device type in the devices in the same geographic location as the first electronic device. If so, the first electronic device considers that the target device is the fifth electronic device, and the first electronic device outputs a fifth label identifying the target device.
  • the first interface further includes a first icon
  • the first icon is associated with the data to be shared
  • the method further includes: the first electronic device receives a third operation, and the third operation is directed to the first tag and/or or the operation of the first icon; in response to the third operation, the first electronic device sends the data to be shared to the second electronic device.
  • the third operation includes but is not limited to drag operation, click operation, etc.; provides a data sharing method, select the second electronic device you want to share on the first interface, and send the data to be shared to the second electronic device .
  • the user operation of data sharing is simplified, the device information is displayed intuitively, and the user experience is improved.
  • the first icon may be, for example, the icon 902 or the icon 903 in FIG. 9B , the first icon may also be the thumbnail 1111 in FIG. 11D ; the first icon may also be the picture 1205 in FIG. 12B .
  • the method before the first electronic device receives the third operation, the method further includes: the first electronic device displays a first label in a first display form on the first interface according to the data type of the data to be shared, the first The first label in the display form is used to prompt the user that the second electronic device supports outputting the data to be shared.
  • the first display form may be to brighten the display area of the first label (change brightness, color, etc.).
  • the first display form may be, for example, the display forms of the device icon 5311 , the device icon 5331 , and the device icon 5341 in FIG. 10C .
  • the preview screen includes an image of the third electronic device and a third label, and the third label is associated with the third electronic device; the method further includes: the first electronic device receives a fourth operation, and the fourth operation is For the operation of the first label and/or the third icon; in response to the fourth operation, the first electronic device outputs a prompt message, where the prompt message is used to prompt the user that the third electronic device does not support outputting data to be shared.
  • the prompt message may be, for example, the information displayed in the diagram box 1100 in FIG. 10B .
  • the first electronic device receives the first operation and displays the first interface, activates the camera, and displays the image captured by the camera in real time on the first interface; the first electronic device recognizes the image in the image according to the image recognition technology.
  • Electronic devices and device types of electronic devices such as speakers, computers, tablet computers, etc.
  • the first electronic device obtains the second electronic device according to the wireless positioning technology (such as UWB positioning, Bluetooth positioning, WiFi positioning, etc.).
  • the location information includes one or more of distance, direction, and angle.
  • the first electronic device determines the display position of the first label of the second electronic device in the preview screen, and the first label is used to identify the second electronic device, such as the device name, device name of the second electronic device type, etc.
  • the display position of the first label is related to the display position of the second electronic device.
  • the first electronic device detects a user operation on the first tag
  • the first electronic device outputs a second interface, where the second interface includes one or more controls for controlling the second electronic device.
  • the second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface.
  • the present application presents the corresponding relationship between the first label and the second electronic device in real time through the augmented reality display mode, and realizes the interaction between the first electronic device and the second electronic device through the first label, and realizes coordinated control among multiple devices, Improved user experience.
  • the software system of the electronic device may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take the Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100.
  • the Android system is only a system example of the electronic device 100 in the embodiment of the present application, and the present application can also be applied to other types of operating systems, such as IOS, windows, Hongmeng, etc., which is not limited in the present application.
  • the following only takes the Android system as an example of the operating system of the electronic device 100 .
  • FIG. 23 shows a software structural block diagram of an electronic device exemplarily provided by an embodiment of the present application.
  • the electronic device can determine the azimuth parameters (such as distance, signal AOA, and RRSI) of nearby devices through the UWB positioning technology, and then determine the display position of the image of the nearby device in the viewfinder interface according to the azimuth parameters of multiple nearby devices, and display nearby devices.
  • the device icon of the device, triggering the device icon realizes the interaction between the electronic device and the nearby devices.
  • the electronic device can establish a wireless communication connection with the target device through one or more wireless communication protocols among UWB, Bluetooth, WLAN and infrared, and perform data transmission.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system can be divided into an application layer, an application framework layer, a protocol stack, and a kernel layer from top to bottom. in:
  • the application layer includes a series of application packages, such as smart life, bluetooth, WLAN and so on. Apps like camera, gallery, calling, music, video, and more can also be included.
  • the smart life APP is a software program that can select and control various smart home devices in the home, and is installed on the electronic device used by the user.
  • the smart life APP can be an application installed when the electronic device leaves the factory, or it can be an application downloaded by a user from the network or obtained from other devices during the use of the electronic device.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer mainly includes API and system services (System Server).
  • the API is used to realize the communication between the application layer and the protocol stack, the HAL layer, and the kernel layer (kernel).
  • the API may include one or more of UWB API, Bluetooth API, WLAN API, and infrared API.
  • the system service may include one or more of UWB service, Bluetooth service, WLAN service, and infrared service.
  • the electronic device 100 may call corresponding system services by calling one or more of the UWB API, the Bluetooth API, the WLAN API, and the infrared API, so as to detect the orientation parameters of the devices near the electronic device 100. You can also call the corresponding system services by calling one or more of the UWB API, Bluetooth API, WLAN API, and infrared API to establish wireless communication connections with nearby devices and perform data transmission.
  • the UWB service may specifically include one or more services, such as UWB positioning service.
  • the UWB positioning service may include position parameter measurements, wherein the position parameter measurements include one or more of distance measurement, AOA measurement, and RRSI measurement.
  • the electronic device 100 invokes the UWB positioning service through the UWB API to detect location parameters of devices near the electronic device 100.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL for Embedded Systems, OpenGL ES)), 2D graphics engine (eg: Skia graphics library ( Skia Graphics Library, SGL)) etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL for Embedded Systems, OpenGL ES
  • 2D graphics engine eg: Skia graphics library ( Skia Graphics Library, SGL) etc.
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video coding formats, such as: Motion Picture Expert Group 4 (Motion Picture Expert Group, MPEG4), Advanced Video Coding (MPEG-4 Part 10 Advanced Video Coding, MPEG-4 AVC/H.264), dynamic Image Expert Compression Standard Audio Layer 3 (MPEG Audio Layer 3, MP3), Advanced Audio Coding (Advanced Audio Coding, AAC), Adaptive Multi-Rate (AMR), Joint Photographic Experts Group (Joint Photographic Experts Group, JPEG) /JPG), Portable Network Graphics (PNG), etc.
  • Motion Picture Expert Group 4 Motion Picture Expert Group 4
  • MPEG-4 Advanced Video Coding
  • MPEG-4 AVC/H.264 MPEG Audio Layer 3
  • MP3 Dynamic Audio Coding
  • AAC Advanced Audio Coding
  • AMR Adaptive Multi-Rate
  • JPEG Joint Photographic Experts Group
  • JPEG Joint Photographic
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer may include one or more of UWB chip drivers, Bluetooth chip drivers, and WLAN drivers, and may also include display drivers, camera drivers, audio drivers, sensor drivers, and so on.
  • the kernel layer is used to perform corresponding operations in response to functions invoked by system services in the application framework layer. For example, in response to a UWB measurement instruction sent by the UWB positioning service invoking the UWB protocol stack, the UWB chip driver sends a UWB measurement request through a hardware device (eg, a UWB chip).
  • a hardware device eg, a UWB chip
  • the software structure framework may be on the electronic device 100 , or may be on the electronic device 201 , the electronic device 202 , the electronic device 203 , and the electronic device 204 .
  • the following takes the device identification scene in the above embodiment as an example to illustrate the workflow of the software and hardware of the electronic device 100 as an example.
  • the accelerometer and/or gyroscope sensor detects a lift operation (eg, Figure 7C), and a corresponding hardware interrupt is issued to the kernel layer.
  • the kernel layer processes tap operations into raw input events. Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the input event as a paired connection to an electronic device (eg, the electronic device 201 ).
  • the smart life application calls the UWB API of the application framework layer to start the UWB location service.
  • the UWB location service sends UWB measurement commands to the UWB HAL interface in the HAL layer by calling the UWB protocol stack.
  • the UWB HAL interface sends a UWB measurement request to the kernel layer, and the kernel layer drives the UWB chip to broadcast the measurement request (such as the first measurement request) by calling the UWB chip driver according to the above UWB measurement request, and uses the UWB time management module to record the UWB measurement request. timestamp.
  • the UWB service of the application framework layer determines the target device, it sends the connection request to the kernel layer by calling the UWB protocol stack, and the UWB chip of the kernel layer drives the UWB chip to send the above-mentioned connection request to the electronic device 201 to request the establishment of UWB communication connection, and data transmission.
  • the UWB service of the application framework layer may also call a Bluetooth service, a WLAN service or an infrared service to send a connection request to the electronic device 201 .
  • the UWB service starts the Bluetooth service, calls the Bluetooth protocol stack through the Bluetooth service, thereby sending the first connection request to the kernel layer, and the Bluetooth chip in the kernel layer drives the Bluetooth chip to send the connection request to the electronic device 201 to request the establishment of a Bluetooth communication connection , and perform data transfer.
  • the electronic device 100 may include a processing module, a storage module, and a communication module.
  • the processing module may be used to control and manage the actions of the electronic device, for example, may be used to support the electronic device to perform the steps performed by the display unit, the detection unit and the processing unit.
  • the storage module may be used to support the electronic device to execute stored program codes and data, and the like.
  • the communication module can be used to support the communication between the electronic device and other devices.
  • the processing module may be a processor or a controller. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and the like.
  • the storage module may be a memory.
  • the communication module may specifically be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the electronic device involved in this embodiment may be a device having the structure shown in FIG. 2 .
  • This embodiment also provides a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, and when the computer instructions are executed on the electronic device, the electronic device executes the above-mentioned related method steps to realize the above-mentioned embodiments. Ways to share photos.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

公开了一种设备识别方法及相关装置,其特征在于,所述方法包括:第一电子设备接收第一操作显示第一界面,启动摄像头,在第一界面中包括摄像头采集的预览画面,该预览画面中包括第二电子设备;第一电子设备根据图像识别技术和无线定位技术,确定出第二电子设备的第一标签的在预览画面中的显示位置,第一标签用于标识第二电子设备。当第一电子设备检测到针对于第一标签的用户操作,第一电子设备输出第二界面,该第二界面包括控制第二电子设备的一个或多个控件。本申请通过增强现实的显示方式实时呈现了第一标签和第二电子设备的对应关系,并且通过第一标签实现了第一电子设备与第二电子设备的交互,实现多设备间的协调控制,提升了用户体验。

Description

一种设备识别方法及相关装置
本申请要求于2020年08月05日提交中国专利局、申请号为202010779841.2、申请名称为“一种设备识别方法及相关装置”的中国专利申请的优先权,本申请要求于2020年08月05日提交中国专利局、申请号为202010782270.8、申请名称为“分享照片的方法和电子设备”的中国专利申请的优先权,本申请要求于2020年10月29日提交中国专利局、申请号为202011183311.8、申请名称为“一种设备识别方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子技术领域,尤其涉及一种设备识别方法及相关装置。
背景技术
随着技术的发展,智能互联设备越来越普及化。较多用户拥有智能手机、电脑、智能电视、平板和智能音箱等多个电子设备,其他设备如家庭中还可能拥有智能影音设备、路由器/wifi盒子,智能清洁设备,智能厨电,智能照明系统等电子设备。
当用户需要选择特定的一个或多个设备进行交互(如控制、配对、数据传输、投屏等)时,可以通过菜单/列表,地图和NFC等方式在多设备中发现和选择目标设备。用户操作较为繁琐。
发明内容
本申请实施例提供了一种设备识别方法及相关装置,可以通过简易的操作直观的显示附近设备的标识信息,并且提供设备之间的交互途径,实现多设备间的协调控制,有效提升用户体验。
需要说明的是,本申请提供的各实施例中,各步骤的执行顺序可以有多种可能的实现方式,其中的部分或全部步骤可以先后执行或并行执行。
第一方面,本申请提供了一种设备识别方法,应用于带有摄像头的第一电子设备,方法包括:第一电子设备接收第一操作;响应于第一操作,第一电子设备显示第一界面,第一界面包括摄像头采集的预览画面,其中预览画面中包括第二电子设备;第一电子设备获取第二电子设备相对于第一电子设备的第一位置信息;第一电子设备基于第一位置信息,和第二电子设备在预览画面中的显示区域,确定出第一标签在预览画面中的显示位置,并在显示位置显示第一标签,其中第一标签用于标识第二电子设备;第一电子设备接收针对第一标签的第二操作;响应于第二操作,第一电子设备显示第二界面,第二界面包括控制第二电子设备的一个或多个控件。
本申请实施例,第一电子设备接收第一操作显示第一界面,启动摄像头,在第一界面中实时显示通过该摄像头采集到的图像;第一电子设备根据图像识别技术,识别出图像中的电子设备以及电子设备的设备类型(例如音箱、电脑、平板电脑等),例如第二电子设备;并且第一电子设备根据无线定位技术(例如UWB定位、蓝牙定位、WiFi定位等),获取第二电子设备相对于第一电子设备的位置信息。该位置信息包括距离、方向、角度中的一项或多项。第一电子设备基于该位置信息,在预览画面中,确定出第二电子设备的第一标签的显示位置, 第一标签用于标识第二电子设备,例如标识第二电子设备的设备名称、设备类型等。其中,第一标签的显示位置与第二电子设备的显示位置有关。当第一电子设备检测到针对于第一标签的用户操作,第一电子设备输出第二界面,该第二界面包括控制第二电子设备的一个或多个控件。其中,第二界面可以是叠加在第一界面上显示,也可以是电子设备从第一界面跳转显示第二界面。本申请通过增强现实的显示方式实时呈现了第一标签和第二电子设备的对应关系,并且通过第一标签实现了第一电子设备与第二电子设备的交互,实现多设备间的协调控制,提升了用户体验。
在一些可能的实施方式中,第一电子设备获取第二电子设备相对于第一电子设备的第一位置信息,具体包括:第一电子设备广播探测请求,探测请求包括第一电子设备的身份标识;第一电子设备接收到第二电子设备基于探测请求发送的探测响应时,基于探测响应确定第二电子设备与第一电子设备的第一位置信息,探测响应包括第二电子设备的身份标识。这种方式中,第一位置信息包括第二电子设备与第一电子设备的相对位置,例如距离、方向、角度等。第一电子设备根据发送探测请求和接收到探测响应的时间差,可以计算出第二电子设备与第一电子设备的距离(距离等于时间差乘电磁波的传播速度);第一电子设备基于该探测响应,计算出该探测响应的到达角度,可以确定出第二电子设备相对于第一电子设备的方位角度。
可选的,探测响应中包括第二电子设备的身份标识和第一位置信息,第一电子设备基于探测响应确定第二电子设备与第一电子设备的第一位置信息。具体的,第二电子设备根据接收到的探测请求,计算第二电子设备与第一电子设备的相对位置。探测请求中包括发送时间,第二电子设备基于该发送时间,以及第二电子设备接收到该探测请求的时间,确定时间差,从而计算出第二电子设备与第一电子设备的距离;第二电子设备基于接收到的探测请求,计算出该探测请求的到达角度,可以确定出第二电子设备相对于第一电子设备的方位角度。第二电子设备向第一电子设备发送探测响应,该探测响应中包括第二电子设备的身份标识和第一位置信息。
在一些可能的实施方式中,第一标签在预览画面中的显示位置和第二电子设备在预览画面中的显示区域,部分重叠或完全重叠。第一标签可以显示在第二电子设备的显示区域内,可以显示在第二电子设备的显示区域的边缘,也可以显示在紧靠第二电子设备的显示区域的位置。
在一些可能的实施方式中,方法还包括:第一电子设备获取第三电子设备相对于第一电子设备的第二位置信息;当第一电子设备检测到预览画面中不包括第三电子设备,且基于第二位置信息确定出第三电子设备在摄像头的取景范围内;第一电子设备基于第二位置信息,确定出第二标签在预览画面中的显示位置,其中第二标签用于指示以下一种或多种信息:第三电子设备的标识信息、第三电子设备的遮挡物、第二位置信息。这种方式中,当第一电子设备检测到第三电子设备的相对位置在摄像头的取景范围内,但是预览画面中不包括第三电子设备的图像,则第一电子设备判断出第三电子设备被遮挡,输出第三电子设备的第二标签,指示第三电子设备的标识信息、遮挡物和在预览界面中被遮挡的位置中的一项或多项。
在一些可能的实施方式中,方法还包括:当第一电子设备检测到预览画面中不包括第三电子设备,且基于第二位置信息确定出第三电子设备不在摄像头的取景范围内;第一电子设备基于第二位置信息,确定出第三标签在预览画面中的显示位置,其中第三标签用于指示以下一种或多种信息:第三电子设备的标识信息、第二位置信息。这种方式中,当第一电子设备检测到第三电子设备的相对位置在摄像头的取景范围之外,并且预览画面中不包括第三电 子设备的图像,则第一电子设备判断出第三电子设备不在取景框中,输出第三电子设备的第二标签,指示第三电子设备的标识信息、以及与第一电子设备的相对位置(方向、角度、距离等)中的一项或多项。
在一些可能的实施方式中,预览画面中包括第四电子设备的图像,第一电子设备显示第一界面之后,还包括:第一电子设备基于预览画面确定第四电子设备的设备类型为第一类型;第一电子设备在第一电子设备的账号下关联或绑定的电子设备中,确定出设备类型为第一类型的第一目标设备;第一电子设备显示第四标签,第四标签用于指示第四电子设备的图像与第一目标设备关联。这种方式中,当第一电子设备无法检测到第四电子设备的位置信息,并且第四电子设备的图像在预览画面中。在这种情况下,第一电子设备根据图像识别技术识别第四电子设备的设备类型,检测与第一电子设备登录同一账号(例如华为账号)的设备中是否存在该设备类型的目标设备。若有,则第一电子设备认为该目标设备即为第四电子设备,第一电子设备输出标识该目标设备的第四标签。
在一些可能的实施方式中,预览画面中包括第五电子设备的图像,第一电子设备显示第一界面之后,还包括:第一电子设备基于预览画面确定第五电子设备的设备类型为第二类型;第一电子设备获取第一电子设备的第三位置信息,第一电子设备中存储有电子设备和位置信息的对应关系;基于对应关系,第一电子设备根据第三位置信息,确定出设备类型为第一类型的第二目标设备,目标设备的位置信息与第三位置信息相同;第一电子设备显示第五标签,第五标签用于指示第五电子设备的图像与第二目标设备关联。这种方式中,当第一电子设备无法检测到第五电子设备的位置信息,并且第五电子设备的图像在预览画面中。在这种情况下,由于第一电子设备中存储有电子设备和位置信息的对应关系(例如智能音箱——客厅,智能台灯——卧室,电脑——公司,等等),第一电子设备根据第一电子设备当前的地理位置,以及根据图像识别技术识别第五电子设备的设备类型,检测与第一电子设备在同一地理位置的设备中是否存在该设备类型的目标设备。若有,则第一电子设备认为该目标设备即为第五电子设备,第一电子设备输出标识该目标设备的第五标签。
在一些可能的实施方式中,第一界面还包括第一图标,第一图标关联了待分享数据,方法还包括:第一电子设备接收第三操作,第三操作为针对于第一标签和/或第一图标的操作;响应于第三操作,第一电子设备将待分享数据发送给第二电子设备。第三操作包括但不限于拖拽操作、点击操作等;提供了一种数据分享的方式,在第一界面上选择想要分享的第二电子设备,将待分享的数据发送到第二电子设备。简化了数据分享的用户操作,直观的显示了设备信息,提升了用户体验。
在一些可能的实施方式中,第一电子设备接收第三操作之前,还包括:第一电子设备根据待分享数据的数据类型,在第一界面上显示第一显示形式的第一标签,第一显示形式的第一标签用于提示用户第二电子设备支持输出待分享数据。其中第一显示形式可以是将第一标签的显示区域提亮(改变亮度、颜色等)。
在一些可能的实施方式中,预览画面中包括第三电子设备的图像和第三标签,第三标签与第三电子设备关联;方法还包括:第一电子设备接收第四操作,第四操作为针对于第一标签和/或第三图标的操作;响应于第四操作,第一电子设备输出提示消息,提示消息用于提示用户第三电子设备不支持输出待分享数据。
第二方面,本申请提供了一种电子设备,包括:一个或多个处理器,存储器;存储器中包括计算机指令,当一个或多个处理器调用计算机指令时,使得电子设备执行:
接收第一操作;
响应于第一操作,显示第一界面,第一界面包括摄像头采集的预览画面,其中预览画面中包括第一目标设备;
获取和第一目标设备的第一相对位置信息;
基于第一相对位置信息,和第一目标设备在预览画面中的显示位置,确定出第一标签在预览画面中的显示位置,其中第一标签用于指示第一目标设备的标识信息;
接收针对第一标签的第二操作;
响应于第二操作,显示第二界面,第二界面包括控制第一目标设备的一个或多个控件。
本申请实施例,电子设备接收操作显示界面,启动摄像头,在界面中实时显示通过该摄像头采集到的图像;电子设备根据图像识别技术,识别出图像中的电子设备以及电子设备的设备类型(例如音箱、电脑、平板电脑等),例如第一目标设备;并且电子设备根据无线定位技术(例如UWB定位、蓝牙定位、WiFi定位等),获取第一目标设备相对于电子设备的位置信息。该位置信息包括距离、方向、角度中的一项或多项。电子设备基于该位置信息,在预览画面中,确定出第一目标设备的标签的显示位置,标签用于标识第一目标设备,例如标识第一目标设备的设备名称、设备类型等。其中,标签的显示位置与第一目标设备的显示位置有关。当电子设备检测到针对于标签的用户操作,电子设备输出第二界面,该第二界面包括控制第一目标设备的一个或多个控件。其中,第二界面可以是叠加在界面上显示,也可以是电子设备从界面跳转显示第二界面。本申请通过增强现实的显示方式在电子设备的第一界面上实时呈现了第一标签和第一目标设备的对应关系,并且通过第一标签实现了电子设备与第一目标设备的交互,实现多设备间的协调控制,提升了用户体验。
在一些可能的实施方式中,当一个或多个处理器调用计算机指令时,使得电子设备执行获取和第一目标设备的第一相对位置信息,具体包括:广播探测请求,探测请求包括电子设备的身份标识;接收到第一目标设备基于探测请求发送的探测响应时,基于探测响应确定和第一目标设备的第一相对位置信息,探测响应包括第一目标设备的身份标识。
可选的,探测响应中包括第一目标设备的身份标识和第一相对位置信息,电子设备基于探测响应确定第一目标设备与电子设备的第一相对位置信息,例如距离、方向、角度等。具体的,第一目标设备根据接收到的探测请求,计算第一目标设备与电子设备的相对位置。探测请求中包括发送时间,第一目标设备基于该发送时间,以及第一目标设备接收到该探测请求的时间,确定时间差,从而计算出第一目标设备与电子设备的距离;第一目标设备基于接收到的探测请求,计算出该探测请求的到达角度,可以确定出第一目标设备相对于电子设备的方位角度。第一目标设备向电子设备发送探测响应,该探测响应中包括第一目标设备的身份标识和第一相对位置信息。
在一些可能的实施方式中,第一标签在预览画面中的显示位置和第一目标设备在预览画面中的显示位置,部分重叠或完全重叠。第一标签可以显示在第一目标设备的显示区域内,可以显示在第一目标设备的显示区域的边缘,也可以显示在紧靠第一目标设备的显示区域的位置。
在一些可能的实施方式中,当一个或多个处理器调用计算机指令时,使得电子设备还执行:获取和第二目标设备的第二相对位置信息;当电子设备检测到预览画面中不包括第二目标设备,且基于第二相对位置信息确定出第二目标设备在摄像头的取景范围内;电子设备基于第二相对位置信息,确定出第二标签在预览画面中的显示位置,其中第二标签用于指示以下一种或多种信息:第二目标设备的标识信息、第二目标设备的遮挡物、第二相对位置信息。
在一些可能的实施方式中,当一个或多个处理器调用计算机指令时,使得电子设备还执 行:当电子设备检测到预览画面中不包括第二目标设备,且基于第二相对位置信息确定出第二目标设备不在摄像头的取景范围内;电子设备基于第二相对位置信息,确定出第三标签在预览画面中的显示位置,其中第三标签用于指示以下一种或多种信息:第二目标设备的标识信息、第二相对位置信息。
在一些可能的实施方式中,预览画面中包括第三目标设备的图像,当一个或多个处理器调用计算机指令时,使得电子设备执行显示第一界面之后,电子设备还执行:基于预览画面确定第三目标设备的设备类型为第一类型;在电子设备的账号下关联或绑定的电子设备中,确定出设备类型为第一类型的设备的标识信息;显示第四标签,第四标签用于指示第三目标设备的图像与标识信息关联。
在一些可能的实施方式中,预览画面中包括第四目标设备的图像,当一个或多个处理器调用计算机指令时,使得电子设备执行显示第一界面之后,电子设备还执行:基于预览画面确定第四目标设备的设备类型为第二类型;获取电子设备的位置信息,电子设备中存储有电子设备和位置信息的对应关系;电子设备根据第三位置信息,在对应关系中确定出设备类型为第一类型的设备的标识信息;显示第五标签,第五标签用于指示第四目标设备的图像与标识信息关联。
在一些可能的实施方式中,第一界面还包括第一图标,第一图标关联了待分享数据,当一个或多个处理器调用计算机指令时,使得电子设备还执行:接收第三操作,第三操作为针对于第一标签和/或第一图标的操作;响应于第三操作,将待分享数据发送给第一目标设备。第三操作包括但不限于拖拽操作、点击操作等;提供了一种数据分享的方式,在第一界面上选择想要分享的第一目标设备,将待分享的数据发送到第一目标设备。简化了数据分享的用户操作,直观的显示了设备信息,提升了用户体验。
在一些可能的实施方式中,当一个或多个处理器调用计算机指令时,使得电子设备执行接收第三操作之前,电子设备还执行:根据待分享数据的数据类型,在第一界面上显示第一显示形式的第一标签,第一显示形式的第一标签用于提示用户第一目标设备支持输出待分享数据。其中第一显示形式可以是将第一标签的显示区域提亮(改变亮度、颜色等)。
在一些可能的实施方式中,预览画面中包括第二目标设备的图像和第三标签,第三标签与第二目标设备关联;当一个或多个处理器调用计算机指令时,使得电子设备还执行:接收第四操作,第四操作为针对于第一标签和/或第三图标的操作;响应于第四操作,输出提示消息,提示消息用于提示用户第二目标设备不支持输出待分享数据。
第三方面,本申请提供了一种分享照片的方法,应用于第一电子设备,该方法包括:显示该第一电子设备的拍摄预览界面,该拍摄预览界面包括第一照片的缩略图和该第一电子设备的摄像头采集的预览画面;识别该预览画面中包括的第六电子设备;确定该第六电子设备与该第一电子设备的相对位置;基于识别出的该第六电子设备和该相对位置,在该预览画面上,显示该第六电子设备的标签,该标签用于标识该第六电子设备;接收对该第一照片的缩略图的第五操作;响应于该第五操作,移动该第一照片的缩略图至该预览画面上该标签标识的该第六电子设备的显示区域;向该第六电子设备发送该第一照片。
在本申请实施例中,用户点击相机应用的图标显示的相机应用主界面可以被称为“拍摄预览界面”,该拍摄预览界面中呈现的画面可以称为“预览图像”或者“预览画面”。
应理解,本申请实施例中拍摄预览界面可以代表包括预览画面、拍摄快门键、本地相册图标、摄像头切换图标等在内的界面,如果该界面上发生显示内容的变化,例如显示了某个识别出的设备标签等,该界面还是可以被称为拍摄预览界面,后续不再赘述。
需要说明的是,该预览画面可以是手机前置摄像头或者后置摄像头获取的,本申请实施例对拍摄照片的摄像头不作限定。例如,一张人物照片是手机前置摄像头获取的,如果用户要通过后置摄像头识别电子设备,可以通过点击摄像头切换按键进行切换。或者,该人物照片是手机后置摄像头获取的,如果用户要通过前置摄像头识别电子设备,可以通过点击摄像头切换按键进行切换,本申请实施例对此不作限定。
通过上述实现方式,手机可以提前判断预览画面中包括的电子设备,并在用户启动照片分享功能时,快速将识别出的电子设备名称显示在界面中,提高手机识别预览画面的物体的速度。例如当手机识别出当前预览画面中包括的第六电子设备之后,用户可以根据自己的需求,将该第一照片的缩略图拖动到待分享的第六电子设备。
应理解,针对上述实现过程,手机可以通过图像检测、3D扫描技术和机器视觉等多种不同的方式,检测并识别到预览画面中包括的其他电子设备,本申请实施例对手机识别预览画面中其他电子设备的方式不作限定。
一种可能的实现方式中,第一照片的缩略图可以是本地相册图标。例如,本地相册图标显示的就是用户最新拍摄的第一照片。
另一种可能的实现方式中,第一照片的缩略图可以和本地相册图标具有相同的样式或显示,且第一照片的缩略图是悬浮显示在拍摄预览界面上。结合第三方面和上述实现方式,在第三方面的某些实现方式中,该方法还包括:接收对该相册图标的第六操作;响应于该第六操作,在该拍摄预览界面悬浮显示该第一照片的缩略图。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,第五操作是拖动该第一照片的缩略图的操作,该第六操作是长按该本地相册图标的操作。
上述方法中,以长按操作为例,介绍了通过用户长按本地相册图标作为触发照片分享过程的操作。应理解,本申请实施例还可以通过其他预设操作触发本申请实施例提供的照片分享过程,或者通过其他预设操作触发手机识别预览画面中的电子设备,例如该预设操作不限于长按本地相册图标、双击本地相册图标、或者在拍摄预览界面上绘制固定图案等,本申请实施例对此不作限定。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,第六电子设备的标签用于标识该第六电子设备的名称,和/或该第六电子设备的标签用于标识该第六电子设备所处的位置。
在本申请实施例中,手机识别出预览画面中包括的第六电子设备之后,可以根据当前预览画面中第六电子设备的显示位置,确定显示第六电子设备的标签的位置。一种可能的方式中,手机可以将第六电子设备的标签显示在预览画面中的该第六电子设备所在的区域。
可选地,该第六电子设备的标签可以显示在靠近该第六电子设备的定位装置的区域。或者,该第六电子设备的标签可以显示在预览画面中的空白区域,不遮挡预览画面中的其他物体。
上述介绍的图标显示方式可以在不遮挡预览画面中其他物体的情况下标记识别出的电子设备,不影响用户的视觉和观感,提高了用户的视觉体验。
通过上述方法,用户可以在拍摄照片的过程中,通过预设的操作,可以启动手机的设备识别功能和定位功能,结合手机的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以将待分享的照片直接拖动到其他电子设备所在的区域,从而快速将照片分享给周围存在的其他电子设备。该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
结合第三方面,在第三方面的某些实现方式中,第一电子设备包括第一定位芯片,该第六电子设备包括第二定位芯片,该识别该预览画面中包括的第六电子设备,确定该第六电子设备与该第一电子设备的相对位置,包括:基于该第一定位芯片、该第二定位芯片和该预览画面,识别该预览画面中包括的该第六电子设备,确定该第六电子设备与该第一电子设备的相对位置,其中,该第一定位芯片包括蓝牙定位芯片、超宽带UWB定位芯片中的至少一种,该第二定位芯片包括蓝牙定位芯片、超宽带UWB定位芯片中的至少一种。
本申请实施例中,手机可以通过多种可能的定位技术,识别出预览画面中的其他电子设备,并定位其他电子设备的位置。可选地,本申请实施例的定位技术可以包括基于蓝牙的无线感知定位、基于超宽带(ultra wide-band,UWB)感知的无线感知定位、基于计算机视觉的定位等技术中的一种,或者以上列举的多种定位技术的融合等,又或者其他更多的定位技术,本申请实施例对手机定位其他电子设备的方式不作限定。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,拍摄预览界面上还包括拍摄快门键,该方法还包括:接收对该拍摄快门键的第七操作;响应于该第七操作,拍摄该第一照片。
可选地,该方法可以在用户通过相机应用拍照时,直接将最新拍摄的第一照片分享给其他设备。或者将本地相册中的日期最新的第一照片分享给其他设备。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,在该拍摄预览界面显示该第一照片的缩略图之前,该方法还包括:接收第八操作;响应于该四操作,在该拍摄预览界面上显示照片列表,该照片列表包括该第一照片和多张第二照片,该第二照片的日期在该第一照片之前;接收第九操作;响应于该五操作,从该照片列表中选中至少一个第二照片;以及,该移动该第一照片的缩略图至该预览画面上该标签标识的该第六电子设备的显示区域之后,该方法还包括:向该第六电子设备发送该第一照片和选中的该至少一个第二照片。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,第八操作是以该本地相册图标为起点沿着预设方向的滑动操作,该第九操作是点击操作。
通过上述方法,用户可以在拍摄照片的过程中,通过预设的操作,可以启动手机的设备识别功能和定位功能,结合手机的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以选择多张待分享的照片,并将多张待分享的照片直接拖动到其他电子设备所在的区域,从而快速将照片分享给周围存在的其他电子设备。该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
一种可能的情况中,该照片列表中的照片可以按照用户拍摄的顺序进行排列。示例性的,第一照片是用户拍摄的最新照片,第二照片的拍摄时间早于第一照片的拍摄时间。
或者,照片列表中的照片可以按照其他可能的排列顺序进行排列,例如检测到拍摄地点为公司,该照片列表中可以显示拍摄地点为公司的照片,本申请实施例对此不作限定。
一种可能的情况中,当用户通过向上滑动的操作触发在界面上显示该照片列表之后,该照片列表中的第一张照片可以是默认选中的。如果用户并不期望分享该第一照片,可以点击第一照片右下角的选择框,取消选择第一照片。同样地,如果用户期望同时分享该第一照片和至少一个第二照片,可以点击每张第二照片右下角的选择框,选择多张待分享的照片,此处不再赘述。
可选地,本申请实施例还可以通过其他预设操作触发本申请实施例提供的分享多张照片过程,或者通过其他预设操作触发手机识别预览画面中的电子设备,例如该预设操作不限于选中本地相册图标并向上拖动、双击本地相册图标、或者在拍摄预览界面上绘制固定图案等, 本申请实施例对此不作限定。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,当移动该第一照片的缩略图至该标签标识的该第六电子设备的显示区域时,该第六电子设备的标签的显示效果发生变化,该显示效果包括该第六电子设备的标签的颜色、大小、动画效果中的一种或多种。
具体地,用户可以将该第一照片的缩略图拖动到第六电子设备所在位置之后释放,该第六电子设备的图标可以呈现为不同的颜色,或者显示出大小变化、跳动、闪烁灯其他动态效果,以提醒用户将当前拍摄的第一照片分享给预览画面中识别到的第六电子设备。
或者,用户拖动该第一照片的缩略图的过程中,在预览画面上,还可以显示提醒控件。示例性的,该提醒控件可以是箭头等,该箭头可以静态显示、跳动显示或者闪烁显示,以提示用户可以将第一照片的缩略图拖动到该箭头标识的位置,实现照片分享功能。本申请实施例对提醒控件的显示方式不作限定。
结合第三方面和上述实现方式,在第三方面的某些实现方式中,当该第六电子设备在该预览画面中被遮挡时,或者检测到该第六电子设备位于该预览画面对应的范围之外的位置时,该方法还包括:在该拍摄预览界面上显示提示信息,该提示信息用于提示用户该第六电子设备的位置,或者该提示信息用于提示用户调整该第一电子设备的位置,使得该第一电子设备的该预览画面中显示该第六电子设备。
需要说明的是,手机可以和附近的其他电子设备进行通信,例如通过蓝牙、无线保真(wireless fidelity,WIFI)模块等多种可能的方式进行通信,那么手机就可以感知到附近存在的电子设备。或者,手机通过UWB等无线定位技术确定附近存在其他电子设备,并识别出该电子设备的类型等,可以显示在拍摄预览界面中。本申请实施例对手机和附近的其他电子设备的通信交互方式、建立连接的方式不作限定。
通过上述方法,当手机到识别到预览画面中存在其他电子设备,且该电子设备被障碍物遮挡时,在用户分享照片的过程中,可以在拍摄预览界面上显示文字或图标等提醒信息,用于提示用户被遮挡的电子设备的位置等,用户可以进一步将拍摄的照片快速分享到被遮挡的电子设备,为用户向被遮挡的电子设备分享照片提供了一种可能的途径,简化了用户分享照片的操作步骤。
一种可能的场景中,手机可能通过无线定位技术识别到附近有第六电子设备,且该第六电子设备并没有显示在手机当前的预览画面中。针对该种场景,本申请实施例还可以在拍摄预览界面上显示提醒信息,用于提醒用户某个方位存在第六电子设备。
可选地,本申请实施例除了提醒窗口的文字提醒之外,还可以包括图标提醒。例如,在手机的拍摄预览界面上,除了该提醒窗口之外,还可以包括静态显示的箭头、动态闪烁的箭头或者跳动显示的箭头等标记被遮挡的第六电子设备的位置的图标,本申请实施例对此不作限定。
或者,另一种可能的方式中,用户可以根据界面上的提醒信息转动手机的方向,使得手机的摄像头可以获取检测到的第六电子设备,并在预览画面中显示用户将要分享照片的第六电子设备,从而可以按照上述介绍的方法,快速地将拍摄的照片分享给其他电子设备。
手机可以通过无线定位技术识别到附近有其他电子设备,且如果该电子设备并没有显示在手机当前的预览画面中。针对该种场景,本申请实施例还可以在拍摄预览界面上显示提醒信息,用于提醒用户某个方位存在其他电子设备。
综上所述,本申请实施例提供的分享照片的方法,用户可以在拍摄照片或者运行相机应用的过程中,通过预设的操作,启动电子设备的设备识别功能和定位功能。并基于电子设备 的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以通过快捷操作选择一张或多张待分享的照片,并直接拖动该一张或多张待分享的照片到其他电子设备所在的区域,从而快速将一张或多张照片分享给周围存在的其他电子设备。此外,本申请实施例针对预览画面中存在被遮挡的其他电子设备等多种场景,为用户提供人性化的交互界面,方便用户可以通过快捷操作分享一张或多张照片,该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
第四方面,提供了一种第一电子设备,包括:处理器和存储器;该存储器存储有一个或多个指令,当该一个或者多个指令被该处理器执行时,使得该第一电子设备执行以下步骤:显示该第一电子设备的拍摄预览界面,该拍摄预览界面包括第一照片的缩略图和该第一电子设备的摄像头采集的预览画面;识别该预览画面中包括的第六电子设备;确定该第六电子设备与该第一电子设备的相对位置;基于识别出的该第六电子设备和该相对位置,在该预览画面上,显示该第六电子设备的标签,该标签用于标识该第六电子设备;接收对该第一照片的缩略图的第五操作;响应于该第五操作,移动该第一照片的缩略图至该预览画面上该标签标识的该第六电子设备的显示区域;向该第六电子设备发送该第一照片。
结合第四方面,在第四方面的某些实现方式中,第一电子设备包括第一定位芯片,该第六电子设备包括第二定位芯片,当该一个或者多个指令被该处理器执行时,使得该第一电子设备执行以下步骤:基于该第一定位芯片、该第二定位芯片和该预览画面,识别该预览画面中包括的该第六电子设备,确定该第六电子设备与该第一电子设备的相对位置,其中,该第一定位芯片包括蓝牙定位芯片、超宽带UWB定位芯片中的至少一种,该第二定位芯片包括蓝牙定位芯片、超宽带UWB定位芯片中的至少一种。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,拍摄预览界面包括相册图标,当该一个或者多个指令被该处理器执行时,使得该第一电子设备执行以下步骤:接收对该相册图标的第六操作;响应于该第六操作,在该拍摄预览界面悬浮显示该第一照片的缩略图。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,第五操作是拖动该第一照片的缩略图的操作,该第六操作是长按该本地相册图标的操作。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,拍摄预览界面上还包括拍摄快门键,当该一个或者多个指令被该处理器执行时,使得该第一电子设备执行以下步骤:接收对该拍摄快门键的第七操作;响应于该第七操作,拍摄该第一照片。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,当该一个或者多个指令被该处理器执行时,使得该第一电子设备执行以下步骤:接收第八操作;响应于该四操作,在该拍摄预览界面上显示照片列表,该照片列表包括该第一照片和多张第二照片,该第二照片的日期在该第一照片之前;接收第九操作;响应于该五操作,从该照片列表中选中至少一个第二照片;以及,移动该第一照片的缩略图至该预览画面上该标签标识的该第六电子设备的显示区域之后,向该第六电子设备发送该第一照片和选中的该至少一个第二照片。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,第八操作是以该本地相册图标为起点沿着预设方向的滑动操作,该第九操作是点击操作。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,第六电子设备的标签用于标识该第六电子设备的名称,和/或该第六电子设备的标签用于标识该第六电子设备所处的位置。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,当移动该第一照片的缩 略图至该标签标识的该第六电子设备的显示区域时,该第六电子设备的标签的显示效果发生变化,该显示效果包括该第六电子设备的标签的颜色、大小、动画效果中的一种或多种。
结合第四方面和上述实现方式,在第四方面的某些实现方式中,当该第六电子设备在该预览画面中被遮挡时,该一个或者多个指令被该处理器执行时,该第一电子设备还用于执行以下步骤:在该拍摄预览界面上显示提示信息,该提示信息用于提示用户该第六电子设备的位置,或者该提示信息用于提示用户调整该第一电子设备的位置,使得该第一电子设备的该预览画面中显示该第六电子设备。
第五方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的实现方式中的方法。
第六方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的方法。
附图说明
图1为本申请实施例提供的一种系统架构示意图;
图2为本申请实施例提供的一种电子设备的结构示意图;
图3为本申请实施例提供的又一种电子设备的结构示意图;
图4为本申请实施例提供的一种设备识别方法的场景示意图;
图5A-图5H为本申请实施例提供的一组界面示意图;
图6A-图6B为本申请实施例提供的又一组界面示意图;
图7A-图7C为本申请实施例提供的又一组界面示意图;
图8A-图8E为本申请实施例提供的又一组界面示意图;
图9A-图9E为本申请实施例提供的又一组界面示意图;
图10A-图10C为本申请实施例提供的又一组界面示意图;
图11A-图11D为本申请实施例提供的又一组界面示意图;
图12A-图12F为本申请实施例提供的又一组界面示意图;
图13是一例分享照片过程的图形用户界面示意图;
图14是本申请实施例提供的一例分享照片过程的图形用户界面示意图;
图15是本申请实施例提供的又一例分享照片过程的图形用户界面示意图;
图16是本申请实施例提供的一例接收照片的图形用户界面示意图;
图17是本申请实施例提供的一例分享照片过程的图形用户界面示意图;
图18是本申请实施例提供的又一例接收照片的图形用户界面示意图;
图19是本申请实施例提供的一例分享照片的方法的示意性流程图;
图20为本申请实施例提供的一种定位方法的方法流程图;
图21为本申请实施例提供的一种定位方法的原理示意图;
图22为本申请实施例提供的一种设备识别方法的流程示意图;
图23为本申请实施例提供的软件架构示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅 是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供了一种基于增强现实的设备识别方法,增强现实(Augmented Reality,AR)是一种借助于计算机图形和可视化技术,将虚拟信息与真实环境叠加在同一个画面或空间同时存在,综合了三维显示技术、交互技术、传感器技术、计算机视觉技术以及多媒体技术等等。
所提方法中,电子设备100进入到第一界面,启动摄像头,在第一界面中实时显示通过该摄像头采集到的图像;同时发送带有无线定位技术的探测请求,电子设备100根据接收到的针对该探测请求的探测响应,确定电子设备100的附近设备,以及附近设备的设备名称、设备类型、距离电子设备100的物理距离和角度。电子设备100对摄像头采集到的图像进行图像识别,识别出图像中的电子设备以及电子设备的设备类型(例如音箱、电脑、平板电脑等)。电子设备100根据附近设备距离电子设备100的物理距离和角度,确定出附近设备在第一界面中的图像的显示区域。电子设备通过增强现实的方式在第一界面实时显示设备图标,该设备图标可以用于电子设备100与附近设备进行交互,例如电子设备100检测到针对于该设备图标的用户操作,响应于该用户操作,电子设备100输出该设备图标对应的附近设备的控制界面。本方法实现了电子设备与附近设备的交互,并且通过增强现实的显示方式实时呈现了设备图标和设备的对应关系,提升了用户体验。
本申请中,设备图标也可称为设备标签。
下面介绍本申请实施例提供的一种通信系统。
请参照图1,图1示例性地示出了本申请实施例中提供的一种通信系统10示意图。如图1所示,该通信系统10包括电子设备100、电子设备201、电子设备202、电子设备203、电子设备204等。电子设备100可以辅助用户选择和控制各种电子设备(例如音箱、电视机、冰箱、空调等等)。本申请中,电子设备100也可称为第一电子设备,电子设备201(或电子设备202、电子设备203、电子设备204等)也可称为第二电子设备;其中,
电子设备(例如电子设备100、电子设备201、电子设备202、电子设备203或电子设备204)具有超宽带(ultra wide band,UWB)通信模块,还可以具有蓝牙通信模块、WLAN通信模块和GPS通信模块中的一项或多项。以电子设备100为例,电子设备100可以通过UWB通信模块、蓝牙通信模块、WLAN通信模块和全球定位系统(Global Positioning System,GPS)通信模块中的一项或多项发射信号来探测、扫描电子设备100附近的电子设备(例如电子设备201、电子设备202、电子设备203或电子设备204),使得电子设备100可以通过UWB、蓝牙、WLAN和GPS中的一种或多种近距离无线通信协议发现附近的电子设备,并与附近的电子设备建立无线通信连接,并可以传输数据至附近的电子设备。
本申请对电子设备(例如电子设备100、电子设备201、电子设备202、电子设备203或电子设备204)的类型不做具体限定,在一些实施例中,本申请实施例中的电子设备可以是手机、可穿戴设备(例如,智能手环)、平板电脑、膝上型计算机(laptop)、手持计算机、电 脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(Augmented reality,AR)\虚拟现实(virtual reality,VR)设备等便携设备。还可以是音箱、电视机、冰箱、空调、车载设备、打印机、投影仪等设备。电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2021110906-appb-000001
或者其它操作系统的电子设备。
在一种可能实现方式中,电子设备100、电子设备201、电子设备202、电子设备203和电子设备204间可以直接通信。在一种可能实现方式中,电子设备100、电子设备201、电子设备202、电子设备203和电子设备204可以通过有线或无线保真(wireless fidelity,WiFi)连接的方式连接至局域网(local area network,LAN)。例如,电子设备100、电子设备201、电子设备202、电子设备203和电子设备204均连接到同一个电子设备301,电子设备100、电子设备201、电子设备202、电子设备203和电子设备204可以通过电子设备301间接通信。该电子设备301可以是电子设备100、电子设备201、电子设备202、电子设备203和电子设备204中的一个,还可以是额外的第三方设备,例如是路由器、云端服务器、网关等。其中,云端服务器可以是硬件服务器,也可以植入虚拟化环境中,例如,云端服务器可以是在可以包括一个或多个其他虚拟机的硬件服务器上执行的虚拟机。电子设备301可以通过网络向电子设备100、电子设备201、电子设备202、电子设备203和电子设备204发送数据,也可以接收电子设备100、电子设备201、电子设备202、电子设备203和电子设备204发送的数据。
电子设备301可以包括有存储器、处理器和收发器。其中,存储器可以用于存储UWB定位的相关程序;存储器还可以用于存储通过UWB定位技术获取的电子设备(例如,电子设备201)的方位参数;存储器还可以用于存储经由电子设备301交换的消息、电子设备100和附近设备相关的数据和/或配置。处理器可以用于当获取局域网中多个附近设备的方位参数时,根据多个附近设备的方位参数中,确定出响应的目标设备。收发器可用于与连接到局域网的电子设备进行通信。需要说明的是,本申请实施例中,多个附近可以连接至同一个局域网,也可以不连接至同一个局域网,此处不做具体限定。
可以理解的,本实施例示出的结构并不构成对通信系统10的具体限定。在本申请另一些实施例中,通信系统10可以包括比图示更多或更少的设备。
下面,介绍本申请实施例中涉及的电子设备100。
参见图2,图2示出了本申请实施例提供的示例性电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信 号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括UWB,无线局域网(wireless  local area networks,WLAN)(如无线保真(wireless fidelity,WiFi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
应理解,在本申请实施例中,如果要实现在两个电子设备之间分享照片,可以包括以以上列举的任意一种通信方式进行传输,例如通过蓝牙、无线保真(wireless fidelity,WIFI)模块等多种可能的方式,本申请实施例对此不作限定。
其中,UWB无线通信是一种具备低耗电与高速传输的无线个人区域网络通讯技术。与常见的通信技术使用的连续载波方式不同,UWB采用脉冲信号来传送数据。UWB利用纳秒(ns)至皮秒(ps)级的非正弦波窄脉冲信号传输数据,而时间调变技术令其传输速率可以大大提高。因为使用的是极短脉冲,在高速通信的同时,UWB设备的发射功率却很小,仅仅只有目前的连续载波系统的几百分之一,因此耗电量相对较低。
UWB系统与传统的窄带系统相比,具有穿透力强、功耗低、抗多径效果好、安全性高、系统复杂度低、能提供精确定位精度等优点。UWB可以应用于需要高质量服务的无线通信应用,可以用在无线个人区域网络(WPAN)、家庭网路连接和短距离雷达等领域。UWB将成为解决企业、家庭、公共场所等高速因特网接入的需求与越来越拥挤的频率资源分配之间的矛盾的技术手段。
本申请实施例中,电子设备100通过一个UWB天线,可以实现距离和接收信号强度(receive signal strength indicator,RSSI)的测量。电子设备100通过至少两个UWB天线可以实现到达角度(Angle of arrival,AOA)测量。
在一些实施例中,电子设备处于待机状态时,电子设备100的UWB通信模块可以处于上电状态。
在一些实施例中,电子设备100可以通过蓝牙实现距离和AOA测量。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶 显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
在本申请的一些实施例中,显示屏194中显示有系统当前输出的界面内容。电子设备100通过GPU,显示屏194以及应用处理器等模块之间相互协作,进而在电子设备100的显示屏上显示图像、应用界面、按键、图标、窗口等,实现电子设备的显示功能。例如,界面内容为即时通讯应用提供的界面。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机 接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。在本申请一些可选的实施例中,压力传感器180A可用于捕获用户手指部位接触显示屏时生成的压力值,并将该压力值传输给处理器,以使得处理器识别用户通过哪个手指部位输入用户操作。
压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于不同触摸位置,可以对应不同的操作指令。在一些可选的实施例中,压力传感器180A还可根据检测到的信号计算触摸点的数量,并将计算值传输给处理器,以使得处理器识别用户通过单指或多指输入用户操作。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(电子设备的X轴、Y轴和Z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。在本申请一些可选的实施例中,加速度传感器180E可用于捕获用户手指部位接触显示屏(或者用户手指敲击电子设备100的后壳后侧边框)时生成的加速度值,并将该加速度值传输给处理器,以使得处理器识别用户通过哪个手指部位输入用户操作。
本申请实施例中,电子设备100可以通过陀螺仪传感器和/或加速度传感器确定电子设备100的姿态变化,进而识别用户操作。例如,根据电子设备100的姿态变化识别当前用户操作为抬起操作,抬起操作可以为电子设备100从平放在水平方向上(此时电子设备的显示屏194与水平方向平行,抬起角度为与水平方向的夹角,即为0度),用户在预设时间内将电子设备100抬起到竖直水平方向(此时电子设备的显示屏194与水平方向垂直,抬起角度为与水平方向的夹角,即为90度),此时在预设时间内的抬起变化角度为90度(90度减去0度)。电子设备100检测到在预设时间内的抬起变化角度超过预设角度,则电子设备100可以认为当前用户操作为抬起操作。预设角度例如可以是30度。
在一些实施例中,电子设备100检测到在预设时间内的抬起变化角度超过预设角度,并且在该预设时间内中某一时刻的抬起角度在预设角度范围内,则电子设备100认为当前用户操作为抬起操作。预设角度范围可以是60度~90度。
又例如,电子设备100可以通过陀螺仪传感器和/或加速度传感器确定电子设备100的姿态变化,进而识别静止状态。静止状态可以为电子设备100在预设时间内的陀螺仪传感器检测出的角度变化在预设范围内,并且在该预设时间内的加速度传感器检测出的速度变化小于阈值。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭显示屏达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸 传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作,该触摸触控操作是指用户手部、手肘、触控笔等接触显示屏194的操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。
下面以电子设备202为例介绍本申请实施例提供的又一种电子设备的结构。
图3示例性的示出了本申请实施例提供的电子设备202的结构示意图。其中,电子设备201、电子设备203、电子设备204均可参考图3所示的结构示意图。
如图3所示,电子设备202可以包括:处理器401,存储器402,无线通信处理模块403,天线404,电源开关405,有线LAN通信处理模块406,USB通信处理模块407,音频模块408,显示屏409。其中:
处理器401可用于读取和执行计算机可读指令。具体实现中,处理器401可主要包括控制器、运算器和寄存器。其中,控制器主要负责指令译码,并为指令对应的操作发出控制信号。运算器主要负责保存指令执行过程中临时存放的寄存器操作数和中间操作结果等。具体实现中,处理器401的硬件架构可以是专用集成电路(ASIC)架构、MIPS架构、ARM架构或者NP架构等等。
在一些实施例中,处理器401可以用于解析无线通信模块403和/或有线LAN通信处理模块406接收到的信号,如电子设备100广播的探测请求,等等。处理401可以用于根据解析结果进行相应的处理操作,如生成探测响应,等等。
在一些实施例中,处理器401还可用于生成无线通信模块403和/或有线LAN通信处理模块406向外发送的信号,如蓝牙广播信号。
存储器402与处理器401耦合,用于存储各种软件程序和/或多组指令。具体实现中,存 储器402可包括高速随机存取的存储器,并且也可包括非易失性存储器,例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。存储器402可以存储操作系统,例如uCOS,VxWorks、RTLinux等嵌入式操作系统。存储器402还可以存储通信程序,该通信程序可用于与电子设备100,一个或多个服务器,或附件设备进行通信。
无线通信模块403可以包括UWB通信模块403A、蓝牙通信模块403B、WLAN通信模块404C、GPS通信模块404D中的一项或多项。其中,UWB通信模块403A可以集成到芯片(System on Chip,SOC)上,UWB通信模块403A在硬件上(或软件上)也可以与其他通信模块(例如,蓝牙通信模块403B)集成为一体。
在一些实施例中,UWB通信模块403A、蓝牙通信模块403B、WLAN通信模块404C、GPS通信模块404D中的一项或多项可以监听到其他设备(如电子设备100)发射的信号,如测量信号、扫描信号等等,并可以发送响应信号,如测量响应、扫描响应等,使得其他设备(如电子设备100)可以发现电子设备202,并通过UWB、蓝牙、WLAN或红外线中的一种或多种近距离无线通信技术与其他设备(如电子设备100)建立无线通信连接,来进行数据传输。
在另一些实施例中,UWB通信模块403A、蓝牙通信模块403B、WLAN通信模块404C、GPS通信模块404D中的一项或多项也可以发射信号,如广播UWB测量信号,使得其他设备(如电子设备100)可以发现电子设备202,并通过UWB、蓝牙、WLAN或红外线中的一种或多种近距离无线通信技术与其他设备(如电子设备100)建立无线通信连接,来进行数据传输。
无线通信模块403还可以包括蜂窝移动通信模块(未示出)。蜂窝移动通信处理模块可以通过蜂窝移动通信技术与其他设备(如服务器)进行通信。
天线404可用于发射和接收电磁波信号。不同通信模块的天线可以复用,也可以相互独立,以提高天线的利用率。例如:可以将蓝牙通信模块403A的天线复用为WLAN通信模块403B的天线。例如,UWB通信模块403A要使用独立的UWB天线。
本申请实施中,为实现UWB通信,电子设备202至少具有一个UWB天线。
电源开关405可用于控制电源向电子设备202的供电。
有线LAN通信处理模块406可用于通过有线LAN和同一个LAN中的其他设备进行通信,还可用于通过有线LAN连接到WAN,可与WAN中的设备通信。
USB通信处理模块407可用于通过USB接口(未示出)与其他设备进行通信。
音频模块408可用于通过音频输出接口输出音频信号,这样可使得电子设备202支持音频播放。音频模块还可用于通过音频输入接口接收音频数据。电子设备202可以为电视机等媒体播放设备。
显示屏409可用于显示图像,视频等。显示屏409可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED)显示屏,有源矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED)显示屏,柔性发光二极管(flexible light-emitting diode,FLED)显示屏,量子点发光二极管(quantum dot light emitting diodes,QLED)显示屏等等。
在一些实施例中,电子设备202还可以包括RS-232接口等串行接口。该串行接口可连接至其他设备,如音箱等音频外放设备,使得显示器和音频外放设备协作播放音视频。
可以理解的是图3示意的结构并不构成对电子设备202的具体限定。在本申请另一些实施例中,电子设备202可以包括比图示更多或更少的部件,或组合某些部件,或者拆分某些 部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
本申请提供了一种基于增强现实的设备识别方法,电子设备100检测到第一操作后,进入第一界面,电子设备100启动摄像头,在第一界面上实时显示该摄像头采集到的预览图像。电子设备100通过计算机视觉技术,识别出第一界面的预览图像中第二电子设备的种类(例如音箱、电脑、平板、电脑等);同时,电子设备100通过无线定位技术(例如UWB定位、蓝牙定位、WiFi定位、GPS定位等),确定出在电子设备100的通信范围内的第二电子设备的方位信息(例如经纬度信息、或距离电子设备的物理距离和角度)和身份信息(例如设备名称、设备类型、设备属性等)。
电子设备100根据第二电子设备与自身的相对距离和相对角度,以及摄像头的拍摄角度范围,确定出第二电子设备在预览图像中的位置。举例来说,如图4所示,图4中包括有电子设备100和附近设备。附近设备包括电子设备201、电子设备202、电子设备203和电子设备204。第二电子设备可以是附近设备中任意一个电子设备。其中,该图4中示例性的示出了本申请的一些应用场景中,电子设备100与电子设备201、电子设备202、电子设备203和电子设备204在水平面上的位置关系。
在本申请的实施例中,为了便于说明电子设备100与附近设备的位置关系,可以将电子设备100上的一参考点(例如,可以是中心位置点)表示其在平面图中的位置。例如,可以用电子设备100的中心位置点,代表其在水平面中的位置。本申请实施例中,可以将电子设备100的中心位置点为起始点垂直于电子设备100触控屏的上边缘的向量所指方向,作为电子设备100的基准方向,也可以称为电子设备100的0度方向。
因此,如图4所示,电子设备201可以在电子设备100的0度方向1m处,电子设备202可以在电子设备100的顺时针330度方向1.2m处,电子设备203可以在电子设备100的顺时针330度方向0.5m处,电子设备204可以在电子设备100的顺时针30度方向0.8m处。
一般来说,摄像头的拍摄角度的左右夹角在60°~80°的范围,上下夹角在45°左右,根据不同手机品牌和摄像头配置会有一定的变化。若电子设备100的拍摄角度的左右夹角为60°,则可以看出电子设备201、电子设备202、电子设备203和电子设备204都在电子设备100的拍摄范围内。根据不同电子设备的长度、宽度、以及距离电子设备100的物理距离,可以判断出电子设备201、电子设备202、电子设备203和电子设备204在电子设备100的拍摄界面中完全显示还是部分显示。
在本申请实施例中,电子设备100的附近设备可以不限于上述图4中的四个电子设备,还可以有更多或更少,图4中只是示例性的以四个电子设备解释本申请,不应构成限定。上述图4中示例性的示出了上述四个电子设备(电子设备201、电子设备202、电子设备203和电子设备204)与电子设备100的相对位置关系,仅仅示例性的解释本申请实施例,不应构成限定。
电子设备100确定出第二电子设备的方位信息后,在预览图像中确定第二电子设备的显示图像以及显示区域,通过增强现实的方式在第一界面实时显示设备图标,用户通过触发该设备图标,电子设备100可以输出第二电子设备的控制界面,实现用户对该第二电子设备的交互。
在一些实施例中,设备图标在第一界面上的显示区域与第二电子设备在预览图像中的显示区域对应。
下面结合应用场景,介绍本申请中提供的一种基于增强现实的设备识别方法。
在图5A-图5F示出的UI实施例中,示例性示出了用户的第一操作触发第一电子设备进入第一界面,电子设备在第一界面上实时显示设备图标的操作过程。
图5A示例性示出了电子设备100上的示例性用户界面510。用户界面510可包括:状态栏511、托盘512以及一个或多个应用程序图标,其中状态栏201可包括:移动通信信号(又可称为蜂窝信号)的一个或多个信号强度指示符513、无线高保真(wireless fidelity,Wi-Fi)信号的一个或多个信号强度指示符514,蓝牙指示符515,电池状态指示符516、时间指示符517。当电子设备100的蓝牙模块为开启状态(即电子设备为蓝牙模块进行供电)时,电子设备100的显示界面上显示蓝牙指示符515。
托盘512具有常用应用程序图标,可展示例如:电话图标、联系人图标、短信图标、相机图标等。一个或多个应用程序图标包括:图库图标、浏览器图标、应用商店图标、设置图标、邮箱图标、云共享图标、备忘录图标。
电子设备100可以启动并同时运行多个应用程序,为用户提供不同的服务或功能。其中,电子设备100同时运行多个应用程序是指,电子设备100已启动多个应用程序,该多个应用程序未关闭,且电子设备100未删除该多个应用程序占用的内存等资源,多个应用程序同时在后台占用内存等资源;而并不要求多个应用程序同时在前台与用户交互。例如,电子设备100先后启动了邮箱、图库和即时通讯这三个应用程序,并同时运行邮箱、图库和即时通讯这三个应用程序。
用户在使用某个应用程序时,若发生应用程序的切换或跳转至桌面进行操作,则电子设备100不会杀死用户之前使用的应用程序,而是将之前使用的应用程序作为后台应用程序保留在多任务队列中。
电子设备100在同时运行多个应用程序的情况下,可以根据多任务队列中的多个应用程序,生成与每个应用程序分别对应的卡片。多任务界面上的多张卡片按照预设的顺序策略横向并列设置。例如,在一种顺序策略中,电子设备100按照运行不同应用程序的时间先后顺序,排列不同应用程序对应的卡片。
电子设备100在检测到指示打开多任务界面520的用户操作后,显示多任务界面520。该多任务界面520包括电子设备100正在运行的多个应用程序分别对应的卡片。其中,指示打开多任务界面的用户操作可以有多种。
示例性的,当电子设备100检测到针对于电子设备100的底部的向上滑动操作时,响应于该操作,如图5B所示,电子设备100显示多任务界面520。
多任务界面520可包括:卡片521、卡片522和删除图标523。其中,卡片521被全部显示,卡片522被部分显示。
删除图标523可以用于关闭当前多任务界面520中显示完整的卡片对应的应用程序。这里的关闭指删除该应用程序占用的内存等资源。一些实施例中,删除图标530可以用于关闭当前多任务界面520中所有卡片对应的应用程序。
需要说明的是,附图仅是示意性说明,附图所示的多任务界面520是指电子设备100的边框内触摸屏上显示的界面,卡片在电子设备100边框内的部分能够被电子设备100的触摸屏显示,卡片在电子设备边框外的部分不能被电子设备100的触摸屏显示。
在上述多任务界面520中,用户可以通过在多任务界面520上左右滑动的方式,对卡片进行切换显示。例如,当电子设备100检测到在多任务界面520上的向右滑动操作时,响应于该操作,多任务界面520上的卡片依次向右移动,此时电子设备100可以完整显示卡片522, 部分显示卡片521。当电子设备100检测到在多任务界面520上的向左滑动操作时,响应于该操作,多任务界面520上的卡片依次向左移动,由于卡片521为多任务界面520中从右数第一张卡片,卡片521的右边没有其他卡片,电子设备100完整显示卡片521后,检测到向左滑动操作,如图5C所示,响应于该操作,电子设备100部分显示预设区域524,继续向左滑动,如图5D所示,电子设备100完全显示预设区域524,在一些实施例中,此时,电子设备100触发显示该预设区域524对应的取景界面。该取景界面可以是电子设备100的后置摄像头采集的画面,也可以是前置摄像头采集的画面。
如图5E所示,图5E示例性示出了一种取景界面530。在取景界面530中实时显示通过摄像头采集到的图像;可选的,电子设备100还可以发送带有无线定位技术的探测请求,电子设备100根据接收到的针对该探测请求的探测响应,确定电子设备100的附近设备,进一步的,确定附近设备的设备名称、设备类型、距离电子设备100的物理距离或角度中一个或多个信息。电子设备100对摄像头采集到的图像进行图像识别,识别出图像中的电子设备(例如音箱、电脑、平板电脑等)。图5E中取景界面530的显示内容为摄像头采集的画面,包括设备图像531、设备图像532、设备图像533和设备图像534。
结合上述图4来说,在本申请中,设备图像531为电子设备100拍摄电子设备202在取景界面530中显示的图像;设备图像532为电子设备100拍摄电子设备201在取景界面530中显示的图像;设备图像533为电子设备100拍摄电子设备203在取景界面530中显示的图像;设备图像534为电子设备100拍摄电子设备202在取景界面534中显示的图像。
电子设备100根据电子设备201、电子设备202、电子设备203和电子设备204距离电子设备100的物理距离和角度,确定出每个设备在取景界面530中的对应的设备图像的显示区域。电子设备100通过增强现实的方式在取景界面530实时显示设备图标,设备图标指示了取景界面530中设备图像对应的电子设备。可选的,该设备图标的显示区域与取景界面530中的设备图像对应。
在一些可选的方式中,设备图标可以显示在取景界面530的固定位置,也可以与设备图像进行对应显示,例如显示在对应的设备图像的周围,或者显示在对应的设备图像的中心位置等;设备图标与对应的设备图像的显示区域可以完全重叠,也可以部分重叠,也可以不重叠(例如,显示在紧靠该对应的设备图像的显示区域的上方区域)。
如图5F所示,设备图标5311的显示区域与设备图像531完全重叠,设备图标5311指示了设备图像531对应的设备名称为matepad(平板电脑);设备图标5321的显示区域与设备图像532部分重叠,设备图标5321指示了设备图像532对应的设备名称为HUAWEI soundX(华为音箱);设备图标5331的显示区域与设备图像533完全重叠,设备图标5331指示了设备图像531对应的设备名称为matebook(电脑);设备图标5341的显示区域与设备图像534部分重叠,设备图标5341指示了设备图像534对应的设备名称为matebook(电脑)。
本申请中,设备图标也可称为设备标签。当设备201称为第二电子设备时,设备图标5321可也称为第一标签。
在一些实施例中,设备图标的显示区域与设备的定位芯片(例如UWB芯片、蓝牙芯片)在设备图像中的位置对应。电子设备100接收到来自电子设备201的定位芯片的探测响应,确定出电子设备201的定位芯片的方位(距离电子设备100的物理距离和角度)。根据该电子设备201的定位芯片的方位,电子设备100确定出电子设备201的定位芯片在取景界面530中的对应的位置,电子设备100将设备201的设备图标显示在该对应的位置。例如设备图标5311显示在设备图像531对应的电子设备内部的定位芯片的位置。设备图标5321和设备图 标5331同理。
在一些实施例中,设备的定位芯片的位置不在取景界面530中,例如设备图像534对应的电子设备504的定位芯片不在取景界面530中。电子设备100可以根据电子设备504的位置及设备尺寸,推算出电子设备504外观关键点(如屏幕四个角)相对于电子设备100的物理距离和方位,当电子设备100拍摄到一个或多个外观关键点,电子设备100将设备图标显示在取景界面530中。
在一些应用场景中,设备图标不仅可以指示设备图像对应的设备的身份信息,设备图标还可以关联设备图像对应的设备的控制卡片。当电子设备100检测到针对于设备图标的用户操作,电子设备100输出该设备图标对应的设备的控制卡片。如图5G所示,当电子设备100检测到针对于设备图标5321的用户操作,该设备图标5321关联的电子设备为HUAWEI soundX,如图5H所示,电子设备100输出HUAWEI soundX的控制卡片540。该控制卡片540中可包括以下中的一个或多个:应用程序标题栏601,连接卡片602,音乐卡片603,投音卡片604、刷新控件605,关闭控件606。其中:
应用程序标题栏601指示了该控制卡片540的设备为HUAWEI soundX。
连接卡片602可以包括指示信息602A和连接方式602B。其中,指示信息602A用于表征设备图像532对应的设备(电子设备201)当前是在线状态还是离线状态。在线状态指电子设备201当前已连接到互联网,离线状态指电子设备201当前未连接到互联网。连接方式602B用于指示电子设备201与电子设备100当前的连接方式,当电子设备201与电子设备100当前的连接方式为蓝牙时,连接方式602B可以展现于蓝牙的图标。当电子设备201与电子设备100当前的连接方式为WiFi时,连接方式602B可以展现于WiFi的图标。
音乐卡片603可以包括音乐名称603A,暂停控件603B,上一个控件603C,下一个控件603D,进度条603E,音量603F,更多控件603H。
暂停控件603B可接收用户的输入操作(例如,单击操作),响应于检测到的用户操作,电子设备201暂停播放音乐。
上一个控件603C可接收用户的输入操作(例如,单击操作),响应于检测到的用户操作,电子设备201可以播放音乐列表中当前播放歌曲的上一首歌曲。
下一个控件603D可接收用户的输入操作(例如,单击操作),响应于检测到的用户操作,电子设备201可以播放音乐列表中当前播放歌曲的下一首歌曲。
进度条603E的可以指示当前歌曲的总时长(例如,04:42)和已播放时长(例如,00:42)。
音量603F可接收用户的输入操作(例如,滑动操作),响应于检测到的用户操作,电子设备201调整电子设备201的播放音量。
更多控件603H可接收用户的输入操作(例如,滑动操作),响应于检测到的用户操作,电子设备100可以显示音乐卡片的更多功能选项,例如,分享、删除、下载等。
投音卡片604用于指示电子设备100将音频输出到电子设备201。当电子设备100检测到针对于投音卡片604的用户操作,响应于该操作,电子设备100的音频输出到电子设备201。
刷新控件605用于刷新当前控制卡片540的显示界面,电子设备100重新获取设备201的当前状态。
关闭控件606用于关闭该控制卡片540,当电子设备100检测到针对于控件606的用户操作,卡片540关闭,电子设备100显示如图5G的取景界面530。
除了图5H所示的设备图像532的控制卡片540的方式,电子设备100还可以有其他方式与设备图像532进行交互,此处不作具体限定。例如,当电子设备100检测到针对于设备 图标5321的用户操作,电子设备100可以直接打开并跳转到设备图像532对应的电子设备所关联的应用软件,显示设备图像532的应用软件的应用界面,比如智慧生活、运动健康等应用软件。
本申请中,取景界面530又可称为第一界面。电子设备100通过计算机识别技术和无线定位技术,确定电子设备100的附近设备的方位信息,并在取景界面530的预览图像中确定附近设备的显示图像以及显示区域,通过增强现实的方式在拍摄界面实时显示设备图标,达到实时预览的效果。用户可以通过触发该设备图标,电子设备100输出对应的电子设备的控制界面,实现用户对附近设备的交互。
在一些应用场景中,电子设备100没有应用程序在后台运行,多任务队列中没有应用程序运行,即多任务界面520中不包括卡片521、卡片522。当电子设备100显示用户界面510,检测到在电子设备100的底部的向上滑动操作时,响应于该操作,电子设备100显示多任务界面。由于多任务界面中没有卡片,则电子设备100直接进入取景界面530。电子设备100启动摄像头,通过摄像头实时采集图像显示在取景界面530上。
在一些应用场景中,电子设备100进入取景界面530后,当取景界面530中只有一个设备,则用户无需点击设备图标,电子设备100可以直接进入该设备的控制界面。示例性的,如图6A所示,图6A中的取景界面530中包括设备图像532,设备图标5321显示在设备图像532的附近,设备图标5321与设备图像532的显示区域部分重叠。当电子设备100检测到在该取景界面530中有唯一一个设备,如图6B所示,电子设备100直接输出该设备图像532的控制卡片540。这种实现方式,当取景界面530中有唯一一个设备图像,则可以认为用户想要针对该设备图像对应的电子设备进行交互,则电子设备100省略用户的触发操作,直接进入该电子设备的控制界面,提升了用户体验。
本申请中,上述图5A~图5D所示的进入取景界面530的方式为可选的,电子设备100还可以有其他方式进入取景界面530。例如,图7A和图7B还提供了一种进入取景界面530的方式。
如图7A所示,图7A中显示了用户界面510,其中,用户界面510的描述可以参考上述图5A中的相关描述。当电子设备100检测到针对于电子设备的底部左侧的用户操作,或者当电子设备100检测到针对于电子设备的底部右侧的用户操作,电子设备100显示如图7B所示的用户界面710。
该用户界面710中可以包括以下中的一个或多个:接续设备选择栏701、控件702A、控件702B、设备显示栏703以及实时取景控件704。其中,
接续设备选择栏701包括一个或多个附近设备的设备选项(也可称为设备图标)。如智慧屏、matepad、matebook、音箱等。区域1202中显示的设备选项可用于触发分享的操作。响应于检测到的作用于设备选项的操作(如在设备图标上的点击操作),电子设备100可以触发分享已选定的数据或任务至该操作选定的设备选项对应的设备的过程。该过程可包括:电子设备100与已选定的设备选项对应的设备建立通信连接,然后通过该通信连接向该设备选项对应的设备传输已选定的数据或任务。
控件702A指示了一种预设模式,在该预设模式下,可以对一个或多个设备进行统一的控制。例如,该预设模式为回家模式,在回家模式下,设备图标703B、设备图标703C和设备图标703F对应的电子设备自动开启,设备图标703A、设备图标703D对应的电子设备自动关闭。
控件702B指示了另一种预设模式,在该预设模式下,可以对一个或多个设备进行统一的控制。例如,该预设模式为离家模式,在离家模式下,设备图标703B、设备图标703C和设备图标703F对应的电子设备自动关闭,设备图标703A、设备图标703D对应的电子设备自动开启。
设备显示栏703中包括多个设备图标。例如华为AI音箱703A、智能电视703B、空气净化器703C、智能台灯703D、蓝牙耳机703E、空调伴侣703F。设备显示栏703显示的多个设备图标中的任一设备图标可接收用户的输入操作(例如,单击操作),响应于检测到的输入操作,电子设备100显示该设备的控制界面。
其中,空气净化器703C中包括控件7031,该控件7031用于控制空气净化器703C的开启和关闭。智能台灯703D和空调伴侣703F也包括与控件7031相同的控件。华为AI音箱703A和智能电视703B等设备不能通过用户界面710控制开启和关闭。
实时取景控件704,用于触发进入取景界面。当电子设备100检测到针对于实时取景控件704的用户操作,电子设备100显示如图5F所示的取景界面530;可选的,电子设备100显示如图5E所示的取景界面530,再显示如图5F所示的取景界面530。
在一些实施例中,用户界面710中的实时取景控件704是可选的,电子设备100可以不显示该实时取景控件704。当电子设备100显示用户界面710,电子设备100检测到抬起操作时,电子设备100可以显示取景界面530。如图7C所示,图7C示例性的示出了一种抬起操作,在T1时刻,电子设备100显示用户界面710,当电子设备100检测到抬起操作,在T2时刻,电子设备100显示取景界面530,其中T1时刻和T2时刻之间的时间间隔小于阈值。
可以理解的,抬起操作只是一种示例性的用户操作,电子设备100还可以通过其他用户操作进入取景界面530。
不限于上述打开取景界面530的方式,本申请还可以通过例如相机应用启动摄像头,从而进入取景界面530;或者通过其他应用程序,如即时通讯应用、支付应用等,触发进入取景界面530;等等。
本申请中,上述图5F所示的取景界面530中的设备图标5311、设备图标5321、设备图标5331和设备图标5341的显示形式为可选的。图8A~图8D还提供了一种设备图标的显示形式,设备图标可以随着取景界面中显示内容的变化而变化,在第一时刻时,设备的显示区域在取景界面中的第一位置,该设备的设备图标显示在取景界面的第一位置内或紧靠第一位置;在第二时刻时,该设备的显示区域在取景界面中的第二位置,则该设备的设备图标显示在取景界面的第二位置内或紧靠第二位置。
如图8A所示,图8A中显示了用户界面530,其中,关于图8A的描述可以参考上述图5F的相关描述。示例性的,在图8A的取景界面530中包括设备图像534,设备图标5341显示在取景界面530中设备图像534的附近,设备图标5341与设备图像534的显示区域部分重叠;取景界面530中包括设备图像531,设备图标5311显示在取景界面530中设备图像531的附近,设备图标5311与设备图像531的显示区域完全重叠。
可以看出,设备图标的显示区域与其对应的设备图像的显示区域对应。
在一些实施例中,在取景界面中显示内容不断变化时,电子设备100不显示设备图标,直到电子设备100的静止状态的持续时间超过预设时间,电子换设备100根据取景界面中的显示内容,显示设备图标。具体的,电子设备100可以通过加速度传感器和/或陀螺仪传感器确定电子设备100的静止状态。
在一些实施例中,设备图标的显示区域与其他设备图标的显示区域有关,例如,设备图标之间的显示区域互不遮挡。
相比于图8A,图8B的拍摄方向或角度与图8A不同,在图8B中的取景界面810中,设备图像534的显示部分更多,设备图标5341与设备图像534的显示区域完全重叠。
相比于图8A,在图8C的取景界面820中,设备图像531与设备图像533部分重叠,设备图标5311显示在设备图像531的上方,紧靠设备图像531的显示区域。设备图标5311与设备图像531的显示区域不重叠。
可以看出,设备图标的显示区域可以随着设备图像的显示区域的变化而变化,例如,根据设备图像在取景界面530中的显示区域的变化,设备图标的显示区域可以是设备图像的显示区域的中心位置(或任意位置);设备图标的显示区域可以是紧靠(或紧邻)设备图像的显示区域的上方(下方、左方、右方)等。
在一些实施例中,当设备不在摄像头的拍摄范围内,电子设备100的取景界面中不包括该设备的设备图像。该设备的设备图标可以以特定的方式显示在取景界面中。
电子设备100进入到取景界面,启动摄像头,在取景界面中实时显示通过该摄像头采集到的图像;同时发送带有无线定位技术的探测请求,电子设备100根据接收到的针对该探测请求的探测响应,确定电子设备100的附近设备,进一步的,确定附近设备的设备名称、设备类型、距离电子设备的物理距离或角度中一个或多个信息。电子设备100对摄像头采集到的图像进行图像识别,识别出图像中的电子设备(例如音箱、电脑、平板电脑等)。
若电子设备100接收到四个探测响应,检测到附近有四个电子设备,该探测响应携带设备的身份信息,例如设备名称、设备类型等信息。电子设备100确定出该四个电子设备的设备名称、设备类型等,例如分别为matepad(设备类型:平板电脑)、HUAWEI soundX(设备类型:音箱)、matebook(设备类型:电脑)、和matebook(设备类型:电脑);并且通过无线定位技术确定出该四个电子设备的方位信息(与电子设备100的物理距离和角度)。
电子设备100采集到的图像中只有三个电子设备的图像。电子设备100通过该四个电子设备的方位信息判断出其中一个电子设备不在电子设备100的摄像头的拍摄范围内,或者电子设备100通过计算机视觉技术识别出该三个电子设备的设备类型,结合该四个电子设备的设备类型,确定出不在图像中的那个电子设备和设备类型;则电子设备100将不在图像中的电子设备的设备图标以第一预设方式显示在图像中。该第一预设方式例如可以是显示在取景界面的固定位置,又例如可以显示在与方位信息相关的位置。
示例性的,如图8A所示,在图8A的取景界面530中设备图像531为设备202的部分显示图像,设备图标5311显示在取景界面530中设备图像531的附近,设备图标5341与设备图像534的显示区域完全重叠。
相比于图8A,图8B的拍摄方向或角度与图8A不同,此时设备202不在摄像头的拍摄范围内,在图8B中的取景界面810中,不包括设备202的设备图像。取景界面810显示图标801和提示符802。其中,提示符802用于提示用户该图标801为特殊图标;图标801显示在电子设备100的取景界面810的左边缘,提示用户在电子设备100的摄像头的拍摄范围之外,存在设备matepad。可选的,该图标801可以触发电子设备显示设备图像531的控制界面。
在一些实施例中,图标801或提示符802可以指示设备的方位(包括角度、距离等)。例如,图标801显示在电子设备的取景界面810的左边缘,提示用户在电子设备100的摄像头的拍摄范围之外,在电子设备100的左边有设备matepad。可选的,还可以通过文本的方式指 示设备的方位。
本申请中,当上述设备matepad称为第三电子设备时,图标801或提示符802可也称为第三标签。
在一些实施例中,当设备被其他物体遮挡,电子设备100的取景界面中不包括该设备的设备图像。该设备的设备图标可以以特定的方式显示在取景界面中。
若电子设备100接收到四个探测响应,检测到附近有四个电子设备,该探测响应携带设备的身份信息,例如设备名称、设备类型等信息。电子设备100确定出该四个电子设备的设备名称、设备类型等,例如分别为matepad(设备类型:平板电脑)、HUAWEI soundX(设备类型:音箱)、matebook(设备类型:电脑)、和matebook(设备类型:电脑);并且通过无线定位技术确定出该四个电子设备的方位信息(与电子设备100的物理距离和角度)。
电子设备100采集到的图像中只有三个电子设备的图像。电子设备100通过该四个电子设备的方位信息,检测到该四个电子设备均在电子设备100的摄像头的拍摄范围内,则判断有电子设备被遮挡。电子设备100通过计算机视觉技术识别出图像中三个电子设备的设备类型,结合该四个电子设备的设备类型,确定出被遮挡的那个电子设备和设备类型;电子设备100将被遮挡的电子设备的设备图标以第二预设方式显示在图像中。该第二预设方式例如可以是显示在取景界面的固定位置,又例如可以是显示在与方位信息相关的位置。
示例性的,如图8C所示,相比于图8A,在图8C中的取景界面820中,不包括设备图像532。取景界面820显示图标803和提示符804。其中,提示符804用于提示用户该图标803为特殊图标;图标803显示在电子设备100的取景界面820的中间区域,提示用户在电子设备100的摄像头的拍摄范围之内,有设备HUAWEI soundX。可选的,该图标803可以触发电子设备100显示设备图像532(即HUAWEI soundX)的控制界面。
在一些实施例中,图标803或提示符804可以指示设备的方位(包括角度、距离等)。例如,图标803显示在电子设备100的取景界面820中设备图像5331的上方,用于提示用户,设备HUAWEI soundX被设备图像5331对应的设备203遮挡。可选的,还可以通过文本的方式指示设备的方位(例如HUAWEI soundX在设备图像5331的正后方);或者通过文本的方式指示设备被遮挡。
可选的,图标803的显示区域不与其他设备图像和设备图标的显示区域重叠。电子设备100根据取景界面820中显示的设备图像531、设备图像533、设备图像534、设备图标5311、设备图标5331、设备图标5341的显示区域,确定图标803的显示区域。
本申请中,当上述设备HUAWEI soundX称为第三电子设备时,图标803或提示符804可也称为第二标签。
在一些实施例中,若其他设备没有可被识别身份的无线定位技术,电子设备100通过计算机视觉识别该其他设备的类型(如手机、平板、电视、音箱等),寻找与电子设备100登录同一账号的设备是否存在对应的设备类型。
举例来说,若电子设备100接收到三个探测响应,检测到附近有三个电子设备,该探测响应携带设备的身份信息,例如设备名称、设备类型等信息。电子设备100确定出该三个电子设备分别为matepad(设备类型:平板电脑)、HUAWEI soundX(设备类型:音箱)、matebook(设备类型:电脑);并且通过无线定位技术确定出该三个电子设备的方位信息(与电子设备100的物理距离和角度)。
电子设备100采集到的图像中有四个电子设备的图像。电子设备100通过计算机视觉识别技术确定出该四个电子设备的图像在取景界面中的显示区域,以及确定出该四个电子设备 的设备类型分别为平板电脑、音箱、电脑、以及电脑。则电子设备100寻找与电子设备100登录同一账号的设备中是否存在电脑。对于每个电子设备来说都有自己的登录账号,一个账号可以绑定一个或多个电子设备,电子设备100在自己的账号下寻找是否存在绑定了设备类型为电脑的电子设备。若存在,则电子设备100认为该电脑与图像中的设备图像存在关联关系。电子设备100将该电脑的设备图标以预设方式显示在图像中。该预设方式例如可以是显示在取景界面的固定位置,又例如可以是显示在与图像中设备图像的显示区域相关的位置。
示例性的,如图8D所示,相比于图8A,取景界面830中包括设备图标805和提示符806。其中,提示符806用于指示该设备图标805为不确定图标,指示了该设备图像533对应的设备与设备图标805存在不确定的关联关系。
本申请中,当上述设备图像533对应的设备称为第四电子设备时,设备图标805或图标806可也称为第四标签。
在一些实施例中,若其他设备没有可被识别身份的无线定位技术,电子设备100通过计算机视觉识别设备的类型(如手机、平板、电视、音箱等),并且通过电子设备100自身的GPS信息,寻找与电子设备100在同一地理位置的设备是否存在对应的设备类型。
举例来说,若电子设备100接收到三个探测响应,检测到附近有三个电子设备,该探测响应携带设备的身份信息,例如设备名称、设备类型等信息。电子设备100确定出该三个电子设备分别为matepad(设备类型:平板电脑)、HUAWEI soundX(设备类型:音箱)、matebook(设备类型:电脑);同时通过无线定位技术确定出该三个电子设备的方位信息(与电子设备100的物理距离和角度)。
电子设备100采集到的图像中有四个电子设备的图像。电子设备100通过计算机视觉识别技术确定出该四个电子设备的图像在取景界面中的显示区域,以及确定出该四个电子设备的设备类型分别为平板电脑、音箱、电脑、以及电脑。则电子设备100寻找与电子设备100在同一个地理位置的电子设备是否存在电脑。
在每个电子设备的配置界面,可以包括地理位置的配置。例如,电子设备100与智能台灯配对连接时,用户在该智能台灯关联的应用软件(例如智慧生活)中,将某智能台灯的设备位置配置为房间;电子设备100与智能音箱配对连接时,用户在该智能音箱关联的应用软件(例如智慧生活)中,将智能音箱的设备位置配置为客厅;电子设备100与电脑配对连接时,用户在该电脑关联的应用软件(例如智慧生活)中,将电脑的设备位置配置为公司;等等。电子设备100根据自身的地理位置,确定出所处的区域,例如电子设备100通过GPS定位获取自身的位置在公司,则将设备位置配置在公司的电子设备中寻找是否存在设备类型为电脑的电子设备。若存在,则电子设备100认为该电脑与图像中的设备图像存在关联关系。电子设备100将该电脑的设备图标以预设方式显示在图像中。该预设方式例如可以是显示在取景界面的固定位置,又例如可以是显示在与图像中设备图像的显示区域相关的位置。该部分内容可参考上述图8D的相关描述。
本申请中,当上述设备图像533对应的设备称为第五电子设备时,设备图标805或图标806可也称为第五标签。
在一些实施例中,若其他设备没有可被识别身份的无线定位技术,并且电子设备100无法正确识别两个相同类型设备的位置信息,电子设备输出两个标签供用户选择。
举例来说,电子设备100采集到的图像中有两个电子设备的图像。电子设备100通过计算机视觉识别技术确定出该两个电子设备的图像在取景界面中的显示区域,以及确定出该两个电子设备的设备类型都为音箱。电子设备100没有接收到探测响应,无法确定出这两个音 箱的方位。
电子设备100可以采用上述的两个实施例所描述的方式,寻找与电子设备100登录同一账号的设备是否存在对应的设备类型;或者通过电子设备100自身的GPS信息,寻找与电子设备100在同一地理位置的设备是否存在对应的设备类型。若电子设备100根据该两种方式确定出一个设备类型为音箱的设备,电子设备100将该音箱的设备图标以预设方式显示在图像中。该预设方式例如可以是显示在取景界面的固定位置,又例如可以是显示在与图像中设备图像的显示区域相关的位置。
若电子设备100根据该两种方式确定出两个设备类型为音箱的设备,由于电子设备100无法将该两个音箱的设备图标与图像中的两个音箱图像一一对应,则电子设备100将该两个音箱的设备图标以预设方式显示在图像中。该预设方式例如可以是显示在取景界面的固定位置,又例如可以是以一个控件的形式呈现在取景界面,当电子设备100检测到针对于该控件的用户操作,电子设备100输出两个设备图标供用户选择。
本申请还示出了一种设备图标的显示形式,可以实现设备图标之间的显示区域不重叠的效果。如图8E所示,图8E中示出了一种通过引线将设备图标显示在设备图像的显示区域的上方区域的显示形式。如图8E所示,设备图像531和设备图标5311通过一根线段连接起来,指示了该设备图标5311与设备图像531对应;设备图像532和设备图标5321通过一根线段连接起来,指示了该设备图标5321与设备图像532对应;设备图像533和设备图标5331通过一根线段连接起来,指示了该设备图标5331与设备图像533对应;设备图像534和设备图标5341通过一根线段连接起来,指示了该设备图标5341与设备图像534对应。
在一些实施例中,当电子设备100检测到设备图标之间的显示区域存在互相重叠,或者两个设备图标之间的显示区域在取景界面中的最近距离小于阈值,则电子设备100输出图8E所示的设备图标,使设备图标之间的显示区域不重叠。
基于上述取景界面530,本申请还提供了一种数据传输方法,用户可以将选择的数据(例如图片、文档、视频等),在取景界面530上,通过滑动操作(或点击操作等)快捷的分享到其他设备上。这样,可以简化用户分享数据的操作步骤,提高分享数据给其他设备的效率。下面分别以三个应用场景为例,详细说明该数据传输方法。
应用场景一,在图9A-图9E示例性示出的UI实施例中,用户可以在多任务界面中触发基于增强现实显示的分享功能,将该多任务界面中的应用程序或应用程序的数据分享到其他设备。
如图9A所示,图9A中显示了用户界面520,其中,关于用户界面520的描述可以参考上述图5B的相关描述。示例性的,当电子设备100检测到针对于卡片521的长按操作901时,电子设备进入卡片521对应的分享界面。电子设备100将卡片521对应的应用程序以及卡片521当前界面中可分享的数据类型提取出来,以图标的方式呈现在分享界面上。
如图9B所述,电子设备100启动摄像头,通过摄像头实时采集图像显示在分享界面920上,分享界面920中的显示内容包括摄像头采集的画面。图9B示例性示出了分享界面920,该分享界面920中包括设备图像531、设备图像532、设备图像533和设备图像534。
分享界面920中,设备图像531、设备图像532、设备图像533、设备图像534、设备图标5311、设备图标5321、设备图标5331、设备图标5341的具体描述可以参考图5F中设备图像531、设备图像532、设备图像533、设备图像534、设备图标5311、设备图标5321、设备图标5331、设备图标5341的相关描述,此处不再赘述。
分享界面920中还可以包括一个或多个图标,该一个或多个图标中每个图标标识了一种可分享的数据,例如应用程序图标902和文件图标903。其中,应用程序图标902与卡片521的应用程序相关联;文件图标903与卡片521中“小说1”的PDF文档相关联。
用户可以通过拖拽的方式,将图标拖拽到相应的设备的显示区域上,用户松手后,电子设备100将该图标对应的数据,发送到该设备上。如图9C所示,用户选中文件图标903后,在分享界面920上拖拽该文件图标903,将该文件图标903拖拽到设备图像534的有效区域。该有效区域为可以指示电子设备100向设备图像534对应的电子设备(电子设备204)分享数据的区域。用户松手后,电子设备100通过无线通信方式将该文件图标903相关联的“小说1”的PDF文档,发送到设备图像534对应的电子设备(电子设备204)。
具体的,无线通信方式包括但不限于Zig-Bee、蓝牙(Bluetooth)、无线宽带(Wi-Fi)、超宽带(UWB)、近场通信(NFC)、Wi-Fi直连(Wi-Fi Direct)等等。
在一些实施例中,当用户将该文件图标903拖拽到设备图像534的显示区域,电子设备100在分享界面920上将设备图像534的显示区域的亮度提高,以指示用户当前文件图标903拖拽到了设备图像534的有效区域。
在一些实施例中,用户将该文件图标903拖拽到设备图标5341的显示区域,用户松手后,电子设备100通过无线通信方式将该文件图标903相关联的“小说1”的PDF文档,发送到设备图像534对应的电子设备(电子设备204)。
如图9D所示,电子设备204接收到电子设备发送的“小说1”的PDF文档,在电子设备204的显示界面1000上输出提示框1001,该提示框1001的文字内容可以为“接收到来自电子设备的PDF文件,点击该提示框查看”。当电子设备204检测到针对于该提示框1001的点击操作,电子设备204打开“小说1”的PDF文档,如图9E所示,电子设备204的显示界面1002上显示“小说1”的PDF文档。在一些实施例中,图9D为可选的,电子设备204接收到电子设备发送的“小说1”的PDF文档,电子设备204直接打开该文档,如图9E所示。
在本申请中,待分享图标也可称为第一图标。用户选中文件图标903后,在分享界面920上拖拽该文件图标903,将该文件图标903拖拽到设备图像534的有效区域,其中这个拖拽操作也可称为第三操作。
在一些实施例中,电子设备100可以根据用户想要分享的数据类型,判断出目标设备是否能够支持输出该数据类型。若不支持,输出提示信息,提示用户选择该目标设备之外的其他设备。
如图10A和图10B所示,电子设备通过拖拽的方式,将文件图标903拖拽到设备图像532的显示区域。由于设备图像532对应的电子设备201的设备类型为音频设备,且电子设备201的设备属性不包括显示功能。则当电子设备检测到用户将文件图标903拖拽到设备图像532的显示区域,电子设备100输出提示信息1100“HUAWEI soundX无法执行该任务”,指示该设备图像532对应的电子设备无法执行输出该文件图标903对应的PDF文档。可选的,当用户将文件图标903拖拽到设备图像532的显示区域,并松手后,电子设备100输出提示信息1100。
不限于上述图10A和图10B所示的方式,在一些实施例中,可以通过设备图标的显示形式来提示用户针对于数据分享的可选择的设备。
如图10C所示,用户选中了文件图标903,由于文件图标903关联了“小说1”的PDF 文档,则当电子设备检测到该文件图标903被选中,取景界面920中的设备图标5311、设备图标5331和设备图标5341的显示区域提亮(或改变图标颜色等);可选的,设备图像531、设备图像533和设备图像534的显示区域提亮。这标识了设备图标5311、设备图标5331和设备图标5341所指示的设备图像531、设备图像533和设备图像534分别对应的电子设备202、电子设备203、电子设备204,是支持输出该文件图标903所关联的PDF文档的设备。提示用户可以将文件图标903拖拽到这几个设备的显示区域中,进行数据分享。
可选的,相比于设备图标5311、设备图标5331和设备图标5341,设备图标5321的显示区域的亮度(或颜色等)不同,指示了设备图标5321对应的电子设备201不支持输出该文件图标903所关联的PDF文档,提示用户不要将文件图标903拖拽到设备图像532的显示区域中。
本申请中,设备图标5311、设备图标5331和设备图标5341的显示形式也可称为第一显示形式,设备图标5321的显示形式也可称为第二显示形式;设备图标的显示形式还可以有更多形式,本申请不作限制。
应用场景二,在图11A-图11D示例性示出的UI实施例中,用户可以通过截屏操作触发基于增强现实显示的分享功能,将截屏图像分享到其他设备。
如图11A所示,图11A显示了用户界面1110,可选的,该用户界面1110可以是电子设备中任意一个显示界面。当电子设备100显示用户界面1110时,接收到截屏操作,则电子设备100采集当前界面的显示内容,生成图片文件。其中,截屏操作可以是通过一个或多个虚拟按键触发实现,也可以是通过一个或多个实体按键触发实现。
如图11B所示,电子设备100接收到截屏操作,采集当前界面的显示内容,生成图片文件。在当前的用户界面1110上显示截屏缩略图1111。该截屏缩略图1111关联了对应的图片文件,如图11C所示,用户针对该截屏缩略图1111进行长按,当电子设备100检测到针对于截屏缩略图1111的长按操作,触发分享功能,电子设备显示如图11D所示的分享界面1120。电子设备启动摄像头,通过摄像头实时采集图像显示在分享界面1120上,分享界面1120中的显示内容包括摄像头采集的画面。如图11D所示,图11D示例性示出了分享界面1120,该分享界面1120中包括设备图像531、设备图像532、设备图像533和设备图像534。
分享界面1120还包括截屏缩略图1111。用户可以自由拖拽该截屏缩略图1111,当用户将该截屏缩略图1111拖拽到分享界面1120中任意一个设备的显示区域上,用户松手后,电子设备100向该设备发送该截屏缩略图1111关联的图片文件。
需要说明的是,基于同一发明构思,本发明实施例中提供的用户拖拽截屏缩略图1111到其他设备的显示区域分享的原理,与用户拖拽文件图标903到其他设备的显示区域分享相似,因此用户拖拽截屏缩略图1111到其他设备的显示区域分享的实施,可以参见用户拖拽文件图标903到其他设备的显示区域分享的实施对应的相应描述,例如可以参考图9C~图9E所示的实施方式以及对应描述,在此不再赘述。
应用场景三,在图12A-图12E示例性示出的UI实施例中,在电子设备检测到选择图片进行分享的操作时,用户可以触发基于增强现实显示的分享功能,将一个或多个图片文件分享到其他设备。
图12A示例性示出了一种用户界面1201。如图12A所示,用户界面1201可包括以下区域中的一个或多个:区域1201、区域1202和区域1203。其中:
区域1201可用于显示图库中的一个或多个图片,这一个或多个图片中可以包括用户选择 的图片,例如已选定图片1205。在一些实施例中,已选定图片1205上可显示有标记1206,标记1206可表示其对应的图片1205被电子设备100选定(也即图片已被用户选择)。在另外一些实施例中,用户可以在区域1201中做出向左或向右的滑动手势等来切换或更新图片。图片1205可以是缩略图。区域405中显示的图片对应的原图可以存储于电子设备100上的图片,也可以存储于云端服务器上。
区域1203中可以显示有一个或多个服务选项(如浏览器、信息等)。服务选项对应的应用程序或协议可支持分享用户选择的图片至联系人或服务器。在一些实施例中,响应于在区域1203中检测到的作用于服务选项的操作(如在“信息”图标上的触摸操作)时,电子设备100可触发通过该服务选项对应的应用程序或协议分享已选定图片至云端联系人或服务器的过程,该过程可包括:电子设备100打开该应用程序或协议,显示其用户界面,在该用户界面中检测用户进行数据分享的操作,响应该操作,通过该应用程序或协议将已选定的图片分享至云端联系人或服务器。
区域1202可用于显示电子设备100自发现的附近设备选项,如智慧屏、mate 30 Pro、matebookX、打印机等。区域1202中显示的设备选项(如mate 30 Pro、matebookX)可用于触发分享的操作。响应于检测到的作用于设备选项的操作(如在设备图标上的触摸操作),电子设备100可以触发分享已选定的图片至该操作选定的设备选项对应的设备的过程。该过程可包括:电子设备100与已选定的设备选项对应的设备建立通信连接,然后通过该通信连接向该设备选项对应的设备传输已选定的图片。
用户界面1210还包括实时取景分享控件1204,该实时取景分享控件704,用于触发进入分享界面。当电子设备100检测到针对于实时取景控件704的用户操作,电子设备100启动摄像头,显示如图12B所示的分享界面1220。该分享界面中包括摄像头采集的图像、设备图标和待分享的图片栏1221。
在一些实施例中,用户界面1210中的实时取景控件1204是可选的,电子设备100可以不显示该实时取景控件1204。当电子设备100显示用户界面1210,电子设备100检测到抬起操作时,电子设备100触发显示分享界面1220。抬起操作可以参考上述图7C的描述,本实施例中,在T1时刻,电子设备100显示用户界面1210,当电子设备100检测到抬起操作,在T2时刻,电子设备100显示分享界面1220,其中T1时刻和T2时刻之间的时间间隔小于阈值。
可以理解的,抬起操作只是一种示例性的用户操作,电子设备100还可以通过其他用户操作进入分享界面1220。
图片栏1221用于显示图库中的一个或多个图片,这一个或多个图片中可以包括用户选择的图片,例如已选定图片1205。在一些实施例中,已选定图片1205上可显示有标记1206,标记1206可表示其对应的图片1205被电子设备选定(也即图片已被用户选择)。在另外一些实施例中,用户可以在区域1201中做出向左或向右的滑动手势等来切换或更新图片。
用户选定一张或多张图片后,选择分享界面中任意一个设备,当电子设备100检测到针对于设备图标的用户操作(如在设备图标上的点击操作),电子设备100可以触发分享已选定的图片至该用户操作选定的设备图标对应的设备的过程。该过程可包括:电子设备100与已选定的设备图标对应的设备建立通信连接,然后通过该通信连接向该设备图标对应的设备传输已选定的图片。
如图12C所示,用户选定了图片1205后,点击设备图标5311,电子设备100检测针对于设备图标5311的用户操作,将图片1205发送给设备图像531对应的电子设备202。如图 12D所示,电子设备202接收到电子设备100发送的图片1205,在电子设备202的显示界面上输出提示框1211,该提示框1211的文字内容可以为“接收到来自电子设备的PDF文件,点击该提示框查看”。当电子设备202检测到针对于该提示框1211的点击操作,电子设备202打开图片1205,如图9E所示,电子设备202的显示界面上显示图片1205。在一些实施例中,图12D为可选的,电子设备202接收到电子设备发送的图片1205,电子设备202直接打开该图片,如图12E所示。
在一些实施例中,可以通过设备图标的显示形式来提示用户针对于数据分享的可选择的设备。
如图12F所示,用户选中了图片1205,由于图片1205的数据类型为图片,则当电子设备100检测到该图片1205被选中,取景界面920中的设备图标5311、设备图标5331和设备图标5341的显示区域提亮(或改变图标颜色等);可选的,设备图像531、设备图像533和设备图像534的显示区域提亮。这标识了设备图标5311、设备图标5331和设备图标5341所指示的设备图像531、设备图像533和设备图像534分别对应的电子设备202、电子设备203、电子设备204,是支持输出该图片1205的设备。提示用户可以选择点击这几个设备的设备图标,进行数据分享。
可选的,相比于设备图标5311、设备图标5331和设备图标5341,设备图标5321的显示区域的亮度(或颜色等)不同,指示了设备图标5321对应的电子设备201不支持输出该图片1205,提示用户不要点击设备图像532的设备图标5321。
本申请中,设备图标的显示形式还可以有更多形式,本申请不作限制。
上述示例性示出的三个场景,基于本申请实施例提供的设备识别方法实现了设备之间的数据传输,可以简化用户分享数据的操作步骤,提高分享数据给其他设备的效率。具体的,本申请实施例还提供了一种分享照片方法,用户可以在相机应用的拍摄预览界面实现对照片的快速分享。下面详细说明该分享照片方法。
在海量终端的环境下,多个终端之间的图片、文件等共享变得越来越普遍。如何快速高效的找到用户期望分享的目标终端,提升用户查找目标终端的效率和体验变得非常重要。
以手机为例,用户使用手机拍完照片之后,经常会有将照片分享给其他用户或者分享给其他电子设备的需求。在当前的用户分享照片的过程中,用户必须执行一系列的繁琐操作,例如打开图库、选中图片、点击分享、搜索其他电子设备、选中目标电子设备、传输图片等多个操作,才可以实现将图片分享给目标电子设备。该分享照片的过程操作繁琐、交互流程多,而且分享照片的效率低。
图13是一例分享照片过程的图形用户界面(graphical user interface,GUI)示意图。其中,图13中的(a)图示出了解锁模式下,手机当前输出的界面内容1301,该界面内容1301显示了多款应用程序(application,App),例如音乐、设置、相册和相机等应用程序。应理解,界面内容1301还可以包括其他更多的应用程序,本申请实施例对此不作限定。
如图13中的(a)图所示,用户点击相机应用的图标,响应于用户的点击操作,手机进入如图13中的(b)图所示相机应用主界面1302,或者称为“拍摄预览界面”,该拍摄预览界面中呈现的画面称为“预览图像”或者“预览画面”。
应理解,在本申请实施例中,如图13中的(b)图所示,拍摄预览界面1302可以包括中间的预览画面、该界面顶端区域、底端区域显示的相机应用的按键、菜单选项等,在后续实施例中,拍摄预览界面和预览画面都可以用于描述相机应用的拍摄界面,例如“在拍摄预览界面上显示提醒窗口”或“在预览画面中显示提醒窗口”不作严格区分,后续不再赘述。
还应理解,本申请实施例中拍摄预览界面可以代表包括预览画面、拍摄快门键、本地相册图标、摄像头切换图标等在内的界面,如果该界面上发生显示内容的变化,例如显示了某个识别出的设备标签等,该界面还是可以被称为拍摄预览界面,后续不再赘述。
在该相机应用主界面1302上,包括多种按键和菜单选项,例如拍摄快门键31、本地相册图标32和摄像头切换按键33等,用户可以通过多种按键和菜单选项实现不同的操作。用户可以执行如图13中的(b)图所示的操作1,点击拍摄快门键31,响应于用户的拍摄操作,手机拍摄照片并将拍摄的照片保存在本地相册。
当用户期望将当前拍摄的照片或者本地相册的其他照片分享给其他的电子设备时,用户可以执行如图13中的(b)图所示的操作2,点击该相机应用主界面1302的本地相册图标32,响应于用户的点击操作,手机进入照片显示界面1303。该照片显示界面1303可以显示当前拍摄的照片,如图13中的(c)图所示,用户点击该照片显示界面1303的“分享”按键,手机进入照片分享界面1304。
该照片分享界面304可以包括照片区域和分享菜单区域,其中,照片区域可以显示多张拍摄的照片,用户可以点击照片右下角的“选择”框,选中期望分享的照片。分享菜单区域可以为用户提供多种照片分享方式,例如“华为分享(Huawei Share)”、“发送给朋友”、“蓝牙”、“发送给好友”、“微博”、“信息”、“电子邮件”、“备忘录”等多种照片分享方式,不同的照片分享方式可以与不同应用(例如微信等)关联,此处不再赘述。
如图13中的(d)图所示,用户点击该照片分享界面304的“华为分享(Huawei Share)”按键,手机可以进入如图13中的(e)图所示的界面,显示多个可以分享的电子设备,例如Ma’s P30、MateBook等。用户可以根据自己的需求,选择待分享的目标电子设备的图标,从而将选中的照片分享给该目标电子设备。
相应地,用户点击期望分享照片的目标电子设备之后,在该目标电子设备上可以弹出接收窗口,该接收窗口可以供选择是否接收当前分享的照片。
以上介绍了用户通过相机应用拍摄照片之后,将照片分享给其他电子设备的过程。该过程的步骤依次经过用户拍摄照片、打开图库、选中图片、点击分享、选择分享方式、搜索其他电子设备、选中目标电子设备、传输图片等多个操作,才可以实现将拍摄的照片分享给目标电子设备。该分享照片的过程操作繁琐、交互流程多,而且分享照片的效率低。
因此,本申请实施例提供一种分享照片的方法,在图14~图18示例性示出的UI实施例中,用户可以通过相机应用,快速将照片分享给其他电子设备。
图14是本申请实施例提供的一例分享照片过程的图形用户界面示意图。其中,图14中的(a)图示出了解锁模式下,手机当前输出的界面内容1401,用户点击相机应用的图标,响应于用户的点击操作,手机显示如图14中的(b)图所示的拍摄预览界面1402。在该拍摄预览界面1402上,用户点击拍摄快门键31,响应于用户的拍摄操作,手机拍摄照片并将拍摄的照片保存在本地相册。
用户执行如图14中的(c)图所示的操作,长按本地相册图标32,响应于用户的长按操作,手机显示如图14中的(d)图所示的界面1404,在该界面1404上显示缩略照片的图标30,或者称为“照片缩略图”。同时,手机启动设备识别功能,根据当前拍摄预览界面1404上呈现的预览画面,识别该预览画面中是否包括其他的电子设备。
示例性的,如图14中的(d)图所示,如果当前呈现的预览画面中包括桌子上的手机10和个人电脑(personal computer,PC)20,手机可以识别出预览画面中的手机10和PC 20,并在界面1404中显示识别出的手机10的名称和PC 20的名称,例如手机10为“P40”,PC 20 为“MateBook”等。
可选地,手机可以不显示识别出的预览画面中其他电子设备的名称,仅仅标记“设备1”、“设备2”等,本申请实施例对此不作限定。
这里需要说明的是,图14中的(b)图和图14中的(c)图呈现的预览画面可以是手机前置摄像头或者后置摄像头获取的,本申请实施例对拍摄照片的摄像头不作限定。例如,当图14中的(b)图的人物照片是手机前置摄像头获取的,如果用户要通过后置摄像头识别电子设备,可以通过点击摄像头切换按键33进行切换。又例如,当图14中的(b)图的人物照片是手机后置摄像头获取的,如果用户要通过前置摄像头识别电子设备,可以通过点击摄像头切换按键33进行切换。
这里还需要说明的是,上述实施例中以长按操作为例,介绍了通过用户长按本地相册图标32作为触发照片分享过程的操作。应理解,本申请实施例还可以通过其他预设操作触发本申请实施例提供的照片分享过程,或者通过其他预设操作触发手机识别预览画面中的电子设备,例如该预设操作不限于长按本地相册图标32、双击本地相册图标32、或者在拍摄预览界面1403上绘制固定图案等,本申请实施例对此不作限定。
一种可能的实现方式中,手机在检测到用户对本地相册图标32的长按操作之后,触发手机的识别功能。换言之,手机未检测到用户对本地相册图标32的长按操作时,可以不识别预览画面中的物体,显示如图14中的(c)图。当手机检测到用户对本地相册图标32的长按操作之后,触发识别该预览画面中的物体,并标记出识别到的电子设备的名称“P40”和“MateBook”,显示如图14中的(d)图。上述实现方式可以避免手机一直处于识别预览画面的物体的状态,从而降低手机的功耗。
另一种可能的实现方式中,手机可以一直启动设备识别功能,即手机持续识别预览画面中的物体,并在检测到用户对本地相册图标32的长按操作之后,标记出识别到的电子设备的名称,显示如图14中的(d)图所示的“P40”和“MateBook”的图标。
上述实现方式可以使得手机提前判断预览画面中包括的物体,并在用户通过长按本地相册图标32启动照片分享功能时,快速将识别出的电子设备名称显示在界面中,提高手机识别预览画面的物体的速度。
当手机识别出当前预览画面中包括的P40和MateBook之后,用户可以根据自己的需求,长按缩略照片的图标30,并将该缩略照片的图标30拖动到待分享的目标设备。
示例性的,如图14中的(d)图所示,预览画面中显示了“P40”和“MateBook”的图标,用户长按该缩略照片的图标30,将该缩略照片的图标30拖动到P40的图标区域并释放。或者,用户长按该缩略照片的图标30,将该缩略照片的图标30拖动到P40所在区域的任意位置并释放。
可选地,用户可以将该缩略照片的图标30拖动到P40的图标所在位置之后释放,该P40的图标可以呈现为不同的颜色,或者显示出大小变化、跳动、闪烁灯其他动态效果,以提醒用户将当前拍摄的照片分享给预览画面中识别到的P40。示例性的,如图14中的(e)图所示,用户拖动该缩略照片的图标30到P40的图标所在位置时,该“P40”图标颜色变化,此时用户释放该缩略照片的图标30,就可以实现将当前拍摄的照片分享至P40。
又一种可能的实现方式中,用户拖动该缩略照片的图标30的过程中,在预览画面上,还可以显示提醒控件。示例性的,如图14中的(e)图所示,该提醒控件可以是箭头40等,该箭头40可以静态显示、跳动显示或者闪烁显示,以提示用户可以将缩略照片的图标30拖动到该箭头40标识的位置,实现照片分享功能。本申请实施例对提醒控件的显示方式不作限定。
应理解,针对上述实现过程,手机可以通过图像检测、3D扫描技术和机器视觉等多种不同的方式,检测并识别到预览画面中包括的其他电子设备,本申请实施例对手机识别预览画面中其他电子设备的方式不作限定。
还应理解,本申请实施例中,手机还可以通过多种可能的定位技术,识别出预览画面中的其他电子设备,并定位其他电子设备的位置。
可选地,本申请实施例的定位技术可以包括基于蓝牙的无线感知定位、基于超宽带(ultra wide-band,UWB)感知的无线感知定位、基于计算机视觉的定位等技术中的一种,或者以上列举的多种定位技术的融合等,又或者其他更多的定位技术,本申请实施例对手机定位其他电子设备的方式不作限定。
此外,在本申请实施例中,手机识别出预览画面中包括的其他电子设备之后,可以根据当前预览画面中物体的显示位置,确定显示电子设备图标的位置。
一种可能的方式中,手机可以将标记其他电子设备的图标显示在预览画面中的该电子设备所在的区域。示例性的,以对图14中的(d)图为例,手机识别出预览画面中的P40和MateBook之后,将标记“P40”的图标显示在识别出的手机所在位置,将标记“MateBook”的图标显示在PC所在位置。
可选地,该标记其他电子设备的图标可以显示在靠近该电子设备的定位装置的区域。示例性的,手机通过UWB芯片和P40进行通信,以定位P40在预览画面中的位置,如果P40的UWB芯片安装于P40的右上角,那么图14中的(d)图中包括“P40”的图标可以显示在P40的右上角的UWB芯片所在区域,本申请实施例对此不作限定。
另一种可能的方式中,该标记其他电子设备的图标可以显示在预览画面中的空白区域,不遮挡预览画面中的其他物体。示例性的,如图14中的(d)图所示,手机识别出预览画面中的P40和MateBook之后,将标记“P40”的图标显示在预览画面的左边界处,以不遮挡P40右侧的PC;同时,将标记“MateBook”的图标显示在预览画面的右边界处,以不遮挡MateBook左侧的手机。
上述介绍的图标显示方式可以在不遮挡预览画面中其他物体的情况下标记识别出的电子设备,不影响用户的视觉和观感,提高了用户的视觉体验。
通过上述方法,用户可以在拍摄照片的过程中,通过预设的操作,可以启动手机的设备识别功能和定位功能,结合手机的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以将待分享的照片直接拖动到其他电子设备所在的区域,从而快速将照片分享给周围存在的其他电子设备。该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
另一种可能的场景中,当手机在识别预览画面中的其他电子设备时,可能出现识别出的电子设备被障碍物遮挡的情况,即预览画面中不能看见该电子设备。针对该种场景,本申请实施例还提供一种分享照片的方法,以实现快速将拍摄的照片分享给预览画面中被遮挡的电子设备。
图15是本申请实施例提供的又一例分享照片过程的图形用户界面示意图。示例性的,如图15中的(a)图所示,在拍摄预览界面1501上,手机识别出预览画面中的PC 20为MateBook,并显示了标记“MateBook”的图标。此外,手机还识别出MateBook后面存在被遮挡的设备1。在该种场景中,MateBook遮挡了设备1,在手机的拍摄预览界面1501上,可以显示提醒窗口50,该提醒窗口50可以包括用于提醒用户检测到的设备1的文字信息。
可选地,本申请实施例除了提醒窗口50的文字提醒之外,还可以包括图标提醒。例如, 在手机的拍摄预览界面1501上,除了该提醒窗口50之外,还可以包括静态显示的箭头、动态闪烁的箭头或者跳动显示的箭头等标记被遮挡的电子设备的位置的图标,本申请实施例对此不作限定。
示例性的,如图15中的(a)图所示,该提醒窗口50显示:此处检测到设备1,是否分享。用户点击该提醒窗口50之后,手机显示如图15中的(b)图所示的界面1502,在该界面1502上,包括照片分享窗口60,用户可以点击该照片分享窗口60的“分享”按键,确定将当前拍摄的照片分享到被遮挡的设备1。
或者,在用户点击如图15中的(a)图所示的该提醒窗口50之后,手机可以不再进一步显示图15中的(b)图的界面,直接将该拍摄的照片分享到被遮挡的设备1,本申请实施例对此不作限定。
需要说明的是,手机可以和附近的其他电子设备进行通信,例如通过蓝牙、无线保真(wireless fidelity,WIFI)模块等多种可能的方式进行通信,那么手机就可以感知到附近存在的电子设备。或者,手机通过UWB等无线定位技术确定附近存在其他电子设备,并识别出该电子设备的类型等,可以显示在拍摄预览界面中。本申请实施例对手机和附近的其他电子设备的通信交互方式、定位方式不作限定。
通过上述方法,当手机到识别到预览画面中存在其他电子设备,且该电子设备被障碍物遮挡时,在用户分享照片的过程中,可以在拍摄预览界面上显示文字或图标等提醒信息,用于提示用户被遮挡的电子设备的位置等,用户可以进一步将拍摄的照片快速分享到被遮挡的电子设备,为用户向被遮挡的电子设备分享照片提供了一种可能的途径,简化了用户分享照片的操作步骤。
又一种可能的场景中,手机可能通过无线定位技术识别到附近有其他电子设备,且该电子设备并没有显示在手机当前的预览画面中。针对该种场景,本申请实施例还可以在拍摄预览界面上显示提醒信息,用于提醒用户某个方位存在其他电子设备。
示例性的,如图15中的(c)图所示,在拍摄预览界面1503上,手机的摄像头获取的预览画面中不包括任何电子设备,但是手机可能在预览画面之外的左侧区域检测到3个电子设备。在该种场景中,在界面1503上,可以显示提醒窗口70,该提醒窗口70可以包括用于提醒用户检测到的多个电子设备的文字信息。
可选地,本申请实施例除了提醒窗口70的文字提醒之外,还可以包括图标提醒。例如,在手机的拍摄预览界面1503上,除了该提醒窗口70之外,还可以包括静态显示的箭头、动态闪烁的箭头或者跳动显示的箭头等标记被遮挡的电子设备的位置的图标,本申请实施例对此不作限定。
示例性的,如图15中的(c)图所示,该提醒窗口70显示:此处检测到3个电子设备,请转动摄像头,获取电子设备的信息。用户点击该提醒窗口70之后,手机显示如图15中的(d)图所示的界面1504,在该界面1504上,包括设备列表窗口80,用户可以点击该设备列表窗口80中的任意一个设备,例如设备3,从而确定将当前拍摄的照片分享到设备3。
或者,另一种可能的方式中,用户可以根据界面1503上的提醒信息转动手机的方向,使得手机的摄像头可以获取检测到的3个电子设备,并在预览画面中显示用户将要分享照片的设备3,从而可以按照图14中介绍的方法,快速地将拍摄的照片分享给其他电子设备。
通过上述方法,当手机摄像头获取的预览画面中不存在其他电子设备,且手机检测到附近存在其他电子设备时,可以在拍摄预览界面上显示文字或图标等提醒信息,用于提示用户附近可以分享照片的其他电子设备的信息或位置等。从而在分享照片的过程中,用户可以通 过拖动照片到预览画面中的其他电子设备的方式,将拍摄的照片快速分享到该电子设备,为用户提供了另一种可能的分享照片的途径,简化了用户分享照片的操作步骤。
在本申请实施例中,手机作为发送设备,接受用户分享照片的电子设备可以作为“接收设备”。对于上述图14和图15的分享照片的过程,当用户拖动缩略照片的图标30至手机识别出的接收设备之后,相应地,在该接收设备上可以出现该照片的接收窗口。
图16是本申请实施例提供的一例接收照片的图形用户界面示意图。示例性的,图16中的(a)图示出了接收设备的一种可能的界面1601,应理解,该界面1601不限于是该接收设备的主界面或者任意一款应用程序的运行界面等,本申请实施例对此不作限定。
以接收设备的主界面1601为例,当用户从手机上执行了分享照片的操作之后,该接收设备可以显示为图16中的(b)图所示的界面1602,该界面1602上包括照片的接收窗口90。可选地,该照片的接收窗口90可以为用户提供“查看”、“关闭”等按键,以便于用户通过接收设备快速查看该分享的照片。
可选地,该照片的接收窗口90可以在接收设备的界面上显示预设时长之后,自动消失或隐藏到接收设备的通知栏,用户可以通过下拉操作,查看通知栏的照片分享结果;或者通过下拉操作,进一步关闭通知栏的照片分享结果,此过程可以参考现有技术中的相关操作,此处不再赘述。
应理解,用户拖动该缩略照片的图标30至识别到的预览画面中的接收设备并释放该缩略照片的图标30之后,手机可以将当前拍摄的照片传输给接收设备。例如,传输方式可以不限于蓝牙传输、WIFI传输、近距离无线通讯技术(near-field communication,NFC)传输以及未来的第五代(5th generation,5G)移动通信系统等高速率通信方式等多种可能的方式,本申请实施例对照片传输的方式不作限定。
还应理解,该分享的照片可以是当前用户点击拍摄快门键之后拍摄的最新照片,也可以是用户之前拍摄的照片,或者是用户手机上保存的其他来源的图片,本申请实施例对此不作限定。
换言之,用户可以打开相机应用,不拍摄照片,直接长按并拖动本地相册图标,将本地相册中拍摄日期最接近现在日期的第一张照片、或者用户手机上存储的其他来源的图片分享给接收设备,本申请实施例对此不作限定。
以上介绍了用户通过相机应用分享一张照片的过程,此外,本申请实施例还提供一种分享照片的方法,用户可以通过相机应用,将多张照片同时分享给预览画面中识别出的接收设备。
图17是本申请实施例提供的一例分享照片过程的图形用户界面示意图。其中,图17中的(a)图示出了解锁模式下,手机当前输出的主界面1701,用户点击主界面1701的相机应用的图标,响应于用户的点击操作,手机显示如图17中的(b)图所示的拍摄预览界面1702。在该拍摄预览界面1702上,用户点击拍摄快门键31,响应于用户的拍摄操作,手机拍摄照片并将拍摄的照片保存在本地相册。
用户执行如图17中的(c)图所示的操作,选中本地相册图标32并沿着箭头所示的方向向上拖动该本地相册图标32,响应于用户的拖动操作,手机显示如图17中的(d)图所示的界面1704。在该界面1704上显示照片列表,如图17中的(d)图所示,该照片列表可以显示多个照片的缩略图,例如照片1、照片2和照片3等。可选地,该照片列表可以显示在界面1704的底端区域,不影响该界面1704中预览画面的显示,保证用户可以看见预览画面中的内容。
一种可能的情况中,该照片列表中的照片可以按照用户拍摄的顺序进行排列。示例性的,照片1是用户拍摄的最新照片,照片2和照片3的拍摄时间早于照片1的拍摄时间。
或者,照片列表中的照片可以按照其他可能的排列顺序进行排列,例如检测到拍摄地点为公司,该照片列表中可以显示拍摄地点为公司的照片,本申请实施例对此不作限定。
一种可能的情况中,当用户执行如图17中的(c)图所示的操作显示该照片列表之后,该照片列表中的第一张照片可以是默认选中的,换言之,图17中的(d)图中的照片1的右下角默认被标识为选中的待分享的照片。如果用户并不期望分享该照片1,可以点击照片1右下角的选择框,取消选择照片1。同样地,如果用户期望同时分享该照片1、照片2和照片3,可以点击每张照片右下角的选择框,选择多张待分享的照片,此处不再赘述。
当用户选中待分享的照片1、照片2和照片3之后,手指可以长按待分享的照片1、照片2和照片3的任意区域,响应于用户的长按操作,手机显示如图17中的(e)图所示的界面1705,在该界面1705上显示缩略照片的图标30。
同时,手机启动设备识别功能,根据当前拍摄预览界面1705上呈现的预览画面,识别该预览画面中是否包括其他的电子设备。可选地,缩略照片的图标30上可以仅显示待分享的照片1、照片2和照片3中的任意一张照片的缩略图,本申请实施例对此不作限定。
可选地,本申请实施例还可以通过其他预设操作触发本申请实施例提供的分享多张照片过程,或者通过其他预设操作触发手机识别预览画面中的电子设备,例如该预设操作不限于选中本地相册图标32并向上拖动、双击本地相册图标32、或者在拍摄预览界面1703上绘制固定图案等,本申请实施例对此不作限定。
示例性的,如图17中的(e)图所示,如果当前呈现的预览画面中包括桌子上的手机10和PC 20,手机可以识别出预览画面中的手机10和PC 20,并在该预览画面中显示识别出的手机10的名称和PC 20的名称,例如手机10为“P40”,PC 20为“MateBook”等。可选地,手机可以不显示识别出的预览画面中其他电子设备的名称,仅仅标记“设备1”、“设备2”等,本申请实施例对此不作限定。
当手机识别出界面1705的预览画面中包括的P40和MateBook之后,用户可以根据自己的需求,将该缩略照片的图标30拖动到待分享的目标设备。
示例性的,如图17中的(f)图所示,预览画面中显示了“P40”和“MateBook”的图标,用户将该缩略照片的图标30拖动到MateBook的图标区域并释放,即可以实现将选中的照片1、照片2和照片3分享给MateBook。或者,用户将该缩略照片的图标30拖动到MateBook所在区域的任意位置并释放,即可以实现将选中的照片1、照片2和照片3分享给MateBook。
可选地,用户将该缩略照片的图标30拖动到MateBook的图标所在位置之后释放,该MateBook的图标可以呈现为不同的颜色,或者显示出大小变化、跳动、闪烁灯其他动态效果,以提醒用户将当前拍摄的照片分享给预览画面中识别到的MateBook。
一种可能的情况中,用户拖动该缩略照片的图标30的过程中,在预览画面上,还可以显示提醒控件。示例性的,如图17中的(f)图所示,该提醒控件可以是箭头40等,该箭头40可以静态显示、跳动显示或者闪烁显示,以提示用户可以将缩略照片的图标30拖动到该箭头40标识的位置,实现照片分享功能。本申请实施例对提醒控件的显示方式不作限定。
这里需要说明的是,在该图17的实施例描述中,和图14至图15介绍的相同的操作过程以及可能的实现方式等,可以参考前述的相应描述,此处不再赘述。
同样地,在分享多张照片的过程中,也可能出现预览界面中的接收设备被遮挡的情况,具体的实现过程可以参照图15中的相关描述,此处不再赘述。
通过上述方法,用户可以在拍摄照片的过程中,通过预设的操作,可以启动手机的设备识别功能和定位功能,结合手机的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以选择多张待分享的照片,并将多张待分享的照片直接拖动到其他电子设备所在的区域,从而快速将照片分享给周围存在的其他电子设备。该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
对于上述图17的分享照片的过程,当用户拖动缩略照片的图标30至手机识别出的PC 20之后,相应地,在该PC 20上可以出现该照片的接收窗口。
图18是本申请实施例提供的又一例接收照片的图形用户界面示意图。示例性的,图18中的(a)图示出了PC 20的一种可能的界面。应理解,以该PC 20可以显示使用windows系统、鸿蒙系统等不同系统所呈现的界面,该界面还可以是PC 20的使用过程中的任意一种运行界面,本申请实施例对该PC 20的显示界面不作限定。
以MateBook使用windows系统为例,当用户从手机上执行了分享了3张照片的操作之后,该MateBook可以显示图18中的(b)图所示的照片的接收窗口1801。可选地,该照片的接收窗口1801可以显示用户分享的照片1、照片2和照片3的缩略图,此外,还可以为用户提供“查看”、“关闭”等按键,以便于用户快速查看该分享的照片。
可选地,该照片的接收窗口1801可以在接收设备的界面上显示预设时长之后,自动消失或隐藏到MateBook的底部的状态栏,用户可以通过点击状态栏的操作,查看照片分享结果;或者进一步关闭状态栏的照片分享结果,此过程可以参考现有技术中的相关操作,此处不再赘述。
应理解,用户拖动该缩略照片的图标30至识别到的预览画面中的MateBook并释放该缩略照片的图标30之后,手机可以将当前拍摄的照片传输给MateBook。例如,手机和MateBook之间的传输方式可以不限于蓝牙传输、WIFI传输、近距离无线通讯技术(near-field communication,NFC)传输以及未来的第五代(5th generation,5G)移动通信系统等高速率通信方式等多种可能的方式,本申请实施例对此不作限定。
还应理解,该分享的照片可以是当前用户点击拍摄快门键之后拍摄的最新照片,也可以是用户之前拍摄的照片,或者是用户手机上保存的其他来源的图片,本申请实施例对此不作限定。换言之,用户可以打开相机应用,不拍摄照片,直接长按并拖动本地相册图标,将本地相册中拍摄日期最接近现在日期的第一张照片分享给接收设备,本申请实施例对此不作限定。
综上所述,本申请实施例提供的分享照片的方法,用户可以在拍摄照片或者运行相机应用的过程中,通过预设的操作,启动电子设备的设备识别功能和定位功能。并基于电子设备的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以通过快捷操作选择一张或多张待分享的照片,并直接拖动该一张或多张待分享的照片到其他电子设备所在的区域,从而快速将一张或多张照片分享给周围存在的其他电子设备。此外,本申请实施例针对预览画面中存在被遮挡的其他电子设备等多种场景,为用户提供人性化的交互界面,方便用户可以通过快捷操作分享一张或多张照片,该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
上述实施例结合图14至图18,从用户交互层面介绍了分享照片的方法,下面将结合图19,从软件实现策略层面,介绍本申请实施例提供的分享照片的方法。应理解,该方法可以在如图2、图3所示的具有触摸屏和摄像头组件等结构电子设备(例如手机、平板、电脑等) 中实现。
图19是本申请实施例提供的一例分享照片的方法的示意性流程图,以手机为例,如图19所示,该方法可以包括以下步骤:
1901,启动相机应用。
具体地,手机启动相机应用,并显示拍摄预览界面。示例性的,该步骤1901的实现过程可以如图14中的(a)图所示,或者如图17中的(a)图所示。
1902,用户点击拍摄快门键,拍摄照片。
应理解,步骤1902为可选的步骤。具体地,该分享的照片的方法可以应用于用户拍摄照片的场景中,该待分享的照片可以是用户点击拍摄快门键拍摄的最新照片,也可以是用户之前拍摄的照片,或者该待分享的照片还可以是用户手机上保存的其他来源的图片,本申请实施例对此不作限定。
示例性的,当待分享的照片是用户当前拍摄的照片时,可以如图14中的(b)图所示,并继续执行下述步骤1903-1904的过程。
1903,检测用户对本地相册图标的长按操作。
1904,当检测到用户对本地相册图标的长按操作时,触发显示缩略照片的图标,且该缩略照片的图标处于可拖拽模式,同时启动设备识别功能。
可选地,除了用户对本地相册图标的长按操作之外,本申请实施例还可以通过其他预设操作触发本申请实施例提供的照片分享过程,或者通过其他预设操作触发手机识别预览画面中的电子设备,例如该预设操作不限于长按本地相册图标、双击本地相册图标、或者在拍摄预览界面上绘制固定图案等,本申请实施例对此不作限定。
一种可能的实现方式中,手机未检测到用户对本地相册图标的长按操作时,可以不识别预览画面中的物体。示例性的,当手机检测到用户对本地相册图标的长按操作之后,触发识别该预览画面中的物体,并标记出识别到的电子设备的名称“P40”和“MateBook”,显示如图14中的(d)图。该方式可以避免手机一直处于识别预览画面的物体的状态,从而降低手机的功耗。
另一种可能的实现方式中,手机可以一直启动设备识别功能,即手机持续识别预览画面中的物体,并在检测到用户对本地相册图标的长按操作之后,标记出识别到的电子设备的名称“P40”和“MateBook”,显示如图14中的(d)图。该方式可以使得手机提前判断预览画面中包括的物体,并在用户通过长按本地相册图标启动照片分享功能时,快速将识别出的电子设备名称显示在界面中,提高手机识别预览画面的物体的速度。
针对步骤1901-步骤1904的场景,可以继续执行下述步骤1909-1911的过程。
1909,识别预览画面中包括的其他电子设备,并标记该识别出的电子设备。
应理解,手机可以和附近的其他电子设备进行通信,例如通过蓝牙、WIFI模块、NFC等多种可能的方式进行通信,那么手机就可以感知到附近存在的电子设备。或者,手机通过UWB等无线定位技术确定附近存在其他电子设备,并识别出该电子设备的类型等,可以显示在拍摄预览界面中。本申请实施例对手机和附近的其他电子设备的通信交互方式、定位方式不作限定。
还应理解,手机可以将标记其他电子设备的图标显示在预览画面中的该电子设备所在的区域,或者显示在预览画面中的空白区域,不遮挡预览画面中的其他物体,具体的显示方式请参考前述描述,此处不再赘述。
1910,检测到用户拖动该缩略照片的图标至识别到的预览画面中其他电子设备。
1911,分享照片至用户拖动该缩略照片的图标到达的电子设备。
可选地,用户可以将该缩略照片的图标拖动到标记其他电子设备的图标所在位置之后释放,该图标可以呈现为不同的颜色,或者显示出大小变化、跳动、闪烁灯其他动态效果,以提醒用户将当前拍摄的照片分享给预览画面中识别到的其他电子设备。
示例性的,如图14中的(e)图所示,用户拖动该缩略照片的图标30到P40的图标所在位置时,该“P40”图标颜色变化,此时用户释放该缩略照片的图标30,就可以实现将当前拍摄的照片分享至P40。
或者,示例性的,如图17中的(f)图所示,用户将该缩略照片的图标30拖动到MateBook的图标所在位置之后释放,该MateBook的图标可以呈现为不同的颜色,或者显示出大小变化、跳动、闪烁灯其他动态效果,以提醒用户将当前拍摄的照片分享给预览画面中识别到的MateBook。
另一种可能的场景中,用户可能期望分享多张照片,或者分享的照片并非当前拍摄的。针对这种场景,可以执行步骤1905-1911的过程。
1905,检测用户对本地相册图标的滑动操作。
1906,显示本地相册的照片列表,检测到用户从照片列表中选中多个待分享的照片。
示例性的,当待分享的照片是用户手机上保存的其他来源的图片时,用户可以打开相机应用,不拍摄照片,直接长按并拖动本地相册图标,在照片列表中找到并选中待分享的照片,可以如图17中的(d)图和(e)图所示。本申请实施例对此不作限定。
示例性的,如图17中的(c)图所示,当检测到用户对本地相册图标的滑动操作之后,显示该照片列表,该照片列表中的第一张照片可以是默认选中的。如果用户并不期望分享该照片1,可以点击照片1右下角的选择框,取消选择照片1。同样地,如果用户期望同时分享该照片1、照片2和照片3,可以点击每张照片右下角的选择框,选择多张待分享的照片,此处不再赘述。
1907,检测用户对多个待分享的照片的长按操作。
1908,当检测到用户对多个待分享的照片的长按操作时,触发显示缩略照片的图标,且该缩略照片的图标处于可拖拽模式,同时启动设备识别功能。
可选地,当用户选中待分享的照片1、照片2和照片3之后,用户手指可以长按待分享的照片1、照片2和照片3的任意区域,都可以实现拖动三张照片。
1909,识别预览画面中包括的其他电子设备,并标记该识别出的电子设备。
1910,拖动该缩略照片的图标至识别到的预览画面中其他电子设备。
1911,分享照片至用户拖动该缩略照片的图标到达的电子设备。
这里需要说明的是,该实现过程的流程可以结合前述图14至图18的具体介绍,对于部分相同的操作过程以及可能的实现方式等,可以参考前述的相应描述,此处不再赘述。
通过上述方法,用户可以在拍摄照片的过程中,通过预设的操作,可以启动手机的设备识别功能和定位功能,结合手机的识别功能和定位功能,识别出相机的预览画面中包括的其他电子设备,用户可以选择多张待分享的照片,并将多张待分享的照片直接拖动到其他电子设备所在的区域,从而快速将照片分享给周围存在的其他电子设备。该过程简化了分享照片的操作流程,缩短了分享照片的时间,提高了用户体验。
上述介绍了本申请的显示界面和方法实现,下面以UWB无线定位技术为例,详细说明电子设备100如何实现对其他电子设备进行测距和测角。
如图20所示,以电子设备100和电子设备201为例,电子设备100发起UWB测量请求。并根据电子设备201的测量响应,确定电子设备100与电子设备201间的距离。具体的,上述设备控制方法包括但不限于步骤S101至S105,其中:
S101、电子设备100广播UWB测量请求,电子设备201接收上述UWB测量请求。
在一些实施例中,电子设备100发起UWB测量请求,电子设备100采用测距算法3确定电子设备201的距离。
步骤S101具体可以包括:电子设备100在T11时刻广播第一测量请求,并记录第一测量请求发送时刻为T11,第一测量请求携带电子设备100的身份信息(例如电子设备的ID、mac地址等)。电子设备201在T12时刻接收到电子设备100发送的第一测量请求,并记录第一测量请求的接收时刻为T12。
S102、电子设备201向电子设备100发送第一测量响应。
电子设备201在T13时刻向电子设备201发送第一测量响应,第一测量请求携带T12、T13、电子设备100的身份信息和电子设备201的身份信息。电子设备201在T14时刻接收到电子设备100发送的第一测量响应,并记录第一测量响应的接收时刻为T14时刻。
S103、电子设备100根据电子设备201发送的测量响应确定电子设备201的方位参数。
具体的,电子设备201的方位参数可以包括电子设备201与电子设备100间的物理距离、电子设备201的信号AOA、电子设备201发送信号的RRSI中的一或多项。下面分别对这三个方位参数进行详细说明:
一,电子设备201与电子设备100间的物理距离。第一测量请求的发送时刻T11和第一测量响应的接收时刻T14的时间差等于Tround1,第一测量请求的接收时刻T12和第一测量响应的发送时刻T13的时间差等于Trelay1,单向飞行时间T可以表示为:T=(Tround1-Trelay2)/2。
电子设备100根据上述公式确定信号单向飞行时间,再根据单向飞行时间T与电磁波传播速度C的乘积,便可确定与电子设备201与电子设备100的物理距离D为C*T。
二,电子设备201的信号AOA。电子设备100可以根据第一测量响应到达不同位置的UWB天线的相位差来计算信号的接收方向,从而确定电子设备201相对于电子设备100的方向。
示例性的,如图21所示,电子设备100接收电子设备201发送的无线信号,该信号的在电子设备100的信号AOA(即相对于接收天线1和接收天线2的连接线,上述无线信号的入射角θ)可以根据该信号在电子设备100的接收天线1和接收天线2上的相位差
Figure PCTCN2021110906-appb-000002
确定。其中,
Figure PCTCN2021110906-appb-000003
可表示如下,
Figure PCTCN2021110906-appb-000004
其中,λ为波长,φ(θ)为天线硬件相位差。通过上式可以确定入射角θ,即电子设备201的信号AOA。举例来说,若电子设备的入射角θ为60度,则电子设备201在电子设备100的顺时针30度方向。
三,电子设备201发送信号的RRSI。电子设备100根据第一测量请求和第一测量响应的RRSI平均值确定电子设备201发送信号的RRSI。在一些实施例中,电子设备100根据第一测量请求和第一测量响应的RRSI确定电子设备201发送信号的RRSI。
本申请中,可以根据电子设备201发送信号的RRSI确定电子设备100与电子设备201间是否有遮挡物。
可以理解,在有遮挡的非视距(Non line-of-sight,NLOS)传播条件下,信号衰减较大,在无遮挡的视距(line-of-sight,LOS)传播条件下,信号衰减较小。同一传播条件下,距离越远,信号衰减较大。本申请实施例中,根据第一测量请求和第一测量响应的RRSI,以及电子设备201与电子设备100之间的物理距离,可以确定电子设备100与电子设备201间是否有遮挡物。
在一些实施例中,根据电子设备100与电子设备201的距离,可以确定电子设备100接收到的电子设备201发送信号的预设RRSI。当接收到的电子设备201发送信号的RRSI小于预设RRSI,则确定电子设备100与电子设备201间有遮挡物,否则无遮挡物。
在一些实施例中,电子设备201的方位参数可以包括电子设备201与电子设备100间的物理距离、信号AOA以及第一标识。其中,电子设备201的第一标识用于表征电子设备100与电子设备201间是否有遮挡。例如,第一标识等于1表示有遮挡,第一标识等于0表示无遮挡。
S104、电子设备100向电子设备201发送连接请求,电子设备201接收电子设备100发送的连接请求。
S105、电子设备201向电子设备100发送第一能力信息和相应的连接参数,上述第一能力信息用于表征电子设备201能支持的通信模式。
在一些实施例中,当上述第一能力信息表征WiFi通信模式时,相应的连接参数可以包括:设备ID,配对秘钥等参数。电子设备100可以使用IEE802.11标准的连接过程,基于上述连接参数与电子设备201建立WiFi连接;
在一些实施例中,当上述第一能力信息表征蓝牙通信模式时,相应的连接参数可以包括:秘钥,加密方式,服务集标识(Service Set Identifier,SSID)等参数。电子设备100可以使用IEE802.15.1标准的连接过程,基于上述连接参数与电子设备201建立蓝牙连接。
在一些实施例中,当上述第一能力信息表征WiFi通信模式和蓝牙通信模式时,电子设备100优先可以使用IEE802.11标准的连接过程,基于上述连接参数与电子设备201建立WiFi连接。
在一些实施例中,第一测量请求还可以携带第二能力信息,第二能力信息用于表征电子设备100所能支持的所有通信模式,例如蓝牙、WiFi等。第一测量响应还可以携带第一能力信息和相应的连接参数。其中,第二能力信息包括第一能力信息,第二能力信息是电子设备201根据第二能力信息确定的。这样步骤S103之后,电子设备100可以直接根据第一测量响应中的第一能力信息和相应的连接参数,与电子设备201建立连接,无需再次发送连接请求。
在一些实施例中,电子设备100也可以多次发起测量请求,根据多次测量请求和多次测量响应的收发时间,获取单向飞行时间平均值和AOA平均值,减小距离和角度测量误差。
在本申请中,不限于上述UWB定位方法,还包括其他方式获取电子设备201相对于电子设备100的位置信息。例如,电子设备100广播UWB测量请求,该探测请求中包括发送时间,电子设备201接收到该测量请求后,基于该发送时间,以及电子设备201接收到该探测请求的时间,确定时间差,从而计算出电子设备201与电子设备100的距离(距离等于时间差乘电磁波的传播速度);电子设备201基于接收到的探测请求,计算出该探测请求的到达角度,可以确定出电子设备201相对于电子设备100的方位角度。电子设备201向电子设备100发送探测响应,该探测响应中第二电子设备201的身份标识和第一位置信息。电子设备100接收到该探测响应,获取确定电子设备201相对于电子设备100的方位参数。
本申请中,上述测量请求(第一测量请求)也可称为探测请求,测量响应(第一测量响 应)也可称为探测响应。
在本申请中,不限于通过UWB定位,还可以通过蓝牙、WiFi、GPS的方式进行定位。
本申请提供了一种设备识别方法,应用于带有摄像头的第一电子设备,如图22所示,方法包括:
S201、第一电子设备接收第一操作。
第一操作可以是前述图5A~图5D中的任意一个或多个用户操作,还可以是前述图7A~图7C中的任意一个或多个用户操作。详细内容可以参考前述图5A~图5D,或图7A~图7C所示实施例,在此不再赘述。
S202、响应于第一操作,第一电子设备显示第一界面,第一界面包括摄像头采集的预览画面,其中预览画面中包括第二电子设备。
第一界面可以是前述取景界面530。第二电子设备例如可以是图5G中设备图像532对应的电子设备201。
S203、第一电子设备获取第二电子设备相对于第一电子设备的第一位置信息。
S204、第一电子设备基于第一位置信息,和第二电子设备在预览画面中的显示区域,确定出第一标签在预览画面中的显示位置,并在显示位置显示第一标签,其中第一标签用于标识第二电子设备。
第二电子设备例如可以是图5G中设备图像532对应的电子设备201,则第一标签例如可以是设备图标5321。
S205、第一电子设备接收针对第一标签的第二操作。第二操作可以是前述图5G中针对设备图标5321的用户操作。
S206、响应于第二操作,第一电子设备显示第二界面,第二界面包括控制第二电子设备的一个或多个控件。第二界面可以是图5H中的显示界面。其中,第二界面可以是叠加在第一界面上显示,也可以是电子设备从第一界面跳转显示第二界面。本申请通过增强现实的显示方式实时呈现了第一标签和第二电子设备的对应关系,并且通过第一标签实现了第一电子设备与第二电子设备的交互,实现多设备间的协调控制,提升了用户体验。
在一些可能的实施方式中,第一电子设备获取第二电子设备相对于第一电子设备的第一位置信息,具体包括:第一电子设备广播探测请求,探测请求包括第一电子设备的身份标识;第一电子设备接收到第二电子设备基于探测请求发送的探测响应时,基于探测响应确定第二电子设备与第一电子设备的第一位置信息,探测响应包括第二电子设备的身份标识。这种方式中,第一位置信息包括第二电子设备与第一电子设备的相对位置,例如距离、方向、角度等。第一电子设备根据发送探测请求和接收到探测响应的时间差,可以计算出第二电子设备与第一电子设备的距离(距离等于时间差乘电磁波的传播速度);第一电子设备基于该探测响应,计算出该探测响应的到达角度,可以确定出第二电子设备相对于第一电子设备的方位角度。
可选的,探测响应中包括第二电子设备的身份标识和第一位置信息,第一电子设备基于探测响应确定第二电子设备与第一电子设备的第一位置信息。具体的,第二电子设备根据接收到的探测请求,计算第二电子设备与第一电子设备的相对位置。探测请求中包括发送时间,第二电子设备基于该发送时间,以及第二电子设备接收到该探测请求的时间,确定时间差,从而计算出第二电子设备与第一电子设备的距离;第二电子设备基于接收到的探测请求,计算出该探测请求的到达角度,可以确定出第二电子设备相对于第一电子设备的方位角度。第二电子设备向第一电子设备发送探测响应,该探测响应中包括第二电子设备的身份标识和第 一位置信息。
在一些可能的实施方式中,第一标签在预览画面中的显示位置和第二电子设备在预览画面中的显示区域,部分重叠或完全重叠。第一标签可以显示在第二电子设备的显示区域内,可以显示在第二电子设备的显示区域的边缘,也可以显示在紧靠第二电子设备的显示区域的位置。
在一些可能的实施方式中,方法还包括:第一电子设备获取第三电子设备相对于第一电子设备的第二位置信息;当第一电子设备检测到预览画面中不包括第三电子设备,且基于第二位置信息确定出第三电子设备在摄像头的取景范围内;第一电子设备基于第二位置信息,确定出第二标签在预览画面中的显示位置,其中第二标签用于指示以下一种或多种信息:第三电子设备的标识信息、第三电子设备的遮挡物、第二位置信息。这种方式中,当第一电子设备检测到第三电子设备的相对位置在摄像头的取景范围内,但是预览画面中不包括第三电子设备的图像,则第一电子设备判断出第三电子设备被遮挡,输出第三电子设备的第二标签,指示第三电子设备的标识信息、遮挡物和在预览界面中被遮挡的位置中的一项或多项。
第二标签例如可以是图8C中的图标803,第三电子设备的图像不在第一界面中,第三电子设备被设备图像533遮挡。
在一些可能的实施方式中,方法还包括:当第一电子设备检测到预览画面中不包括第三电子设备,且基于第二位置信息确定出第三电子设备不在摄像头的取景范围内;第一电子设备基于第二位置信息,确定出第三标签在预览画面中的显示位置,其中第三标签用于指示以下一种或多种信息:第三电子设备的标识信息、第二位置信息。这种方式中,当第一电子设备检测到第三电子设备的相对位置在摄像头的取景范围之外,并且预览画面中不包括第三电子设备的图像,则第一电子设备判断出第三电子设备不在取景框中,输出第三电子设备的第二标签,指示第三电子设备的标识信息、以及与第一电子设备的相对位置(方向、角度、距离等)中的一项或多项。
第三标签例如可以是图8B中的图标802,第三电子设备的图像不在第一界面中,第三电子设备在摄像头的取景范围之外。
在一些可能的实施方式中,预览画面中包括第四电子设备的图像,第一电子设备显示第一界面之后,还包括:第一电子设备基于预览画面确定第四电子设备的设备类型为第一类型;第一电子设备在第一电子设备的账号下关联或绑定的电子设备中,确定出设备类型为第一类型的第一目标设备;第一电子设备显示第四标签,第四标签用于指示第四电子设备的图像与第一目标设备关联。这种方式中,当第一电子设备无法检测到第四电子设备的位置信息,并且第四电子设备的图像在预览画面中。在这种情况下,第一电子设备根据图像识别技术识别第四电子设备的设备类型,检测与第一电子设备登录同一账号(例如华为账号)的设备中是否存在该设备类型的目标设备。若有,则第一电子设备认为该目标设备即为第四电子设备,第一电子设备输出标识该目标设备的第四标签。
第三标签例如可以是图8D中的图标805,第四电子设备的图像在第一界面中,第一电子设备无法定位第四电子设备的位置。
在一些可能的实施方式中,预览画面中包括第五电子设备的图像,第一电子设备显示第一界面之后,还包括:第一电子设备基于预览画面确定第五电子设备的设备类型为第二类型;第一电子设备获取第一电子设备的第三位置信息,第一电子设备中存储有电子设备和位置信息的对应关系;基于对应关系,第一电子设备根据第三位置信息,确定出设备类型为第一类型的第二目标设备,目标设备的位置信息与第三位置信息相同;第一电子设备显示第五标签, 第五标签用于指示第五电子设备的图像与第二目标设备关联。这种方式中,当第一电子设备无法检测到第五电子设备的位置信息,并且第五电子设备的图像在预览画面中。在这种情况下,由于第一电子设备中存储有电子设备和位置信息的对应关系(例如智能音箱——客厅,智能台灯——卧室,电脑——公司,等等),第一电子设备根据第一电子设备当前的地理位置,以及根据图像识别技术识别第五电子设备的设备类型,检测与第一电子设备在同一地理位置的设备中是否存在该设备类型的目标设备。若有,则第一电子设备认为该目标设备即为第五电子设备,第一电子设备输出标识该目标设备的第五标签。
在一些可能的实施方式中,第一界面还包括第一图标,第一图标关联了待分享数据,方法还包括:第一电子设备接收第三操作,第三操作为针对于第一标签和/或第一图标的操作;响应于第三操作,第一电子设备将待分享数据发送给第二电子设备。第三操作包括但不限于拖拽操作、点击操作等;提供了一种数据分享的方式,在第一界面上选择想要分享的第二电子设备,将待分享的数据发送到第二电子设备。简化了数据分享的用户操作,直观的显示了设备信息,提升了用户体验。
第一图标例如可以是图9B中的图标902或图标903,第一图标还可以是图11D中的缩略图1111;第一图标还可以是图12B中的图片1205。
在一些可能的实施方式中,第一电子设备接收第三操作之前,还包括:第一电子设备根据待分享数据的数据类型,在第一界面上显示第一显示形式的第一标签,第一显示形式的第一标签用于提示用户第二电子设备支持输出待分享数据。其中第一显示形式可以是将第一标签的显示区域提亮(改变亮度、颜色等)。第一显示形式例如可以是图10C中设备图标5311、设备图标5331、设备图标5341的显示形式。
在一些可能的实施方式中,预览画面中包括第三电子设备的图像和第三标签,第三标签与第三电子设备关联;方法还包括:第一电子设备接收第四操作,第四操作为针对于第一标签和/或第三图标的操作;响应于第四操作,第一电子设备输出提示消息,提示消息用于提示用户第三电子设备不支持输出待分享数据。提示消息例如可以是图10B中的图示框1100显示的信息。
本申请实施例,第一电子设备接收第一操作显示第一界面,启动摄像头,在第一界面中实时显示通过该摄像头采集到的图像;第一电子设备根据图像识别技术,识别出图像中的电子设备以及电子设备的设备类型(例如音箱、电脑、平板电脑等),例如第二电子设备;并且第一电子设备根据无线定位技术(例如UWB定位、蓝牙定位、WiFi定位等),获取第二电子设备相对于第一电子设备的位置信息。该位置信息包括距离、方向、角度中的一项或多项。第一电子设备基于该位置信息,在预览画面中,确定出第二电子设备的第一标签的显示位置,第一标签用于标识第二电子设备,例如标识第二电子设备的设备名称、设备类型等。其中,第一标签的显示位置与第二电子设备的显示位置有关。当第一电子设备检测到针对于第一标签的用户操作,第一电子设备输出第二界面,该第二界面包括控制第二电子设备的一个或多个控件。其中,第二界面可以是叠加在第一界面上显示,也可以是电子设备从第一界面跳转显示第二界面。本申请通过增强现实的显示方式实时呈现了第一标签和第二电子设备的对应关系,并且通过第一标签实现了第一电子设备与第二电子设备的交互,实现多设备间的协调控制,提升了用户体验。
在本申请实施例中,电子设备(例如,电子设备100)的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系 统为例,示例性说明电子设备100的软件结构。其中,Android系统仅为本申请实施例中电子设备100的一种系统示例,本申请还可以适用于其他类型的操作系统,比如IOS、windows、鸿蒙等,本申请对此不加以限制。下述仅将Android系统作为电子设备100的操作系统的示例。
参见图23,图23示出了本申请实施例示例性提供的电子设备的软件结构框图。该电子设备可以在通过UWB定位技术确定附近设备的方位参数(例如距离、信号AOA及RRSI),进而根据多个附近设备的方位参数,确定取景界面中附近设备的图像的显示位置,并显示附近设备的设备图标,触发该设备图标,实现电子设备与附近设备的交互。电子设备可以通过UWB、蓝牙、WLAN和红外线中的一种或多种无线通信协议和目标设备建立无线通信连接,并进行数据传输。
如图23所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,可以将Android系统从上至下分为应用程序层,应用程序框架层,协议栈,以及内核层(kernel)。其中:
应用程序层包括一系列应用程序包,例如智慧生活,蓝牙,WLAN等等。还可以包括相机,图库,通话,音乐,视频等应用程序。
其中,智慧生活APP是能够对家居中的各种智能家居设备进行选择和控制的软件程序,安装在用户使用的电子设备上。智慧生活APP可以是电子设备出厂时已安装的应用,也可以是用户在使用电子设备的过程中从网络下载或从其他设备获取的应用。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图23所示,应用程序框架层主要可以包括API和系统服务(System Server)。其中,API用于实现应用程序层和协议栈、HAL层、内核层(kernel)之间的通信。例如,可提供“智慧生活”和内核层(kernel)之间的通信等。API可以包括UWB API、蓝牙API、WLAN API、红外线API中的一项或多项,相应的,系统服务可以包括UWB服务、蓝牙服务、WLAN服务、红外线服务中的一项或多项。电子设备100可以通过调用UWB API、蓝牙API、WLAN API、红外线API中的一项或多项调用相应的系统服务,来探测电子设备100附近设备的方位参数。还可以通过调用UWB API、蓝牙API、WLAN API、红外线API中的一项或多项调用相应的系统服务,来与附近设备建立无线通信连接,以及进行数据传输。
其中,UWB服务具体可以包括一或多项服务,例如UWB定位服务。UWB定位服务可以包括方位参数测量,其中,方位参数测量包括距离测量、AOA测量、RRSI测量中的一或多项。例如,电子设备100通过UWB API调用UWB定位服务,来探测电子设备100附近设备的方位参数。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL嵌入式系统版(OpenGL for Embedded Systems,OpenGL ES)),2D图形引擎(例如:Skia图形库(Skia Graphics Library,SGL))等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融 合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:移动影像专家组4(Motion Picture Expert Group,MPEG4),高级视频编码(MPEG-4Part 10 Advanced Video Coding,MPEG-4 AVC/H.264),动态影像专家压缩标准音频层3(MPEG Audio Layer3,MP3),高级音频编码(Advanced Audio Coding,AAC),自适应多速率(Adaptive Multi-Rate,AMR),联合图像专家组(Joint Photographic Experts Group,JPEG/JPG),便携式网络图形(Portable Network Graphics,PNG)等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包含UWB芯片驱动、蓝牙芯片驱动、WLAN驱动中的一或多项,还可以包括显示驱动,摄像头驱动,音频驱动,传感器驱动等等。内核层(kernel)用于响应于应用程序框架层中系统服务调用的功能执行对应的操作。例如,响应于UWB定位服务调用UWB协议栈发送的UWB测量指令,UWB芯片驱动通过硬件设备(例如UWB芯片)发送UWB测量请求。
在本申请示例中,该软件结构框架可以在电子设备100上,也可以在电子设备201、电子设备202、电子设备203、电子设备204上。
下面以上述实施例中的设备识别场景为例,示例性说明电子设备100软件以及硬件的工作流程。
加速度传感器和/或陀螺仪传感器检测到抬起操作(例如图7C),相应的硬件中断被发给内核层。内核层将敲击操作加工成原始输入事件。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件为对电子设备(例如电子设备201、)的配对连接。智慧生活应用调用应用框架层的UWB API,以启动UWB定位服务。UWB定位服务通过调用UWB协议栈向HAL层中的UWB HAL接口发送UWB测量指令。UWB HAL接口向内核层发送UWB测量请求,内核层根据上述UWB测量请求通过调用UWB芯片驱动,来驱动UWB芯片广播测量请求(例如第一测量请求),同时利用UWB时间管理模块记录UWB测量请求发送时间戳。
在一些实施例中,应用框架层的UWB服务确定目标设备后通过调用UWB协议栈,将连接请求发送至内核层,内核层的UWB芯片驱动UWB芯片向电子设备201发送上述连接请求,以请求建立UWB通信连接,并进行数据传输。可选的,应用框架层的UWB服务还可以调用蓝牙服务、WLAN服务或红外线服务,向电子设备201发送连接请求。例如,UWB服务启动蓝牙服务,通过蓝牙服务调用蓝牙协议栈,从而将第一连接请求发送至内核层,内核层的蓝牙芯片驱动蓝牙芯片将连接请求发送至电子设备201,以请求建立蓝牙通信连接,并进行数据传输。
在采用集成的单元的情况下,电子设备100可以包括处理模块、存储模块和通信模块。其中,处理模块可以用于对电子设备的动作进行控制管理,例如,可以用于支持电子设备执行上述显示单元、检测单元和处理单元执行的步骤。存储模块可以用于支持电子设备执行存储程序代码和数据等。通信模块,可以用于支持电子设备与其他设备的通信。
其中,处理模块可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等 等。存储模块可以是存储器。通信模块具体可以为射频电路、蓝牙芯片、Wi-Fi芯片等与其他电子设备交互的设备。
在一个实施例中,当处理模块为处理器,存储模块为存储器时,本实施例所涉及的电子设备可以为具有图2所示结构的设备。
本实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的分享照片的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (21)

  1. 一种设备识别方法,其特征在于,应用于带有摄像头的第一电子设备,所述方法包括:
    第一电子设备接收第一操作;
    响应于所述第一操作,所述第一电子设备显示第一界面,所述第一界面包括所述摄像头采集的预览画面,其中所述预览画面中包括第二电子设备;
    所述第一电子设备获取所述第二电子设备相对于所述第一电子设备的第一位置信息;
    所述第一电子设备基于所述第一位置信息,和所述第二电子设备在所述预览画面中的显示区域,确定出第一标签在所述预览画面中的显示位置,并在所述显示位置显示所述第一标签,其中所述第一标签用于标识所述第二电子设备;
    所述第一电子设备接收针对所述第一标签的第二操作;
    响应于所述第二操作,所述第一电子设备显示第二界面,所述第二界面包括控制所述第二电子设备的一个或多个控件。
  2. 根据权利要求1所述的方法,其特征在于,所述第一电子设备获取所述第二电子设备相对于所述第一电子设备的第一位置信息,具体包括:
    所述第一电子设备广播探测请求,所述探测请求包括所述第一电子设备的身份标识;
    所述第一电子设备接收到所述第二电子设备基于所述探测请求发送的探测响应时,基于所述探测响应确定所述第二电子设备与所述第一电子设备的第一位置信息,所述探测响应包括所述第二电子设备的身份标识。
  3. 根据权利要求1所述的方法,其特征在于,所述第一标签在所述预览画面中的显示位置和所述第二电子设备在所述预览画面中的显示区域,部分重叠或完全重叠。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备获取第三电子设备相对于所述第一电子设备的第二位置信息;
    当所述第一电子设备检测到所述预览画面中不包括所述第三电子设备,且基于所述第二位置信息确定出所述第三电子设备在所述摄像头的取景范围内;
    所述第一电子设备基于所述第二位置信息,确定出第二标签在所述预览画面中的显示位置,其中所述第二标签用于指示以下一种或多种信息:所述第三电子设备的标识信息、所述第三电子设备的遮挡物、所述第二位置信息。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    当所述第一电子设备检测到所述预览画面中不包括所述第三电子设备,且基于所述第二位置信息确定出所述第三电子设备不在所述摄像头的取景范围内;
    所述第一电子设备基于所述第二位置信息,确定出第三标签在所述预览画面中的显示位置,其中所述第三标签用于指示以下一种或多种信息:所述第三电子设备的标识信息、所述第二位置信息。
  6. 根据权利要求1所述的方法,其特征在于,所述预览画面中包括第四电子设备的图像, 所述第一电子设备显示第一界面之后,还包括:
    所述第一电子设备基于所述预览画面确定所述第四电子设备的设备类型为第一类型;
    所述第一电子设备在所述第一电子设备的账号下关联或绑定的电子设备中,确定出设备类型为所述第一类型的第一目标设备;
    所述第一电子设备显示第四标签,所述第四标签用于指示所述第四电子设备的图像与所述第一目标设备关联。
  7. 根据权利要求1所述的方法,其特征在于,所述预览画面中包括第五电子设备的图像,所述第一电子设备显示第一界面之后,还包括:
    所述第一电子设备基于所述预览画面确定所述第五电子设备的设备类型为第二类型;
    所述第一电子设备获取所述第一电子设备的第三位置信息,所述第一电子设备中存储有电子设备和位置信息的对应关系;
    基于所述对应关系,所述第一电子设备根据所述第三位置信息,确定出所述设备类型为所述第一类型的第二目标设备,所述目标设备的位置信息与所述第三位置信息相同;
    所述第一电子设备显示第五标签,所述第五标签用于指示所述第五电子设备的图像与所述第二目标设备关联。
  8. 根据权利要求1所述的方法,其特征在于,所述第一界面还包括第一图标,所述第一图标关联了待分享数据,所述方法还包括:
    第一电子设备接收第三操作,所述第三操作为针对于所述第一标签和/或所述第一图标的操作;
    响应于所述第三操作,所述第一电子设备将所述待分享数据发送给所述第二电子设备。
  9. 根据权利要求8所述的方法,其特征在于,所述第一电子设备接收第三操作之前,还包括:
    所述第一电子设备根据所述待分享数据的数据类型,在所述第一界面上显示第一显示形式的所述第一标签,所述第一显示形式的所述第一标签用于提示用户所述第二电子设备支持输出所述待分享数据。
  10. 根据权利要求8所述的方法,其特征在于,所述预览画面中包括第三电子设备的图像和所述第三标签,所述第三标签与所述第三电子设备关联;所述方法还包括:
    第一电子设备接收第四操作,所述第四操作为针对于所述第一标签和/或所述第三图标的操作;
    响应于所述第四操作,所述第一电子设备输出提示消息,所述提示消息用于提示用户所述第三电子设备不支持输出所述待分享数据。
  11. 一种电子设备,其特征在于,包括:一个或多个处理器,存储器;所述存储器中包括计算机指令,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备执行:
    接收第一操作;
    响应于所述第一操作,显示第一界面,所述第一界面包括所述摄像头采集的预览画面,其中所述预览画面中包括第一目标设备;
    获取和所述第一目标设备的第一相对位置信息;
    基于所述第一相对位置信息,和所述第一目标设备在所述预览画面中的显示位置,确定出第一标签在所述预览画面中的显示位置,其中所述第一标签用于指示所述第一目标设备的标识信息;
    接收针对所述第一标签的第二操作;
    响应于所述第二操作,显示第二界面,所述第二界面包括控制所述第一目标设备的一个或多个控件。
  12. 根据权利要求11所述的电子设备,其特征在于,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备执行获取和所述第一目标设备的第一相对位置信息,具体包括:
    广播探测请求,所述探测请求包括所述电子设备的身份标识;
    接收到所述第一目标设备基于所述探测请求发送的探测响应时,基于所述探测响应确定和所述第一目标设备的第一相对位置信息,所述探测响应包括所述第一目标设备的身份标识。
  13. 根据权利要求11所述的电子设备,其特征在于,所述第一标签在所述预览画面中的显示位置和所述第一目标设备在所述预览画面中的显示位置,部分重叠或完全重叠。
  14. 根据权利要求11所述的电子设备,其特征在于,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备还执行:
    获取和第二目标设备的第二相对位置信息;
    当所述电子设备检测到所述预览画面中不包括所述第二目标设备,且基于所述第二相对位置信息确定出所述第二目标设备在所述摄像头的取景范围内;
    所述电子设备基于所述第二相对位置信息,确定出第二标签在所述预览画面中的显示位置,其中所述第二标签用于指示以下一种或多种信息:所述第二目标设备的标识信息、所述第二目标设备的遮挡物、所述第二相对位置信息。
  15. 根据权利要求14所述的电子设备,其特征在于,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备还执行:
    当所述电子设备检测到所述预览画面中不包括所述第二目标设备,且基于所述第二相对位置信息确定出所述第二目标设备不在所述摄像头的取景范围内;
    所述电子设备基于所述第二相对位置信息,确定出第三标签在所述预览画面中的显示位置,其中所述第三标签用于指示以下一种或多种信息:所述第二目标设备的标识信息、所述第二相对位置信息。
  16. 根据权利要求11所述的电子设备,其特征在于,所述预览画面中包括第三目标设备的图像,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备执行显示第一界面之后,所述电子设备还执行:
    基于所述预览画面确定所述第三目标设备的设备类型为第一类型;
    在所述电子设备的账号下关联或绑定的电子设备中,确定出设备类型为所述第一类型的设备的标识信息;
    显示第四标签,所述第四标签用于指示所述第三目标设备的图像与所述标识信息关联。
  17. 根据权利要求11所述的电子设备,其特征在于,所述预览画面中包括第四目标设备的图像,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备执行显示第一界面之后,所述电子设备还执行:
    基于所述预览画面确定所述第四目标设备的设备类型为第二类型;
    获取所述电子设备的位置信息,所述电子设备中存储有电子设备和位置信息的对应关系;
    所述电子设备根据所述第三位置信息,在所述对应关系中确定出所述设备类型为所述第一类型的设备的标识信息;
    显示第五标签,所述第五标签用于指示所述第四目标设备的图像与所述标识信息关联。
  18. 根据权利要求11所述的电子设备,其特征在于,所述第一界面还包括第一图标,所述第一图标关联了待分享数据,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备还执行:
    接收第三操作,所述第三操作为针对于所述第一标签和/或所述第一图标的操作;
    响应于所述第三操作,将所述待分享数据发送给所述第一目标设备。
  19. 根据权利要求18所述的电子设备,其特征在于,当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备执行接收第三操作之前,所述电子设备还执行:
    根据所述待分享数据的数据类型,在所述第一界面上显示第一显示形式的所述第一标签,所述第一显示形式的所述第一标签用于提示用户所述第一目标设备支持输出所述待分享数据。
  20. 根据权利要求18所述的电子设备,其特征在于,所述预览画面中包括第二目标设备的图像和所述第三标签,所述第三标签与所述第二目标设备关联;当所述一个或多个处理器调用所述计算机指令时,使得所述电子设备还执行:
    接收第四操作,所述第四操作为针对于所述第一标签和/或所述第三图标的操作;
    响应于所述第四操作,输出提示消息,所述提示消息用于提示用户所述第二目标设备不支持输出所述待分享数据。
  21. 一种计算机可读介质,用于存储一个或多个程序,其中所述一个或多个程序被配置为被所述一个或多个处理器执行,所述一个或多个程序包括指令,所述指令用于执行如权利要求1-10所述的方法。
PCT/CN2021/110906 2020-08-05 2021-08-05 一种设备识别方法及相关装置 WO2022028537A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21853577.1A EP4184905A4 (en) 2020-08-05 2021-08-05 DEVICE RECOGNITION METHOD AND ASSOCIATED APPARATUS
JP2023507686A JP2023538835A (ja) 2020-08-05 2021-08-05 デバイス識別方法及び関連する装置
US18/164,170 US20230188832A1 (en) 2020-08-05 2023-02-03 Device Identification Method and Related Apparatus

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202010782270.8 2020-08-05
CN202010779841.2 2020-08-05
CN202010779841 2020-08-05
CN202010782270 2020-08-05
CN202011183311.8 2020-10-29
CN202011183311.8A CN114079691B (zh) 2020-08-05 2020-10-29 一种设备识别方法及相关装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/164,170 Continuation US20230188832A1 (en) 2020-08-05 2023-02-03 Device Identification Method and Related Apparatus

Publications (1)

Publication Number Publication Date
WO2022028537A1 true WO2022028537A1 (zh) 2022-02-10

Family

ID=80117036

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/110906 WO2022028537A1 (zh) 2020-08-05 2021-08-05 一种设备识别方法及相关装置

Country Status (5)

Country Link
US (1) US20230188832A1 (zh)
EP (1) EP4184905A4 (zh)
JP (1) JP2023538835A (zh)
CN (1) CN116489268A (zh)
WO (1) WO2022028537A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112269510B (zh) * 2020-10-29 2022-03-25 维沃移动通信(杭州)有限公司 信息处理方法、装置及电子设备
US11232193B1 (en) * 2020-11-04 2022-01-25 Malwarebytes Inc. Automated generation of a sandbox configuration for malware detection
EP4207717A4 (en) * 2020-11-11 2024-03-13 Samsung Electronics Co., Ltd. ELECTRONIC DEVICE AND USER INTERFACE DISPLAY METHOD

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120242866A1 (en) * 2011-03-22 2012-09-27 Kyocera Corporation Device, control method, and storage medium storing program
CN105843054A (zh) * 2016-03-22 2016-08-10 美的集团股份有限公司 控制家居设备的方法、智能家居系统及移动设备
CN106569409A (zh) * 2016-10-13 2017-04-19 杭州鸿雁电器有限公司 一种基于图形捕获的家居设备控制系统、家居控制设备及控制方法
CN109088803A (zh) * 2018-09-20 2018-12-25 塔普翊海(上海)智能科技有限公司 一种ar遥控装置、智能家居遥控系统及方法
CN111045344A (zh) * 2019-12-31 2020-04-21 维沃移动通信有限公司 一种家居设备的控制方法及电子设备
CN111262763A (zh) * 2020-04-02 2020-06-09 深圳市晶讯软件通讯技术有限公司 一种基于实景图的智能家居设备控制系统及方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9712948B2 (en) * 2014-04-30 2017-07-18 Avago Technologies General Ip (Singapore) Pte. Ltd. Image triggered pairing
US11153427B2 (en) * 2018-09-05 2021-10-19 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN111432331B (zh) * 2020-03-30 2021-10-15 华为技术有限公司 一种无线连接方法、装置和终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120242866A1 (en) * 2011-03-22 2012-09-27 Kyocera Corporation Device, control method, and storage medium storing program
CN105843054A (zh) * 2016-03-22 2016-08-10 美的集团股份有限公司 控制家居设备的方法、智能家居系统及移动设备
CN106569409A (zh) * 2016-10-13 2017-04-19 杭州鸿雁电器有限公司 一种基于图形捕获的家居设备控制系统、家居控制设备及控制方法
CN109088803A (zh) * 2018-09-20 2018-12-25 塔普翊海(上海)智能科技有限公司 一种ar遥控装置、智能家居遥控系统及方法
CN111045344A (zh) * 2019-12-31 2020-04-21 维沃移动通信有限公司 一种家居设备的控制方法及电子设备
CN111262763A (zh) * 2020-04-02 2020-06-09 深圳市晶讯软件通讯技术有限公司 一种基于实景图的智能家居设备控制系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4184905A4

Also Published As

Publication number Publication date
EP4184905A4 (en) 2024-03-20
JP2023538835A (ja) 2023-09-12
CN116489268A (zh) 2023-07-25
EP4184905A1 (en) 2023-05-24
US20230188832A1 (en) 2023-06-15

Similar Documents

Publication Publication Date Title
WO2021013158A1 (zh) 显示方法及相关装置
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2022089441A1 (zh) 一种跨设备的内容分享方法、电子设备及系统
WO2021052214A1 (zh) 一种手势交互方法、装置及终端设备
WO2020155014A1 (zh) 智能家居设备分享系统、方法及电子设备
WO2022028537A1 (zh) 一种设备识别方法及相关装置
WO2020173370A1 (zh) 一种应用图标的移动方法及电子设备
WO2021185244A1 (zh) 一种设备交互的方法和电子设备
WO2022042770A1 (zh) 控制通信服务状态的方法、终端设备和可读存储介质
WO2021037146A1 (zh) 一种移动终端的文件共享方法及设备
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2020224447A1 (zh) 一种在联系人中添加智能家居设备的方法及系统
CN112130788A (zh) 一种内容分享方法及其装置
WO2020238759A1 (zh) 一种界面显示方法和电子设备
WO2022042326A1 (zh) 显示控制的方法及相关装置
WO2022042769A2 (zh) 多屏交互的系统、方法、装置和介质
WO2022007944A1 (zh) 一种设备控制方法及相关装置
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2022089122A1 (zh) 一种应用窗口的投屏方法与电子设备
WO2021143391A1 (zh) 基于视频通话的共享屏幕方法及移动设备
CN113452945A (zh) 分享应用界面的方法、装置、电子设备及可读存储介质
WO2022100219A1 (zh) 数据转移方法及相关装置
WO2022028290A1 (zh) 基于指向操作的设备之间的交互方法及电子设备
WO2021052388A1 (zh) 一种视频通信方法及视频通信装置
WO2021197354A1 (zh) 一种设备的定位方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21853577

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023507686

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021853577

Country of ref document: EP

Effective date: 20230220

NENP Non-entry into the national phase

Ref country code: DE