CN114079691A - Equipment identification method and related device - Google Patents

Equipment identification method and related device Download PDF

Info

Publication number
CN114079691A
CN114079691A CN202011183311.8A CN202011183311A CN114079691A CN 114079691 A CN114079691 A CN 114079691A CN 202011183311 A CN202011183311 A CN 202011183311A CN 114079691 A CN114079691 A CN 114079691A
Authority
CN
China
Prior art keywords
electronic device
interface
electronic
icon
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011183311.8A
Other languages
Chinese (zh)
Other versions
CN114079691B (en
Inventor
徐杰
龙嘉裕
吴思举
孙科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211294205.6A priority Critical patent/CN116489268A/en
Priority to PCT/CN2021/110906 priority patent/WO2022028537A1/en
Priority to JP2023507686A priority patent/JP7537828B2/en
Priority to EP21853577.1A priority patent/EP4184905A4/en
Publication of CN114079691A publication Critical patent/CN114079691A/en
Application granted granted Critical
Publication of CN114079691B publication Critical patent/CN114079691B/en
Priority to US18/164,170 priority patent/US20230188832A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed are a device identification method and a related device, wherein the method comprises: the method comprises the steps that first electronic equipment receives first operation and displays a first interface, a camera is started, the first interface comprises a preview picture collected by the camera, and the preview picture comprises second electronic equipment; the first electronic equipment determines the display position of a first label of the second electronic equipment in the preview picture according to the image recognition technology and the wireless positioning technology, wherein the first label is used for identifying the second electronic equipment. When the first electronic device detects a user operation directed to the first tag, the first electronic device outputs a second interface that includes one or more controls that control the second electronic device. The application shows the corresponding relation between the first label and the second electronic equipment in real time through the display mode of augmented reality, realizes interaction between the first electronic equipment and the second electronic equipment through the first label, realizes coordination control among multiple devices, and improves user experience.

Description

Equipment identification method and related device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an apparatus identification method and a related device.
Background
With the development of technology, intelligent interconnection equipment is more and more popularized. More users possess a plurality of electronic equipment such as smart mobile phone, computer, smart television, flat board and smart sound box, and other equipment still can possess audio-visual equipment of intelligence, router wifi box, intelligent cleaning device, intelligence kitchen electricity, electronic equipment such as intelligent lighting system like in the family.
When a user needs to select a specific device or devices for interaction (e.g., control, pairing, data transfer, screen projection, etc.), a target device can be found and selected among the multiple devices through menus/lists, maps, NFC, and the like. The user operation is cumbersome.
Disclosure of Invention
The embodiment of the application provides a device identification method and a related device, which can intuitively display identification information of nearby devices through simple operation, provide an interaction path among the devices, realize coordination control among multiple devices, and effectively improve user experience.
It should be noted that, in each embodiment provided in the present application, there may be multiple possible implementation manners of the execution sequence of each step, and some or all of the steps may be executed sequentially or in parallel.
In a first aspect, the present application provides an apparatus identification method applied to a first electronic apparatus with a camera, where the method includes: the first electronic equipment receives a first operation; responding to the first operation, the first electronic equipment displays a first interface, wherein the first interface comprises a preview picture acquired by a camera, and the preview picture comprises second electronic equipment; the method comprises the steps that first electronic equipment acquires first position information of second electronic equipment relative to the first electronic equipment; the first electronic equipment determines the display position of a first label in the preview picture based on the first position information and the display area of the second electronic equipment in the preview picture, and displays the first label at the display position, wherein the first label is used for identifying the second electronic equipment; the first electronic equipment receives a second operation aiming at the first label; in response to the second operation, the first electronic device displays a second interface that includes one or more controls that control the second electronic device.
According to the embodiment of the application, the first electronic equipment receives a first operation display first interface, starts a camera, and displays an image acquired through the camera in real time in the first interface; the first electronic device identifies the electronic device in the image and the device type (such as a sound box, a computer, a tablet computer and the like) of the electronic device, such as a second electronic device, according to an image identification technology; and the first electronic device obtains location information of the second electronic device relative to the first electronic device according to a wireless location technology (e.g., UWB location, bluetooth location, WiFi location, etc.). The position information includes one or more of a distance, a direction, and an angle. Based on the location information, the first electronic device determines a display location of a first tag of the second electronic device in the preview screen, where the first tag is used for identifying the second electronic device, for example, identifying a device name, a device type, and the like of the second electronic device. Wherein the display position of the first label is related to the display position of the second electronic device. When the first electronic device detects a user operation directed to the first tag, the first electronic device outputs a second interface that includes one or more controls that control the second electronic device. The second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface. The application shows the corresponding relation between the first label and the second electronic equipment in real time through the display mode of augmented reality, realizes interaction between the first electronic equipment and the second electronic equipment through the first label, realizes coordination control among multiple devices, and improves user experience.
In some possible embodiments, the acquiring, by a first electronic device, first position information of a second electronic device relative to the first electronic device specifically includes: the method comprises the steps that a first electronic device broadcasts a detection request, wherein the detection request comprises an identity of the first electronic device; when the first electronic device receives a detection response sent by the second electronic device based on the detection request, first position information of the second electronic device and the first electronic device is determined based on the detection response, and the detection response comprises an identity of the second electronic device. In this manner, the first location information includes a relative location, such as a distance, a direction, an angle, etc., of the second electronic device with respect to the first electronic device. The first electronic device can calculate the distance between the second electronic device and the first electronic device according to the time difference between the sending of the probe request and the receiving of the probe response (the distance is equal to the time difference multiplied by the propagation speed of the electromagnetic wave); the first electronic device calculates the angle of arrival of the probe response based on the probe response, and may determine the azimuth angle of the second electronic device relative to the first electronic device.
Optionally, the detection response includes an identity of the second electronic device and first location information, and the first electronic device determines the first location information of the second electronic device and the first electronic device based on the detection response. Specifically, the second electronic device calculates a relative position between the second electronic device and the first electronic device according to the received probe request. The detection request comprises sending time, the second electronic device determines a time difference based on the sending time and the time when the second electronic device receives the detection request, and therefore the distance between the second electronic device and the first electronic device is calculated; the second electronic device calculates an angle of arrival of the probe request based on the received probe request, and can determine an azimuth angle of the second electronic device relative to the first electronic device. And the second electronic equipment sends a detection response to the first electronic equipment, wherein the detection response comprises the identity of the second electronic equipment and the first position information.
In some possible embodiments, the display position of the first tab in the preview screen and the display area of the second electronic device in the preview screen partially overlap or completely overlap. The first label may be displayed in a display area of the second electronic device, may be displayed at an edge of the display area of the second electronic device, or may be displayed in a position close to the display area of the second electronic device.
In some possible embodiments, the method further comprises: the first electronic equipment acquires second position information of the third electronic equipment relative to the first electronic equipment; when the first electronic equipment detects that the preview picture does not include the third electronic equipment, determining that the third electronic equipment is in the framing range of the camera based on the second position information; the first electronic equipment determines the display position of a second label in the preview picture based on the second position information, wherein the second label is used for indicating one or more of the following information: identification information of the third electronic device, a shelter of the third electronic device, and second location information. In this manner, when the first electronic device detects that the relative position of the third electronic device is within the viewing range of the camera, but the preview screen does not include the image of the third electronic device, the first electronic device determines that the third electronic device is blocked, and outputs the second tag of the third electronic device, which indicates one or more of the identification information of the third electronic device, the blocking object, and the position blocked in the preview interface.
In some possible embodiments, the method further comprises: when the first electronic equipment detects that the third electronic equipment is not included in the preview picture, determining that the third electronic equipment is not in the framing range of the camera based on the second position information; the first electronic equipment determines the display position of a third label in the preview picture based on the second position information, wherein the third label is used for indicating one or more of the following information: identification information of the third electronic device, the second location information. In this manner, when the first electronic device detects that the relative position of the third electronic device is outside the view range of the camera and the image of the third electronic device is not included in the preview screen, the first electronic device determines that the third electronic device is not in the view frame, and outputs the second tag of the third electronic device indicating one or more of the identification information of the third electronic device and the relative position (direction, angle, distance, etc.) with respect to the first electronic device.
In some possible embodiments, the preview screen includes an image of a fourth electronic device, and after the first electronic device displays the first interface, the method further includes: the first electronic equipment determines that the equipment type of the fourth electronic equipment is a first type based on the preview picture; determining a first target device with a first type as a device type in electronic devices associated or bound with an account of the first electronic device by the first electronic device; the first electronic device displays a fourth label indicating that an image of the fourth electronic device is associated with the first target device. In this manner, when the first electronic device cannot detect the location information of the fourth electronic device, and the image of the fourth electronic device is in the preview screen. In this case, the first electronic device identifies the device type of the fourth electronic device according to an image recognition technology, and detects whether a target device of the device type exists in devices that log in the same account (for example, hua is the account) as the first electronic device. If yes, the first electronic device regards the target device as a fourth electronic device, and the first electronic device outputs a fourth tag identifying the target device.
In some possible embodiments, the preview screen includes an image of a fifth electronic device, and after the first electronic device displays the first interface, the method further includes: the first electronic equipment determines that the equipment type of the fifth electronic equipment is the second type based on the preview picture; the first electronic equipment acquires third position information of the first electronic equipment, and the first electronic equipment stores the corresponding relation between the electronic equipment and the position information; based on the corresponding relation, the first electronic device determines a second target device with the first type according to the third position information, and the position information of the target device is the same as the third position information; the first electronic device displays a fifth label indicating that the image of the fifth electronic device is associated with the second target device. In this manner, when the first electronic device cannot detect the position information of the fifth electronic device, and the image of the fifth electronic device is in the preview screen. In this case, since the first electronic device stores the correspondence between the electronic device and the location information (e.g., smart speaker — living room, smart desk lamp — bedroom, computer — company, etc.), the first electronic device detects whether the target device of the device type exists in the devices in the same geographical location as the first electronic device according to the current geographical location of the first electronic device and the device type of the fifth electronic device identified by the image recognition technology. If yes, the first electronic device regards the target device as a fifth electronic device, and outputs a fifth tag identifying the target device.
In some possible embodiments, the first interface further includes a first icon, and the first icon is associated with data to be shared, and the method further includes: the first electronic equipment receives a third operation, wherein the third operation is an operation aiming at the first label and/or the first icon; and responding to the third operation, and the first electronic equipment sends the data to be shared to the second electronic equipment. The third operation includes but is not limited to a drag operation, a click operation, and the like; the data sharing method includes the steps that a second electronic device needing to be shared is selected on a first interface, and data to be shared are sent to the second electronic device. The data sharing method and the data sharing device simplify user operation of data sharing, visually display equipment information and improve user experience.
In some possible embodiments, before the first electronic device receives the third operation, the method further includes: the first electronic device displays a first label in a first display form on a first interface according to the data type of the data to be shared, and the first label in the first display form is used for prompting a user that the second electronic device supports outputting the data to be shared. Wherein the first display form may be to highlight (change brightness, color, etc.) the display area of the first label.
In some possible embodiments, the preview screen includes an image of a third electronic device and a third tag, the third tag being associated with the third electronic device; the method further comprises the following steps: the first electronic equipment receives a fourth operation, wherein the fourth operation is an operation aiming at the first label and/or the third icon; responding to the fourth operation, the first electronic equipment outputs a prompt message, wherein the prompt message is used for prompting the user that the third electronic equipment does not support outputting of the data to be shared.
In a second aspect, the present application provides an electronic device comprising: one or more processors, memory; the memory includes computer instructions that, when invoked by the one or more processors, cause the electronic device to perform:
receiving a first operation;
responding to the first operation, and displaying a first interface, wherein the first interface comprises a preview picture acquired by a camera, and the preview picture comprises first target equipment;
acquiring first relative position information of first target equipment;
determining the display position of a first label in the preview picture based on the first relative position information and the display position of the first target equipment in the preview picture, wherein the first label is used for indicating the identification information of the first target equipment;
Receiving a second operation for the first tag;
in response to the second operation, a second interface is displayed, the second interface including one or more controls that control the first target device.
According to the embodiment of the application, the electronic equipment receives an operation display interface, starts a camera and displays an image acquired by the camera in real time in the interface; the electronic equipment identifies the electronic equipment in the image and the equipment type (such as a sound box, a computer, a tablet computer and the like) of the electronic equipment, such as a first target equipment, according to an image identification technology; and the electronic device obtains location information of the first target device relative to the electronic device according to a wireless location technology (e.g., UWB location, bluetooth location, WiFi location, etc.). The position information includes one or more of a distance, a direction, and an angle. Based on the location information, the electronic device determines a display location of a tag of the first target device in the preview screen, where the tag is used for identifying the first target device, for example, identifying a device name, a device type, and the like of the first target device. Wherein the display position of the tag is related to the display position of the first target device. When the electronic device detects a user operation directed to the label, the electronic device outputs a second interface that includes one or more controls that control the first target device. The second interface may be displayed by being superimposed on the interface, or the electronic device may jump from the interface to display the second interface. The corresponding relation between the first label and the first target device is presented on the first interface of the electronic device in real time through the augmented reality display mode, interaction between the electronic device and the first target device is achieved through the first label, coordination control among multiple devices is achieved, and user experience is improved.
In some possible embodiments, when the one or more processors invoke the computer instructions, the electronic device is caused to perform the acquiring of the first relative location information with the first target device, including in particular: broadcasting a detection request, wherein the detection request comprises an identity of the electronic equipment; when a probe response sent by the first target device based on the probe request is received, first relative position information of the first target device is determined based on the probe response, and the probe response comprises the identity of the first target device.
Optionally, the probe response includes an identity and first relative location information of the first target device, and the electronic device determines, based on the probe response, the first relative location information, such as a distance, a direction, an angle, and the like, of the first target device and the electronic device. Specifically, the first target device calculates a relative position between the first target device and the electronic device according to the received probe request. The detection request comprises sending time, the first target device determines a time difference based on the sending time and the time when the first target device receives the detection request, and therefore the distance between the first target device and the electronic device is calculated; the first target device calculates an angle of arrival of the probe request based on the received probe request, and may determine an azimuth angle of the first target device relative to the electronic device. The first target device sends a detection response to the electronic device, wherein the detection response comprises the identity of the first target device and the first relative position information.
In some possible embodiments, the display position of the first tag in the preview screen and the display position of the first target device in the preview screen partially overlap or completely overlap. The first tag may be displayed within a display area of the first target device, may be displayed at an edge of the display area of the first target device, or may be displayed in close proximity to the display area of the first target device.
In some possible implementations, the computer instructions, when invoked by the one or more processors, cause the electronic device to further perform: acquiring second relative position information of a second target device; when the electronic equipment detects that the preview picture does not include the second target equipment, determining that the second target equipment is in the framing range of the camera based on the second relative position information; the electronic equipment determines the display position of a second label in the preview picture based on the second relative position information, wherein the second label is used for indicating one or more of the following information: identification information of the second target device, an obstruction of the second target device, and second relative location information.
In some possible implementations, the computer instructions, when invoked by the one or more processors, cause the electronic device to further perform: when the electronic equipment detects that the preview picture does not include the second target equipment, determining that the second target equipment is not in the framing range of the camera based on the second relative position information; the electronic equipment determines the display position of a third label in the preview picture based on the second relative position information, wherein the third label is used for indicating one or more of the following information: identification information of the second target device, second relative location information.
In some possible embodiments, the preview screen includes an image of the third target device, and when the one or more processors invoke the computer instructions to cause the electronic device to perform, after displaying the first interface, the electronic device further performs: determining that the device type of the third target device is the first type based on the preview picture; determining identification information of equipment of which the equipment type is a first type in associated or bound electronic equipment under an account of the electronic equipment; displaying a fourth label indicating that the image of the third target device is associated with the identification information.
In some possible embodiments, the preview screen includes an image of the fourth target device, and when the one or more processors invoke the computer instructions to cause the electronic device to perform, after displaying the first interface, the electronic device further performs: determining that the device type of the fourth target device is a second type based on the preview screen; acquiring position information of the electronic equipment, wherein the electronic equipment stores the corresponding relation between the electronic equipment and the position information; the electronic equipment determines the identification information of the equipment with the first type in the corresponding relation according to the third position information; displaying a fifth label, the fifth label for indicating that the image of the fourth target device is associated with the identification information.
In some possible embodiments, the first interface further includes a first icon, the first icon being associated with data to be shared, and when the one or more processors invoke the computer instructions, the electronic device is further caused to perform: receiving a third operation, wherein the third operation is an operation aiming at the first label and/or the first icon; and responding to the third operation, and sending the data to be shared to the first target equipment. The third operation includes but is not limited to a drag operation, a click operation, and the like; the data sharing method includes the steps that a first target device needing to be shared is selected on a first interface, and data to be shared are sent to the first target device. The data sharing method and the data sharing device simplify user operation of data sharing, visually display equipment information and improve user experience.
In some possible implementations, when the one or more processors invoke the computer instructions, causing the electronic device to perform, before receiving the third operation, the electronic device further performs: according to the data type of the data to be shared, displaying a first label in a first display form on a first interface, wherein the first label in the first display form is used for prompting a user that a first target device supports outputting of the data to be shared. Wherein the first display form may be to highlight (change brightness, color, etc.) the display area of the first label.
In some possible embodiments, the preview screen includes an image of the second target device and a third tag, the third tag being associated with the second target device; when the one or more processors invoke the computer instructions, cause the electronic device to further perform: receiving a fourth operation, wherein the fourth operation is an operation aiming at the first label and/or the third icon; and responding to the fourth operation, and outputting a prompt message for prompting the user that the second target equipment does not support outputting the data to be shared.
In a third aspect, the present application provides a method for sharing photos, applied to a first electronic device, the method including: displaying a shooting preview interface of the first electronic equipment, wherein the shooting preview interface comprises a thumbnail of a first photo and a preview picture acquired by a camera of the first electronic equipment; identifying a sixth electronic device included in the preview screen; determining the relative position of the sixth electronic device and the first electronic device; displaying a label of the sixth electronic device on the preview screen based on the recognized sixth electronic device and the relative position, wherein the label is used for identifying the sixth electronic device; receiving a fifth operation on the thumbnail of the first photo; in response to the fifth operation, moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen; and sending the first photo to the sixth electronic equipment.
In the embodiment of the present application, a camera application main interface displayed by a user clicking an icon of a camera application may be referred to as a "shooting preview interface", and a screen presented in the shooting preview interface may be referred to as a "preview image" or a "preview screen".
It should be understood that, in the embodiment of the present application, the shooting preview interface may represent an interface including a preview screen, a shooting shutter key, a local album icon, a camera switching icon, and the like, and if a change of display content occurs on the interface, for example, a certain identified device tag is displayed, the interface may also be referred to as a shooting preview interface, which is not described in detail later.
It should be noted that the preview screen may be obtained by a front camera or a rear camera of the mobile phone, and the camera for taking the picture is not limited in the embodiment of the present application. For example, a photo of a person is obtained by a front camera of a mobile phone, and if a user wants to identify the electronic device through a rear camera, the user can switch the photo by clicking a camera switching key. Or the personal photo is obtained by a rear camera of the mobile phone, and if the user wants to identify the electronic device through the front camera, the user can switch the electronic device by clicking a camera switching key, which is not limited in the embodiment of the present application.
Through the implementation mode, the mobile phone can judge the electronic equipment in the preview picture in advance, and when the user starts the picture sharing function, the recognized name of the electronic equipment is displayed in the interface quickly, so that the speed of recognizing the object in the preview picture by the mobile phone is increased. For example, after the mobile phone identifies the sixth electronic device included in the current preview screen, the user may drag the thumbnail of the first photo to the sixth electronic device to be shared according to the needs of the user.
It should be understood that, for the implementation process described above, the mobile phone may detect and identify other electronic devices included in the preview screen in many different ways, such as image detection, 3D scanning technology, machine vision, and the like, and the embodiment of the present application does not limit the way in which the mobile phone identifies other electronic devices in the preview screen.
In one possible implementation, the thumbnail of the first photo may be a local album icon. For example, the local album icon shows the first photograph that the user has recently taken.
In another possible implementation, the thumbnail of the first photo may have the same style or display as the local album icon, and the thumbnail of the first photo is displayed in a floating manner on the shooting preview interface. With reference to the third aspect and the foregoing implementations, in some implementations of the third aspect, the method further includes: receiving a sixth operation on the album icon; and in response to the sixth operation, displaying the thumbnail of the first photo in a floating manner on the shooting preview interface.
With reference to the third aspect and the foregoing implementations, in some implementations of the third aspect, the fifth operation is an operation of dragging a thumbnail of the first photo, and the sixth operation is an operation of long-pressing the local album icon.
In the above method, taking long press operation as an example, long press of a local album icon by a user is used as an operation for triggering a photo sharing process. It should be understood that the photo sharing process provided in the embodiment of the present application may also be triggered by other preset operations, or the electronic device in the preview screen is triggered by other preset operations, for example, the preset operation is not limited to long-pressing the local album icon, double-clicking the local album icon, or drawing a fixed pattern on the shooting preview interface, and the like, and the embodiment of the present application is not limited thereto.
With reference to the third aspect and the foregoing implementation manners, in some implementation manners of the third aspect, a tag of the sixth electronic device is used to identify a name of the sixth electronic device, and/or the tag of the sixth electronic device is used to identify a location where the sixth electronic device is located.
In this embodiment of the application, after the mobile phone identifies the sixth electronic device included in the preview screen, the location where the tag of the sixth electronic device is displayed may be determined according to the display location of the sixth electronic device in the current preview screen. In a possible manner, the mobile phone may display a label of the sixth electronic device in an area where the sixth electronic device is located in the preview screen.
Alternatively, the label of the sixth electronic device may be displayed in an area near the positioning means of the sixth electronic device. Alternatively, the label of the sixth electronic device may be displayed in a blank area in the preview screen, so as not to obscure other objects in the preview screen.
The icon display mode can mark the identified electronic equipment under the condition of not shielding other objects in the preview picture, does not influence the vision and the impression of the user, and improves the visual experience of the user.
By the method, the user can start the equipment identification function and the positioning function of the mobile phone through preset operation in the process of shooting the picture, identify other electronic equipment included in the preview picture of the camera by combining the identification function and the positioning function of the mobile phone, and directly drag the picture to be shared to the area where the other electronic equipment is located, so that the picture can be rapidly shared to other electronic equipment existing around. The process simplifies the operation flow of sharing the photos, shortens the time of sharing the photos and improves the user experience.
With reference to the third aspect, in some implementations of the third aspect, the first electronic device includes a first positioning chip, the sixth electronic device includes a second positioning chip, the identifying the sixth electronic device included in the preview screen, and the determining the relative position of the sixth electronic device and the first electronic device includes: based on the first positioning chip, the second positioning chip and the preview picture, the sixth electronic device included in the preview picture is identified, and the relative position of the sixth electronic device and the first electronic device is determined, wherein the first positioning chip comprises at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip, and the second positioning chip comprises at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip.
In the embodiment of the application, the mobile phone can identify other electronic devices in the preview screen through multiple possible positioning technologies, and position the positions of the other electronic devices. Optionally, the positioning technology of the embodiment of the present application may include one of technologies such as bluetooth-based wireless sensing positioning, Ultra Wide Band (UWB) -based wireless sensing positioning, computer vision-based positioning, or a fusion of the above-listed multiple positioning technologies, or other more positioning technologies, and the embodiment of the present application does not limit the way in which the mobile phone positions the other electronic devices.
With reference to the third aspect and the foregoing implementation manners, in some implementation manners of the third aspect, the shooting preview interface further includes a shooting shutter key, and the method further includes: receiving a seventh operation on the photographing shutter key; and in response to the seventh operation, taking the first picture.
Optionally, the method may directly share the latest shot first picture to other devices when the user takes a picture through the camera application. Or sharing the first photo with the latest date in the local photo album to other devices.
With reference to the third aspect and the foregoing implementations, in some implementations of the third aspect, before the taking preview interface displays the thumbnail of the first photograph, the method further includes: receiving an eighth operation; in response to the four operations, displaying a picture list on the shooting preview interface, wherein the picture list comprises the first picture and a plurality of second pictures, and the date of the second pictures is before the first pictures; receiving a ninth operation; responding to the five operations, and selecting at least one second photo from the photo list; and after the moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen, the method further includes: and sending the first photo and the selected at least one second photo to the sixth electronic device.
With reference to the third aspect and the foregoing implementation manners, in some implementation manners of the third aspect, the eighth operation is a sliding operation along a preset direction with the local album icon as a starting point, and the ninth operation is a clicking operation.
By the method, the user can start the equipment identification function and the positioning function of the mobile phone through preset operation in the process of shooting the photos, and identify other electronic equipment in the preview picture of the camera by combining the identification function and the positioning function of the mobile phone, so that the user can select a plurality of photos to be shared and directly drag the plurality of photos to be shared to the areas where the other electronic equipment is located, and the photos can be quickly shared to other electronic equipment existing around. The process simplifies the operation flow of sharing the photos, shortens the time of sharing the photos and improves the user experience.
In one possible scenario, the photos in the photo list may be arranged in the order in which they were taken by the user. Illustratively, the first photograph is the most recent photograph taken by the user, and the second photograph is taken earlier than the first photograph.
Alternatively, the photos in the photo list may be arranged in other possible arrangement orders, for example, it is detected that the shooting location is a company, and the photos of which the shooting location is the company may be displayed in the photo list, which is not limited in this embodiment of the present application.
In one possible scenario, after the user triggers the display of the list of photos on the interface through a slide-up operation, the first photo in the list of photos may be selected by default. If the user does not desire to share the first photo, the user can click on the selection box at the lower right corner of the first photo to deselect the first photo. Similarly, if the user desires to share the first photo and the at least one second photo at the same time, the user may click on the selection box at the lower right corner of each second photo to select a plurality of photos to be shared, which is not described herein again.
Optionally, the process of sharing multiple photos provided in the embodiment of the present application may also be triggered by other preset operations, or the electronic device in the preview screen is triggered by other preset operations, for example, the preset operations are not limited to selecting a local album icon and dragging the local album icon upwards, double-clicking the local album icon, or drawing a fixed pattern on a shooting preview interface, and the like, and the embodiment of the present application is not limited thereto.
With reference to the third aspect and the foregoing implementation manners, in some implementation manners of the third aspect, when the thumbnail of the first photo is moved to the display area of the sixth electronic device identified by the tag, a display effect of the tag of the sixth electronic device changes, where the display effect includes one or more of a color, a size, and an animation effect of the tag of the sixth electronic device.
Specifically, the user may drag the thumbnail of the first photo to the location of the sixth electronic device and then release the thumbnail, where the icon of the sixth electronic device may be presented in different colors, or display other dynamic effects such as size change, jumping, and flashing lights, so as to remind the user to share the currently-photographed first photo with the sixth electronic device identified in the preview screen.
Or in the process that the user drags the thumbnail of the first photo, a reminding control can be displayed on the preview picture. For example, the reminder control may be an arrow or the like, and the arrow may be displayed statically, jumpily or flickers to prompt the user that the thumbnail of the first photo can be dragged to the position identified by the arrow, so as to implement the photo sharing function. The display mode of the reminding control is not limited in the embodiment of the application.
With reference to the third aspect and the foregoing implementation manners, in some implementation manners of the third aspect, when the sixth electronic device is blocked in the preview screen, or when it is detected that the sixth electronic device is located outside a range corresponding to the preview screen, the method further includes: and displaying prompt information on the shooting preview interface, wherein the prompt information is used for prompting a user of the position of the sixth electronic equipment, or the prompt information is used for prompting the user to adjust the position of the first electronic equipment, so that the sixth electronic equipment is displayed in the preview picture of the first electronic equipment.
It should be noted that the mobile phone may communicate with other nearby electronic devices, for example, communicate in multiple possible manners such as bluetooth, wireless fidelity (WIFI) module, and the like, so that the mobile phone may sense the electronic devices existing nearby. Alternatively, the mobile phone may determine that another electronic device exists nearby by using a wireless positioning technology such as UWB, recognize the type of the electronic device, and display the type in the shooting preview interface. The embodiment of the application does not limit the communication interaction mode and the connection establishing mode of the mobile phone and other nearby electronic devices.
By the method, when the mobile phone recognizes that other electronic equipment exists in the preview picture and the electronic equipment is shielded by the obstacle, in the process of sharing the picture by the user, reminding information such as characters or icons can be displayed on the shooting preview interface for reminding the user of the position of the shielded electronic equipment and the like, the user can further share the shot picture to the shielded electronic equipment quickly, a possible way is provided for the user to share the picture to the shielded electronic equipment, and the operation steps of sharing the picture by the user are simplified.
In a possible scenario, the mobile phone may recognize that there is a sixth electronic device nearby through the wireless positioning technology, and the sixth electronic device is not displayed in the current preview screen of the mobile phone. For the scene, the embodiment of the application can also display the reminding information on the shooting preview interface, so as to remind the user that the sixth electronic device exists in a certain direction.
Optionally, in the embodiment of the present application, besides the text reminder in the reminder window, an icon reminder may also be included. For example, in addition to the reminder window, an icon that marks the position of the sixth electronic device that is blocked, such as a statically displayed arrow, a dynamically flashing arrow, or a jumping displayed arrow, may be included on the shooting preview interface of the mobile phone, which is not limited in this embodiment of the application.
Or, in another possible mode, the user may rotate the direction of the mobile phone according to the reminding information on the interface, so that the camera of the mobile phone may acquire the detected sixth electronic device, and display the sixth electronic device that the user is going to share the photo in the preview picture, thereby quickly sharing the shot photo to other electronic devices according to the method described above.
The mobile phone can identify other nearby electronic devices through wireless location technology, and if the electronic devices are not displayed in the current preview screen of the mobile phone. For the scene, the embodiment of the application can also display reminding information on a shooting preview interface, so as to remind a user that other electronic equipment exists in a certain direction.
To sum up, according to the method for sharing photos provided by the embodiment of the application, the user can start the device identification function and the positioning function of the electronic device through preset operations in the process of shooting the photos or running the camera application. And based on the identification function and the positioning function of the electronic equipment, other electronic equipment included in the preview picture of the camera is identified, and a user can select one or more photos to be shared through quick operation and directly drag the one or more photos to be shared to the area where the other electronic equipment is located, so that the one or more photos are quickly shared to other electronic equipment existing around. In addition, according to the embodiment of the application, aiming at various scenes such as other shielded electronic equipment and the like in the preview picture, a humanized interactive interface is provided for the user, the user can conveniently share one or more photos through quick operation, the operation process of sharing the photos is simplified, the time of sharing the photos is shortened, and the user experience is improved.
In a fourth aspect, a first electronic device is provided, comprising: a processor and a memory; the memory stores one or more instructions that, when executed by the processor, cause the first electronic device to perform the steps of: displaying a shooting preview interface of the first electronic equipment, wherein the shooting preview interface comprises a thumbnail of a first photo and a preview picture acquired by a camera of the first electronic equipment; identifying a sixth electronic device included in the preview screen; determining the relative position of the sixth electronic device and the first electronic device; displaying a label of the sixth electronic device on the preview screen based on the recognized sixth electronic device and the relative position, wherein the label is used for identifying the sixth electronic device; receiving a fifth operation on the thumbnail of the first photo; in response to the fifth operation, moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen; and sending the first photo to the sixth electronic equipment.
With reference to the fourth aspect, in some implementations of the fourth aspect, the first electronic device includes a first positioning chip, the sixth electronic device includes a second positioning chip, and the one or more instructions, when executed by the processor, cause the first electronic device to perform the steps of: based on the first positioning chip, the second positioning chip and the preview picture, the sixth electronic device included in the preview picture is identified, and the relative position of the sixth electronic device and the first electronic device is determined, wherein the first positioning chip comprises at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip, and the second positioning chip comprises at least one of a Bluetooth positioning chip and an ultra-wideband UWB positioning chip.
With reference to the fourth aspect and the foregoing implementations, in some implementations of the fourth aspect, the capture preview interface includes an album icon, which when executed by the processor, causes the first electronic device to perform the following steps: receiving a sixth operation on the album icon; and in response to the sixth operation, displaying the thumbnail of the first photo in a floating manner on the shooting preview interface.
With reference to the fourth aspect and the foregoing implementation manners, in some implementation manners of the fourth aspect, the fifth operation is an operation of dragging a thumbnail of the first photo, and the sixth operation is an operation of long-pressing the local album icon.
With reference to the fourth aspect and the foregoing implementation manners, in some implementation manners of the fourth aspect, a shooting shutter key is further included on the shooting preview interface, and when the one or more instructions are executed by the processor, the first electronic device is caused to perform the following steps: receiving a seventh operation on the photographing shutter key; and in response to the seventh operation, taking the first picture.
With reference to the fourth aspect and the implementations described above, in some implementations of the fourth aspect, the one or more instructions, when executed by the processor, cause the first electronic device to perform the steps of: receiving an eighth operation; in response to the four operations, displaying a picture list on the shooting preview interface, wherein the picture list comprises the first picture and a plurality of second pictures, and the date of the second pictures is before the first pictures; receiving a ninth operation; responding to the five operations, and selecting at least one second photo from the photo list; and after moving the thumbnail of the first photo to the display area of the sixth electronic device identified by the label on the preview screen, sending the first photo and the selected at least one second photo to the sixth electronic device.
With reference to the fourth aspect and the foregoing implementation manners, in some implementation manners of the fourth aspect, the eighth operation is a sliding operation along a preset direction with the local album icon as a starting point, and the ninth operation is a clicking operation.
With reference to the fourth aspect and the foregoing implementation manners, in some implementation manners of the fourth aspect, the tag of the sixth electronic device is used to identify a name of the sixth electronic device, and/or the tag of the sixth electronic device is used to identify a location where the sixth electronic device is located.
With reference to the fourth aspect and the foregoing implementation manners, in some implementation manners of the fourth aspect, when the thumbnail of the first photo is moved to the display area of the sixth electronic device identified by the tag, a display effect of the tag of the sixth electronic device changes, where the display effect includes one or more of a color, a size, and an animation effect of the tag of the sixth electronic device.
With reference to the fourth aspect and the foregoing implementations, in some implementations of the fourth aspect, when the sixth electronic device is occluded in the preview screen and the one or more instructions are executed by the processor, the first electronic device is further configured to perform the following steps: and displaying prompt information on the shooting preview interface, wherein the prompt information is used for prompting a user of the position of the sixth electronic equipment, or the prompt information is used for prompting the user to adjust the position of the first electronic equipment, so that the sixth electronic equipment is displayed in the preview picture of the first electronic equipment.
In a fifth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions that, when executed on an electronic device, cause the electronic device to perform the method in any one of the possible implementation manners of the foregoing aspect.
In a sixth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the method in any one of the possible implementations of any one of the above aspects.
Drawings
Fig. 1 is a schematic diagram of a system architecture according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of another electronic device provided in the embodiment of the present application;
fig. 4 is a schematic view of a scenario of an apparatus identification method according to an embodiment of the present application;
FIGS. 5A-5H are schematic diagrams of a set of interfaces provided by embodiments of the present application;
FIGS. 6A-6B are schematic diagrams of yet another set of interfaces provided by embodiments of the present application;
FIGS. 7A-7C are schematic views of another set of interfaces provided by embodiments of the present application;
FIGS. 8A-8E are schematic views of yet another set of interfaces provided by embodiments of the present application;
FIGS. 9A-9E are schematic views of yet another set of interfaces provided by embodiments of the present application;
FIGS. 10A-10C are schematic views of yet another set of interfaces provided by embodiments of the present application;
FIGS. 11A-11D are schematic views of another set of interfaces provided by embodiments of the present application;
FIGS. 12A-12F are schematic views of yet another set of interfaces provided by embodiments of the present application;
FIG. 13 is a graphical user interface diagram of an example process for sharing photos;
FIG. 14 is a graphical user interface diagram illustrating an example process for sharing photos according to an embodiment of the present disclosure;
FIG. 15 is a graphical user interface diagram of another process for sharing photos according to an embodiment of the present application;
FIG. 16 is a schematic diagram of an exemplary graphical user interface for receiving photographs according to an embodiment of the present application;
FIG. 17 is a graphical user interface diagram illustrating an example process for sharing photos according to an embodiment of the present disclosure;
FIG. 18 is a schematic diagram of a graphical user interface for receiving photos according to yet another embodiment of the present application;
FIG. 19 is a schematic flow chart of an example method for sharing photos according to the present disclosure;
fig. 20 is a flowchart of a method of a positioning method according to an embodiment of the present application;
FIG. 21 is a schematic diagram illustrating a positioning method according to an embodiment of the present disclosure;
fig. 22 is a schematic flowchart of an apparatus identification method according to an embodiment of the present application;
Fig. 23 is a schematic diagram of a software architecture provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
The embodiment of the application provides an Augmented Reality (AR) based equipment identification method, wherein the AR is an AR which superimposes virtual information and a real environment on the same picture or space by means of computer graphics and visualization technologies and simultaneously exists, and integrates a three-dimensional display technology, an interaction technology, a sensor technology, a computer vision technology, a multimedia technology and the like.
In the method, the electronic device 100 enters a first interface, starts a camera, and displays an image acquired by the camera in real time in the first interface; meanwhile, a probe request with a wireless positioning technology is sent, the electronic device 100 determines nearby devices of the electronic device 100 according to the received probe response to the probe request, and the device names, the device types, the physical distances and the angles of the nearby devices from the electronic device 100. The electronic device 100 performs image recognition on the image captured by the camera, and identifies the electronic device in the image and a device type (e.g., a speaker, a computer, a tablet computer, etc.) of the electronic device. The electronic device 100 determines a display area of an image of the nearby device in the first interface according to the physical distance and the angle between the nearby device and the electronic device 100. The electronic device displays a device icon in real time on the first interface in an augmented reality manner, where the device icon may be used for the electronic device 100 to interact with a nearby device, for example, the electronic device 100 detects a user operation directed to the device icon, and in response to the user operation, the electronic device 100 outputs a control interface of the nearby device corresponding to the device icon. The method realizes the interaction between the electronic equipment and the nearby equipment, and the corresponding relation between the equipment icon and the equipment is presented in real time through an augmented reality display mode, so that the user experience is improved.
In this application, a device icon may also be referred to as a device tag.
A communication system provided in an embodiment of the present application is described below.
Referring to fig. 1, fig. 1 schematically illustrates a communication system 10 provided in an embodiment of the present application. As shown in fig. 1, the communication system 10 includes an electronic device 100, an electronic device 201, an electronic device 202, an electronic device 203, an electronic device 204, and the like. The electronic device 100 may assist a user in selecting and controlling various electronic devices (e.g., speakers, televisions, refrigerators, air conditioners, etc.). In this application, the electronic device 100 may also be referred to as a first electronic device, and the electronic device 201 (or the electronic device 202, the electronic device 203, the electronic device 204, etc.) may also be referred to as a second electronic device; wherein,
an electronic device (e.g., the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204) has an Ultra Wide Band (UWB) communication module and may also have one or more of a bluetooth communication module, a WLAN communication module, and a GPS communication module. Taking the electronic device 100 as an example, the electronic device 100 may detect and scan electronic devices (e.g., the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204) near the electronic device 100 by transmitting signals through one or more of a UWB communication module, a bluetooth communication module, a WLAN communication module, and a Global Positioning System (GPS) communication module, so that the electronic device 100 may discover the nearby electronic devices through one or more near-field wireless communication protocols of UWB, bluetooth, WLAN, and GPS, establish wireless communication connections with the nearby electronic devices, and transmit data to the nearby electronic devices.
The present application is not limited to the type of the electronic device (e.g., the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204), and in some embodiments, the electronic device in the embodiments of the present application may be a mobile phone, a wearable device (e.g., a smart band), a tablet computer, a laptop computer (laptop), a handheld computer, a computer, an ultra-mobile personal computer (UMPC), a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \\ Virtual Reality (VR) device, or other portable device. The device can also be a sound box, a television, a refrigerator, an air conditioner, vehicle-mounted equipment, a printer, a projector and the like. Exemplary embodiments of the electronic device include, but are not limited to, a mount
Figure BDA0002750755210000121
Or other operating system.
In one possible implementation, electronic device 100, electronic device 201, electronic device 202, electronic device 203, and electronic device 204 may communicate directly with each other. In one possible implementation, the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 may be connected to a Local Area Network (LAN) by way of a wired or wireless fidelity (WiFi) connection. For example, the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 are all connected to the same electronic device 301, and the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 may communicate indirectly through the electronic device 301. The electronic device 301 may be one of the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204, and may also be an additional third-party device, such as a router, a cloud server, a gateway, and the like. The cloud server may be a hardware server or may be embedded in a virtualization environment, for example, the cloud server may be a virtual machine executing on a hardware server that may include one or more other virtual machines. The electronic device 301 may transmit data to the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 via a network, and may receive data transmitted from the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204.
The electronic device 301 may include a memory, a processor, and a transceiver. Wherein the memory may be used to store programs related to UWB positioning; the memory may also be used to store orientation parameters of an electronic device (e.g., electronic device 201) acquired via UWB positioning techniques; the memory may also be used to store messages exchanged via electronic device 301, data and/or configurations related to electronic device 100 and nearby devices. The processor may be configured to determine, when obtaining the location parameters of the plurality of nearby devices in the local area network, a target device to respond according to the location parameters of the plurality of nearby devices. The transceiver may be used to communicate with electronic devices connected to a local area network. It should be noted that, in the embodiment of the present application, multiple vicinities may or may not be connected to the same local area network, and are not specifically limited herein.
It is to be understood that the configuration shown in the present embodiment does not constitute a specific limitation to the communication system 10. In other embodiments of the present application, communication system 10 may include more or fewer devices than those shown.
Next, the electronic apparatus 100 related to the embodiment of the present application is described.
Referring to fig. 2, fig. 2 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including UWB, Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (WiFi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
It should be understood that, in this embodiment of the present application, if sharing photos between two electronic devices is to be achieved, transmission may be performed in any one of the above listed communication manners, for example, through bluetooth, a wireless fidelity (WIFI) module, and other possible manners, which is not limited in this embodiment of the present application.
Among them, UWB wireless communication is a wireless personal area network communication technology having low power consumption and high-speed transmission. UWB employs pulse signals to transmit data, unlike the continuous carrier approach used by common communication techniques. UWB utilizes nanosecond (ns) to picosecond (ps) non-sine wave narrow pulse signal to transmit data, and time modulation technology can greatly improve the transmission rate. Because of the use of very short pulses, UWB devices transmit very little power at high speed, only a few percent of the current continuous carrier systems, and thus consume relatively little power.
Compared with the traditional narrow-band system, the UWB system has the advantages of strong penetrating power, low power consumption, good multipath resistance effect, high safety, low system complexity, capability of providing accurate positioning precision and the like. UWB may be applied to wireless communication applications requiring high quality services, and may be used in the fields of Wireless Personal Area Networks (WPANs), home network connections, short-range radars, and the like. UWB will become a technological means to solve the contradiction between the demand for high-speed internet access in enterprises, homes, public places, etc. and the increasingly crowded allocation of frequency resources.
In the embodiment of the present application, the electronic device 100 may implement measurement of a distance and a Received Signal Strength (RSSI) through one UWB antenna. The electronic device 100 may enable Angle of arrival (AOA) measurements through at least two UWB antennas.
In some embodiments, the UWB communication module of the electronic device 100 may be in a powered-on state while the electronic device is in a standby state.
In some embodiments, electronic device 100 may implement range and AOA measurements via bluetooth.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In some embodiments of the present application, the interface content currently output by the system is displayed in the display screen 194. The electronic device 100 displays images, application interfaces, keys, icons, windows, and the like on the display screen of the electronic device 100 through mutual cooperation among the modules of the GPU, the display screen 194, the application processor, and the like, thereby implementing a display function of the electronic device. For example, the interface content is an interface provided by an instant messaging application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some alternative embodiments of the present application, the pressure sensor 180A may be configured to capture a pressure value generated when a user's finger portion contacts the display screen and transmit the pressure value to the processor, so that the processor identifies which finger portion the user entered the user action.
The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, different touch positions may be acted on, and different operation instructions may be corresponded. In some alternative embodiments, the pressure sensor 180A may also calculate the number of touch points from the detected signals and transmit the calculated values to the processor, so that the processor recognizes the user's operation by single-finger or multi-finger input.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 about three axes (the X, Y, and Z axes of the electronic device) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In some alternative embodiments of the present application, the acceleration sensor 180E may be configured to capture an acceleration value generated when a finger portion of the user touches the display screen (or when the finger of the user strikes a rear bezel of the electronic device 100), and transmit the acceleration value to the processor, so that the processor identifies which finger portion the user input the user operation through.
In this embodiment, the electronic device 100 may determine the posture change of the electronic device 100 through a gyroscope sensor and/or an acceleration sensor, so as to recognize the user operation. For example, the current user operation is recognized as a lifting operation according to the posture change of the electronic device 100, the lifting operation may be that the electronic device 100 is horizontally placed in a horizontal direction (at this time, the display screen 194 of the electronic device is parallel to the horizontal direction, and the lifting angle is an included angle with the horizontal direction, that is, 0 degree), the user lifts the electronic device 100 to a vertical horizontal direction within a preset time (at this time, the display screen 194 of the electronic device is perpendicular to the horizontal direction, and the lifting angle is an included angle with the horizontal direction, that is, 90 degrees), and the lifting change angle within the preset time is 90 degrees (the 90 degrees minus 0 degree). When the electronic device 100 detects that the lift variation angle within the preset time exceeds the preset angle, the electronic device 100 may consider that the current user operation is a lift operation. The preset angle may be 30 degrees, for example.
In some embodiments, when the electronic device 100 detects that the lift-up variation angle within the preset time exceeds the preset angle and the lift-up angle at a certain time within the preset time is within the preset angle range, the electronic device 100 regards the current user operation as the lift-up operation. The preset angle range may be 60 to 90 degrees.
For another example, the electronic device 100 may determine a change in the posture of the electronic device 100 through a gyro sensor and/or an acceleration sensor, thereby recognizing the stationary state. The static state may be that the angle change detected by the gyro sensor of the electronic device 100 within a preset time is within a preset range, and the speed change detected by the acceleration sensor within the preset time is less than a threshold value.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the display screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout, which is an operation of a user's hand, elbow, stylus, or the like contacting the display screen 194. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
Next, a structure of another electronic device provided in the embodiment of the present application is described by taking the electronic device 202 as an example.
Fig. 3 schematically shows a structural diagram of an electronic device 202 provided in an embodiment of the present application. The electronic device 201, the electronic device 203, and the electronic device 204 may all refer to the schematic structural diagram shown in fig. 3.
As shown in fig. 3, the electronic device 202 may include: the device comprises a processor 401, a memory 402, a wireless communication processing module 403, an antenna 404, a power switch 405, a wired LAN communication processing module 406, a USB communication processing module 407, an audio module 408 and a display screen 409. Wherein:
processor 401 may be used to read and execute computer readable instructions. In particular implementations, processor 401 may include primarily controllers, operators, and registers. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a specific implementation, the hardware architecture of the processor 401 may be an Application Specific Integrated Circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, or an NP architecture, etc.
In some embodiments, the processor 401 may be configured to parse signals received by the wireless communication module 403 and/or the wired LAN communication processing module 406, such as probe requests broadcast by the electronic device 100, and/or the like. The process 401 may be used to perform corresponding processing operations according to the parsing result, such as generating a probe response, and so on.
In some embodiments, the processor 401 may also be configured to generate a signal, such as a bluetooth broadcast signal, that is sent out by the wireless communication module 403 and/or the wired LAN communication processing module 406.
A memory 402 is coupled to the processor 401 for storing various software programs and/or sets of instructions. In particular implementations, memory 402 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 402 may store an operating system, such as an embedded operating system like uCOS, VxWorks, RTLinux, etc. The memory 402 may also store communication programs that may be used to communicate with the electronic device 100, one or more servers, or accessory devices.
The wireless communication module 403 may include one or more of a UWB communication module 403A, a bluetooth communication module 403B, WLAN communication module 404C, GPS communication module 404D. The UWB communication module 403A may be integrated into a Chip (System on Chip, SOC), and the UWB communication module 403A may be integrated with other communication modules (e.g., the bluetooth communication module 403B) in hardware (or software).
In some embodiments, one or more of the UWB communication module 403A, the bluetooth communication module 403B, WLAN and the communication module 404C, GPS and the communication module 404D may listen to signals, such as measurement signals, scanning signals, etc., transmitted by other devices (such as the electronic device 100) and may transmit response signals, such as measurement responses, scanning responses, etc., so that the other devices (such as the electronic device 100) may discover the electronic device 202 and establish wireless communication connections with the other devices (such as the electronic device 100) via one or more of UWB, bluetooth, WLAN, or infrared for data transmission.
In other embodiments, one or more of the UWB communication module 403A, the bluetooth communication module 403B, WLAN, the communication module 404C, GPS, and the communication module 404D may also transmit signals, such as broadcast UWB measurement signals, so that other devices (e.g., the electronic device 100) may discover the electronic device 202 and establish wireless communication connections with other devices (e.g., the electronic device 100) via one or more of UWB, bluetooth, WLAN, or infrared, for data transfer.
The wireless communication module 403 may also include a cellular mobile communication module (not shown). The cellular mobile communication processing module may communicate with other devices, such as servers, via cellular mobile communication technology.
The antenna 404 may be used to transmit and receive electromagnetic wave signals. The antennas of different communication modules can be multiplexed and can also be mutually independent so as to improve the utilization rate of the antennas. For example: the antenna of the bluetooth communication module 403A may be multiplexed as the antenna of the WLAN communication module 403B. For example, the UWB communication module 403A is to use a separate UWB antenna.
In this embodiment, the electronic device 202 has at least one UWB antenna for implementing UWB communication.
The power switch 405 may be used to control the power supply of the power source to the electronic device 202.
The wired LAN communication processing module 406 may be used to communicate with other devices in the same LAN through a wired LAN, and may also be used to connect to a WAN through a wired LAN, communicating with devices in the WAN.
The USB communication processing module 407 may be used to communicate with other devices through a USB interface (not shown).
The audio module 408 may be configured to output audio signals via the audio output interface, which may enable the electronic device 202 to support audio playback. The audio module may also be configured to receive audio data via the audio input interface. The electronic device 202 may be a media player device such as a television.
Display 409 may be used to display images, video, etc. The display screen 409 may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED) display screen, an active-matrix organic light-emitting diode (AMOLED) display screen, a flexible light-emitting diode (FLED) display screen, a quantum dot light-emitting diode (QLED) display screen, or the like.
In some embodiments, the electronic device 202 may also include a serial interface such as an RS-232 interface. The serial interface can be connected to other devices, such as audio play-out devices like a sound box, so that the display and the audio play-out devices can cooperatively play audio and video.
It is to be understood that the configuration illustrated in fig. 3 is not to be construed as specifically limiting to the electronic device 202. In other embodiments of the present application, the electronic device 202 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The application provides an augmented reality-based equipment identification method, after detecting a first operation, an electronic device 100 enters a first interface, the electronic device 100 starts a camera, and a preview image acquired by the camera is displayed on the first interface in real time. The electronic device 100 identifies the type of the second electronic device (e.g., a sound box, a computer, a tablet, a computer, etc.) in the preview image of the first interface through a computer vision technology; meanwhile, the electronic device 100 determines orientation information (e.g., longitude and latitude information, or physical distance and angle from the electronic device) and identity information (e.g., device name, device type, device attributes, etc.) of a second electronic device within a communication range of the electronic device 100 through a wireless positioning technology (e.g., UWB positioning, bluetooth positioning, WiFi positioning, GPS positioning, etc.).
The electronic device 100 determines the position of the second electronic device in the preview image according to the relative distance and the relative angle between the second electronic device and the electronic device, and the shooting angle range of the camera. For example, as shown in FIG. 4, FIG. 4 includes an electronic device 100 and nearby devices. The nearby devices include electronic device 201, electronic device 202, electronic device 203, and electronic device 204. The second electronic device may be any one of the nearby devices. Fig. 4 schematically shows a position relationship of the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 on a horizontal plane in some application scenarios of the present application.
In the embodiment of the present application, for convenience of explaining the positional relationship between the electronic device 100 and the nearby device, a reference point (for example, a center position point) on the electronic device 100 may be represented as its position in a plan view. For example, the central location point of the electronic device 100 may be used to represent its position in a horizontal plane. In this embodiment of the application, a direction indicated by a vector having a center position point of the electronic device 100 as a starting point perpendicular to an upper edge of the touch screen of the electronic device 100 may be used as a reference direction of the electronic device 100, and may also be referred to as a 0-degree direction of the electronic device 100.
Thus, as shown in fig. 4, the electronic device 201 may be at 1m in the 0 degree direction of the electronic device 100, the electronic device 202 may be at 1.2m in the 330 degree direction clockwise of the electronic device 100, the electronic device 203 may be at 0.5m in the 330 degree direction clockwise of the electronic device 100, and the electronic device 204 may be at 0.8m in the 30 degree direction clockwise of the electronic device 100.
Generally, the left-right included angle of the shooting angle of the camera is in the range of 60 degrees to 80 degrees, and the up-down included angle is about 45 degrees, and the left-right included angle and the up-down included angle can be changed to a certain extent according to different mobile phone brands and camera configurations. If the left-right angle of the shooting angle of the electronic device 100 is 60 °, it can be seen that the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 are all within the shooting range of the electronic device 100. According to the length and width of different electronic devices and the physical distance from the electronic device 100, it can be determined whether the electronic device 201, the electronic device 202, the electronic device 203 and the electronic device 204 are completely or partially displayed in the shooting interface of the electronic device 100.
In the embodiment of the present application, the nearby devices of the electronic device 100 may not be limited to the four electronic devices in fig. 4, and there may be more or less nearby devices, and fig. 4 illustrates the present application by using four electronic devices only for example, and should not be construed as limiting. The above-mentioned fig. 4 exemplarily shows the relative position relationship between the above-mentioned four electronic devices (the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204) and the electronic device 100, and only exemplarily explains the embodiment of the present application, and should not be construed as a limitation.
After the electronic device 100 determines the orientation information of the second electronic device, a display image and a display area of the second electronic device are determined in the preview image, a device icon is displayed on the first interface in real time in an augmented reality manner, and the user can output a control interface of the second electronic device by triggering the device icon, so that the interaction of the user with the second electronic device is realized.
In some embodiments, the display area of the device icon on the first interface corresponds to the display area of the second electronic device in the preview image.
The augmented reality-based device identification method provided in the present application is introduced below in combination with an application scenario.
In the UI embodiments shown in fig. 5A-5F, an operation process is exemplarily shown in which a first operation of a user triggers a first electronic device to enter a first interface, and the electronic device displays a device icon on the first interface in real time.
Fig. 5A illustrates an exemplary user interface 510 on the electronic device 100. The user interface 510 may include: status bar 511, tray 512, and one or more application icons, wherein status bar 201 may include: one or more signal strength indicators 513 for mobile communication signals (which may also be referred to as cellular signals), one or more signal strength indicators 514 for wireless fidelity (Wi-Fi) signals, a bluetooth indicator 515, a battery status indicator 516, and a time indicator 517. When the bluetooth module of the electronic device 100 is in an on state (i.e., the electronic device supplies power to the bluetooth module), the bluetooth indicator 515 is displayed on the display interface of the electronic device 100.
The tray 512 has common application icons that may show, for example: phone icons, contact icons, short message icons, camera icons, and the like. The one or more application icons include: a gallery icon, a browser icon, an application store icon, a setting icon, a mailbox icon, a cloud sharing icon, and a memo icon.
Electronic device 100 may launch and run multiple applications simultaneously, providing different services or functions for the user. The electronic device 100 running multiple applications simultaneously means that the electronic device 100 has started multiple applications, the multiple applications are not closed, the electronic device 100 does not delete resources such as a memory occupied by the multiple applications, and the multiple applications occupy resources such as a memory in the background at the same time; without requiring multiple applications to interact with the user in the foreground at the same time. For example, the electronic device 100 sequentially starts three applications of a mailbox, a gallery and instant messaging, and simultaneously runs the three applications of the mailbox, the gallery and the instant messaging.
When a user uses an application, if the application is switched or jumps to the desktop for operation, the electronic device 100 does not kill the application previously used by the user, but keeps the previously used application in the multitasking queue as a background application.
When the electronic apparatus 100 simultaneously runs a plurality of applications, cards corresponding to the respective applications may be generated from the plurality of applications in the multitasking queue. And a plurality of cards on the multitask interface are transversely arranged in parallel according to a preset sequence strategy. For example, in one sequencing strategy, the electronic device 100 arranges the cards corresponding to the different applications according to the chronological order of running the different applications.
Upon detecting a user operation indicating opening of the multitasking interface 520, the electronic device 100 displays the multitasking interface 520. The multitasking interface 520 comprises cards corresponding to a plurality of application programs which are running by the electronic device 100. There may be various user operations for instructing the opening of the multi-tasking interface.
Illustratively, when the electronic device 100 detects a slide-up operation with respect to the bottom of the electronic device 100, in response to the operation, the electronic device 100 displays a multitasking interface 520 as shown in fig. 5B.
Multitasking interface 520 may include: card 521, card 522, and delete icon 523. In this case, the card 521 is displayed in its entirety and the card 522 is displayed in part.
The delete icon 523 may be used to close the application corresponding to the card currently displayed in the multitasking interface 520. Here, the closing refers to deleting resources such as a memory occupied by the application program. In some embodiments, delete icon 530 may be used to close the application corresponding to all cards in current multitasking interface 520.
It should be noted that the drawings are only schematic illustrations, where the multitasking interface 520 shown in the drawings refers to an interface displayed on a touch screen in a frame of the electronic device 100, a portion of the card in the frame of the electronic device 100 can be displayed by the touch screen of the electronic device 100, and a portion of the card outside the frame of the electronic device cannot be displayed by the touch screen of the electronic device 100.
In the multitasking interface 520, the user can switch and display the card by sliding the multitasking interface 520 left and right. For example, when the electronic device 100 detects a rightward sliding operation on the multitasking interface 520, the cards on the multitasking interface 520 sequentially move rightward in response to the operation, and the electronic device 100 can display the cards 522 in a complete manner and display the cards 521 in a partial manner. When the electronic device 100 detects a leftward sliding operation on the multitasking interface 520, in response to the operation, the cards on the multitasking interface 520 are sequentially moved leftward, since the card 521 is the first card from the right in the multitasking interface 520, and there is no other card on the right side of the card 521, after the electronic device 100 completely displays the card 521, the leftward sliding operation is detected, as shown in fig. 5C, in response to the operation, the electronic device 100 partially displays a preset area 524 and continues to slide leftward, as shown in fig. 5D, the electronic device 100 completely displays the preset area 524, and in some embodiments, at this time, the electronic device 100 triggers to display a viewfinder interface corresponding to the preset area 524. The viewing interface may be a picture captured by a rear camera of the electronic device 100, or may be a picture captured by a front camera.
As shown in FIG. 5E, FIG. 5E illustrates a viewing interface 530. Displaying the image collected by the camera in real time in the viewing interface 530; optionally, the electronic device 100 may further send a probe request with a wireless positioning technology, and the electronic device 100 determines a nearby device of the electronic device 100 according to a received probe response to the probe request, and further determines one or more of a device name, a device type, a physical distance or an angle from the electronic device 100 of the nearby device. The electronic device 100 performs image recognition on the image captured by the camera, and recognizes an electronic device (e.g., a sound box, a computer, a tablet computer, etc.) in the image. The display content of the view interface 530 in fig. 5E is a picture captured by a camera, and includes a device image 531, a device image 532, a device image 533, and a device image 534.
In conjunction with fig. 4, the device image 531 is an image of the electronic device 100 displayed in the viewfinder interface 530 of the electronic device 202; device image 532 is an image that electronic device 100 takes the image that electronic device 201 displays in viewfinder interface 530; device image 533 is an image of electronic device 100 captured by electronic device 203 displayed in viewfinder interface 530; device image 534 is an image that electronic device 100 captures that electronic device 202 displays in viewing interface 534.
The electronic device 100 determines a display area of a corresponding device image of each device in the view interface 530 according to the physical distances and angles from the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 to the electronic device 100. The electronic device 100 displays a device icon in the viewing interface 530 in real time in an augmented reality manner, and the device icon indicates an electronic device corresponding to the device image in the viewing interface 530. Optionally, the display area of the device icon corresponds to the device image in the viewing interface 530.
In some alternative manners, the device icon may be displayed at a fixed position of the viewing interface 530, or may be displayed corresponding to the device image, for example, at the periphery of the corresponding device image, or at the center of the corresponding device image; the device icon may or may not completely overlap with the display area of the corresponding device image (for example, be displayed in an area immediately above the display area of the corresponding device image).
As shown in fig. 5F, the display area of the device icon 5311 completely overlaps with the device image 531, and the device icon 5311 indicates the device name matepad (tablet computer) corresponding to the device image 531; a display area of the device icon 5321 partially overlaps the device image 532, and the device icon 5321 indicates a device name, HUAWEI soundbox, corresponding to the device image 532; the display area of the device icon 5331 completely overlaps the device image 533, and the device icon 5331 indicates that the device name corresponding to the device image 531 is mathebook; the display area of the device icon 5341 partially overlaps with the device image 534, and the device icon 5341 indicates that the device name corresponding to the device image 534 is mathwood (computer).
In this application, a device icon may also be referred to as a device tag. When the device 201 is referred to as a second electronic device, the device icon 5321 may also be referred to as a first tag.
In some embodiments, the display area of the device icon corresponds to the location of the device's positioning chip (e.g., UWB chip, bluetooth chip) in the device image. The electronic device 100 receives the probe response from the positioning chip of the electronic device 201 and determines the orientation (physical distance and angle from the electronic device 100) of the positioning chip of the electronic device 201. According to the orientation of the positioning chip of the electronic device 201, the electronic device 100 determines a corresponding position of the positioning chip of the electronic device 201 in the viewing interface 530, and the electronic device 100 displays the device icon of the device 201 at the corresponding position. For example, the device icon 5311 shows the position of the positioning chip inside the electronic device corresponding to the device image 531. Device icon 5321 is the same as device icon 5331.
In some embodiments, the location of the positioning chip of the device is not in viewing interface 530, e.g., the positioning chip of electronic device 504 to which device image 534 corresponds is not in viewing interface 530. Electronic device 100 may derive a physical distance and orientation of appearance key points (e.g., four corners of the screen) of electronic device 504 with respect to electronic device 100 based on the location of electronic device 504 and the device size, and when electronic device 100 captures one or more appearance key points, electronic device 100 displays a device icon in viewfinder interface 530.
In some application scenarios, the device icon may not only indicate identity information of a device corresponding to the device image, but also associate a control card of the device corresponding to the device image. When the electronic device 100 detects a user operation for a device icon, the electronic device 100 outputs a control card of the device corresponding to the device icon. As shown in fig. 5G, when the electronic device 100 detects a user operation directed to the device icon 5321, the electronic device associated with the device icon 5321 is a huafei soundX, as shown in fig. 5H, the electronic device 100 outputs a control card 540 of the huafei soundX. The control card 540 may include one or more of the following: an application title bar 601, a connection card 602, a music card 603, a projection card 604, a refresh control 605, and a close control 606. Wherein:
the application title bar 601 indicates that the device controlling the card 540 is a HUAWEI soundX.
The connection card 602 may include indication information 602A and a connection means 602B. The indication information 602A is used to characterize whether the device (electronic device 201) corresponding to the device image 532 is in an online state or an offline state. The online state means that the electronic device 201 is currently connected to the internet, and the offline state means that the electronic device 201 is not currently connected to the internet. The connection mode 602B is used to indicate a current connection mode of the electronic device 201 and the electronic device 100, and when the current connection mode of the electronic device 201 and the electronic device 100 is bluetooth, the connection mode 602B may be displayed in an icon of the bluetooth. When the current connection mode between the electronic device 201 and the electronic device 100 is WiFi, the connection mode 602B may be displayed in an icon of WiFi.
The music card 603 may include a music title 603A, a pause control 603B, a previous control 603C, a next control 603D, a progress bar 603E, a volume 603F, and more controls 603H.
The pause control 603B may receive an input operation (e.g., a one-click operation) by the user, and in response to the detected user operation, the electronic device 201 pauses the playing of music.
The previous control 603C may receive an input operation (e.g., a single-click operation) by the user, and in response to the detected user operation, the electronic device 201 may play a song that is previous to the currently playing song in the music list.
The next control 603D may receive an input operation (e.g., a single-click operation) by the user, and in response to the detected user operation, the electronic device 201 may play a song next to the currently playing song in the music list.
The progress bar 603E may indicate the total duration (e.g., 04:42) and the played duration (e.g., 00:42) of the current song.
The volume 603F may receive an input operation (e.g., a slide operation) by the user, and in response to the detected user operation, the electronic device 201 adjusts the playback volume of the electronic device 201.
The more control 603H may receive an input operation (e.g., a sliding operation) by the user, and in response to the detected user operation, the electronic device 100 may display more function options of the musical card, such as sharing, deleting, downloading, and the like.
The drop card 604 is used to instruct the electronic device 100 to output audio to the electronic device 201. When the electronic apparatus 100 detects a user operation for the throw card 604, in response to the operation, the audio of the electronic apparatus 100 is output to the electronic apparatus 201.
The refresh control 605 is used for refreshing the display interface of the current control card 540, and the electronic device 100 retrieves the current state of the device 201.
The closing control 606 is used to close the control card 540, and when the electronic device 100 detects a user operation directed to the control 606, the card 540 is closed, and the electronic device 100 displays the viewing interface 530 as shown in fig. 5G.
In addition to the way the device image 532 controls the card 540 shown in fig. 5H, the electronic device 100 may interact with the device image 532 in other ways, which are not limited in this respect. For example, when the electronic device 100 detects a user operation directed to the device icon 5321, the electronic device 100 may directly open and jump to the application software associated with the electronic device corresponding to the device image 532, and display an application interface of the application software of the device image 532, such as application software for smart life, sports health, and the like.
In this application, the viewing interface 530 may also be referred to as a first interface. The electronic device 100 determines the orientation information of the nearby device of the electronic device 100 through a computer recognition technology and a wireless positioning technology, determines the display image and the display area of the nearby device in the preview image of the viewing interface 530, and displays the device icon in real time on the shooting interface in an augmented reality manner, so that the effect of real-time preview is achieved. The user may trigger the device icon, and the electronic device 100 outputs a control interface of the corresponding electronic device, thereby implementing interaction of the user with the nearby device.
In some application scenarios, no application is running in the background and no application is running in the multitasking queue in the electronic device 100, i.e. the cards 521 and 522 are not included in the multitasking interface 520. When the electronic apparatus 100 displays the user interface 510 and a slide-up operation at the bottom of the electronic apparatus 100 is detected, the electronic apparatus 100 displays a multitasking interface in response to the operation. Since there is no card in the multitasking interface, the electronic device 100 directly enters the viewfinder interface 530. The electronic device 100 starts the camera, and captures an image in real time through the camera to be displayed on the viewing interface 530.
In some application scenarios, after electronic device 100 enters viewfinder interface 530, when there is only one device in viewfinder interface 530, the user does not need to click a device icon, and electronic device 100 can directly enter the control interface of the device. Illustratively, as shown in fig. 6A, the viewing interface 530 in fig. 6A includes a device image 532 therein, a device icon 5321 is displayed in the vicinity of the device image 532, and the device icon 5321 partially overlaps with a display area of the device image 532. When the electronic device 100 detects that there is only one device in the viewfinder interface 530, as shown in fig. 6B, the electronic device 100 directly outputs the control card 540 of the device image 532. In this implementation manner, when there is only one device image in the viewing interface 530, it may be considered that the user wants to interact with the electronic device corresponding to the device image, and the electronic device 100 omits the trigger operation of the user, directly enters the control interface of the electronic device, and improves user experience.
In this application, the manner of entering the viewing interface 530 shown in fig. 5A to 5D is optional, and the electronic device 100 may also enter the viewing interface 530 in other manners. For example, fig. 7A and 7B also provide a way to access the viewing interface 530.
As shown in FIG. 7A, a user interface 510 is shown in FIG. 7A, wherein the description of the user interface 510 may refer to the related description of FIG. 5A above. When the electronic device 100 detects a user operation directed to the bottom left side of the electronic device, or when the electronic device 100 detects a user operation directed to the bottom right side of the electronic device, the electronic device 100 displays a user interface 710 as shown in fig. 7B.
One or more of the following may be included in the user interface 710: continuation device selection bar 701, control 702A, control 702B, device display bar 703, and live view control 704. Wherein,
the connectivity device selection bar 701 includes device options (also referred to as device icons) for one or more nearby devices. Such as smart screens, matepad, matewood, speakers, etc. The device options displayed in area 1202 may be used to trigger the operation of sharing. In response to a detected operation on a device option (e.g., a click operation on a device icon), the electronic device 100 may trigger a process of sharing the selected data or task to a device corresponding to the device option selected by the operation. The process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device option, and then transmits the selected data or task to the device corresponding to the selected device option through the communication connection.
The control 702A indicates a preset mode in which one or more devices can be controlled in a unified manner. For example, the preset mode is a home mode, and in the home mode, the electronic devices corresponding to the device icon 703B, the device icon 703C, and the device icon 703F are automatically turned on, and the electronic devices corresponding to the device icon 703A and the device icon 703D are automatically turned off.
The control 702B indicates another preset mode in which one or more devices can be controlled in a unified manner. For example, the preset mode is an away-from-home mode, and in the away-from-home mode, the electronic devices corresponding to the device icon 703B, the device icon 703C, and the device icon 703F are automatically turned off, and the electronic devices corresponding to the device icon 703A and the device icon 703D are automatically turned on.
The device display field 703 includes a plurality of device icons therein. For example, hua is AI sound box 703A, smart tv 703B, air purifier 703C, smart desk lamp 703D, bluetooth headset 703E, air conditioner companion 703F. Any one of the device icons displayed in the device display field 703 may receive an input operation (e.g., a click operation) by the user, and in response to the detected input operation, the electronic device 100 displays a control interface of the device.
The air purifier 703C includes a control 7031 therein, and the control 7031 is used to control the air purifier 703C to be turned on or off. The intelligent desk lamp 703D and the air conditioner companion 703F also include the same controls as the control 7031. The devices such as the hua shi sound box 703A and the smart tv 703B cannot be controlled to be turned on or off through the user interface 710.
And a live view control 704 for triggering entry into a viewing interface. When electronic device 100 detects a user operation directed to live view control 704, electronic device 100 displays a view interface 530 as shown in FIG. 5F; optionally, the electronic device 100 displays the view interface 530 as shown in fig. 5E, and then displays the view interface 530 as shown in fig. 5F.
In some embodiments, live view control 704 in user interface 710 is optional, and electronic device 100 may not display live view control 704. When electronic device 100 displays user interface 710 and electronic device 100 detects a lift-off operation, electronic device 100 may display viewfinder interface 530. As shown in fig. 7C, fig. 7C exemplarily illustrates a lift-off operation, where at time T1, the electronic device 100 displays the user interface 710, and when the electronic device 100 detects the lift-off operation, at time T2, the electronic device 100 displays the viewfinder interface 530, where a time interval between time T1 and time T2 is less than a threshold.
It is to be appreciated that the lift operation is merely an exemplary user operation and that electronic device 100 may also access viewing interface 530 via other user operations.
Without being limited to the above manner of opening the viewing interface 530, the present application may also start a camera, for example, through a camera application, so as to enter the viewing interface 530; or through other application programs, such as an instant messaging application, a payment application, and the like, to enter the viewing interface 530; and so on.
In this application, the display forms of the device icon 5311, the device icon 5321, the device icon 5331, and the device icon 5341 in the viewing interface 530 shown in fig. 5F are selectable. 8A-8D also provide a display of a device icon that may change as the content displayed in the viewing interface changes, at a first time, the display area of the device being at a first location in the viewing interface, the device icon of the device being displayed within or in close proximity to the first location in the viewing interface; at a second time, the display area of the device is at a second location in the viewing interface, and the device icon of the device is displayed within or proximate to the second location of the viewing interface.
As shown in FIG. 8A, a user interface 530 is shown in FIG. 8A, wherein reference is made to the description above with respect to FIG. 5F for the description of FIG. 8A. Illustratively, a device image 534 is included in the viewing interface 530 of fig. 8A, a device icon 5341 is displayed in the viewing interface 530 near the device image 534, the device icon 5341 partially overlapping the display area of the device image 534; the device image 531 is included in the viewing interface 530, the device icon 5311 is displayed in the viewing interface 530 in the vicinity of the device image 531, and the device icon 5311 completely overlaps with the display area of the device image 531.
It can be seen that the display area of the device icon corresponds to the display area of its corresponding device image.
In some embodiments, the electronic device 100 does not display the device icon while the display content in the viewfinder interface is changing, and the electronic device 100 displays the device icon according to the display content in the viewfinder interface until the duration of the stationary state of the electronic device 100 exceeds a preset time. Specifically, the electronic device 100 may determine the stationary state of the electronic device 100 through an acceleration sensor and/or a gyro sensor.
In some embodiments, the display area of the device icons is related to the display area of other device icons, e.g., the display areas between the device icons are not obscured from each other.
Compared to fig. 8A, the shooting direction or angle of fig. 8B is different from that of fig. 8A, and in the viewing interface 810 in fig. 8B, the display portion of the device image 534 is more, and the device icon 5341 completely overlaps the display area of the device image 534.
In contrast to fig. 8A, in the viewfinder interface 820 of fig. 8C, the device image 531 partially overlaps the device image 533, and the device icon 5311 is displayed above the device image 531, in close proximity to the display area of the device image 531. The device icon 5311 does not overlap the display area of the device image 531.
It can be seen that the display area of the device icon may change with the change of the display area of the device image, for example, the display area of the device icon may be a center position (or an arbitrary position) of the display area of the device image according to the change of the display area of the device image in the viewing interface 530; the display area of the device icon may be immediately above (or next to) the display area of the device image (below, left, right), or the like.
In some embodiments, when the device is not within the capture range of the camera, the device image of the device is not included in the viewing interface of the electronic device 100. The device icon for the device may be displayed in a viewing interface in a particular manner.
When the electronic device 100 enters a viewing interface, starting a camera, and displaying an image acquired by the camera in real time in the viewing interface; meanwhile, a probe request with a wireless positioning technology is sent, the electronic device 100 determines nearby devices of the electronic device 100 according to the received probe response to the probe request, and further determines one or more information of device names, device types, physical distances or angles from the nearby devices to the electronic device. The electronic device 100 performs image recognition on the image captured by the camera, and recognizes an electronic device (e.g., a sound box, a computer, a tablet computer, etc.) in the image.
If the electronic device 100 receives four probe responses, it detects that there are four electronic devices nearby, and the probe responses carry identity information of the devices, such as device names, device types, and the like. The electronic device 100 determines device names, device types, and the like of the four electronic devices, for example, matepad (device type: tablet computer), HUAWEI soundX (device type: sound box), mathebook (device type: computer), and mathebook (device type: computer), respectively; and determines orientation information (physical distance and angle from the electronic device 100) of the four electronic devices through wireless location technology.
The electronic device 100 acquires images of only three electronic devices. The electronic device 100 judges that one of the electronic devices is not in the shooting range of the camera of the electronic device 100 through the orientation information of the four electronic devices, or the electronic device 100 identifies the device types of the three electronic devices through a computer vision technology, and determines the electronic device and the device type which are not in the image according to the device types of the four electronic devices; the electronic device 100 displays a device icon of the electronic device that is not in the image in the first preset manner in the image. The first preset mode may be, for example, displayed at a fixed position of the viewing interface, or may be, for example, displayed at a position related to the orientation information.
Illustratively, as shown in fig. 8A, a device image 531 is a partial display image of the device 202 in the viewing interface 530 of fig. 8A, a device icon 5311 is displayed in the viewing interface 530 in the vicinity of the device image 531, and a device icon 5341 completely overlaps with a display area of the device image 534.
In contrast to fig. 8A, the shooting direction or angle of fig. 8B differs from fig. 8A in that the device 202 is not within the shooting range of the camera, and in the viewfinder interface 810 in fig. 8B, the device image of the device 202 is not included. The viewfinder interface 810 displays an icon 801 and a prompt 802. Wherein, the prompt 802 is used to prompt the user that the icon 801 is a special icon; an icon 801 is displayed on the left edge of the viewing interface 810 of the electronic device 100, and prompts the user that the device matepad exists outside the shooting range of the camera of the electronic device 100. Optionally, the icon 801 may trigger a control interface of the electronic device display device image 531.
In some embodiments, the icon 801 or prompt 802 may indicate the orientation (including angle, distance, etc.) of the device. For example, the icon 801 is displayed at the left edge of the viewfinder interface 810 of the electronic device, prompting the user that the device mathpad is outside the shooting range of the camera of the electronic device 100 and on the left of the electronic device 100. Alternatively, the orientation of the device may be indicated textually.
In this application, when the device matepad is referred to as a third electronic device, the icon 801 or the prompt 802 may also be referred to as a third tag.
In some embodiments, when a device is occluded by other objects, the device image for the device is not included in the viewing interface of electronic device 100. The device icon for the device may be displayed in a viewing interface in a particular manner.
If the electronic device 100 receives four probe responses, it detects that there are four electronic devices nearby, and the probe responses carry identity information of the devices, such as device names, device types, and the like. The electronic device 100 determines device names, device types, and the like of the four electronic devices, for example, matepad (device type: tablet computer), HUAWEI soundX (device type: sound box), mathebook (device type: computer), and mathebook (device type: computer), respectively; and determines orientation information (physical distance and angle from the electronic device 100) of the four electronic devices through wireless location technology.
The electronic device 100 acquires images of only three electronic devices. The electronic device 100 detects that the four electronic devices are all within the shooting range of the camera of the electronic device 100 through the orientation information of the four electronic devices, and then determines that the electronic device is blocked. The electronic device 100 identifies the device types of the three electronic devices in the image through a computer vision technology, and determines the shielded electronic device and the device type by combining the device types of the four electronic devices; the electronic device 100 displays the device icon of the electronic device that is occluded in the image in a second preset manner. The second preset mode may be, for example, displayed at a fixed position of the viewing interface, or may be, for example, displayed at a position related to the orientation information.
Illustratively, as shown in fig. 8C, in contrast to fig. 8A, in the viewfinder interface 820 in fig. 8C, the device image 532 is not included. The viewing interface 820 displays an icon 803 and a prompt 804. Wherein, the prompt 804 is used to prompt the user that the icon 803 is a special icon; an icon 803 is displayed in the middle area of the viewfinder interface 820 of the electronic device 100, prompting the user that there is a device, the huabei soundX, within the shooting range of the camera of the electronic device 100. Optionally, this icon 803 may trigger the electronic device 100 to display a control interface for the device image 532 (i.e., the huabei soundX).
In some embodiments, the icon 803 or prompt 804 may indicate the orientation (including angle, distance, etc.) of the device. For example, an icon 803 is displayed above the device image 5331 in the viewfinder interface 820 of the electronic device 100 to prompt the user that the device huabei soundX is occluded by the device 203 corresponding to the device image 5331. Optionally, the orientation of the device may also be indicated textually (e.g., the huabei soundX is directly behind the device image 5331); or by means of text indicating that the device is occluded.
Optionally, the display area of icon 803 does not overlap with the display areas of other device images and device icons. The electronic apparatus 100 determines the display area of the icon 803 from the display areas of the apparatus image 531, the apparatus image 533, the apparatus image 534, the apparatus icon 5311, the apparatus icon 5331, and the apparatus icon 5341 displayed in the viewing interface 820.
In this application, when the aforementioned device, the huabei soundX, is referred to as a third electronic device, the icon 803 or the prompt 804 may also be referred to as a second tag.
In some embodiments, if the other device does not have a wireless location technology capable of being identified, the electronic device 100 identifies the type of the other device (e.g., a mobile phone, a tablet, a television, a sound box, etc.) through computer vision, and finds whether a device type corresponding to a device in which the electronic device 100 logs in the same account exists.
For example, if the electronic device 100 receives three probe responses, it detects that there are three electronic devices nearby, and the probe responses carry device identity information, such as device name, device type, and the like. The electronic device 100 determines that the three electronic devices are matepad (device type: tablet computer), HUAWEI soundX (device type: sound box), and matebook (device type: computer), respectively; and determines orientation information (physical distance and angle from the electronic device 100) of the three electronic devices through wireless location technology.
The electronic device 100 acquires images of four electronic devices. The electronic device 100 determines, through computer vision recognition technology, display areas of the images of the four electronic devices in the viewing interface, and determines that the device types of the four electronic devices are a tablet computer, a sound box, a computer, and a computer, respectively. The electronic device 100 searches whether a computer exists in a device that is logged in to the same account as the electronic device 100. Each electronic device has its own login account, one account may bind to one or more electronic devices, and the electronic device 100 finds whether there is an electronic device bound to a computer by its own account. If the device image exists, the electronic device 100 considers that the computer is associated with the device image in the image. The electronic device 100 displays the device icon of the computer in a preset manner in the image. The preset mode may be, for example, a fixed position displayed on the viewing interface, or may be, for example, a position displayed in relation to a display area of the device image in the image.
Illustratively, as shown in FIG. 8D, in contrast to FIG. 8A, a device icon 805 and a prompt 806 are included in the viewfinder interface 830. The prompt 806 is used to indicate that the device icon 805 is an uncertain icon, indicating that there is an uncertain association between the device corresponding to the device image 533 and the device icon 805.
In this application, when the device corresponding to the device image 533 is referred to as a fourth electronic device, the device icon 805 or the icon 806 may also be referred to as a fourth label.
In some embodiments, if the other device does not have a wireless location technology capable of being identified, the electronic device 100 identifies the type of the device (e.g., mobile phone, tablet, tv, speaker, etc.) through computer vision, and finds whether the device type corresponding to the device in the same geographical location as the electronic device 100 exists through the GPS information of the electronic device 100 itself.
For example, if the electronic device 100 receives three probe responses, it detects that there are three electronic devices nearby, and the probe responses carry device identity information, such as device name, device type, and the like. The electronic device 100 determines that the three electronic devices are matepad (device type: tablet computer), HUAWEI soundX (device type: sound box), and matebook (device type: computer), respectively; while orientation information (physical distance and angle from the electronic device 100) for the three electronic devices is determined via wireless location technology.
The electronic device 100 acquires images of four electronic devices. The electronic device 100 determines, through computer vision recognition technology, display areas of the images of the four electronic devices in the viewing interface, and determines that the device types of the four electronic devices are a tablet computer, a sound box, a computer, and a computer, respectively. The electronic device 100 looks for the presence of a computer of an electronic device in the same geographical location as the electronic device 100.
At the configuration interface of each electronic device, a configuration of the geographic location may be included. For example, when the electronic device 100 is connected in pair with an intelligent desk lamp, a user configures the device location of the intelligent desk lamp as a room in application software (e.g., smart life) associated with the intelligent desk lamp; when the electronic device 100 is connected to the smart speaker in a paired manner, a user configures the device location of the smart speaker as a living room in application software (e.g., smart life) associated with the smart speaker; when the electronic device 100 is connected with a computer in a pairing manner, a user configures the device position of the computer as a company in application software (such as smart life) associated with the computer; and so on. The electronic device 100 determines the located area according to the geographical location of the electronic device, for example, the electronic device 100 obtains the location of the electronic device in a company through GPS positioning, and then configures the location of the electronic device in the electronic device of the company to find whether the electronic device with the computer type exists. If the device image exists, the electronic device 100 considers that the computer is associated with the device image in the image. The electronic device 100 displays the device icon of the computer in a preset manner in the image. The preset mode may be, for example, a fixed position displayed on the viewing interface, or may be, for example, a position displayed in relation to a display area of the device image in the image. This part of the disclosure may be referred to the related description of fig. 8D above.
In this application, when the device corresponding to the device image 533 is referred to as a fifth electronic device, the device icon 805 or the icon 806 may also be referred to as a fifth label.
In some embodiments, if the other device does not have a wireless location technology that can be identified and the electronic device 100 cannot correctly identify the location information of two devices of the same type, the electronic device outputs two tags for selection by the user.
For example, the electronic device 100 acquires two images of the electronic device. The electronic device 100 determines, through computer vision recognition technology, display areas of the images of the two electronic devices in the viewing interface, and determines that the device types of the two electronic devices are both speakers. The electronic device 100 does not receive the probe response and cannot determine the orientation of the two speakers.
The electronic device 100 may find whether a device type corresponding to a device in which the electronic device 100 logs in the same account exists in the manner described in the above two embodiments; or whether a device type corresponding to a device in the same geographical location as the electronic device 100 exists is found through the GPS information of the electronic device 100 itself. If the electronic device 100 determines that one device type is a device of a sound box according to the two modes, the electronic device 100 displays a device icon of the sound box in a preset mode in the image. The preset mode may be, for example, a fixed position displayed on the viewing interface, or may be, for example, a position displayed in relation to a display area of the device image in the image.
If the electronic device 100 determines that the two device types are devices of sound boxes according to the two manners, because the electronic device 100 cannot correspond the device icons of the two sound boxes to the two sound box images in the image one by one, the electronic device 100 displays the device icons of the two sound boxes in the image in a preset manner. The preset mode may be, for example, a fixed position displayed on the viewing interface, or may be, for example, a mode presented on the viewing interface in the form of a control, and when the electronic device 100 detects a user operation directed to the control, the electronic device 100 outputs two device icons for the user to select.
The application also shows a display form of the device icons, and the effect that display areas among the device icons are not overlapped can be achieved. As shown in fig. 8E, a display form in which a device icon is displayed in an upper area of a display area of a device image by a lead line is shown in fig. 8E. As shown in fig. 8E, the device image 531 and the device icon 5311 are connected by a line segment indicating that the device icon 5311 corresponds to the device image 531; the device image 532 and the device icon 5321 are connected by a line segment indicating that the device icon 5321 corresponds to the device image 532; the device image 533 and the device icon 5331 are connected by a line segment indicating that the device icon 5331 corresponds to the device image 533; the device image 534 and the device icon 5341 are connected by a line segment indicating that the device icon 5341 corresponds to the device image 534.
In some embodiments, when electronic device 100 detects that there is overlap between display regions between device icons, or that the closest distance between the display regions between two device icons in the viewing interface is less than a threshold, electronic device 100 outputs the device icon shown in fig. 8E such that the display regions between the device icons do not overlap.
Based on the above viewing interface 530, the present application also provides a data transmission method, so that a user can quickly share selected data (for example, pictures, documents, videos, etc.) to other devices through a sliding operation (or a clicking operation, etc.) on the viewing interface 530. Therefore, the operation steps of sharing data by the user can be simplified, and the efficiency of sharing data to other equipment is improved. The following describes the data transmission method in detail by taking three application scenarios as examples.
In an application scenario one, in the UI embodiment exemplarily shown in fig. 9A to 9E, a user may trigger a sharing function based on augmented reality display in a multitasking interface, and share an application program or data of the application program in the multitasking interface to another device.
As shown in fig. 9A, a user interface 520 is shown in fig. 9A, wherein the description of the user interface 520 may refer to the related description of fig. 5B above. For example, when the electronic device 100 detects a long press operation 901 for the card 521, the electronic device enters a sharing interface corresponding to the card 521. The electronic device 100 extracts the application program corresponding to the card 521 and the data type that can be shared in the current interface of the card 521, and presents the data type on the sharing interface in an icon manner.
As shown in fig. 9B, the electronic device 100 starts the camera, and captures an image in real time through the camera to display on the sharing interface 920, where the display content in the sharing interface 920 includes a picture captured by the camera. Fig. 9B illustrates a sharing interface 920, where the sharing interface 920 includes a device image 531, a device image 532, a device image 533, and a device image 534.
In the sharing interface 920, specific descriptions of the device image 531, the device image 532, the device image 533, the device image 534, the device icon 5311, the device icon 5321, the device icon 5331, and the device icon 5341 may refer to the descriptions related to the device image 531, the device image 532, the device image 533, the device image 534, the device icon 5311, the device icon 5321, the device icon 5331, and the device icon 5341 in fig. 5F, which are not described herein again.
Sharing interface 920 may also include one or more icons, each of which identifies a type of sharable data, such as application icon 902 and file icon 903. Wherein the application icon 902 is associated with an application of the card 521; the file icon 903 is associated with the PDF document of "novel 1" in card 521.
The user can drag the icon to the display area of the corresponding device in a dragging manner, and after the user releases his hand, the electronic device 100 sends the data corresponding to the icon to the device. As shown in fig. 9C, after the user selects the file icon 903, the file icon 903 is dragged on the sharing interface 920, and the file icon 903 is dragged to an effective area of the device image 534. The effective area is an area that can instruct the electronic device 100 to share data with an electronic device (the electronic device 204) corresponding to the device image 534. After the user releases his/her hand, the electronic apparatus 100 transmits the PDF document of "novel 1" associated with the file icon 903 to the electronic apparatus (electronic apparatus 204) corresponding to the apparatus image 534 by wireless communication.
Specifically, the wireless communication mode includes, but is not limited to, Zig-Bee, Bluetooth (Bluetooth), wireless broadband (Wi-Fi), ultra-wideband (UWB), Near Field Communication (NFC), Wi-Fi direct (Wi-FiDirect), and the like.
In some embodiments, when the user drags the file icon 903 to the display area of the device image 534, the electronic device 100 increases the brightness of the display area of the device image 534 on the sharing interface 920 to indicate that the user currently drags the file icon 903 to the active area of the device image 534.
In some embodiments, the user drags the file icon 903 to the display area of the device icon 5341, and after the user releases his hand, the electronic device 100 sends the PDF document of "novel 1" associated with the file icon 903 to the electronic device (electronic device 204) corresponding to the device image 534 in a wireless communication manner.
As shown in fig. 9D, the electronic device 204 receives the PDF document of "novel 1" sent by the electronic device, and outputs a prompt box 1001 on the display interface 1000 of the electronic device 204, where the text content of the prompt box 1001 may be "receive a PDF file from the electronic device, click the prompt box to view". When the electronic device 204 detects a click operation for the prompt box 1001, the electronic device 204 opens the PDF document of "novel 1", and as shown in fig. 9E, the PDF document of "novel 1" is displayed on the display interface 1002 of the electronic device 204. In some embodiments, fig. 9D is optional, and the electronic device 204 receives the PDF document of "novel 1" sent by the electronic device, and the electronic device 204 directly opens the document, as shown in fig. 9E.
In the present application, the icon to be shared may also be referred to as a first icon. After the user selects the file icon 903, the file icon 903 is dragged on the sharing interface 920, and the file icon 903 is dragged to an effective area of the device image 534, where this dragging operation may also be referred to as a third operation.
In some embodiments, the electronic device 100 may determine whether the target device can support outputting the data type according to the data type that the user wants to share. If not, outputting prompt information to prompt the user to select other devices except the target device.
As shown in fig. 10A and 10B, the electronic device drags the file icon 903 to the display area of the device image 532 by way of dragging. Since the device type of the electronic device 201 corresponding to the device image 532 is an audio device, and the device attribute of the electronic device 201 does not include a display function. When the electronic device detects that the user drags the file icon 903 to the display area of the device image 532, the electronic device 100 outputs a prompt message 1100 "the huangweii soundX cannot execute the task", which indicates that the electronic device corresponding to the device image 532 cannot execute the output of the PDF document corresponding to the file icon 903. Optionally, when the user drags the file icon 903 to the display area of the device image 532 and releases his hand, the electronic device 100 outputs the prompt message 1100.
Not limited to the manner shown in fig. 10A and 10B described above, in some embodiments, the user may be prompted by a display form of a device icon for a selectable device for data sharing.
As shown in fig. 10C, the user selects the file icon 903, and since the file icon 903 is associated with the PDF document of "novel 1", when the electronic device detects that the file icon 903 is selected, the display areas of the device icon 5311, the device icon 5331, and the device icon 5341 in the viewfinder interface 920 are highlighted (or icon colors are changed, etc.); optionally, the display areas of device image 531, device image 533, and device image 534 are highlighted. This identifies the electronic device 202, the electronic device 203, and the electronic device 204 corresponding to the device icon 5311, the device icon 5331, and the device image 531, the device image 533, and the device image 534, respectively, as devices that support outputting the PDF document associated with the file icon 903. The user is prompted to drag the file icon 903 to the display areas of the several devices for data sharing.
Optionally, compared to the device icon 5311, the device icon 5331, and the device icon 5341, the difference in brightness (or color, etc.) of the display area of the device icon 5321 indicates that the electronic device 201 corresponding to the device icon 5321 does not support outputting the PDF document associated with the file icon 903, and prompts the user not to drag the file icon 903 into the display area of the device image 532.
In this application, the display forms of the device icon 5311, the device icon 5331, and the device icon 5341 may also be referred to as a first display form, and the display form of the device icon 5321 may also be referred to as a second display form; the display form of the device icon may also have more forms, and the application is not limited.
In the UI embodiment exemplarily shown in fig. 11A to 11D, a user may trigger a sharing function based on augmented reality display through a screen capture operation to share a screen capture image with other devices.
As shown in fig. 11A, fig. 11A shows a user interface 1110, and optionally, the user interface 1110 may be any display interface in an electronic device. When the electronic device 100 displays the user interface 1110 and receives a screen capture operation, the electronic device 100 collects display content of a current interface and generates a picture file. The screen capture operation can be triggered by one or more virtual keys or one or more physical keys.
As shown in fig. 11B, the electronic device 100 receives the screen capture operation, collects the display content of the current interface, and generates a picture file. Screenshot thumbnails 1111 are displayed on the current user interface 1110. The screenshot thumbnail 1111 is associated with a corresponding picture file, as shown in fig. 11C, the user presses the screenshot thumbnail 1111 for a long time, when the electronic device 100 detects that the long press operation is performed on the screenshot thumbnail 1111, the sharing function is triggered, and the electronic device displays the sharing interface 1120 shown in fig. 11D. The electronic device starts the camera, and the image is collected by the camera in real time and displayed on the sharing interface 1120, and the display content in the sharing interface 1120 includes the picture collected by the camera. As shown in fig. 11D, fig. 11D exemplarily shows a sharing interface 1120, and the sharing interface 1120 includes a device image 531, a device image 532, a device image 533, and a device image 534.
Sharing interface 1120 also includes screenshot thumbnail 1111. The user can freely drag the screenshot thumbnail 1111, and when the user drags the screenshot thumbnail 1111 to a display area of any one device in the sharing interface 1120, and the user releases his or her hand, the electronic device 100 sends the picture file associated with the screenshot thumbnail 1111 to the device.
It should be noted that, based on the same inventive concept, the principle of sharing the screenshot thumbnail 1111 dragged by the user to the display area of the other device provided in the embodiment of the present invention is similar to sharing the screenshot thumbnail 1111 dragged by the user to the display area of the other device, so that implementation of sharing the screenshot thumbnail 1111 dragged by the user to the display area of the other device may refer to corresponding descriptions of implementation of sharing the screenshot thumbnail 1111 dragged by the user to the display area of the other device, for example, refer to the implementation embodiments and corresponding descriptions shown in fig. 9C to 9E, and no further description is given here.
In an application scenario three, in the UI embodiment exemplarily shown in fig. 12A to 12E, when the electronic device detects an operation of selecting a picture for sharing, the user may trigger a sharing function based on augmented reality display, and share one or more picture files to other devices.
Fig. 12A illustrates an example user interface 1201. As shown in fig. 12A, user interface 1201 may include one or more of the following areas: region 1201, region 1202, and region 1203. Wherein:
region 1201 may be used to display one or more pictures in the gallery that may include a picture selected by the user, such as selected picture 1205. In some embodiments, a marker 1206 may be displayed on the selected picture 1205, and the marker 1206 may indicate that its corresponding picture 1205 is selected by the electronic device 100 (i.e., the picture has been selected by the user). In still other embodiments, the user may make a swipe gesture, left or right, etc., in region 1201 to switch or update the picture. The picture 1205 may be a thumbnail. The original image corresponding to the picture displayed in the area 405 may be stored in the picture on the electronic device 100, or may be stored in the cloud server.
One or more service options (e.g., browser, information, etc.) may be displayed in area 1203. The application program or protocol corresponding to the service option can support sharing the picture selected by the user to the contact or the server. In some embodiments, in response to a detected operation on the service option in the area 1203 (e.g., a touch operation on the "info" icon), the electronic device 100 may trigger a process of sharing the selected picture to the cloud contact or the cloud server through an application or a protocol corresponding to the service option, where the process may include: the electronic device 100 opens the application program or the protocol, displays a user interface thereof, detects a data sharing operation performed by a user in the user interface, responds to the operation, and shares the selected picture to the cloud contact or the server through the application program or the protocol.
The area 1202 may be used to display nearby device options that the electronic device 100 self-discovers, such as smart screens, mate 30Pro, mateebookX, printers, and so on. Device options (e.g., mate 30Pro, mateebookX) displayed in area 1202 may be used to trigger the operation of sharing. In response to a detected operation (e.g., a touch operation on a device icon) acting on a device option, the electronic device 100 may trigger a process of sharing a selected picture to a device corresponding to the device option selected by the operation. The process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device option, and then transmits the selected picture to the device corresponding to the device option through the communication connection.
The user interface 1210 also includes a live view share control 1204, which is a live view share control 704 for triggering entry into the sharing interface. When the electronic device 100 detects a user operation directed to the live view control 704, the electronic device 100 activates the camera and displays a sharing interface 1220 as shown in fig. 12B. The sharing interface includes images acquired by a camera, device icons, and a to-be-shared photo bar 1221.
In some embodiments, a live view control 1204 in the user interface 1210 is optional, and the electronic device 100 may not display the live view control 1204. When the electronic device 100 displays the user interface 1210 and the electronic device 100 detects the lift-off operation, the electronic device 100 triggers the display sharing interface 1220. The lifting operation may be described with reference to fig. 7C, in this embodiment, at time T1, the electronic device 100 displays the user interface 1210, and when the electronic device 100 detects the lifting operation, at time T2, the electronic device 100 displays the sharing interface 1220, where a time interval between time T1 and time T2 is less than a threshold.
It is to be appreciated that the lift operation is merely an exemplary user operation, and that electronic device 100 may also access sharing interface 1220 via other user operations.
The pictures column 1221 is used to display one or more pictures in the gallery, which may include a picture selected by the user, such as the selected picture 1205. In some embodiments, a marker 1206 may be displayed on the selected picture 1205, and the marker 1206 may indicate that its corresponding picture 1205 is selected by the electronic device (i.e., the picture has been selected by the user). In still other embodiments, the user may make a swipe gesture, left or right, etc., in region 1201 to switch or update the picture.
After the user selects one or more pictures, any one device in the sharing interface is selected, and when the electronic device 100 detects a user operation (e.g., a click operation on a device icon) for the device icon, the electronic device 100 may trigger a process of sharing the selected picture to a device corresponding to the device icon selected by the user operation. The process may include: the electronic device 100 establishes a communication connection with the device corresponding to the selected device icon, and then transmits the selected picture to the device corresponding to the device icon through the communication connection.
As shown in fig. 12C, when the user selects the picture 1205, the user clicks the device icon 5311, the electronic device 100 detects a user operation for the device icon 5311, and sends the picture 1205 to the electronic device 202 corresponding to the device image 531. As shown in fig. 12D, the electronic device 202 receives the picture 1205 sent by the electronic device 100, and outputs a prompt box 1211 on a display interface of the electronic device 202, where the text content of the prompt box 1211 may be "receive a PDF file from the electronic device, click the prompt box to view". When the electronic device 202 detects a click operation for the prompt box 1211, the electronic device 202 opens the picture 1205, as shown in fig. 9E, and the picture 1205 is displayed on the display interface of the electronic device 202. In some embodiments, fig. 12D is optional, and the electronic device 202 receives the picture 1205 sent by the electronic device, and the electronic device 202 directly opens the picture, as shown in fig. 12E.
In some embodiments, the user may be prompted by a display form of a device icon for a selectable device for data sharing.
As shown in fig. 12F, when the user selects the picture 1205, and the data type of the picture 1205 is a picture, when the electronic device 100 detects that the picture 1205 is selected, the display areas of the device icon 5311, the device icon 5331, and the device icon 5341 in the view interface 920 are highlighted (or the icon color is changed, etc.); optionally, the display areas of device image 531, device image 533, and device image 534 are highlighted. This identifies the electronic device 202, the electronic device 203, and the electronic device 204 corresponding to the device image 531, the device image 533, and the device image 534 indicated by the device icon 5311, the device icon 5331, and the device icon 5341, respectively, as devices that support outputting the picture 1205. And prompting the user to select to click the device icons of the devices for data sharing.
Optionally, compared to the device icon 5311, the device icon 5331, and the device icon 5341, the brightness (or color, etc.) of the display area of the device icon 5321 is different, which indicates that the electronic device 201 corresponding to the device icon 5321 does not support outputting the picture 1205, and prompts the user not to click the device icon 5321 of the device image 532.
In the present application, the display form of the device icon may also have more forms, and the present application is not limited.
In the three exemplarily shown scenarios, the data transmission between the devices is realized based on the device identification method provided by the embodiment of the application, the operation steps of sharing data by a user can be simplified, and the efficiency of sharing data to other devices is improved. Specifically, the embodiment of the application further provides a method for sharing the photos, and a user can quickly share the photos in a shooting preview interface of a camera application. The method for sharing photos is described in detail below.
Under the environment of massive terminals, sharing of pictures, files and the like among a plurality of terminals becomes more and more common. How to quickly and efficiently find a target terminal which a user desires to share becomes very important to improve the efficiency and experience of the user in finding the target terminal.
Taking a mobile phone as an example, after a user takes a photo with the mobile phone, the user often needs to share the photo with other users or other electronic devices. In the current process of sharing photos by a user, the user must perform a series of tedious operations, such as opening a gallery, selecting a picture, clicking to share, searching for other electronic devices, selecting a target electronic device, transmitting a picture, and the like, to share the picture with the target electronic device. The process of sharing the photos is complex in operation, multiple in interaction flow and low in photo sharing efficiency.
FIG. 13 is a Graphical User Interface (GUI) diagram illustrating an example of a process for sharing photos. Fig. 13 (a) shows interface content 1301 currently output by the mobile phone in the unlock mode, where the interface content 1301 displays a plurality of applications (apps), such as music, settings, photo album, and camera. It should be understood that the interface content 1301 may also include other more application programs, which is not limited in this embodiment of the application.
As shown in fig. 13 (a), the user clicks an icon of the camera application, and in response to the user's clicking operation, the cellular phone enters a main interface 1302 of the camera application as shown in fig. 13 (b), which is alternatively referred to as a "shooting preview interface", and a screen presented in the shooting preview interface is referred to as a "preview image" or a "preview screen".
It should be understood that, in this embodiment of the present application, as shown in fig. 13 (b), the shooting preview interface 1302 may include a preview screen in the middle, and keys, menu options, etc. of the camera application displayed in the top end region and the bottom end region of the interface, and in subsequent embodiments, both the shooting preview interface and the preview screen may be used to describe the shooting interface of the camera application, for example, "display a reminder window on the shooting preview interface" or "display a reminder window in the preview screen" is not strictly distinguished, and will not be described in detail later.
It should also be understood that, in this embodiment of the present application, the shooting preview interface may represent an interface including a preview screen, a shooting shutter key, a local album icon, a camera switching icon, and the like, and if a change of display content occurs on the interface, for example, a certain identified device tag is displayed, the interface may also be referred to as a shooting preview interface, which is not described in detail later.
On the camera application home interface 1302, various keys and menu options are included, such as a shooting shutter key 31, a local album icon 32, a camera switching key 33, and the like, and a user can perform different operations through the various keys and menu options. The user can perform operation 1 as shown in (b) of fig. 13, click the shooting shutter key 31, and in response to the user's shooting operation, the mobile phone takes a picture and saves the taken picture in the local album.
When the user desires to share the currently taken photo or other photos of the local album with other electronic devices, the user may perform operation 2 shown in fig. 13 (b), click the local album icon 32 of the camera application home interface 1302, and in response to the click operation of the user, the mobile phone enters the photo display interface 1303. The photo display interface 1303 may display a currently taken photo, as shown in fig. 13 (c), and the user clicks a "share" button of the photo display interface 1303, and the mobile phone enters the photo sharing interface 1304.
The photo sharing interface 304 may include a photo area and a sharing menu area, wherein the photo area may display a plurality of photos taken, and the user may click on a "select" box in the lower right corner of the photos to select a photo desired to be shared. The sharing menu area may provide a plurality of photo sharing manners for the user, for example, a plurality of photo sharing manners such as "Huawei Share", "send to friend", "bluetooth", "send to friend", "microblog", "information", "email", and "memo", and different photo sharing manners may be associated with different applications (e.g., WeChat, etc.), which is not described herein again.
As shown in fig. 13 (d), when the user clicks the "hua is Share (Huawei Share)" button of the photo sharing interface 304, the mobile phone may enter the interface shown in fig. 13 (e) to display a plurality of electronic devices that can be shared, such as Ma's P30, MateBook, and the like. The user can select the icon of the target electronic equipment to be shared according to the requirement of the user, and therefore the selected picture is shared to the target electronic equipment.
Accordingly, after the user clicks the target electronic device which desires to share the photos, a receiving window can pop up on the target electronic device, and the receiving window can be used for selecting whether to receive the currently shared photos.
The above describes a process of sharing a picture to other electronic devices after the user takes the picture through the camera application. The steps of the process are sequentially performed by a plurality of operations of shooting a picture by a user, opening a gallery, selecting the picture, clicking to share, selecting a sharing mode, searching other electronic equipment, selecting target electronic equipment, transmitting the picture and the like, so that the shot picture can be shared with the target electronic equipment. The process of sharing the photos is complex in operation, multiple in interaction flow and low in photo sharing efficiency.
Therefore, embodiments of the present application provide a method for sharing photos, and in the UI embodiments exemplarily shown in fig. 14 to 18, a user may quickly share photos to other electronic devices through a camera application.
Fig. 14 is a schematic diagram of an exemplary graphical user interface for a process of sharing photos according to an embodiment of the present disclosure. In fig. 14, (a) shows interface contents 1401 currently output by the mobile phone in the unlock mode, and the user clicks an icon of the camera application, and in response to a click operation by the user, the mobile phone displays a shooting preview interface 1402 shown in fig. 14 (b). On the shooting preview interface 1402, the user clicks the shooting shutter key 31, and in response to the user's shooting operation, the mobile phone takes a picture and saves the taken picture in a local album.
The user performs the operation shown in fig. 14 (c), long-presses the local album icon 32, and in response to the long-press operation of the user, the cellular phone displays an interface 1404 shown in fig. 14 (d), and displays an icon 30 of a thumbnail photo, alternatively referred to as "photo thumbnail", on the interface 1404. Meanwhile, the mobile phone starts the device identification function, and identifies whether the preview screen includes other electronic devices according to the preview screen presented on the current shooting preview interface 1404.
Illustratively, as shown in (d) of fig. 14, if the currently presented preview screen includes the cell phone 10 and the Personal Computer (PC) 20 on the desk, the cell phone may recognize the cell phone 10 and the PC 20 in the preview screen and display the recognized name of the cell phone 10 and the name of the PC 20 in the interface 1404, for example, the cell phone 10 is "P40", the PC 20 is "MateBook", and the like.
Alternatively, the mobile phone may not display the names of the other electronic devices in the identified preview screen, and only mark "device 1", "device 2", and the like, which is not limited in this embodiment of the application.
Here, the preview screen presented in fig. 14 (b) and fig. 14 (c) may be obtained by a front camera or a rear camera of a mobile phone, and the camera for taking a picture is not limited in the embodiment of the present application. For example, when the person photograph in fig. 14 (b) is taken by the front camera of the mobile phone, if the user wants to recognize the electronic device by the rear camera, the user can switch by clicking the camera switch button 33. For another example, when the person photograph in fig. 14 (b) is obtained by a rear camera of a mobile phone, if the user wants to recognize the electronic device through the front camera, the user can switch the person photograph by clicking the camera switch button 33.
It should be further noted that, in the above embodiment, taking the long press operation as an example, the long press operation by the user on the local album icon 32 as an operation for triggering the photo sharing process is described. It should be understood that the photo sharing process provided in the embodiment of the present application may also be triggered by other preset operations, or the electronic device in the preview screen is triggered by other preset operations, for example, the preset operation is not limited to long pressing the local album icon 32, double clicking the local album icon 32, or drawing a fixed pattern on the shooting preview interface 1403, and the embodiment of the present application does not limit this.
In one possible implementation, the mobile phone triggers the identification function of the mobile phone after detecting the long-press operation of the local album icon 32 by the user. In other words, if the mobile phone does not detect the long press operation of the local album icon 32 by the user, the object in the preview screen may not be recognized, and the display may be as shown in fig. 14 (c). When the mobile phone detects that the user presses the local album icon 32 for a long time, the mobile phone triggers to identify the object in the preview screen, marks the names "P40" and "mathebook" of the identified electronic devices, and displays the image (d) in fig. 14. The implementation mode can avoid the situation that the mobile phone is always in the state of identifying the object of the preview picture, thereby reducing the power consumption of the mobile phone.
In another possible implementation manner, the mobile phone may always start the device identification function, that is, the mobile phone continuously identifies the object in the preview screen, and after detecting the long pressing operation of the local album icon 32 by the user, marks the name of the identified electronic device, and displays the icons of "P40" and "MateBook" as shown in (d) of fig. 14.
The implementation mode can enable the mobile phone to judge the objects in the preview picture in advance, and quickly display the recognized names of the electronic devices in the interface when the user starts the picture sharing function by long pressing the local album icon 32, so that the speed of recognizing the objects in the preview picture by the mobile phone is increased.
After the mobile phone recognizes P40 and MateBook included in the current preview screen, the user may, according to the needs of the user, press the icon 30 of the thumbnail for a long time, and drag the icon 30 of the thumbnail to the target device to be shared.
For example, as shown in fig. 14 (d), icons of "P40" and "mathebook" are displayed on the preview screen, and the user holds the icon 30 of the thumbnail, drags the icon 30 of the thumbnail to the icon area of P40, and releases the icon. Alternatively, the user holds the thumbnail icon 30, drags the thumbnail icon 30 to an arbitrary position in the area where P40 is located, and releases the drag.
Alternatively, the user may drag the icon 30 of the thumbnail to the position of the icon of P40 and then release the icon, and the icon of P40 may appear in different colors or display size change, jump, flash light or other dynamic effects to remind the user to share the currently taken picture with the P40 identified in the preview screen. For example, as shown in fig. 14 (e), when the user drags the icon 30 of the thumbnail to the position of the icon of P40, the color of the "P40" icon changes, and at this time, the user releases the icon 30 of the thumbnail, so that the currently taken photo can be shared to P40.
In another possible implementation manner, in the process of dragging the icon 30 of the thumbnail by the user, a reminder control may be displayed on the preview screen. Illustratively, as shown in fig. 14 (e), the reminder control may be an arrow 40, and the arrow 40 may be displayed statically, jumpily or flickers to prompt the user to drag the icon 30 of the thumbnail to the position indicated by the arrow 40, so as to implement the photo sharing function. The display mode of the reminding control is not limited in the embodiment of the application.
It should be understood that, for the implementation process described above, the mobile phone may detect and identify other electronic devices included in the preview screen in many different ways, such as image detection, 3D scanning technology, machine vision, and the like, and the embodiment of the present application does not limit the way in which the mobile phone identifies other electronic devices in the preview screen.
It should also be understood that, in the embodiment of the present application, the mobile phone may further identify other electronic devices in the preview screen through a plurality of possible positioning technologies, and position the positions of the other electronic devices.
Optionally, the positioning technology of the embodiment of the present application may include one of technologies such as bluetooth-based wireless sensing positioning, Ultra Wide Band (UWB) -based wireless sensing positioning, computer vision-based positioning, or a fusion of the above-listed multiple positioning technologies, or other more positioning technologies, and the embodiment of the present application does not limit the way in which the mobile phone positions the other electronic devices.
In addition, in the embodiment of the present application, after the mobile phone identifies other electronic devices included in the preview screen, the position at which the icon of the electronic device is displayed may be determined according to the display position of the object in the current preview screen.
In one possible mode, the mobile phone may display an icon marking other electronic devices in an area where the electronic device is located in the preview screen. For example, taking the diagram (d) in fig. 14 as an example, after the mobile phone recognizes P40 and MateBook on the preview screen, the icon marked with "P40" is displayed at the recognized location of the mobile phone, and the icon marked with "MateBook" is displayed at the location of the PC.
Alternatively, the icon marking the other electronic device may be displayed in an area near the pointing device of the electronic device. For example, the mobile phone communicates with the P40 through the UWB chip to locate the position of the P40 in the preview screen, and if the UWB chip of P40 is installed at the upper right corner of the P40, an icon including "P40" in the drawing (d) in fig. 14 may be displayed in the area of the UWB chip at the upper right corner of the P40, which is not limited in the embodiment of the present application.
In another possible way, the icon marking other electronic devices can be displayed in a blank area in the preview picture without obstructing other objects in the preview picture. Exemplarily, as shown in (d) of fig. 14, after the cell phone recognizes P40 and MateBook in the preview screen, an icon marked with "P40" is displayed at the left boundary of the preview screen so as not to obscure the PC on the right side of P40; meanwhile, an icon marked with 'MateBook' is displayed at the right boundary of the preview picture so as not to shield the mobile phone at the left side of the MateBook.
The icon display mode can mark the identified electronic equipment under the condition of not shielding other objects in the preview picture, does not influence the vision and the impression of the user, and improves the visual experience of the user.
By the method, the user can start the equipment identification function and the positioning function of the mobile phone through preset operation in the process of shooting the picture, identify other electronic equipment included in the preview picture of the camera by combining the identification function and the positioning function of the mobile phone, and directly drag the picture to be shared to the area where the other electronic equipment is located, so that the picture can be rapidly shared to other electronic equipment existing around. The process simplifies the operation flow of sharing the photos, shortens the time of sharing the photos and improves the user experience.
In another possible scenario, when the mobile phone identifies other electronic devices in the preview screen, a situation may occur where the identified electronic device is blocked by an obstacle, that is, the electronic device cannot be seen in the preview screen. For the scene, the embodiment of the application further provides a method for sharing the photos, so that the shot photos can be quickly shared to the electronic equipment which is shielded in the preview picture.
Fig. 15 is a schematic diagram of a graphical user interface of another process for sharing photos according to an embodiment of the present application. Illustratively, as shown in fig. 15 (a), on the shooting preview interface 1501, the cellular phone recognizes that the PC 20 in the preview screen is mathbook, and displays an icon labeled "mathbook". In addition, the handset also recognizes that there is a hidden device 1 behind the MateBook. In this kind of scene, the matchbook blocks the device 1, and on the shooting preview interface 1501 of the mobile phone, a reminding window 50 may be displayed, where the reminding window 50 may include text information for reminding the user of the detected device 1.
Optionally, in addition to the text reminder in the reminder window 50, the embodiment of the present application may further include an icon reminder. For example, the shooting preview interface 1501 of the mobile phone may include, in addition to the reminder window 50, icons such as statically displayed arrows, dynamically flashing arrows, or jumping displayed arrows, which mark the position of the electronic device that is being hidden, but the embodiment of the present application is not limited thereto.
Illustratively, as shown in fig. 15 (a), the reminder window 50 displays: here, device 1 is detected, whether or not to share. After the user clicks the reminding window 50, the mobile phone displays an interface 1502 as shown in (b) of fig. 15, the interface 1502 includes a picture sharing window 60, and the user can click a "share" button of the picture sharing window 60 to determine to share the currently taken picture with the blocked device 1.
Or, after the user clicks the reminding window 50 shown in fig. 15 (a), the mobile phone may not further display the interface shown in fig. 15 (b), and directly share the taken picture with the blocked device 1, which is not limited in the embodiment of the present application.
It should be noted that the mobile phone may communicate with other nearby electronic devices, for example, communicate in multiple possible manners such as bluetooth, wireless fidelity (WIFI) module, and the like, so that the mobile phone may sense the electronic devices existing nearby. Alternatively, the mobile phone may determine that another electronic device exists nearby by using a wireless positioning technology such as UWB, recognize the type of the electronic device, and display the type in the shooting preview interface. The embodiment of the application does not limit the communication interaction mode and the positioning mode of the mobile phone and other nearby electronic equipment.
By the method, when the mobile phone recognizes that other electronic equipment exists in the preview picture and the electronic equipment is shielded by the obstacle, in the process of sharing the picture by the user, reminding information such as characters or icons can be displayed on the shooting preview interface for reminding the user of the position of the shielded electronic equipment and the like, the user can further share the shot picture to the shielded electronic equipment quickly, a possible way is provided for the user to share the picture to the shielded electronic equipment, and the operation steps of sharing the picture by the user are simplified.
In yet another possible scenario, the handset may identify other electronic devices nearby through wireless location technology, and the electronic devices are not displayed in the current preview screen of the handset. For the scene, the embodiment of the application can also display reminding information on a shooting preview interface, so as to remind a user that other electronic equipment exists in a certain direction.
Illustratively, as shown in (c) of fig. 15, on the shooting preview interface 1503, the camera of the cell phone acquires a preview screen that does not include any electronic devices, but the cell phone may detect 3 electronic devices in the left region outside the preview screen. In such a scenario, on interface 1503, a reminder window 70 may be displayed, where the reminder window 70 may include textual information for reminding the user of the detected plurality of electronic devices.
Optionally, the embodiment of the present application may further include an icon reminder in addition to the text reminder in the reminder window 70. For example, the shooting preview interface 1503 of the mobile phone may include, in addition to the reminder window 70, icons such as statically displayed arrows, dynamically flashing arrows, or jumping displayed arrows that mark the positions of the electronic devices that are blocked, which is not limited in the embodiment of the present application.
Illustratively, as shown in (c) of fig. 15, the reminder window 70 displays: here, 3 electronic devices are detected, and the camera is turned to acquire information of the electronic devices. After the user clicks the reminder window 70, the mobile phone displays an interface 1504 as shown in (d) of fig. 15, where the interface 1504 includes a device list window 80, and the user can click any one of the devices, for example, the device 3, in the device list window 80, so as to determine to share the currently taken picture with the device 3.
Or, in another possible manner, the user may rotate the direction of the mobile phone according to the reminding information on the interface 1503, so that the camera of the mobile phone may obtain the detected 3 electronic devices, and display the device 3 that the user is going to share the picture in the preview screen, so that the shot picture can be quickly shared with other electronic devices according to the method described in fig. 14.
By the method, when other electronic equipment does not exist in the preview picture acquired by the camera of the mobile phone and the mobile phone detects that other electronic equipment exists nearby, reminding information such as characters or icons can be displayed on the shooting preview interface for reminding the user of information or positions and the like of other electronic equipment nearby which can share photos. Therefore, in the process of sharing the photos, the user can quickly share the shot photos to the electronic equipment by dragging the photos to other electronic equipment in the preview picture, another possible way for sharing the photos is provided for the user, and the operation steps of sharing the photos by the user are simplified.
In the embodiment of the application, the mobile phone serves as sending equipment, and the electronic equipment which receives the pictures shared by the user can serve as receiving equipment. For the above-mentioned process of sharing photos in fig. 14 and 15, after the user drags the icon 30 of the thumbnail photo to the receiving device recognized by the mobile phone, a receiving window of the photo may appear on the receiving device accordingly.
Fig. 16 is a schematic diagram of an exemplary graphical user interface for receiving photos according to an embodiment of the present application. For example, fig. 16 (a) illustrates a possible interface 1601 of a receiving device, and it should be understood that the interface 1601 is not limited to a host interface of the receiving device or an operation interface of any application, and the like, and the embodiment of the present application is not limited thereto.
Taking the main interface 1601 of the receiving device as an example, after the user performs an operation of sharing photos from the mobile phone, the receiving device may be displayed as an interface 1602 shown in (b) of fig. 16, and the interface 1602 includes a receiving window 90 for photos. Optionally, the receiving window 90 of the photo may provide the user with "view", "close", etc. buttons to facilitate the user to quickly view the shared photo through the receiving device.
Optionally, the receiving window 90 of the photo may automatically disappear or hide to the notification bar of the receiving device after displaying a preset duration on the interface of the receiving device, and the user may view the photo sharing result of the notification bar through a pull-down operation; or the photo sharing result of the notification bar is further closed through a pull-down operation, and the process may refer to related operations in the prior art, which are not described herein again.
It should be understood that after the user drags the icon 30 of the thumbnail to the recognized receiving device in the preview screen and releases the icon 30 of the thumbnail, the mobile phone may transmit the currently taken photo to the receiving device. For example, the transmission method may not be limited to bluetooth transmission, WIFI transmission, near-field communication (NFC) transmission, future fifth generation (5G) mobile communication system, and other high-speed communication methods, and the embodiment of the present application is not limited to the photo transmission method.
It should also be understood that the shared picture may be the latest picture taken after the current user clicks the shooting shutter key, or may be a picture taken before the user, or a picture of another source stored in the mobile phone of the user, which is not limited in the embodiment of the present application.
In other words, the user may open the camera application, not take a picture, directly long press and drag the local album icon, and share the first picture whose shooting date is closest to the current date in the local album or pictures from other sources stored in the mobile phone of the user to the receiving device, which is not limited in the embodiment of the present application.
In addition, the embodiment of the application also provides a method for sharing the photos, and the user can share a plurality of photos to the receiving device identified in the preview picture through the camera application.
Fig. 17 is a schematic diagram of an exemplary graphical user interface for a process of sharing photos according to an embodiment of the present disclosure. In fig. 17, (a) shows a host interface 1701 currently output by the cellular phone in the unlock mode, and the user clicks an icon of a camera application of the host interface 1701, and in response to the user's clicking operation, the cellular phone displays a shooting preview interface 1702 shown in fig. 17 (b). On the shooting preview interface 1702, the user clicks the shooting shutter key 31, and in response to the user's shooting operation, the mobile phone takes a picture and saves the taken picture in a local album.
The user performs the operation shown in fig. 17 (c), selects the local album icon 32 and drags the local album icon 32 upward in the direction shown by the arrow, and in response to the user's dragging operation, the cellular phone displays the interface 1704 shown in fig. 17 (d). A photo list is displayed on the interface 1704, and as shown in fig. 17 (d), the photo list may display thumbnails of a plurality of photos, such as photo 1, photo 2, photo 3, and the like. Optionally, the list of photos may be displayed in the bottom area of the interface 1704, without affecting the display of the preview screen in the interface 1704, so as to ensure that the user can see the contents in the preview screen.
In one possible scenario, the photos in the photo list may be arranged in the order in which they were taken by the user. Illustratively, photograph 1 is the most recent photograph taken by the user, and photographs 2 and 3 are taken earlier than photograph 1.
Alternatively, the photos in the photo list may be arranged in other possible arrangement orders, for example, it is detected that the shooting location is a company, and the photos of which the shooting location is the company may be displayed in the photo list, which is not limited in this embodiment of the present application.
In one possible case, after the user performs the operation shown in fig. 17 (c), to display the photo list, the first photo in the photo list may be selected by default, in other words, the lower right corner of photo 1 in fig. 17 (d) is identified by default as the selected photo to be shared. If the user does not desire to share photo 1, the user may click on the selection box in the lower right corner of photo 1 to deselect photo 1. Similarly, if the user desires to share the photos 1, 2, and 3 at the same time, the user can click on the selection box at the lower right corner of each photo to select a plurality of photos to be shared, which is not described herein again.
After the user selects the photo 1, the photo 2, and the photo 3 to be shared, the finger can press any area of the photo 1, the photo 2, and the photo 3 to be shared for a long time, in response to the long-press operation of the user, the mobile phone displays an interface 1705 as shown in (e) of fig. 17, and an icon 30 of a thumbnail photo is displayed on the interface 1705.
Meanwhile, the mobile phone starts an equipment identification function, and identifies whether the preview picture includes other electronic equipment or not according to the preview picture presented on the current shooting preview interface 1705. Alternatively, only a thumbnail of any one of the photos 1, 2, and 3 to be shared may be displayed on the icon 30 of the thumbnail, which is not limited in the embodiment of the present application.
Optionally, the process of sharing multiple photos provided in the embodiment of the present application may also be triggered by other preset operations, or the electronic device in the preview screen is triggered by other preset operations, for example, the preset operations are not limited to selecting the local album icon 32 and dragging the local album icon 32 upwards, double-clicking the local album icon 32, or drawing a fixed pattern on the shooting preview interface 1703, and the like, which is not limited in the embodiment of the present application.
Illustratively, as shown in (e) of fig. 17, if the cell phone 10 and the PC 20 on the desk are included in the currently presented preview screen, the cell phone may recognize the cell phone 10 and the PC 20 in the preview screen and display the recognized name of the cell phone 10 and the name of the PC 20 in the preview screen, for example, the cell phone 10 is "P40", the PC 20 is "MateBook", or the like. Alternatively, the mobile phone may not display the names of the other electronic devices in the identified preview screen, and only mark "device 1", "device 2", and the like, which is not limited in this embodiment of the application.
After the mobile phone recognizes the P40 and the mathebook included in the preview screen of the interface 1705, the user may drag the icon 30 of the thumbnail to the target device to be shared according to the needs of the user.
For example, as shown in fig. 17 (f), icons of "P40" and "mathebook" are displayed on the preview screen, and the user drags the icon 30 of the thumbnail photo to the icon area of the mathebook and releases the drag, that is, the selected photo 1, photo 2, and photo 3 can be shared with the mathebook. Or, the user drags the icon 30 of the thumbnail photo to any position of the area where the matchbook is located and releases the icon, so that the selected photo 1, photo 2 and photo 3 can be shared with the matchbook.
Optionally, the user drags the icon 30 of the thumbnail photo to the position of the icon of the MateBook and then releases the icon, and the icon of the MateBook may be presented in different colors, or display size change, jumping, flashing lights and other dynamic effects to remind the user to share the currently taken photo with the identified MateBook in the preview screen.
In one possible case, a reminder control may be displayed on the preview screen while the user drags the icon 30 of the thumbnail. Illustratively, as shown in (f) of fig. 17, the reminder control may be an arrow 40 or the like, and the arrow 40 may be displayed statically, jumpily or flickers to prompt the user to drag the icon 30 of the thumbnail to the position indicated by the arrow 40, so as to implement the photo sharing function. The display mode of the reminding control is not limited in the embodiment of the application.
It should be noted that, in the description of the embodiment in fig. 17, the same operation procedure and possible implementation manners as those described in fig. 14 to fig. 15 may refer to the corresponding description above, and are not described again here.
Similarly, in the process of sharing multiple photos, a situation that the receiving device in the preview interface is blocked may also occur, and a specific implementation process may refer to related description in fig. 15, which is not described herein again.
By the method, the user can start the equipment identification function and the positioning function of the mobile phone through preset operation in the process of shooting the photos, and identify other electronic equipment in the preview picture of the camera by combining the identification function and the positioning function of the mobile phone, so that the user can select a plurality of photos to be shared and directly drag the plurality of photos to be shared to the areas where the other electronic equipment is located, and the photos can be quickly shared to other electronic equipment existing around. The process simplifies the operation flow of sharing the photos, shortens the time of sharing the photos and improves the user experience.
With the photo sharing process of fig. 17, after the user drags the icon 30 of the thumbnail photo to the PC 20 recognized by the mobile phone, accordingly, a receiving window of the photo may appear on the PC 20.
FIG. 18 is a schematic diagram of a graphical user interface for receiving photos according to another embodiment of the present application. Illustratively, a possible interface of the PC 20 is shown in (a) of fig. 18. It should be understood that, with the PC 20, an interface presented by using different systems such as a windows system, a hong meng system, and the like may be displayed, and the interface may also be any running interface during the use of the PC 20, and the display interface of the PC 20 is not limited in this embodiment of the application.
Taking the matchbook using the windows system as an example, after the user performs an operation of sharing 3 photos from the mobile phone, the matchbook may display a receiving window 1801 of the photos shown in (b) in fig. 18. Optionally, the photo receiving window 1801 may display thumbnails of photos 1, 2, and 3 shared by the user, and may further provide the user with "view", "close", and other buttons, so that the user can quickly view the shared photos.
Optionally, the receiving window 1801 of the photo may automatically disappear or hide to the status bar at the bottom of the matchbook after displaying a preset duration on the interface of the receiving device, and a user may view a photo sharing result by clicking the status bar; or further closing the photo sharing result of the status bar, and the process may refer to related operations in the prior art, which are not described herein again.
It should be understood that after the user drags the icon 30 of the thumbnail to the identified matchbook in the preview screen and releases the icon 30 of the thumbnail, the mobile phone may transmit the currently photographed photo to the matchbook. For example, the transmission method between the mobile phone and the MateBook may not be limited to various possible methods such as bluetooth transmission, WIFI transmission, near-field communication (NFC) transmission, and high-speed communication methods such as a future 5th generation (5G) mobile communication system, and the embodiments of the present application are not limited thereto.
It should also be understood that the shared picture may be the latest picture taken after the current user clicks the shooting shutter key, or may be a picture taken before the user, or a picture of another source stored in the mobile phone of the user, which is not limited in the embodiment of the present application. In other words, the user may open the camera application, not take a picture, directly long press and drag the local album icon, and share the first picture with the shooting date closest to the current date in the local album to the receiving device, which is not limited in the embodiment of the present application.
To sum up, according to the method for sharing photos provided by the embodiment of the application, the user can start the device identification function and the positioning function of the electronic device through preset operations in the process of shooting the photos or running the camera application. And based on the identification function and the positioning function of the electronic equipment, other electronic equipment included in the preview picture of the camera is identified, and a user can select one or more photos to be shared through quick operation and directly drag the one or more photos to be shared to the area where the other electronic equipment is located, so that the one or more photos are quickly shared to other electronic equipment existing around. In addition, according to the embodiment of the application, aiming at various scenes such as other shielded electronic equipment and the like in the preview picture, a humanized interactive interface is provided for the user, the user can conveniently share one or more photos through quick operation, the operation process of sharing the photos is simplified, the time of sharing the photos is shortened, and the user experience is improved.
The above embodiment describes a method for sharing photos from a user interaction level with reference to fig. 14 to 18, and the method for sharing photos provided in the embodiment of the present application is described from a software implementation policy level with reference to fig. 19. It should be understood that the method may be implemented in an electronic device (e.g., a mobile phone, a tablet, a computer, etc.) having a structure such as a touch screen and a camera assembly as shown in fig. 2 and 3.
Fig. 19 is a schematic flowchart of an example of a method for sharing photos according to an embodiment of the present application, taking a mobile phone as an example, as shown in fig. 19, the method may include the following steps:
1901, a camera application is started.
Specifically, the mobile phone starts a camera application and displays a shooting preview interface. Illustratively, the implementation process of this step 1901 may be as shown in (a) of fig. 14, or as shown in (a) of fig. 17.
1902, the user clicks a shooting shutter key to take a picture.
It is understood that step 1902 is an optional step. Specifically, the method for sharing the photos can be applied to a scene in which the user takes the photos, the photos to be shared can be the latest photos shot by the user clicking a shooting shutter key, the photos shot by the user before, or the photos to be shared can also be the photos from other sources stored in the mobile phone of the user, and the embodiment of the application is not limited to this.
For example, when the photo to be shared is a photo currently taken by the user, the process of the following steps 1903 and 1904 may be executed continuously as shown in (b) of fig. 14.
1903, a long press operation of the local album icon by the user is detected.
1904, when detecting a long press operation of the local album icon by the user, triggering to display the icon of the thumbnail photo, and the icon of the thumbnail photo is in a draggable mode, and simultaneously starting the device identification function.
Optionally, in addition to the long-press operation of the local album icon by the user, the embodiment of the present application may also trigger the photo sharing process provided in the embodiment of the present application through other preset operations, or trigger the electronic device in the preview screen to be recognized by the mobile phone through other preset operations, for example, the preset operation is not limited to long-press the local album icon, double-click the local album icon, or draw a fixed pattern on the shooting preview interface, and the like, and the embodiment of the present application does not limit this.
In one possible implementation manner, when the mobile phone does not detect the long-press operation of the local album icon by the user, the object in the preview screen may not be recognized. Illustratively, after the mobile phone detects that the user presses the local album icon for a long time, the mobile phone triggers to identify the object in the preview screen, and marks the names "P40" and "mathebook" of the identified electronic device, as shown in fig. 14 (d). The method can avoid the situation that the mobile phone is always in the state of identifying the object of the preview picture, thereby reducing the power consumption of the mobile phone.
In another possible implementation manner, the mobile phone may always start the device identification function, that is, the mobile phone continuously identifies the object in the preview screen, and after detecting the long pressing operation of the user on the local album icon, marks the names "P40" and "matchbook" of the identified electronic devices, and displays the image (d) in fig. 14. According to the method, the mobile phone can judge the objects in the preview picture in advance, and when the user starts the picture sharing function by pressing the local album icon for a long time, the recognized names of the electronic devices are displayed in the interface quickly, so that the speed of recognizing the objects in the preview picture by the mobile phone is increased.
The following process of step 1909 and 1911 can be continuously executed for the scenario of step 1901-step 1904.
1909, other electronic devices included in the preview screen are identified, and the identified electronic devices are marked.
It should be understood that the mobile phone can communicate with other nearby electronic devices, for example, by bluetooth, WIFI module, NFC, etc., and then the mobile phone can sense the nearby electronic devices. Alternatively, the mobile phone may determine that another electronic device exists nearby by using a wireless positioning technology such as UWB, recognize the type of the electronic device, and display the type in the shooting preview interface. The embodiment of the application does not limit the communication interaction mode and the positioning mode of the mobile phone and other nearby electronic equipment.
It should also be understood that the mobile phone may display the icon for marking other electronic devices in the area where the electronic device is located in the preview screen or in the blank area in the preview screen, so as not to obscure other objects in the preview screen, and please refer to the foregoing description for a specific display manner, which is not described herein again.
1910, it is detected that the user drags the icon of the thumbnail photo to the other electronic devices in the recognized preview screen.
1911, share photos to the electronic device that the user dragged the icon of the thumbnail to arrive.
Optionally, the user may drag the icon of the thumbnail to a position where an icon marking other electronic devices is located, and then release the icon, where the icon may be presented in different colors, or display size change, jumping, and other dynamic effects of flashing lights, so as to remind the user to share the currently-taken picture with the other electronic devices identified in the preview screen.
For example, as shown in fig. 14 (e), when the user drags the icon 30 of the thumbnail to the position of the icon of P40, the color of the "P40" icon changes, and at this time, the user releases the icon 30 of the thumbnail, so that the currently taken photo can be shared to P40.
Or, for example, as shown in (f) in fig. 17, the user drags the icon 30 of the thumbnail to the position of the icon of the mathebook and then releases the icon, where the icon of the mathebook may be presented in different colors, or display size change, jitter, flashing light and other dynamic effects to remind the user to share the currently taken photo with the mathebook identified in the preview screen.
In another possible scenario, a user may desire to share multiple photos, or to share photos that are not currently taken. For such a scenario, the process of step 1905-1911 may be performed.
1905, a user's sliding operation of the local album icon is detected.
1906, a photo list of the local photo album is displayed, and a plurality of photos to be shared selected by the user from the photo list is detected.
For example, when the photos to be shared are pictures from other sources saved on the mobile phone of the user, the user may open the camera application, not take the photos, directly long press and drag the local album icon, find and select the photos to be shared in the photo list, which may be shown in (d) and (e) of fig. 17. The embodiments of the present application do not limit this.
Illustratively, as shown in (c) of fig. 17, after the user's sliding operation on the local album icon is detected, the photo list is displayed, and the first photo in the photo list may be selected by default. If the user does not desire to share photo 1, the user may click on the selection box in the lower right corner of photo 1 to deselect photo 1. Similarly, if the user desires to share the photos 1, 2, and 3 at the same time, the user can click on the selection box at the lower right corner of each photo to select a plurality of photos to be shared, which is not described herein again.
1907, detecting a long press operation of the user on a plurality of photos to be shared.
1908, when detecting a long press operation of the user on a plurality of photos to be shared, triggering to display an icon of the thumbnail photo, and the icon of the thumbnail photo is in a draggable mode, and simultaneously starting the device identification function.
Optionally, after the user selects the photos 1, 2, and 3 to be shared, the user's finger may press any area of the photos 1, 2, and 3 to be shared for a long time, and dragging the three photos may be implemented.
1909, other electronic devices included in the preview screen are identified, and the identified electronic devices are marked.
1910, the icon of the thumbnail is dragged to the other electronic devices in the recognized preview screen.
1911, share photos to the electronic device that the user dragged the icon of the thumbnail to arrive.
It should be noted that, the flow of the implementation process may be combined with the specific descriptions in fig. 14 to fig. 18, and for part of the same operation processes and possible implementation manners, reference may be made to the corresponding descriptions above, and details are not described here again.
By the method, the user can start the equipment identification function and the positioning function of the mobile phone through preset operation in the process of shooting the photos, and identify other electronic equipment in the preview picture of the camera by combining the identification function and the positioning function of the mobile phone, so that the user can select a plurality of photos to be shared and directly drag the plurality of photos to be shared to the areas where the other electronic equipment is located, and the photos can be quickly shared to other electronic equipment existing around. The process simplifies the operation flow of sharing the photos, shortens the time of sharing the photos and improves the user experience.
The above describes the display interface and the implementation of the method of the present application, and the following describes how the electronic device 100 implements distance measurement and angle measurement for other electronic devices in detail by taking UWB wireless positioning technology as an example.
As shown in fig. 20, taking the electronic apparatus 100 and the electronic apparatus 201 as an example, the electronic apparatus 100 initiates a UWB measurement request. And determines the distance between the electronic device 100 and the electronic device 201 according to the measured response of the electronic device 201. Specifically, the device control method includes, but is not limited to, steps S101 to S105, where:
s101, the electronic device 100 broadcasts a UWB measurement request, and the electronic device 201 receives the UWB measurement request.
In some embodiments, the electronic device 100 initiates a UWB measurement request and the electronic device 100 employs the ranging algorithm 3 to determine the range of the electronic device 201.
Step S101 may specifically include: the electronic device 100 broadcasts the first measurement request at time T11, and records that the sending time of the first measurement request is T11, where the first measurement request carries identity information (e.g., ID, mac address, etc. of the electronic device) of the electronic device 100. The electronic device 201 receives the first measurement request sent by the electronic device 100 at time T12, and records the reception time of the first measurement request as T12.
S102, the electronic device 201 sends a first measurement response to the electronic device 100.
The electronic device 201 sends a first measurement response to the electronic device 201 at time T13, where the first measurement request carries T12, T13, the identity information of the electronic device 100, and the identity information of the electronic device 201. The electronic device 201 receives the first measurement response sent by the electronic device 100 at time T14, and records the reception time of the first measurement response as time T14.
S103, the electronic device 100 determines the orientation parameters of the electronic device 201 according to the measurement response sent by the electronic device 201.
Specifically, the orientation parameters of the electronic device 201 may include one or more of a physical distance between the electronic device 201 and the electronic device 100, a signal AOA of the electronic device 201, and a RRSI of a signal transmitted by the electronic device 201. The three orientation parameters are described in detail below:
first, the physical distance between the electronic device 201 and the electronic device 100. The time difference between the sending time T11 of the first measurement request and the receiving time T14 of the first measurement response is equal to Tround1, the time difference between the receiving time T12 of the first measurement request and the sending time T13 of the first measurement response is equal to tresay 1, and the one-way flight time T can be expressed as: t ═ 2 (Tround1-Trelay 2).
The electronic device 100 determines the one-way flight time of the signal according to the above formula, and then determines that the physical distance D between the electronic device 201 and the electronic device 100 is C × T according to the product of the one-way flight time T and the electromagnetic wave propagation speed C.
Second, the signal AOA of the electronic device 201. The electronic device 100 may determine the orientation of the electronic device 201 relative to the electronic device 100 by calculating the direction of reception of the signal from the phase difference of the first measured response arriving at the UWB antenna at the different location.
Illustratively, as shown in fig. 21, the electronic device 100 receives a wireless signal transmitted by the electronic device 201, and the signal AOA of the signal at the electronic device 100 (i.e. the incident angle θ of the wireless signal with respect to the connecting line of the receiving antenna 1 and the receiving antenna 2) can be based on the phase difference of the signal at the receiving antenna 1 and the receiving antenna 2 of the electronic device 100
Figure BDA0002750755210000451
And (4) determining. Wherein,
Figure BDA0002750755210000452
it can be expressed as follows,
Figure BDA0002750755210000453
wherein, λ is wavelength, and φ (θ) is antenna hardware phase difference. The angle of incidence θ, i.e., the signal AOA of the electronic device 201, can be determined by the above equation. For example, if the incident angle θ of the electronic device is 60 degrees, the electronic device 201 is in the clockwise 30-degree direction of the electronic device 100.
Third, the electronic device 201 transmits the RRSI of the signal. The electronic device 100 determines the RRSI of the signal sent by the electronic device 201 according to the RRSI average of the first measurement request and the first measurement response. In some embodiments, the electronic device 100 determines the RRSI of the signal sent by the electronic device 201 from the RRSI of the first measurement request and the first measurement response.
In the present application, whether there is a shielding object between the electronic device 100 and the electronic device 201 can be determined according to the RRSI of the signal sent by the electronic device 201.
It is understood that signal attenuation is greater under Non-line-of-sight (NLOS) propagation conditions with occlusion and less under line-of-sight (LOS) propagation conditions without occlusion. Under the same propagation condition, the farther the distance is, the greater the signal attenuation is. In the embodiment of the present application, it can be determined whether there is an obstruction between the electronic device 100 and the electronic device 201 according to the RRSI of the first measurement request and the first measurement response and the physical distance between the electronic device 201 and the electronic device 100.
In some embodiments, the preset RRSI of the signal transmitted by the electronic device 201 received by the electronic device 100 may be determined according to the distance between the electronic device 100 and the electronic device 201. When the received RRSI of the signal sent by the electronic device 201 is smaller than the preset RRSI, it is determined that there is a blocking object between the electronic device 100 and the electronic device 201, otherwise, there is no blocking object.
In some embodiments, the orientation parameters of electronic device 201 may include a physical distance between electronic device 201 and electronic device 100, signal AOA, and first identification. The first identifier of the electronic device 201 is used to indicate whether there is a shielding between the electronic device 100 and the electronic device 201. For example, a first flag equal to 1 indicates occlusion, and a first flag equal to 0 indicates no occlusion.
S104, the electronic device 100 sends a connection request to the electronic device 201, and the electronic device 201 receives the connection request sent by the electronic device 100.
S105, the electronic device 201 sends first capability information and corresponding connection parameters to the electronic device 100, wherein the first capability information is used for representing a communication mode which can be supported by the electronic device 201.
In some embodiments, when the first capability information characterizes a WiFi communication mode, the corresponding connection parameters may include: device ID, pairing key, etc. The electronic device 100 may establish a WiFi connection with the electronic device 201 based on the above connection parameters using a connection procedure of the IEE802.11 standard;
in some embodiments, when the first capability information characterizes a bluetooth communication mode, the corresponding connection parameters may include: a secret key, an encryption method, a Service Set Identifier (SSID), and other parameters. The electronic device 100 may establish a bluetooth connection with the electronic device 201 based on the connection parameters described above using the IEE802.15.1 standard connection procedure.
In some embodiments, when the first capability information characterizes the WiFi communication mode and the bluetooth communication mode, the electronic device 100 may preferentially establish a WiFi connection with the electronic device 201 based on the connection parameters using a connection procedure of the IEE802.11 standard.
In some embodiments, the first measurement request may also carry second capability information, which is used to characterize all communication modes that the electronic device 100 can support, such as bluetooth, WiFi, etc. The first measurement response may also carry first capability information and corresponding connection parameters. Wherein the second capability information includes the first capability information, and the second capability information is determined by the electronic device 201 according to the second capability information. After step S103, the electronic device 100 may establish a connection with the electronic device 201 directly according to the first capability information and the corresponding connection parameters in the first measurement response, without sending a connection request again.
In some embodiments, the electronic device 100 may also initiate a measurement request multiple times, and obtain the one-way time-of-flight average value and the AOA average value according to the transceiving time of the multiple measurement requests and the multiple measurement responses, so as to reduce the distance and angle measurement errors.
In the present application, the UWB positioning method is not limited to the above-mentioned UWB positioning method, and other methods may be used to acquire the position information of the electronic device 201 relative to the electronic device 100. For example, the electronic device 100 broadcasts a UWB measurement request, the probe request includes a transmission time, and after the electronic device 201 receives the measurement request, the electronic device 201 determines a time difference based on the transmission time and the time when the electronic device 201 receives the probe request, thereby calculating a distance between the electronic device 201 and the electronic device 100 (the distance is equal to the time difference multiplied by the propagation velocity of the electromagnetic wave); the electronic device 201 calculates an angle of arrival of the probe request based on the received probe request, and can determine an azimuth angle of the electronic device 201 relative to the electronic device 100. The electronic device 201 sends a probe response to the electronic device 100, in which the identity of the second electronic device 201 and the first location information are included. The electronic device 100 receives the probe response and obtains orientation parameters that determine the orientation of the electronic device 201 relative to the electronic device 100.
In the present application, the measurement request (first measurement request) may also be referred to as a probe request, and the measurement response (first measurement response) may also be referred to as a probe response.
In the present application, positioning is not limited to positioning by UWB, and positioning may be performed by bluetooth, WiFi, or GPS.
The application provides a device identification method, which is applied to a first electronic device with a camera, and as shown in fig. 22, the method includes:
s201, the first electronic device receives a first operation.
The first operation may be any one or more of the user operations in fig. 5A to 5D, and may also be any one or more of the user operations in fig. 7A to 7C. For details, reference may be made to the embodiments shown in fig. 5A to 5D or fig. 7A to 7C, which are not described herein again.
S202, responding to the first operation, displaying a first interface by the first electronic equipment, wherein the first interface comprises a preview picture acquired by the camera, and the preview picture comprises the second electronic equipment.
The first interface may be the previously described viewing interface 530. The second electronic device may be, for example, the electronic device 201 corresponding to the device image 532 in fig. 5G.
S203, the first electronic device acquires first position information of the second electronic device relative to the first electronic device.
And S204, the first electronic equipment determines the display position of the first label in the preview picture based on the first position information and the display area of the second electronic equipment in the preview picture, and displays the first label at the display position, wherein the first label is used for identifying the second electronic equipment.
The second electronic device may be, for example, the electronic device 201 corresponding to the device image 532 in fig. 5G, and the first tag may be, for example, the device icon 5321.
S205, the first electronic device receives a second operation aiming at the first label. The second operation may be the user operation described previously in fig. 5G for device icon 5321.
And S206, responding to the second operation, displaying a second interface by the first electronic equipment, wherein the second interface comprises one or more controls for controlling the second electronic equipment. The second interface may be the display interface in fig. 5H. The second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface. The application shows the corresponding relation between the first label and the second electronic equipment in real time through the display mode of augmented reality, realizes interaction between the first electronic equipment and the second electronic equipment through the first label, realizes coordination control among multiple devices, and improves user experience.
In some possible embodiments, the acquiring, by a first electronic device, first position information of a second electronic device relative to the first electronic device specifically includes: the method comprises the steps that a first electronic device broadcasts a detection request, wherein the detection request comprises an identity of the first electronic device; when the first electronic device receives a detection response sent by the second electronic device based on the detection request, first position information of the second electronic device and the first electronic device is determined based on the detection response, and the detection response comprises an identity of the second electronic device. In this manner, the first location information includes a relative location, such as a distance, a direction, an angle, etc., of the second electronic device with respect to the first electronic device. The first electronic device can calculate the distance between the second electronic device and the first electronic device according to the time difference between the sending of the probe request and the receiving of the probe response (the distance is equal to the time difference multiplied by the propagation speed of the electromagnetic wave); the first electronic device calculates the angle of arrival of the probe response based on the probe response, and may determine the azimuth angle of the second electronic device relative to the first electronic device.
Optionally, the detection response includes an identity of the second electronic device and first location information, and the first electronic device determines the first location information of the second electronic device and the first electronic device based on the detection response. Specifically, the second electronic device calculates a relative position between the second electronic device and the first electronic device according to the received probe request. The detection request comprises sending time, the second electronic device determines a time difference based on the sending time and the time when the second electronic device receives the detection request, and therefore the distance between the second electronic device and the first electronic device is calculated; the second electronic device calculates an angle of arrival of the probe request based on the received probe request, and can determine an azimuth angle of the second electronic device relative to the first electronic device. And the second electronic equipment sends a detection response to the first electronic equipment, wherein the detection response comprises the identity of the second electronic equipment and the first position information.
In some possible embodiments, the display position of the first tab in the preview screen and the display area of the second electronic device in the preview screen partially overlap or completely overlap. The first label may be displayed in a display area of the second electronic device, may be displayed at an edge of the display area of the second electronic device, or may be displayed in a position close to the display area of the second electronic device.
In some possible embodiments, the method further comprises: the first electronic equipment acquires second position information of the third electronic equipment relative to the first electronic equipment; when the first electronic equipment detects that the preview picture does not include the third electronic equipment, determining that the third electronic equipment is in the framing range of the camera based on the second position information; the first electronic equipment determines the display position of a second label in the preview picture based on the second position information, wherein the second label is used for indicating one or more of the following information: identification information of the third electronic device, a shelter of the third electronic device, and second location information. In this manner, when the first electronic device detects that the relative position of the third electronic device is within the viewing range of the camera, but the preview screen does not include the image of the third electronic device, the first electronic device determines that the third electronic device is blocked, and outputs the second tag of the third electronic device, which indicates one or more of the identification information of the third electronic device, the blocking object, and the position blocked in the preview interface.
The second label may be, for example, icon 803 in fig. 8C, an image of the third electronic device not being in the first interface, the third electronic device being obscured by device image 533.
In some possible embodiments, the method further comprises: when the first electronic equipment detects that the third electronic equipment is not included in the preview picture, determining that the third electronic equipment is not in the framing range of the camera based on the second position information; the first electronic equipment determines the display position of a third label in the preview picture based on the second position information, wherein the third label is used for indicating one or more of the following information: identification information of the third electronic device, the second location information. In this manner, when the first electronic device detects that the relative position of the third electronic device is outside the view range of the camera and the image of the third electronic device is not included in the preview screen, the first electronic device determines that the third electronic device is not in the view frame, and outputs the second tag of the third electronic device indicating one or more of the identification information of the third electronic device and the relative position (direction, angle, distance, etc.) with respect to the first electronic device.
The third tab may be, for example, icon 802 in fig. 8B, the image of the third electronic device is not in the first interface, and the third electronic device is outside the viewing range of the camera.
In some possible embodiments, the preview screen includes an image of a fourth electronic device, and after the first electronic device displays the first interface, the method further includes: the first electronic equipment determines that the equipment type of the fourth electronic equipment is a first type based on the preview picture; determining a first target device with a first type as a device type in electronic devices associated or bound with an account of the first electronic device by the first electronic device; the first electronic device displays a fourth label indicating that an image of the fourth electronic device is associated with the first target device. In this manner, when the first electronic device cannot detect the location information of the fourth electronic device, and the image of the fourth electronic device is in the preview screen. In this case, the first electronic device identifies the device type of the fourth electronic device according to an image recognition technology, and detects whether a target device of the device type exists in devices that log in the same account (for example, hua is the account) as the first electronic device. If yes, the first electronic device regards the target device as a fourth electronic device, and the first electronic device outputs a fourth tag identifying the target device.
The third label may be, for example, an icon 805 in fig. 8D, where the image of the fourth electronic device is in the first interface, and the first electronic device cannot locate the position of the fourth electronic device.
In some possible embodiments, the preview screen includes an image of a fifth electronic device, and after the first electronic device displays the first interface, the method further includes: the first electronic equipment determines that the equipment type of the fifth electronic equipment is the second type based on the preview picture; the first electronic equipment acquires third position information of the first electronic equipment, and the first electronic equipment stores the corresponding relation between the electronic equipment and the position information; based on the corresponding relation, the first electronic device determines a second target device with the first type according to the third position information, and the position information of the target device is the same as the third position information; the first electronic device displays a fifth label indicating that the image of the fifth electronic device is associated with the second target device. In this manner, when the first electronic device cannot detect the position information of the fifth electronic device, and the image of the fifth electronic device is in the preview screen. In this case, since the first electronic device stores the correspondence between the electronic device and the location information (e.g., smart speaker — living room, smart desk lamp — bedroom, computer — company, etc.), the first electronic device detects whether the target device of the device type exists in the devices in the same geographical location as the first electronic device according to the current geographical location of the first electronic device and the device type of the fifth electronic device identified by the image recognition technology. If yes, the first electronic device regards the target device as a fifth electronic device, and outputs a fifth tag identifying the target device.
In some possible embodiments, the first interface further includes a first icon, and the first icon is associated with data to be shared, and the method further includes: the first electronic equipment receives a third operation, wherein the third operation is an operation aiming at the first label and/or the first icon; and responding to the third operation, and the first electronic equipment sends the data to be shared to the second electronic equipment. The third operation includes but is not limited to a drag operation, a click operation, and the like; the data sharing method includes the steps that a second electronic device needing to be shared is selected on a first interface, and data to be shared are sent to the second electronic device. The data sharing method and the data sharing device simplify user operation of data sharing, visually display equipment information and improve user experience.
The first icon may be, for example, icon 902 or icon 903 in fig. 9B, and the first icon may also be thumbnail 1111 in fig. 11D; the first icon may also be the picture 1205 in fig. 12B.
In some possible embodiments, before the first electronic device receives the third operation, the method further includes: the first electronic device displays a first label in a first display form on a first interface according to the data type of the data to be shared, and the first label in the first display form is used for prompting a user that the second electronic device supports outputting the data to be shared. Wherein the first display form may be to highlight (change brightness, color, etc.) the display area of the first label. The first display format may be, for example, the display format of the device icon 5311, the device icon 5331, and the device icon 5341 in fig. 10C.
In some possible embodiments, the preview screen includes an image of a third electronic device and a third tag, the third tag being associated with the third electronic device; the method further comprises the following steps: the first electronic equipment receives a fourth operation, wherein the fourth operation is an operation aiming at the first label and/or the third icon; responding to the fourth operation, the first electronic equipment outputs a prompt message, wherein the prompt message is used for prompting the user that the third electronic equipment does not support outputting of the data to be shared. The prompting message may be, for example, the information displayed in the pictorial box 1100 in fig. 10B.
According to the embodiment of the application, the first electronic equipment receives a first operation display first interface, starts a camera, and displays an image acquired through the camera in real time in the first interface; the first electronic device identifies the electronic device in the image and the device type (such as a sound box, a computer, a tablet computer and the like) of the electronic device, such as a second electronic device, according to an image identification technology; and the first electronic device obtains location information of the second electronic device relative to the first electronic device according to a wireless location technology (e.g., UWB location, bluetooth location, WiFi location, etc.). The position information includes one or more of a distance, a direction, and an angle. Based on the location information, the first electronic device determines a display location of a first tag of the second electronic device in the preview screen, where the first tag is used for identifying the second electronic device, for example, identifying a device name, a device type, and the like of the second electronic device. Wherein the display position of the first label is related to the display position of the second electronic device. When the first electronic device detects a user operation directed to the first tag, the first electronic device outputs a second interface that includes one or more controls that control the second electronic device. The second interface may be displayed by being superimposed on the first interface, or the electronic device may jump from the first interface to display the second interface. The application shows the corresponding relation between the first label and the second electronic equipment in real time through the display mode of augmented reality, realizes interaction between the first electronic equipment and the second electronic equipment through the first label, realizes coordination control among multiple devices, and improves user experience.
In an embodiment of the present application, a software system of an electronic device (e.g., the electronic device 100) may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100. The Android system is only an example of the system of the electronic device 100 in the embodiment of the present application, and the present application may also be applied to other types of operating systems, such as IOS, windows, and a grand montage, which is not limited in this application. The following description will use the Android system only as an example of the operating system of the electronic device 100.
Referring to fig. 23, fig. 23 is a block diagram illustrating a software structure of an electronic device exemplarily provided in an embodiment of the present application. The electronic equipment can determine orientation parameters (such as distance, signal AOA and RRSI) of nearby equipment through UWB positioning technology, further determine the display position of an image of the nearby equipment in a viewing interface according to the orientation parameters of the nearby equipment, display equipment icons of the nearby equipment, trigger the equipment icons, and achieve interaction between the electronic equipment and the nearby equipment. The electronic device can establish wireless communication connection with the target device through one or more wireless communication protocols of UWB, Bluetooth, WLAN and infrared ray, and perform data transmission.
As shown in fig. 23, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided from top to bottom into an application layer, an application framework layer, a protocol stack, and a kernel layer (kernel). Wherein:
the application layer includes a series of application packages such as smart life, bluetooth, WLAN, etc. Applications such as cameras, galleries, telephony, music, video, etc. may also be included.
The intelligent life APP is a software program capable of selecting and controlling various intelligent household devices in a house and is installed on electronic equipment used by a user. The smart life APP may be an application installed when the electronic device leaves a factory, or an application downloaded from a network or acquired from another device during use of the electronic device by a user.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 23, the application framework layer may mainly include an API and a System service (System Server). The API is used for realizing communication between an application program layer and a protocol stack, a HAL layer and a kernel layer (kernel). For example, communication between "smart life" and kernel layers (kernel) may be provided, and the like. The API may include one or more of UWB API, bluetooth API, WLAN API, infrared API, and accordingly, the system service may include one or more of UWB service, bluetooth service, WLAN service, infrared service. The electronic device 100 may detect orientation parameters of devices in the vicinity of the electronic device 100 by invoking one or more of a UWB API, a bluetooth API, a WLAN API, an infrared API, or a corresponding system service. And one or more of UWB API, Bluetooth API, WLAN API and infrared API can be called to call corresponding system service to establish wireless communication connection with nearby equipment and carry out data transmission.
UWB services may specifically include one or more services, such as UWB location services, among others. The UWB positioning service may include position parameter measurements, wherein the position parameter measurements include one or more of range measurements, AOA measurements, RRSI measurements. For example, the electronic device 100 invokes a UWB positioning service through a UWB API to probe the location parameters of devices in the vicinity of the electronic device 100.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional Graphics processing Libraries (e.g., OpenGL for Embedded Systems, OpenGL ES)), 2D Graphics engines (e.g., Skia Graphics Library (SGL)), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: motion Picture Expert Group 4 (MPEG 4), Advanced Video Coding (MPEG-4Part 10Advanced Video Coding, MPEG-4AVC/h.264), Motion Picture Expert compression standard Audio Layer 3(MPEG Audio Layer3, MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR), Joint Photographic Experts Group (Joint Photographic Experts Group, JPEG/JPG), Portable Network Graphics (PNG), and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The core layer may include one or more of a UWB chip driver, a bluetooth chip driver, a WLAN driver, and may further include a display driver, a camera driver, an audio driver, a sensor driver, and the like. The kernel layer (kernel) is used for responding to the function called by the system service in the application program framework layer to execute the corresponding operation. For example, in response to the UWB location service invoking a UWB measurement instruction sent by the UWB protocol stack, the UWB chip driver sends a UWB measurement request through a hardware device (e.g., a UWB chip).
In the present example, the software framework may be on the electronic device 100, or may be on the electronic devices 201, 202, 203, 204.
The following exemplifies the device identification scenario in the above embodiment, and exemplifies the workflow of the software and hardware of the electronic device 100.
The acceleration sensor and/or the gyroscope sensor detect the lift-off operation (e.g., fig. 7C), and a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the tapping operation into the original input event. The raw input events are stored at the kernel layer. The application framework layer obtains the raw input event from the kernel layer and identifies the input event as a pairing connection to an electronic device (e.g., electronic device 201). The smart life application calls the UWB API of the application framework layer to initiate the UWB location service. The UWB location service sends UWB measurement instructions to the UWB HAL interface in the HAL layer by invoking the UWB protocol stack. The UWBHAL interface sends a UWB measurement request to the core layer, and the core layer drives the UWB chip to broadcast a measurement request (e.g., a first measurement request) by calling the UWB chip driver according to the UWB measurement request, and simultaneously records a UWB measurement request transmission time stamp by using the UWB time management module.
In some embodiments, after determining the target device, the UWB service of the application framework layer calls the UWB protocol stack to send a connection request to the core layer, and the UWB chip of the core layer drives the UWB chip to send the connection request to the electronic device 201 to request to establish a UWB communication connection and perform data transmission. Optionally, the UWB service of the application framework layer may also invoke a bluetooth service, a WLAN service, or an infrared service, and send a connection request to the electronic device 201. For example, the UWB service starts a bluetooth service, calls a bluetooth protocol stack through the bluetooth service, and transmits a first connection request to the core layer, and the bluetooth chip of the core layer drives the bluetooth chip to transmit the connection request to the electronic device 201, so as to request establishment of a bluetooth communication connection, and perform data transmission.
In case an integrated unit is employed, the electronic device 100 may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage an action of the electronic device, and for example, may be configured to support the electronic device to perform steps performed by the display unit, the detection unit, and the processing unit. The memory module may be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 2.
The present embodiment also provides a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, and when the computer instructions are executed on an electronic device, the electronic device is caused to execute the above related method steps to implement the method for sharing photos in the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (21)

1. A device identification method is applied to a first electronic device with a camera, and comprises the following steps:
the first electronic equipment receives a first operation;
responding to the first operation, the first electronic equipment displays a first interface, wherein the first interface comprises a preview picture acquired by the camera, and the preview picture comprises second electronic equipment;
the first electronic equipment acquires first position information of the second electronic equipment relative to the first electronic equipment;
the first electronic device determines a display position of a first label in the preview picture based on the first position information and a display area of the second electronic device in the preview picture, and displays the first label at the display position, wherein the first label is used for identifying the second electronic device;
The first electronic device receiving a second operation for the first tag;
in response to the second operation, the first electronic device displays a second interface that includes one or more controls that control the second electronic device.
2. The method according to claim 1, wherein the acquiring, by the first electronic device, first location information of the second electronic device relative to the first electronic device specifically includes:
the first electronic equipment broadcasts a detection request, wherein the detection request comprises an identity of the first electronic equipment;
when the first electronic device receives a probe response sent by the second electronic device based on the probe request, determining first location information of the second electronic device and the first electronic device based on the probe response, wherein the probe response comprises an identity of the second electronic device.
3. The method according to claim 1, wherein a display position of the first tab in the preview screen and a display area of the second electronic device in the preview screen partially overlap or completely overlap.
4. The method of claim 1, further comprising:
The first electronic equipment acquires second position information of third electronic equipment relative to the first electronic equipment;
when the first electronic device detects that the third electronic device is not included in the preview picture, determining that the third electronic device is within a framing range of the camera based on the second position information;
the first electronic equipment determines the display position of a second label in the preview picture based on the second position information, wherein the second label is used for indicating one or more of the following information: identification information of the third electronic device, a shelter of the third electronic device, and the second location information.
5. The method of claim 4, further comprising:
when the first electronic device detects that the third electronic device is not included in the preview picture, determining that the third electronic device is not in the framing range of the camera based on the second position information;
the first electronic equipment determines the display position of a third label in the preview picture based on the second position information, wherein the third label is used for indicating one or more of the following information: identification information of the third electronic device, the second location information.
6. The method according to claim 1, wherein the preview screen includes an image of a fourth electronic device, and after the first electronic device displays the first interface, the method further comprises:
the first electronic equipment determines that the equipment type of the fourth electronic equipment is a first type based on the preview picture;
determining that the device type is a first target device of the first type in electronic devices associated or bound with the first electronic device under the account of the first electronic device;
the first electronic device displays a fourth label indicating that an image of the fourth electronic device is associated with the first target device.
7. The method according to claim 1, wherein the preview screen includes an image of a fifth electronic device, and after the first electronic device displays the first interface, the method further comprises:
the first electronic equipment determines that the equipment type of the fifth electronic equipment is a second type based on the preview picture;
the first electronic equipment acquires third position information of the first electronic equipment, and the first electronic equipment stores the corresponding relation between the electronic equipment and the position information;
Based on the corresponding relation, the first electronic device determines that the device type is a second target device of the first type according to the third position information, and the position information of the target device is the same as the third position information;
the first electronic device displays a fifth label indicating that an image of the fifth electronic device is associated with the second target device.
8. The method of claim 1, wherein the first interface further comprises a first icon associated with data to be shared, the method further comprising:
the first electronic equipment receives a third operation, wherein the third operation is an operation aiming at the first label and/or the first icon;
responding to the third operation, and sending the data to be shared to the second electronic equipment by the first electronic equipment.
9. The method of claim 8, wherein before the first electronic device receives the third operation, further comprising:
the first electronic device displays the first label in a first display form on the first interface according to the data type of the data to be shared, and the first label in the first display form is used for prompting a user that the second electronic device supports outputting the data to be shared.
10. The method according to claim 8, wherein the preview screen includes an image of a third electronic device and the third tag, the third tag being associated with the third electronic device; the method further comprises the following steps:
the first electronic equipment receives a fourth operation, wherein the fourth operation is an operation aiming at the first label and/or the third icon;
responding to the fourth operation, the first electronic device outputs a prompt message, wherein the prompt message is used for prompting a user that the third electronic device does not support outputting the data to be shared.
11. An electronic device, comprising: one or more processors, memory; computer instructions are included in the memory that, when invoked by the one or more processors, cause the electronic device to perform:
receiving a first operation;
responding to the first operation, and displaying a first interface, wherein the first interface comprises a preview picture acquired by the camera, and the preview picture comprises first target equipment;
acquiring first relative position information of the first target equipment;
determining a display position of a first label in the preview picture based on the first relative position information and the display position of the first target device in the preview picture, wherein the first label is used for indicating identification information of the first target device;
Receiving a second operation for the first tag;
in response to the second operation, displaying a second interface that includes one or more controls that control the first target device.
12. The electronic device of claim 11, wherein the one or more processors, when invoking the computer instructions, cause the electronic device to perform obtaining first relative location information with the first target device, specifically comprising:
broadcasting a probe request, the probe request including an identity of the electronic device;
when a probe response sent by the first target device based on the probe request is received, determining first relative position information of the first target device based on the probe response, wherein the probe response comprises an identity of the first target device.
13. The electronic device according to claim 11, wherein a display position of the first tag in the preview screen and a display position of the first target device in the preview screen partially overlap or completely overlap.
14. The electronic device of claim 11, wherein the computer instructions, when invoked by the one or more processors, cause the electronic device to further perform:
Acquiring second relative position information of a second target device;
when the electronic equipment detects that the second target equipment is not included in the preview picture, determining that the second target equipment is in a framing range of the camera based on the second relative position information;
the electronic equipment determines the display position of a second label in the preview picture based on the second relative position information, wherein the second label is used for indicating one or more of the following information: identification information of the second target device, an obstruction of the second target device, and the second relative location information.
15. The electronic device of claim 14, wherein the computer instructions, when invoked by the one or more processors, cause the electronic device to further perform:
when the electronic equipment detects that the second target equipment is not included in the preview picture, determining that the second target equipment is not in the framing range of the camera based on the second relative position information;
the electronic equipment determines the display position of a third label in the preview picture based on the second relative position information, wherein the third label is used for indicating one or more of the following information: identification information of the second target device, the second relative position information.
16. The electronic device of claim 11, wherein the preview screen includes an image of a third target device, and wherein the one or more processors, when invoking the computer instructions, cause the electronic device to perform displaying a first interface, the electronic device further performs:
determining that the device type of the third target device is a first type based on the preview screen;
determining identification information of the equipment with the equipment type of the first type in the electronic equipment associated or bound under the account of the electronic equipment;
displaying a fourth label indicating that the image of the third target device is associated with the identification information.
17. The electronic device of claim 11, wherein the preview screen includes an image of a fourth target device, and wherein the one or more processors, when invoking the computer instructions, cause the electronic device to perform displaying a first interface, the electronic device further performs:
determining that the device type of the fourth target device is a second type based on the preview screen;
acquiring the position information of the electronic equipment, wherein the electronic equipment stores the corresponding relation between the electronic equipment and the position information;
The electronic equipment determines the identification information of the equipment with the equipment type of the first type in the corresponding relation according to the third position information;
displaying a fifth label indicating that the image of the fourth target device is associated with the identification information.
18. The electronic device of claim 11, wherein the first interface further comprises a first icon associated with data to be shared, and wherein the computer instructions, when invoked by the one or more processors, cause the electronic device to further perform:
receiving a third operation, wherein the third operation is an operation aiming at the first label and/or the first icon;
and responding to the third operation, and sending the data to be shared to the first target equipment.
19. The electronic device of claim 18, wherein when the one or more processors invoke the computer instructions, prior to causing the electronic device to perform receiving a third operation, the electronic device further performs:
and displaying the first label in a first display form on the first interface according to the data type of the data to be shared, wherein the first label in the first display form is used for prompting a user that the first target device supports outputting the data to be shared.
20. The electronic device of claim 18, wherein the preview screen includes an image of a second target device and the third tag, the third tag being associated with the second target device; when the computer instructions are invoked by the one or more processors, causing the electronic device to further perform:
receiving a fourth operation, wherein the fourth operation is an operation aiming at the first label and/or the third icon;
responding to the fourth operation, and outputting a prompt message, wherein the prompt message is used for prompting the user that the second target device does not support outputting the data to be shared.
21. A computer readable medium storing one or more programs, wherein the one or more programs are configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the method of claims 1-10.
CN202011183311.8A 2020-08-05 2020-10-29 Equipment identification method and related device Active CN114079691B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202211294205.6A CN116489268A (en) 2020-08-05 2020-10-29 Equipment identification method and related device
PCT/CN2021/110906 WO2022028537A1 (en) 2020-08-05 2021-08-05 Device recognition method and related apparatus
JP2023507686A JP7537828B2 (en) 2020-08-05 2021-08-05 Device identification method and related apparatus
EP21853577.1A EP4184905A4 (en) 2020-08-05 2021-08-05 Device recognition method and related apparatus
US18/164,170 US20230188832A1 (en) 2020-08-05 2023-02-03 Device Identification Method and Related Apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2020107822708 2020-08-05
CN202010779841 2020-08-05
CN202010782270 2020-08-05
CN2020107798412 2020-08-05

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211294205.6A Division CN116489268A (en) 2020-08-05 2020-10-29 Equipment identification method and related device

Publications (2)

Publication Number Publication Date
CN114079691A true CN114079691A (en) 2022-02-22
CN114079691B CN114079691B (en) 2022-11-04

Family

ID=80282822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011183311.8A Active CN114079691B (en) 2020-08-05 2020-10-29 Equipment identification method and related device

Country Status (1)

Country Link
CN (1) CN114079691B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115267652A (en) * 2022-07-26 2022-11-01 Oppo广东移动通信有限公司 Angle measuring method, foldable device, storage medium and computer program product
WO2024164929A1 (en) * 2023-02-10 2024-08-15 华为技术有限公司 Device control system, method, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5690404A (en) * 1995-11-07 1997-11-25 Keller; William Hidden photograph storage device
CN105843054A (en) * 2016-03-22 2016-08-10 美的集团股份有限公司 Method for controlling household devices, intelligent household system and mobile device
CN106569409A (en) * 2016-10-13 2017-04-19 杭州鸿雁电器有限公司 Graph capturing based household equipment control system, device and method
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN111262763A (en) * 2020-04-02 2020-06-09 深圳市晶讯软件通讯技术有限公司 Intelligent household equipment control system and method based on live-action picture
CN111432331A (en) * 2020-03-30 2020-07-17 华为技术有限公司 Wireless connection method, device and terminal equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5690404A (en) * 1995-11-07 1997-11-25 Keller; William Hidden photograph storage device
CN105843054A (en) * 2016-03-22 2016-08-10 美的集团股份有限公司 Method for controlling household devices, intelligent household system and mobile device
CN106569409A (en) * 2016-10-13 2017-04-19 杭州鸿雁电器有限公司 Graph capturing based household equipment control system, device and method
CN111083379A (en) * 2019-12-31 2020-04-28 维沃移动通信(杭州)有限公司 Shooting method and electronic equipment
CN111432331A (en) * 2020-03-30 2020-07-17 华为技术有限公司 Wireless connection method, device and terminal equipment
CN111262763A (en) * 2020-04-02 2020-06-09 深圳市晶讯软件通讯技术有限公司 Intelligent household equipment control system and method based on live-action picture

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115267652A (en) * 2022-07-26 2022-11-01 Oppo广东移动通信有限公司 Angle measuring method, foldable device, storage medium and computer program product
WO2024164929A1 (en) * 2023-02-10 2024-08-15 华为技术有限公司 Device control system, method, and electronic device

Also Published As

Publication number Publication date
CN114079691B (en) 2022-11-04

Similar Documents

Publication Publication Date Title
WO2020238871A1 (en) Screen projection method and system and related apparatus
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN113542839B (en) Screen projection method of electronic equipment and electronic equipment
WO2022028537A1 (en) Device recognition method and related apparatus
WO2019072178A1 (en) Method for processing notification, and electronic device
CN113961157B (en) Display interaction system, display method and equipment
CN113496426A (en) Service recommendation method, electronic device and system
WO2020238759A1 (en) Interface display method and electronic device
CN113794796B (en) Screen projection method and electronic equipment
CN112383664B (en) Device control method, first terminal device, second terminal device and computer readable storage medium
CN112130788A (en) Content sharing method and device
WO2022042769A2 (en) Multi-screen interaction system and method, apparatus, and medium
CN113452945A (en) Method and device for sharing application interface, electronic equipment and readable storage medium
JP2023500656A (en) DISPLAY METHOD AND ELECTRONIC DEVICE
WO2022089122A1 (en) Screen projection method for application window and electronic devices
CN114356195B (en) File transmission method and related equipment
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
CN113921002A (en) Equipment control method and related device
CN114079691B (en) Equipment identification method and related device
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2021052388A1 (en) Video communication method and video communication apparatus
WO2023093778A1 (en) Screenshot capture method and related apparatus
CN114584817B (en) Screen projection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant